00:00:00.001 Started by upstream project "autotest-per-patch" build number 126141 00:00:00.001 originally caused by: 00:00:00.001 Started by user sys_sgci 00:00:00.011 Checking out git https://review.spdk.io/gerrit/a/build_pool/jenkins_build_pool into /var/jenkins_home/workspace/crypto-phy-autotest_script/33b20b30f0a51e6b52980845e0f6aa336787973ad45e341fbbf98d1b65b265d4 to read jbp/jenkins/jjb-config/jobs/autotest-downstream/autotest-phy.groovy 00:00:00.012 The recommended git tool is: git 00:00:00.012 using credential 00000000-0000-0000-0000-000000000002 00:00:00.014 > git rev-parse --resolve-git-dir /var/jenkins_home/workspace/crypto-phy-autotest_script/33b20b30f0a51e6b52980845e0f6aa336787973ad45e341fbbf98d1b65b265d4/jbp/.git # timeout=10 00:00:00.024 Fetching changes from the remote Git repository 00:00:00.027 > git config remote.origin.url https://review.spdk.io/gerrit/a/build_pool/jenkins_build_pool # timeout=10 00:00:00.039 Using shallow fetch with depth 1 00:00:00.039 Fetching upstream changes from https://review.spdk.io/gerrit/a/build_pool/jenkins_build_pool 00:00:00.039 > git --version # timeout=10 00:00:00.054 > git --version # 'git version 2.39.2' 00:00:00.054 using GIT_ASKPASS to set credentials SPDKCI HTTPS Credentials 00:00:00.075 Setting http proxy: proxy-dmz.intel.com:911 00:00:00.075 > git fetch --tags --force --progress --depth=1 -- https://review.spdk.io/gerrit/a/build_pool/jenkins_build_pool refs/heads/master # timeout=5 00:00:02.251 > git rev-parse origin/FETCH_HEAD^{commit} # timeout=10 00:00:02.262 > git rev-parse FETCH_HEAD^{commit} # timeout=10 00:00:02.273 Checking out Revision 9bf0dabeadcf84e29a3d5dbec2430e38aceadf8d (FETCH_HEAD) 00:00:02.273 > git config core.sparsecheckout # timeout=10 00:00:02.282 > git read-tree -mu HEAD # timeout=10 00:00:02.296 > git checkout -f 9bf0dabeadcf84e29a3d5dbec2430e38aceadf8d # timeout=5 00:00:02.315 Commit message: "inventory: add WCP3 to free inventory" 00:00:02.315 > git rev-list --no-walk 9bf0dabeadcf84e29a3d5dbec2430e38aceadf8d # timeout=10 00:00:02.457 [Pipeline] Start of Pipeline 00:00:02.471 [Pipeline] library 00:00:02.473 Loading library shm_lib@master 00:00:02.474 Library shm_lib@master is due for a refresh after 30 minutes, clearing. 00:00:02.475 Caching library shm_lib@master 00:00:02.475 Attempting to resolve master from remote references... 00:00:02.475 > git --version # timeout=10 00:00:02.487 > git --version # 'git version 2.39.2' 00:00:02.487 using GIT_ASKPASS to set credentials SPDKCI HTTPS Credentials 00:00:02.497 Setting http proxy: proxy-dmz.intel.com:911 00:00:02.497 > git ls-remote -- https://review.spdk.io/gerrit/a/build_pool/shm_lib # timeout=10 00:00:06.384 Found match: refs/heads/master revision d2a35f9b7368dd6eeb1355c656f2b93d62569800 00:00:06.388 Selected Git installation does not exist. Using Default 00:00:06.389 The recommended git tool is: NONE 00:00:06.389 using credential 00000000-0000-0000-0000-000000000002 00:00:06.392 > git rev-parse --resolve-git-dir /var/jenkins_home/workspace/crypto-phy-autotest_libs/6e27998ca6b735f457f1bf0490b425345ba4637a91de7f2498f417cb3d899827/.git # timeout=10 00:00:06.403 Fetching changes from the remote Git repository 00:00:06.406 > git config remote.origin.url https://review.spdk.io/gerrit/a/build_pool/shm_lib # timeout=10 00:00:06.416 Fetching without tags 00:00:06.416 Fetching upstream changes from https://review.spdk.io/gerrit/a/build_pool/shm_lib 00:00:06.416 > git --version # timeout=10 00:00:06.427 > git --version # 'git version 2.39.2' 00:00:06.427 using GIT_ASKPASS to set credentials SPDKCI HTTPS Credentials 00:00:06.437 Setting http proxy: proxy-dmz.intel.com:911 00:00:06.438 > git fetch --no-tags --force --progress -- https://review.spdk.io/gerrit/a/build_pool/shm_lib +refs/heads/*:refs/remotes/origin/* # timeout=10 00:00:07.185 Checking out Revision d2a35f9b7368dd6eeb1355c656f2b93d62569800 (master) 00:00:07.185 > git config core.sparsecheckout # timeout=10 00:00:07.196 > git checkout -f d2a35f9b7368dd6eeb1355c656f2b93d62569800 # timeout=10 00:00:07.211 Commit message: "sorcer: add WCP5 server as sorcerer" 00:00:07.211 > git rev-list --no-walk d2a35f9b7368dd6eeb1355c656f2b93d62569800 # timeout=10 00:00:07.275 [Pipeline] node 00:00:07.407 Running on WFP50 in /var/jenkins/workspace/crypto-phy-autotest 00:00:07.410 [Pipeline] { 00:00:07.430 [Pipeline] catchError 00:00:07.432 [Pipeline] { 00:00:07.453 [Pipeline] wrap 00:00:07.467 [Pipeline] { 00:00:07.477 [Pipeline] stage 00:00:07.480 [Pipeline] { (Prologue) 00:00:07.648 [Pipeline] sh 00:00:07.925 + logger -p user.info -t JENKINS-CI 00:00:07.939 [Pipeline] echo 00:00:07.940 Node: WFP50 00:00:07.946 [Pipeline] sh 00:00:08.233 [Pipeline] setCustomBuildProperty 00:00:08.246 [Pipeline] echo 00:00:08.247 Cleanup processes 00:00:08.252 [Pipeline] sh 00:00:08.530 + sudo pgrep -af /var/jenkins/workspace/crypto-phy-autotest/spdk 00:00:08.530 2299768 sudo pgrep -af /var/jenkins/workspace/crypto-phy-autotest/spdk 00:00:08.542 [Pipeline] sh 00:00:08.821 ++ sudo pgrep -af /var/jenkins/workspace/crypto-phy-autotest/spdk 00:00:08.821 ++ grep -v 'sudo pgrep' 00:00:08.821 ++ awk '{print $1}' 00:00:08.821 + sudo kill -9 00:00:08.821 + true 00:00:08.836 [Pipeline] cleanWs 00:00:08.846 [WS-CLEANUP] Deleting project workspace... 00:00:08.846 [WS-CLEANUP] Deferred wipeout is used... 00:00:08.852 [WS-CLEANUP] done 00:00:08.858 [Pipeline] setCustomBuildProperty 00:00:08.875 [Pipeline] sh 00:00:09.152 + sudo git config --global --replace-all safe.directory '*' 00:00:09.211 [Pipeline] httpRequest 00:00:09.239 [Pipeline] echo 00:00:09.240 Sorcerer 10.211.164.101 is alive 00:00:09.247 [Pipeline] httpRequest 00:00:09.250 HttpMethod: GET 00:00:09.251 URL: http://10.211.164.101/packages/jbp_9bf0dabeadcf84e29a3d5dbec2430e38aceadf8d.tar.gz 00:00:09.251 Sending request to url: http://10.211.164.101/packages/jbp_9bf0dabeadcf84e29a3d5dbec2430e38aceadf8d.tar.gz 00:00:09.276 Response Code: HTTP/1.1 200 OK 00:00:09.276 Success: Status code 200 is in the accepted range: 200,404 00:00:09.276 Saving response body to /var/jenkins/workspace/crypto-phy-autotest/jbp_9bf0dabeadcf84e29a3d5dbec2430e38aceadf8d.tar.gz 00:00:16.412 [Pipeline] sh 00:00:16.695 + tar --no-same-owner -xf jbp_9bf0dabeadcf84e29a3d5dbec2430e38aceadf8d.tar.gz 00:00:16.971 [Pipeline] httpRequest 00:00:16.992 [Pipeline] echo 00:00:16.994 Sorcerer 10.211.164.101 is alive 00:00:17.003 [Pipeline] httpRequest 00:00:17.008 HttpMethod: GET 00:00:17.009 URL: http://10.211.164.101/packages/spdk_182dd7de475bca6e9768a600616eb841d1034467.tar.gz 00:00:17.010 Sending request to url: http://10.211.164.101/packages/spdk_182dd7de475bca6e9768a600616eb841d1034467.tar.gz 00:00:17.033 Response Code: HTTP/1.1 200 OK 00:00:17.033 Success: Status code 200 is in the accepted range: 200,404 00:00:17.034 Saving response body to /var/jenkins/workspace/crypto-phy-autotest/spdk_182dd7de475bca6e9768a600616eb841d1034467.tar.gz 00:01:01.777 [Pipeline] sh 00:01:02.066 + tar --no-same-owner -xf spdk_182dd7de475bca6e9768a600616eb841d1034467.tar.gz 00:01:06.270 [Pipeline] sh 00:01:06.551 + git -C spdk log --oneline -n5 00:01:06.551 182dd7de4 nvmf: large IU and atomic write unit reporting 00:01:06.551 968224f46 app/trace_record: add a optional option '-t' 00:01:06.551 d83ccf437 accel: clarify the usage of spdk_accel_sequence_abort() 00:01:06.551 f282c9958 doc/jsonrpc.md fix style issue 00:01:06.551 868be8ed2 iscs: chap mutual authentication should apply when configured. 00:01:06.562 [Pipeline] } 00:01:06.580 [Pipeline] // stage 00:01:06.589 [Pipeline] stage 00:01:06.591 [Pipeline] { (Prepare) 00:01:06.606 [Pipeline] writeFile 00:01:06.619 [Pipeline] sh 00:01:06.900 + logger -p user.info -t JENKINS-CI 00:01:06.913 [Pipeline] sh 00:01:07.188 + logger -p user.info -t JENKINS-CI 00:01:07.203 [Pipeline] sh 00:01:07.484 + cat autorun-spdk.conf 00:01:07.484 SPDK_RUN_FUNCTIONAL_TEST=1 00:01:07.484 SPDK_TEST_BLOCKDEV=1 00:01:07.484 SPDK_TEST_ISAL=1 00:01:07.484 SPDK_TEST_CRYPTO=1 00:01:07.484 SPDK_TEST_REDUCE=1 00:01:07.484 SPDK_TEST_VBDEV_COMPRESS=1 00:01:07.484 SPDK_RUN_UBSAN=1 00:01:07.491 RUN_NIGHTLY=0 00:01:07.497 [Pipeline] readFile 00:01:07.526 [Pipeline] withEnv 00:01:07.529 [Pipeline] { 00:01:07.542 [Pipeline] sh 00:01:07.819 + set -ex 00:01:07.820 + [[ -f /var/jenkins/workspace/crypto-phy-autotest/autorun-spdk.conf ]] 00:01:07.820 + source /var/jenkins/workspace/crypto-phy-autotest/autorun-spdk.conf 00:01:07.820 ++ SPDK_RUN_FUNCTIONAL_TEST=1 00:01:07.820 ++ SPDK_TEST_BLOCKDEV=1 00:01:07.820 ++ SPDK_TEST_ISAL=1 00:01:07.820 ++ SPDK_TEST_CRYPTO=1 00:01:07.820 ++ SPDK_TEST_REDUCE=1 00:01:07.820 ++ SPDK_TEST_VBDEV_COMPRESS=1 00:01:07.820 ++ SPDK_RUN_UBSAN=1 00:01:07.820 ++ RUN_NIGHTLY=0 00:01:07.820 + case $SPDK_TEST_NVMF_NICS in 00:01:07.820 + DRIVERS= 00:01:07.820 + [[ -n '' ]] 00:01:07.820 + exit 0 00:01:07.829 [Pipeline] } 00:01:07.851 [Pipeline] // withEnv 00:01:07.858 [Pipeline] } 00:01:07.878 [Pipeline] // stage 00:01:07.890 [Pipeline] catchError 00:01:07.892 [Pipeline] { 00:01:07.911 [Pipeline] timeout 00:01:07.911 Timeout set to expire in 40 min 00:01:07.914 [Pipeline] { 00:01:07.935 [Pipeline] stage 00:01:07.937 [Pipeline] { (Tests) 00:01:07.955 [Pipeline] sh 00:01:08.237 + jbp/jenkins/jjb-config/jobs/scripts/autoruner.sh /var/jenkins/workspace/crypto-phy-autotest 00:01:08.237 ++ readlink -f /var/jenkins/workspace/crypto-phy-autotest 00:01:08.237 + DIR_ROOT=/var/jenkins/workspace/crypto-phy-autotest 00:01:08.237 + [[ -n /var/jenkins/workspace/crypto-phy-autotest ]] 00:01:08.237 + DIR_SPDK=/var/jenkins/workspace/crypto-phy-autotest/spdk 00:01:08.237 + DIR_OUTPUT=/var/jenkins/workspace/crypto-phy-autotest/output 00:01:08.237 + [[ -d /var/jenkins/workspace/crypto-phy-autotest/spdk ]] 00:01:08.237 + [[ ! -d /var/jenkins/workspace/crypto-phy-autotest/output ]] 00:01:08.237 + mkdir -p /var/jenkins/workspace/crypto-phy-autotest/output 00:01:08.237 + [[ -d /var/jenkins/workspace/crypto-phy-autotest/output ]] 00:01:08.237 + [[ crypto-phy-autotest == pkgdep-* ]] 00:01:08.237 + cd /var/jenkins/workspace/crypto-phy-autotest 00:01:08.237 + source /etc/os-release 00:01:08.237 ++ NAME='Fedora Linux' 00:01:08.237 ++ VERSION='38 (Cloud Edition)' 00:01:08.237 ++ ID=fedora 00:01:08.237 ++ VERSION_ID=38 00:01:08.237 ++ VERSION_CODENAME= 00:01:08.237 ++ PLATFORM_ID=platform:f38 00:01:08.237 ++ PRETTY_NAME='Fedora Linux 38 (Cloud Edition)' 00:01:08.237 ++ ANSI_COLOR='0;38;2;60;110;180' 00:01:08.237 ++ LOGO=fedora-logo-icon 00:01:08.237 ++ CPE_NAME=cpe:/o:fedoraproject:fedora:38 00:01:08.237 ++ HOME_URL=https://fedoraproject.org/ 00:01:08.237 ++ DOCUMENTATION_URL=https://docs.fedoraproject.org/en-US/fedora/f38/system-administrators-guide/ 00:01:08.237 ++ SUPPORT_URL=https://ask.fedoraproject.org/ 00:01:08.237 ++ BUG_REPORT_URL=https://bugzilla.redhat.com/ 00:01:08.237 ++ REDHAT_BUGZILLA_PRODUCT=Fedora 00:01:08.237 ++ REDHAT_BUGZILLA_PRODUCT_VERSION=38 00:01:08.237 ++ REDHAT_SUPPORT_PRODUCT=Fedora 00:01:08.237 ++ REDHAT_SUPPORT_PRODUCT_VERSION=38 00:01:08.237 ++ SUPPORT_END=2024-05-14 00:01:08.237 ++ VARIANT='Cloud Edition' 00:01:08.237 ++ VARIANT_ID=cloud 00:01:08.237 + uname -a 00:01:08.237 Linux spdk-wfp-50 6.7.0-68.fc38.x86_64 #1 SMP PREEMPT_DYNAMIC Mon Jan 15 00:59:40 UTC 2024 x86_64 GNU/Linux 00:01:08.237 + sudo /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh status 00:01:11.523 0000:d7:05.5 (8086 201d): Skipping not allowed VMD controller at 0000:d7:05.5 00:01:11.523 0000:85:05.5 (8086 201d): Skipping not allowed VMD controller at 0000:85:05.5 00:01:11.523 Hugepages 00:01:11.523 node hugesize free / total 00:01:11.523 node0 1048576kB 0 / 0 00:01:11.523 node0 2048kB 0 / 0 00:01:11.523 node1 1048576kB 0 / 0 00:01:11.523 node1 2048kB 0 / 0 00:01:11.523 00:01:11.523 Type BDF Vendor Device NUMA Driver Device Block devices 00:01:11.523 I/OAT 0000:00:04.0 8086 2021 0 ioatdma - - 00:01:11.523 I/OAT 0000:00:04.1 8086 2021 0 ioatdma - - 00:01:11.523 I/OAT 0000:00:04.2 8086 2021 0 ioatdma - - 00:01:11.523 I/OAT 0000:00:04.3 8086 2021 0 ioatdma - - 00:01:11.523 I/OAT 0000:00:04.4 8086 2021 0 ioatdma - - 00:01:11.523 I/OAT 0000:00:04.5 8086 2021 0 ioatdma - - 00:01:11.523 I/OAT 0000:00:04.6 8086 2021 0 ioatdma - - 00:01:11.523 I/OAT 0000:00:04.7 8086 2021 0 ioatdma - - 00:01:11.523 NVMe 0000:5e:00.0 8086 0b60 0 nvme nvme0 nvme0n1 00:01:11.523 I/OAT 0000:80:04.0 8086 2021 1 ioatdma - - 00:01:11.523 I/OAT 0000:80:04.1 8086 2021 1 ioatdma - - 00:01:11.523 I/OAT 0000:80:04.2 8086 2021 1 ioatdma - - 00:01:11.523 I/OAT 0000:80:04.3 8086 2021 1 ioatdma - - 00:01:11.523 I/OAT 0000:80:04.4 8086 2021 1 ioatdma - - 00:01:11.523 I/OAT 0000:80:04.5 8086 2021 1 ioatdma - - 00:01:11.523 I/OAT 0000:80:04.6 8086 2021 1 ioatdma - - 00:01:11.523 I/OAT 0000:80:04.7 8086 2021 1 ioatdma - - 00:01:11.523 VMD 0000:85:05.5 8086 201d 1 vfio-pci - - 00:01:11.523 VMD 0000:d7:05.5 8086 201d 1 vfio-pci - - 00:01:11.523 + rm -f /tmp/spdk-ld-path 00:01:11.523 + source autorun-spdk.conf 00:01:11.523 ++ SPDK_RUN_FUNCTIONAL_TEST=1 00:01:11.523 ++ SPDK_TEST_BLOCKDEV=1 00:01:11.523 ++ SPDK_TEST_ISAL=1 00:01:11.523 ++ SPDK_TEST_CRYPTO=1 00:01:11.523 ++ SPDK_TEST_REDUCE=1 00:01:11.523 ++ SPDK_TEST_VBDEV_COMPRESS=1 00:01:11.523 ++ SPDK_RUN_UBSAN=1 00:01:11.523 ++ RUN_NIGHTLY=0 00:01:11.523 + (( SPDK_TEST_NVME_CMB == 1 || SPDK_TEST_NVME_PMR == 1 )) 00:01:11.523 + [[ -n '' ]] 00:01:11.523 + sudo git config --global --add safe.directory /var/jenkins/workspace/crypto-phy-autotest/spdk 00:01:11.782 + for M in /var/spdk/build-*-manifest.txt 00:01:11.782 + [[ -f /var/spdk/build-pkg-manifest.txt ]] 00:01:11.782 + cp /var/spdk/build-pkg-manifest.txt /var/jenkins/workspace/crypto-phy-autotest/output/ 00:01:11.782 + for M in /var/spdk/build-*-manifest.txt 00:01:11.782 + [[ -f /var/spdk/build-repo-manifest.txt ]] 00:01:11.782 + cp /var/spdk/build-repo-manifest.txt /var/jenkins/workspace/crypto-phy-autotest/output/ 00:01:11.782 ++ uname 00:01:11.782 + [[ Linux == \L\i\n\u\x ]] 00:01:11.782 + sudo dmesg -T 00:01:11.782 + sudo dmesg --clear 00:01:11.782 + dmesg_pid=2300741 00:01:11.782 + [[ Fedora Linux == FreeBSD ]] 00:01:11.782 + export UNBIND_ENTIRE_IOMMU_GROUP=yes 00:01:11.782 + UNBIND_ENTIRE_IOMMU_GROUP=yes 00:01:11.782 + [[ -e /var/spdk/dependencies/vhost/spdk_test_image.qcow2 ]] 00:01:11.782 + [[ -x /usr/src/fio-static/fio ]] 00:01:11.782 + export FIO_BIN=/usr/src/fio-static/fio 00:01:11.782 + FIO_BIN=/usr/src/fio-static/fio 00:01:11.782 + sudo dmesg -Tw 00:01:11.782 + [[ '' == \/\v\a\r\/\j\e\n\k\i\n\s\/\w\o\r\k\s\p\a\c\e\/\c\r\y\p\t\o\-\p\h\y\-\a\u\t\o\t\e\s\t\/\q\e\m\u\_\v\f\i\o\/* ]] 00:01:11.782 + [[ ! -v VFIO_QEMU_BIN ]] 00:01:11.782 + [[ -e /usr/local/qemu/vfio-user-latest ]] 00:01:11.782 + export VFIO_QEMU_BIN=/usr/local/qemu/vfio-user-latest/bin/qemu-system-x86_64 00:01:11.782 + VFIO_QEMU_BIN=/usr/local/qemu/vfio-user-latest/bin/qemu-system-x86_64 00:01:11.782 + [[ -e /usr/local/qemu/vanilla-latest ]] 00:01:11.782 + export QEMU_BIN=/usr/local/qemu/vanilla-latest/bin/qemu-system-x86_64 00:01:11.782 + QEMU_BIN=/usr/local/qemu/vanilla-latest/bin/qemu-system-x86_64 00:01:11.782 + spdk/autorun.sh /var/jenkins/workspace/crypto-phy-autotest/autorun-spdk.conf 00:01:11.782 Test configuration: 00:01:11.782 SPDK_RUN_FUNCTIONAL_TEST=1 00:01:11.782 SPDK_TEST_BLOCKDEV=1 00:01:11.782 SPDK_TEST_ISAL=1 00:01:11.782 SPDK_TEST_CRYPTO=1 00:01:11.782 SPDK_TEST_REDUCE=1 00:01:11.782 SPDK_TEST_VBDEV_COMPRESS=1 00:01:11.782 SPDK_RUN_UBSAN=1 00:01:11.782 RUN_NIGHTLY=0 18:04:55 -- common/autobuild_common.sh@15 -- $ source /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/common.sh 00:01:11.782 18:04:55 -- scripts/common.sh@508 -- $ [[ -e /bin/wpdk_common.sh ]] 00:01:11.782 18:04:55 -- scripts/common.sh@516 -- $ [[ -e /etc/opt/spdk-pkgdep/paths/export.sh ]] 00:01:11.782 18:04:55 -- scripts/common.sh@517 -- $ source /etc/opt/spdk-pkgdep/paths/export.sh 00:01:11.782 18:04:55 -- paths/export.sh@2 -- $ PATH=/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/home/sys_sgci/.local/bin:/home/sys_sgci/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:01:11.782 18:04:55 -- paths/export.sh@3 -- $ PATH=/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/home/sys_sgci/.local/bin:/home/sys_sgci/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:01:11.782 18:04:55 -- paths/export.sh@4 -- $ PATH=/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/home/sys_sgci/.local/bin:/home/sys_sgci/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:01:11.782 18:04:55 -- paths/export.sh@5 -- $ export PATH 00:01:11.782 18:04:55 -- paths/export.sh@6 -- $ echo /opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/home/sys_sgci/.local/bin:/home/sys_sgci/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:01:11.782 18:04:55 -- common/autobuild_common.sh@443 -- $ out=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:01:11.782 18:04:55 -- common/autobuild_common.sh@444 -- $ date +%s 00:01:11.782 18:04:55 -- common/autobuild_common.sh@444 -- $ mktemp -dt spdk_1720800295.XXXXXX 00:01:11.782 18:04:55 -- common/autobuild_common.sh@444 -- $ SPDK_WORKSPACE=/tmp/spdk_1720800295.f0HUj5 00:01:11.782 18:04:55 -- common/autobuild_common.sh@446 -- $ [[ -n '' ]] 00:01:11.782 18:04:55 -- common/autobuild_common.sh@450 -- $ '[' -n '' ']' 00:01:11.782 18:04:55 -- common/autobuild_common.sh@453 -- $ scanbuild_exclude='--exclude /var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/' 00:01:11.782 18:04:55 -- common/autobuild_common.sh@457 -- $ scanbuild_exclude+=' --exclude /var/jenkins/workspace/crypto-phy-autotest/spdk/xnvme --exclude /tmp' 00:01:11.782 18:04:55 -- common/autobuild_common.sh@459 -- $ scanbuild='scan-build -o /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/scan-build-tmp --exclude /var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/ --exclude /var/jenkins/workspace/crypto-phy-autotest/spdk/xnvme --exclude /tmp --status-bugs' 00:01:11.782 18:04:55 -- common/autobuild_common.sh@460 -- $ get_config_params 00:01:11.782 18:04:55 -- common/autotest_common.sh@396 -- $ xtrace_disable 00:01:11.782 18:04:55 -- common/autotest_common.sh@10 -- $ set +x 00:01:12.039 18:04:55 -- common/autobuild_common.sh@460 -- $ config_params='--enable-debug --enable-werror --with-rdma --with-idxd --with-fio=/usr/src/fio --with-iscsi-initiator --disable-unit-tests --with-vbdev-compress --with-dpdk-compressdev --with-crypto --enable-ubsan --enable-coverage --with-ublk' 00:01:12.039 18:04:55 -- common/autobuild_common.sh@462 -- $ start_monitor_resources 00:01:12.039 18:04:55 -- pm/common@17 -- $ local monitor 00:01:12.039 18:04:55 -- pm/common@19 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:01:12.039 18:04:55 -- pm/common@19 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:01:12.039 18:04:55 -- pm/common@19 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:01:12.039 18:04:55 -- pm/common@21 -- $ date +%s 00:01:12.039 18:04:55 -- pm/common@19 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:01:12.039 18:04:55 -- pm/common@21 -- $ date +%s 00:01:12.039 18:04:55 -- pm/common@25 -- $ sleep 1 00:01:12.039 18:04:55 -- pm/common@21 -- $ date +%s 00:01:12.039 18:04:55 -- pm/common@21 -- $ date +%s 00:01:12.039 18:04:55 -- pm/common@21 -- $ /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm/collect-cpu-load -d /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power -l -p monitor.autobuild.sh.1720800295 00:01:12.039 18:04:55 -- pm/common@21 -- $ /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm/collect-vmstat -d /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power -l -p monitor.autobuild.sh.1720800295 00:01:12.039 18:04:55 -- pm/common@21 -- $ /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm/collect-cpu-temp -d /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power -l -p monitor.autobuild.sh.1720800295 00:01:12.039 18:04:55 -- pm/common@21 -- $ sudo -E /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm/collect-bmc-pm -d /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power -l -p monitor.autobuild.sh.1720800295 00:01:12.039 Redirecting to /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power/monitor.autobuild.sh.1720800295_collect-vmstat.pm.log 00:01:12.039 Redirecting to /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power/monitor.autobuild.sh.1720800295_collect-cpu-load.pm.log 00:01:12.040 Redirecting to /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power/monitor.autobuild.sh.1720800295_collect-cpu-temp.pm.log 00:01:12.040 Redirecting to /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power/monitor.autobuild.sh.1720800295_collect-bmc-pm.bmc.pm.log 00:01:12.973 18:04:56 -- common/autobuild_common.sh@463 -- $ trap stop_monitor_resources EXIT 00:01:12.973 18:04:56 -- spdk/autobuild.sh@11 -- $ SPDK_TEST_AUTOBUILD= 00:01:12.973 18:04:56 -- spdk/autobuild.sh@12 -- $ umask 022 00:01:12.973 18:04:56 -- spdk/autobuild.sh@13 -- $ cd /var/jenkins/workspace/crypto-phy-autotest/spdk 00:01:12.973 18:04:56 -- spdk/autobuild.sh@16 -- $ date -u 00:01:12.973 Fri Jul 12 04:04:56 PM UTC 2024 00:01:12.973 18:04:56 -- spdk/autobuild.sh@17 -- $ git describe --tags 00:01:12.973 v24.09-pre-194-g182dd7de4 00:01:12.973 18:04:56 -- spdk/autobuild.sh@19 -- $ '[' 0 -eq 1 ']' 00:01:12.973 18:04:56 -- spdk/autobuild.sh@23 -- $ '[' 1 -eq 1 ']' 00:01:12.973 18:04:56 -- spdk/autobuild.sh@24 -- $ run_test ubsan echo 'using ubsan' 00:01:12.973 18:04:56 -- common/autotest_common.sh@1099 -- $ '[' 3 -le 1 ']' 00:01:12.973 18:04:56 -- common/autotest_common.sh@1105 -- $ xtrace_disable 00:01:12.973 18:04:56 -- common/autotest_common.sh@10 -- $ set +x 00:01:12.973 ************************************ 00:01:12.973 START TEST ubsan 00:01:12.973 ************************************ 00:01:12.973 18:04:56 ubsan -- common/autotest_common.sh@1123 -- $ echo 'using ubsan' 00:01:12.973 using ubsan 00:01:12.973 00:01:12.973 real 0m0.001s 00:01:12.973 user 0m0.000s 00:01:12.973 sys 0m0.001s 00:01:12.973 18:04:56 ubsan -- common/autotest_common.sh@1124 -- $ xtrace_disable 00:01:12.973 18:04:56 ubsan -- common/autotest_common.sh@10 -- $ set +x 00:01:12.973 ************************************ 00:01:12.973 END TEST ubsan 00:01:12.973 ************************************ 00:01:12.973 18:04:56 -- common/autotest_common.sh@1142 -- $ return 0 00:01:12.973 18:04:56 -- spdk/autobuild.sh@27 -- $ '[' -n '' ']' 00:01:12.973 18:04:56 -- spdk/autobuild.sh@31 -- $ case "$SPDK_TEST_AUTOBUILD" in 00:01:12.973 18:04:56 -- spdk/autobuild.sh@47 -- $ [[ 0 -eq 1 ]] 00:01:12.973 18:04:56 -- spdk/autobuild.sh@51 -- $ [[ 0 -eq 1 ]] 00:01:12.973 18:04:56 -- spdk/autobuild.sh@55 -- $ [[ -n '' ]] 00:01:12.973 18:04:56 -- spdk/autobuild.sh@57 -- $ [[ 0 -eq 1 ]] 00:01:12.973 18:04:56 -- spdk/autobuild.sh@59 -- $ [[ 0 -eq 1 ]] 00:01:12.973 18:04:56 -- spdk/autobuild.sh@62 -- $ [[ 0 -eq 1 ]] 00:01:12.973 18:04:56 -- spdk/autobuild.sh@67 -- $ /var/jenkins/workspace/crypto-phy-autotest/spdk/configure --enable-debug --enable-werror --with-rdma --with-idxd --with-fio=/usr/src/fio --with-iscsi-initiator --disable-unit-tests --with-vbdev-compress --with-dpdk-compressdev --with-crypto --enable-ubsan --enable-coverage --with-ublk --with-shared 00:01:13.231 Using default SPDK env in /var/jenkins/workspace/crypto-phy-autotest/spdk/lib/env_dpdk 00:01:13.231 Using default DPDK in /var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build 00:01:13.489 Using 'verbs' RDMA provider 00:01:29.781 Configuring ISA-L (logfile: /var/jenkins/workspace/crypto-phy-autotest/spdk/.spdk-isal.log)...done. 00:01:41.976 Configuring ISA-L-crypto (logfile: /var/jenkins/workspace/crypto-phy-autotest/spdk/.spdk-isal-crypto.log)...done. 00:01:42.547 Creating mk/config.mk...done. 00:01:42.547 Creating mk/cc.flags.mk...done. 00:01:42.547 Type 'make' to build. 00:01:42.547 18:05:26 -- spdk/autobuild.sh@69 -- $ run_test make make -j72 00:01:42.547 18:05:26 -- common/autotest_common.sh@1099 -- $ '[' 3 -le 1 ']' 00:01:42.547 18:05:26 -- common/autotest_common.sh@1105 -- $ xtrace_disable 00:01:42.547 18:05:26 -- common/autotest_common.sh@10 -- $ set +x 00:01:42.547 ************************************ 00:01:42.547 START TEST make 00:01:42.547 ************************************ 00:01:42.547 18:05:26 make -- common/autotest_common.sh@1123 -- $ make -j72 00:01:43.114 make[1]: Nothing to be done for 'all'. 00:02:21.867 The Meson build system 00:02:21.867 Version: 1.3.1 00:02:21.867 Source dir: /var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk 00:02:21.867 Build dir: /var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build-tmp 00:02:21.867 Build type: native build 00:02:21.867 Program cat found: YES (/usr/bin/cat) 00:02:21.867 Project name: DPDK 00:02:21.867 Project version: 24.03.0 00:02:21.867 C compiler for the host machine: cc (gcc 13.2.1 "cc (GCC) 13.2.1 20231011 (Red Hat 13.2.1-4)") 00:02:21.867 C linker for the host machine: cc ld.bfd 2.39-16 00:02:21.867 Host machine cpu family: x86_64 00:02:21.867 Host machine cpu: x86_64 00:02:21.867 Message: ## Building in Developer Mode ## 00:02:21.867 Program pkg-config found: YES (/usr/bin/pkg-config) 00:02:21.867 Program check-symbols.sh found: YES (/var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/buildtools/check-symbols.sh) 00:02:21.867 Program options-ibverbs-static.sh found: YES (/var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/buildtools/options-ibverbs-static.sh) 00:02:21.867 Program python3 found: YES (/usr/bin/python3) 00:02:21.867 Program cat found: YES (/usr/bin/cat) 00:02:21.867 Compiler for C supports arguments -march=native: YES 00:02:21.867 Checking for size of "void *" : 8 00:02:21.867 Checking for size of "void *" : 8 (cached) 00:02:21.867 Compiler for C supports link arguments -Wl,--undefined-version: NO 00:02:21.867 Library m found: YES 00:02:21.867 Library numa found: YES 00:02:21.867 Has header "numaif.h" : YES 00:02:21.867 Library fdt found: NO 00:02:21.867 Library execinfo found: NO 00:02:21.867 Has header "execinfo.h" : YES 00:02:21.867 Found pkg-config: YES (/usr/bin/pkg-config) 1.8.0 00:02:21.867 Run-time dependency libarchive found: NO (tried pkgconfig) 00:02:21.867 Run-time dependency libbsd found: NO (tried pkgconfig) 00:02:21.867 Run-time dependency jansson found: NO (tried pkgconfig) 00:02:21.867 Run-time dependency openssl found: YES 3.0.9 00:02:21.867 Run-time dependency libpcap found: YES 1.10.4 00:02:21.867 Has header "pcap.h" with dependency libpcap: YES 00:02:21.867 Compiler for C supports arguments -Wcast-qual: YES 00:02:21.867 Compiler for C supports arguments -Wdeprecated: YES 00:02:21.867 Compiler for C supports arguments -Wformat: YES 00:02:21.867 Compiler for C supports arguments -Wformat-nonliteral: NO 00:02:21.867 Compiler for C supports arguments -Wformat-security: NO 00:02:21.867 Compiler for C supports arguments -Wmissing-declarations: YES 00:02:21.867 Compiler for C supports arguments -Wmissing-prototypes: YES 00:02:21.867 Compiler for C supports arguments -Wnested-externs: YES 00:02:21.867 Compiler for C supports arguments -Wold-style-definition: YES 00:02:21.867 Compiler for C supports arguments -Wpointer-arith: YES 00:02:21.867 Compiler for C supports arguments -Wsign-compare: YES 00:02:21.867 Compiler for C supports arguments -Wstrict-prototypes: YES 00:02:21.867 Compiler for C supports arguments -Wundef: YES 00:02:21.867 Compiler for C supports arguments -Wwrite-strings: YES 00:02:21.867 Compiler for C supports arguments -Wno-address-of-packed-member: YES 00:02:21.867 Compiler for C supports arguments -Wno-packed-not-aligned: YES 00:02:21.867 Compiler for C supports arguments -Wno-missing-field-initializers: YES 00:02:21.867 Compiler for C supports arguments -Wno-zero-length-bounds: YES 00:02:21.867 Program objdump found: YES (/usr/bin/objdump) 00:02:21.867 Compiler for C supports arguments -mavx512f: YES 00:02:21.867 Checking if "AVX512 checking" compiles: YES 00:02:21.867 Fetching value of define "__SSE4_2__" : 1 00:02:21.867 Fetching value of define "__AES__" : 1 00:02:21.867 Fetching value of define "__AVX__" : 1 00:02:21.867 Fetching value of define "__AVX2__" : 1 00:02:21.867 Fetching value of define "__AVX512BW__" : 1 00:02:21.867 Fetching value of define "__AVX512CD__" : 1 00:02:21.867 Fetching value of define "__AVX512DQ__" : 1 00:02:21.867 Fetching value of define "__AVX512F__" : 1 00:02:21.867 Fetching value of define "__AVX512VL__" : 1 00:02:21.867 Fetching value of define "__PCLMUL__" : 1 00:02:21.867 Fetching value of define "__RDRND__" : 1 00:02:21.867 Fetching value of define "__RDSEED__" : 1 00:02:21.867 Fetching value of define "__VPCLMULQDQ__" : (undefined) 00:02:21.867 Fetching value of define "__znver1__" : (undefined) 00:02:21.867 Fetching value of define "__znver2__" : (undefined) 00:02:21.867 Fetching value of define "__znver3__" : (undefined) 00:02:21.867 Fetching value of define "__znver4__" : (undefined) 00:02:21.867 Compiler for C supports arguments -Wno-format-truncation: YES 00:02:21.867 Message: lib/log: Defining dependency "log" 00:02:21.867 Message: lib/kvargs: Defining dependency "kvargs" 00:02:21.867 Message: lib/telemetry: Defining dependency "telemetry" 00:02:21.867 Checking for function "getentropy" : NO 00:02:21.867 Message: lib/eal: Defining dependency "eal" 00:02:21.867 Message: lib/ring: Defining dependency "ring" 00:02:21.867 Message: lib/rcu: Defining dependency "rcu" 00:02:21.867 Message: lib/mempool: Defining dependency "mempool" 00:02:21.867 Message: lib/mbuf: Defining dependency "mbuf" 00:02:21.867 Fetching value of define "__PCLMUL__" : 1 (cached) 00:02:21.867 Fetching value of define "__AVX512F__" : 1 (cached) 00:02:21.867 Fetching value of define "__AVX512BW__" : 1 (cached) 00:02:21.867 Fetching value of define "__AVX512DQ__" : 1 (cached) 00:02:21.867 Fetching value of define "__AVX512VL__" : 1 (cached) 00:02:21.867 Fetching value of define "__VPCLMULQDQ__" : (undefined) (cached) 00:02:21.867 Compiler for C supports arguments -mpclmul: YES 00:02:21.867 Compiler for C supports arguments -maes: YES 00:02:21.867 Compiler for C supports arguments -mavx512f: YES (cached) 00:02:21.867 Compiler for C supports arguments -mavx512bw: YES 00:02:21.867 Compiler for C supports arguments -mavx512dq: YES 00:02:21.867 Compiler for C supports arguments -mavx512vl: YES 00:02:21.867 Compiler for C supports arguments -mvpclmulqdq: YES 00:02:21.867 Compiler for C supports arguments -mavx2: YES 00:02:21.867 Compiler for C supports arguments -mavx: YES 00:02:21.867 Message: lib/net: Defining dependency "net" 00:02:21.867 Message: lib/meter: Defining dependency "meter" 00:02:21.867 Message: lib/ethdev: Defining dependency "ethdev" 00:02:21.867 Message: lib/pci: Defining dependency "pci" 00:02:21.867 Message: lib/cmdline: Defining dependency "cmdline" 00:02:21.867 Message: lib/hash: Defining dependency "hash" 00:02:21.867 Message: lib/timer: Defining dependency "timer" 00:02:21.867 Message: lib/compressdev: Defining dependency "compressdev" 00:02:21.867 Message: lib/cryptodev: Defining dependency "cryptodev" 00:02:21.867 Message: lib/dmadev: Defining dependency "dmadev" 00:02:21.867 Compiler for C supports arguments -Wno-cast-qual: YES 00:02:21.868 Message: lib/power: Defining dependency "power" 00:02:21.868 Message: lib/reorder: Defining dependency "reorder" 00:02:21.868 Message: lib/security: Defining dependency "security" 00:02:21.868 Has header "linux/userfaultfd.h" : YES 00:02:21.868 Has header "linux/vduse.h" : YES 00:02:21.868 Message: lib/vhost: Defining dependency "vhost" 00:02:21.868 Compiler for C supports arguments -Wno-format-truncation: YES (cached) 00:02:21.868 Message: drivers/bus/auxiliary: Defining dependency "bus_auxiliary" 00:02:21.868 Message: drivers/bus/pci: Defining dependency "bus_pci" 00:02:21.868 Message: drivers/bus/vdev: Defining dependency "bus_vdev" 00:02:21.868 Compiler for C supports arguments -std=c11: YES 00:02:21.868 Compiler for C supports arguments -Wno-strict-prototypes: YES 00:02:21.868 Compiler for C supports arguments -D_BSD_SOURCE: YES 00:02:21.868 Compiler for C supports arguments -D_DEFAULT_SOURCE: YES 00:02:21.868 Compiler for C supports arguments -D_XOPEN_SOURCE=600: YES 00:02:21.868 Run-time dependency libmlx5 found: YES 1.24.44.0 00:02:21.868 Run-time dependency libibverbs found: YES 1.14.44.0 00:02:21.868 Library mtcr_ul found: NO 00:02:21.868 Header "infiniband/verbs.h" has symbol "IBV_FLOW_SPEC_ESP" with dependencies libmlx5, libibverbs: YES 00:02:21.868 Header "infiniband/verbs.h" has symbol "IBV_RX_HASH_IPSEC_SPI" with dependencies libmlx5, libibverbs: YES 00:02:21.868 Header "infiniband/verbs.h" has symbol "IBV_ACCESS_RELAXED_ORDERING " with dependencies libmlx5, libibverbs: YES 00:02:21.868 Header "infiniband/mlx5dv.h" has symbol "MLX5DV_CQE_RES_FORMAT_CSUM_STRIDX" with dependencies libmlx5, libibverbs: YES 00:02:21.868 Header "infiniband/mlx5dv.h" has symbol "MLX5DV_CONTEXT_MASK_TUNNEL_OFFLOADS" with dependencies libmlx5, libibverbs: YES 00:02:21.868 Header "infiniband/mlx5dv.h" has symbol "MLX5DV_CONTEXT_FLAGS_MPW_ALLOWED" with dependencies libmlx5, libibverbs: YES 00:02:21.868 Header "infiniband/mlx5dv.h" has symbol "MLX5DV_CONTEXT_FLAGS_CQE_128B_COMP" with dependencies libmlx5, libibverbs: YES 00:02:21.868 Header "infiniband/mlx5dv.h" has symbol "MLX5DV_CQ_INIT_ATTR_FLAGS_CQE_PAD" with dependencies libmlx5, libibverbs: YES 00:02:21.868 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_create_flow_action_packet_reformat" with dependencies libmlx5, libibverbs: YES 00:02:21.868 Header "infiniband/verbs.h" has symbol "IBV_FLOW_SPEC_MPLS" with dependencies libmlx5, libibverbs: YES 00:02:21.868 Header "infiniband/verbs.h" has symbol "IBV_WQ_FLAGS_PCI_WRITE_END_PADDING" with dependencies libmlx5, libibverbs: YES 00:02:21.868 Header "infiniband/verbs.h" has symbol "IBV_WQ_FLAG_RX_END_PADDING" with dependencies libmlx5, libibverbs: NO 00:02:21.868 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_query_devx_port" with dependencies libmlx5, libibverbs: NO 00:02:21.868 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_query_port" with dependencies libmlx5, libibverbs: YES 00:02:21.868 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_dr_action_create_dest_ib_port" with dependencies libmlx5, libibverbs: YES 00:02:26.065 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_devx_obj_create" with dependencies libmlx5, libibverbs: YES 00:02:26.065 Header "infiniband/mlx5dv.h" has symbol "MLX5DV_FLOW_ACTION_COUNTERS_DEVX" with dependencies libmlx5, libibverbs: YES 00:02:26.065 Header "infiniband/mlx5dv.h" has symbol "MLX5DV_FLOW_ACTION_DEFAULT_MISS" with dependencies libmlx5, libibverbs: YES 00:02:26.065 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_devx_obj_query_async" with dependencies libmlx5, libibverbs: YES 00:02:26.065 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_devx_qp_query" with dependencies libmlx5, libibverbs: YES 00:02:26.065 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_pp_alloc" with dependencies libmlx5, libibverbs: YES 00:02:26.065 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_dr_action_create_dest_devx_tir" with dependencies libmlx5, libibverbs: YES 00:02:26.065 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_devx_get_event" with dependencies libmlx5, libibverbs: YES 00:02:26.065 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_dr_action_create_flow_meter" with dependencies libmlx5, libibverbs: YES 00:02:26.065 Header "infiniband/mlx5dv.h" has symbol "MLX5_MMAP_GET_NC_PAGES_CMD" with dependencies libmlx5, libibverbs: YES 00:02:26.065 Header "infiniband/mlx5dv.h" has symbol "MLX5DV_DR_DOMAIN_TYPE_NIC_RX" with dependencies libmlx5, libibverbs: YES 00:02:26.065 Header "infiniband/mlx5dv.h" has symbol "MLX5DV_DR_DOMAIN_TYPE_FDB" with dependencies libmlx5, libibverbs: YES 00:02:26.065 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_dr_action_create_push_vlan" with dependencies libmlx5, libibverbs: YES 00:02:26.065 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_alloc_var" with dependencies libmlx5, libibverbs: YES 00:02:26.065 Header "infiniband/mlx5dv.h" has symbol "MLX5_OPCODE_ENHANCED_MPSW" with dependencies libmlx5, libibverbs: NO 00:02:26.065 Header "infiniband/mlx5dv.h" has symbol "MLX5_OPCODE_SEND_EN" with dependencies libmlx5, libibverbs: NO 00:02:26.065 Header "infiniband/mlx5dv.h" has symbol "MLX5_OPCODE_WAIT" with dependencies libmlx5, libibverbs: NO 00:02:26.065 Header "infiniband/mlx5dv.h" has symbol "MLX5_OPCODE_ACCESS_ASO" with dependencies libmlx5, libibverbs: NO 00:02:26.065 Header "linux/if_link.h" has symbol "IFLA_NUM_VF" with dependencies libmlx5, libibverbs: YES 00:02:26.065 Header "linux/if_link.h" has symbol "IFLA_EXT_MASK" with dependencies libmlx5, libibverbs: YES 00:02:26.065 Header "linux/if_link.h" has symbol "IFLA_PHYS_SWITCH_ID" with dependencies libmlx5, libibverbs: YES 00:02:26.065 Header "linux/if_link.h" has symbol "IFLA_PHYS_PORT_NAME" with dependencies libmlx5, libibverbs: YES 00:02:26.065 Header "rdma/rdma_netlink.h" has symbol "RDMA_NL_NLDEV" with dependencies libmlx5, libibverbs: YES 00:02:26.065 Header "rdma/rdma_netlink.h" has symbol "RDMA_NLDEV_CMD_GET" with dependencies libmlx5, libibverbs: YES 00:02:26.065 Header "rdma/rdma_netlink.h" has symbol "RDMA_NLDEV_CMD_PORT_GET" with dependencies libmlx5, libibverbs: YES 00:02:26.065 Header "rdma/rdma_netlink.h" has symbol "RDMA_NLDEV_ATTR_DEV_INDEX" with dependencies libmlx5, libibverbs: YES 00:02:26.065 Header "rdma/rdma_netlink.h" has symbol "RDMA_NLDEV_ATTR_DEV_NAME" with dependencies libmlx5, libibverbs: YES 00:02:26.065 Header "rdma/rdma_netlink.h" has symbol "RDMA_NLDEV_ATTR_PORT_INDEX" with dependencies libmlx5, libibverbs: YES 00:02:26.065 Header "rdma/rdma_netlink.h" has symbol "RDMA_NLDEV_ATTR_PORT_STATE" with dependencies libmlx5, libibverbs: YES 00:02:26.065 Header "rdma/rdma_netlink.h" has symbol "RDMA_NLDEV_ATTR_NDEV_INDEX" with dependencies libmlx5, libibverbs: YES 00:02:26.065 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_dump_dr_domain" with dependencies libmlx5, libibverbs: YES 00:02:26.065 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_dr_action_create_flow_sampler" with dependencies libmlx5, libibverbs: YES 00:02:26.065 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_dr_domain_set_reclaim_device_memory" with dependencies libmlx5, libibverbs: YES 00:02:26.065 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_dr_action_create_dest_array" with dependencies libmlx5, libibverbs: YES 00:02:26.065 Header "linux/devlink.h" has symbol "DEVLINK_GENL_NAME" with dependencies libmlx5, libibverbs: YES 00:02:26.065 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_dr_action_create_aso" with dependencies libmlx5, libibverbs: YES 00:02:26.065 Header "infiniband/verbs.h" has symbol "INFINIBAND_VERBS_H" with dependencies libmlx5, libibverbs: YES 00:02:26.065 Header "infiniband/mlx5dv.h" has symbol "MLX5_WQE_UMR_CTRL_FLAG_INLINE" with dependencies libmlx5, libibverbs: YES 00:02:26.065 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_dump_dr_rule" with dependencies libmlx5, libibverbs: YES 00:02:26.065 Header "infiniband/mlx5dv.h" has symbol "MLX5DV_DR_ACTION_FLAGS_ASO_CT_DIRECTION_INITIATOR" with dependencies libmlx5, libibverbs: YES 00:02:26.065 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_dr_domain_allow_duplicate_rules" with dependencies libmlx5, libibverbs: YES 00:02:26.065 Header "infiniband/verbs.h" has symbol "ibv_reg_mr_iova" with dependencies libmlx5, libibverbs: YES 00:02:26.065 Header "infiniband/verbs.h" has symbol "ibv_import_device" with dependencies libmlx5, libibverbs: YES 00:02:26.066 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_dr_action_create_dest_root_table" with dependencies libmlx5, libibverbs: YES 00:02:26.066 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_create_steering_anchor" with dependencies libmlx5, libibverbs: YES 00:02:26.066 Header "infiniband/verbs.h" has symbol "ibv_is_fork_initialized" with dependencies libmlx5, libibverbs: YES 00:02:26.066 Checking whether type "struct mlx5dv_sw_parsing_caps" has member "sw_parsing_offloads" with dependencies libmlx5, libibverbs: YES 00:02:26.066 Checking whether type "struct ibv_counter_set_init_attr" has member "counter_set_id" with dependencies libmlx5, libibverbs: NO 00:02:26.066 Checking whether type "struct ibv_counters_init_attr" has member "comp_mask" with dependencies libmlx5, libibverbs: YES 00:02:26.066 Checking whether type "struct mlx5dv_devx_uar" has member "mmap_off" with dependencies libmlx5, libibverbs: YES 00:02:26.066 Checking whether type "struct mlx5dv_flow_matcher_attr" has member "ft_type" with dependencies libmlx5, libibverbs: YES 00:02:26.066 Configuring mlx5_autoconf.h using configuration 00:02:26.066 Message: drivers/common/mlx5: Defining dependency "common_mlx5" 00:02:26.066 Run-time dependency libcrypto found: YES 3.0.9 00:02:26.066 Library IPSec_MB found: YES 00:02:26.066 Fetching value of define "IMB_VERSION_STR" : "1.5.0" 00:02:26.066 Message: drivers/common/qat: Defining dependency "common_qat" 00:02:26.066 Message: drivers/mempool/ring: Defining dependency "mempool_ring" 00:02:26.066 Message: Disabling raw/* drivers: missing internal dependency "rawdev" 00:02:26.066 Library IPSec_MB found: YES 00:02:26.066 Fetching value of define "IMB_VERSION_STR" : "1.5.0" (cached) 00:02:26.066 Message: drivers/crypto/ipsec_mb: Defining dependency "crypto_ipsec_mb" 00:02:26.066 Compiler for C supports arguments -std=c11: YES (cached) 00:02:26.066 Compiler for C supports arguments -Wno-strict-prototypes: YES (cached) 00:02:26.066 Compiler for C supports arguments -D_BSD_SOURCE: YES (cached) 00:02:26.066 Compiler for C supports arguments -D_DEFAULT_SOURCE: YES (cached) 00:02:26.066 Compiler for C supports arguments -D_XOPEN_SOURCE=600: YES (cached) 00:02:26.066 Message: drivers/crypto/mlx5: Defining dependency "crypto_mlx5" 00:02:26.066 Run-time dependency libisal found: NO (tried pkgconfig) 00:02:26.066 Library libisal found: NO 00:02:26.066 Message: drivers/compress/isal: Defining dependency "compress_isal" 00:02:26.066 Compiler for C supports arguments -std=c11: YES (cached) 00:02:26.066 Compiler for C supports arguments -Wno-strict-prototypes: YES (cached) 00:02:26.066 Compiler for C supports arguments -D_BSD_SOURCE: YES (cached) 00:02:26.066 Compiler for C supports arguments -D_DEFAULT_SOURCE: YES (cached) 00:02:26.066 Compiler for C supports arguments -D_XOPEN_SOURCE=600: YES (cached) 00:02:26.066 Message: drivers/compress/mlx5: Defining dependency "compress_mlx5" 00:02:26.066 Message: Disabling regex/* drivers: missing internal dependency "regexdev" 00:02:26.066 Message: Disabling ml/* drivers: missing internal dependency "mldev" 00:02:26.066 Message: Disabling event/* drivers: missing internal dependency "eventdev" 00:02:26.066 Message: Disabling baseband/* drivers: missing internal dependency "bbdev" 00:02:26.066 Message: Disabling gpu/* drivers: missing internal dependency "gpudev" 00:02:26.066 Program doxygen found: YES (/usr/bin/doxygen) 00:02:26.066 Configuring doxy-api-html.conf using configuration 00:02:26.066 Configuring doxy-api-man.conf using configuration 00:02:26.066 Program mandb found: YES (/usr/bin/mandb) 00:02:26.066 Program sphinx-build found: NO 00:02:26.066 Configuring rte_build_config.h using configuration 00:02:26.066 Message: 00:02:26.066 ================= 00:02:26.066 Applications Enabled 00:02:26.066 ================= 00:02:26.066 00:02:26.066 apps: 00:02:26.066 00:02:26.066 00:02:26.066 Message: 00:02:26.066 ================= 00:02:26.066 Libraries Enabled 00:02:26.066 ================= 00:02:26.066 00:02:26.066 libs: 00:02:26.066 log, kvargs, telemetry, eal, ring, rcu, mempool, mbuf, 00:02:26.066 net, meter, ethdev, pci, cmdline, hash, timer, compressdev, 00:02:26.066 cryptodev, dmadev, power, reorder, security, vhost, 00:02:26.066 00:02:26.066 Message: 00:02:26.066 =============== 00:02:26.066 Drivers Enabled 00:02:26.066 =============== 00:02:26.066 00:02:26.066 common: 00:02:26.066 mlx5, qat, 00:02:26.066 bus: 00:02:26.066 auxiliary, pci, vdev, 00:02:26.066 mempool: 00:02:26.066 ring, 00:02:26.066 dma: 00:02:26.066 00:02:26.066 net: 00:02:26.066 00:02:26.066 crypto: 00:02:26.066 ipsec_mb, mlx5, 00:02:26.066 compress: 00:02:26.066 isal, mlx5, 00:02:26.066 vdpa: 00:02:26.066 00:02:26.066 00:02:26.066 Message: 00:02:26.066 ================= 00:02:26.066 Content Skipped 00:02:26.066 ================= 00:02:26.066 00:02:26.066 apps: 00:02:26.066 dumpcap: explicitly disabled via build config 00:02:26.066 graph: explicitly disabled via build config 00:02:26.066 pdump: explicitly disabled via build config 00:02:26.066 proc-info: explicitly disabled via build config 00:02:26.066 test-acl: explicitly disabled via build config 00:02:26.066 test-bbdev: explicitly disabled via build config 00:02:26.066 test-cmdline: explicitly disabled via build config 00:02:26.066 test-compress-perf: explicitly disabled via build config 00:02:26.066 test-crypto-perf: explicitly disabled via build config 00:02:26.066 test-dma-perf: explicitly disabled via build config 00:02:26.066 test-eventdev: explicitly disabled via build config 00:02:26.066 test-fib: explicitly disabled via build config 00:02:26.066 test-flow-perf: explicitly disabled via build config 00:02:26.066 test-gpudev: explicitly disabled via build config 00:02:26.066 test-mldev: explicitly disabled via build config 00:02:26.066 test-pipeline: explicitly disabled via build config 00:02:26.066 test-pmd: explicitly disabled via build config 00:02:26.066 test-regex: explicitly disabled via build config 00:02:26.066 test-sad: explicitly disabled via build config 00:02:26.066 test-security-perf: explicitly disabled via build config 00:02:26.066 00:02:26.066 libs: 00:02:26.066 argparse: explicitly disabled via build config 00:02:26.066 metrics: explicitly disabled via build config 00:02:26.066 acl: explicitly disabled via build config 00:02:26.066 bbdev: explicitly disabled via build config 00:02:26.066 bitratestats: explicitly disabled via build config 00:02:26.066 bpf: explicitly disabled via build config 00:02:26.066 cfgfile: explicitly disabled via build config 00:02:26.066 distributor: explicitly disabled via build config 00:02:26.066 efd: explicitly disabled via build config 00:02:26.066 eventdev: explicitly disabled via build config 00:02:26.066 dispatcher: explicitly disabled via build config 00:02:26.066 gpudev: explicitly disabled via build config 00:02:26.066 gro: explicitly disabled via build config 00:02:26.066 gso: explicitly disabled via build config 00:02:26.066 ip_frag: explicitly disabled via build config 00:02:26.066 jobstats: explicitly disabled via build config 00:02:26.066 latencystats: explicitly disabled via build config 00:02:26.066 lpm: explicitly disabled via build config 00:02:26.066 member: explicitly disabled via build config 00:02:26.066 pcapng: explicitly disabled via build config 00:02:26.066 rawdev: explicitly disabled via build config 00:02:26.066 regexdev: explicitly disabled via build config 00:02:26.066 mldev: explicitly disabled via build config 00:02:26.066 rib: explicitly disabled via build config 00:02:26.066 sched: explicitly disabled via build config 00:02:26.066 stack: explicitly disabled via build config 00:02:26.066 ipsec: explicitly disabled via build config 00:02:26.066 pdcp: explicitly disabled via build config 00:02:26.066 fib: explicitly disabled via build config 00:02:26.066 port: explicitly disabled via build config 00:02:26.066 pdump: explicitly disabled via build config 00:02:26.066 table: explicitly disabled via build config 00:02:26.066 pipeline: explicitly disabled via build config 00:02:26.066 graph: explicitly disabled via build config 00:02:26.066 node: explicitly disabled via build config 00:02:26.066 00:02:26.066 drivers: 00:02:26.066 common/cpt: not in enabled drivers build config 00:02:26.066 common/dpaax: not in enabled drivers build config 00:02:26.066 common/iavf: not in enabled drivers build config 00:02:26.066 common/idpf: not in enabled drivers build config 00:02:26.066 common/ionic: not in enabled drivers build config 00:02:26.066 common/mvep: not in enabled drivers build config 00:02:26.066 common/octeontx: not in enabled drivers build config 00:02:26.066 bus/cdx: not in enabled drivers build config 00:02:26.066 bus/dpaa: not in enabled drivers build config 00:02:26.066 bus/fslmc: not in enabled drivers build config 00:02:26.066 bus/ifpga: not in enabled drivers build config 00:02:26.066 bus/platform: not in enabled drivers build config 00:02:26.066 bus/uacce: not in enabled drivers build config 00:02:26.066 bus/vmbus: not in enabled drivers build config 00:02:26.066 common/cnxk: not in enabled drivers build config 00:02:26.066 common/nfp: not in enabled drivers build config 00:02:26.066 common/nitrox: not in enabled drivers build config 00:02:26.066 common/sfc_efx: not in enabled drivers build config 00:02:26.066 mempool/bucket: not in enabled drivers build config 00:02:26.066 mempool/cnxk: not in enabled drivers build config 00:02:26.066 mempool/dpaa: not in enabled drivers build config 00:02:26.066 mempool/dpaa2: not in enabled drivers build config 00:02:26.066 mempool/octeontx: not in enabled drivers build config 00:02:26.066 mempool/stack: not in enabled drivers build config 00:02:26.066 dma/cnxk: not in enabled drivers build config 00:02:26.066 dma/dpaa: not in enabled drivers build config 00:02:26.066 dma/dpaa2: not in enabled drivers build config 00:02:26.066 dma/hisilicon: not in enabled drivers build config 00:02:26.066 dma/idxd: not in enabled drivers build config 00:02:26.066 dma/ioat: not in enabled drivers build config 00:02:26.066 dma/skeleton: not in enabled drivers build config 00:02:26.066 net/af_packet: not in enabled drivers build config 00:02:26.066 net/af_xdp: not in enabled drivers build config 00:02:26.066 net/ark: not in enabled drivers build config 00:02:26.066 net/atlantic: not in enabled drivers build config 00:02:26.066 net/avp: not in enabled drivers build config 00:02:26.066 net/axgbe: not in enabled drivers build config 00:02:26.066 net/bnx2x: not in enabled drivers build config 00:02:26.066 net/bnxt: not in enabled drivers build config 00:02:26.066 net/bonding: not in enabled drivers build config 00:02:26.066 net/cnxk: not in enabled drivers build config 00:02:26.066 net/cpfl: not in enabled drivers build config 00:02:26.066 net/cxgbe: not in enabled drivers build config 00:02:26.066 net/dpaa: not in enabled drivers build config 00:02:26.066 net/dpaa2: not in enabled drivers build config 00:02:26.066 net/e1000: not in enabled drivers build config 00:02:26.066 net/ena: not in enabled drivers build config 00:02:26.066 net/enetc: not in enabled drivers build config 00:02:26.066 net/enetfec: not in enabled drivers build config 00:02:26.066 net/enic: not in enabled drivers build config 00:02:26.066 net/failsafe: not in enabled drivers build config 00:02:26.066 net/fm10k: not in enabled drivers build config 00:02:26.066 net/gve: not in enabled drivers build config 00:02:26.066 net/hinic: not in enabled drivers build config 00:02:26.066 net/hns3: not in enabled drivers build config 00:02:26.066 net/i40e: not in enabled drivers build config 00:02:26.066 net/iavf: not in enabled drivers build config 00:02:26.066 net/ice: not in enabled drivers build config 00:02:26.066 net/idpf: not in enabled drivers build config 00:02:26.066 net/igc: not in enabled drivers build config 00:02:26.066 net/ionic: not in enabled drivers build config 00:02:26.066 net/ipn3ke: not in enabled drivers build config 00:02:26.066 net/ixgbe: not in enabled drivers build config 00:02:26.066 net/mana: not in enabled drivers build config 00:02:26.066 net/memif: not in enabled drivers build config 00:02:26.066 net/mlx4: not in enabled drivers build config 00:02:26.066 net/mlx5: not in enabled drivers build config 00:02:26.066 net/mvneta: not in enabled drivers build config 00:02:26.066 net/mvpp2: not in enabled drivers build config 00:02:26.066 net/netvsc: not in enabled drivers build config 00:02:26.066 net/nfb: not in enabled drivers build config 00:02:26.066 net/nfp: not in enabled drivers build config 00:02:26.066 net/ngbe: not in enabled drivers build config 00:02:26.066 net/null: not in enabled drivers build config 00:02:26.066 net/octeontx: not in enabled drivers build config 00:02:26.066 net/octeon_ep: not in enabled drivers build config 00:02:26.066 net/pcap: not in enabled drivers build config 00:02:26.066 net/pfe: not in enabled drivers build config 00:02:26.066 net/qede: not in enabled drivers build config 00:02:26.066 net/ring: not in enabled drivers build config 00:02:26.066 net/sfc: not in enabled drivers build config 00:02:26.066 net/softnic: not in enabled drivers build config 00:02:26.066 net/tap: not in enabled drivers build config 00:02:26.066 net/thunderx: not in enabled drivers build config 00:02:26.066 net/txgbe: not in enabled drivers build config 00:02:26.066 net/vdev_netvsc: not in enabled drivers build config 00:02:26.066 net/vhost: not in enabled drivers build config 00:02:26.066 net/virtio: not in enabled drivers build config 00:02:26.066 net/vmxnet3: not in enabled drivers build config 00:02:26.066 raw/*: missing internal dependency, "rawdev" 00:02:26.066 crypto/armv8: not in enabled drivers build config 00:02:26.066 crypto/bcmfs: not in enabled drivers build config 00:02:26.066 crypto/caam_jr: not in enabled drivers build config 00:02:26.066 crypto/ccp: not in enabled drivers build config 00:02:26.066 crypto/cnxk: not in enabled drivers build config 00:02:26.066 crypto/dpaa_sec: not in enabled drivers build config 00:02:26.066 crypto/dpaa2_sec: not in enabled drivers build config 00:02:26.066 crypto/mvsam: not in enabled drivers build config 00:02:26.066 crypto/nitrox: not in enabled drivers build config 00:02:26.066 crypto/null: not in enabled drivers build config 00:02:26.066 crypto/octeontx: not in enabled drivers build config 00:02:26.066 crypto/openssl: not in enabled drivers build config 00:02:26.066 crypto/scheduler: not in enabled drivers build config 00:02:26.066 crypto/uadk: not in enabled drivers build config 00:02:26.066 crypto/virtio: not in enabled drivers build config 00:02:26.066 compress/nitrox: not in enabled drivers build config 00:02:26.066 compress/octeontx: not in enabled drivers build config 00:02:26.067 compress/zlib: not in enabled drivers build config 00:02:26.067 regex/*: missing internal dependency, "regexdev" 00:02:26.067 ml/*: missing internal dependency, "mldev" 00:02:26.067 vdpa/ifc: not in enabled drivers build config 00:02:26.067 vdpa/mlx5: not in enabled drivers build config 00:02:26.067 vdpa/nfp: not in enabled drivers build config 00:02:26.067 vdpa/sfc: not in enabled drivers build config 00:02:26.067 event/*: missing internal dependency, "eventdev" 00:02:26.067 baseband/*: missing internal dependency, "bbdev" 00:02:26.067 gpu/*: missing internal dependency, "gpudev" 00:02:26.067 00:02:26.067 00:02:26.324 Build targets in project: 115 00:02:26.324 00:02:26.324 DPDK 24.03.0 00:02:26.324 00:02:26.324 User defined options 00:02:26.324 buildtype : debug 00:02:26.324 default_library : shared 00:02:26.324 libdir : lib 00:02:26.324 prefix : /var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build 00:02:26.324 c_args : -Wno-stringop-overflow -fcommon -Wno-stringop-overread -Wno-array-bounds -I/var/jenkins/workspace/crypto-phy-autotest/spdk/intel-ipsec-mb/lib -DNO_COMPAT_IMB_API_053 -I/var/jenkins/workspace/crypto-phy-autotest/spdk/isa-l -I/var/jenkins/workspace/crypto-phy-autotest/spdk/isalbuild -fPIC -Werror 00:02:26.324 c_link_args : -L/var/jenkins/workspace/crypto-phy-autotest/spdk/intel-ipsec-mb/lib -L/var/jenkins/workspace/crypto-phy-autotest/spdk/isa-l/.libs -lisal 00:02:26.324 cpu_instruction_set: native 00:02:26.324 disable_apps : test-sad,test-acl,test-dma-perf,test-pipeline,test-compress-perf,test-fib,test-flow-perf,test-crypto-perf,test-bbdev,test-eventdev,pdump,test-mldev,test-cmdline,graph,test-security-perf,test-pmd,test,proc-info,test-regex,dumpcap,test-gpudev 00:02:26.324 disable_libs : port,sched,rib,node,ipsec,distributor,gro,eventdev,pdcp,acl,member,latencystats,efd,stack,regexdev,rawdev,bpf,metrics,gpudev,pipeline,pdump,table,fib,dispatcher,mldev,gso,cfgfile,bitratestats,ip_frag,graph,lpm,jobstats,argparse,pcapng,bbdev 00:02:26.324 enable_docs : false 00:02:26.324 enable_drivers : bus,bus/pci,bus/vdev,mempool/ring,crypto/qat,compress/qat,common/qat,common/mlx5,bus/auxiliary,crypto,crypto/aesni_mb,crypto/mlx5,crypto/ipsec_mb,compress,compress/isal,compress/mlx5 00:02:26.324 enable_kmods : false 00:02:26.324 max_lcores : 128 00:02:26.324 tests : false 00:02:26.324 00:02:26.324 Found ninja-1.11.1.git.kitware.jobserver-1 at /usr/local/bin/ninja 00:02:26.901 ninja: Entering directory `/var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build-tmp' 00:02:26.901 [1/378] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_errno.c.o 00:02:27.159 [2/378] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_hypervisor.c.o 00:02:27.159 [3/378] Compiling C object lib/librte_eal.a.p/eal_common_rte_version.c.o 00:02:27.159 [4/378] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_cpuflags.c.o 00:02:27.159 [5/378] Compiling C object lib/librte_eal.a.p/eal_x86_rte_spinlock.c.o 00:02:27.159 [6/378] Compiling C object lib/librte_log.a.p/log_log_linux.c.o 00:02:27.159 [7/378] Compiling C object lib/librte_eal.a.p/eal_common_rte_reciprocal.c.o 00:02:27.159 [8/378] Compiling C object lib/librte_kvargs.a.p/kvargs_rte_kvargs.c.o 00:02:27.159 [9/378] Compiling C object lib/librte_eal.a.p/eal_x86_rte_hypervisor.c.o 00:02:27.159 [10/378] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_hexdump.c.o 00:02:27.159 [11/378] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_string_fns.c.o 00:02:27.159 [12/378] Compiling C object lib/librte_log.a.p/log_log.c.o 00:02:27.159 [13/378] Linking static target lib/librte_kvargs.a 00:02:27.159 [14/378] Compiling C object lib/librte_telemetry.a.p/telemetry_telemetry_data.c.o 00:02:27.159 [15/378] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_uuid.c.o 00:02:27.159 [16/378] Linking static target lib/librte_log.a 00:02:27.159 [17/378] Compiling C object lib/librte_eal.a.p/eal_linux_eal_cpuflags.c.o 00:02:27.159 [18/378] Compiling C object lib/librte_eal.a.p/eal_x86_rte_cpuflags.c.o 00:02:27.159 [19/378] Compiling C object lib/librte_eal.a.p/eal_unix_eal_debug.c.o 00:02:27.427 [20/378] Compiling C object lib/librte_eal.a.p/eal_linux_eal_timer.c.o 00:02:27.427 [21/378] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_devargs.c.o 00:02:27.427 [22/378] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_trace_points.c.o 00:02:27.686 [23/378] Generating lib/kvargs.sym_chk with a custom command (wrapped by meson to capture output) 00:02:27.686 [24/378] Compiling C object lib/librte_eal.a.p/eal_unix_rte_thread.c.o 00:02:27.686 [25/378] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_debug.c.o 00:02:27.686 [26/378] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_config.c.o 00:02:27.686 [27/378] Compiling C object lib/librte_eal.a.p/eal_common_rte_keepalive.c.o 00:02:27.686 [28/378] Compiling C object lib/librte_eal.a.p/eal_common_rte_random.c.o 00:02:27.686 [29/378] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_tailqs.c.o 00:02:27.686 [30/378] Compiling C object lib/librte_mempool.a.p/mempool_rte_mempool_ops_default.c.o 00:02:27.686 [31/378] Compiling C object lib/librte_eal.a.p/eal_common_rte_service.c.o 00:02:27.686 [32/378] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_memalloc.c.o 00:02:27.686 [33/378] Compiling C object lib/librte_eal.a.p/eal_unix_eal_firmware.c.o 00:02:27.686 [34/378] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_class.c.o 00:02:27.686 [35/378] Compiling C object lib/librte_eal.a.p/eal_unix_eal_unix_memory.c.o 00:02:27.686 [36/378] Compiling C object lib/librte_eal.a.p/eal_unix_eal_filesystem.c.o 00:02:27.686 [37/378] Compiling C object lib/librte_eal.a.p/eal_unix_eal_unix_timer.c.o 00:02:27.686 [38/378] Compiling C object lib/librte_ring.a.p/ring_rte_ring.c.o 00:02:27.686 [39/378] Linking static target lib/librte_ring.a 00:02:27.686 [40/378] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_timer.c.o 00:02:27.686 [41/378] Compiling C object lib/librte_telemetry.a.p/telemetry_telemetry_legacy.c.o 00:02:27.686 [42/378] Compiling C object lib/librte_eal.a.p/eal_unix_eal_unix_thread.c.o 00:02:27.686 [43/378] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_mcfg.c.o 00:02:27.686 [44/378] Compiling C object lib/librte_eal.a.p/eal_common_malloc_elem.c.o 00:02:27.686 [45/378] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_trace.c.o 00:02:27.686 [46/378] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_lcore.c.o 00:02:27.686 [47/378] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_bus.c.o 00:02:27.686 [48/378] Compiling C object lib/librte_eal.a.p/eal_x86_rte_power_intrinsics.c.o 00:02:27.686 [49/378] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_thread.c.o 00:02:27.686 [50/378] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_trace_ctf.c.o 00:02:27.686 [51/378] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_trace_utils.c.o 00:02:27.686 [52/378] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_proc.c.o 00:02:27.686 [53/378] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_dynmem.c.o 00:02:27.686 [54/378] Compiling C object lib/librte_eal.a.p/eal_linux_eal_lcore.c.o 00:02:27.686 [55/378] Compiling C object lib/librte_eal.a.p/eal_linux_eal_vfio_mp_sync.c.o 00:02:27.686 [56/378] Compiling C object lib/librte_eal.a.p/eal_linux_eal_thread.c.o 00:02:27.686 [57/378] Compiling C object lib/librte_eal.a.p/eal_common_hotplug_mp.c.o 00:02:27.686 [58/378] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_interrupts.c.o 00:02:27.686 [59/378] Compiling C object lib/librte_eal.a.p/eal_linux_eal_memory.c.o 00:02:27.686 [60/378] Compiling C object lib/librte_eal.a.p/eal_x86_rte_cycles.c.o 00:02:27.686 [61/378] Compiling C object lib/librte_telemetry.a.p/telemetry_telemetry.c.o 00:02:27.686 [62/378] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_dev.c.o 00:02:27.686 [63/378] Compiling C object lib/librte_eal.a.p/eal_linux_eal_alarm.c.o 00:02:27.686 [64/378] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_fbarray.c.o 00:02:27.686 [65/378] Linking static target lib/librte_telemetry.a 00:02:27.686 [66/378] Compiling C object lib/librte_eal.a.p/eal_common_malloc_mp.c.o 00:02:27.686 [67/378] Compiling C object lib/librte_eal.a.p/eal_common_malloc_heap.c.o 00:02:27.686 [68/378] Compiling C object lib/librte_eal.a.p/eal_common_rte_malloc.c.o 00:02:27.686 [69/378] Compiling C object lib/librte_eal.a.p/eal_linux_eal_hugepage_info.c.o 00:02:27.686 [70/378] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_memzone.c.o 00:02:27.686 [71/378] Compiling C object lib/librte_net.a.p/net_rte_net_crc.c.o 00:02:27.686 [72/378] Compiling C object lib/librte_mbuf.a.p/mbuf_rte_mbuf_ptype.c.o 00:02:27.686 [73/378] Compiling C object lib/librte_mempool.a.p/mempool_mempool_trace_points.c.o 00:02:27.686 [74/378] Compiling C object lib/librte_eal.a.p/eal_linux_eal_dev.c.o 00:02:27.686 [75/378] Compiling C object lib/librte_pci.a.p/pci_rte_pci.c.o 00:02:27.686 [76/378] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_cirbuf.c.o 00:02:27.686 [77/378] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_vt100.c.o 00:02:27.949 [78/378] Linking static target lib/librte_pci.a 00:02:27.949 [79/378] Compiling C object lib/librte_eal.a.p/eal_linux_eal.c.o 00:02:27.949 [80/378] Compiling C object lib/librte_mbuf.a.p/mbuf_rte_mbuf_pool_ops.c.o 00:02:27.949 [81/378] Compiling C object lib/librte_hash.a.p/hash_rte_hash_crc.c.o 00:02:27.949 [82/378] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_socket.c.o 00:02:27.949 [83/378] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_parse_string.c.o 00:02:27.949 [84/378] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_parse_num.c.o 00:02:27.949 [85/378] Compiling C object lib/librte_mempool.a.p/mempool_rte_mempool_ops.c.o 00:02:27.949 [86/378] Compiling C object lib/net/libnet_crc_avx512_lib.a.p/net_crc_avx512.c.o 00:02:27.949 [87/378] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_parse.c.o 00:02:27.949 [88/378] Compiling C object lib/librte_mbuf.a.p/mbuf_rte_mbuf_dyn.c.o 00:02:27.949 [89/378] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_memory.c.o 00:02:27.949 [90/378] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_parse_portlist.c.o 00:02:27.949 [91/378] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_os_unix.c.o 00:02:27.949 [92/378] Linking static target lib/net/libnet_crc_avx512_lib.a 00:02:27.949 [93/378] Compiling C object lib/librte_net.a.p/net_rte_arp.c.o 00:02:27.950 [94/378] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline.c.o 00:02:27.950 [95/378] Compiling C object lib/librte_net.a.p/net_rte_ether.c.o 00:02:27.950 [96/378] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_rdline.c.o 00:02:27.950 [97/378] Compiling C object lib/librte_power.a.p/power_power_kvm_vm.c.o 00:02:27.950 [98/378] Compiling C object lib/librte_eal.a.p/eal_linux_eal_interrupts.c.o 00:02:27.950 [99/378] Compiling C object lib/librte_power.a.p/power_power_common.c.o 00:02:27.950 [100/378] Compiling C object lib/librte_eal.a.p/eal_linux_eal_memalloc.c.o 00:02:27.950 [101/378] Compiling C object lib/librte_net.a.p/net_rte_net.c.o 00:02:27.950 [102/378] Compiling C object lib/librte_eal.a.p/eal_linux_eal_vfio.c.o 00:02:27.950 [103/378] Compiling C object lib/librte_rcu.a.p/rcu_rte_rcu_qsbr.c.o 00:02:27.950 [104/378] Compiling C object lib/librte_power.a.p/power_guest_channel.c.o 00:02:27.950 [105/378] Compiling C object lib/librte_mempool.a.p/mempool_rte_mempool.c.o 00:02:27.950 [106/378] Linking static target lib/librte_rcu.a 00:02:27.950 [107/378] Linking static target lib/librte_mempool.a 00:02:27.950 [108/378] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_options.c.o 00:02:27.950 [109/378] Compiling C object lib/librte_vhost.a.p/vhost_fd_man.c.o 00:02:27.950 [110/378] Compiling C object lib/librte_meter.a.p/meter_rte_meter.c.o 00:02:27.950 [111/378] Generating lib/log.sym_chk with a custom command (wrapped by meson to capture output) 00:02:27.950 [112/378] Linking static target lib/librte_meter.a 00:02:27.950 [113/378] Compiling C object drivers/libtmp_rte_bus_auxiliary.a.p/bus_auxiliary_auxiliary_params.c.o 00:02:28.237 [114/378] Compiling C object drivers/libtmp_rte_bus_pci.a.p/bus_pci_pci_params.c.o 00:02:28.237 [115/378] Linking target lib/librte_log.so.24.1 00:02:28.237 [116/378] Compiling C object lib/librte_net.a.p/net_net_crc_sse.c.o 00:02:28.237 [117/378] Linking static target lib/librte_net.a 00:02:28.237 [118/378] Compiling C object drivers/libtmp_rte_bus_vdev.a.p/bus_vdev_vdev_params.c.o 00:02:28.237 [119/378] Compiling C object lib/librte_ethdev.a.p/ethdev_ethdev_profile.c.o 00:02:28.237 [120/378] Generating lib/ring.sym_chk with a custom command (wrapped by meson to capture output) 00:02:28.237 [121/378] Generating lib/pci.sym_chk with a custom command (wrapped by meson to capture output) 00:02:28.237 [122/378] Compiling C object lib/librte_ethdev.a.p/ethdev_rte_class_eth.c.o 00:02:28.237 [123/378] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_parse_etheraddr.c.o 00:02:28.237 [124/378] Compiling C object lib/librte_mbuf.a.p/mbuf_rte_mbuf.c.o 00:02:28.237 [125/378] Linking static target lib/librte_mbuf.a 00:02:28.542 [126/378] Generating symbol file lib/librte_log.so.24.1.p/librte_log.so.24.1.symbols 00:02:28.542 [127/378] Compiling C object lib/librte_eal.a.p/eal_unix_eal_file.c.o 00:02:28.542 [128/378] Compiling C object lib/librte_ethdev.a.p/ethdev_ethdev_private.c.o 00:02:28.542 [129/378] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_parse_ipaddr.c.o 00:02:28.542 [130/378] Compiling C object lib/librte_hash.a.p/hash_rte_thash_gfni.c.o 00:02:28.542 [131/378] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_launch.c.o 00:02:28.542 [132/378] Compiling C object lib/librte_ethdev.a.p/ethdev_sff_8636.c.o 00:02:28.542 [133/378] Linking static target lib/librte_cmdline.a 00:02:28.542 [134/378] Compiling C object lib/librte_ethdev.a.p/ethdev_sff_telemetry.c.o 00:02:28.542 [135/378] Compiling C object lib/librte_ethdev.a.p/ethdev_sff_8472.c.o 00:02:28.542 [136/378] Compiling C object lib/librte_hash.a.p/hash_rte_thash.c.o 00:02:28.542 [137/378] Compiling C object lib/librte_compressdev.a.p/compressdev_rte_compressdev_pmd.c.o 00:02:28.542 [138/378] Linking target lib/librte_kvargs.so.24.1 00:02:28.542 [139/378] Compiling C object lib/librte_ethdev.a.p/ethdev_ethdev_driver.c.o 00:02:28.542 [140/378] Compiling C object lib/librte_dmadev.a.p/dmadev_rte_dmadev_trace_points.c.o 00:02:28.542 [141/378] Linking static target lib/librte_eal.a 00:02:28.542 [142/378] Compiling C object lib/librte_timer.a.p/timer_rte_timer.c.o 00:02:28.542 [143/378] Compiling C object lib/librte_hash.a.p/hash_rte_fbk_hash.c.o 00:02:28.542 [144/378] Compiling C object lib/librte_power.a.p/power_rte_power.c.o 00:02:28.542 [145/378] Linking static target lib/librte_timer.a 00:02:28.542 [146/378] Compiling C object lib/librte_power.a.p/power_rte_power_uncore.c.o 00:02:28.542 [147/378] Generating lib/rcu.sym_chk with a custom command (wrapped by meson to capture output) 00:02:28.542 [148/378] Compiling C object lib/librte_power.a.p/power_power_amd_pstate_cpufreq.c.o 00:02:28.542 [149/378] Generating lib/meter.sym_chk with a custom command (wrapped by meson to capture output) 00:02:28.542 [150/378] Compiling C object drivers/libtmp_rte_bus_auxiliary.a.p/bus_auxiliary_linux_auxiliary.c.o 00:02:28.542 [151/378] Compiling C object lib/librte_cryptodev.a.p/cryptodev_cryptodev_pmd.c.o 00:02:28.542 [152/378] Compiling C object lib/librte_power.a.p/power_power_intel_uncore.c.o 00:02:28.542 [153/378] Compiling C object lib/librte_power.a.p/power_power_acpi_cpufreq.c.o 00:02:28.542 [154/378] Compiling C object drivers/libtmp_rte_common_qat.a.p/common_qat_qat_logs.c.o 00:02:28.542 [155/378] Compiling C object lib/librte_ethdev.a.p/ethdev_ethdev_linux_ethtool.c.o 00:02:28.542 [156/378] Compiling C object lib/librte_ethdev.a.p/ethdev_sff_8079.c.o 00:02:28.542 [157/378] Compiling C object lib/librte_compressdev.a.p/compressdev_rte_compressdev.c.o 00:02:28.542 [158/378] Compiling C object lib/librte_power.a.p/power_power_cppc_cpufreq.c.o 00:02:28.542 [159/378] Compiling C object lib/librte_ethdev.a.p/ethdev_rte_tm.c.o 00:02:28.542 [160/378] Compiling C object lib/librte_ethdev.a.p/ethdev_rte_ethdev_cman.c.o 00:02:28.542 [161/378] Compiling C object lib/librte_ethdev.a.p/ethdev_sff_common.c.o 00:02:28.542 [162/378] Generating lib/telemetry.sym_chk with a custom command (wrapped by meson to capture output) 00:02:28.542 [163/378] Compiling C object lib/librte_cryptodev.a.p/cryptodev_cryptodev_trace_points.c.o 00:02:28.542 [164/378] Compiling C object lib/librte_vhost.a.p/vhost_socket.c.o 00:02:28.543 [165/378] Compiling C object lib/librte_dmadev.a.p/dmadev_rte_dmadev.c.o 00:02:28.543 [166/378] Compiling C object lib/librte_compressdev.a.p/compressdev_rte_comp.c.o 00:02:28.543 [167/378] Compiling C object lib/librte_power.a.p/power_power_pstate_cpufreq.c.o 00:02:28.543 [168/378] Generating lib/net.sym_chk with a custom command (wrapped by meson to capture output) 00:02:28.543 [169/378] Compiling C object drivers/libtmp_rte_bus_auxiliary.a.p/bus_auxiliary_auxiliary_common.c.o 00:02:28.543 [170/378] Linking static target lib/librte_dmadev.a 00:02:28.543 [171/378] Linking static target lib/librte_compressdev.a 00:02:28.543 [172/378] Compiling C object lib/librte_ethdev.a.p/ethdev_rte_ethdev_telemetry.c.o 00:02:28.543 [173/378] Linking static target drivers/libtmp_rte_bus_auxiliary.a 00:02:28.543 [174/378] Linking target lib/librte_telemetry.so.24.1 00:02:28.543 [175/378] Compiling C object drivers/libtmp_rte_bus_pci.a.p/bus_pci_pci_common.c.o 00:02:28.810 [176/378] Compiling C object lib/librte_power.a.p/power_rte_power_pmd_mgmt.c.o 00:02:28.810 [177/378] Compiling C object drivers/libtmp_rte_bus_pci.a.p/bus_pci_pci_common_uio.c.o 00:02:28.810 [178/378] Compiling C object drivers/libtmp_rte_bus_pci.a.p/bus_pci_linux_pci.c.o 00:02:28.810 [179/378] Linking static target lib/librte_power.a 00:02:28.810 [180/378] Compiling C object lib/librte_vhost.a.p/vhost_virtio_net_ctrl.c.o 00:02:28.810 [181/378] Compiling C object drivers/libtmp_rte_bus_vdev.a.p/bus_vdev_vdev.c.o 00:02:28.810 [182/378] Compiling C object lib/librte_ethdev.a.p/ethdev_rte_mtr.c.o 00:02:28.810 [183/378] Linking static target drivers/libtmp_rte_bus_vdev.a 00:02:28.810 [184/378] Compiling C object drivers/libtmp_rte_bus_pci.a.p/bus_pci_linux_pci_uio.c.o 00:02:28.810 [185/378] Generating symbol file lib/librte_kvargs.so.24.1.p/librte_kvargs.so.24.1.symbols 00:02:28.810 [186/378] Compiling C object lib/librte_vhost.a.p/vhost_vdpa.c.o 00:02:28.810 [187/378] Compiling C object lib/librte_vhost.a.p/vhost_iotlb.c.o 00:02:28.810 [188/378] Compiling C object lib/librte_reorder.a.p/reorder_rte_reorder.c.o 00:02:28.810 [189/378] Compiling C object drivers/libtmp_rte_bus_pci.a.p/bus_pci_linux_pci_vfio.c.o 00:02:28.810 [190/378] Linking static target lib/librte_reorder.a 00:02:28.810 [191/378] Linking static target drivers/libtmp_rte_bus_pci.a 00:02:28.810 [192/378] Compiling C object lib/librte_vhost.a.p/vhost_vduse.c.o 00:02:28.810 [193/378] Compiling C object drivers/libtmp_rte_common_mlx5.a.p/common_mlx5_linux_mlx5_glue.c.o 00:02:28.810 [194/378] Generating symbol file lib/librte_telemetry.so.24.1.p/librte_telemetry.so.24.1.symbols 00:02:28.810 [195/378] Compiling C object lib/librte_security.a.p/security_rte_security.c.o 00:02:28.810 [196/378] Linking static target lib/librte_security.a 00:02:28.810 [197/378] Compiling C object drivers/libtmp_rte_common_mlx5.a.p/common_mlx5_mlx5_malloc.c.o 00:02:29.069 [198/378] Generating drivers/rte_bus_auxiliary.pmd.c with a custom command 00:02:29.069 [199/378] Compiling C object lib/librte_ethdev.a.p/ethdev_ethdev_trace_points.c.o 00:02:29.069 [200/378] Compiling C object drivers/librte_bus_auxiliary.a.p/meson-generated_.._rte_bus_auxiliary.pmd.c.o 00:02:29.069 [201/378] Compiling C object drivers/libtmp_rte_common_mlx5.a.p/common_mlx5_mlx5_common_mp.c.o 00:02:29.069 [202/378] Compiling C object drivers/librte_bus_auxiliary.so.24.1.p/meson-generated_.._rte_bus_auxiliary.pmd.c.o 00:02:29.069 [203/378] Linking static target drivers/librte_bus_auxiliary.a 00:02:29.069 [204/378] Generating drivers/rte_bus_vdev.pmd.c with a custom command 00:02:29.069 [205/378] Compiling C object drivers/libtmp_rte_common_mlx5.a.p/common_mlx5_mlx5_common_pci.c.o 00:02:29.069 [206/378] Compiling C object drivers/librte_bus_vdev.a.p/meson-generated_.._rte_bus_vdev.pmd.c.o 00:02:29.069 [207/378] Compiling C object lib/librte_hash.a.p/hash_rte_cuckoo_hash.c.o 00:02:29.069 [208/378] Compiling C object lib/librte_vhost.a.p/vhost_vhost_user.c.o 00:02:29.069 [209/378] Linking static target drivers/librte_bus_vdev.a 00:02:29.069 [210/378] Compiling C object drivers/librte_bus_vdev.so.24.1.p/meson-generated_.._rte_bus_vdev.pmd.c.o 00:02:29.069 [211/378] Linking static target lib/librte_hash.a 00:02:29.069 [212/378] Compiling C object lib/librte_vhost.a.p/vhost_vhost.c.o 00:02:29.069 [213/378] Compiling C object drivers/libtmp_rte_common_qat.a.p/compress_qat_dev_qat_comp_pmd_gen4.c.o 00:02:29.069 [214/378] Compiling C object drivers/libtmp_rte_common_qat.a.p/common_qat_qat_common.c.o 00:02:29.069 [215/378] Compiling C object drivers/libtmp_rte_common_qat.a.p/compress_qat_dev_qat_comp_pmd_gen2.c.o 00:02:29.069 [216/378] Generating lib/timer.sym_chk with a custom command (wrapped by meson to capture output) 00:02:29.069 [217/378] Compiling C object drivers/libtmp_rte_common_mlx5.a.p/common_mlx5_linux_mlx5_common_verbs.c.o 00:02:29.069 [218/378] Compiling C object drivers/libtmp_rte_common_mlx5.a.p/common_mlx5_mlx5_common.c.o 00:02:29.069 [219/378] Compiling C object drivers/libtmp_rte_common_mlx5.a.p/common_mlx5_mlx5_common_mr.c.o 00:02:29.069 [220/378] Generating lib/mempool.sym_chk with a custom command (wrapped by meson to capture output) 00:02:29.069 [221/378] Compiling C object drivers/libtmp_rte_common_mlx5.a.p/common_mlx5_mlx5_common_devx.c.o 00:02:29.069 [222/378] Compiling C object drivers/libtmp_rte_common_qat.a.p/common_qat_dev_qat_dev_gen2.c.o 00:02:29.329 [223/378] Generating drivers/rte_bus_pci.pmd.c with a custom command 00:02:29.329 [224/378] Compiling C object drivers/libtmp_rte_common_qat.a.p/common_qat_dev_qat_dev_gen1.c.o 00:02:29.329 [225/378] Compiling C object drivers/libtmp_rte_common_qat.a.p/common_qat_dev_qat_dev_gen3.c.o 00:02:29.330 [226/378] Compiling C object drivers/libtmp_rte_common_qat.a.p/common_qat_dev_qat_dev_gen4.c.o 00:02:29.330 [227/378] Compiling C object drivers/libtmp_rte_common_qat.a.p/compress_qat_dev_qat_comp_pmd_gen3.c.o 00:02:29.330 [228/378] Compiling C object drivers/libtmp_rte_common_qat.a.p/common_qat_qat_pf2vf.c.o 00:02:29.330 [229/378] Compiling C object drivers/librte_bus_pci.a.p/meson-generated_.._rte_bus_pci.pmd.c.o 00:02:29.330 [230/378] Compiling C object drivers/librte_bus_pci.so.24.1.p/meson-generated_.._rte_bus_pci.pmd.c.o 00:02:29.330 [231/378] Compiling C object drivers/libtmp_rte_common_qat.a.p/common_qat_dev_qat_dev_gen5.c.o 00:02:29.330 [232/378] Compiling C object drivers/libtmp_rte_common_mlx5.a.p/common_mlx5_linux_mlx5_common_auxiliary.c.o 00:02:29.330 [233/378] Linking static target drivers/librte_bus_pci.a 00:02:29.330 [234/378] Compiling C object drivers/libtmp_rte_common_mlx5.a.p/common_mlx5_mlx5_common_utils.c.o 00:02:29.330 [235/378] Generating lib/reorder.sym_chk with a custom command (wrapped by meson to capture output) 00:02:29.330 [236/378] Generating lib/mbuf.sym_chk with a custom command (wrapped by meson to capture output) 00:02:29.330 [237/378] Compiling C object drivers/libtmp_rte_common_qat.a.p/compress_qat_dev_qat_comp_pmd_gen1.c.o 00:02:29.330 [238/378] Compiling C object drivers/libtmp_rte_common_qat.a.p/compress_qat_dev_qat_comp_pmd_gen5.c.o 00:02:29.330 [239/378] Compiling C object drivers/libtmp_rte_common_qat.a.p/common_qat_dev_qat_dev_gen_lce.c.o 00:02:29.330 [240/378] Compiling C object drivers/libtmp_rte_common_qat.a.p/compress_qat_qat_comp_pmd.c.o 00:02:29.330 [241/378] Compiling C object drivers/libtmp_rte_common_qat.a.p/common_qat_qat_device.c.o 00:02:29.330 [242/378] Compiling C object drivers/libtmp_rte_common_qat.a.p/crypto_qat_qat_crypto.c.o 00:02:29.330 [243/378] Generating lib/dmadev.sym_chk with a custom command (wrapped by meson to capture output) 00:02:29.330 [244/378] Compiling C object drivers/libtmp_rte_common_mlx5.a.p/common_mlx5_linux_mlx5_common_os.c.o 00:02:29.330 [245/378] Compiling C object drivers/libtmp_rte_common_qat.a.p/crypto_qat_dev_qat_asym_pmd_gen1.c.o 00:02:29.330 [246/378] Compiling C object drivers/libtmp_rte_common_mlx5.a.p/common_mlx5_linux_mlx5_nl.c.o 00:02:29.330 [247/378] Generating drivers/rte_bus_auxiliary.sym_chk with a custom command (wrapped by meson to capture output) 00:02:29.330 [248/378] Generating lib/compressdev.sym_chk with a custom command (wrapped by meson to capture output) 00:02:29.330 [249/378] Compiling C object lib/librte_ethdev.a.p/ethdev_rte_flow.c.o 00:02:29.330 [250/378] Compiling C object drivers/libtmp_rte_common_qat.a.p/crypto_qat_dev_qat_crypto_pmd_gen5.c.o 00:02:29.330 [251/378] Compiling C object lib/librte_cryptodev.a.p/cryptodev_rte_cryptodev.c.o 00:02:29.330 [252/378] Linking static target lib/librte_cryptodev.a 00:02:29.330 [253/378] Compiling C object drivers/libtmp_rte_common_qat.a.p/crypto_qat_qat_sym.c.o 00:02:29.330 [254/378] Compiling C object drivers/libtmp_rte_common_qat.a.p/compress_qat_qat_comp.c.o 00:02:29.330 [255/378] Compiling C object drivers/libtmp_rte_common_qat.a.p/crypto_qat_dev_qat_crypto_pmd_gen2.c.o 00:02:29.588 [256/378] Generating lib/security.sym_chk with a custom command (wrapped by meson to capture output) 00:02:29.588 [257/378] Compiling C object drivers/libtmp_rte_common_qat.a.p/common_qat_qat_qp.c.o 00:02:29.588 [258/378] Compiling C object drivers/libtmp_rte_crypto_ipsec_mb.a.p/crypto_ipsec_mb_ipsec_mb_private.c.o 00:02:29.588 [259/378] Generating drivers/rte_bus_vdev.sym_chk with a custom command (wrapped by meson to capture output) 00:02:29.588 [260/378] Compiling C object drivers/libtmp_rte_common_qat.a.p/crypto_qat_dev_qat_crypto_pmd_gen_lce.c.o 00:02:29.588 [261/378] Compiling C object drivers/libtmp_rte_crypto_ipsec_mb.a.p/crypto_ipsec_mb_ipsec_mb_ops.c.o 00:02:29.588 [262/378] Compiling C object drivers/libtmp_rte_crypto_mlx5.a.p/crypto_mlx5_mlx5_crypto_xts.c.o 00:02:29.588 [263/378] Compiling C object drivers/libtmp_rte_mempool_ring.a.p/mempool_ring_rte_mempool_ring.c.o 00:02:29.588 [264/378] Linking static target drivers/libtmp_rte_mempool_ring.a 00:02:29.588 [265/378] Generating lib/power.sym_chk with a custom command (wrapped by meson to capture output) 00:02:29.588 [266/378] Compiling C object drivers/libtmp_rte_compress_isal.a.p/compress_isal_isal_compress_pmd_ops.c.o 00:02:29.588 [267/378] Compiling C object drivers/libtmp_rte_crypto_ipsec_mb.a.p/crypto_ipsec_mb_pmd_chacha_poly.c.o 00:02:29.588 [268/378] Compiling C object drivers/libtmp_rte_crypto_mlx5.a.p/crypto_mlx5_mlx5_crypto.c.o 00:02:29.588 [269/378] Compiling C object lib/librte_ethdev.a.p/ethdev_rte_ethdev.c.o 00:02:29.588 [270/378] Linking static target lib/librte_ethdev.a 00:02:29.588 [271/378] Generating lib/cmdline.sym_chk with a custom command (wrapped by meson to capture output) 00:02:29.588 [272/378] Compiling C object drivers/libtmp_rte_common_qat.a.p/crypto_qat_qat_sym_session.c.o 00:02:29.588 [273/378] Compiling C object drivers/libtmp_rte_compress_mlx5.a.p/compress_mlx5_mlx5_compress.c.o 00:02:29.846 [274/378] Linking static target drivers/libtmp_rte_compress_mlx5.a 00:02:29.846 [275/378] Compiling C object drivers/libtmp_rte_crypto_ipsec_mb.a.p/crypto_ipsec_mb_pmd_zuc.c.o 00:02:29.846 [276/378] Compiling C object drivers/libtmp_rte_crypto_mlx5.a.p/crypto_mlx5_mlx5_crypto_dek.c.o 00:02:29.846 [277/378] Compiling C object drivers/libtmp_rte_common_mlx5.a.p/common_mlx5_mlx5_devx_cmds.c.o 00:02:29.846 [278/378] Compiling C object drivers/libtmp_rte_crypto_ipsec_mb.a.p/crypto_ipsec_mb_pmd_kasumi.c.o 00:02:29.846 [279/378] Generating drivers/rte_mempool_ring.pmd.c with a custom command 00:02:29.846 [280/378] Compiling C object drivers/libtmp_rte_compress_isal.a.p/compress_isal_isal_compress_pmd.c.o 00:02:29.846 [281/378] Linking static target drivers/libtmp_rte_compress_isal.a 00:02:29.846 [282/378] Linking static target drivers/libtmp_rte_common_mlx5.a 00:02:29.846 [283/378] Compiling C object drivers/librte_mempool_ring.a.p/meson-generated_.._rte_mempool_ring.pmd.c.o 00:02:29.846 [284/378] Compiling C object drivers/librte_mempool_ring.so.24.1.p/meson-generated_.._rte_mempool_ring.pmd.c.o 00:02:29.846 [285/378] Linking static target drivers/librte_mempool_ring.a 00:02:29.846 [286/378] Compiling C object drivers/libtmp_rte_crypto_ipsec_mb.a.p/crypto_ipsec_mb_pmd_snow3g.c.o 00:02:29.846 [287/378] Compiling C object drivers/libtmp_rte_crypto_mlx5.a.p/crypto_mlx5_mlx5_crypto_gcm.c.o 00:02:29.846 [288/378] Compiling C object drivers/libtmp_rte_common_qat.a.p/crypto_qat_dev_qat_crypto_pmd_gen3.c.o 00:02:29.846 [289/378] Compiling C object drivers/libtmp_rte_crypto_ipsec_mb.a.p/crypto_ipsec_mb_pmd_aesni_mb.c.o 00:02:29.846 [290/378] Linking static target drivers/libtmp_rte_crypto_mlx5.a 00:02:29.846 [291/378] Generating drivers/rte_compress_mlx5.pmd.c with a custom command 00:02:29.846 [292/378] Compiling C object drivers/libtmp_rte_common_qat.a.p/crypto_qat_dev_qat_sym_pmd_gen1.c.o 00:02:30.105 [293/378] Compiling C object drivers/librte_compress_mlx5.so.24.1.p/meson-generated_.._rte_compress_mlx5.pmd.c.o 00:02:30.105 [294/378] Compiling C object drivers/librte_compress_mlx5.a.p/meson-generated_.._rte_compress_mlx5.pmd.c.o 00:02:30.105 [295/378] Linking static target drivers/librte_compress_mlx5.a 00:02:30.105 [296/378] Compiling C object drivers/libtmp_rte_common_qat.a.p/crypto_qat_dev_qat_crypto_pmd_gen4.c.o 00:02:30.105 [297/378] Generating drivers/rte_compress_isal.pmd.c with a custom command 00:02:30.105 [298/378] Generating lib/hash.sym_chk with a custom command (wrapped by meson to capture output) 00:02:30.105 [299/378] Generating drivers/rte_common_mlx5.pmd.c with a custom command 00:02:30.105 [300/378] Generating drivers/rte_bus_pci.sym_chk with a custom command (wrapped by meson to capture output) 00:02:30.105 [301/378] Compiling C object drivers/librte_compress_isal.a.p/meson-generated_.._rte_compress_isal.pmd.c.o 00:02:30.105 [302/378] Compiling C object drivers/librte_compress_isal.so.24.1.p/meson-generated_.._rte_compress_isal.pmd.c.o 00:02:30.105 [303/378] Linking static target drivers/librte_compress_isal.a 00:02:30.105 [304/378] Compiling C object drivers/librte_common_mlx5.a.p/meson-generated_.._rte_common_mlx5.pmd.c.o 00:02:30.105 [305/378] Compiling C object drivers/librte_common_mlx5.so.24.1.p/meson-generated_.._rte_common_mlx5.pmd.c.o 00:02:30.105 [306/378] Generating drivers/rte_crypto_mlx5.pmd.c with a custom command 00:02:30.105 [307/378] Linking static target drivers/librte_common_mlx5.a 00:02:30.105 [308/378] Compiling C object lib/librte_vhost.a.p/vhost_vhost_crypto.c.o 00:02:30.105 [309/378] Compiling C object drivers/librte_crypto_mlx5.so.24.1.p/meson-generated_.._rte_crypto_mlx5.pmd.c.o 00:02:30.105 [310/378] Compiling C object drivers/librte_crypto_mlx5.a.p/meson-generated_.._rte_crypto_mlx5.pmd.c.o 00:02:30.105 [311/378] Linking static target drivers/librte_crypto_mlx5.a 00:02:30.363 [312/378] Compiling C object drivers/libtmp_rte_crypto_ipsec_mb.a.p/crypto_ipsec_mb_pmd_aesni_gcm.c.o 00:02:30.363 [313/378] Linking static target drivers/libtmp_rte_crypto_ipsec_mb.a 00:02:30.620 [314/378] Compiling C object drivers/libtmp_rte_common_qat.a.p/crypto_qat_qat_asym.c.o 00:02:30.620 [315/378] Generating drivers/rte_crypto_ipsec_mb.pmd.c with a custom command 00:02:30.620 [316/378] Linking static target drivers/libtmp_rte_common_qat.a 00:02:30.620 [317/378] Compiling C object drivers/librte_crypto_ipsec_mb.so.24.1.p/meson-generated_.._rte_crypto_ipsec_mb.pmd.c.o 00:02:30.620 [318/378] Compiling C object drivers/librte_crypto_ipsec_mb.a.p/meson-generated_.._rte_crypto_ipsec_mb.pmd.c.o 00:02:30.620 [319/378] Linking static target drivers/librte_crypto_ipsec_mb.a 00:02:30.878 [320/378] Compiling C object lib/librte_vhost.a.p/vhost_virtio_net.c.o 00:02:30.878 [321/378] Linking static target lib/librte_vhost.a 00:02:30.878 [322/378] Generating drivers/rte_common_qat.pmd.c with a custom command 00:02:30.878 [323/378] Compiling C object drivers/librte_common_qat.so.24.1.p/meson-generated_.._rte_common_qat.pmd.c.o 00:02:30.878 [324/378] Compiling C object drivers/librte_common_qat.a.p/meson-generated_.._rte_common_qat.pmd.c.o 00:02:31.136 [325/378] Linking static target drivers/librte_common_qat.a 00:02:31.704 [326/378] Generating lib/cryptodev.sym_chk with a custom command (wrapped by meson to capture output) 00:02:33.079 [327/378] Generating lib/vhost.sym_chk with a custom command (wrapped by meson to capture output) 00:02:36.366 [328/378] Generating drivers/rte_common_mlx5.sym_chk with a custom command (wrapped by meson to capture output) 00:02:38.902 [329/378] Generating lib/ethdev.sym_chk with a custom command (wrapped by meson to capture output) 00:02:41.439 [330/378] Generating lib/eal.sym_chk with a custom command (wrapped by meson to capture output) 00:02:41.439 [331/378] Linking target lib/librte_eal.so.24.1 00:02:41.698 [332/378] Generating symbol file lib/librte_eal.so.24.1.p/librte_eal.so.24.1.symbols 00:02:41.698 [333/378] Linking target lib/librte_timer.so.24.1 00:02:41.698 [334/378] Linking target lib/librte_ring.so.24.1 00:02:41.698 [335/378] Linking target lib/librte_meter.so.24.1 00:02:41.698 [336/378] Linking target lib/librte_pci.so.24.1 00:02:41.698 [337/378] Linking target lib/librte_dmadev.so.24.1 00:02:41.698 [338/378] Linking target drivers/librte_bus_vdev.so.24.1 00:02:41.698 [339/378] Linking target drivers/librte_bus_auxiliary.so.24.1 00:02:41.698 [340/378] Generating symbol file lib/librte_pci.so.24.1.p/librte_pci.so.24.1.symbols 00:02:41.698 [341/378] Generating symbol file lib/librte_timer.so.24.1.p/librte_timer.so.24.1.symbols 00:02:41.698 [342/378] Generating symbol file drivers/librte_bus_auxiliary.so.24.1.p/librte_bus_auxiliary.so.24.1.symbols 00:02:41.698 [343/378] Generating symbol file lib/librte_ring.so.24.1.p/librte_ring.so.24.1.symbols 00:02:41.698 [344/378] Generating symbol file lib/librte_meter.so.24.1.p/librte_meter.so.24.1.symbols 00:02:41.698 [345/378] Generating symbol file lib/librte_dmadev.so.24.1.p/librte_dmadev.so.24.1.symbols 00:02:41.698 [346/378] Generating symbol file drivers/librte_bus_vdev.so.24.1.p/librte_bus_vdev.so.24.1.symbols 00:02:41.698 [347/378] Linking target drivers/librte_bus_pci.so.24.1 00:02:41.957 [348/378] Linking target lib/librte_rcu.so.24.1 00:02:41.957 [349/378] Linking target lib/librte_mempool.so.24.1 00:02:41.957 [350/378] Generating symbol file lib/librte_mempool.so.24.1.p/librte_mempool.so.24.1.symbols 00:02:41.957 [351/378] Generating symbol file drivers/librte_bus_pci.so.24.1.p/librte_bus_pci.so.24.1.symbols 00:02:41.957 [352/378] Generating symbol file lib/librte_rcu.so.24.1.p/librte_rcu.so.24.1.symbols 00:02:41.957 [353/378] Linking target lib/librte_mbuf.so.24.1 00:02:41.957 [354/378] Linking target drivers/librte_mempool_ring.so.24.1 00:02:42.216 [355/378] Generating symbol file lib/librte_mbuf.so.24.1.p/librte_mbuf.so.24.1.symbols 00:02:42.216 [356/378] Linking target lib/librte_reorder.so.24.1 00:02:42.216 [357/378] Linking target lib/librte_compressdev.so.24.1 00:02:42.216 [358/378] Linking target lib/librte_net.so.24.1 00:02:42.216 [359/378] Linking target lib/librte_cryptodev.so.24.1 00:02:42.475 [360/378] Generating symbol file lib/librte_compressdev.so.24.1.p/librte_compressdev.so.24.1.symbols 00:02:42.475 [361/378] Generating symbol file lib/librte_net.so.24.1.p/librte_net.so.24.1.symbols 00:02:42.475 [362/378] Generating symbol file lib/librte_cryptodev.so.24.1.p/librte_cryptodev.so.24.1.symbols 00:02:42.475 [363/378] Linking target lib/librte_cmdline.so.24.1 00:02:42.475 [364/378] Linking target lib/librte_hash.so.24.1 00:02:42.475 [365/378] Linking target lib/librte_security.so.24.1 00:02:42.475 [366/378] Linking target drivers/librte_compress_isal.so.24.1 00:02:42.475 [367/378] Linking target lib/librte_ethdev.so.24.1 00:02:42.736 [368/378] Generating symbol file lib/librte_hash.so.24.1.p/librte_hash.so.24.1.symbols 00:02:42.736 [369/378] Generating symbol file lib/librte_security.so.24.1.p/librte_security.so.24.1.symbols 00:02:42.736 [370/378] Generating symbol file lib/librte_ethdev.so.24.1.p/librte_ethdev.so.24.1.symbols 00:02:42.736 [371/378] Linking target drivers/librte_common_mlx5.so.24.1 00:02:42.736 [372/378] Linking target lib/librte_vhost.so.24.1 00:02:42.736 [373/378] Linking target lib/librte_power.so.24.1 00:02:42.996 [374/378] Generating symbol file drivers/librte_common_mlx5.so.24.1.p/librte_common_mlx5.so.24.1.symbols 00:02:42.996 [375/378] Linking target drivers/librte_compress_mlx5.so.24.1 00:02:42.996 [376/378] Linking target drivers/librte_crypto_mlx5.so.24.1 00:02:42.996 [377/378] Linking target drivers/librte_crypto_ipsec_mb.so.24.1 00:02:42.996 [378/378] Linking target drivers/librte_common_qat.so.24.1 00:02:42.996 INFO: autodetecting backend as ninja 00:02:42.996 INFO: calculating backend command to run: /usr/local/bin/ninja -C /var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build-tmp -j 72 00:02:44.370 CC lib/ut_mock/mock.o 00:02:44.370 CC lib/ut/ut.o 00:02:44.370 CC lib/log/log.o 00:02:44.370 CC lib/log/log_flags.o 00:02:44.370 CC lib/log/log_deprecated.o 00:02:44.628 LIB libspdk_ut_mock.a 00:02:44.628 LIB libspdk_ut.a 00:02:44.628 LIB libspdk_log.a 00:02:44.628 SO libspdk_ut_mock.so.6.0 00:02:44.628 SO libspdk_ut.so.2.0 00:02:44.628 SO libspdk_log.so.7.0 00:02:44.628 SYMLINK libspdk_ut_mock.so 00:02:44.628 SYMLINK libspdk_ut.so 00:02:44.628 SYMLINK libspdk_log.so 00:02:45.194 CC lib/dma/dma.o 00:02:45.194 CC lib/util/base64.o 00:02:45.194 CC lib/util/bit_array.o 00:02:45.194 CC lib/util/cpuset.o 00:02:45.194 CC lib/util/crc16.o 00:02:45.194 CC lib/util/crc32.o 00:02:45.194 CC lib/util/crc32c.o 00:02:45.194 CC lib/util/crc32_ieee.o 00:02:45.194 CC lib/util/dif.o 00:02:45.194 CC lib/util/crc64.o 00:02:45.194 CC lib/util/fd.o 00:02:45.194 CC lib/util/iov.o 00:02:45.194 CC lib/util/file.o 00:02:45.194 CC lib/util/hexlify.o 00:02:45.194 CXX lib/trace_parser/trace.o 00:02:45.194 CC lib/util/pipe.o 00:02:45.194 CC lib/util/math.o 00:02:45.194 CC lib/ioat/ioat.o 00:02:45.194 CC lib/util/strerror_tls.o 00:02:45.194 CC lib/util/string.o 00:02:45.194 CC lib/util/fd_group.o 00:02:45.194 CC lib/util/uuid.o 00:02:45.194 CC lib/util/xor.o 00:02:45.194 CC lib/util/zipf.o 00:02:45.194 CC lib/vfio_user/host/vfio_user_pci.o 00:02:45.194 CC lib/vfio_user/host/vfio_user.o 00:02:45.453 LIB libspdk_ioat.a 00:02:45.453 SO libspdk_ioat.so.7.0 00:02:45.453 LIB libspdk_dma.a 00:02:45.453 LIB libspdk_vfio_user.a 00:02:45.453 SYMLINK libspdk_ioat.so 00:02:45.453 SO libspdk_dma.so.4.0 00:02:45.453 SO libspdk_vfio_user.so.5.0 00:02:45.711 SYMLINK libspdk_vfio_user.so 00:02:45.711 LIB libspdk_util.a 00:02:45.711 SYMLINK libspdk_dma.so 00:02:45.711 SO libspdk_util.so.9.1 00:02:45.970 SYMLINK libspdk_util.so 00:02:45.970 LIB libspdk_trace_parser.a 00:02:45.970 SO libspdk_trace_parser.so.5.0 00:02:46.229 SYMLINK libspdk_trace_parser.so 00:02:46.229 CC lib/idxd/idxd_kernel.o 00:02:46.229 CC lib/idxd/idxd.o 00:02:46.229 CC lib/idxd/idxd_user.o 00:02:46.229 CC lib/json/json_parse.o 00:02:46.229 CC lib/reduce/reduce.o 00:02:46.229 CC lib/json/json_util.o 00:02:46.229 CC lib/json/json_write.o 00:02:46.229 CC lib/conf/conf.o 00:02:46.229 CC lib/env_dpdk/env.o 00:02:46.229 CC lib/rdma_utils/rdma_utils.o 00:02:46.229 CC lib/env_dpdk/pci.o 00:02:46.229 CC lib/env_dpdk/memory.o 00:02:46.229 CC lib/env_dpdk/init.o 00:02:46.229 CC lib/env_dpdk/threads.o 00:02:46.229 CC lib/env_dpdk/pci_ioat.o 00:02:46.229 CC lib/rdma_provider/common.o 00:02:46.229 CC lib/env_dpdk/pci_virtio.o 00:02:46.229 CC lib/rdma_provider/rdma_provider_verbs.o 00:02:46.229 CC lib/vmd/vmd.o 00:02:46.229 CC lib/env_dpdk/pci_vmd.o 00:02:46.229 CC lib/vmd/led.o 00:02:46.229 CC lib/env_dpdk/pci_idxd.o 00:02:46.229 CC lib/env_dpdk/pci_event.o 00:02:46.229 CC lib/env_dpdk/sigbus_handler.o 00:02:46.229 CC lib/env_dpdk/pci_dpdk.o 00:02:46.229 CC lib/env_dpdk/pci_dpdk_2207.o 00:02:46.229 CC lib/env_dpdk/pci_dpdk_2211.o 00:02:46.488 LIB libspdk_rdma_provider.a 00:02:46.488 LIB libspdk_conf.a 00:02:46.488 SO libspdk_rdma_provider.so.6.0 00:02:46.488 LIB libspdk_rdma_utils.a 00:02:46.488 SO libspdk_conf.so.6.0 00:02:46.488 LIB libspdk_json.a 00:02:46.488 SO libspdk_rdma_utils.so.1.0 00:02:46.488 SYMLINK libspdk_rdma_provider.so 00:02:46.746 SYMLINK libspdk_conf.so 00:02:46.746 SO libspdk_json.so.6.0 00:02:46.746 SYMLINK libspdk_json.so 00:02:46.746 SYMLINK libspdk_rdma_utils.so 00:02:46.746 LIB libspdk_idxd.a 00:02:46.746 SO libspdk_idxd.so.12.0 00:02:47.004 LIB libspdk_reduce.a 00:02:47.004 LIB libspdk_vmd.a 00:02:47.004 SYMLINK libspdk_idxd.so 00:02:47.004 SO libspdk_reduce.so.6.0 00:02:47.004 SO libspdk_vmd.so.6.0 00:02:47.004 SYMLINK libspdk_reduce.so 00:02:47.004 SYMLINK libspdk_vmd.so 00:02:47.004 CC lib/jsonrpc/jsonrpc_server.o 00:02:47.004 CC lib/jsonrpc/jsonrpc_server_tcp.o 00:02:47.004 CC lib/jsonrpc/jsonrpc_client.o 00:02:47.004 CC lib/jsonrpc/jsonrpc_client_tcp.o 00:02:47.573 LIB libspdk_jsonrpc.a 00:02:47.573 SO libspdk_jsonrpc.so.6.0 00:02:47.573 SYMLINK libspdk_jsonrpc.so 00:02:47.573 LIB libspdk_env_dpdk.a 00:02:47.832 SO libspdk_env_dpdk.so.14.1 00:02:47.832 SYMLINK libspdk_env_dpdk.so 00:02:48.121 CC lib/rpc/rpc.o 00:02:48.380 LIB libspdk_rpc.a 00:02:48.380 SO libspdk_rpc.so.6.0 00:02:48.380 SYMLINK libspdk_rpc.so 00:02:48.639 CC lib/notify/notify.o 00:02:48.639 CC lib/notify/notify_rpc.o 00:02:48.639 CC lib/keyring/keyring.o 00:02:48.639 CC lib/keyring/keyring_rpc.o 00:02:48.639 CC lib/trace/trace.o 00:02:48.639 CC lib/trace/trace_flags.o 00:02:48.639 CC lib/trace/trace_rpc.o 00:02:48.898 LIB libspdk_notify.a 00:02:48.898 SO libspdk_notify.so.6.0 00:02:48.898 LIB libspdk_trace.a 00:02:48.898 LIB libspdk_keyring.a 00:02:49.157 SYMLINK libspdk_notify.so 00:02:49.157 SO libspdk_trace.so.10.0 00:02:49.157 SO libspdk_keyring.so.1.0 00:02:49.157 SYMLINK libspdk_keyring.so 00:02:49.157 SYMLINK libspdk_trace.so 00:02:49.415 CC lib/thread/thread.o 00:02:49.415 CC lib/thread/iobuf.o 00:02:49.415 CC lib/sock/sock.o 00:02:49.415 CC lib/sock/sock_rpc.o 00:02:49.673 LIB libspdk_sock.a 00:02:49.932 SO libspdk_sock.so.10.0 00:02:49.932 SYMLINK libspdk_sock.so 00:02:50.191 CC lib/nvme/nvme_ctrlr_cmd.o 00:02:50.191 CC lib/nvme/nvme_ctrlr.o 00:02:50.191 CC lib/nvme/nvme_fabric.o 00:02:50.191 CC lib/nvme/nvme_ns_cmd.o 00:02:50.191 CC lib/nvme/nvme_ns.o 00:02:50.191 CC lib/nvme/nvme_pcie_common.o 00:02:50.191 CC lib/nvme/nvme_pcie.o 00:02:50.191 CC lib/nvme/nvme.o 00:02:50.191 CC lib/nvme/nvme_qpair.o 00:02:50.191 CC lib/nvme/nvme_quirks.o 00:02:50.191 CC lib/nvme/nvme_transport.o 00:02:50.191 CC lib/nvme/nvme_discovery.o 00:02:50.191 CC lib/nvme/nvme_ctrlr_ocssd_cmd.o 00:02:50.191 CC lib/nvme/nvme_ns_ocssd_cmd.o 00:02:50.191 CC lib/nvme/nvme_tcp.o 00:02:50.191 CC lib/nvme/nvme_opal.o 00:02:50.191 CC lib/nvme/nvme_io_msg.o 00:02:50.191 CC lib/nvme/nvme_poll_group.o 00:02:50.191 CC lib/nvme/nvme_zns.o 00:02:50.191 CC lib/nvme/nvme_stubs.o 00:02:50.191 CC lib/nvme/nvme_auth.o 00:02:50.191 CC lib/nvme/nvme_cuse.o 00:02:50.191 CC lib/nvme/nvme_rdma.o 00:02:51.128 LIB libspdk_thread.a 00:02:51.128 SO libspdk_thread.so.10.1 00:02:51.128 SYMLINK libspdk_thread.so 00:02:51.388 CC lib/blob/request.o 00:02:51.388 CC lib/virtio/virtio.o 00:02:51.388 CC lib/blob/blobstore.o 00:02:51.388 CC lib/virtio/virtio_vhost_user.o 00:02:51.388 CC lib/blob/zeroes.o 00:02:51.388 CC lib/virtio/virtio_vfio_user.o 00:02:51.388 CC lib/virtio/virtio_pci.o 00:02:51.388 CC lib/blob/blob_bs_dev.o 00:02:51.388 CC lib/accel/accel.o 00:02:51.388 CC lib/accel/accel_rpc.o 00:02:51.388 CC lib/accel/accel_sw.o 00:02:51.388 CC lib/init/json_config.o 00:02:51.388 CC lib/init/subsystem.o 00:02:51.388 CC lib/init/subsystem_rpc.o 00:02:51.388 CC lib/init/rpc.o 00:02:51.647 LIB libspdk_init.a 00:02:51.906 SO libspdk_init.so.5.0 00:02:51.906 LIB libspdk_virtio.a 00:02:51.906 SO libspdk_virtio.so.7.0 00:02:51.906 SYMLINK libspdk_init.so 00:02:51.906 LIB libspdk_nvme.a 00:02:51.906 SYMLINK libspdk_virtio.so 00:02:52.165 SO libspdk_nvme.so.13.1 00:02:52.165 CC lib/event/app.o 00:02:52.165 CC lib/event/reactor.o 00:02:52.165 CC lib/event/log_rpc.o 00:02:52.165 CC lib/event/scheduler_static.o 00:02:52.165 CC lib/event/app_rpc.o 00:02:52.425 SYMLINK libspdk_nvme.so 00:02:52.684 LIB libspdk_accel.a 00:02:52.684 LIB libspdk_event.a 00:02:52.684 SO libspdk_accel.so.15.1 00:02:52.684 SO libspdk_event.so.14.0 00:02:52.684 SYMLINK libspdk_accel.so 00:02:52.943 SYMLINK libspdk_event.so 00:02:53.202 CC lib/bdev/bdev_rpc.o 00:02:53.202 CC lib/bdev/bdev.o 00:02:53.202 CC lib/bdev/part.o 00:02:53.202 CC lib/bdev/bdev_zone.o 00:02:53.202 CC lib/bdev/scsi_nvme.o 00:02:55.108 LIB libspdk_blob.a 00:02:55.108 SO libspdk_blob.so.11.0 00:02:55.108 SYMLINK libspdk_blob.so 00:02:55.367 CC lib/lvol/lvol.o 00:02:55.367 CC lib/blobfs/blobfs.o 00:02:55.367 CC lib/blobfs/tree.o 00:02:55.934 LIB libspdk_bdev.a 00:02:55.934 SO libspdk_bdev.so.15.1 00:02:55.934 SYMLINK libspdk_bdev.so 00:02:56.198 LIB libspdk_blobfs.a 00:02:56.199 CC lib/scsi/lun.o 00:02:56.199 CC lib/scsi/dev.o 00:02:56.199 CC lib/scsi/scsi_bdev.o 00:02:56.199 CC lib/scsi/port.o 00:02:56.199 CC lib/scsi/scsi.o 00:02:56.199 CC lib/ublk/ublk.o 00:02:56.199 CC lib/ublk/ublk_rpc.o 00:02:56.199 CC lib/scsi/task.o 00:02:56.199 CC lib/scsi/scsi_rpc.o 00:02:56.199 CC lib/scsi/scsi_pr.o 00:02:56.199 CC lib/nbd/nbd.o 00:02:56.199 CC lib/nbd/nbd_rpc.o 00:02:56.199 CC lib/nvmf/ctrlr.o 00:02:56.199 CC lib/nvmf/ctrlr_discovery.o 00:02:56.199 CC lib/ftl/ftl_init.o 00:02:56.199 CC lib/nvmf/ctrlr_bdev.o 00:02:56.199 CC lib/nvmf/subsystem.o 00:02:56.199 CC lib/ftl/ftl_core.o 00:02:56.199 CC lib/nvmf/nvmf.o 00:02:56.199 CC lib/nvmf/tcp.o 00:02:56.199 CC lib/ftl/ftl_layout.o 00:02:56.199 CC lib/nvmf/nvmf_rpc.o 00:02:56.199 CC lib/ftl/ftl_debug.o 00:02:56.199 CC lib/nvmf/stubs.o 00:02:56.199 CC lib/nvmf/transport.o 00:02:56.199 CC lib/ftl/ftl_io.o 00:02:56.199 CC lib/nvmf/rdma.o 00:02:56.199 CC lib/nvmf/mdns_server.o 00:02:56.199 CC lib/ftl/ftl_l2p_flat.o 00:02:56.199 SO libspdk_blobfs.so.10.0 00:02:56.199 CC lib/ftl/ftl_sb.o 00:02:56.199 CC lib/ftl/ftl_l2p.o 00:02:56.199 CC lib/ftl/ftl_nv_cache.o 00:02:56.199 CC lib/nvmf/auth.o 00:02:56.199 CC lib/ftl/ftl_band_ops.o 00:02:56.199 CC lib/ftl/ftl_writer.o 00:02:56.199 CC lib/ftl/ftl_band.o 00:02:56.199 CC lib/ftl/ftl_reloc.o 00:02:56.199 CC lib/ftl/ftl_rq.o 00:02:56.199 CC lib/ftl/ftl_l2p_cache.o 00:02:56.199 CC lib/ftl/ftl_p2l.o 00:02:56.199 CC lib/ftl/mngt/ftl_mngt.o 00:02:56.199 CC lib/ftl/mngt/ftl_mngt_bdev.o 00:02:56.199 CC lib/ftl/mngt/ftl_mngt_shutdown.o 00:02:56.199 CC lib/ftl/mngt/ftl_mngt_startup.o 00:02:56.199 CC lib/ftl/mngt/ftl_mngt_md.o 00:02:56.199 CC lib/ftl/mngt/ftl_mngt_misc.o 00:02:56.199 CC lib/ftl/mngt/ftl_mngt_ioch.o 00:02:56.199 CC lib/ftl/mngt/ftl_mngt_l2p.o 00:02:56.199 CC lib/ftl/mngt/ftl_mngt_band.o 00:02:56.199 CC lib/ftl/mngt/ftl_mngt_self_test.o 00:02:56.199 CC lib/ftl/mngt/ftl_mngt_p2l.o 00:02:56.199 CC lib/ftl/mngt/ftl_mngt_recovery.o 00:02:56.199 CC lib/ftl/mngt/ftl_mngt_upgrade.o 00:02:56.199 CC lib/ftl/utils/ftl_conf.o 00:02:56.199 CC lib/ftl/utils/ftl_md.o 00:02:56.199 CC lib/ftl/utils/ftl_mempool.o 00:02:56.199 CC lib/ftl/utils/ftl_bitmap.o 00:02:56.199 CC lib/ftl/utils/ftl_property.o 00:02:56.199 CC lib/ftl/utils/ftl_layout_tracker_bdev.o 00:02:56.199 CC lib/ftl/upgrade/ftl_layout_upgrade.o 00:02:56.199 CC lib/ftl/upgrade/ftl_sb_upgrade.o 00:02:56.199 CC lib/ftl/upgrade/ftl_p2l_upgrade.o 00:02:56.199 CC lib/ftl/upgrade/ftl_band_upgrade.o 00:02:56.199 CC lib/ftl/upgrade/ftl_trim_upgrade.o 00:02:56.199 CC lib/ftl/upgrade/ftl_chunk_upgrade.o 00:02:56.199 CC lib/ftl/upgrade/ftl_sb_v3.o 00:02:56.199 CC lib/ftl/nvc/ftl_nvc_dev.o 00:02:56.460 CC lib/ftl/nvc/ftl_nvc_bdev_vss.o 00:02:56.460 CC lib/ftl/upgrade/ftl_sb_v5.o 00:02:56.460 CC lib/ftl/base/ftl_base_dev.o 00:02:56.460 LIB libspdk_lvol.a 00:02:56.460 SO libspdk_lvol.so.10.0 00:02:56.460 SYMLINK libspdk_blobfs.so 00:02:56.460 CC lib/ftl/base/ftl_base_bdev.o 00:02:56.460 SYMLINK libspdk_lvol.so 00:02:56.720 CC lib/ftl/ftl_trace.o 00:02:56.979 LIB libspdk_nbd.a 00:02:56.979 SO libspdk_nbd.so.7.0 00:02:56.979 LIB libspdk_scsi.a 00:02:56.979 SYMLINK libspdk_nbd.so 00:02:56.979 SO libspdk_scsi.so.9.0 00:02:57.238 LIB libspdk_ublk.a 00:02:57.238 SYMLINK libspdk_scsi.so 00:02:57.238 SO libspdk_ublk.so.3.0 00:02:57.238 SYMLINK libspdk_ublk.so 00:02:57.238 LIB libspdk_ftl.a 00:02:57.497 CC lib/vhost/vhost.o 00:02:57.497 CC lib/vhost/vhost_rpc.o 00:02:57.497 CC lib/vhost/vhost_scsi.o 00:02:57.497 CC lib/vhost/rte_vhost_user.o 00:02:57.497 CC lib/vhost/vhost_blk.o 00:02:57.497 SO libspdk_ftl.so.9.0 00:02:57.497 CC lib/iscsi/init_grp.o 00:02:57.497 CC lib/iscsi/conn.o 00:02:57.497 CC lib/iscsi/iscsi.o 00:02:57.497 CC lib/iscsi/md5.o 00:02:57.497 CC lib/iscsi/param.o 00:02:57.497 CC lib/iscsi/iscsi_subsystem.o 00:02:57.497 CC lib/iscsi/portal_grp.o 00:02:57.497 CC lib/iscsi/tgt_node.o 00:02:57.497 CC lib/iscsi/iscsi_rpc.o 00:02:57.497 CC lib/iscsi/task.o 00:02:58.062 SYMLINK libspdk_ftl.so 00:02:58.319 LIB libspdk_nvmf.a 00:02:58.319 SO libspdk_nvmf.so.18.1 00:02:58.578 SYMLINK libspdk_nvmf.so 00:02:58.578 LIB libspdk_vhost.a 00:02:58.578 SO libspdk_vhost.so.8.0 00:02:58.836 SYMLINK libspdk_vhost.so 00:02:58.837 LIB libspdk_iscsi.a 00:02:59.095 SO libspdk_iscsi.so.8.0 00:02:59.095 SYMLINK libspdk_iscsi.so 00:02:59.659 CC module/env_dpdk/env_dpdk_rpc.o 00:02:59.917 CC module/blob/bdev/blob_bdev.o 00:02:59.917 CC module/accel/error/accel_error_rpc.o 00:02:59.917 CC module/accel/error/accel_error.o 00:02:59.917 CC module/scheduler/gscheduler/gscheduler.o 00:02:59.917 CC module/accel/ioat/accel_ioat_rpc.o 00:02:59.917 CC module/scheduler/dynamic/scheduler_dynamic.o 00:02:59.917 CC module/accel/ioat/accel_ioat.o 00:02:59.917 CC module/accel/dsa/accel_dsa.o 00:02:59.917 CC module/accel/dsa/accel_dsa_rpc.o 00:02:59.917 CC module/sock/posix/posix.o 00:02:59.917 CC module/accel/dpdk_compressdev/accel_dpdk_compressdev.o 00:02:59.917 CC module/accel/dpdk_compressdev/accel_dpdk_compressdev_rpc.o 00:02:59.917 CC module/scheduler/dpdk_governor/dpdk_governor.o 00:02:59.917 CC module/keyring/file/keyring.o 00:02:59.917 CC module/keyring/linux/keyring.o 00:02:59.917 CC module/keyring/file/keyring_rpc.o 00:02:59.917 CC module/accel/dpdk_cryptodev/accel_dpdk_cryptodev.o 00:02:59.917 CC module/keyring/linux/keyring_rpc.o 00:02:59.917 CC module/accel/dpdk_cryptodev/accel_dpdk_cryptodev_rpc.o 00:02:59.917 CC module/accel/iaa/accel_iaa.o 00:02:59.917 CC module/accel/iaa/accel_iaa_rpc.o 00:02:59.917 LIB libspdk_env_dpdk_rpc.a 00:02:59.917 SO libspdk_env_dpdk_rpc.so.6.0 00:02:59.917 SYMLINK libspdk_env_dpdk_rpc.so 00:02:59.917 LIB libspdk_scheduler_dpdk_governor.a 00:02:59.917 LIB libspdk_scheduler_gscheduler.a 00:03:00.175 LIB libspdk_keyring_file.a 00:03:00.175 LIB libspdk_accel_error.a 00:03:00.175 SO libspdk_scheduler_dpdk_governor.so.4.0 00:03:00.175 SO libspdk_scheduler_gscheduler.so.4.0 00:03:00.175 LIB libspdk_scheduler_dynamic.a 00:03:00.175 SO libspdk_keyring_file.so.1.0 00:03:00.175 LIB libspdk_accel_ioat.a 00:03:00.175 SO libspdk_accel_error.so.2.0 00:03:00.175 LIB libspdk_accel_iaa.a 00:03:00.175 SO libspdk_scheduler_dynamic.so.4.0 00:03:00.175 SYMLINK libspdk_scheduler_gscheduler.so 00:03:00.175 SYMLINK libspdk_scheduler_dpdk_governor.so 00:03:00.175 SO libspdk_accel_ioat.so.6.0 00:03:00.175 LIB libspdk_blob_bdev.a 00:03:00.175 LIB libspdk_accel_dsa.a 00:03:00.175 SYMLINK libspdk_keyring_file.so 00:03:00.175 SO libspdk_accel_iaa.so.3.0 00:03:00.175 LIB libspdk_keyring_linux.a 00:03:00.175 SYMLINK libspdk_accel_error.so 00:03:00.175 SYMLINK libspdk_scheduler_dynamic.so 00:03:00.175 SO libspdk_blob_bdev.so.11.0 00:03:00.175 SO libspdk_accel_dsa.so.5.0 00:03:00.175 SYMLINK libspdk_accel_ioat.so 00:03:00.175 SO libspdk_keyring_linux.so.1.0 00:03:00.175 SYMLINK libspdk_blob_bdev.so 00:03:00.176 SYMLINK libspdk_accel_iaa.so 00:03:00.176 SYMLINK libspdk_accel_dsa.so 00:03:00.434 SYMLINK libspdk_keyring_linux.so 00:03:00.434 LIB libspdk_sock_posix.a 00:03:00.434 SO libspdk_sock_posix.so.6.0 00:03:00.692 SYMLINK libspdk_sock_posix.so 00:03:00.692 CC module/blobfs/bdev/blobfs_bdev.o 00:03:00.692 CC module/blobfs/bdev/blobfs_bdev_rpc.o 00:03:00.692 CC module/bdev/gpt/gpt.o 00:03:00.692 CC module/bdev/gpt/vbdev_gpt.o 00:03:00.692 CC module/bdev/virtio/bdev_virtio_scsi.o 00:03:00.692 CC module/bdev/virtio/bdev_virtio_blk.o 00:03:00.692 CC module/bdev/virtio/bdev_virtio_rpc.o 00:03:00.692 CC module/bdev/nvme/bdev_nvme.o 00:03:00.692 CC module/bdev/nvme/bdev_nvme_rpc.o 00:03:00.692 CC module/bdev/error/vbdev_error.o 00:03:00.692 CC module/bdev/nvme/nvme_rpc.o 00:03:00.692 CC module/bdev/nvme/bdev_mdns_client.o 00:03:00.692 CC module/bdev/error/vbdev_error_rpc.o 00:03:00.692 CC module/bdev/nvme/vbdev_opal.o 00:03:00.692 CC module/bdev/nvme/vbdev_opal_rpc.o 00:03:00.692 CC module/bdev/nvme/bdev_nvme_cuse_rpc.o 00:03:00.692 CC module/bdev/malloc/bdev_malloc.o 00:03:00.692 CC module/bdev/malloc/bdev_malloc_rpc.o 00:03:00.692 CC module/bdev/crypto/vbdev_crypto.o 00:03:00.692 CC module/bdev/crypto/vbdev_crypto_rpc.o 00:03:00.692 CC module/bdev/lvol/vbdev_lvol.o 00:03:00.692 CC module/bdev/zone_block/vbdev_zone_block.o 00:03:00.692 CC module/bdev/lvol/vbdev_lvol_rpc.o 00:03:00.692 CC module/bdev/split/vbdev_split.o 00:03:00.692 CC module/bdev/delay/vbdev_delay.o 00:03:00.692 CC module/bdev/zone_block/vbdev_zone_block_rpc.o 00:03:00.692 CC module/bdev/split/vbdev_split_rpc.o 00:03:00.692 CC module/bdev/delay/vbdev_delay_rpc.o 00:03:00.692 CC module/bdev/aio/bdev_aio.o 00:03:00.692 CC module/bdev/aio/bdev_aio_rpc.o 00:03:00.692 CC module/bdev/passthru/vbdev_passthru.o 00:03:00.692 CC module/bdev/passthru/vbdev_passthru_rpc.o 00:03:00.692 CC module/bdev/null/bdev_null.o 00:03:00.692 CC module/bdev/null/bdev_null_rpc.o 00:03:00.692 CC module/bdev/compress/vbdev_compress.o 00:03:00.692 CC module/bdev/compress/vbdev_compress_rpc.o 00:03:00.692 CC module/bdev/iscsi/bdev_iscsi.o 00:03:00.692 CC module/bdev/raid/bdev_raid.o 00:03:00.692 CC module/bdev/iscsi/bdev_iscsi_rpc.o 00:03:00.692 CC module/bdev/raid/bdev_raid_rpc.o 00:03:00.692 CC module/bdev/raid/bdev_raid_sb.o 00:03:00.692 CC module/bdev/raid/raid0.o 00:03:00.692 CC module/bdev/raid/raid1.o 00:03:00.692 CC module/bdev/raid/concat.o 00:03:00.692 CC module/bdev/ftl/bdev_ftl_rpc.o 00:03:00.692 CC module/bdev/ftl/bdev_ftl.o 00:03:00.950 LIB libspdk_blobfs_bdev.a 00:03:00.950 SO libspdk_blobfs_bdev.so.6.0 00:03:00.950 LIB libspdk_accel_dpdk_cryptodev.a 00:03:00.950 LIB libspdk_accel_dpdk_compressdev.a 00:03:00.950 LIB libspdk_bdev_gpt.a 00:03:00.950 SYMLINK libspdk_blobfs_bdev.so 00:03:00.950 SO libspdk_accel_dpdk_compressdev.so.3.0 00:03:01.208 SO libspdk_accel_dpdk_cryptodev.so.3.0 00:03:01.208 LIB libspdk_bdev_split.a 00:03:01.208 LIB libspdk_bdev_passthru.a 00:03:01.208 SO libspdk_bdev_gpt.so.6.0 00:03:01.208 LIB libspdk_bdev_error.a 00:03:01.208 SO libspdk_bdev_split.so.6.0 00:03:01.208 LIB libspdk_bdev_crypto.a 00:03:01.208 SO libspdk_bdev_passthru.so.6.0 00:03:01.208 LIB libspdk_bdev_aio.a 00:03:01.208 SO libspdk_bdev_error.so.6.0 00:03:01.208 LIB libspdk_bdev_null.a 00:03:01.208 SYMLINK libspdk_accel_dpdk_compressdev.so 00:03:01.208 LIB libspdk_bdev_compress.a 00:03:01.208 LIB libspdk_bdev_delay.a 00:03:01.208 SYMLINK libspdk_accel_dpdk_cryptodev.so 00:03:01.208 SYMLINK libspdk_bdev_gpt.so 00:03:01.208 SO libspdk_bdev_crypto.so.6.0 00:03:01.208 LIB libspdk_bdev_malloc.a 00:03:01.208 SO libspdk_bdev_aio.so.6.0 00:03:01.208 SO libspdk_bdev_null.so.6.0 00:03:01.208 SO libspdk_bdev_compress.so.6.0 00:03:01.208 SYMLINK libspdk_bdev_split.so 00:03:01.208 SYMLINK libspdk_bdev_passthru.so 00:03:01.208 SO libspdk_bdev_delay.so.6.0 00:03:01.208 SYMLINK libspdk_bdev_error.so 00:03:01.208 SO libspdk_bdev_malloc.so.6.0 00:03:01.208 LIB libspdk_bdev_ftl.a 00:03:01.208 SYMLINK libspdk_bdev_crypto.so 00:03:01.208 SYMLINK libspdk_bdev_aio.so 00:03:01.208 SYMLINK libspdk_bdev_null.so 00:03:01.208 SYMLINK libspdk_bdev_compress.so 00:03:01.208 SO libspdk_bdev_ftl.so.6.0 00:03:01.208 LIB libspdk_bdev_zone_block.a 00:03:01.208 SYMLINK libspdk_bdev_delay.so 00:03:01.208 LIB libspdk_bdev_virtio.a 00:03:01.208 SYMLINK libspdk_bdev_malloc.so 00:03:01.208 LIB libspdk_bdev_iscsi.a 00:03:01.208 LIB libspdk_bdev_lvol.a 00:03:01.208 SO libspdk_bdev_zone_block.so.6.0 00:03:01.466 SO libspdk_bdev_iscsi.so.6.0 00:03:01.466 SO libspdk_bdev_virtio.so.6.0 00:03:01.466 SYMLINK libspdk_bdev_ftl.so 00:03:01.466 SO libspdk_bdev_lvol.so.6.0 00:03:01.466 SYMLINK libspdk_bdev_iscsi.so 00:03:01.466 SYMLINK libspdk_bdev_lvol.so 00:03:01.466 SYMLINK libspdk_bdev_virtio.so 00:03:01.466 SYMLINK libspdk_bdev_zone_block.so 00:03:01.723 LIB libspdk_bdev_raid.a 00:03:01.723 SO libspdk_bdev_raid.so.6.0 00:03:01.723 SYMLINK libspdk_bdev_raid.so 00:03:03.098 LIB libspdk_bdev_nvme.a 00:03:03.098 SO libspdk_bdev_nvme.so.7.0 00:03:03.355 SYMLINK libspdk_bdev_nvme.so 00:03:03.920 CC module/event/subsystems/keyring/keyring.o 00:03:03.920 CC module/event/subsystems/vmd/vmd.o 00:03:03.920 CC module/event/subsystems/vhost_blk/vhost_blk.o 00:03:03.920 CC module/event/subsystems/vmd/vmd_rpc.o 00:03:03.920 CC module/event/subsystems/sock/sock.o 00:03:03.920 CC module/event/subsystems/scheduler/scheduler.o 00:03:04.177 CC module/event/subsystems/iobuf/iobuf.o 00:03:04.177 CC module/event/subsystems/iobuf/iobuf_rpc.o 00:03:04.177 LIB libspdk_event_keyring.a 00:03:04.177 LIB libspdk_event_vhost_blk.a 00:03:04.177 SO libspdk_event_keyring.so.1.0 00:03:04.177 LIB libspdk_event_scheduler.a 00:03:04.177 LIB libspdk_event_vmd.a 00:03:04.177 LIB libspdk_event_sock.a 00:03:04.177 LIB libspdk_event_iobuf.a 00:03:04.177 SO libspdk_event_scheduler.so.4.0 00:03:04.177 SO libspdk_event_vhost_blk.so.3.0 00:03:04.177 SO libspdk_event_sock.so.5.0 00:03:04.177 SO libspdk_event_vmd.so.6.0 00:03:04.177 SYMLINK libspdk_event_keyring.so 00:03:04.178 SO libspdk_event_iobuf.so.3.0 00:03:04.435 SYMLINK libspdk_event_scheduler.so 00:03:04.435 SYMLINK libspdk_event_sock.so 00:03:04.435 SYMLINK libspdk_event_vmd.so 00:03:04.435 SYMLINK libspdk_event_iobuf.so 00:03:04.435 SYMLINK libspdk_event_vhost_blk.so 00:03:04.691 CC module/event/subsystems/accel/accel.o 00:03:04.949 LIB libspdk_event_accel.a 00:03:04.949 SO libspdk_event_accel.so.6.0 00:03:04.949 SYMLINK libspdk_event_accel.so 00:03:05.206 CC module/event/subsystems/bdev/bdev.o 00:03:05.464 LIB libspdk_event_bdev.a 00:03:05.464 SO libspdk_event_bdev.so.6.0 00:03:05.723 SYMLINK libspdk_event_bdev.so 00:03:06.005 CC module/event/subsystems/nvmf/nvmf_rpc.o 00:03:06.005 CC module/event/subsystems/nvmf/nvmf_tgt.o 00:03:06.005 CC module/event/subsystems/nbd/nbd.o 00:03:06.005 CC module/event/subsystems/ublk/ublk.o 00:03:06.005 CC module/event/subsystems/scsi/scsi.o 00:03:06.276 LIB libspdk_event_nbd.a 00:03:06.276 LIB libspdk_event_ublk.a 00:03:06.276 LIB libspdk_event_scsi.a 00:03:06.276 SO libspdk_event_nbd.so.6.0 00:03:06.276 SO libspdk_event_ublk.so.3.0 00:03:06.276 LIB libspdk_event_nvmf.a 00:03:06.276 SO libspdk_event_scsi.so.6.0 00:03:06.276 SYMLINK libspdk_event_nbd.so 00:03:06.276 SO libspdk_event_nvmf.so.6.0 00:03:06.276 SYMLINK libspdk_event_ublk.so 00:03:06.276 SYMLINK libspdk_event_scsi.so 00:03:06.276 SYMLINK libspdk_event_nvmf.so 00:03:06.534 CC module/event/subsystems/iscsi/iscsi.o 00:03:06.534 CC module/event/subsystems/vhost_scsi/vhost_scsi.o 00:03:06.791 LIB libspdk_event_iscsi.a 00:03:06.791 LIB libspdk_event_vhost_scsi.a 00:03:06.791 SO libspdk_event_vhost_scsi.so.3.0 00:03:06.791 SO libspdk_event_iscsi.so.6.0 00:03:07.049 SYMLINK libspdk_event_vhost_scsi.so 00:03:07.049 SYMLINK libspdk_event_iscsi.so 00:03:07.049 SO libspdk.so.6.0 00:03:07.049 SYMLINK libspdk.so 00:03:07.621 TEST_HEADER include/spdk/accel.h 00:03:07.621 TEST_HEADER include/spdk/accel_module.h 00:03:07.621 TEST_HEADER include/spdk/assert.h 00:03:07.621 TEST_HEADER include/spdk/barrier.h 00:03:07.621 TEST_HEADER include/spdk/base64.h 00:03:07.621 TEST_HEADER include/spdk/bdev_module.h 00:03:07.621 TEST_HEADER include/spdk/bdev_zone.h 00:03:07.621 TEST_HEADER include/spdk/bit_array.h 00:03:07.621 TEST_HEADER include/spdk/bdev.h 00:03:07.621 TEST_HEADER include/spdk/bit_pool.h 00:03:07.621 CC app/spdk_lspci/spdk_lspci.o 00:03:07.621 TEST_HEADER include/spdk/blob_bdev.h 00:03:07.621 TEST_HEADER include/spdk/blobfs_bdev.h 00:03:07.621 TEST_HEADER include/spdk/blobfs.h 00:03:07.621 CC app/trace_record/trace_record.o 00:03:07.621 CC app/spdk_nvme_perf/perf.o 00:03:07.621 TEST_HEADER include/spdk/blob.h 00:03:07.621 TEST_HEADER include/spdk/config.h 00:03:07.621 TEST_HEADER include/spdk/conf.h 00:03:07.621 CC test/rpc_client/rpc_client_test.o 00:03:07.621 TEST_HEADER include/spdk/cpuset.h 00:03:07.621 CXX app/trace/trace.o 00:03:07.621 TEST_HEADER include/spdk/crc16.h 00:03:07.621 CC app/spdk_nvme_discover/discovery_aer.o 00:03:07.621 TEST_HEADER include/spdk/crc32.h 00:03:07.621 TEST_HEADER include/spdk/crc64.h 00:03:07.621 TEST_HEADER include/spdk/dif.h 00:03:07.621 TEST_HEADER include/spdk/dma.h 00:03:07.621 TEST_HEADER include/spdk/endian.h 00:03:07.621 TEST_HEADER include/spdk/env_dpdk.h 00:03:07.621 TEST_HEADER include/spdk/env.h 00:03:07.621 TEST_HEADER include/spdk/event.h 00:03:07.621 TEST_HEADER include/spdk/fd_group.h 00:03:07.621 TEST_HEADER include/spdk/fd.h 00:03:07.621 TEST_HEADER include/spdk/file.h 00:03:07.621 TEST_HEADER include/spdk/ftl.h 00:03:07.621 TEST_HEADER include/spdk/gpt_spec.h 00:03:07.621 TEST_HEADER include/spdk/hexlify.h 00:03:07.621 CC app/spdk_top/spdk_top.o 00:03:07.621 CC app/spdk_nvme_identify/identify.o 00:03:07.621 TEST_HEADER include/spdk/histogram_data.h 00:03:07.621 TEST_HEADER include/spdk/idxd.h 00:03:07.621 TEST_HEADER include/spdk/idxd_spec.h 00:03:07.621 TEST_HEADER include/spdk/init.h 00:03:07.621 TEST_HEADER include/spdk/ioat.h 00:03:07.621 TEST_HEADER include/spdk/ioat_spec.h 00:03:07.621 TEST_HEADER include/spdk/iscsi_spec.h 00:03:07.621 TEST_HEADER include/spdk/json.h 00:03:07.621 TEST_HEADER include/spdk/jsonrpc.h 00:03:07.621 TEST_HEADER include/spdk/keyring_module.h 00:03:07.621 TEST_HEADER include/spdk/keyring.h 00:03:07.621 TEST_HEADER include/spdk/likely.h 00:03:07.621 TEST_HEADER include/spdk/log.h 00:03:07.621 TEST_HEADER include/spdk/memory.h 00:03:07.621 TEST_HEADER include/spdk/mmio.h 00:03:07.621 TEST_HEADER include/spdk/nbd.h 00:03:07.621 TEST_HEADER include/spdk/lvol.h 00:03:07.621 TEST_HEADER include/spdk/nvme.h 00:03:07.621 TEST_HEADER include/spdk/notify.h 00:03:07.621 TEST_HEADER include/spdk/nvme_intel.h 00:03:07.621 TEST_HEADER include/spdk/nvme_ocssd.h 00:03:07.621 TEST_HEADER include/spdk/nvme_ocssd_spec.h 00:03:07.621 TEST_HEADER include/spdk/nvme_spec.h 00:03:07.621 TEST_HEADER include/spdk/nvme_zns.h 00:03:07.621 TEST_HEADER include/spdk/nvmf_cmd.h 00:03:07.621 TEST_HEADER include/spdk/nvmf_fc_spec.h 00:03:07.621 CC examples/interrupt_tgt/interrupt_tgt.o 00:03:07.621 TEST_HEADER include/spdk/nvmf.h 00:03:07.621 TEST_HEADER include/spdk/nvmf_transport.h 00:03:07.621 TEST_HEADER include/spdk/nvmf_spec.h 00:03:07.621 TEST_HEADER include/spdk/opal.h 00:03:07.621 TEST_HEADER include/spdk/opal_spec.h 00:03:07.621 CC app/iscsi_tgt/iscsi_tgt.o 00:03:07.621 TEST_HEADER include/spdk/pci_ids.h 00:03:07.621 CC app/nvmf_tgt/nvmf_main.o 00:03:07.621 TEST_HEADER include/spdk/pipe.h 00:03:07.621 TEST_HEADER include/spdk/queue.h 00:03:07.621 TEST_HEADER include/spdk/reduce.h 00:03:07.621 TEST_HEADER include/spdk/rpc.h 00:03:07.621 TEST_HEADER include/spdk/scheduler.h 00:03:07.621 TEST_HEADER include/spdk/scsi.h 00:03:07.621 TEST_HEADER include/spdk/scsi_spec.h 00:03:07.621 TEST_HEADER include/spdk/sock.h 00:03:07.621 TEST_HEADER include/spdk/stdinc.h 00:03:07.621 TEST_HEADER include/spdk/string.h 00:03:07.621 TEST_HEADER include/spdk/thread.h 00:03:07.621 TEST_HEADER include/spdk/trace.h 00:03:07.621 TEST_HEADER include/spdk/trace_parser.h 00:03:07.621 TEST_HEADER include/spdk/tree.h 00:03:07.621 TEST_HEADER include/spdk/ublk.h 00:03:07.621 TEST_HEADER include/spdk/util.h 00:03:07.621 TEST_HEADER include/spdk/uuid.h 00:03:07.621 TEST_HEADER include/spdk/vfio_user_pci.h 00:03:07.621 TEST_HEADER include/spdk/vfio_user_spec.h 00:03:07.621 TEST_HEADER include/spdk/version.h 00:03:07.621 TEST_HEADER include/spdk/vhost.h 00:03:07.621 TEST_HEADER include/spdk/vmd.h 00:03:07.621 TEST_HEADER include/spdk/xor.h 00:03:07.621 TEST_HEADER include/spdk/zipf.h 00:03:07.621 CC app/spdk_dd/spdk_dd.o 00:03:07.621 CXX test/cpp_headers/accel.o 00:03:07.621 CXX test/cpp_headers/accel_module.o 00:03:07.621 CXX test/cpp_headers/assert.o 00:03:07.621 CXX test/cpp_headers/barrier.o 00:03:07.621 CXX test/cpp_headers/base64.o 00:03:07.621 CXX test/cpp_headers/bdev.o 00:03:07.621 CXX test/cpp_headers/bdev_module.o 00:03:07.621 CXX test/cpp_headers/bdev_zone.o 00:03:07.621 CXX test/cpp_headers/bit_array.o 00:03:07.621 CXX test/cpp_headers/bit_pool.o 00:03:07.621 CXX test/cpp_headers/blob_bdev.o 00:03:07.621 CXX test/cpp_headers/blobfs_bdev.o 00:03:07.621 CXX test/cpp_headers/blobfs.o 00:03:07.621 CXX test/cpp_headers/blob.o 00:03:07.621 CXX test/cpp_headers/config.o 00:03:07.621 CXX test/cpp_headers/crc16.o 00:03:07.621 CXX test/cpp_headers/cpuset.o 00:03:07.621 CXX test/cpp_headers/crc64.o 00:03:07.621 CXX test/cpp_headers/crc32.o 00:03:07.621 CXX test/cpp_headers/conf.o 00:03:07.621 CXX test/cpp_headers/dif.o 00:03:07.621 CXX test/cpp_headers/endian.o 00:03:07.621 CXX test/cpp_headers/dma.o 00:03:07.621 CXX test/cpp_headers/env_dpdk.o 00:03:07.621 CXX test/cpp_headers/env.o 00:03:07.621 CXX test/cpp_headers/event.o 00:03:07.621 CXX test/cpp_headers/fd.o 00:03:07.621 CXX test/cpp_headers/file.o 00:03:07.621 CXX test/cpp_headers/gpt_spec.o 00:03:07.621 CXX test/cpp_headers/ftl.o 00:03:07.621 CXX test/cpp_headers/fd_group.o 00:03:07.621 CXX test/cpp_headers/hexlify.o 00:03:07.621 CXX test/cpp_headers/histogram_data.o 00:03:07.621 CXX test/cpp_headers/idxd_spec.o 00:03:07.621 CXX test/cpp_headers/idxd.o 00:03:07.621 CXX test/cpp_headers/init.o 00:03:07.621 CXX test/cpp_headers/ioat_spec.o 00:03:07.621 CXX test/cpp_headers/ioat.o 00:03:07.621 CXX test/cpp_headers/iscsi_spec.o 00:03:07.621 CXX test/cpp_headers/json.o 00:03:07.621 CXX test/cpp_headers/jsonrpc.o 00:03:07.621 CXX test/cpp_headers/keyring.o 00:03:07.621 CXX test/cpp_headers/keyring_module.o 00:03:07.621 CC test/env/memory/memory_ut.o 00:03:07.621 CC test/env/env_dpdk_post_init/env_dpdk_post_init.o 00:03:07.621 CC examples/ioat/verify/verify.o 00:03:07.621 CC test/env/vtophys/vtophys.o 00:03:07.621 CC app/spdk_tgt/spdk_tgt.o 00:03:07.622 CC test/env/pci/pci_ut.o 00:03:07.622 CC examples/ioat/perf/perf.o 00:03:07.622 CC test/app/histogram_perf/histogram_perf.o 00:03:07.622 CC test/thread/poller_perf/poller_perf.o 00:03:07.622 CC test/app/jsoncat/jsoncat.o 00:03:07.622 CC examples/util/zipf/zipf.o 00:03:07.622 CC app/fio/nvme/fio_plugin.o 00:03:07.884 CC test/app/stub/stub.o 00:03:07.884 CC app/fio/bdev/fio_plugin.o 00:03:07.884 CC test/app/bdev_svc/bdev_svc.o 00:03:07.884 CC test/dma/test_dma/test_dma.o 00:03:07.884 LINK spdk_lspci 00:03:07.884 LINK rpc_client_test 00:03:07.884 CC test/env/mem_callbacks/mem_callbacks.o 00:03:07.884 LINK spdk_nvme_discover 00:03:08.146 LINK nvmf_tgt 00:03:08.146 LINK interrupt_tgt 00:03:08.146 CC test/app/fuzz/nvme_fuzz/nvme_fuzz.o 00:03:08.146 LINK histogram_perf 00:03:08.146 CXX test/cpp_headers/likely.o 00:03:08.146 LINK vtophys 00:03:08.146 CXX test/cpp_headers/log.o 00:03:08.146 LINK env_dpdk_post_init 00:03:08.146 CXX test/cpp_headers/lvol.o 00:03:08.146 CXX test/cpp_headers/memory.o 00:03:08.146 LINK iscsi_tgt 00:03:08.146 CXX test/cpp_headers/mmio.o 00:03:08.146 CXX test/cpp_headers/nbd.o 00:03:08.146 LINK zipf 00:03:08.146 CXX test/cpp_headers/notify.o 00:03:08.146 CXX test/cpp_headers/nvme_intel.o 00:03:08.146 CXX test/cpp_headers/nvme.o 00:03:08.146 LINK stub 00:03:08.146 CXX test/cpp_headers/nvme_ocssd.o 00:03:08.146 CXX test/cpp_headers/nvme_ocssd_spec.o 00:03:08.146 LINK jsoncat 00:03:08.146 CXX test/cpp_headers/nvme_spec.o 00:03:08.146 CXX test/cpp_headers/nvme_zns.o 00:03:08.146 CXX test/cpp_headers/nvmf_cmd.o 00:03:08.146 CXX test/cpp_headers/nvmf_fc_spec.o 00:03:08.146 CXX test/cpp_headers/nvmf.o 00:03:08.146 CXX test/cpp_headers/nvmf_spec.o 00:03:08.146 LINK spdk_trace_record 00:03:08.146 CXX test/cpp_headers/nvmf_transport.o 00:03:08.146 LINK verify 00:03:08.146 CXX test/cpp_headers/opal.o 00:03:08.146 CC test/app/fuzz/iscsi_fuzz/iscsi_fuzz.o 00:03:08.146 CC test/app/fuzz/vhost_fuzz/vhost_fuzz_rpc.o 00:03:08.146 CXX test/cpp_headers/opal_spec.o 00:03:08.146 CXX test/cpp_headers/pci_ids.o 00:03:08.146 CXX test/cpp_headers/pipe.o 00:03:08.146 LINK poller_perf 00:03:08.146 CXX test/cpp_headers/queue.o 00:03:08.146 CC test/app/fuzz/vhost_fuzz/vhost_fuzz.o 00:03:08.146 CXX test/cpp_headers/reduce.o 00:03:08.409 CXX test/cpp_headers/rpc.o 00:03:08.409 LINK bdev_svc 00:03:08.409 CXX test/cpp_headers/scheduler.o 00:03:08.409 CXX test/cpp_headers/scsi.o 00:03:08.409 CXX test/cpp_headers/scsi_spec.o 00:03:08.409 CXX test/cpp_headers/sock.o 00:03:08.409 CXX test/cpp_headers/stdinc.o 00:03:08.409 CXX test/cpp_headers/string.o 00:03:08.409 CXX test/cpp_headers/thread.o 00:03:08.409 LINK spdk_tgt 00:03:08.409 CXX test/cpp_headers/trace.o 00:03:08.409 CXX test/cpp_headers/trace_parser.o 00:03:08.409 CXX test/cpp_headers/ublk.o 00:03:08.409 CXX test/cpp_headers/tree.o 00:03:08.409 LINK spdk_dd 00:03:08.409 CXX test/cpp_headers/util.o 00:03:08.409 CXX test/cpp_headers/uuid.o 00:03:08.409 CXX test/cpp_headers/version.o 00:03:08.409 CXX test/cpp_headers/vfio_user_pci.o 00:03:08.409 CXX test/cpp_headers/vfio_user_spec.o 00:03:08.409 CXX test/cpp_headers/vhost.o 00:03:08.409 CXX test/cpp_headers/vmd.o 00:03:08.409 CXX test/cpp_headers/xor.o 00:03:08.409 CXX test/cpp_headers/zipf.o 00:03:08.669 LINK pci_ut 00:03:08.669 LINK ioat_perf 00:03:08.669 LINK spdk_trace 00:03:08.669 LINK test_dma 00:03:08.669 CC examples/vmd/lsvmd/lsvmd.o 00:03:08.669 CC examples/vmd/led/led.o 00:03:08.669 LINK spdk_nvme_perf 00:03:08.669 CC examples/sock/hello_world/hello_sock.o 00:03:08.669 CC examples/idxd/perf/perf.o 00:03:08.669 LINK nvme_fuzz 00:03:08.927 CC examples/thread/thread/thread_ex.o 00:03:08.927 LINK spdk_nvme 00:03:08.927 LINK mem_callbacks 00:03:08.927 LINK spdk_bdev 00:03:08.927 LINK lsvmd 00:03:08.927 CC test/event/reactor/reactor.o 00:03:08.927 CC test/event/event_perf/event_perf.o 00:03:08.927 CC test/event/reactor_perf/reactor_perf.o 00:03:08.927 LINK led 00:03:08.927 CC test/event/app_repeat/app_repeat.o 00:03:08.927 LINK spdk_nvme_identify 00:03:08.927 CC test/event/scheduler/scheduler.o 00:03:08.927 LINK vhost_fuzz 00:03:08.928 LINK spdk_top 00:03:08.928 LINK hello_sock 00:03:09.185 LINK reactor 00:03:09.185 CC app/vhost/vhost.o 00:03:09.185 LINK thread 00:03:09.185 LINK event_perf 00:03:09.185 LINK reactor_perf 00:03:09.185 LINK app_repeat 00:03:09.185 CC test/nvme/reserve/reserve.o 00:03:09.185 CC test/nvme/startup/startup.o 00:03:09.185 CC test/nvme/boot_partition/boot_partition.o 00:03:09.185 CC test/nvme/e2edp/nvme_dp.o 00:03:09.185 CC test/nvme/aer/aer.o 00:03:09.185 CC test/nvme/sgl/sgl.o 00:03:09.185 CC test/nvme/cuse/cuse.o 00:03:09.185 CC test/nvme/simple_copy/simple_copy.o 00:03:09.185 CC test/nvme/doorbell_aers/doorbell_aers.o 00:03:09.185 CC test/nvme/reset/reset.o 00:03:09.185 CC test/nvme/err_injection/err_injection.o 00:03:09.185 CC test/nvme/compliance/nvme_compliance.o 00:03:09.185 CC test/nvme/overhead/overhead.o 00:03:09.185 CC test/nvme/fdp/fdp.o 00:03:09.185 CC test/nvme/connect_stress/connect_stress.o 00:03:09.185 CC test/nvme/fused_ordering/fused_ordering.o 00:03:09.185 LINK scheduler 00:03:09.185 CC test/accel/dif/dif.o 00:03:09.185 CC test/blobfs/mkfs/mkfs.o 00:03:09.442 LINK vhost 00:03:09.442 CC test/lvol/esnap/esnap.o 00:03:09.442 LINK reserve 00:03:09.442 LINK startup 00:03:09.442 LINK boot_partition 00:03:09.442 LINK connect_stress 00:03:09.442 LINK err_injection 00:03:09.442 LINK aer 00:03:09.442 LINK idxd_perf 00:03:09.442 LINK mkfs 00:03:09.442 LINK fused_ordering 00:03:09.442 LINK simple_copy 00:03:09.442 LINK sgl 00:03:09.442 LINK doorbell_aers 00:03:09.442 CC examples/nvme/reconnect/reconnect.o 00:03:09.442 LINK memory_ut 00:03:09.442 CC examples/nvme/hello_world/hello_world.o 00:03:09.442 LINK reset 00:03:09.442 CC examples/nvme/pmr_persistence/pmr_persistence.o 00:03:09.442 CC examples/nvme/abort/abort.o 00:03:09.442 CC examples/nvme/hotplug/hotplug.o 00:03:09.442 CC examples/nvme/nvme_manage/nvme_manage.o 00:03:09.442 CC examples/nvme/cmb_copy/cmb_copy.o 00:03:09.442 CC examples/nvme/arbitration/arbitration.o 00:03:09.700 LINK overhead 00:03:09.700 LINK nvme_dp 00:03:09.700 LINK fdp 00:03:09.700 LINK nvme_compliance 00:03:09.700 CC examples/accel/perf/accel_perf.o 00:03:09.700 CC examples/blob/cli/blobcli.o 00:03:09.700 CC examples/blob/hello_world/hello_blob.o 00:03:09.700 LINK dif 00:03:09.700 LINK pmr_persistence 00:03:09.700 LINK cmb_copy 00:03:09.700 LINK hello_world 00:03:09.958 LINK hotplug 00:03:09.958 LINK arbitration 00:03:09.958 LINK reconnect 00:03:09.958 LINK abort 00:03:09.958 LINK hello_blob 00:03:10.216 LINK nvme_manage 00:03:10.216 LINK accel_perf 00:03:10.216 LINK blobcli 00:03:10.216 LINK iscsi_fuzz 00:03:10.475 CC test/bdev/bdevio/bdevio.o 00:03:10.475 LINK cuse 00:03:10.734 CC examples/bdev/bdevperf/bdevperf.o 00:03:10.734 CC examples/bdev/hello_world/hello_bdev.o 00:03:10.734 LINK bdevio 00:03:10.992 LINK hello_bdev 00:03:11.558 LINK bdevperf 00:03:12.127 CC examples/nvmf/nvmf/nvmf.o 00:03:12.694 LINK nvmf 00:03:13.262 LINK esnap 00:03:13.527 00:03:13.527 real 1m31.079s 00:03:13.527 user 17m22.965s 00:03:13.527 sys 4m11.653s 00:03:13.527 18:06:57 make -- common/autotest_common.sh@1124 -- $ xtrace_disable 00:03:13.527 18:06:57 make -- common/autotest_common.sh@10 -- $ set +x 00:03:13.527 ************************************ 00:03:13.527 END TEST make 00:03:13.527 ************************************ 00:03:13.786 18:06:57 -- common/autotest_common.sh@1142 -- $ return 0 00:03:13.786 18:06:57 -- spdk/autobuild.sh@1 -- $ stop_monitor_resources 00:03:13.786 18:06:57 -- pm/common@29 -- $ signal_monitor_resources TERM 00:03:13.786 18:06:57 -- pm/common@40 -- $ local monitor pid pids signal=TERM 00:03:13.786 18:06:57 -- pm/common@42 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:03:13.786 18:06:57 -- pm/common@43 -- $ [[ -e /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power/collect-cpu-load.pid ]] 00:03:13.786 18:06:57 -- pm/common@44 -- $ pid=2300787 00:03:13.786 18:06:57 -- pm/common@50 -- $ kill -TERM 2300787 00:03:13.786 18:06:57 -- pm/common@42 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:03:13.786 18:06:57 -- pm/common@43 -- $ [[ -e /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power/collect-vmstat.pid ]] 00:03:13.786 18:06:57 -- pm/common@44 -- $ pid=2300789 00:03:13.786 18:06:57 -- pm/common@50 -- $ kill -TERM 2300789 00:03:13.786 18:06:57 -- pm/common@42 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:03:13.786 18:06:57 -- pm/common@43 -- $ [[ -e /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power/collect-cpu-temp.pid ]] 00:03:13.786 18:06:57 -- pm/common@44 -- $ pid=2300791 00:03:13.786 18:06:57 -- pm/common@50 -- $ kill -TERM 2300791 00:03:13.786 18:06:57 -- pm/common@42 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:03:13.786 18:06:57 -- pm/common@43 -- $ [[ -e /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power/collect-bmc-pm.pid ]] 00:03:13.786 18:06:57 -- pm/common@44 -- $ pid=2300815 00:03:13.786 18:06:57 -- pm/common@50 -- $ sudo -E kill -TERM 2300815 00:03:13.786 18:06:57 -- spdk/autotest.sh@25 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/nvmf/common.sh 00:03:13.786 18:06:57 -- nvmf/common.sh@7 -- # uname -s 00:03:13.786 18:06:57 -- nvmf/common.sh@7 -- # [[ Linux == FreeBSD ]] 00:03:13.786 18:06:57 -- nvmf/common.sh@9 -- # NVMF_PORT=4420 00:03:13.786 18:06:57 -- nvmf/common.sh@10 -- # NVMF_SECOND_PORT=4421 00:03:13.786 18:06:57 -- nvmf/common.sh@11 -- # NVMF_THIRD_PORT=4422 00:03:13.786 18:06:57 -- nvmf/common.sh@12 -- # NVMF_IP_PREFIX=192.168.100 00:03:13.786 18:06:57 -- nvmf/common.sh@13 -- # NVMF_IP_LEAST_ADDR=8 00:03:13.786 18:06:57 -- nvmf/common.sh@14 -- # NVMF_TCP_IP_ADDRESS=127.0.0.1 00:03:13.786 18:06:57 -- nvmf/common.sh@15 -- # NVMF_TRANSPORT_OPTS= 00:03:13.786 18:06:57 -- nvmf/common.sh@16 -- # NVMF_SERIAL=SPDKISFASTANDAWESOME 00:03:13.786 18:06:57 -- nvmf/common.sh@17 -- # nvme gen-hostnqn 00:03:13.786 18:06:57 -- nvmf/common.sh@17 -- # NVME_HOSTNQN=nqn.2014-08.org.nvmexpress:uuid:809e2efd-7f71-e711-906e-0017a4403562 00:03:13.786 18:06:57 -- nvmf/common.sh@18 -- # NVME_HOSTID=809e2efd-7f71-e711-906e-0017a4403562 00:03:13.786 18:06:57 -- nvmf/common.sh@19 -- # NVME_HOST=("--hostnqn=$NVME_HOSTNQN" "--hostid=$NVME_HOSTID") 00:03:13.786 18:06:57 -- nvmf/common.sh@20 -- # NVME_CONNECT='nvme connect' 00:03:13.786 18:06:57 -- nvmf/common.sh@21 -- # NET_TYPE=phy-fallback 00:03:13.786 18:06:57 -- nvmf/common.sh@22 -- # NVME_SUBNQN=nqn.2016-06.io.spdk:testnqn 00:03:13.786 18:06:57 -- nvmf/common.sh@45 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/common.sh 00:03:13.786 18:06:57 -- scripts/common.sh@508 -- # [[ -e /bin/wpdk_common.sh ]] 00:03:13.786 18:06:57 -- scripts/common.sh@516 -- # [[ -e /etc/opt/spdk-pkgdep/paths/export.sh ]] 00:03:13.786 18:06:57 -- scripts/common.sh@517 -- # source /etc/opt/spdk-pkgdep/paths/export.sh 00:03:13.786 18:06:57 -- paths/export.sh@2 -- # PATH=/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:03:13.786 18:06:57 -- paths/export.sh@3 -- # PATH=/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:03:13.786 18:06:57 -- paths/export.sh@4 -- # PATH=/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:03:13.786 18:06:57 -- paths/export.sh@5 -- # export PATH 00:03:13.786 18:06:57 -- paths/export.sh@6 -- # echo /opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:03:13.786 18:06:57 -- nvmf/common.sh@47 -- # : 0 00:03:13.786 18:06:57 -- nvmf/common.sh@48 -- # export NVMF_APP_SHM_ID 00:03:13.786 18:06:57 -- nvmf/common.sh@49 -- # build_nvmf_app_args 00:03:13.786 18:06:57 -- nvmf/common.sh@25 -- # '[' 0 -eq 1 ']' 00:03:13.786 18:06:57 -- nvmf/common.sh@29 -- # NVMF_APP+=(-i "$NVMF_APP_SHM_ID" -e 0xFFFF) 00:03:13.786 18:06:57 -- nvmf/common.sh@31 -- # NVMF_APP+=("${NO_HUGE[@]}") 00:03:13.786 18:06:57 -- nvmf/common.sh@33 -- # '[' -n '' ']' 00:03:13.786 18:06:57 -- nvmf/common.sh@35 -- # '[' 0 -eq 1 ']' 00:03:13.786 18:06:57 -- nvmf/common.sh@51 -- # have_pci_nics=0 00:03:13.786 18:06:57 -- spdk/autotest.sh@27 -- # '[' 0 -ne 0 ']' 00:03:13.786 18:06:57 -- spdk/autotest.sh@32 -- # uname -s 00:03:13.786 18:06:57 -- spdk/autotest.sh@32 -- # '[' Linux = Linux ']' 00:03:13.786 18:06:57 -- spdk/autotest.sh@33 -- # old_core_pattern='|/usr/lib/systemd/systemd-coredump %P %u %g %s %t %c %h' 00:03:13.786 18:06:57 -- spdk/autotest.sh@34 -- # mkdir -p /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/coredumps 00:03:13.786 18:06:57 -- spdk/autotest.sh@39 -- # echo '|/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/core-collector.sh %P %s %t' 00:03:13.786 18:06:57 -- spdk/autotest.sh@40 -- # echo /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/coredumps 00:03:13.786 18:06:57 -- spdk/autotest.sh@44 -- # modprobe nbd 00:03:13.786 18:06:57 -- spdk/autotest.sh@46 -- # type -P udevadm 00:03:13.786 18:06:57 -- spdk/autotest.sh@46 -- # udevadm=/usr/sbin/udevadm 00:03:13.786 18:06:57 -- spdk/autotest.sh@48 -- # udevadm_pid=2367614 00:03:13.786 18:06:57 -- spdk/autotest.sh@53 -- # start_monitor_resources 00:03:13.786 18:06:57 -- pm/common@17 -- # local monitor 00:03:13.786 18:06:57 -- pm/common@19 -- # for monitor in "${MONITOR_RESOURCES[@]}" 00:03:13.786 18:06:57 -- spdk/autotest.sh@47 -- # /usr/sbin/udevadm monitor --property 00:03:13.786 18:06:57 -- pm/common@19 -- # for monitor in "${MONITOR_RESOURCES[@]}" 00:03:13.786 18:06:57 -- pm/common@19 -- # for monitor in "${MONITOR_RESOURCES[@]}" 00:03:13.786 18:06:57 -- pm/common@19 -- # for monitor in "${MONITOR_RESOURCES[@]}" 00:03:13.786 18:06:57 -- pm/common@25 -- # sleep 1 00:03:13.786 18:06:57 -- pm/common@21 -- # date +%s 00:03:13.786 18:06:57 -- pm/common@21 -- # date +%s 00:03:13.786 18:06:57 -- pm/common@21 -- # date +%s 00:03:13.786 18:06:57 -- pm/common@21 -- # date +%s 00:03:13.786 18:06:57 -- pm/common@21 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm/collect-vmstat -d /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power -l -p monitor.autotest.sh.1720800417 00:03:13.786 18:06:57 -- pm/common@21 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm/collect-cpu-load -d /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power -l -p monitor.autotest.sh.1720800417 00:03:13.786 18:06:57 -- pm/common@21 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm/collect-cpu-temp -d /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power -l -p monitor.autotest.sh.1720800417 00:03:13.786 18:06:57 -- pm/common@21 -- # sudo -E /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm/collect-bmc-pm -d /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power -l -p monitor.autotest.sh.1720800417 00:03:13.786 Redirecting to /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power/monitor.autotest.sh.1720800417_collect-vmstat.pm.log 00:03:13.786 Redirecting to /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power/monitor.autotest.sh.1720800417_collect-cpu-temp.pm.log 00:03:13.786 Redirecting to /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power/monitor.autotest.sh.1720800417_collect-cpu-load.pm.log 00:03:13.786 Redirecting to /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power/monitor.autotest.sh.1720800417_collect-bmc-pm.bmc.pm.log 00:03:14.722 18:06:58 -- spdk/autotest.sh@55 -- # trap 'autotest_cleanup || :; exit 1' SIGINT SIGTERM EXIT 00:03:14.722 18:06:58 -- spdk/autotest.sh@57 -- # timing_enter autotest 00:03:14.722 18:06:58 -- common/autotest_common.sh@722 -- # xtrace_disable 00:03:14.722 18:06:58 -- common/autotest_common.sh@10 -- # set +x 00:03:14.722 18:06:58 -- spdk/autotest.sh@59 -- # create_test_list 00:03:14.722 18:06:58 -- common/autotest_common.sh@746 -- # xtrace_disable 00:03:14.722 18:06:58 -- common/autotest_common.sh@10 -- # set +x 00:03:14.980 18:06:58 -- spdk/autotest.sh@61 -- # dirname /var/jenkins/workspace/crypto-phy-autotest/spdk/autotest.sh 00:03:14.980 18:06:58 -- spdk/autotest.sh@61 -- # readlink -f /var/jenkins/workspace/crypto-phy-autotest/spdk 00:03:14.980 18:06:58 -- spdk/autotest.sh@61 -- # src=/var/jenkins/workspace/crypto-phy-autotest/spdk 00:03:14.980 18:06:58 -- spdk/autotest.sh@62 -- # out=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:03:14.980 18:06:58 -- spdk/autotest.sh@63 -- # cd /var/jenkins/workspace/crypto-phy-autotest/spdk 00:03:14.980 18:06:58 -- spdk/autotest.sh@65 -- # freebsd_update_contigmem_mod 00:03:14.980 18:06:58 -- common/autotest_common.sh@1455 -- # uname 00:03:14.980 18:06:58 -- common/autotest_common.sh@1455 -- # '[' Linux = FreeBSD ']' 00:03:14.980 18:06:58 -- spdk/autotest.sh@66 -- # freebsd_set_maxsock_buf 00:03:14.980 18:06:58 -- common/autotest_common.sh@1475 -- # uname 00:03:14.980 18:06:58 -- common/autotest_common.sh@1475 -- # [[ Linux = FreeBSD ]] 00:03:14.980 18:06:58 -- spdk/autotest.sh@71 -- # grep CC_TYPE mk/cc.mk 00:03:14.980 18:06:58 -- spdk/autotest.sh@71 -- # CC_TYPE=CC_TYPE=gcc 00:03:14.980 18:06:58 -- spdk/autotest.sh@72 -- # hash lcov 00:03:14.980 18:06:58 -- spdk/autotest.sh@72 -- # [[ CC_TYPE=gcc == *\c\l\a\n\g* ]] 00:03:14.980 18:06:58 -- spdk/autotest.sh@80 -- # export 'LCOV_OPTS= 00:03:14.980 --rc lcov_branch_coverage=1 00:03:14.980 --rc lcov_function_coverage=1 00:03:14.980 --rc genhtml_branch_coverage=1 00:03:14.980 --rc genhtml_function_coverage=1 00:03:14.980 --rc genhtml_legend=1 00:03:14.980 --rc geninfo_all_blocks=1 00:03:14.980 ' 00:03:14.980 18:06:58 -- spdk/autotest.sh@80 -- # LCOV_OPTS=' 00:03:14.980 --rc lcov_branch_coverage=1 00:03:14.980 --rc lcov_function_coverage=1 00:03:14.980 --rc genhtml_branch_coverage=1 00:03:14.980 --rc genhtml_function_coverage=1 00:03:14.980 --rc genhtml_legend=1 00:03:14.980 --rc geninfo_all_blocks=1 00:03:14.980 ' 00:03:14.980 18:06:58 -- spdk/autotest.sh@81 -- # export 'LCOV=lcov 00:03:14.980 --rc lcov_branch_coverage=1 00:03:14.980 --rc lcov_function_coverage=1 00:03:14.980 --rc genhtml_branch_coverage=1 00:03:14.980 --rc genhtml_function_coverage=1 00:03:14.980 --rc genhtml_legend=1 00:03:14.980 --rc geninfo_all_blocks=1 00:03:14.980 --no-external' 00:03:14.980 18:06:58 -- spdk/autotest.sh@81 -- # LCOV='lcov 00:03:14.980 --rc lcov_branch_coverage=1 00:03:14.980 --rc lcov_function_coverage=1 00:03:14.980 --rc genhtml_branch_coverage=1 00:03:14.980 --rc genhtml_function_coverage=1 00:03:14.980 --rc genhtml_legend=1 00:03:14.980 --rc geninfo_all_blocks=1 00:03:14.980 --no-external' 00:03:14.980 18:06:58 -- spdk/autotest.sh@83 -- # lcov --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 --rc genhtml_branch_coverage=1 --rc genhtml_function_coverage=1 --rc genhtml_legend=1 --rc geninfo_all_blocks=1 --no-external -v 00:03:14.980 lcov: LCOV version 1.14 00:03:14.980 18:06:58 -- spdk/autotest.sh@85 -- # lcov --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 --rc genhtml_branch_coverage=1 --rc genhtml_function_coverage=1 --rc genhtml_legend=1 --rc geninfo_all_blocks=1 --no-external -q -c -i -t Baseline -d /var/jenkins/workspace/crypto-phy-autotest/spdk -o /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/cov_base.info 00:03:29.843 /var/jenkins/workspace/crypto-phy-autotest/spdk/lib/nvme/nvme_stubs.gcno:no functions found 00:03:29.843 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/lib/nvme/nvme_stubs.gcno 00:03:44.779 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/accel.gcno:no functions found 00:03:44.779 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/accel.gcno 00:03:44.779 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/barrier.gcno:no functions found 00:03:44.779 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/barrier.gcno 00:03:44.779 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/accel_module.gcno:no functions found 00:03:44.779 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/accel_module.gcno 00:03:44.779 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/bdev.gcno:no functions found 00:03:44.779 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/bdev.gcno 00:03:44.779 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/assert.gcno:no functions found 00:03:44.779 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/assert.gcno 00:03:44.779 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/base64.gcno:no functions found 00:03:44.779 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/base64.gcno 00:03:44.779 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/bit_array.gcno:no functions found 00:03:44.779 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/bit_array.gcno 00:03:44.779 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/bit_pool.gcno:no functions found 00:03:44.779 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/bit_pool.gcno 00:03:44.779 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/bdev_module.gcno:no functions found 00:03:44.779 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/bdev_module.gcno 00:03:44.779 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/blobfs_bdev.gcno:no functions found 00:03:44.779 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/blobfs_bdev.gcno 00:03:44.779 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/blobfs.gcno:no functions found 00:03:44.779 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/blobfs.gcno 00:03:44.779 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/blob.gcno:no functions found 00:03:44.779 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/blob.gcno 00:03:44.779 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/bdev_zone.gcno:no functions found 00:03:44.779 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/bdev_zone.gcno 00:03:44.779 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/crc16.gcno:no functions found 00:03:44.779 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/crc16.gcno 00:03:44.779 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/blob_bdev.gcno:no functions found 00:03:44.779 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/blob_bdev.gcno 00:03:44.779 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/config.gcno:no functions found 00:03:44.779 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/config.gcno 00:03:44.779 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/dma.gcno:no functions found 00:03:44.779 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/dma.gcno 00:03:44.779 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/crc32.gcno:no functions found 00:03:44.779 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/crc32.gcno 00:03:44.779 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/cpuset.gcno:no functions found 00:03:44.779 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/cpuset.gcno 00:03:44.779 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/dif.gcno:no functions found 00:03:44.779 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/dif.gcno 00:03:44.779 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/crc64.gcno:no functions found 00:03:44.779 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/crc64.gcno 00:03:44.779 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/event.gcno:no functions found 00:03:44.779 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/event.gcno 00:03:44.779 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/conf.gcno:no functions found 00:03:44.779 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/conf.gcno 00:03:44.779 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/env_dpdk.gcno:no functions found 00:03:44.779 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/env_dpdk.gcno 00:03:44.779 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/env.gcno:no functions found 00:03:44.779 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/env.gcno 00:03:44.779 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/endian.gcno:no functions found 00:03:44.779 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/endian.gcno 00:03:44.779 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/gpt_spec.gcno:no functions found 00:03:44.779 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/gpt_spec.gcno 00:03:44.779 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/fd.gcno:no functions found 00:03:44.779 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/fd.gcno 00:03:44.779 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/file.gcno:no functions found 00:03:44.779 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/file.gcno 00:03:44.779 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/histogram_data.gcno:no functions found 00:03:44.779 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/histogram_data.gcno 00:03:44.779 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/fd_group.gcno:no functions found 00:03:44.779 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/fd_group.gcno 00:03:44.779 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/ioat_spec.gcno:no functions found 00:03:44.779 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/ioat_spec.gcno 00:03:44.779 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/hexlify.gcno:no functions found 00:03:44.779 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/hexlify.gcno 00:03:44.779 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/ioat.gcno:no functions found 00:03:44.779 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/ioat.gcno 00:03:44.779 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/idxd.gcno:no functions found 00:03:44.779 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/idxd.gcno 00:03:44.779 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/ftl.gcno:no functions found 00:03:44.779 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/ftl.gcno 00:03:44.779 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/init.gcno:no functions found 00:03:44.779 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/init.gcno 00:03:44.779 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/idxd_spec.gcno:no functions found 00:03:44.779 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/idxd_spec.gcno 00:03:44.779 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/jsonrpc.gcno:no functions found 00:03:44.779 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/jsonrpc.gcno 00:03:44.779 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/iscsi_spec.gcno:no functions found 00:03:44.779 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/iscsi_spec.gcno 00:03:44.779 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/keyring.gcno:no functions found 00:03:44.779 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/keyring.gcno 00:03:44.779 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/json.gcno:no functions found 00:03:44.779 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/json.gcno 00:03:44.779 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/keyring_module.gcno:no functions found 00:03:44.779 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/keyring_module.gcno 00:03:44.780 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/likely.gcno:no functions found 00:03:44.780 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/likely.gcno 00:03:44.780 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/log.gcno:no functions found 00:03:44.780 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/log.gcno 00:03:44.780 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/lvol.gcno:no functions found 00:03:44.780 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/lvol.gcno 00:03:44.780 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/memory.gcno:no functions found 00:03:44.780 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/memory.gcno 00:03:44.780 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nbd.gcno:no functions found 00:03:44.780 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nbd.gcno 00:03:44.780 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/mmio.gcno:no functions found 00:03:44.780 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/mmio.gcno 00:03:44.780 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/notify.gcno:no functions found 00:03:44.780 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/notify.gcno 00:03:44.780 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvme.gcno:no functions found 00:03:44.780 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvme.gcno 00:03:44.780 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvme_intel.gcno:no functions found 00:03:44.780 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvme_intel.gcno 00:03:44.780 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvme_ocssd.gcno:no functions found 00:03:44.780 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvme_ocssd.gcno 00:03:44.780 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvme_ocssd_spec.gcno:no functions found 00:03:44.780 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvme_ocssd_spec.gcno 00:03:44.780 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvme_zns.gcno:no functions found 00:03:44.780 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvme_zns.gcno 00:03:44.780 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvmf_cmd.gcno:no functions found 00:03:44.780 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvmf_cmd.gcno 00:03:44.780 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvme_spec.gcno:no functions found 00:03:44.780 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvme_spec.gcno 00:03:44.780 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvmf_fc_spec.gcno:no functions found 00:03:44.780 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvmf_fc_spec.gcno 00:03:44.780 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvmf_spec.gcno:no functions found 00:03:44.780 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvmf_spec.gcno 00:03:44.780 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvmf.gcno:no functions found 00:03:44.780 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvmf.gcno 00:03:44.780 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvmf_transport.gcno:no functions found 00:03:44.780 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvmf_transport.gcno 00:03:44.780 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/opal.gcno:no functions found 00:03:44.780 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/opal.gcno 00:03:44.780 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/opal_spec.gcno:no functions found 00:03:44.780 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/opal_spec.gcno 00:03:44.780 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/pci_ids.gcno:no functions found 00:03:44.780 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/pci_ids.gcno 00:03:44.780 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/pipe.gcno:no functions found 00:03:44.780 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/pipe.gcno 00:03:44.780 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/queue.gcno:no functions found 00:03:44.780 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/queue.gcno 00:03:44.780 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/reduce.gcno:no functions found 00:03:44.780 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/reduce.gcno 00:03:44.780 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/rpc.gcno:no functions found 00:03:44.780 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/rpc.gcno 00:03:44.780 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/scheduler.gcno:no functions found 00:03:44.780 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/scheduler.gcno 00:03:44.780 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/sock.gcno:no functions found 00:03:44.780 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/sock.gcno 00:03:44.780 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/scsi_spec.gcno:no functions found 00:03:44.780 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/scsi_spec.gcno 00:03:44.780 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/scsi.gcno:no functions found 00:03:44.780 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/scsi.gcno 00:03:44.780 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/stdinc.gcno:no functions found 00:03:44.780 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/stdinc.gcno 00:03:44.780 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/thread.gcno:no functions found 00:03:44.780 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/thread.gcno 00:03:44.780 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/string.gcno:no functions found 00:03:44.780 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/string.gcno 00:03:44.780 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/trace.gcno:no functions found 00:03:44.780 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/trace.gcno 00:03:44.780 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/trace_parser.gcno:no functions found 00:03:44.780 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/trace_parser.gcno 00:03:44.780 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/ublk.gcno:no functions found 00:03:44.780 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/ublk.gcno 00:03:44.780 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/tree.gcno:no functions found 00:03:44.780 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/tree.gcno 00:03:44.780 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/util.gcno:no functions found 00:03:44.780 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/util.gcno 00:03:44.780 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/uuid.gcno:no functions found 00:03:44.780 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/uuid.gcno 00:03:44.780 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/version.gcno:no functions found 00:03:44.780 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/version.gcno 00:03:44.780 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/vfio_user_pci.gcno:no functions found 00:03:44.780 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/vfio_user_pci.gcno 00:03:44.780 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/vfio_user_spec.gcno:no functions found 00:03:44.780 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/vfio_user_spec.gcno 00:03:44.780 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/vhost.gcno:no functions found 00:03:44.780 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/vhost.gcno 00:03:44.780 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/xor.gcno:no functions found 00:03:44.780 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/xor.gcno 00:03:44.780 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/vmd.gcno:no functions found 00:03:44.780 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/vmd.gcno 00:03:44.780 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/zipf.gcno:no functions found 00:03:44.780 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/zipf.gcno 00:03:50.044 18:07:33 -- spdk/autotest.sh@89 -- # timing_enter pre_cleanup 00:03:50.044 18:07:33 -- common/autotest_common.sh@722 -- # xtrace_disable 00:03:50.044 18:07:33 -- common/autotest_common.sh@10 -- # set +x 00:03:50.044 18:07:33 -- spdk/autotest.sh@91 -- # rm -f 00:03:50.044 18:07:33 -- spdk/autotest.sh@94 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh reset 00:03:52.573 0000:d7:05.5 (8086 201d): Skipping not allowed VMD controller at 0000:d7:05.5 00:03:52.573 0000:85:05.5 (8086 201d): Skipping not allowed VMD controller at 0000:85:05.5 00:03:52.573 0000:5e:00.0 (8086 0b60): Already using the nvme driver 00:03:52.573 0000:00:04.7 (8086 2021): Already using the ioatdma driver 00:03:52.573 0000:00:04.6 (8086 2021): Already using the ioatdma driver 00:03:52.573 0000:00:04.5 (8086 2021): Already using the ioatdma driver 00:03:52.573 0000:00:04.4 (8086 2021): Already using the ioatdma driver 00:03:52.573 0000:00:04.3 (8086 2021): Already using the ioatdma driver 00:03:52.573 0000:00:04.2 (8086 2021): Already using the ioatdma driver 00:03:52.573 0000:00:04.1 (8086 2021): Already using the ioatdma driver 00:03:52.573 0000:00:04.0 (8086 2021): Already using the ioatdma driver 00:03:52.832 0000:80:04.7 (8086 2021): Already using the ioatdma driver 00:03:52.832 0000:80:04.6 (8086 2021): Already using the ioatdma driver 00:03:52.832 0000:80:04.5 (8086 2021): Already using the ioatdma driver 00:03:52.832 0000:80:04.4 (8086 2021): Already using the ioatdma driver 00:03:52.832 0000:80:04.3 (8086 2021): Already using the ioatdma driver 00:03:52.832 0000:80:04.2 (8086 2021): Already using the ioatdma driver 00:03:52.832 0000:80:04.1 (8086 2021): Already using the ioatdma driver 00:03:52.832 0000:80:04.0 (8086 2021): Already using the ioatdma driver 00:03:52.832 18:07:36 -- spdk/autotest.sh@96 -- # get_zoned_devs 00:03:52.832 18:07:36 -- common/autotest_common.sh@1669 -- # zoned_devs=() 00:03:52.832 18:07:36 -- common/autotest_common.sh@1669 -- # local -gA zoned_devs 00:03:52.832 18:07:36 -- common/autotest_common.sh@1670 -- # local nvme bdf 00:03:52.832 18:07:36 -- common/autotest_common.sh@1672 -- # for nvme in /sys/block/nvme* 00:03:52.832 18:07:36 -- common/autotest_common.sh@1673 -- # is_block_zoned nvme0n1 00:03:52.832 18:07:36 -- common/autotest_common.sh@1662 -- # local device=nvme0n1 00:03:52.832 18:07:36 -- common/autotest_common.sh@1664 -- # [[ -e /sys/block/nvme0n1/queue/zoned ]] 00:03:52.832 18:07:36 -- common/autotest_common.sh@1665 -- # [[ none != none ]] 00:03:52.832 18:07:36 -- spdk/autotest.sh@98 -- # (( 0 > 0 )) 00:03:52.832 18:07:36 -- spdk/autotest.sh@110 -- # for dev in /dev/nvme*n!(*p*) 00:03:52.832 18:07:36 -- spdk/autotest.sh@112 -- # [[ -z '' ]] 00:03:52.832 18:07:36 -- spdk/autotest.sh@113 -- # block_in_use /dev/nvme0n1 00:03:52.832 18:07:36 -- scripts/common.sh@378 -- # local block=/dev/nvme0n1 pt 00:03:52.832 18:07:36 -- scripts/common.sh@387 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/spdk-gpt.py /dev/nvme0n1 00:03:52.832 No valid GPT data, bailing 00:03:53.091 18:07:36 -- scripts/common.sh@391 -- # blkid -s PTTYPE -o value /dev/nvme0n1 00:03:53.091 18:07:36 -- scripts/common.sh@391 -- # pt= 00:03:53.091 18:07:36 -- scripts/common.sh@392 -- # return 1 00:03:53.091 18:07:36 -- spdk/autotest.sh@114 -- # dd if=/dev/zero of=/dev/nvme0n1 bs=1M count=1 00:03:53.091 1+0 records in 00:03:53.091 1+0 records out 00:03:53.091 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.00480851 s, 218 MB/s 00:03:53.091 18:07:36 -- spdk/autotest.sh@118 -- # sync 00:03:53.091 18:07:36 -- spdk/autotest.sh@120 -- # xtrace_disable_per_cmd reap_spdk_processes 00:03:53.091 18:07:36 -- common/autotest_common.sh@22 -- # eval 'reap_spdk_processes 12> /dev/null' 00:03:53.091 18:07:36 -- common/autotest_common.sh@22 -- # reap_spdk_processes 00:03:58.358 18:07:41 -- spdk/autotest.sh@124 -- # uname -s 00:03:58.358 18:07:41 -- spdk/autotest.sh@124 -- # '[' Linux = Linux ']' 00:03:58.358 18:07:41 -- spdk/autotest.sh@125 -- # run_test setup.sh /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/test-setup.sh 00:03:58.358 18:07:41 -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:03:58.358 18:07:41 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:03:58.358 18:07:41 -- common/autotest_common.sh@10 -- # set +x 00:03:58.358 ************************************ 00:03:58.358 START TEST setup.sh 00:03:58.358 ************************************ 00:03:58.358 18:07:41 setup.sh -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/test-setup.sh 00:03:58.358 * Looking for test storage... 00:03:58.358 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup 00:03:58.358 18:07:41 setup.sh -- setup/test-setup.sh@10 -- # uname -s 00:03:58.358 18:07:41 setup.sh -- setup/test-setup.sh@10 -- # [[ Linux == Linux ]] 00:03:58.358 18:07:41 setup.sh -- setup/test-setup.sh@12 -- # run_test acl /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/acl.sh 00:03:58.358 18:07:41 setup.sh -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:03:58.358 18:07:41 setup.sh -- common/autotest_common.sh@1105 -- # xtrace_disable 00:03:58.358 18:07:41 setup.sh -- common/autotest_common.sh@10 -- # set +x 00:03:58.358 ************************************ 00:03:58.358 START TEST acl 00:03:58.358 ************************************ 00:03:58.358 18:07:41 setup.sh.acl -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/acl.sh 00:03:58.358 * Looking for test storage... 00:03:58.358 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup 00:03:58.358 18:07:41 setup.sh.acl -- setup/acl.sh@10 -- # get_zoned_devs 00:03:58.358 18:07:41 setup.sh.acl -- common/autotest_common.sh@1669 -- # zoned_devs=() 00:03:58.358 18:07:41 setup.sh.acl -- common/autotest_common.sh@1669 -- # local -gA zoned_devs 00:03:58.358 18:07:41 setup.sh.acl -- common/autotest_common.sh@1670 -- # local nvme bdf 00:03:58.358 18:07:41 setup.sh.acl -- common/autotest_common.sh@1672 -- # for nvme in /sys/block/nvme* 00:03:58.358 18:07:41 setup.sh.acl -- common/autotest_common.sh@1673 -- # is_block_zoned nvme0n1 00:03:58.358 18:07:41 setup.sh.acl -- common/autotest_common.sh@1662 -- # local device=nvme0n1 00:03:58.358 18:07:41 setup.sh.acl -- common/autotest_common.sh@1664 -- # [[ -e /sys/block/nvme0n1/queue/zoned ]] 00:03:58.358 18:07:41 setup.sh.acl -- common/autotest_common.sh@1665 -- # [[ none != none ]] 00:03:58.358 18:07:41 setup.sh.acl -- setup/acl.sh@12 -- # devs=() 00:03:58.358 18:07:41 setup.sh.acl -- setup/acl.sh@12 -- # declare -a devs 00:03:58.358 18:07:41 setup.sh.acl -- setup/acl.sh@13 -- # drivers=() 00:03:58.358 18:07:41 setup.sh.acl -- setup/acl.sh@13 -- # declare -A drivers 00:03:58.358 18:07:41 setup.sh.acl -- setup/acl.sh@51 -- # setup reset 00:03:58.358 18:07:41 setup.sh.acl -- setup/common.sh@9 -- # [[ reset == output ]] 00:03:58.358 18:07:41 setup.sh.acl -- setup/common.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh reset 00:04:02.542 18:07:45 setup.sh.acl -- setup/acl.sh@52 -- # collect_setup_devs 00:04:02.542 18:07:45 setup.sh.acl -- setup/acl.sh@16 -- # local dev driver 00:04:02.542 18:07:45 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:04:02.542 18:07:45 setup.sh.acl -- setup/acl.sh@15 -- # setup output status 00:04:02.542 18:07:45 setup.sh.acl -- setup/common.sh@9 -- # [[ output == output ]] 00:04:02.542 18:07:45 setup.sh.acl -- setup/common.sh@10 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh status 00:04:05.072 18:07:48 setup.sh.acl -- setup/acl.sh@19 -- # [[ (8086 == *:*:*.* ]] 00:04:05.072 18:07:48 setup.sh.acl -- setup/acl.sh@19 -- # continue 00:04:05.072 18:07:48 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:04:05.072 18:07:48 setup.sh.acl -- setup/acl.sh@19 -- # [[ (8086 == *:*:*.* ]] 00:04:05.072 18:07:48 setup.sh.acl -- setup/acl.sh@19 -- # continue 00:04:05.072 18:07:48 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:04:05.072 Hugepages 00:04:05.072 node hugesize free / total 00:04:05.072 18:07:48 setup.sh.acl -- setup/acl.sh@19 -- # [[ 1048576kB == *:*:*.* ]] 00:04:05.072 18:07:48 setup.sh.acl -- setup/acl.sh@19 -- # continue 00:04:05.072 18:07:48 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:04:05.072 18:07:48 setup.sh.acl -- setup/acl.sh@19 -- # [[ 2048kB == *:*:*.* ]] 00:04:05.072 18:07:48 setup.sh.acl -- setup/acl.sh@19 -- # continue 00:04:05.072 18:07:48 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:04:05.072 18:07:48 setup.sh.acl -- setup/acl.sh@19 -- # [[ 1048576kB == *:*:*.* ]] 00:04:05.072 18:07:48 setup.sh.acl -- setup/acl.sh@19 -- # continue 00:04:05.072 18:07:48 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:04:05.072 00:04:05.072 Type BDF Vendor Device NUMA Driver Device Block devices 00:04:05.072 18:07:48 setup.sh.acl -- setup/acl.sh@19 -- # [[ 2048kB == *:*:*.* ]] 00:04:05.072 18:07:48 setup.sh.acl -- setup/acl.sh@19 -- # continue 00:04:05.072 18:07:48 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:04:05.072 18:07:48 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:00:04.0 == *:*:*.* ]] 00:04:05.072 18:07:48 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:04:05.072 18:07:48 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:04:05.072 18:07:48 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:04:05.072 18:07:48 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:00:04.1 == *:*:*.* ]] 00:04:05.072 18:07:48 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:04:05.072 18:07:48 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:04:05.072 18:07:48 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:04:05.072 18:07:48 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:00:04.2 == *:*:*.* ]] 00:04:05.072 18:07:48 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:04:05.072 18:07:48 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:04:05.072 18:07:48 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:04:05.072 18:07:48 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:00:04.3 == *:*:*.* ]] 00:04:05.072 18:07:48 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:04:05.072 18:07:48 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:04:05.072 18:07:48 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:04:05.072 18:07:48 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:00:04.4 == *:*:*.* ]] 00:04:05.072 18:07:48 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:04:05.072 18:07:48 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:04:05.072 18:07:48 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:04:05.072 18:07:48 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:00:04.5 == *:*:*.* ]] 00:04:05.072 18:07:48 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:04:05.072 18:07:48 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:04:05.073 18:07:48 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:04:05.073 18:07:48 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:00:04.6 == *:*:*.* ]] 00:04:05.073 18:07:48 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:04:05.073 18:07:48 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:04:05.073 18:07:48 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:04:05.073 18:07:48 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:00:04.7 == *:*:*.* ]] 00:04:05.073 18:07:48 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:04:05.073 18:07:48 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:04:05.073 18:07:48 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:04:05.331 18:07:48 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:5e:00.0 == *:*:*.* ]] 00:04:05.331 18:07:48 setup.sh.acl -- setup/acl.sh@20 -- # [[ nvme == nvme ]] 00:04:05.331 18:07:48 setup.sh.acl -- setup/acl.sh@21 -- # [[ '' == *\0\0\0\0\:\5\e\:\0\0\.\0* ]] 00:04:05.331 18:07:48 setup.sh.acl -- setup/acl.sh@22 -- # devs+=("$dev") 00:04:05.331 18:07:48 setup.sh.acl -- setup/acl.sh@22 -- # drivers["$dev"]=nvme 00:04:05.331 18:07:48 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:04:05.331 18:07:48 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:80:04.0 == *:*:*.* ]] 00:04:05.331 18:07:48 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:04:05.331 18:07:48 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:04:05.331 18:07:48 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:04:05.331 18:07:48 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:80:04.1 == *:*:*.* ]] 00:04:05.331 18:07:48 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:04:05.331 18:07:48 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:04:05.331 18:07:48 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:04:05.331 18:07:48 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:80:04.2 == *:*:*.* ]] 00:04:05.331 18:07:48 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:04:05.331 18:07:48 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:04:05.331 18:07:48 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:04:05.332 18:07:48 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:80:04.3 == *:*:*.* ]] 00:04:05.332 18:07:48 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:04:05.332 18:07:48 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:04:05.332 18:07:48 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:04:05.332 18:07:48 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:80:04.4 == *:*:*.* ]] 00:04:05.332 18:07:48 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:04:05.332 18:07:48 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:04:05.332 18:07:48 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:04:05.332 18:07:48 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:80:04.5 == *:*:*.* ]] 00:04:05.332 18:07:48 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:04:05.332 18:07:48 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:04:05.332 18:07:48 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:04:05.332 18:07:48 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:80:04.6 == *:*:*.* ]] 00:04:05.332 18:07:48 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:04:05.332 18:07:48 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:04:05.332 18:07:48 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:04:05.332 18:07:48 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:80:04.7 == *:*:*.* ]] 00:04:05.332 18:07:48 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:04:05.332 18:07:48 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:04:05.332 18:07:48 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:04:05.332 18:07:48 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:85:05.5 == *:*:*.* ]] 00:04:05.332 18:07:48 setup.sh.acl -- setup/acl.sh@20 -- # [[ vfio-pci == nvme ]] 00:04:05.332 18:07:48 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:04:05.332 18:07:48 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:04:05.332 18:07:48 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:d7:05.5 == *:*:*.* ]] 00:04:05.332 18:07:48 setup.sh.acl -- setup/acl.sh@20 -- # [[ vfio-pci == nvme ]] 00:04:05.332 18:07:48 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:04:05.332 18:07:48 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:04:05.332 18:07:48 setup.sh.acl -- setup/acl.sh@24 -- # (( 1 > 0 )) 00:04:05.332 18:07:48 setup.sh.acl -- setup/acl.sh@54 -- # run_test denied denied 00:04:05.332 18:07:48 setup.sh.acl -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:04:05.332 18:07:48 setup.sh.acl -- common/autotest_common.sh@1105 -- # xtrace_disable 00:04:05.332 18:07:48 setup.sh.acl -- common/autotest_common.sh@10 -- # set +x 00:04:05.332 ************************************ 00:04:05.332 START TEST denied 00:04:05.332 ************************************ 00:04:05.332 18:07:48 setup.sh.acl.denied -- common/autotest_common.sh@1123 -- # denied 00:04:05.332 18:07:48 setup.sh.acl.denied -- setup/acl.sh@38 -- # PCI_BLOCKED=' 0000:5e:00.0' 00:04:05.332 18:07:48 setup.sh.acl.denied -- setup/acl.sh@39 -- # grep 'Skipping denied controller at 0000:5e:00.0' 00:04:05.332 18:07:48 setup.sh.acl.denied -- setup/acl.sh@38 -- # setup output config 00:04:05.332 18:07:48 setup.sh.acl.denied -- setup/common.sh@9 -- # [[ output == output ]] 00:04:05.332 18:07:48 setup.sh.acl.denied -- setup/common.sh@10 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh config 00:04:09.520 0000:5e:00.0 (8086 0b60): Skipping denied controller at 0000:5e:00.0 00:04:09.520 18:07:52 setup.sh.acl.denied -- setup/acl.sh@40 -- # verify 0000:5e:00.0 00:04:09.520 18:07:52 setup.sh.acl.denied -- setup/acl.sh@28 -- # local dev driver 00:04:09.520 18:07:52 setup.sh.acl.denied -- setup/acl.sh@30 -- # for dev in "$@" 00:04:09.520 18:07:52 setup.sh.acl.denied -- setup/acl.sh@31 -- # [[ -e /sys/bus/pci/devices/0000:5e:00.0 ]] 00:04:09.520 18:07:52 setup.sh.acl.denied -- setup/acl.sh@32 -- # readlink -f /sys/bus/pci/devices/0000:5e:00.0/driver 00:04:09.520 18:07:52 setup.sh.acl.denied -- setup/acl.sh@32 -- # driver=/sys/bus/pci/drivers/nvme 00:04:09.520 18:07:52 setup.sh.acl.denied -- setup/acl.sh@33 -- # [[ nvme == \n\v\m\e ]] 00:04:09.520 18:07:52 setup.sh.acl.denied -- setup/acl.sh@41 -- # setup reset 00:04:09.520 18:07:52 setup.sh.acl.denied -- setup/common.sh@9 -- # [[ reset == output ]] 00:04:09.520 18:07:52 setup.sh.acl.denied -- setup/common.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh reset 00:04:14.787 00:04:14.787 real 0m9.020s 00:04:14.787 user 0m2.979s 00:04:14.787 sys 0m5.354s 00:04:14.787 18:07:57 setup.sh.acl.denied -- common/autotest_common.sh@1124 -- # xtrace_disable 00:04:14.787 18:07:57 setup.sh.acl.denied -- common/autotest_common.sh@10 -- # set +x 00:04:14.787 ************************************ 00:04:14.787 END TEST denied 00:04:14.787 ************************************ 00:04:14.787 18:07:57 setup.sh.acl -- common/autotest_common.sh@1142 -- # return 0 00:04:14.787 18:07:57 setup.sh.acl -- setup/acl.sh@55 -- # run_test allowed allowed 00:04:14.787 18:07:57 setup.sh.acl -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:04:14.787 18:07:57 setup.sh.acl -- common/autotest_common.sh@1105 -- # xtrace_disable 00:04:14.787 18:07:57 setup.sh.acl -- common/autotest_common.sh@10 -- # set +x 00:04:14.787 ************************************ 00:04:14.787 START TEST allowed 00:04:14.787 ************************************ 00:04:14.787 18:07:58 setup.sh.acl.allowed -- common/autotest_common.sh@1123 -- # allowed 00:04:14.787 18:07:58 setup.sh.acl.allowed -- setup/acl.sh@45 -- # PCI_ALLOWED=0000:5e:00.0 00:04:14.787 18:07:58 setup.sh.acl.allowed -- setup/acl.sh@45 -- # setup output config 00:04:14.787 18:07:58 setup.sh.acl.allowed -- setup/acl.sh@46 -- # grep -E '0000:5e:00.0 .*: nvme -> .*' 00:04:14.787 18:07:58 setup.sh.acl.allowed -- setup/common.sh@9 -- # [[ output == output ]] 00:04:14.787 18:07:58 setup.sh.acl.allowed -- setup/common.sh@10 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh config 00:04:21.380 0000:5e:00.0 (8086 0b60): nvme -> vfio-pci 00:04:21.380 18:08:04 setup.sh.acl.allowed -- setup/acl.sh@47 -- # verify 00:04:21.380 18:08:04 setup.sh.acl.allowed -- setup/acl.sh@28 -- # local dev driver 00:04:21.380 18:08:04 setup.sh.acl.allowed -- setup/acl.sh@48 -- # setup reset 00:04:21.380 18:08:04 setup.sh.acl.allowed -- setup/common.sh@9 -- # [[ reset == output ]] 00:04:21.380 18:08:04 setup.sh.acl.allowed -- setup/common.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh reset 00:04:24.665 00:04:24.665 real 0m10.179s 00:04:24.665 user 0m2.669s 00:04:24.665 sys 0m4.982s 00:04:24.665 18:08:08 setup.sh.acl.allowed -- common/autotest_common.sh@1124 -- # xtrace_disable 00:04:24.665 18:08:08 setup.sh.acl.allowed -- common/autotest_common.sh@10 -- # set +x 00:04:24.665 ************************************ 00:04:24.665 END TEST allowed 00:04:24.665 ************************************ 00:04:24.665 18:08:08 setup.sh.acl -- common/autotest_common.sh@1142 -- # return 0 00:04:24.665 00:04:24.665 real 0m26.559s 00:04:24.665 user 0m8.180s 00:04:24.665 sys 0m15.330s 00:04:24.665 18:08:08 setup.sh.acl -- common/autotest_common.sh@1124 -- # xtrace_disable 00:04:24.665 18:08:08 setup.sh.acl -- common/autotest_common.sh@10 -- # set +x 00:04:24.665 ************************************ 00:04:24.665 END TEST acl 00:04:24.665 ************************************ 00:04:24.665 18:08:08 setup.sh -- common/autotest_common.sh@1142 -- # return 0 00:04:24.665 18:08:08 setup.sh -- setup/test-setup.sh@13 -- # run_test hugepages /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/hugepages.sh 00:04:24.665 18:08:08 setup.sh -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:04:24.665 18:08:08 setup.sh -- common/autotest_common.sh@1105 -- # xtrace_disable 00:04:24.665 18:08:08 setup.sh -- common/autotest_common.sh@10 -- # set +x 00:04:24.665 ************************************ 00:04:24.665 START TEST hugepages 00:04:24.665 ************************************ 00:04:24.665 18:08:08 setup.sh.hugepages -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/hugepages.sh 00:04:24.925 * Looking for test storage... 00:04:24.925 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup 00:04:24.925 18:08:08 setup.sh.hugepages -- setup/hugepages.sh@10 -- # nodes_sys=() 00:04:24.925 18:08:08 setup.sh.hugepages -- setup/hugepages.sh@10 -- # declare -a nodes_sys 00:04:24.925 18:08:08 setup.sh.hugepages -- setup/hugepages.sh@12 -- # declare -i default_hugepages=0 00:04:24.925 18:08:08 setup.sh.hugepages -- setup/hugepages.sh@13 -- # declare -i no_nodes=0 00:04:24.925 18:08:08 setup.sh.hugepages -- setup/hugepages.sh@14 -- # declare -i nr_hugepages=0 00:04:24.925 18:08:08 setup.sh.hugepages -- setup/hugepages.sh@16 -- # get_meminfo Hugepagesize 00:04:24.925 18:08:08 setup.sh.hugepages -- setup/common.sh@17 -- # local get=Hugepagesize 00:04:24.925 18:08:08 setup.sh.hugepages -- setup/common.sh@18 -- # local node= 00:04:24.925 18:08:08 setup.sh.hugepages -- setup/common.sh@19 -- # local var val 00:04:24.925 18:08:08 setup.sh.hugepages -- setup/common.sh@20 -- # local mem_f mem 00:04:24.925 18:08:08 setup.sh.hugepages -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:24.925 18:08:08 setup.sh.hugepages -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:24.925 18:08:08 setup.sh.hugepages -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:24.925 18:08:08 setup.sh.hugepages -- setup/common.sh@28 -- # mapfile -t mem 00:04:24.925 18:08:08 setup.sh.hugepages -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:24.925 18:08:08 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:24.925 18:08:08 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:24.925 18:08:08 setup.sh.hugepages -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92293472 kB' 'MemFree: 77057928 kB' 'MemAvailable: 80329812 kB' 'Buffers: 11136 kB' 'Cached: 9251316 kB' 'SwapCached: 0 kB' 'Active: 6292064 kB' 'Inactive: 3441940 kB' 'Active(anon): 5901448 kB' 'Inactive(anon): 0 kB' 'Active(file): 390616 kB' 'Inactive(file): 3441940 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 475312 kB' 'Mapped: 186904 kB' 'Shmem: 5429896 kB' 'KReclaimable: 185940 kB' 'Slab: 482692 kB' 'SReclaimable: 185940 kB' 'SUnreclaim: 296752 kB' 'KernelStack: 16384 kB' 'PageTables: 8444 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 52438188 kB' 'Committed_AS: 7280140 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 200996 kB' 'VmallocChunk: 0 kB' 'Percpu: 47360 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 2048' 'HugePages_Free: 2048' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 4194304 kB' 'DirectMap4k: 634276 kB' 'DirectMap2M: 12673024 kB' 'DirectMap1G: 88080384 kB' 00:04:24.925 18:08:08 setup.sh.hugepages -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:24.925 18:08:08 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:24.925 18:08:08 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:24.925 18:08:08 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:24.925 18:08:08 setup.sh.hugepages -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:24.925 18:08:08 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:24.925 18:08:08 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:24.925 18:08:08 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:24.925 18:08:08 setup.sh.hugepages -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:24.925 18:08:08 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:24.925 18:08:08 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:24.925 18:08:08 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:24.925 18:08:08 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:24.925 18:08:08 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:24.925 18:08:08 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:24.925 18:08:08 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:24.925 18:08:08 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:24.925 18:08:08 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:24.925 18:08:08 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:24.925 18:08:08 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:24.925 18:08:08 setup.sh.hugepages -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:24.925 18:08:08 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:24.925 18:08:08 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:24.925 18:08:08 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:24.925 18:08:08 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:24.925 18:08:08 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:24.925 18:08:08 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:24.925 18:08:08 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:24.925 18:08:08 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:24.925 18:08:08 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:24.925 18:08:08 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:24.925 18:08:08 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:24.925 18:08:08 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:24.925 18:08:08 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:24.925 18:08:08 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:24.925 18:08:08 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:24.925 18:08:08 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:24.925 18:08:08 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:24.925 18:08:08 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:24.925 18:08:08 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:24.925 18:08:08 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:24.925 18:08:08 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:24.925 18:08:08 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:24.925 18:08:08 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:24.926 18:08:08 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:24.926 18:08:08 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:24.926 18:08:08 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:24.926 18:08:08 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:24.926 18:08:08 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:24.926 18:08:08 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:24.926 18:08:08 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:24.926 18:08:08 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:24.926 18:08:08 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:24.926 18:08:08 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:24.926 18:08:08 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:24.926 18:08:08 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:24.926 18:08:08 setup.sh.hugepages -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:24.926 18:08:08 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:24.926 18:08:08 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:24.926 18:08:08 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:24.926 18:08:08 setup.sh.hugepages -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:24.926 18:08:08 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:24.926 18:08:08 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:24.926 18:08:08 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:24.926 18:08:08 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:24.926 18:08:08 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:24.926 18:08:08 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:24.926 18:08:08 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:24.926 18:08:08 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:24.926 18:08:08 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:24.926 18:08:08 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:24.926 18:08:08 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:24.926 18:08:08 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:24.926 18:08:08 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:24.926 18:08:08 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:24.926 18:08:08 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:24.926 18:08:08 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:24.926 18:08:08 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:24.926 18:08:08 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:24.926 18:08:08 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:24.926 18:08:08 setup.sh.hugepages -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:24.926 18:08:08 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:24.926 18:08:08 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:24.926 18:08:08 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:24.926 18:08:08 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:24.926 18:08:08 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:24.926 18:08:08 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:24.926 18:08:08 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:24.926 18:08:08 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:24.926 18:08:08 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:24.926 18:08:08 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:24.926 18:08:08 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:24.926 18:08:08 setup.sh.hugepages -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:24.926 18:08:08 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:24.926 18:08:08 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:24.926 18:08:08 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:24.926 18:08:08 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:24.926 18:08:08 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:24.926 18:08:08 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:24.926 18:08:08 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:24.926 18:08:08 setup.sh.hugepages -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:24.926 18:08:08 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:24.926 18:08:08 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:24.926 18:08:08 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:24.926 18:08:08 setup.sh.hugepages -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:24.926 18:08:08 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:24.926 18:08:08 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:24.926 18:08:08 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:24.926 18:08:08 setup.sh.hugepages -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:24.926 18:08:08 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:24.926 18:08:08 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:24.926 18:08:08 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:24.926 18:08:08 setup.sh.hugepages -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:24.926 18:08:08 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:24.926 18:08:08 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:24.926 18:08:08 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:24.926 18:08:08 setup.sh.hugepages -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:24.926 18:08:08 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:24.926 18:08:08 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:24.926 18:08:08 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:24.926 18:08:08 setup.sh.hugepages -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:24.926 18:08:08 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:24.926 18:08:08 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:24.926 18:08:08 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:24.926 18:08:08 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:24.926 18:08:08 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:24.926 18:08:08 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:24.926 18:08:08 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:24.926 18:08:08 setup.sh.hugepages -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:24.926 18:08:08 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:24.926 18:08:08 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:24.926 18:08:08 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:24.926 18:08:08 setup.sh.hugepages -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:24.926 18:08:08 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:24.926 18:08:08 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:24.926 18:08:08 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:24.926 18:08:08 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:24.926 18:08:08 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:24.926 18:08:08 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:24.926 18:08:08 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:24.926 18:08:08 setup.sh.hugepages -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:24.926 18:08:08 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:24.926 18:08:08 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:24.926 18:08:08 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:24.926 18:08:08 setup.sh.hugepages -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:24.926 18:08:08 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:24.926 18:08:08 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:24.926 18:08:08 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:24.926 18:08:08 setup.sh.hugepages -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:24.926 18:08:08 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:24.926 18:08:08 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:24.926 18:08:08 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:24.926 18:08:08 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:24.927 18:08:08 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:24.927 18:08:08 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:24.927 18:08:08 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:24.927 18:08:08 setup.sh.hugepages -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:24.927 18:08:08 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:24.927 18:08:08 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:24.927 18:08:08 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:24.927 18:08:08 setup.sh.hugepages -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:24.927 18:08:08 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:24.927 18:08:08 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:24.927 18:08:08 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:24.927 18:08:08 setup.sh.hugepages -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:24.927 18:08:08 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:24.927 18:08:08 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:24.927 18:08:08 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:24.927 18:08:08 setup.sh.hugepages -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:24.927 18:08:08 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:24.927 18:08:08 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:24.927 18:08:08 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:24.927 18:08:08 setup.sh.hugepages -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:24.927 18:08:08 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:24.927 18:08:08 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:24.927 18:08:08 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:24.927 18:08:08 setup.sh.hugepages -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:24.927 18:08:08 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:24.927 18:08:08 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:24.927 18:08:08 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:24.927 18:08:08 setup.sh.hugepages -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:24.927 18:08:08 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:24.927 18:08:08 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:24.927 18:08:08 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:24.927 18:08:08 setup.sh.hugepages -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:24.927 18:08:08 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:24.927 18:08:08 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:24.927 18:08:08 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:24.927 18:08:08 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:24.927 18:08:08 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:24.927 18:08:08 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:24.927 18:08:08 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:24.927 18:08:08 setup.sh.hugepages -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:24.927 18:08:08 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:24.927 18:08:08 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:24.927 18:08:08 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:24.927 18:08:08 setup.sh.hugepages -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:24.927 18:08:08 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:24.927 18:08:08 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:24.927 18:08:08 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:24.927 18:08:08 setup.sh.hugepages -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:24.927 18:08:08 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:24.927 18:08:08 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:24.927 18:08:08 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:24.927 18:08:08 setup.sh.hugepages -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:24.927 18:08:08 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:24.927 18:08:08 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:24.927 18:08:08 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:24.927 18:08:08 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Hugepagesize == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:24.927 18:08:08 setup.sh.hugepages -- setup/common.sh@33 -- # echo 2048 00:04:24.927 18:08:08 setup.sh.hugepages -- setup/common.sh@33 -- # return 0 00:04:24.927 18:08:08 setup.sh.hugepages -- setup/hugepages.sh@16 -- # default_hugepages=2048 00:04:24.927 18:08:08 setup.sh.hugepages -- setup/hugepages.sh@17 -- # default_huge_nr=/sys/kernel/mm/hugepages/hugepages-2048kB/nr_hugepages 00:04:24.927 18:08:08 setup.sh.hugepages -- setup/hugepages.sh@18 -- # global_huge_nr=/proc/sys/vm/nr_hugepages 00:04:24.927 18:08:08 setup.sh.hugepages -- setup/hugepages.sh@21 -- # unset -v HUGE_EVEN_ALLOC 00:04:24.927 18:08:08 setup.sh.hugepages -- setup/hugepages.sh@22 -- # unset -v HUGEMEM 00:04:24.927 18:08:08 setup.sh.hugepages -- setup/hugepages.sh@23 -- # unset -v HUGENODE 00:04:24.927 18:08:08 setup.sh.hugepages -- setup/hugepages.sh@24 -- # unset -v NRHUGE 00:04:24.927 18:08:08 setup.sh.hugepages -- setup/hugepages.sh@207 -- # get_nodes 00:04:24.927 18:08:08 setup.sh.hugepages -- setup/hugepages.sh@27 -- # local node 00:04:24.927 18:08:08 setup.sh.hugepages -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:04:24.927 18:08:08 setup.sh.hugepages -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=1024 00:04:24.927 18:08:08 setup.sh.hugepages -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:04:24.927 18:08:08 setup.sh.hugepages -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=1024 00:04:24.927 18:08:08 setup.sh.hugepages -- setup/hugepages.sh@32 -- # no_nodes=2 00:04:24.927 18:08:08 setup.sh.hugepages -- setup/hugepages.sh@33 -- # (( no_nodes > 0 )) 00:04:24.927 18:08:08 setup.sh.hugepages -- setup/hugepages.sh@208 -- # clear_hp 00:04:24.927 18:08:08 setup.sh.hugepages -- setup/hugepages.sh@37 -- # local node hp 00:04:24.927 18:08:08 setup.sh.hugepages -- setup/hugepages.sh@39 -- # for node in "${!nodes_sys[@]}" 00:04:24.927 18:08:08 setup.sh.hugepages -- setup/hugepages.sh@40 -- # for hp in "/sys/devices/system/node/node$node/hugepages/hugepages-"* 00:04:24.927 18:08:08 setup.sh.hugepages -- setup/hugepages.sh@41 -- # echo 0 00:04:24.927 18:08:08 setup.sh.hugepages -- setup/hugepages.sh@40 -- # for hp in "/sys/devices/system/node/node$node/hugepages/hugepages-"* 00:04:24.927 18:08:08 setup.sh.hugepages -- setup/hugepages.sh@41 -- # echo 0 00:04:24.927 18:08:08 setup.sh.hugepages -- setup/hugepages.sh@39 -- # for node in "${!nodes_sys[@]}" 00:04:24.927 18:08:08 setup.sh.hugepages -- setup/hugepages.sh@40 -- # for hp in "/sys/devices/system/node/node$node/hugepages/hugepages-"* 00:04:24.927 18:08:08 setup.sh.hugepages -- setup/hugepages.sh@41 -- # echo 0 00:04:24.927 18:08:08 setup.sh.hugepages -- setup/hugepages.sh@40 -- # for hp in "/sys/devices/system/node/node$node/hugepages/hugepages-"* 00:04:24.927 18:08:08 setup.sh.hugepages -- setup/hugepages.sh@41 -- # echo 0 00:04:24.927 18:08:08 setup.sh.hugepages -- setup/hugepages.sh@45 -- # export CLEAR_HUGE=yes 00:04:24.927 18:08:08 setup.sh.hugepages -- setup/hugepages.sh@45 -- # CLEAR_HUGE=yes 00:04:24.927 18:08:08 setup.sh.hugepages -- setup/hugepages.sh@210 -- # run_test default_setup default_setup 00:04:24.927 18:08:08 setup.sh.hugepages -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:04:24.927 18:08:08 setup.sh.hugepages -- common/autotest_common.sh@1105 -- # xtrace_disable 00:04:24.927 18:08:08 setup.sh.hugepages -- common/autotest_common.sh@10 -- # set +x 00:04:24.927 ************************************ 00:04:24.927 START TEST default_setup 00:04:24.927 ************************************ 00:04:24.927 18:08:08 setup.sh.hugepages.default_setup -- common/autotest_common.sh@1123 -- # default_setup 00:04:24.927 18:08:08 setup.sh.hugepages.default_setup -- setup/hugepages.sh@136 -- # get_test_nr_hugepages 2097152 0 00:04:24.927 18:08:08 setup.sh.hugepages.default_setup -- setup/hugepages.sh@49 -- # local size=2097152 00:04:24.927 18:08:08 setup.sh.hugepages.default_setup -- setup/hugepages.sh@50 -- # (( 2 > 1 )) 00:04:24.927 18:08:08 setup.sh.hugepages.default_setup -- setup/hugepages.sh@51 -- # shift 00:04:24.927 18:08:08 setup.sh.hugepages.default_setup -- setup/hugepages.sh@52 -- # node_ids=('0') 00:04:24.927 18:08:08 setup.sh.hugepages.default_setup -- setup/hugepages.sh@52 -- # local node_ids 00:04:24.927 18:08:08 setup.sh.hugepages.default_setup -- setup/hugepages.sh@55 -- # (( size >= default_hugepages )) 00:04:24.927 18:08:08 setup.sh.hugepages.default_setup -- setup/hugepages.sh@57 -- # nr_hugepages=1024 00:04:24.927 18:08:08 setup.sh.hugepages.default_setup -- setup/hugepages.sh@58 -- # get_test_nr_hugepages_per_node 0 00:04:24.928 18:08:08 setup.sh.hugepages.default_setup -- setup/hugepages.sh@62 -- # user_nodes=('0') 00:04:24.928 18:08:08 setup.sh.hugepages.default_setup -- setup/hugepages.sh@62 -- # local user_nodes 00:04:24.928 18:08:08 setup.sh.hugepages.default_setup -- setup/hugepages.sh@64 -- # local _nr_hugepages=1024 00:04:24.928 18:08:08 setup.sh.hugepages.default_setup -- setup/hugepages.sh@65 -- # local _no_nodes=2 00:04:24.928 18:08:08 setup.sh.hugepages.default_setup -- setup/hugepages.sh@67 -- # nodes_test=() 00:04:24.928 18:08:08 setup.sh.hugepages.default_setup -- setup/hugepages.sh@67 -- # local -g nodes_test 00:04:24.928 18:08:08 setup.sh.hugepages.default_setup -- setup/hugepages.sh@69 -- # (( 1 > 0 )) 00:04:24.928 18:08:08 setup.sh.hugepages.default_setup -- setup/hugepages.sh@70 -- # for _no_nodes in "${user_nodes[@]}" 00:04:24.928 18:08:08 setup.sh.hugepages.default_setup -- setup/hugepages.sh@71 -- # nodes_test[_no_nodes]=1024 00:04:24.928 18:08:08 setup.sh.hugepages.default_setup -- setup/hugepages.sh@73 -- # return 0 00:04:24.928 18:08:08 setup.sh.hugepages.default_setup -- setup/hugepages.sh@137 -- # setup output 00:04:24.928 18:08:08 setup.sh.hugepages.default_setup -- setup/common.sh@9 -- # [[ output == output ]] 00:04:24.928 18:08:08 setup.sh.hugepages.default_setup -- setup/common.sh@10 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh 00:04:28.223 0000:d7:05.5 (8086 201d): Skipping not allowed VMD controller at 0000:d7:05.5 00:04:28.223 0000:85:05.5 (8086 201d): Skipping not allowed VMD controller at 0000:85:05.5 00:04:28.223 0000:00:04.7 (8086 2021): ioatdma -> vfio-pci 00:04:28.223 0000:00:04.6 (8086 2021): ioatdma -> vfio-pci 00:04:28.223 0000:00:04.5 (8086 2021): ioatdma -> vfio-pci 00:04:28.223 0000:00:04.4 (8086 2021): ioatdma -> vfio-pci 00:04:28.223 0000:00:04.3 (8086 2021): ioatdma -> vfio-pci 00:04:28.223 0000:00:04.2 (8086 2021): ioatdma -> vfio-pci 00:04:28.223 0000:00:04.1 (8086 2021): ioatdma -> vfio-pci 00:04:28.223 0000:00:04.0 (8086 2021): ioatdma -> vfio-pci 00:04:28.223 0000:80:04.7 (8086 2021): ioatdma -> vfio-pci 00:04:28.223 0000:80:04.6 (8086 2021): ioatdma -> vfio-pci 00:04:28.223 0000:80:04.5 (8086 2021): ioatdma -> vfio-pci 00:04:28.482 0000:80:04.4 (8086 2021): ioatdma -> vfio-pci 00:04:28.482 0000:80:04.3 (8086 2021): ioatdma -> vfio-pci 00:04:28.482 0000:80:04.2 (8086 2021): ioatdma -> vfio-pci 00:04:28.482 0000:80:04.1 (8086 2021): ioatdma -> vfio-pci 00:04:28.482 0000:80:04.0 (8086 2021): ioatdma -> vfio-pci 00:04:31.015 0000:5e:00.0 (8086 0b60): nvme -> vfio-pci 00:04:31.015 18:08:14 setup.sh.hugepages.default_setup -- setup/hugepages.sh@138 -- # verify_nr_hugepages 00:04:31.015 18:08:14 setup.sh.hugepages.default_setup -- setup/hugepages.sh@89 -- # local node 00:04:31.015 18:08:14 setup.sh.hugepages.default_setup -- setup/hugepages.sh@90 -- # local sorted_t 00:04:31.015 18:08:14 setup.sh.hugepages.default_setup -- setup/hugepages.sh@91 -- # local sorted_s 00:04:31.015 18:08:14 setup.sh.hugepages.default_setup -- setup/hugepages.sh@92 -- # local surp 00:04:31.015 18:08:14 setup.sh.hugepages.default_setup -- setup/hugepages.sh@93 -- # local resv 00:04:31.015 18:08:14 setup.sh.hugepages.default_setup -- setup/hugepages.sh@94 -- # local anon 00:04:31.015 18:08:14 setup.sh.hugepages.default_setup -- setup/hugepages.sh@96 -- # [[ always [madvise] never != *\[\n\e\v\e\r\]* ]] 00:04:31.015 18:08:14 setup.sh.hugepages.default_setup -- setup/hugepages.sh@97 -- # get_meminfo AnonHugePages 00:04:31.015 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@17 -- # local get=AnonHugePages 00:04:31.015 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@18 -- # local node= 00:04:31.015 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@19 -- # local var val 00:04:31.015 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@20 -- # local mem_f mem 00:04:31.015 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:31.015 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:31.015 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:31.015 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@28 -- # mapfile -t mem 00:04:31.015 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:31.015 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:31.015 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:31.015 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92293472 kB' 'MemFree: 79169616 kB' 'MemAvailable: 82441412 kB' 'Buffers: 11136 kB' 'Cached: 9251428 kB' 'SwapCached: 0 kB' 'Active: 6308948 kB' 'Inactive: 3441940 kB' 'Active(anon): 5918332 kB' 'Inactive(anon): 0 kB' 'Active(file): 390616 kB' 'Inactive(file): 3441940 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 490804 kB' 'Mapped: 186976 kB' 'Shmem: 5430008 kB' 'KReclaimable: 185764 kB' 'Slab: 483000 kB' 'SReclaimable: 185764 kB' 'SUnreclaim: 297236 kB' 'KernelStack: 16416 kB' 'PageTables: 8216 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 53486764 kB' 'Committed_AS: 7302940 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 200996 kB' 'VmallocChunk: 0 kB' 'Percpu: 47360 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 634276 kB' 'DirectMap2M: 12673024 kB' 'DirectMap1G: 88080384 kB' 00:04:31.015 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ MemTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:31.015 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:31.015 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:31.015 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:31.015 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ MemFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:31.015 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:31.015 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:31.015 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:31.015 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ MemAvailable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:31.015 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:31.015 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:31.015 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:31.015 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Buffers == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:31.015 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:31.015 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:31.015 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:31.015 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Cached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:31.015 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:31.015 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:31.015 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:31.015 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SwapCached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:31.015 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:31.015 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:31.015 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:31.015 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Active == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:31.015 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:31.015 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:31.015 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:31.015 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Inactive == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:31.015 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:31.015 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:31.015 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:31.015 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Active(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:31.015 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:31.015 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:31.015 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:31.015 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Inactive(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:31.015 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:31.015 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:31.015 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:31.015 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Active(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:31.015 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:31.015 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:31.015 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:31.015 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Inactive(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:31.016 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:31.016 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:31.016 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:31.016 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Unevictable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:31.016 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:31.016 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:31.016 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:31.016 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Mlocked == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:31.016 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:31.016 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:31.016 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:31.016 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SwapTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:31.016 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:31.016 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:31.016 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:31.016 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SwapFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:31.016 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:31.016 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:31.016 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:31.016 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Zswap == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:31.016 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:31.016 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:31.016 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:31.016 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Zswapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:31.016 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:31.016 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:31.016 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:31.016 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Dirty == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:31.016 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:31.016 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:31.016 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:31.016 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Writeback == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:31.016 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:31.016 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:31.016 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:31.016 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ AnonPages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:31.016 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:31.016 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:31.016 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:31.016 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Mapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:31.016 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:31.016 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:31.016 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:31.016 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Shmem == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:31.016 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:31.016 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:31.016 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:31.016 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ KReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:31.016 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:31.016 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:31.016 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:31.016 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Slab == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:31.016 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:31.016 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:31.016 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:31.016 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:31.016 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:31.016 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:31.016 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:31.016 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SUnreclaim == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:31.016 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:31.016 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:31.016 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:31.016 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ KernelStack == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:31.016 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:31.016 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:31.016 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:31.016 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ PageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:31.016 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:31.016 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:31.016 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:31.016 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SecPageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:31.016 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:31.016 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:31.016 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:31.016 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ NFS_Unstable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:31.016 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:31.016 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:31.016 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:31.016 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Bounce == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:31.016 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:31.016 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:31.016 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:31.016 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ WritebackTmp == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:31.016 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:31.016 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:31.016 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:31.016 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ CommitLimit == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:31.016 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:31.016 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:31.016 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:31.016 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Committed_AS == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:31.016 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:31.016 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:31.016 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:31.016 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ VmallocTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:31.016 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:31.016 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:31.016 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:31.016 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ VmallocUsed == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:31.017 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:31.017 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:31.017 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:31.017 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ VmallocChunk == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:31.017 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:31.017 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:31.017 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:31.017 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Percpu == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:31.017 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:31.017 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:31.017 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:31.017 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ HardwareCorrupted == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:31.017 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:31.017 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:31.017 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:31.017 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ AnonHugePages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:31.017 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@33 -- # echo 0 00:04:31.017 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@33 -- # return 0 00:04:31.017 18:08:14 setup.sh.hugepages.default_setup -- setup/hugepages.sh@97 -- # anon=0 00:04:31.017 18:08:14 setup.sh.hugepages.default_setup -- setup/hugepages.sh@99 -- # get_meminfo HugePages_Surp 00:04:31.017 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@17 -- # local get=HugePages_Surp 00:04:31.017 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@18 -- # local node= 00:04:31.017 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@19 -- # local var val 00:04:31.017 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@20 -- # local mem_f mem 00:04:31.017 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:31.017 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:31.017 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:31.017 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@28 -- # mapfile -t mem 00:04:31.017 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:31.017 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:31.017 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:31.017 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92293472 kB' 'MemFree: 79170568 kB' 'MemAvailable: 82442364 kB' 'Buffers: 11136 kB' 'Cached: 9251432 kB' 'SwapCached: 0 kB' 'Active: 6307624 kB' 'Inactive: 3441940 kB' 'Active(anon): 5917008 kB' 'Inactive(anon): 0 kB' 'Active(file): 390616 kB' 'Inactive(file): 3441940 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 490400 kB' 'Mapped: 186892 kB' 'Shmem: 5430012 kB' 'KReclaimable: 185764 kB' 'Slab: 482988 kB' 'SReclaimable: 185764 kB' 'SUnreclaim: 297224 kB' 'KernelStack: 16352 kB' 'PageTables: 8704 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 53486764 kB' 'Committed_AS: 7302960 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 200964 kB' 'VmallocChunk: 0 kB' 'Percpu: 47360 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 634276 kB' 'DirectMap2M: 12673024 kB' 'DirectMap1G: 88080384 kB' 00:04:31.017 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:31.017 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:31.017 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:31.017 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:31.017 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:31.017 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:31.017 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:31.017 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:31.017 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:31.017 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:31.017 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:31.017 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:31.017 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:31.017 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:31.017 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:31.017 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:31.017 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:31.017 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:31.017 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:31.017 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:31.017 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:31.017 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:31.017 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:31.017 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:31.017 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:31.017 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:31.017 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:31.017 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:31.017 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:31.017 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:31.017 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:31.017 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:31.017 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:31.017 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:31.017 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:31.017 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:31.017 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:31.017 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:31.017 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:31.017 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:31.017 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:31.017 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:31.017 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:31.017 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:31.017 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:31.017 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:31.017 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:31.017 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:31.017 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:31.017 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:31.017 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:31.017 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:31.017 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:31.017 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:31.017 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:31.017 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:31.017 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:31.018 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:31.018 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:31.018 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:31.018 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:31.018 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:31.018 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:31.018 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:31.018 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:31.018 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:31.018 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:31.018 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:31.018 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:31.018 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:31.018 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:31.018 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:31.018 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:31.018 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:31.018 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:31.018 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:31.018 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:31.018 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:31.018 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:31.018 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:31.018 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:31.018 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:31.018 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:31.018 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:31.018 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:31.018 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:31.018 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:31.018 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:31.018 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:31.018 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:31.018 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:31.018 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:31.018 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:31.018 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:31.018 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:31.018 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:31.018 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:31.018 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:31.018 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:31.018 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:31.018 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:31.018 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:31.018 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:31.018 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:31.018 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:31.018 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:31.018 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:31.018 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:31.018 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:31.018 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:31.018 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:31.018 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:31.018 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:31.018 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:31.018 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:31.018 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:31.018 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:31.018 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:31.018 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:31.018 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:31.018 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:31.018 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:31.018 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:31.018 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:31.018 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:31.018 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:31.018 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:31.018 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:31.018 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:31.018 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:31.018 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:31.018 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:31.018 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:31.018 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:31.018 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:31.018 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:31.018 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:31.018 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:31.018 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:31.018 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:31.018 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:31.018 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:31.018 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:31.018 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:31.018 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:31.018 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:31.018 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:31.018 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:31.018 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:31.018 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:31.018 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:31.018 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:31.018 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:31.018 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:31.018 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:31.018 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:31.018 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:31.019 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:31.019 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:31.019 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:31.019 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:31.019 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:31.019 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:31.019 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:31.019 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:31.019 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:31.019 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:31.019 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:31.019 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:31.019 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:31.019 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:31.019 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:31.019 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:31.019 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:31.019 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:31.019 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:31.019 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:31.019 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:31.019 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:31.019 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:31.019 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:31.019 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:31.019 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:31.019 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:31.019 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:31.019 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:31.019 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:31.019 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:31.019 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:31.019 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:31.019 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:31.019 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:31.019 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:31.019 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:31.019 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:31.019 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:31.019 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:31.019 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:31.019 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:31.019 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:31.019 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:31.019 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:31.019 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:31.019 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:31.019 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:31.019 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@33 -- # echo 0 00:04:31.019 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@33 -- # return 0 00:04:31.019 18:08:14 setup.sh.hugepages.default_setup -- setup/hugepages.sh@99 -- # surp=0 00:04:31.019 18:08:14 setup.sh.hugepages.default_setup -- setup/hugepages.sh@100 -- # get_meminfo HugePages_Rsvd 00:04:31.019 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@17 -- # local get=HugePages_Rsvd 00:04:31.019 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@18 -- # local node= 00:04:31.019 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@19 -- # local var val 00:04:31.019 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@20 -- # local mem_f mem 00:04:31.019 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:31.019 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:31.019 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:31.019 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@28 -- # mapfile -t mem 00:04:31.019 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:31.019 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:31.019 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:31.019 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92293472 kB' 'MemFree: 79172900 kB' 'MemAvailable: 82444688 kB' 'Buffers: 11136 kB' 'Cached: 9251448 kB' 'SwapCached: 0 kB' 'Active: 6307764 kB' 'Inactive: 3441940 kB' 'Active(anon): 5917148 kB' 'Inactive(anon): 0 kB' 'Active(file): 390616 kB' 'Inactive(file): 3441940 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 490448 kB' 'Mapped: 186892 kB' 'Shmem: 5430028 kB' 'KReclaimable: 185748 kB' 'Slab: 482972 kB' 'SReclaimable: 185748 kB' 'SUnreclaim: 297224 kB' 'KernelStack: 16400 kB' 'PageTables: 8296 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 53486764 kB' 'Committed_AS: 7301512 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 200980 kB' 'VmallocChunk: 0 kB' 'Percpu: 47360 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 634276 kB' 'DirectMap2M: 12673024 kB' 'DirectMap1G: 88080384 kB' 00:04:31.019 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:31.019 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:31.019 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:31.019 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:31.019 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:31.019 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:31.019 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:31.019 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:31.019 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:31.019 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:31.019 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:31.019 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:31.019 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:31.019 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:31.019 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:31.019 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:31.019 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:31.019 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:31.019 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:31.019 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:31.019 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:31.019 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:31.020 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:31.020 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:31.020 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:31.020 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:31.020 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:31.020 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:31.020 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:31.020 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:31.020 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:31.020 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:31.020 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:31.020 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:31.020 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:31.020 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:31.020 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:31.020 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:31.020 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:31.020 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:31.020 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:31.020 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:31.020 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:31.020 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:31.020 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:31.020 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:31.020 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:31.020 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:31.020 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:31.020 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:31.020 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:31.020 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:31.020 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:31.020 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:31.020 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:31.020 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:31.020 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:31.020 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:31.020 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:31.020 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:31.020 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:31.020 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:31.020 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:31.020 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:31.020 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:31.020 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:31.020 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:31.020 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:31.020 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:31.020 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:31.020 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:31.020 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:31.020 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:31.020 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:31.020 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:31.020 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:31.020 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:31.020 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:31.020 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:31.020 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:31.020 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:31.020 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:31.020 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:31.020 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:31.020 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:31.020 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:31.020 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:31.020 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:31.020 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:31.020 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:31.020 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:31.020 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:31.020 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:31.020 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:31.020 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:31.020 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:31.020 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:31.020 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:31.021 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:31.021 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:31.021 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:31.021 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:31.021 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:31.021 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:31.021 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:31.021 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:31.021 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:31.021 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:31.021 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:31.021 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:31.021 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:31.021 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:31.021 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:31.021 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:31.021 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:31.021 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:31.021 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:31.021 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:31.021 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:31.021 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:31.021 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:31.021 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:31.021 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:31.021 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:31.021 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:31.021 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:31.021 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:31.021 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:31.021 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:31.021 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:31.021 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:31.021 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:31.021 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:31.021 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:31.021 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:31.021 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:31.021 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:31.021 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:31.021 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:31.021 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:31.021 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:31.021 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:31.021 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:31.021 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:31.021 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:31.021 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:31.021 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:31.021 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:31.021 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:31.021 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:31.021 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:31.021 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:31.021 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:31.021 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:31.021 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:31.021 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:31.021 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:31.021 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:31.021 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:31.021 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:31.021 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:31.021 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:31.021 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:31.021 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:31.021 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:31.021 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:31.021 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:31.021 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:31.021 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:31.021 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:31.021 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:31.021 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:31.021 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:31.021 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:31.021 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:31.021 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:31.021 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:31.021 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:31.021 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:31.021 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:31.021 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:31.021 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:31.021 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:31.021 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:31.021 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:31.021 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:31.021 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:31.021 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:31.021 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:31.021 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:31.021 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:31.021 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:31.021 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:31.021 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:31.021 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:31.021 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:31.022 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:31.022 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:31.022 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:31.022 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:31.022 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:31.022 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@33 -- # echo 0 00:04:31.022 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@33 -- # return 0 00:04:31.022 18:08:14 setup.sh.hugepages.default_setup -- setup/hugepages.sh@100 -- # resv=0 00:04:31.022 18:08:14 setup.sh.hugepages.default_setup -- setup/hugepages.sh@102 -- # echo nr_hugepages=1024 00:04:31.022 nr_hugepages=1024 00:04:31.022 18:08:14 setup.sh.hugepages.default_setup -- setup/hugepages.sh@103 -- # echo resv_hugepages=0 00:04:31.022 resv_hugepages=0 00:04:31.022 18:08:14 setup.sh.hugepages.default_setup -- setup/hugepages.sh@104 -- # echo surplus_hugepages=0 00:04:31.022 surplus_hugepages=0 00:04:31.022 18:08:14 setup.sh.hugepages.default_setup -- setup/hugepages.sh@105 -- # echo anon_hugepages=0 00:04:31.022 anon_hugepages=0 00:04:31.022 18:08:14 setup.sh.hugepages.default_setup -- setup/hugepages.sh@107 -- # (( 1024 == nr_hugepages + surp + resv )) 00:04:31.022 18:08:14 setup.sh.hugepages.default_setup -- setup/hugepages.sh@109 -- # (( 1024 == nr_hugepages )) 00:04:31.022 18:08:14 setup.sh.hugepages.default_setup -- setup/hugepages.sh@110 -- # get_meminfo HugePages_Total 00:04:31.022 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@17 -- # local get=HugePages_Total 00:04:31.022 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@18 -- # local node= 00:04:31.022 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@19 -- # local var val 00:04:31.022 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@20 -- # local mem_f mem 00:04:31.022 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:31.022 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:31.022 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:31.022 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@28 -- # mapfile -t mem 00:04:31.022 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:31.022 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:31.022 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:31.022 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92293472 kB' 'MemFree: 79171896 kB' 'MemAvailable: 82443684 kB' 'Buffers: 11136 kB' 'Cached: 9251468 kB' 'SwapCached: 0 kB' 'Active: 6308084 kB' 'Inactive: 3441940 kB' 'Active(anon): 5917468 kB' 'Inactive(anon): 0 kB' 'Active(file): 390616 kB' 'Inactive(file): 3441940 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 490720 kB' 'Mapped: 186892 kB' 'Shmem: 5430048 kB' 'KReclaimable: 185748 kB' 'Slab: 482972 kB' 'SReclaimable: 185748 kB' 'SUnreclaim: 297224 kB' 'KernelStack: 16288 kB' 'PageTables: 8184 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 53486764 kB' 'Committed_AS: 7301520 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 200948 kB' 'VmallocChunk: 0 kB' 'Percpu: 47360 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 634276 kB' 'DirectMap2M: 12673024 kB' 'DirectMap1G: 88080384 kB' 00:04:31.022 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:31.022 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:31.022 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:31.022 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:31.022 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:31.022 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:31.022 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:31.022 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:31.022 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:31.022 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:31.022 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:31.022 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:31.022 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:31.022 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:31.022 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:31.022 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:31.022 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:31.022 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:31.022 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:31.022 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:31.022 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:31.022 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:31.022 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:31.022 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:31.022 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:31.022 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:31.022 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:31.022 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:31.022 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:31.022 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:31.022 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:31.022 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:31.022 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:31.022 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:31.022 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:31.022 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:31.022 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:31.022 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:31.022 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:31.022 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:31.022 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:31.022 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:31.022 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:31.022 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:31.022 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:31.022 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:31.022 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:31.022 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:31.022 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:31.022 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:31.022 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:31.022 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:31.022 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:31.022 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:31.022 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:31.022 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:31.022 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:31.022 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:31.022 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:31.022 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:31.023 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:31.023 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:31.023 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:31.023 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:31.023 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:31.023 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:31.023 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:31.023 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:31.023 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:31.023 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:31.023 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:31.023 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:31.023 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:31.023 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:31.023 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:31.023 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:31.023 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:31.023 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:31.023 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:31.023 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:31.023 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:31.023 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:31.023 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:31.023 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:31.023 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:31.023 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:31.023 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:31.023 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:31.023 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:31.023 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:31.023 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:31.023 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:31.023 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:31.023 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:31.023 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:31.023 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:31.023 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:31.023 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:31.023 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:31.023 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:31.023 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:31.023 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:31.023 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:31.023 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:31.023 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:31.023 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:31.023 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:31.023 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:31.023 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:31.023 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:31.023 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:31.023 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:31.023 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:31.023 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:31.023 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:31.023 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:31.023 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:31.023 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:31.023 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:31.023 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:31.023 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:31.023 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:31.023 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:31.023 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:31.023 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:31.023 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:31.023 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:31.023 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:31.023 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:31.023 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:31.023 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:31.023 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:31.023 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:31.023 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:31.023 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:31.023 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:31.023 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:31.023 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:31.023 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:31.023 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:31.023 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:31.023 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:31.023 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:31.023 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:31.023 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:31.023 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:31.023 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:31.023 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:31.023 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:31.023 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:31.023 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:31.023 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:31.023 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:31.023 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:31.023 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:31.023 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:31.023 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:31.023 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:31.023 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:31.024 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:31.024 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:31.024 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:31.024 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:31.024 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:31.024 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:31.024 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:31.024 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:31.024 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:31.024 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:31.024 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:31.024 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:31.024 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:31.024 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:31.024 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:31.024 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:31.024 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:31.024 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:31.024 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:31.024 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:31.024 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:31.024 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:31.024 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:31.024 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:31.024 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:31.024 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:31.024 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:31.024 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:31.024 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:31.024 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:31.024 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:31.024 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:31.024 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:31.024 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:31.024 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@33 -- # echo 1024 00:04:31.024 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@33 -- # return 0 00:04:31.024 18:08:14 setup.sh.hugepages.default_setup -- setup/hugepages.sh@110 -- # (( 1024 == nr_hugepages + surp + resv )) 00:04:31.024 18:08:14 setup.sh.hugepages.default_setup -- setup/hugepages.sh@112 -- # get_nodes 00:04:31.024 18:08:14 setup.sh.hugepages.default_setup -- setup/hugepages.sh@27 -- # local node 00:04:31.024 18:08:14 setup.sh.hugepages.default_setup -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:04:31.024 18:08:14 setup.sh.hugepages.default_setup -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=1024 00:04:31.024 18:08:14 setup.sh.hugepages.default_setup -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:04:31.024 18:08:14 setup.sh.hugepages.default_setup -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=0 00:04:31.024 18:08:14 setup.sh.hugepages.default_setup -- setup/hugepages.sh@32 -- # no_nodes=2 00:04:31.024 18:08:14 setup.sh.hugepages.default_setup -- setup/hugepages.sh@33 -- # (( no_nodes > 0 )) 00:04:31.024 18:08:14 setup.sh.hugepages.default_setup -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:04:31.024 18:08:14 setup.sh.hugepages.default_setup -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:04:31.024 18:08:14 setup.sh.hugepages.default_setup -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 0 00:04:31.024 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@17 -- # local get=HugePages_Surp 00:04:31.024 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@18 -- # local node=0 00:04:31.024 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@19 -- # local var val 00:04:31.024 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@20 -- # local mem_f mem 00:04:31.024 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:31.024 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node0/meminfo ]] 00:04:31.024 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node0/meminfo 00:04:31.024 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@28 -- # mapfile -t mem 00:04:31.024 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:31.024 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:31.024 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:31.024 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 48116940 kB' 'MemFree: 37096196 kB' 'MemUsed: 11020744 kB' 'SwapCached: 0 kB' 'Active: 4829672 kB' 'Inactive: 3371680 kB' 'Active(anon): 4671776 kB' 'Inactive(anon): 0 kB' 'Active(file): 157896 kB' 'Inactive(file): 3371680 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 7978652 kB' 'Mapped: 106200 kB' 'AnonPages: 225892 kB' 'Shmem: 4449076 kB' 'KernelStack: 9160 kB' 'PageTables: 4220 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 104008 kB' 'Slab: 287452 kB' 'SReclaimable: 104008 kB' 'SUnreclaim: 183444 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Surp: 0' 00:04:31.024 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:31.024 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:31.024 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:31.024 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:31.024 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:31.024 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:31.024 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:31.024 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:31.283 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:31.283 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:31.283 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:31.283 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:31.283 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:31.283 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:31.283 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:31.283 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:31.283 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:31.283 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:31.283 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:31.283 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:31.283 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:31.283 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:31.283 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:31.283 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:31.284 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:31.284 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:31.284 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:31.284 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:31.284 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:31.284 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:31.284 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:31.284 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:31.284 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:31.284 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:31.284 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:31.284 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:31.284 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:31.284 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:31.284 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:31.284 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:31.284 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:31.284 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:31.284 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:31.284 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:31.284 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:31.284 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:31.284 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:31.284 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:31.284 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:31.284 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:31.284 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:31.284 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:31.284 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:31.284 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:31.284 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:31.284 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:31.284 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:31.284 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:31.284 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:31.284 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:31.284 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:31.284 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:31.284 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:31.284 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:31.284 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:31.284 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:31.284 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:31.284 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:31.284 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:31.284 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:31.284 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:31.284 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:31.284 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:31.284 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:31.284 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:31.284 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:31.284 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:31.284 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:31.284 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:31.284 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:31.284 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:31.284 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:31.284 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:31.284 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:31.284 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:31.284 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:31.284 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:31.284 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:31.284 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:31.284 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:31.284 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:31.284 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:31.284 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:31.284 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:31.284 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:31.284 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:31.284 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:31.284 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:31.284 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:31.284 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:31.284 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:31.284 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:31.284 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:31.284 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:31.284 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:31.284 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:31.284 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:31.284 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:31.284 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:31.284 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:31.284 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:31.284 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:31.284 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:31.284 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:31.284 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:31.284 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:31.284 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:31.284 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:31.284 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:31.284 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:31.284 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:31.284 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:31.284 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:31.284 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:31.284 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:31.284 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:31.284 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:31.284 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:31.284 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:31.284 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:31.284 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:31.284 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:31.284 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:31.284 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:31.284 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:31.284 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:31.284 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:31.284 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:31.284 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:31.284 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:31.284 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:31.284 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:31.284 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:31.284 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:31.284 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:31.284 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@33 -- # echo 0 00:04:31.284 18:08:14 setup.sh.hugepages.default_setup -- setup/common.sh@33 -- # return 0 00:04:31.284 18:08:14 setup.sh.hugepages.default_setup -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:04:31.284 18:08:14 setup.sh.hugepages.default_setup -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:04:31.284 18:08:14 setup.sh.hugepages.default_setup -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:04:31.284 18:08:14 setup.sh.hugepages.default_setup -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:04:31.284 18:08:14 setup.sh.hugepages.default_setup -- setup/hugepages.sh@128 -- # echo 'node0=1024 expecting 1024' 00:04:31.284 node0=1024 expecting 1024 00:04:31.284 18:08:14 setup.sh.hugepages.default_setup -- setup/hugepages.sh@130 -- # [[ 1024 == \1\0\2\4 ]] 00:04:31.284 00:04:31.285 real 0m6.230s 00:04:31.285 user 0m1.351s 00:04:31.285 sys 0m2.420s 00:04:31.285 18:08:14 setup.sh.hugepages.default_setup -- common/autotest_common.sh@1124 -- # xtrace_disable 00:04:31.285 18:08:14 setup.sh.hugepages.default_setup -- common/autotest_common.sh@10 -- # set +x 00:04:31.285 ************************************ 00:04:31.285 END TEST default_setup 00:04:31.285 ************************************ 00:04:31.285 18:08:14 setup.sh.hugepages -- common/autotest_common.sh@1142 -- # return 0 00:04:31.285 18:08:14 setup.sh.hugepages -- setup/hugepages.sh@211 -- # run_test per_node_1G_alloc per_node_1G_alloc 00:04:31.285 18:08:14 setup.sh.hugepages -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:04:31.285 18:08:14 setup.sh.hugepages -- common/autotest_common.sh@1105 -- # xtrace_disable 00:04:31.285 18:08:14 setup.sh.hugepages -- common/autotest_common.sh@10 -- # set +x 00:04:31.285 ************************************ 00:04:31.285 START TEST per_node_1G_alloc 00:04:31.285 ************************************ 00:04:31.285 18:08:14 setup.sh.hugepages.per_node_1G_alloc -- common/autotest_common.sh@1123 -- # per_node_1G_alloc 00:04:31.285 18:08:14 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@143 -- # local IFS=, 00:04:31.285 18:08:14 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@145 -- # get_test_nr_hugepages 1048576 0 1 00:04:31.285 18:08:14 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@49 -- # local size=1048576 00:04:31.285 18:08:14 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@50 -- # (( 3 > 1 )) 00:04:31.285 18:08:14 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@51 -- # shift 00:04:31.285 18:08:14 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@52 -- # node_ids=('0' '1') 00:04:31.285 18:08:14 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@52 -- # local node_ids 00:04:31.285 18:08:14 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@55 -- # (( size >= default_hugepages )) 00:04:31.285 18:08:14 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@57 -- # nr_hugepages=512 00:04:31.285 18:08:14 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@58 -- # get_test_nr_hugepages_per_node 0 1 00:04:31.285 18:08:14 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@62 -- # user_nodes=('0' '1') 00:04:31.285 18:08:14 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@62 -- # local user_nodes 00:04:31.285 18:08:14 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@64 -- # local _nr_hugepages=512 00:04:31.285 18:08:14 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@65 -- # local _no_nodes=2 00:04:31.285 18:08:14 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@67 -- # nodes_test=() 00:04:31.285 18:08:14 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@67 -- # local -g nodes_test 00:04:31.285 18:08:14 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@69 -- # (( 2 > 0 )) 00:04:31.285 18:08:14 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@70 -- # for _no_nodes in "${user_nodes[@]}" 00:04:31.285 18:08:14 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@71 -- # nodes_test[_no_nodes]=512 00:04:31.285 18:08:14 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@70 -- # for _no_nodes in "${user_nodes[@]}" 00:04:31.285 18:08:14 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@71 -- # nodes_test[_no_nodes]=512 00:04:31.285 18:08:14 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@73 -- # return 0 00:04:31.285 18:08:14 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@146 -- # NRHUGE=512 00:04:31.285 18:08:14 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@146 -- # HUGENODE=0,1 00:04:31.285 18:08:14 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@146 -- # setup output 00:04:31.285 18:08:14 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@9 -- # [[ output == output ]] 00:04:31.285 18:08:14 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@10 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh 00:04:34.567 0000:d7:05.5 (8086 201d): Skipping not allowed VMD controller at 0000:d7:05.5 00:04:34.567 0000:85:05.5 (8086 201d): Skipping not allowed VMD controller at 0000:85:05.5 00:04:34.567 0000:00:04.7 (8086 2021): Already using the vfio-pci driver 00:04:34.567 0000:5e:00.0 (8086 0b60): Already using the vfio-pci driver 00:04:34.567 0000:00:04.6 (8086 2021): Already using the vfio-pci driver 00:04:34.567 0000:00:04.5 (8086 2021): Already using the vfio-pci driver 00:04:34.567 0000:00:04.4 (8086 2021): Already using the vfio-pci driver 00:04:34.567 0000:00:04.3 (8086 2021): Already using the vfio-pci driver 00:04:34.567 0000:00:04.2 (8086 2021): Already using the vfio-pci driver 00:04:34.567 0000:00:04.1 (8086 2021): Already using the vfio-pci driver 00:04:34.567 0000:00:04.0 (8086 2021): Already using the vfio-pci driver 00:04:34.567 0000:80:04.7 (8086 2021): Already using the vfio-pci driver 00:04:34.567 0000:80:04.6 (8086 2021): Already using the vfio-pci driver 00:04:34.567 0000:80:04.5 (8086 2021): Already using the vfio-pci driver 00:04:34.567 0000:80:04.4 (8086 2021): Already using the vfio-pci driver 00:04:34.567 0000:80:04.3 (8086 2021): Already using the vfio-pci driver 00:04:34.567 0000:80:04.2 (8086 2021): Already using the vfio-pci driver 00:04:34.567 0000:80:04.1 (8086 2021): Already using the vfio-pci driver 00:04:34.567 0000:80:04.0 (8086 2021): Already using the vfio-pci driver 00:04:34.567 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@147 -- # nr_hugepages=1024 00:04:34.567 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@147 -- # verify_nr_hugepages 00:04:34.567 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@89 -- # local node 00:04:34.567 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@90 -- # local sorted_t 00:04:34.567 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@91 -- # local sorted_s 00:04:34.567 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@92 -- # local surp 00:04:34.567 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@93 -- # local resv 00:04:34.567 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@94 -- # local anon 00:04:34.567 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@96 -- # [[ always [madvise] never != *\[\n\e\v\e\r\]* ]] 00:04:34.567 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@97 -- # get_meminfo AnonHugePages 00:04:34.567 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@17 -- # local get=AnonHugePages 00:04:34.567 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@18 -- # local node= 00:04:34.567 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@19 -- # local var val 00:04:34.567 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@20 -- # local mem_f mem 00:04:34.567 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:34.567 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:34.567 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:34.567 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:04:34.567 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:34.567 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.567 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.567 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92293472 kB' 'MemFree: 79163988 kB' 'MemAvailable: 82435776 kB' 'Buffers: 11136 kB' 'Cached: 9251564 kB' 'SwapCached: 0 kB' 'Active: 6307780 kB' 'Inactive: 3441940 kB' 'Active(anon): 5917164 kB' 'Inactive(anon): 0 kB' 'Active(file): 390616 kB' 'Inactive(file): 3441940 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 489680 kB' 'Mapped: 187040 kB' 'Shmem: 5430144 kB' 'KReclaimable: 185748 kB' 'Slab: 482044 kB' 'SReclaimable: 185748 kB' 'SUnreclaim: 296296 kB' 'KernelStack: 16176 kB' 'PageTables: 7984 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 53486764 kB' 'Committed_AS: 7300736 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 201044 kB' 'VmallocChunk: 0 kB' 'Percpu: 47360 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 634276 kB' 'DirectMap2M: 12673024 kB' 'DirectMap1G: 88080384 kB' 00:04:34.567 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ MemTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:34.567 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:34.567 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.567 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.567 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ MemFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:34.567 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:34.567 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.567 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.567 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:34.567 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:34.567 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.567 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.567 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Buffers == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:34.567 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:34.567 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.567 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.567 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Cached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:34.567 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:34.567 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.567 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.567 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SwapCached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:34.567 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:34.567 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.567 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.567 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Active == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:34.567 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:34.567 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.567 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.567 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Inactive == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:34.567 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:34.567 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.567 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.567 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:34.567 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:34.567 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.567 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.567 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:34.567 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:34.567 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.567 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.567 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Active(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:34.567 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:34.567 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.567 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.567 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:34.567 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:34.567 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.567 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.567 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Unevictable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:34.567 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:34.567 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.567 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.567 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Mlocked == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:34.567 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:34.567 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.567 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.567 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:34.567 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:34.567 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.567 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.567 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SwapFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:34.567 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:34.567 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.567 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.567 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Zswap == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:34.567 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:34.567 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.567 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.567 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Zswapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:34.567 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:34.568 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.568 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.568 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Dirty == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:34.568 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:34.568 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.568 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.568 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Writeback == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:34.568 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:34.568 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.568 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.568 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ AnonPages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:34.568 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:34.568 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.568 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.568 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Mapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:34.568 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:34.568 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.568 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.568 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Shmem == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:34.568 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:34.568 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.568 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.568 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:34.568 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:34.568 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.568 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.568 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Slab == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:34.568 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:34.568 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.568 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.568 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:34.568 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:34.568 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.568 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.568 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:34.568 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:34.568 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.568 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.568 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ KernelStack == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:34.568 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:34.568 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.568 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.568 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ PageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:34.568 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:34.568 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.568 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.568 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:34.568 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:34.568 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.568 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.568 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:34.568 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:34.568 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.568 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.568 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Bounce == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:34.568 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:34.568 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.568 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.568 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:34.568 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:34.568 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.568 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.568 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:34.568 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:34.568 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.568 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.568 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:34.568 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:34.568 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.568 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.568 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:34.568 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:34.568 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.568 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.568 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:34.568 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:34.568 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.568 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.568 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:34.568 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:34.568 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.568 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.568 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Percpu == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:34.568 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:34.568 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.568 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.568 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:34.568 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:34.568 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.568 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.568 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:34.568 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@33 -- # echo 0 00:04:34.568 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@33 -- # return 0 00:04:34.568 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@97 -- # anon=0 00:04:34.568 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@99 -- # get_meminfo HugePages_Surp 00:04:34.568 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:04:34.568 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@18 -- # local node= 00:04:34.568 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@19 -- # local var val 00:04:34.568 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@20 -- # local mem_f mem 00:04:34.568 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:34.568 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:34.568 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:34.568 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:04:34.568 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:34.568 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.568 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.568 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92293472 kB' 'MemFree: 79164660 kB' 'MemAvailable: 82436448 kB' 'Buffers: 11136 kB' 'Cached: 9251568 kB' 'SwapCached: 0 kB' 'Active: 6307176 kB' 'Inactive: 3441940 kB' 'Active(anon): 5916560 kB' 'Inactive(anon): 0 kB' 'Active(file): 390616 kB' 'Inactive(file): 3441940 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 489684 kB' 'Mapped: 186920 kB' 'Shmem: 5430148 kB' 'KReclaimable: 185748 kB' 'Slab: 482072 kB' 'SReclaimable: 185748 kB' 'SUnreclaim: 296324 kB' 'KernelStack: 16224 kB' 'PageTables: 8124 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 53486764 kB' 'Committed_AS: 7300756 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 201044 kB' 'VmallocChunk: 0 kB' 'Percpu: 47360 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 634276 kB' 'DirectMap2M: 12673024 kB' 'DirectMap1G: 88080384 kB' 00:04:34.568 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:34.568 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:34.568 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.568 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.568 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:34.568 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:34.568 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.568 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.569 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:34.569 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:34.569 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.569 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.569 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:34.569 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:34.569 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.569 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.569 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:34.569 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:34.569 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.569 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.569 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:34.569 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:34.569 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.569 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.569 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:34.569 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:34.569 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.569 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.569 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:34.569 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:34.569 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.569 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.569 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:34.569 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:34.569 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.569 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.569 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:34.569 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:34.569 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.569 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.569 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:34.569 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:34.569 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.569 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.569 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:34.569 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:34.569 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.569 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.569 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:34.569 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:34.569 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.569 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.569 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:34.569 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:34.569 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.569 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.569 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:34.569 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:34.569 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.569 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.569 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:34.569 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:34.569 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.569 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.569 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:34.569 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:34.569 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.569 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.569 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:34.569 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:34.569 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.569 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.569 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:34.569 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:34.569 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.569 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.569 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:34.569 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:34.569 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.569 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.569 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:34.569 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:34.569 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.569 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.569 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:34.569 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:34.569 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.569 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.569 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:34.569 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:34.569 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.569 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.569 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:34.569 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:34.569 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.569 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.569 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:34.569 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:34.569 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.569 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.569 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:34.569 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:34.569 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.569 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.569 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:34.569 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:34.569 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.569 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.569 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:34.569 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:34.569 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.569 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.569 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:34.569 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:34.569 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.569 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.569 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:34.569 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:34.569 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.569 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.835 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:34.835 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:34.835 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.835 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.835 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:34.835 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:34.835 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.835 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.835 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:34.835 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:34.835 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.835 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.835 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:34.835 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:34.835 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.835 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.836 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:34.836 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:34.836 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.836 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.836 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:34.836 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:34.836 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.836 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.836 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:34.836 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:34.836 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.836 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.836 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:34.836 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:34.836 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.836 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.836 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:34.836 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:34.836 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.836 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.836 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:34.836 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:34.836 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.836 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.836 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:34.836 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:34.836 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.836 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.836 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:34.836 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:34.836 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.836 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.836 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:34.836 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:34.836 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.836 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.836 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:34.836 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:34.836 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.836 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.836 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:34.836 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:34.836 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.836 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.836 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:34.836 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:34.836 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.836 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.836 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:34.836 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:34.836 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.836 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.836 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:34.836 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:34.836 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.836 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.836 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:34.836 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:34.836 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.836 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.836 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:34.836 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:34.836 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.836 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.836 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:34.836 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:34.836 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.836 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.836 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:34.836 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@33 -- # echo 0 00:04:34.836 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@33 -- # return 0 00:04:34.836 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@99 -- # surp=0 00:04:34.836 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@100 -- # get_meminfo HugePages_Rsvd 00:04:34.836 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@17 -- # local get=HugePages_Rsvd 00:04:34.836 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@18 -- # local node= 00:04:34.836 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@19 -- # local var val 00:04:34.836 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@20 -- # local mem_f mem 00:04:34.836 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:34.836 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:34.836 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:34.836 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:04:34.836 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:34.836 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.836 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.836 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92293472 kB' 'MemFree: 79165652 kB' 'MemAvailable: 82437440 kB' 'Buffers: 11136 kB' 'Cached: 9251568 kB' 'SwapCached: 0 kB' 'Active: 6307220 kB' 'Inactive: 3441940 kB' 'Active(anon): 5916604 kB' 'Inactive(anon): 0 kB' 'Active(file): 390616 kB' 'Inactive(file): 3441940 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 489720 kB' 'Mapped: 186920 kB' 'Shmem: 5430148 kB' 'KReclaimable: 185748 kB' 'Slab: 482072 kB' 'SReclaimable: 185748 kB' 'SUnreclaim: 296324 kB' 'KernelStack: 16240 kB' 'PageTables: 8176 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 53486764 kB' 'Committed_AS: 7300780 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 201060 kB' 'VmallocChunk: 0 kB' 'Percpu: 47360 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 634276 kB' 'DirectMap2M: 12673024 kB' 'DirectMap1G: 88080384 kB' 00:04:34.836 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:34.836 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:34.836 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.836 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.836 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:34.836 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:34.836 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.836 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.836 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:34.836 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:34.836 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.836 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.836 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:34.836 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:34.836 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.836 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.836 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:34.836 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:34.836 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.836 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.836 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:34.836 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:34.836 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.836 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.836 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:34.836 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:34.836 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.836 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.836 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:34.837 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:34.837 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.837 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.837 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:34.837 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:34.837 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.837 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.837 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:34.837 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:34.837 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.837 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.837 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:34.837 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:34.837 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.837 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.837 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:34.837 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:34.837 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.837 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.837 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:34.837 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:34.837 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.837 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.837 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:34.837 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:34.837 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.837 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.837 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:34.837 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:34.837 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.837 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.837 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:34.837 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:34.837 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.837 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.837 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:34.837 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:34.837 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.837 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.837 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:34.837 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:34.837 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.837 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.837 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:34.837 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:34.837 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.837 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.837 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:34.837 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:34.837 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.837 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.837 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:34.837 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:34.837 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.837 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.837 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:34.837 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:34.837 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.837 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.837 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:34.837 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:34.837 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.837 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.837 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:34.837 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:34.837 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.837 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.837 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:34.837 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:34.837 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.837 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.837 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:34.837 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:34.837 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.837 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.837 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:34.837 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:34.837 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.837 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.837 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:34.837 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:34.837 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.837 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.837 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:34.837 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:34.837 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.837 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.837 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:34.837 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:34.837 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.837 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.837 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:34.837 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:34.837 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.837 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.837 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:34.837 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:34.837 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.837 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.837 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:34.837 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:34.837 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.837 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.837 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:34.837 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:34.837 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.837 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.837 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:34.837 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:34.837 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.837 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.837 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:34.837 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:34.837 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.837 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.837 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:34.837 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:34.837 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.837 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.837 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:34.837 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:34.837 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.837 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.837 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:34.837 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:34.837 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.838 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.838 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:34.838 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:34.838 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.838 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.838 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:34.838 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:34.838 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.838 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.838 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:34.838 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:34.838 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.838 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.838 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:34.838 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:34.838 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.838 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.838 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:34.838 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:34.838 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.838 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.838 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:34.838 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:34.838 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.838 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.838 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:34.838 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:34.838 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.838 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.838 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:34.838 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:34.838 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.838 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.838 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:34.838 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:34.838 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.838 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.838 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:34.838 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:34.838 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.838 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.838 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:34.838 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:34.838 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.838 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.838 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:34.838 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@33 -- # echo 0 00:04:34.838 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@33 -- # return 0 00:04:34.838 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@100 -- # resv=0 00:04:34.838 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@102 -- # echo nr_hugepages=1024 00:04:34.838 nr_hugepages=1024 00:04:34.838 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@103 -- # echo resv_hugepages=0 00:04:34.838 resv_hugepages=0 00:04:34.838 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@104 -- # echo surplus_hugepages=0 00:04:34.838 surplus_hugepages=0 00:04:34.838 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@105 -- # echo anon_hugepages=0 00:04:34.838 anon_hugepages=0 00:04:34.838 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@107 -- # (( 1024 == nr_hugepages + surp + resv )) 00:04:34.838 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@109 -- # (( 1024 == nr_hugepages )) 00:04:34.838 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@110 -- # get_meminfo HugePages_Total 00:04:34.838 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@17 -- # local get=HugePages_Total 00:04:34.838 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@18 -- # local node= 00:04:34.838 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@19 -- # local var val 00:04:34.838 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@20 -- # local mem_f mem 00:04:34.838 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:34.838 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:34.838 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:34.838 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:04:34.838 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:34.838 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.838 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.838 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92293472 kB' 'MemFree: 79165404 kB' 'MemAvailable: 82437192 kB' 'Buffers: 11136 kB' 'Cached: 9251608 kB' 'SwapCached: 0 kB' 'Active: 6307200 kB' 'Inactive: 3441940 kB' 'Active(anon): 5916584 kB' 'Inactive(anon): 0 kB' 'Active(file): 390616 kB' 'Inactive(file): 3441940 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 489676 kB' 'Mapped: 186920 kB' 'Shmem: 5430188 kB' 'KReclaimable: 185748 kB' 'Slab: 482072 kB' 'SReclaimable: 185748 kB' 'SUnreclaim: 296324 kB' 'KernelStack: 16224 kB' 'PageTables: 8124 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 53486764 kB' 'Committed_AS: 7300800 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 201060 kB' 'VmallocChunk: 0 kB' 'Percpu: 47360 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 634276 kB' 'DirectMap2M: 12673024 kB' 'DirectMap1G: 88080384 kB' 00:04:34.838 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:34.838 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:34.838 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.838 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.838 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:34.838 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:34.838 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.838 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.838 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:34.838 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:34.838 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.838 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.838 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:34.838 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:34.838 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.838 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.838 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:34.838 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:34.838 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.838 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.838 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:34.838 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:34.838 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.838 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.838 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:34.838 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:34.838 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.838 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.838 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:34.838 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:34.838 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.838 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.838 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:34.838 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:34.838 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.838 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.838 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:34.838 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:34.838 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.838 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.838 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:34.838 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:34.838 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.839 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.839 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:34.839 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:34.839 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.839 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.839 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:34.839 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:34.839 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.839 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.839 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:34.839 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:34.839 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.839 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.839 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:34.839 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:34.839 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.839 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.839 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:34.839 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:34.839 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.839 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.839 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:34.839 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:34.839 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.839 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.839 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:34.839 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:34.839 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.839 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.839 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:34.839 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:34.839 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.839 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.839 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:34.839 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:34.839 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.839 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.839 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:34.839 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:34.839 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.839 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.839 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:34.839 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:34.839 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.839 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.839 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:34.839 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:34.839 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.839 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.839 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:34.839 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:34.839 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.839 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.839 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:34.839 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:34.839 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.839 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.839 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:34.839 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:34.839 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.839 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.839 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:34.839 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:34.839 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.839 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.839 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:34.839 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:34.839 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.839 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.839 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:34.839 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:34.839 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.839 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.839 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:34.839 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:34.839 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.839 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.839 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:34.839 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:34.839 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.839 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.839 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:34.839 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:34.839 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.839 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.839 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:34.839 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:34.839 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.839 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.839 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:34.839 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:34.839 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.839 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.839 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:34.839 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:34.839 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.839 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.839 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:34.839 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:34.839 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.839 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.839 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:34.839 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:34.839 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.839 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.839 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:34.839 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:34.839 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.839 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.839 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:34.839 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:34.839 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.839 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.839 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:34.839 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:34.839 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.839 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.839 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:34.839 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:34.839 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.839 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.839 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:34.839 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:34.839 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.839 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.839 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:34.840 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:34.840 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.840 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.840 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:34.840 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:34.840 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.840 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.840 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:34.840 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:34.840 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.840 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.840 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:34.840 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:34.840 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.840 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.840 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:34.840 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:34.840 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.840 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.840 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:34.840 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:34.840 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.840 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.840 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:34.840 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@33 -- # echo 1024 00:04:34.840 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@33 -- # return 0 00:04:34.840 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@110 -- # (( 1024 == nr_hugepages + surp + resv )) 00:04:34.840 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@112 -- # get_nodes 00:04:34.840 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@27 -- # local node 00:04:34.840 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:04:34.840 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=512 00:04:34.840 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:04:34.840 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=512 00:04:34.840 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@32 -- # no_nodes=2 00:04:34.840 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@33 -- # (( no_nodes > 0 )) 00:04:34.840 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:04:34.840 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:04:34.840 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 0 00:04:34.840 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:04:34.840 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@18 -- # local node=0 00:04:34.840 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@19 -- # local var val 00:04:34.840 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@20 -- # local mem_f mem 00:04:34.840 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:34.840 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node0/meminfo ]] 00:04:34.840 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node0/meminfo 00:04:34.840 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:04:34.840 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:34.840 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.840 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.840 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 48116940 kB' 'MemFree: 38142464 kB' 'MemUsed: 9974476 kB' 'SwapCached: 0 kB' 'Active: 4831344 kB' 'Inactive: 3371680 kB' 'Active(anon): 4673448 kB' 'Inactive(anon): 0 kB' 'Active(file): 157896 kB' 'Inactive(file): 3371680 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 7978796 kB' 'Mapped: 106228 kB' 'AnonPages: 227484 kB' 'Shmem: 4449220 kB' 'KernelStack: 9176 kB' 'PageTables: 4344 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 104008 kB' 'Slab: 286952 kB' 'SReclaimable: 104008 kB' 'SUnreclaim: 182944 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 512' 'HugePages_Free: 512' 'HugePages_Surp: 0' 00:04:34.840 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:34.840 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:34.840 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.840 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.840 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:34.840 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:34.840 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.840 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.840 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:34.840 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:34.840 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.840 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.840 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:34.840 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:34.840 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.840 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.840 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:34.840 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:34.840 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.840 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.840 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:34.840 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:34.840 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.840 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.840 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:34.840 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:34.840 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.840 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.840 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:34.840 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:34.840 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.840 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.840 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:34.840 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:34.840 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.840 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.840 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:34.840 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:34.840 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.840 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.840 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:34.840 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:34.840 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.841 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.841 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:34.841 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:34.841 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.841 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.841 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:34.841 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:34.841 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.841 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.841 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:34.841 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:34.841 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.841 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.841 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:34.841 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:34.841 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.841 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.841 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:34.841 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:34.841 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.841 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.841 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:34.841 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:34.841 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.841 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.841 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:34.841 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:34.841 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.841 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.841 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:34.841 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:34.841 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.841 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.841 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:34.841 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:34.841 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.841 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.841 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:34.841 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:34.841 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.841 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.841 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:34.841 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:34.841 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.841 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.841 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:34.841 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:34.841 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.841 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.841 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:34.841 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:34.841 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.841 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.841 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:34.841 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:34.841 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.841 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.841 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:34.841 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:34.841 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.841 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.841 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:34.841 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:34.841 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.841 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.841 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:34.841 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:34.841 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.841 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.841 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:34.841 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:34.841 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.841 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.841 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:34.841 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:34.841 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.841 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.841 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:34.841 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:34.841 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.841 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.841 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:34.841 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:34.841 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.841 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.841 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:34.841 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:34.841 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.841 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.841 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:34.841 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:34.841 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.841 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.841 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:34.841 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:34.841 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.841 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.841 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:34.841 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:34.841 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.841 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.841 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:34.841 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@33 -- # echo 0 00:04:34.841 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@33 -- # return 0 00:04:34.841 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:04:34.841 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:04:34.841 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:04:34.841 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 1 00:04:34.841 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:04:34.841 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@18 -- # local node=1 00:04:34.841 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@19 -- # local var val 00:04:34.841 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@20 -- # local mem_f mem 00:04:34.841 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:34.841 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node1/meminfo ]] 00:04:34.841 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node1/meminfo 00:04:34.841 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:04:34.841 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:34.841 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 44176532 kB' 'MemFree: 41031624 kB' 'MemUsed: 3144908 kB' 'SwapCached: 0 kB' 'Active: 1474968 kB' 'Inactive: 70260 kB' 'Active(anon): 1242248 kB' 'Inactive(anon): 0 kB' 'Active(file): 232720 kB' 'Inactive(file): 70260 kB' 'Unevictable: 0 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 1283976 kB' 'Mapped: 80188 kB' 'AnonPages: 261268 kB' 'Shmem: 980996 kB' 'KernelStack: 7048 kB' 'PageTables: 3772 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 81740 kB' 'Slab: 195112 kB' 'SReclaimable: 81740 kB' 'SUnreclaim: 113372 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 512' 'HugePages_Free: 512' 'HugePages_Surp: 0' 00:04:34.841 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.842 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.842 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:34.842 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:34.842 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.842 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.842 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:34.842 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:34.842 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.842 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.842 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:34.842 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:34.842 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.842 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.842 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:34.842 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:34.842 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.842 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.842 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:34.842 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:34.842 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.842 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.842 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:34.842 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:34.842 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.842 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.842 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:34.842 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:34.842 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.842 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.842 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:34.842 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:34.842 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.842 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.842 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:34.842 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:34.842 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.842 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.842 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:34.842 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:34.842 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.842 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.842 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:34.842 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:34.842 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.842 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.842 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:34.842 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:34.842 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.842 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.842 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:34.842 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:34.842 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.842 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.842 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:34.842 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:34.842 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.842 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.842 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:34.842 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:34.842 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.842 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.842 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:34.842 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:34.842 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.842 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.842 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:34.842 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:34.842 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.842 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.842 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:34.842 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:34.842 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.842 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.842 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:34.842 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:34.842 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.842 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.842 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:34.842 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:34.842 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.842 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.842 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:34.842 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:34.842 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.842 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.842 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:34.842 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:34.842 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.842 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.842 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:34.842 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:34.842 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.842 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.842 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:34.842 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:34.842 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.842 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.842 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:34.842 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:34.842 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.842 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.842 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:34.842 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:34.842 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.842 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.842 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:34.842 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:34.842 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.842 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.842 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:34.842 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:34.842 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.842 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.842 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:34.842 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:34.842 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.842 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.842 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:34.842 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:34.842 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.842 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.842 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:34.842 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:34.842 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.842 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.842 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:34.842 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:34.843 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.843 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.843 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:34.843 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:34.843 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.843 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.843 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:34.843 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:34.843 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.843 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.843 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:34.843 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:34.843 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.843 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.843 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:34.843 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:34.843 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:34.843 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:34.843 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:34.843 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@33 -- # echo 0 00:04:34.843 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@33 -- # return 0 00:04:34.843 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:04:34.843 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:04:34.843 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:04:34.843 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:04:34.843 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@128 -- # echo 'node0=512 expecting 512' 00:04:34.843 node0=512 expecting 512 00:04:34.843 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:04:34.843 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:04:34.843 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:04:34.843 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@128 -- # echo 'node1=512 expecting 512' 00:04:34.843 node1=512 expecting 512 00:04:34.843 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@130 -- # [[ 512 == \5\1\2 ]] 00:04:34.843 00:04:34.843 real 0m3.592s 00:04:34.843 user 0m1.337s 00:04:34.843 sys 0m2.328s 00:04:34.843 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- common/autotest_common.sh@1124 -- # xtrace_disable 00:04:34.843 18:08:18 setup.sh.hugepages.per_node_1G_alloc -- common/autotest_common.sh@10 -- # set +x 00:04:34.843 ************************************ 00:04:34.843 END TEST per_node_1G_alloc 00:04:34.843 ************************************ 00:04:34.843 18:08:18 setup.sh.hugepages -- common/autotest_common.sh@1142 -- # return 0 00:04:34.843 18:08:18 setup.sh.hugepages -- setup/hugepages.sh@212 -- # run_test even_2G_alloc even_2G_alloc 00:04:34.843 18:08:18 setup.sh.hugepages -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:04:34.843 18:08:18 setup.sh.hugepages -- common/autotest_common.sh@1105 -- # xtrace_disable 00:04:34.843 18:08:18 setup.sh.hugepages -- common/autotest_common.sh@10 -- # set +x 00:04:34.843 ************************************ 00:04:34.843 START TEST even_2G_alloc 00:04:34.843 ************************************ 00:04:34.843 18:08:18 setup.sh.hugepages.even_2G_alloc -- common/autotest_common.sh@1123 -- # even_2G_alloc 00:04:34.843 18:08:18 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@152 -- # get_test_nr_hugepages 2097152 00:04:34.843 18:08:18 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@49 -- # local size=2097152 00:04:34.843 18:08:18 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@50 -- # (( 1 > 1 )) 00:04:34.843 18:08:18 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@55 -- # (( size >= default_hugepages )) 00:04:34.843 18:08:18 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@57 -- # nr_hugepages=1024 00:04:34.843 18:08:18 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@58 -- # get_test_nr_hugepages_per_node 00:04:34.843 18:08:18 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@62 -- # user_nodes=() 00:04:34.843 18:08:18 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@62 -- # local user_nodes 00:04:34.843 18:08:18 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@64 -- # local _nr_hugepages=1024 00:04:34.843 18:08:18 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@65 -- # local _no_nodes=2 00:04:34.843 18:08:18 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@67 -- # nodes_test=() 00:04:34.843 18:08:18 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@67 -- # local -g nodes_test 00:04:34.843 18:08:18 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@69 -- # (( 0 > 0 )) 00:04:34.843 18:08:18 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@74 -- # (( 0 > 0 )) 00:04:34.843 18:08:18 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@81 -- # (( _no_nodes > 0 )) 00:04:34.843 18:08:18 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@82 -- # nodes_test[_no_nodes - 1]=512 00:04:34.843 18:08:18 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@83 -- # : 512 00:04:34.843 18:08:18 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@84 -- # : 1 00:04:34.843 18:08:18 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@81 -- # (( _no_nodes > 0 )) 00:04:34.843 18:08:18 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@82 -- # nodes_test[_no_nodes - 1]=512 00:04:34.843 18:08:18 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@83 -- # : 0 00:04:34.843 18:08:18 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@84 -- # : 0 00:04:34.843 18:08:18 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@81 -- # (( _no_nodes > 0 )) 00:04:34.843 18:08:18 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@153 -- # NRHUGE=1024 00:04:34.843 18:08:18 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@153 -- # HUGE_EVEN_ALLOC=yes 00:04:34.843 18:08:18 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@153 -- # setup output 00:04:34.843 18:08:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@9 -- # [[ output == output ]] 00:04:34.843 18:08:18 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@10 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh 00:04:38.136 0000:d7:05.5 (8086 201d): Skipping not allowed VMD controller at 0000:d7:05.5 00:04:38.136 0000:85:05.5 (8086 201d): Skipping not allowed VMD controller at 0000:85:05.5 00:04:38.136 0000:00:04.7 (8086 2021): Already using the vfio-pci driver 00:04:38.136 0000:5e:00.0 (8086 0b60): Already using the vfio-pci driver 00:04:38.136 0000:00:04.6 (8086 2021): Already using the vfio-pci driver 00:04:38.136 0000:00:04.5 (8086 2021): Already using the vfio-pci driver 00:04:38.136 0000:00:04.4 (8086 2021): Already using the vfio-pci driver 00:04:38.136 0000:00:04.3 (8086 2021): Already using the vfio-pci driver 00:04:38.136 0000:00:04.2 (8086 2021): Already using the vfio-pci driver 00:04:38.136 0000:00:04.1 (8086 2021): Already using the vfio-pci driver 00:04:38.136 0000:00:04.0 (8086 2021): Already using the vfio-pci driver 00:04:38.136 0000:80:04.7 (8086 2021): Already using the vfio-pci driver 00:04:38.136 0000:80:04.6 (8086 2021): Already using the vfio-pci driver 00:04:38.136 0000:80:04.5 (8086 2021): Already using the vfio-pci driver 00:04:38.136 0000:80:04.4 (8086 2021): Already using the vfio-pci driver 00:04:38.136 0000:80:04.3 (8086 2021): Already using the vfio-pci driver 00:04:38.136 0000:80:04.2 (8086 2021): Already using the vfio-pci driver 00:04:38.136 0000:80:04.1 (8086 2021): Already using the vfio-pci driver 00:04:38.136 0000:80:04.0 (8086 2021): Already using the vfio-pci driver 00:04:38.136 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@154 -- # verify_nr_hugepages 00:04:38.136 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@89 -- # local node 00:04:38.136 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@90 -- # local sorted_t 00:04:38.136 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@91 -- # local sorted_s 00:04:38.136 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@92 -- # local surp 00:04:38.136 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@93 -- # local resv 00:04:38.136 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@94 -- # local anon 00:04:38.136 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@96 -- # [[ always [madvise] never != *\[\n\e\v\e\r\]* ]] 00:04:38.136 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@97 -- # get_meminfo AnonHugePages 00:04:38.136 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@17 -- # local get=AnonHugePages 00:04:38.136 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@18 -- # local node= 00:04:38.136 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@19 -- # local var val 00:04:38.136 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@20 -- # local mem_f mem 00:04:38.136 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:38.136 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:38.136 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:38.136 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:04:38.136 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:38.136 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.136 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.136 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92293472 kB' 'MemFree: 79180112 kB' 'MemAvailable: 82451896 kB' 'Buffers: 11136 kB' 'Cached: 9251716 kB' 'SwapCached: 0 kB' 'Active: 6305696 kB' 'Inactive: 3441940 kB' 'Active(anon): 5915080 kB' 'Inactive(anon): 0 kB' 'Active(file): 390616 kB' 'Inactive(file): 3441940 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 488100 kB' 'Mapped: 185940 kB' 'Shmem: 5430296 kB' 'KReclaimable: 185740 kB' 'Slab: 481744 kB' 'SReclaimable: 185740 kB' 'SUnreclaim: 296004 kB' 'KernelStack: 16128 kB' 'PageTables: 7652 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 53486764 kB' 'Committed_AS: 7290524 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 201044 kB' 'VmallocChunk: 0 kB' 'Percpu: 47360 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 634276 kB' 'DirectMap2M: 12673024 kB' 'DirectMap1G: 88080384 kB' 00:04:38.136 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:38.136 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:38.136 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.136 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.136 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:38.136 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:38.136 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.136 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.136 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:38.136 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:38.136 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.136 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.136 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Buffers == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:38.136 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:38.136 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.136 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.136 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Cached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:38.136 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:38.136 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.136 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.136 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SwapCached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:38.136 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:38.136 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.136 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.136 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:38.136 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:38.136 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.136 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.136 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:38.136 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:38.136 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.136 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.136 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:38.136 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:38.136 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.136 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.136 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:38.136 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:38.136 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.136 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.136 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:38.136 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:38.136 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.136 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.136 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:38.136 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:38.136 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.136 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.137 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Unevictable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:38.137 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:38.137 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.137 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.137 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Mlocked == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:38.137 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:38.137 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.137 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.137 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:38.137 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:38.137 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.137 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.137 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SwapFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:38.137 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:38.137 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.137 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.137 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Zswap == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:38.137 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:38.137 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.137 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.137 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Zswapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:38.137 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:38.137 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.137 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.137 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Dirty == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:38.137 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:38.137 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.137 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.137 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Writeback == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:38.137 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:38.137 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.137 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.137 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ AnonPages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:38.137 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:38.137 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.137 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.137 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Mapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:38.137 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:38.137 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.137 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.137 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Shmem == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:38.137 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:38.137 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.137 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.137 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:38.137 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:38.137 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.137 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.137 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Slab == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:38.137 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:38.137 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.137 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.137 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:38.137 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:38.137 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.137 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.137 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:38.137 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:38.137 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.137 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.137 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ KernelStack == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:38.137 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:38.137 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.137 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.137 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ PageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:38.137 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:38.137 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.137 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.137 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:38.137 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:38.137 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.137 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.137 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:38.137 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:38.137 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.137 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.137 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Bounce == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:38.137 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:38.137 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.137 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.137 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:38.137 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:38.137 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.137 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.137 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:38.137 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:38.137 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.137 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.137 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:38.137 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:38.137 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.137 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.137 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:38.137 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:38.137 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.137 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.137 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:38.137 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:38.137 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.137 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.137 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:38.137 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:38.137 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.138 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.138 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Percpu == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:38.138 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:38.138 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.138 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.138 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:38.138 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:38.138 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.138 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.138 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:38.138 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@33 -- # echo 0 00:04:38.138 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@33 -- # return 0 00:04:38.138 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@97 -- # anon=0 00:04:38.138 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@99 -- # get_meminfo HugePages_Surp 00:04:38.138 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:04:38.138 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@18 -- # local node= 00:04:38.138 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@19 -- # local var val 00:04:38.138 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@20 -- # local mem_f mem 00:04:38.138 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:38.138 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:38.138 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:38.138 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:04:38.138 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:38.138 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.138 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.138 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92293472 kB' 'MemFree: 79180316 kB' 'MemAvailable: 82452100 kB' 'Buffers: 11136 kB' 'Cached: 9251720 kB' 'SwapCached: 0 kB' 'Active: 6305136 kB' 'Inactive: 3441940 kB' 'Active(anon): 5914520 kB' 'Inactive(anon): 0 kB' 'Active(file): 390616 kB' 'Inactive(file): 3441940 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 487604 kB' 'Mapped: 185880 kB' 'Shmem: 5430300 kB' 'KReclaimable: 185740 kB' 'Slab: 481776 kB' 'SReclaimable: 185740 kB' 'SUnreclaim: 296036 kB' 'KernelStack: 16112 kB' 'PageTables: 7616 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 53486764 kB' 'Committed_AS: 7290544 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 200996 kB' 'VmallocChunk: 0 kB' 'Percpu: 47360 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 634276 kB' 'DirectMap2M: 12673024 kB' 'DirectMap1G: 88080384 kB' 00:04:38.138 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:38.138 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:38.138 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.138 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.138 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:38.138 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:38.138 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.138 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.138 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:38.138 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:38.138 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.138 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.138 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:38.138 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:38.138 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.138 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.138 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:38.138 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:38.138 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.138 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.138 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:38.138 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:38.138 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.138 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.138 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:38.138 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:38.138 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.138 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.138 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:38.138 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:38.138 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.138 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.138 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:38.138 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:38.138 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.138 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.138 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:38.138 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:38.138 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.138 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.138 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:38.138 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:38.138 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.138 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.138 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:38.138 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:38.138 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.138 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.138 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:38.138 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:38.138 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.138 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.138 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:38.138 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:38.138 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.138 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.138 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:38.138 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:38.138 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.138 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.139 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:38.139 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:38.139 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.139 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.139 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:38.139 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:38.139 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.139 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.139 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:38.139 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:38.139 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.139 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.139 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:38.139 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:38.139 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.139 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.139 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:38.139 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:38.139 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.139 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.139 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:38.139 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:38.139 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.139 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.139 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:38.139 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:38.139 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.139 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.139 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:38.139 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:38.139 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.139 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.139 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:38.139 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:38.139 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.139 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.139 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:38.139 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:38.139 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.139 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.139 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:38.139 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:38.139 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.139 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.139 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:38.139 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:38.139 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.139 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.139 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:38.139 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:38.139 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.139 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.139 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:38.139 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:38.139 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.139 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.139 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:38.139 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:38.139 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.139 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.139 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:38.139 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:38.139 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.139 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.139 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:38.139 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:38.139 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.139 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.139 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:38.139 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:38.139 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.139 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.139 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:38.139 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:38.139 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.139 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.139 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:38.139 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:38.139 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.139 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.139 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:38.139 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:38.139 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.139 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.139 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:38.139 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:38.139 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.139 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.139 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:38.139 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:38.139 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.139 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.139 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:38.139 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:38.405 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.405 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.405 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:38.405 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:38.405 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.405 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.405 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:38.405 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:38.405 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.405 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.405 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:38.405 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:38.405 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.405 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.405 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:38.405 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:38.405 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.405 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.405 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:38.405 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:38.405 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.405 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.405 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:38.405 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:38.405 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.405 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.405 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:38.405 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:38.405 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.405 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.405 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:38.405 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:38.405 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.405 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.405 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:38.405 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:38.405 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.405 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.405 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:38.405 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:38.405 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.405 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.405 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:38.405 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:38.405 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.405 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.405 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:38.405 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:38.405 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.405 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.405 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:38.405 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@33 -- # echo 0 00:04:38.405 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@33 -- # return 0 00:04:38.405 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@99 -- # surp=0 00:04:38.405 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@100 -- # get_meminfo HugePages_Rsvd 00:04:38.405 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@17 -- # local get=HugePages_Rsvd 00:04:38.405 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@18 -- # local node= 00:04:38.405 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@19 -- # local var val 00:04:38.405 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@20 -- # local mem_f mem 00:04:38.405 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:38.405 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:38.405 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:38.405 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:04:38.405 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:38.405 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.405 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.405 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92293472 kB' 'MemFree: 79179560 kB' 'MemAvailable: 82451344 kB' 'Buffers: 11136 kB' 'Cached: 9251736 kB' 'SwapCached: 0 kB' 'Active: 6305208 kB' 'Inactive: 3441940 kB' 'Active(anon): 5914592 kB' 'Inactive(anon): 0 kB' 'Active(file): 390616 kB' 'Inactive(file): 3441940 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 487604 kB' 'Mapped: 185880 kB' 'Shmem: 5430316 kB' 'KReclaimable: 185740 kB' 'Slab: 481776 kB' 'SReclaimable: 185740 kB' 'SUnreclaim: 296036 kB' 'KernelStack: 16112 kB' 'PageTables: 7616 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 53486764 kB' 'Committed_AS: 7290564 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 200996 kB' 'VmallocChunk: 0 kB' 'Percpu: 47360 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 634276 kB' 'DirectMap2M: 12673024 kB' 'DirectMap1G: 88080384 kB' 00:04:38.405 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:38.405 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:38.405 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.405 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.405 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:38.405 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:38.405 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.405 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.405 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:38.405 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:38.405 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.405 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.405 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:38.405 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:38.405 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.405 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.405 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:38.405 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:38.405 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.405 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.405 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:38.405 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:38.405 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.405 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.405 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:38.405 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:38.405 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.405 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.405 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:38.405 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:38.405 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.405 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.405 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:38.405 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:38.405 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.405 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.405 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:38.405 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:38.405 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.405 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.405 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:38.405 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:38.405 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.405 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.405 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:38.406 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:38.406 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.406 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.406 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:38.406 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:38.406 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.406 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.406 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:38.406 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:38.406 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.406 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.406 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:38.406 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:38.406 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.406 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.406 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:38.406 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:38.406 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.406 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.406 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:38.406 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:38.406 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.406 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.406 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:38.406 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:38.406 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.406 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.406 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:38.406 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:38.406 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.406 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.406 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:38.406 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:38.406 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.406 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.406 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:38.406 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:38.406 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.406 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.406 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:38.406 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:38.406 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.406 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.406 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:38.406 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:38.406 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.406 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.406 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:38.406 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:38.406 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.406 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.406 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:38.406 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:38.406 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.406 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.406 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:38.406 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:38.406 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.406 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.406 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:38.406 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:38.406 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.406 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.406 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:38.406 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:38.406 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.406 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.406 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:38.406 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:38.406 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.406 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.406 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:38.406 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:38.406 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.406 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.406 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:38.406 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:38.406 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.406 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.406 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:38.406 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:38.406 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.406 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.406 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:38.406 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:38.406 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.406 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.406 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:38.406 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:38.406 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.406 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.406 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:38.406 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:38.406 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.406 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.406 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:38.406 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:38.406 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.406 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.406 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:38.406 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:38.406 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.406 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.406 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:38.406 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:38.406 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.406 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.406 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:38.406 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:38.406 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.406 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.406 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:38.406 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:38.406 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.406 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.406 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:38.406 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:38.406 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.406 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.406 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:38.406 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:38.406 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.406 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.406 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:38.406 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:38.406 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.406 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.406 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:38.406 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:38.406 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.407 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.407 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:38.407 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:38.407 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.407 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.407 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:38.407 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:38.407 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.407 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.407 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:38.407 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:38.407 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.407 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.407 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:38.407 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:38.407 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.407 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.407 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:38.407 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:38.407 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.407 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.407 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:38.407 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:38.407 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.407 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.407 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:38.407 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@33 -- # echo 0 00:04:38.407 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@33 -- # return 0 00:04:38.407 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@100 -- # resv=0 00:04:38.407 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@102 -- # echo nr_hugepages=1024 00:04:38.407 nr_hugepages=1024 00:04:38.407 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@103 -- # echo resv_hugepages=0 00:04:38.407 resv_hugepages=0 00:04:38.407 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@104 -- # echo surplus_hugepages=0 00:04:38.407 surplus_hugepages=0 00:04:38.407 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@105 -- # echo anon_hugepages=0 00:04:38.407 anon_hugepages=0 00:04:38.407 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@107 -- # (( 1024 == nr_hugepages + surp + resv )) 00:04:38.407 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@109 -- # (( 1024 == nr_hugepages )) 00:04:38.407 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@110 -- # get_meminfo HugePages_Total 00:04:38.407 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@17 -- # local get=HugePages_Total 00:04:38.407 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@18 -- # local node= 00:04:38.407 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@19 -- # local var val 00:04:38.407 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@20 -- # local mem_f mem 00:04:38.407 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:38.407 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:38.407 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:38.407 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:04:38.407 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:38.407 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.407 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.407 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92293472 kB' 'MemFree: 79178804 kB' 'MemAvailable: 82450588 kB' 'Buffers: 11136 kB' 'Cached: 9251736 kB' 'SwapCached: 0 kB' 'Active: 6305208 kB' 'Inactive: 3441940 kB' 'Active(anon): 5914592 kB' 'Inactive(anon): 0 kB' 'Active(file): 390616 kB' 'Inactive(file): 3441940 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 487604 kB' 'Mapped: 185880 kB' 'Shmem: 5430316 kB' 'KReclaimable: 185740 kB' 'Slab: 481776 kB' 'SReclaimable: 185740 kB' 'SUnreclaim: 296036 kB' 'KernelStack: 16112 kB' 'PageTables: 7616 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 53486764 kB' 'Committed_AS: 7290720 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 201012 kB' 'VmallocChunk: 0 kB' 'Percpu: 47360 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 634276 kB' 'DirectMap2M: 12673024 kB' 'DirectMap1G: 88080384 kB' 00:04:38.407 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:38.407 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:38.407 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.407 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.407 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:38.407 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:38.407 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.407 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.407 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:38.407 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:38.407 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.407 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.407 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:38.407 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:38.407 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.407 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.407 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:38.407 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:38.407 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.407 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.407 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:38.407 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:38.407 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.407 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.407 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:38.407 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:38.407 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.407 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.407 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:38.407 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:38.407 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.407 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.407 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:38.407 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:38.407 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.407 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.407 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:38.407 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:38.407 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.407 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.407 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:38.407 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:38.407 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.407 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.407 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:38.407 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:38.407 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.407 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.407 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:38.407 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:38.407 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.407 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.407 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:38.407 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:38.407 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.407 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.407 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:38.407 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:38.407 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.407 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.407 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:38.407 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:38.407 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.407 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.407 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:38.407 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:38.407 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.407 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.407 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:38.408 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:38.408 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.408 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.408 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:38.408 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:38.408 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.408 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.408 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:38.408 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:38.408 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.408 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.408 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:38.408 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:38.408 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.408 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.408 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:38.408 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:38.408 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.408 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.408 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:38.408 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:38.408 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.408 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.408 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:38.408 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:38.408 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.408 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.408 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:38.408 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:38.408 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.408 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.408 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:38.408 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:38.408 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.408 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.408 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:38.408 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:38.408 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.408 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.408 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:38.408 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:38.408 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.408 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.408 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:38.408 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:38.408 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.408 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.408 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:38.408 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:38.408 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.408 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.408 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:38.408 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:38.408 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.408 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.408 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:38.408 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:38.408 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.408 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.408 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:38.408 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:38.408 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.408 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.408 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:38.408 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:38.408 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.408 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.408 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:38.408 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:38.408 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.408 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.408 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:38.408 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:38.408 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.408 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.408 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:38.408 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:38.408 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.408 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.408 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:38.408 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:38.408 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.408 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.408 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:38.408 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:38.408 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.408 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.408 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:38.408 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:38.408 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.408 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.408 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:38.408 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:38.408 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.408 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.408 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:38.408 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:38.408 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.408 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.408 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:38.408 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:38.408 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.408 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.408 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:38.408 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:38.408 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.408 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.408 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:38.408 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:38.408 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.408 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.408 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:38.408 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:38.408 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.408 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.408 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:38.408 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:38.408 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.408 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.408 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:38.408 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:38.408 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.408 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.408 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:38.408 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@33 -- # echo 1024 00:04:38.408 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@33 -- # return 0 00:04:38.408 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@110 -- # (( 1024 == nr_hugepages + surp + resv )) 00:04:38.408 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@112 -- # get_nodes 00:04:38.408 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@27 -- # local node 00:04:38.409 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:04:38.409 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=512 00:04:38.409 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:04:38.409 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=512 00:04:38.409 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@32 -- # no_nodes=2 00:04:38.409 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@33 -- # (( no_nodes > 0 )) 00:04:38.409 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:04:38.409 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:04:38.409 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 0 00:04:38.409 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:04:38.409 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@18 -- # local node=0 00:04:38.409 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@19 -- # local var val 00:04:38.409 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@20 -- # local mem_f mem 00:04:38.409 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:38.409 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node0/meminfo ]] 00:04:38.409 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node0/meminfo 00:04:38.409 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:04:38.409 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:38.409 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.409 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.409 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 48116940 kB' 'MemFree: 38140616 kB' 'MemUsed: 9976324 kB' 'SwapCached: 0 kB' 'Active: 4829212 kB' 'Inactive: 3371680 kB' 'Active(anon): 4671316 kB' 'Inactive(anon): 0 kB' 'Active(file): 157896 kB' 'Inactive(file): 3371680 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 7978912 kB' 'Mapped: 106000 kB' 'AnonPages: 225224 kB' 'Shmem: 4449336 kB' 'KernelStack: 9048 kB' 'PageTables: 3824 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 104000 kB' 'Slab: 286580 kB' 'SReclaimable: 104000 kB' 'SUnreclaim: 182580 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 512' 'HugePages_Free: 512' 'HugePages_Surp: 0' 00:04:38.409 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:38.409 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:38.409 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.409 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.409 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:38.409 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:38.409 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.409 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.409 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:38.409 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:38.409 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.409 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.409 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:38.409 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:38.409 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.409 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.409 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:38.409 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:38.409 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.409 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.409 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:38.409 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:38.409 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.409 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.409 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:38.409 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:38.409 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.409 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.409 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:38.409 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:38.409 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.409 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.409 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:38.409 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:38.409 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.409 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.409 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:38.409 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:38.409 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.409 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.409 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:38.409 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:38.409 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.409 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.409 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:38.409 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:38.409 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.409 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.409 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:38.409 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:38.409 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.409 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.409 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:38.409 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:38.409 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.409 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.409 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:38.409 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:38.409 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.409 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.409 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:38.409 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:38.409 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.409 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.409 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:38.409 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:38.409 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.409 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.409 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:38.409 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:38.409 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.409 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.409 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:38.409 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:38.409 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.410 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.410 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:38.410 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:38.410 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.410 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.410 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:38.410 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:38.410 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.410 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.410 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:38.410 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:38.410 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.410 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.410 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:38.410 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:38.410 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.410 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.410 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:38.410 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:38.410 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.410 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.410 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:38.410 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:38.410 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.410 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.410 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:38.410 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:38.410 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.410 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.410 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:38.410 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:38.410 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.410 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.410 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:38.410 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:38.410 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.410 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.410 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:38.410 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:38.410 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.410 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.410 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:38.410 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:38.410 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.410 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.410 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:38.410 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:38.410 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.410 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.410 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:38.410 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:38.410 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.410 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.410 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:38.410 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:38.410 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.410 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.410 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:38.410 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:38.410 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.410 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.410 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:38.410 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:38.410 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.410 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.410 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:38.410 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:38.410 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.410 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.410 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:38.410 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@33 -- # echo 0 00:04:38.410 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@33 -- # return 0 00:04:38.410 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:04:38.410 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:04:38.410 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:04:38.410 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 1 00:04:38.410 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:04:38.410 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@18 -- # local node=1 00:04:38.410 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@19 -- # local var val 00:04:38.410 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@20 -- # local mem_f mem 00:04:38.410 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:38.410 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node1/meminfo ]] 00:04:38.410 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node1/meminfo 00:04:38.410 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:04:38.410 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:38.410 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.410 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.410 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 44176532 kB' 'MemFree: 41037684 kB' 'MemUsed: 3138848 kB' 'SwapCached: 0 kB' 'Active: 1476428 kB' 'Inactive: 70260 kB' 'Active(anon): 1243708 kB' 'Inactive(anon): 0 kB' 'Active(file): 232720 kB' 'Inactive(file): 70260 kB' 'Unevictable: 0 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 1284008 kB' 'Mapped: 79880 kB' 'AnonPages: 262820 kB' 'Shmem: 981028 kB' 'KernelStack: 7080 kB' 'PageTables: 3860 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 81740 kB' 'Slab: 195196 kB' 'SReclaimable: 81740 kB' 'SUnreclaim: 113456 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 512' 'HugePages_Free: 512' 'HugePages_Surp: 0' 00:04:38.410 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:38.410 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:38.410 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.410 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.410 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:38.410 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:38.410 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.410 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.410 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:38.410 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:38.410 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.410 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.410 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:38.410 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:38.410 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.410 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.410 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:38.410 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:38.410 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.410 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.410 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:38.410 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:38.410 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.410 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.410 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:38.410 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:38.410 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.410 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.410 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:38.410 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:38.410 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.410 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.410 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:38.410 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:38.411 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.411 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.411 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:38.411 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:38.411 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.411 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.411 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:38.411 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:38.411 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.411 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.411 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:38.411 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:38.411 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.411 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.411 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:38.411 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:38.411 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.411 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.411 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:38.411 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:38.411 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.411 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.411 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:38.411 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:38.411 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.411 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.411 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:38.411 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:38.411 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.411 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.411 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:38.411 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:38.411 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.411 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.411 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:38.411 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:38.411 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.411 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.411 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:38.411 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:38.411 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.411 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.411 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:38.411 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:38.411 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.411 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.411 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:38.411 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:38.411 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.411 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.411 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:38.411 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:38.411 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.411 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.411 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:38.411 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:38.411 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.411 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.411 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:38.411 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:38.411 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.411 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.411 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:38.411 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:38.411 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.411 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.411 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:38.411 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:38.411 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.411 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.411 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:38.411 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:38.411 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.411 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.411 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:38.411 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:38.411 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.411 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.411 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:38.411 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:38.411 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.411 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.411 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:38.411 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:38.411 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.411 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.411 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:38.411 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:38.411 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.411 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.411 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:38.411 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:38.411 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.411 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.411 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:38.411 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:38.411 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.411 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.411 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:38.411 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:38.411 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.411 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.411 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:38.411 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:38.411 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.411 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.411 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:38.411 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:38.411 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:38.411 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:38.411 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:38.411 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@33 -- # echo 0 00:04:38.411 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@33 -- # return 0 00:04:38.411 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:04:38.411 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:04:38.411 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:04:38.411 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:04:38.411 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@128 -- # echo 'node0=512 expecting 512' 00:04:38.411 node0=512 expecting 512 00:04:38.411 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:04:38.411 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:04:38.411 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:04:38.411 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@128 -- # echo 'node1=512 expecting 512' 00:04:38.411 node1=512 expecting 512 00:04:38.411 18:08:21 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@130 -- # [[ 512 == \5\1\2 ]] 00:04:38.411 00:04:38.411 real 0m3.468s 00:04:38.411 user 0m1.259s 00:04:38.411 sys 0m2.212s 00:04:38.411 18:08:21 setup.sh.hugepages.even_2G_alloc -- common/autotest_common.sh@1124 -- # xtrace_disable 00:04:38.411 18:08:21 setup.sh.hugepages.even_2G_alloc -- common/autotest_common.sh@10 -- # set +x 00:04:38.411 ************************************ 00:04:38.411 END TEST even_2G_alloc 00:04:38.411 ************************************ 00:04:38.411 18:08:22 setup.sh.hugepages -- common/autotest_common.sh@1142 -- # return 0 00:04:38.411 18:08:22 setup.sh.hugepages -- setup/hugepages.sh@213 -- # run_test odd_alloc odd_alloc 00:04:38.412 18:08:22 setup.sh.hugepages -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:04:38.412 18:08:22 setup.sh.hugepages -- common/autotest_common.sh@1105 -- # xtrace_disable 00:04:38.412 18:08:22 setup.sh.hugepages -- common/autotest_common.sh@10 -- # set +x 00:04:38.412 ************************************ 00:04:38.412 START TEST odd_alloc 00:04:38.412 ************************************ 00:04:38.412 18:08:22 setup.sh.hugepages.odd_alloc -- common/autotest_common.sh@1123 -- # odd_alloc 00:04:38.412 18:08:22 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@159 -- # get_test_nr_hugepages 2098176 00:04:38.412 18:08:22 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@49 -- # local size=2098176 00:04:38.412 18:08:22 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@50 -- # (( 1 > 1 )) 00:04:38.412 18:08:22 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@55 -- # (( size >= default_hugepages )) 00:04:38.412 18:08:22 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@57 -- # nr_hugepages=1025 00:04:38.412 18:08:22 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@58 -- # get_test_nr_hugepages_per_node 00:04:38.412 18:08:22 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@62 -- # user_nodes=() 00:04:38.412 18:08:22 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@62 -- # local user_nodes 00:04:38.412 18:08:22 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@64 -- # local _nr_hugepages=1025 00:04:38.412 18:08:22 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@65 -- # local _no_nodes=2 00:04:38.412 18:08:22 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@67 -- # nodes_test=() 00:04:38.412 18:08:22 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@67 -- # local -g nodes_test 00:04:38.412 18:08:22 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@69 -- # (( 0 > 0 )) 00:04:38.412 18:08:22 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@74 -- # (( 0 > 0 )) 00:04:38.412 18:08:22 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@81 -- # (( _no_nodes > 0 )) 00:04:38.412 18:08:22 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@82 -- # nodes_test[_no_nodes - 1]=512 00:04:38.412 18:08:22 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@83 -- # : 513 00:04:38.412 18:08:22 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@84 -- # : 1 00:04:38.412 18:08:22 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@81 -- # (( _no_nodes > 0 )) 00:04:38.412 18:08:22 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@82 -- # nodes_test[_no_nodes - 1]=513 00:04:38.412 18:08:22 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@83 -- # : 0 00:04:38.412 18:08:22 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@84 -- # : 0 00:04:38.412 18:08:22 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@81 -- # (( _no_nodes > 0 )) 00:04:38.412 18:08:22 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@160 -- # HUGEMEM=2049 00:04:38.412 18:08:22 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@160 -- # HUGE_EVEN_ALLOC=yes 00:04:38.412 18:08:22 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@160 -- # setup output 00:04:38.412 18:08:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@9 -- # [[ output == output ]] 00:04:38.412 18:08:22 setup.sh.hugepages.odd_alloc -- setup/common.sh@10 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh 00:04:41.708 0000:d7:05.5 (8086 201d): Skipping not allowed VMD controller at 0000:d7:05.5 00:04:41.708 0000:85:05.5 (8086 201d): Skipping not allowed VMD controller at 0000:85:05.5 00:04:41.969 0000:00:04.7 (8086 2021): Already using the vfio-pci driver 00:04:41.969 0000:5e:00.0 (8086 0b60): Already using the vfio-pci driver 00:04:41.969 0000:00:04.6 (8086 2021): Already using the vfio-pci driver 00:04:41.969 0000:00:04.5 (8086 2021): Already using the vfio-pci driver 00:04:41.969 0000:00:04.4 (8086 2021): Already using the vfio-pci driver 00:04:41.969 0000:00:04.3 (8086 2021): Already using the vfio-pci driver 00:04:41.969 0000:00:04.2 (8086 2021): Already using the vfio-pci driver 00:04:41.969 0000:00:04.1 (8086 2021): Already using the vfio-pci driver 00:04:41.969 0000:00:04.0 (8086 2021): Already using the vfio-pci driver 00:04:41.969 0000:80:04.7 (8086 2021): Already using the vfio-pci driver 00:04:41.969 0000:80:04.6 (8086 2021): Already using the vfio-pci driver 00:04:41.969 0000:80:04.5 (8086 2021): Already using the vfio-pci driver 00:04:41.969 0000:80:04.4 (8086 2021): Already using the vfio-pci driver 00:04:41.969 0000:80:04.3 (8086 2021): Already using the vfio-pci driver 00:04:41.969 0000:80:04.2 (8086 2021): Already using the vfio-pci driver 00:04:41.969 0000:80:04.1 (8086 2021): Already using the vfio-pci driver 00:04:41.969 0000:80:04.0 (8086 2021): Already using the vfio-pci driver 00:04:41.969 18:08:25 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@161 -- # verify_nr_hugepages 00:04:41.969 18:08:25 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@89 -- # local node 00:04:41.969 18:08:25 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@90 -- # local sorted_t 00:04:41.969 18:08:25 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@91 -- # local sorted_s 00:04:41.969 18:08:25 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@92 -- # local surp 00:04:41.969 18:08:25 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@93 -- # local resv 00:04:41.969 18:08:25 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@94 -- # local anon 00:04:41.969 18:08:25 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@96 -- # [[ always [madvise] never != *\[\n\e\v\e\r\]* ]] 00:04:41.969 18:08:25 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@97 -- # get_meminfo AnonHugePages 00:04:41.969 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@17 -- # local get=AnonHugePages 00:04:41.969 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@18 -- # local node= 00:04:41.969 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@19 -- # local var val 00:04:41.969 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@20 -- # local mem_f mem 00:04:41.969 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:41.969 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:41.969 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:41.969 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:04:41.969 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:41.969 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.969 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.969 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92293472 kB' 'MemFree: 79182336 kB' 'MemAvailable: 82454116 kB' 'Buffers: 11136 kB' 'Cached: 9251872 kB' 'SwapCached: 0 kB' 'Active: 6306608 kB' 'Inactive: 3441940 kB' 'Active(anon): 5915992 kB' 'Inactive(anon): 0 kB' 'Active(file): 390616 kB' 'Inactive(file): 3441940 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 488260 kB' 'Mapped: 185960 kB' 'Shmem: 5430452 kB' 'KReclaimable: 185732 kB' 'Slab: 482080 kB' 'SReclaimable: 185732 kB' 'SUnreclaim: 296348 kB' 'KernelStack: 16160 kB' 'PageTables: 7756 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 53485740 kB' 'Committed_AS: 7291432 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 201060 kB' 'VmallocChunk: 0 kB' 'Percpu: 47360 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1025' 'HugePages_Free: 1025' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2099200 kB' 'DirectMap4k: 634276 kB' 'DirectMap2M: 12673024 kB' 'DirectMap1G: 88080384 kB' 00:04:41.969 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:41.969 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:41.969 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.969 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.969 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:41.969 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:41.969 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.969 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.969 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:41.969 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:41.969 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.969 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.969 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Buffers == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:41.969 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:41.969 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.969 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.969 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Cached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:41.969 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:41.969 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.969 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.969 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SwapCached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:41.969 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:41.969 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.969 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.969 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:41.969 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:41.969 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.969 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.969 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:41.969 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:41.969 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.969 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.969 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:41.969 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:41.969 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.969 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.969 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:41.969 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:41.969 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.969 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.969 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:41.969 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:41.969 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.969 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.969 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:41.969 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:41.969 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.969 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.969 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Unevictable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:41.969 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:41.969 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.969 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.969 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Mlocked == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:41.969 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:41.969 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.969 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.969 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:41.969 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:41.969 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.969 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.969 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SwapFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:41.969 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:41.969 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.969 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.969 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Zswap == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:41.969 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:41.969 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.969 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.969 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Zswapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:41.969 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:41.969 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.969 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.969 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Dirty == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:41.969 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:41.969 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.969 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.970 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Writeback == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:41.970 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:41.970 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.970 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.970 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ AnonPages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:41.970 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:41.970 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.970 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.970 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Mapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:41.970 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:41.970 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.970 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.970 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Shmem == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:41.970 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:41.970 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.970 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.970 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:41.970 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:41.970 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.970 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.970 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Slab == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:41.970 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:41.970 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.970 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.970 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:41.970 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:41.970 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.970 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.970 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:41.970 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:41.970 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.970 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.970 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ KernelStack == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:41.970 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:41.970 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.970 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.970 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ PageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:41.970 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:41.970 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.970 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.970 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:41.970 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:41.970 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.970 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.970 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:41.970 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:41.970 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.970 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.970 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Bounce == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:41.970 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:41.970 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.970 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.970 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:41.970 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:41.970 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.970 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.970 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:41.970 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:41.970 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.970 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.970 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:41.970 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:41.970 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.970 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.970 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:41.970 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:41.970 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.970 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.970 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:41.970 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:41.970 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.970 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.970 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:41.970 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:41.970 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.970 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.970 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Percpu == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:41.970 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:41.970 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.970 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.970 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:41.970 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:41.970 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.970 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.970 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:41.970 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@33 -- # echo 0 00:04:41.970 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@33 -- # return 0 00:04:41.970 18:08:25 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@97 -- # anon=0 00:04:41.970 18:08:25 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@99 -- # get_meminfo HugePages_Surp 00:04:41.970 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:04:41.970 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@18 -- # local node= 00:04:41.970 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@19 -- # local var val 00:04:41.970 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@20 -- # local mem_f mem 00:04:41.970 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:41.970 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:41.970 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:41.970 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:04:41.970 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:41.970 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.970 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.970 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92293472 kB' 'MemFree: 79182368 kB' 'MemAvailable: 82454148 kB' 'Buffers: 11136 kB' 'Cached: 9251876 kB' 'SwapCached: 0 kB' 'Active: 6306332 kB' 'Inactive: 3441940 kB' 'Active(anon): 5915716 kB' 'Inactive(anon): 0 kB' 'Active(file): 390616 kB' 'Inactive(file): 3441940 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 488504 kB' 'Mapped: 185892 kB' 'Shmem: 5430456 kB' 'KReclaimable: 185732 kB' 'Slab: 482080 kB' 'SReclaimable: 185732 kB' 'SUnreclaim: 296348 kB' 'KernelStack: 16160 kB' 'PageTables: 7764 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 53485740 kB' 'Committed_AS: 7291448 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 201028 kB' 'VmallocChunk: 0 kB' 'Percpu: 47360 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1025' 'HugePages_Free: 1025' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2099200 kB' 'DirectMap4k: 634276 kB' 'DirectMap2M: 12673024 kB' 'DirectMap1G: 88080384 kB' 00:04:41.970 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:41.970 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:41.970 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.970 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.970 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:41.970 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:41.970 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.970 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.970 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:41.970 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:41.970 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.970 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.970 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:41.970 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:41.970 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.970 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.970 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:41.970 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:41.970 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.970 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.971 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:41.971 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:41.971 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.971 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.971 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:41.971 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:41.971 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.971 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.971 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:41.971 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:41.971 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.971 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.971 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:41.971 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:41.971 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.971 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.971 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:41.971 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:41.971 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.971 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.971 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:41.971 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:41.971 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.971 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.971 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:41.971 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:41.971 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.971 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.971 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:41.971 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:41.971 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.971 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.971 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:41.971 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:41.971 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.971 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.971 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:41.971 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:41.971 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.971 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.971 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:41.971 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:41.971 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.971 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.971 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:41.971 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:41.971 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.971 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.971 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:41.971 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:41.971 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.971 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.971 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:41.971 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:41.971 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.971 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.971 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:41.971 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:41.971 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.971 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.971 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:41.971 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:41.971 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.971 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.971 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:41.971 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:41.971 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.971 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.971 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:41.971 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:41.971 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.971 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.971 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:41.971 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:41.971 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.971 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.971 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:41.971 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:41.971 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.971 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.971 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:41.971 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:41.971 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.971 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.971 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:41.971 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:41.971 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.971 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.971 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:41.971 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:41.971 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.971 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.971 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:41.971 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:41.971 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.971 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.971 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:41.971 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:41.971 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.971 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.971 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:41.971 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:41.971 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.971 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.971 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:41.971 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:41.971 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.971 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.971 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:41.971 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:41.971 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.971 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.971 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:41.971 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:41.971 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.971 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.971 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:41.971 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:41.971 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.971 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.971 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:41.971 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:41.971 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.971 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.971 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:41.971 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:41.971 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.971 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.971 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:41.971 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:41.971 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.971 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.971 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:41.972 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:41.972 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.972 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.972 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:41.972 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:41.972 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.972 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.972 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:41.972 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:41.972 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.972 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.972 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:41.972 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:41.972 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.972 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.972 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:41.972 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:41.972 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.972 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.972 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:41.972 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:41.972 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.972 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.972 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:41.972 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:41.972 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.972 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.972 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:41.972 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:41.972 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.972 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.972 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:41.972 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:41.972 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.972 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.972 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:41.972 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:41.972 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.972 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.972 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:41.972 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:41.972 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.972 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.972 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:41.972 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:41.972 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.972 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.972 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:41.972 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:41.972 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.972 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.972 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:41.972 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@33 -- # echo 0 00:04:41.972 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@33 -- # return 0 00:04:41.972 18:08:25 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@99 -- # surp=0 00:04:41.972 18:08:25 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@100 -- # get_meminfo HugePages_Rsvd 00:04:41.972 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@17 -- # local get=HugePages_Rsvd 00:04:41.972 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@18 -- # local node= 00:04:41.972 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@19 -- # local var val 00:04:41.972 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@20 -- # local mem_f mem 00:04:41.972 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:41.972 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:41.972 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:41.972 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:04:41.972 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:41.972 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.972 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.972 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92293472 kB' 'MemFree: 79182620 kB' 'MemAvailable: 82454400 kB' 'Buffers: 11136 kB' 'Cached: 9251892 kB' 'SwapCached: 0 kB' 'Active: 6306644 kB' 'Inactive: 3441940 kB' 'Active(anon): 5916028 kB' 'Inactive(anon): 0 kB' 'Active(file): 390616 kB' 'Inactive(file): 3441940 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 488812 kB' 'Mapped: 185892 kB' 'Shmem: 5430472 kB' 'KReclaimable: 185732 kB' 'Slab: 482136 kB' 'SReclaimable: 185732 kB' 'SUnreclaim: 296404 kB' 'KernelStack: 16176 kB' 'PageTables: 7840 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 53485740 kB' 'Committed_AS: 7291468 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 201028 kB' 'VmallocChunk: 0 kB' 'Percpu: 47360 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1025' 'HugePages_Free: 1025' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2099200 kB' 'DirectMap4k: 634276 kB' 'DirectMap2M: 12673024 kB' 'DirectMap1G: 88080384 kB' 00:04:41.972 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:41.972 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:41.972 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.972 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.972 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:41.972 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:41.972 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.972 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.972 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:41.972 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:41.972 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.972 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.972 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:41.972 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:41.972 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.972 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.972 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:41.972 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:41.972 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.972 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.972 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:41.972 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:41.972 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.972 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.972 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:41.972 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:41.972 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.972 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.972 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:41.972 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:41.972 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.972 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.972 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:41.972 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:41.972 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.972 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.972 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:41.972 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:41.972 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.972 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.972 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:41.972 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:41.972 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.972 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.972 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:41.972 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:41.972 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.972 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.972 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:41.972 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:41.972 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.972 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.972 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:41.972 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:41.972 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.973 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.973 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:41.973 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:41.973 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.973 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.973 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:41.973 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:41.973 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.973 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.973 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:41.973 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:41.973 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.973 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.973 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:41.973 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:41.973 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.973 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.973 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:41.973 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:41.973 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.973 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.973 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:41.973 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:41.973 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.973 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.973 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:41.973 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:41.973 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.973 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.973 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:41.973 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:41.973 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.973 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.973 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:41.973 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:41.973 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.973 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.973 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:41.973 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:41.973 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.973 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.973 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:41.973 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:41.973 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.973 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.973 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:41.973 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:41.973 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.973 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.973 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:41.973 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:41.973 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.973 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.973 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:41.973 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:41.973 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.973 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.973 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:41.973 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:41.973 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.973 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.973 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:41.973 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:41.973 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.973 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.973 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:41.973 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:41.973 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.973 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.973 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:41.973 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:41.973 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.973 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.973 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:41.973 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:41.973 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.973 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.973 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:41.973 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:41.973 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:41.973 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:41.973 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:41.973 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:42.235 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:42.235 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:42.235 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:42.235 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:42.235 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:42.235 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:42.235 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:42.235 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:42.235 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:42.235 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:42.235 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:42.235 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:42.235 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:42.235 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:42.235 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:42.235 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:42.235 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:42.235 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:42.235 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:42.235 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:42.235 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:42.235 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:42.235 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:42.235 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:42.235 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:42.235 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:42.235 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:42.235 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:42.235 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:42.235 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:42.235 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:42.235 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:42.235 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:42.235 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:42.235 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:42.235 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:42.235 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:42.235 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:42.235 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:42.235 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:42.235 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:42.235 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:42.235 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:42.235 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:42.235 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:42.235 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:42.235 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:42.235 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:42.235 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:42.235 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:42.235 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:42.235 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:42.235 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:42.235 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:42.235 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:42.235 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:42.235 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:42.235 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:42.235 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:42.235 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:42.235 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:42.235 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:42.235 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:42.235 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@33 -- # echo 0 00:04:42.235 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@33 -- # return 0 00:04:42.235 18:08:25 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@100 -- # resv=0 00:04:42.235 18:08:25 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@102 -- # echo nr_hugepages=1025 00:04:42.235 nr_hugepages=1025 00:04:42.235 18:08:25 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@103 -- # echo resv_hugepages=0 00:04:42.235 resv_hugepages=0 00:04:42.235 18:08:25 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@104 -- # echo surplus_hugepages=0 00:04:42.235 surplus_hugepages=0 00:04:42.235 18:08:25 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@105 -- # echo anon_hugepages=0 00:04:42.235 anon_hugepages=0 00:04:42.235 18:08:25 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@107 -- # (( 1025 == nr_hugepages + surp + resv )) 00:04:42.235 18:08:25 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@109 -- # (( 1025 == nr_hugepages )) 00:04:42.235 18:08:25 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@110 -- # get_meminfo HugePages_Total 00:04:42.235 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@17 -- # local get=HugePages_Total 00:04:42.235 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@18 -- # local node= 00:04:42.235 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@19 -- # local var val 00:04:42.235 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@20 -- # local mem_f mem 00:04:42.236 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:42.236 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:42.236 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:42.236 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:04:42.236 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:42.236 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:42.236 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:42.236 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92293472 kB' 'MemFree: 79189856 kB' 'MemAvailable: 82461636 kB' 'Buffers: 11136 kB' 'Cached: 9251912 kB' 'SwapCached: 0 kB' 'Active: 6307240 kB' 'Inactive: 3441940 kB' 'Active(anon): 5916624 kB' 'Inactive(anon): 0 kB' 'Active(file): 390616 kB' 'Inactive(file): 3441940 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 489368 kB' 'Mapped: 186396 kB' 'Shmem: 5430492 kB' 'KReclaimable: 185732 kB' 'Slab: 482136 kB' 'SReclaimable: 185732 kB' 'SUnreclaim: 296404 kB' 'KernelStack: 16144 kB' 'PageTables: 7740 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 53485740 kB' 'Committed_AS: 7293504 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 201044 kB' 'VmallocChunk: 0 kB' 'Percpu: 47360 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1025' 'HugePages_Free: 1025' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2099200 kB' 'DirectMap4k: 634276 kB' 'DirectMap2M: 12673024 kB' 'DirectMap1G: 88080384 kB' 00:04:42.236 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:42.236 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:42.236 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:42.236 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:42.236 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:42.236 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:42.236 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:42.236 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:42.236 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:42.236 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:42.236 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:42.236 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:42.236 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:42.236 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:42.236 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:42.236 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:42.236 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:42.236 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:42.236 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:42.236 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:42.236 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:42.236 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:42.236 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:42.236 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:42.236 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:42.236 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:42.236 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:42.236 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:42.236 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:42.236 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:42.236 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:42.236 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:42.236 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:42.236 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:42.236 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:42.236 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:42.236 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:42.236 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:42.236 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:42.236 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:42.236 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:42.236 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:42.236 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:42.236 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:42.236 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:42.236 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:42.236 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:42.236 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:42.236 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:42.236 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:42.236 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:42.236 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:42.236 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:42.236 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:42.236 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:42.236 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:42.236 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:42.236 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:42.236 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:42.236 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:42.236 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:42.236 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:42.236 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:42.236 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:42.236 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:42.236 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:42.236 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:42.236 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:42.236 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:42.236 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:42.236 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:42.236 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:42.236 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:42.236 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:42.236 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:42.236 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:42.236 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:42.236 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:42.236 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:42.236 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:42.236 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:42.236 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:42.236 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:42.236 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:42.236 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:42.236 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:42.236 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:42.236 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:42.236 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:42.236 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:42.236 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:42.236 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:42.236 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:42.236 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:42.236 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:42.236 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:42.236 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:42.236 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:42.236 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:42.236 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:42.236 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:42.236 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:42.236 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:42.236 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:42.236 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:42.236 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:42.236 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:42.236 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:42.236 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:42.236 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:42.236 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:42.236 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:42.237 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:42.237 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:42.237 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:42.237 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:42.237 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:42.237 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:42.237 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:42.237 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:42.237 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:42.237 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:42.237 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:42.237 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:42.237 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:42.237 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:42.237 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:42.237 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:42.237 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:42.237 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:42.237 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:42.237 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:42.237 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:42.237 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:42.237 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:42.237 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:42.237 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:42.237 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:42.237 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:42.237 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:42.237 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:42.237 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:42.237 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:42.237 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:42.237 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:42.237 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:42.237 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:42.237 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:42.237 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:42.237 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:42.237 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:42.237 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:42.237 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:42.237 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:42.237 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:42.237 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:42.237 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:42.237 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:42.237 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:42.237 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:42.237 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:42.237 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:42.237 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:42.237 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:42.237 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:42.237 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:42.237 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:42.237 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:42.237 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:42.237 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:42.237 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:42.237 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:42.237 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:42.237 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:42.237 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:42.237 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:42.237 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:42.237 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:42.237 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:42.237 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:42.237 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:42.237 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:42.237 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:42.237 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:42.237 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:42.237 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:42.237 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:42.237 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:42.237 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:42.237 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:42.237 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:42.237 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:42.237 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:42.237 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@33 -- # echo 1025 00:04:42.237 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@33 -- # return 0 00:04:42.237 18:08:25 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@110 -- # (( 1025 == nr_hugepages + surp + resv )) 00:04:42.237 18:08:25 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@112 -- # get_nodes 00:04:42.237 18:08:25 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@27 -- # local node 00:04:42.237 18:08:25 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:04:42.237 18:08:25 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=512 00:04:42.237 18:08:25 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:04:42.237 18:08:25 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=513 00:04:42.237 18:08:25 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@32 -- # no_nodes=2 00:04:42.237 18:08:25 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@33 -- # (( no_nodes > 0 )) 00:04:42.237 18:08:25 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:04:42.237 18:08:25 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:04:42.237 18:08:25 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 0 00:04:42.237 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:04:42.237 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@18 -- # local node=0 00:04:42.237 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@19 -- # local var val 00:04:42.237 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@20 -- # local mem_f mem 00:04:42.237 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:42.237 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node0/meminfo ]] 00:04:42.237 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node0/meminfo 00:04:42.237 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:04:42.237 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:42.237 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:42.237 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:42.237 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 48116940 kB' 'MemFree: 38151292 kB' 'MemUsed: 9965648 kB' 'SwapCached: 0 kB' 'Active: 4836600 kB' 'Inactive: 3371680 kB' 'Active(anon): 4678704 kB' 'Inactive(anon): 0 kB' 'Active(file): 157896 kB' 'Inactive(file): 3371680 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 7979064 kB' 'Mapped: 106712 kB' 'AnonPages: 232420 kB' 'Shmem: 4449488 kB' 'KernelStack: 9112 kB' 'PageTables: 4028 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 104000 kB' 'Slab: 286852 kB' 'SReclaimable: 104000 kB' 'SUnreclaim: 182852 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 512' 'HugePages_Free: 512' 'HugePages_Surp: 0' 00:04:42.237 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:42.237 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:42.237 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:42.237 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:42.237 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:42.237 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:42.237 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:42.237 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:42.237 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:42.237 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:42.237 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:42.237 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:42.237 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:42.237 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:42.237 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:42.237 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:42.238 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:42.238 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:42.238 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:42.238 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:42.238 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:42.238 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:42.238 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:42.238 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:42.238 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:42.238 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:42.238 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:42.238 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:42.238 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:42.238 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:42.238 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:42.238 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:42.238 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:42.238 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:42.238 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:42.238 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:42.238 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:42.238 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:42.238 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:42.238 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:42.238 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:42.238 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:42.238 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:42.238 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:42.238 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:42.238 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:42.238 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:42.238 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:42.238 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:42.238 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:42.238 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:42.238 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:42.238 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:42.238 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:42.238 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:42.238 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:42.238 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:42.238 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:42.238 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:42.238 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:42.238 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:42.238 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:42.238 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:42.238 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:42.238 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:42.238 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:42.238 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:42.238 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:42.238 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:42.238 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:42.238 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:42.238 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:42.238 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:42.238 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:42.238 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:42.238 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:42.238 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:42.238 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:42.238 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:42.238 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:42.238 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:42.238 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:42.238 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:42.238 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:42.238 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:42.238 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:42.238 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:42.238 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:42.238 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:42.238 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:42.238 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:42.238 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:42.238 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:42.238 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:42.238 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:42.238 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:42.238 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:42.238 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:42.238 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:42.238 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:42.238 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:42.238 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:42.238 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:42.238 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:42.238 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:42.238 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:42.238 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:42.238 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:42.238 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:42.238 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:42.238 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:42.238 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:42.238 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:42.238 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:42.238 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:42.238 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:42.238 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:42.238 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:42.238 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:42.238 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:42.238 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:42.238 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:42.238 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:42.238 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:42.238 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:42.238 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:42.238 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:42.238 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:42.238 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:42.238 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:42.238 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:42.238 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:42.238 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:42.238 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:42.238 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:42.238 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:42.238 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:42.238 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:42.238 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:42.238 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:42.238 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:42.238 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:42.238 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:42.238 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:42.238 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:42.238 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@33 -- # echo 0 00:04:42.238 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@33 -- # return 0 00:04:42.238 18:08:25 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:04:42.239 18:08:25 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:04:42.239 18:08:25 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:04:42.239 18:08:25 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 1 00:04:42.239 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:04:42.239 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@18 -- # local node=1 00:04:42.239 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@19 -- # local var val 00:04:42.239 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@20 -- # local mem_f mem 00:04:42.239 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:42.239 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node1/meminfo ]] 00:04:42.239 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node1/meminfo 00:04:42.239 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:04:42.239 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:42.239 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:42.239 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:42.239 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 44176532 kB' 'MemFree: 41031696 kB' 'MemUsed: 3144836 kB' 'SwapCached: 0 kB' 'Active: 1475232 kB' 'Inactive: 70260 kB' 'Active(anon): 1242512 kB' 'Inactive(anon): 0 kB' 'Active(file): 232720 kB' 'Inactive(file): 70260 kB' 'Unevictable: 0 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 1284008 kB' 'Mapped: 80032 kB' 'AnonPages: 261520 kB' 'Shmem: 981028 kB' 'KernelStack: 7048 kB' 'PageTables: 3772 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 81732 kB' 'Slab: 195276 kB' 'SReclaimable: 81732 kB' 'SUnreclaim: 113544 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 513' 'HugePages_Free: 513' 'HugePages_Surp: 0' 00:04:42.239 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:42.239 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:42.239 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:42.239 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:42.239 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:42.239 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:42.239 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:42.239 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:42.239 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:42.239 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:42.239 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:42.239 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:42.239 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:42.239 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:42.239 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:42.239 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:42.239 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:42.239 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:42.239 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:42.239 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:42.239 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:42.239 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:42.239 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:42.239 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:42.239 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:42.239 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:42.239 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:42.239 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:42.239 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:42.239 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:42.239 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:42.239 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:42.239 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:42.239 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:42.239 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:42.239 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:42.239 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:42.239 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:42.239 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:42.239 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:42.239 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:42.239 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:42.239 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:42.239 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:42.239 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:42.239 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:42.239 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:42.239 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:42.239 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:42.239 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:42.239 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:42.239 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:42.239 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:42.239 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:42.239 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:42.239 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:42.239 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:42.239 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:42.239 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:42.239 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:42.239 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:42.239 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:42.239 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:42.239 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:42.239 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:42.239 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:42.239 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:42.239 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:42.239 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:42.239 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:42.239 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:42.239 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:42.239 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:42.239 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:42.239 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:42.239 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:42.239 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:42.239 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:42.240 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:42.240 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:42.240 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:42.240 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:42.240 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:42.240 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:42.240 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:42.240 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:42.240 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:42.240 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:42.240 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:42.240 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:42.240 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:42.240 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:42.240 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:42.240 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:42.240 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:42.240 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:42.240 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:42.240 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:42.240 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:42.240 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:42.240 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:42.240 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:42.240 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:42.240 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:42.240 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:42.240 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:42.240 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:42.240 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:42.240 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:42.240 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:42.240 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:42.240 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:42.240 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:42.240 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:42.240 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:42.240 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:42.240 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:42.240 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:42.240 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:42.240 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:42.240 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:42.240 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:42.240 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:42.240 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:42.240 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:42.240 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:42.240 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:42.240 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:42.240 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:42.240 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:42.240 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:42.240 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:42.240 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:42.240 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:42.240 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:42.240 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:42.240 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:42.240 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:42.240 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:42.240 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:42.240 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:42.240 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:42.240 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:42.240 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:42.240 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:42.240 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@33 -- # echo 0 00:04:42.240 18:08:25 setup.sh.hugepages.odd_alloc -- setup/common.sh@33 -- # return 0 00:04:42.240 18:08:25 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:04:42.240 18:08:25 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:04:42.240 18:08:25 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:04:42.240 18:08:25 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:04:42.240 18:08:25 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@128 -- # echo 'node0=512 expecting 513' 00:04:42.240 node0=512 expecting 513 00:04:42.240 18:08:25 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:04:42.240 18:08:25 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:04:42.240 18:08:25 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:04:42.240 18:08:25 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@128 -- # echo 'node1=513 expecting 512' 00:04:42.240 node1=513 expecting 512 00:04:42.240 18:08:25 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@130 -- # [[ 512 513 == \5\1\2\ \5\1\3 ]] 00:04:42.240 00:04:42.240 real 0m3.767s 00:04:42.240 user 0m1.473s 00:04:42.240 sys 0m2.353s 00:04:42.240 18:08:25 setup.sh.hugepages.odd_alloc -- common/autotest_common.sh@1124 -- # xtrace_disable 00:04:42.240 18:08:25 setup.sh.hugepages.odd_alloc -- common/autotest_common.sh@10 -- # set +x 00:04:42.240 ************************************ 00:04:42.240 END TEST odd_alloc 00:04:42.240 ************************************ 00:04:42.240 18:08:25 setup.sh.hugepages -- common/autotest_common.sh@1142 -- # return 0 00:04:42.240 18:08:25 setup.sh.hugepages -- setup/hugepages.sh@214 -- # run_test custom_alloc custom_alloc 00:04:42.240 18:08:25 setup.sh.hugepages -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:04:42.240 18:08:25 setup.sh.hugepages -- common/autotest_common.sh@1105 -- # xtrace_disable 00:04:42.240 18:08:25 setup.sh.hugepages -- common/autotest_common.sh@10 -- # set +x 00:04:42.240 ************************************ 00:04:42.240 START TEST custom_alloc 00:04:42.240 ************************************ 00:04:42.240 18:08:25 setup.sh.hugepages.custom_alloc -- common/autotest_common.sh@1123 -- # custom_alloc 00:04:42.240 18:08:25 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@167 -- # local IFS=, 00:04:42.240 18:08:25 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@169 -- # local node 00:04:42.240 18:08:25 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@170 -- # nodes_hp=() 00:04:42.240 18:08:25 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@170 -- # local nodes_hp 00:04:42.240 18:08:25 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@172 -- # local nr_hugepages=0 _nr_hugepages=0 00:04:42.240 18:08:25 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@174 -- # get_test_nr_hugepages 1048576 00:04:42.240 18:08:25 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@49 -- # local size=1048576 00:04:42.240 18:08:25 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@50 -- # (( 1 > 1 )) 00:04:42.240 18:08:25 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@55 -- # (( size >= default_hugepages )) 00:04:42.240 18:08:25 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@57 -- # nr_hugepages=512 00:04:42.240 18:08:25 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@58 -- # get_test_nr_hugepages_per_node 00:04:42.240 18:08:25 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@62 -- # user_nodes=() 00:04:42.240 18:08:25 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@62 -- # local user_nodes 00:04:42.240 18:08:25 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@64 -- # local _nr_hugepages=512 00:04:42.240 18:08:25 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@65 -- # local _no_nodes=2 00:04:42.240 18:08:25 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@67 -- # nodes_test=() 00:04:42.240 18:08:25 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@67 -- # local -g nodes_test 00:04:42.240 18:08:25 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@69 -- # (( 0 > 0 )) 00:04:42.240 18:08:25 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@74 -- # (( 0 > 0 )) 00:04:42.240 18:08:25 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@81 -- # (( _no_nodes > 0 )) 00:04:42.240 18:08:25 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@82 -- # nodes_test[_no_nodes - 1]=256 00:04:42.240 18:08:25 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@83 -- # : 256 00:04:42.240 18:08:25 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@84 -- # : 1 00:04:42.240 18:08:25 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@81 -- # (( _no_nodes > 0 )) 00:04:42.240 18:08:25 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@82 -- # nodes_test[_no_nodes - 1]=256 00:04:42.240 18:08:25 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@83 -- # : 0 00:04:42.240 18:08:25 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@84 -- # : 0 00:04:42.240 18:08:25 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@81 -- # (( _no_nodes > 0 )) 00:04:42.240 18:08:25 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@175 -- # nodes_hp[0]=512 00:04:42.240 18:08:25 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@176 -- # (( 2 > 1 )) 00:04:42.240 18:08:25 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@177 -- # get_test_nr_hugepages 2097152 00:04:42.240 18:08:25 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@49 -- # local size=2097152 00:04:42.240 18:08:25 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@50 -- # (( 1 > 1 )) 00:04:42.240 18:08:25 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@55 -- # (( size >= default_hugepages )) 00:04:42.240 18:08:25 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@57 -- # nr_hugepages=1024 00:04:42.240 18:08:25 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@58 -- # get_test_nr_hugepages_per_node 00:04:42.240 18:08:25 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@62 -- # user_nodes=() 00:04:42.240 18:08:25 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@62 -- # local user_nodes 00:04:42.240 18:08:25 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@64 -- # local _nr_hugepages=1024 00:04:42.240 18:08:25 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@65 -- # local _no_nodes=2 00:04:42.240 18:08:25 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@67 -- # nodes_test=() 00:04:42.241 18:08:25 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@67 -- # local -g nodes_test 00:04:42.241 18:08:25 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@69 -- # (( 0 > 0 )) 00:04:42.241 18:08:25 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@74 -- # (( 1 > 0 )) 00:04:42.241 18:08:25 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@75 -- # for _no_nodes in "${!nodes_hp[@]}" 00:04:42.241 18:08:25 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@76 -- # nodes_test[_no_nodes]=512 00:04:42.241 18:08:25 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@78 -- # return 0 00:04:42.241 18:08:25 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@178 -- # nodes_hp[1]=1024 00:04:42.241 18:08:25 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@181 -- # for node in "${!nodes_hp[@]}" 00:04:42.241 18:08:25 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@182 -- # HUGENODE+=("nodes_hp[$node]=${nodes_hp[node]}") 00:04:42.241 18:08:25 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@183 -- # (( _nr_hugepages += nodes_hp[node] )) 00:04:42.241 18:08:25 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@181 -- # for node in "${!nodes_hp[@]}" 00:04:42.241 18:08:25 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@182 -- # HUGENODE+=("nodes_hp[$node]=${nodes_hp[node]}") 00:04:42.241 18:08:25 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@183 -- # (( _nr_hugepages += nodes_hp[node] )) 00:04:42.241 18:08:25 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@186 -- # get_test_nr_hugepages_per_node 00:04:42.241 18:08:25 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@62 -- # user_nodes=() 00:04:42.241 18:08:25 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@62 -- # local user_nodes 00:04:42.241 18:08:25 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@64 -- # local _nr_hugepages=1024 00:04:42.241 18:08:25 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@65 -- # local _no_nodes=2 00:04:42.241 18:08:25 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@67 -- # nodes_test=() 00:04:42.241 18:08:25 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@67 -- # local -g nodes_test 00:04:42.241 18:08:25 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@69 -- # (( 0 > 0 )) 00:04:42.241 18:08:25 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@74 -- # (( 2 > 0 )) 00:04:42.241 18:08:25 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@75 -- # for _no_nodes in "${!nodes_hp[@]}" 00:04:42.241 18:08:25 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@76 -- # nodes_test[_no_nodes]=512 00:04:42.241 18:08:25 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@75 -- # for _no_nodes in "${!nodes_hp[@]}" 00:04:42.241 18:08:25 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@76 -- # nodes_test[_no_nodes]=1024 00:04:42.241 18:08:25 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@78 -- # return 0 00:04:42.241 18:08:25 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@187 -- # HUGENODE='nodes_hp[0]=512,nodes_hp[1]=1024' 00:04:42.241 18:08:25 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@187 -- # setup output 00:04:42.241 18:08:25 setup.sh.hugepages.custom_alloc -- setup/common.sh@9 -- # [[ output == output ]] 00:04:42.241 18:08:25 setup.sh.hugepages.custom_alloc -- setup/common.sh@10 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh 00:04:46.467 0000:d7:05.5 (8086 201d): Skipping not allowed VMD controller at 0000:d7:05.5 00:04:46.467 0000:85:05.5 (8086 201d): Skipping not allowed VMD controller at 0000:85:05.5 00:04:46.467 0000:00:04.7 (8086 2021): Already using the vfio-pci driver 00:04:46.467 0000:5e:00.0 (8086 0b60): Already using the vfio-pci driver 00:04:46.467 0000:00:04.6 (8086 2021): Already using the vfio-pci driver 00:04:46.467 0000:00:04.5 (8086 2021): Already using the vfio-pci driver 00:04:46.467 0000:00:04.4 (8086 2021): Already using the vfio-pci driver 00:04:46.467 0000:00:04.3 (8086 2021): Already using the vfio-pci driver 00:04:46.467 0000:00:04.2 (8086 2021): Already using the vfio-pci driver 00:04:46.467 0000:00:04.1 (8086 2021): Already using the vfio-pci driver 00:04:46.467 0000:00:04.0 (8086 2021): Already using the vfio-pci driver 00:04:46.467 0000:80:04.7 (8086 2021): Already using the vfio-pci driver 00:04:46.467 0000:80:04.6 (8086 2021): Already using the vfio-pci driver 00:04:46.467 0000:80:04.5 (8086 2021): Already using the vfio-pci driver 00:04:46.467 0000:80:04.4 (8086 2021): Already using the vfio-pci driver 00:04:46.467 0000:80:04.3 (8086 2021): Already using the vfio-pci driver 00:04:46.467 0000:80:04.2 (8086 2021): Already using the vfio-pci driver 00:04:46.467 0000:80:04.1 (8086 2021): Already using the vfio-pci driver 00:04:46.467 0000:80:04.0 (8086 2021): Already using the vfio-pci driver 00:04:46.467 18:08:29 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@188 -- # nr_hugepages=1536 00:04:46.467 18:08:29 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@188 -- # verify_nr_hugepages 00:04:46.467 18:08:29 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@89 -- # local node 00:04:46.467 18:08:29 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@90 -- # local sorted_t 00:04:46.467 18:08:29 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@91 -- # local sorted_s 00:04:46.467 18:08:29 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@92 -- # local surp 00:04:46.467 18:08:29 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@93 -- # local resv 00:04:46.467 18:08:29 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@94 -- # local anon 00:04:46.467 18:08:29 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@96 -- # [[ always [madvise] never != *\[\n\e\v\e\r\]* ]] 00:04:46.467 18:08:29 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@97 -- # get_meminfo AnonHugePages 00:04:46.467 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@17 -- # local get=AnonHugePages 00:04:46.467 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@18 -- # local node= 00:04:46.467 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@19 -- # local var val 00:04:46.467 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@20 -- # local mem_f mem 00:04:46.467 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:46.467 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:46.467 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:46.467 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:04:46.467 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:46.467 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.467 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.467 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92293472 kB' 'MemFree: 78146112 kB' 'MemAvailable: 81417892 kB' 'Buffers: 11136 kB' 'Cached: 9252024 kB' 'SwapCached: 0 kB' 'Active: 6307008 kB' 'Inactive: 3441940 kB' 'Active(anon): 5916392 kB' 'Inactive(anon): 0 kB' 'Active(file): 390616 kB' 'Inactive(file): 3441940 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 489004 kB' 'Mapped: 185952 kB' 'Shmem: 5430604 kB' 'KReclaimable: 185732 kB' 'Slab: 482052 kB' 'SReclaimable: 185732 kB' 'SUnreclaim: 296320 kB' 'KernelStack: 16128 kB' 'PageTables: 7688 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 52962476 kB' 'Committed_AS: 7291964 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 201060 kB' 'VmallocChunk: 0 kB' 'Percpu: 47360 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1536' 'HugePages_Free: 1536' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 3145728 kB' 'DirectMap4k: 634276 kB' 'DirectMap2M: 12673024 kB' 'DirectMap1G: 88080384 kB' 00:04:46.467 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:46.467 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:46.467 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.467 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.468 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:46.468 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:46.468 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.468 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.468 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:46.468 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:46.468 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.468 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.468 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Buffers == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:46.468 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:46.468 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.468 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.468 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Cached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:46.468 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:46.468 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.468 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.468 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SwapCached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:46.468 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:46.468 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.468 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.468 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:46.468 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:46.468 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.468 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.468 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:46.468 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:46.468 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.468 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.468 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:46.468 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:46.468 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.468 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.468 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:46.468 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:46.468 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.468 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.468 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:46.468 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:46.468 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.468 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.468 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:46.468 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:46.468 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.468 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.468 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Unevictable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:46.468 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:46.468 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.468 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.468 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Mlocked == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:46.468 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:46.468 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.468 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.468 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:46.468 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:46.468 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.468 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.468 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SwapFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:46.468 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:46.468 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.468 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.468 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Zswap == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:46.468 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:46.468 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.468 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.468 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Zswapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:46.468 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:46.468 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.468 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.468 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Dirty == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:46.468 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:46.468 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.468 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.468 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Writeback == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:46.468 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:46.468 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.468 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.468 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ AnonPages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:46.468 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:46.468 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.468 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.468 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Mapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:46.468 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:46.468 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.468 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.468 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Shmem == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:46.468 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:46.468 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.468 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.468 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:46.468 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:46.468 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.468 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.468 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Slab == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:46.468 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:46.468 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.468 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.468 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:46.468 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:46.468 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.468 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.468 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:46.468 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:46.468 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.468 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.468 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ KernelStack == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:46.468 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:46.468 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.468 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.468 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ PageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:46.468 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:46.468 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.468 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.468 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:46.468 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:46.468 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.468 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.468 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:46.468 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:46.468 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.468 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.468 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Bounce == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:46.468 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:46.468 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.468 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.468 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:46.468 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:46.468 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.468 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.468 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:46.468 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:46.468 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.468 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.469 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:46.469 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:46.469 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.469 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.469 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:46.469 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:46.469 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.469 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.469 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:46.469 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:46.469 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.469 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.469 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:46.469 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:46.469 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.469 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.469 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Percpu == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:46.469 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:46.469 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.469 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.469 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:46.469 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:46.469 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.469 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.469 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:46.469 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@33 -- # echo 0 00:04:46.469 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@33 -- # return 0 00:04:46.469 18:08:29 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@97 -- # anon=0 00:04:46.469 18:08:29 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@99 -- # get_meminfo HugePages_Surp 00:04:46.469 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:04:46.469 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@18 -- # local node= 00:04:46.469 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@19 -- # local var val 00:04:46.469 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@20 -- # local mem_f mem 00:04:46.469 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:46.469 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:46.469 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:46.469 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:04:46.469 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:46.469 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.469 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.469 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92293472 kB' 'MemFree: 78147316 kB' 'MemAvailable: 81419096 kB' 'Buffers: 11136 kB' 'Cached: 9252028 kB' 'SwapCached: 0 kB' 'Active: 6306796 kB' 'Inactive: 3441940 kB' 'Active(anon): 5916180 kB' 'Inactive(anon): 0 kB' 'Active(file): 390616 kB' 'Inactive(file): 3441940 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 488840 kB' 'Mapped: 185904 kB' 'Shmem: 5430608 kB' 'KReclaimable: 185732 kB' 'Slab: 482108 kB' 'SReclaimable: 185732 kB' 'SUnreclaim: 296376 kB' 'KernelStack: 16160 kB' 'PageTables: 7800 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 52962476 kB' 'Committed_AS: 7291984 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 201028 kB' 'VmallocChunk: 0 kB' 'Percpu: 47360 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1536' 'HugePages_Free: 1536' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 3145728 kB' 'DirectMap4k: 634276 kB' 'DirectMap2M: 12673024 kB' 'DirectMap1G: 88080384 kB' 00:04:46.469 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:46.469 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:46.469 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.469 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.469 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:46.469 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:46.469 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.469 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.469 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:46.469 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:46.469 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.469 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.469 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:46.469 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:46.469 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.469 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.469 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:46.469 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:46.469 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.469 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.469 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:46.469 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:46.469 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.469 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.469 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:46.469 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:46.469 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.469 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.469 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:46.469 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:46.469 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.469 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.469 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:46.469 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:46.469 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.469 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.469 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:46.469 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:46.469 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.469 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.469 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:46.469 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:46.469 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.469 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.469 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:46.469 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:46.469 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.469 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.469 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:46.469 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:46.469 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.469 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.469 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:46.469 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:46.469 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.469 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.469 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:46.469 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:46.469 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.469 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.469 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:46.469 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:46.469 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.469 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.469 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:46.469 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:46.469 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.469 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.469 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:46.469 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:46.469 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.469 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.469 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:46.469 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:46.469 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.469 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.469 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:46.470 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:46.470 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.470 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.470 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:46.470 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:46.470 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.470 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.470 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:46.470 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:46.470 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.470 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.470 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:46.470 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:46.470 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.470 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.470 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:46.470 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:46.470 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.470 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.470 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:46.470 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:46.470 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.470 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.470 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:46.470 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:46.470 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.470 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.470 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:46.470 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:46.470 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.470 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.470 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:46.470 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:46.470 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.470 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.470 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:46.470 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:46.470 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.470 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.470 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:46.470 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:46.470 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.470 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.470 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:46.470 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:46.470 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.470 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.470 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:46.470 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:46.470 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.470 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.470 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:46.470 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:46.470 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.470 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.470 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:46.470 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:46.470 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.470 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.470 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:46.470 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:46.470 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.470 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.470 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:46.470 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:46.470 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.470 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.470 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:46.470 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:46.470 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.470 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.470 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:46.470 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:46.470 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.470 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.470 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:46.470 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:46.470 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.470 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.470 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:46.470 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:46.470 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.470 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.470 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:46.470 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:46.470 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.470 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.470 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:46.470 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:46.470 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.470 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.470 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:46.470 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:46.470 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.470 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.470 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:46.470 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:46.470 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.470 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.470 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:46.470 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:46.470 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.470 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.470 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:46.470 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:46.470 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.470 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.470 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:46.470 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:46.470 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.470 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.470 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:46.470 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:46.470 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.470 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.470 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:46.470 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:46.470 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.470 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.470 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:46.470 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:46.470 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.470 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.470 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:46.470 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:46.470 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.470 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.470 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:46.470 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@33 -- # echo 0 00:04:46.470 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@33 -- # return 0 00:04:46.470 18:08:29 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@99 -- # surp=0 00:04:46.471 18:08:29 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@100 -- # get_meminfo HugePages_Rsvd 00:04:46.471 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@17 -- # local get=HugePages_Rsvd 00:04:46.471 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@18 -- # local node= 00:04:46.471 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@19 -- # local var val 00:04:46.471 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@20 -- # local mem_f mem 00:04:46.471 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:46.471 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:46.471 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:46.471 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:04:46.471 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:46.471 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.471 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.471 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92293472 kB' 'MemFree: 78147956 kB' 'MemAvailable: 81419736 kB' 'Buffers: 11136 kB' 'Cached: 9252044 kB' 'SwapCached: 0 kB' 'Active: 6306812 kB' 'Inactive: 3441940 kB' 'Active(anon): 5916196 kB' 'Inactive(anon): 0 kB' 'Active(file): 390616 kB' 'Inactive(file): 3441940 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 488844 kB' 'Mapped: 185904 kB' 'Shmem: 5430624 kB' 'KReclaimable: 185732 kB' 'Slab: 482108 kB' 'SReclaimable: 185732 kB' 'SUnreclaim: 296376 kB' 'KernelStack: 16160 kB' 'PageTables: 7800 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 52962476 kB' 'Committed_AS: 7292004 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 201028 kB' 'VmallocChunk: 0 kB' 'Percpu: 47360 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1536' 'HugePages_Free: 1536' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 3145728 kB' 'DirectMap4k: 634276 kB' 'DirectMap2M: 12673024 kB' 'DirectMap1G: 88080384 kB' 00:04:46.471 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:46.471 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:46.471 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.471 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.471 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:46.471 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:46.471 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.471 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.471 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:46.471 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:46.471 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.471 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.471 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:46.471 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:46.471 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.471 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.471 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:46.471 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:46.471 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.471 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.471 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:46.471 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:46.471 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.471 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.471 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:46.471 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:46.471 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.471 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.471 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:46.471 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:46.471 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.471 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.471 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:46.471 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:46.471 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.471 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.471 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:46.471 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:46.471 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.471 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.471 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:46.471 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:46.471 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.471 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.471 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:46.471 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:46.471 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.471 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.471 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:46.471 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:46.471 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.471 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.471 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:46.471 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:46.471 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.471 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.471 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:46.471 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:46.471 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.471 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.471 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:46.471 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:46.471 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.471 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.471 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:46.471 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:46.471 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.471 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.471 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:46.471 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:46.471 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.471 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.471 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:46.471 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:46.471 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.471 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.471 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:46.471 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:46.471 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.471 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.471 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:46.471 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:46.471 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.471 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.472 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:46.472 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:46.472 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.472 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.472 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:46.472 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:46.472 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.472 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.472 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:46.472 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:46.472 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.472 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.472 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:46.472 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:46.472 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.472 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.472 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:46.472 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:46.472 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.472 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.472 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:46.472 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:46.472 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.472 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.472 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:46.472 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:46.472 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.472 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.472 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:46.472 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:46.472 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.472 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.472 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:46.472 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:46.472 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.472 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.472 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:46.472 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:46.472 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.472 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.472 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:46.472 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:46.472 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.472 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.472 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:46.472 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:46.472 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.472 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.472 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:46.472 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:46.472 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.472 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.472 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:46.472 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:46.472 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.472 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.472 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:46.472 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:46.472 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.472 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.472 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:46.472 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:46.472 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.472 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.472 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:46.472 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:46.472 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.472 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.472 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:46.472 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:46.472 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.472 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.472 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:46.472 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:46.472 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.472 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.472 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:46.472 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:46.472 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.472 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.472 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:46.472 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:46.472 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.472 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.472 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:46.472 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:46.472 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.472 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.472 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:46.472 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:46.472 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.472 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.472 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:46.472 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:46.472 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.472 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.472 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:46.472 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:46.472 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.472 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.472 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:46.472 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:46.472 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.472 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.472 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:46.472 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:46.472 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.472 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.472 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:46.472 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:46.472 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.472 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.472 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:46.472 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:46.472 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.472 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.472 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:46.472 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@33 -- # echo 0 00:04:46.472 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@33 -- # return 0 00:04:46.472 18:08:29 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@100 -- # resv=0 00:04:46.472 18:08:29 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@102 -- # echo nr_hugepages=1536 00:04:46.472 nr_hugepages=1536 00:04:46.472 18:08:29 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@103 -- # echo resv_hugepages=0 00:04:46.472 resv_hugepages=0 00:04:46.472 18:08:29 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@104 -- # echo surplus_hugepages=0 00:04:46.472 surplus_hugepages=0 00:04:46.472 18:08:29 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@105 -- # echo anon_hugepages=0 00:04:46.472 anon_hugepages=0 00:04:46.472 18:08:29 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@107 -- # (( 1536 == nr_hugepages + surp + resv )) 00:04:46.472 18:08:29 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@109 -- # (( 1536 == nr_hugepages )) 00:04:46.472 18:08:29 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@110 -- # get_meminfo HugePages_Total 00:04:46.472 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@17 -- # local get=HugePages_Total 00:04:46.473 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@18 -- # local node= 00:04:46.473 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@19 -- # local var val 00:04:46.473 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@20 -- # local mem_f mem 00:04:46.473 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:46.473 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:46.473 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:46.473 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:04:46.473 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:46.473 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.473 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.473 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92293472 kB' 'MemFree: 78148208 kB' 'MemAvailable: 81419988 kB' 'Buffers: 11136 kB' 'Cached: 9252064 kB' 'SwapCached: 0 kB' 'Active: 6307136 kB' 'Inactive: 3441940 kB' 'Active(anon): 5916520 kB' 'Inactive(anon): 0 kB' 'Active(file): 390616 kB' 'Inactive(file): 3441940 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 489140 kB' 'Mapped: 185904 kB' 'Shmem: 5430644 kB' 'KReclaimable: 185732 kB' 'Slab: 482108 kB' 'SReclaimable: 185732 kB' 'SUnreclaim: 296376 kB' 'KernelStack: 16160 kB' 'PageTables: 7800 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 52962476 kB' 'Committed_AS: 7292028 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 201028 kB' 'VmallocChunk: 0 kB' 'Percpu: 47360 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1536' 'HugePages_Free: 1536' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 3145728 kB' 'DirectMap4k: 634276 kB' 'DirectMap2M: 12673024 kB' 'DirectMap1G: 88080384 kB' 00:04:46.473 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:46.473 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:46.473 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.473 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.473 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:46.473 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:46.473 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.473 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.473 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:46.473 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:46.473 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.473 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.473 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:46.473 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:46.473 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.473 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.473 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:46.473 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:46.473 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.473 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.473 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:46.473 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:46.473 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.473 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.473 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:46.473 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:46.473 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.473 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.473 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:46.473 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:46.473 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.473 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.473 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:46.473 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:46.473 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.473 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.473 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:46.473 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:46.473 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.473 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.473 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:46.473 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:46.473 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.473 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.473 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:46.473 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:46.473 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.473 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.473 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:46.473 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:46.473 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.473 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.473 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:46.473 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:46.473 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.473 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.473 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:46.473 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:46.473 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.473 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.473 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:46.473 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:46.473 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.473 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.473 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:46.473 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:46.473 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.473 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.473 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:46.473 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:46.473 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.473 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.473 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:46.473 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:46.473 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.473 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.473 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:46.473 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:46.473 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.473 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.473 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:46.473 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:46.473 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.473 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.473 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:46.473 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:46.473 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.473 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.473 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:46.473 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:46.473 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.473 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.473 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:46.473 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:46.473 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.473 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.473 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:46.473 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:46.473 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.473 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.473 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:46.473 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:46.473 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.473 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.473 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:46.473 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:46.473 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.473 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.474 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:46.474 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:46.474 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.474 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.474 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:46.474 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:46.474 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.474 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.474 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:46.474 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:46.474 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.474 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.474 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:46.474 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:46.474 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.474 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.474 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:46.474 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:46.474 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.474 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.474 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:46.474 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:46.474 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.474 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.474 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:46.474 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:46.474 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.474 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.474 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:46.474 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:46.474 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.474 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.474 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:46.474 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:46.474 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.474 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.474 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:46.474 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:46.474 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.474 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.474 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:46.474 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:46.474 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.474 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.474 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:46.474 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:46.474 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.474 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.474 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:46.474 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:46.474 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.474 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.474 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:46.474 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:46.474 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.474 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.474 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:46.474 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:46.474 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.474 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.474 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:46.474 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:46.474 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.474 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.474 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:46.474 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:46.474 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.474 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.474 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:46.474 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:46.474 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.474 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.474 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:46.474 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:46.474 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.474 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.474 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:46.474 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:46.474 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.474 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.474 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:46.474 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:46.474 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.474 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.474 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:46.474 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@33 -- # echo 1536 00:04:46.474 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@33 -- # return 0 00:04:46.474 18:08:29 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@110 -- # (( 1536 == nr_hugepages + surp + resv )) 00:04:46.474 18:08:29 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@112 -- # get_nodes 00:04:46.474 18:08:29 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@27 -- # local node 00:04:46.474 18:08:29 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:04:46.474 18:08:29 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=512 00:04:46.474 18:08:29 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:04:46.474 18:08:29 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=1024 00:04:46.474 18:08:29 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@32 -- # no_nodes=2 00:04:46.474 18:08:29 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@33 -- # (( no_nodes > 0 )) 00:04:46.474 18:08:29 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:04:46.474 18:08:29 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:04:46.474 18:08:29 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 0 00:04:46.474 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:04:46.474 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@18 -- # local node=0 00:04:46.474 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@19 -- # local var val 00:04:46.474 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@20 -- # local mem_f mem 00:04:46.474 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:46.474 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node0/meminfo ]] 00:04:46.474 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node0/meminfo 00:04:46.474 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:04:46.474 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:46.474 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.474 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.474 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 48116940 kB' 'MemFree: 38150192 kB' 'MemUsed: 9966748 kB' 'SwapCached: 0 kB' 'Active: 4832676 kB' 'Inactive: 3371680 kB' 'Active(anon): 4674780 kB' 'Inactive(anon): 0 kB' 'Active(file): 157896 kB' 'Inactive(file): 3371680 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 7979216 kB' 'Mapped: 106024 kB' 'AnonPages: 228412 kB' 'Shmem: 4449640 kB' 'KernelStack: 9144 kB' 'PageTables: 4240 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 104000 kB' 'Slab: 287016 kB' 'SReclaimable: 104000 kB' 'SUnreclaim: 183016 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 512' 'HugePages_Free: 512' 'HugePages_Surp: 0' 00:04:46.474 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:46.474 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:46.474 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.474 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.474 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:46.474 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:46.474 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.474 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.474 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:46.474 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:46.474 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.475 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.475 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:46.475 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:46.475 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.475 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.475 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:46.475 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:46.475 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.475 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.475 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:46.475 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:46.475 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.475 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.475 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:46.475 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:46.475 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.475 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.475 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:46.475 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:46.475 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.475 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.475 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:46.475 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:46.475 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.475 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.475 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:46.475 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:46.475 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.475 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.475 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:46.475 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:46.475 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.475 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.475 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:46.475 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:46.475 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.475 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.475 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:46.475 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:46.475 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.475 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.475 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:46.475 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:46.475 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.475 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.475 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:46.475 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:46.475 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.475 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.475 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:46.475 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:46.475 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.475 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.475 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:46.475 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:46.475 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.475 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.475 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:46.475 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:46.475 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.475 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.475 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:46.475 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:46.475 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.475 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.475 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:46.475 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:46.475 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.475 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.475 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:46.475 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:46.475 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.475 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.475 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:46.475 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:46.475 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.475 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.475 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:46.475 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:46.475 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.475 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.475 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:46.475 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:46.475 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.475 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.475 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:46.475 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:46.475 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.475 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.475 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:46.475 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:46.475 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.475 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.475 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:46.475 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:46.475 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.475 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.475 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:46.475 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:46.475 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.475 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.475 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:46.475 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:46.475 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.475 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.475 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:46.475 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:46.475 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.475 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.475 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:46.475 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:46.475 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.475 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.475 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:46.475 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:46.475 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.475 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.475 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:46.475 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:46.475 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.475 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.475 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:46.475 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:46.475 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.475 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.475 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:46.475 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:46.475 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.475 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.475 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:46.475 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:46.475 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.475 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.476 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:46.476 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@33 -- # echo 0 00:04:46.476 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@33 -- # return 0 00:04:46.476 18:08:29 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:04:46.476 18:08:29 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:04:46.476 18:08:29 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:04:46.476 18:08:29 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 1 00:04:46.476 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:04:46.476 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@18 -- # local node=1 00:04:46.476 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@19 -- # local var val 00:04:46.476 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@20 -- # local mem_f mem 00:04:46.476 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:46.476 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node1/meminfo ]] 00:04:46.476 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node1/meminfo 00:04:46.476 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:04:46.476 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:46.476 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.476 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.476 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 44176532 kB' 'MemFree: 39997964 kB' 'MemUsed: 4178568 kB' 'SwapCached: 0 kB' 'Active: 1475200 kB' 'Inactive: 70260 kB' 'Active(anon): 1242480 kB' 'Inactive(anon): 0 kB' 'Active(file): 232720 kB' 'Inactive(file): 70260 kB' 'Unevictable: 0 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 1284008 kB' 'Mapped: 79880 kB' 'AnonPages: 261472 kB' 'Shmem: 981028 kB' 'KernelStack: 7048 kB' 'PageTables: 3728 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 81732 kB' 'Slab: 195092 kB' 'SReclaimable: 81732 kB' 'SUnreclaim: 113360 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Surp: 0' 00:04:46.476 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:46.476 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:46.476 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.476 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.476 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:46.476 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:46.476 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.476 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.476 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:46.476 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:46.476 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.476 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.476 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:46.476 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:46.476 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.476 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.476 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:46.476 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:46.476 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.476 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.476 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:46.476 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:46.476 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.476 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.476 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:46.476 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:46.476 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.476 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.476 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:46.476 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:46.476 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.476 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.476 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:46.476 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:46.476 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.476 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.476 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:46.476 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:46.476 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.476 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.476 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:46.476 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:46.476 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.476 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.476 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:46.476 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:46.476 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.476 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.476 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:46.476 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:46.476 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.476 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.476 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:46.476 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:46.476 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.476 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.476 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:46.476 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:46.476 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.476 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.476 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:46.476 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:46.476 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.476 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.476 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:46.476 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:46.476 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.476 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.476 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:46.476 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:46.476 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.476 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.476 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:46.476 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:46.476 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.476 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.476 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:46.476 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:46.476 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.476 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.476 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:46.476 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:46.476 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.476 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.476 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:46.476 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:46.476 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.476 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.476 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:46.477 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:46.477 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.477 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.477 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:46.477 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:46.477 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.477 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.477 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:46.477 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:46.477 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.477 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.477 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:46.477 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:46.477 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.477 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.477 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:46.477 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:46.477 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.477 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.477 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:46.477 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:46.477 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.477 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.477 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:46.477 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:46.477 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.477 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.477 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:46.477 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:46.477 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.477 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.477 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:46.477 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:46.477 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.477 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.477 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:46.477 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:46.477 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.477 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.477 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:46.477 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:46.477 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.477 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.477 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:46.477 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:46.477 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.477 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.477 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:46.477 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:46.477 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.477 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.477 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:46.477 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:46.477 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:46.477 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:46.477 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:46.477 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@33 -- # echo 0 00:04:46.477 18:08:29 setup.sh.hugepages.custom_alloc -- setup/common.sh@33 -- # return 0 00:04:46.477 18:08:29 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:04:46.477 18:08:29 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:04:46.477 18:08:29 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:04:46.477 18:08:29 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:04:46.477 18:08:29 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@128 -- # echo 'node0=512 expecting 512' 00:04:46.477 node0=512 expecting 512 00:04:46.477 18:08:29 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:04:46.477 18:08:29 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:04:46.477 18:08:29 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:04:46.477 18:08:29 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@128 -- # echo 'node1=1024 expecting 1024' 00:04:46.477 node1=1024 expecting 1024 00:04:46.477 18:08:29 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@130 -- # [[ 512,1024 == \5\1\2\,\1\0\2\4 ]] 00:04:46.477 00:04:46.477 real 0m3.910s 00:04:46.477 user 0m1.519s 00:04:46.477 sys 0m2.494s 00:04:46.477 18:08:29 setup.sh.hugepages.custom_alloc -- common/autotest_common.sh@1124 -- # xtrace_disable 00:04:46.477 18:08:29 setup.sh.hugepages.custom_alloc -- common/autotest_common.sh@10 -- # set +x 00:04:46.477 ************************************ 00:04:46.477 END TEST custom_alloc 00:04:46.477 ************************************ 00:04:46.477 18:08:29 setup.sh.hugepages -- common/autotest_common.sh@1142 -- # return 0 00:04:46.477 18:08:29 setup.sh.hugepages -- setup/hugepages.sh@215 -- # run_test no_shrink_alloc no_shrink_alloc 00:04:46.477 18:08:29 setup.sh.hugepages -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:04:46.477 18:08:29 setup.sh.hugepages -- common/autotest_common.sh@1105 -- # xtrace_disable 00:04:46.477 18:08:29 setup.sh.hugepages -- common/autotest_common.sh@10 -- # set +x 00:04:46.477 ************************************ 00:04:46.477 START TEST no_shrink_alloc 00:04:46.477 ************************************ 00:04:46.477 18:08:29 setup.sh.hugepages.no_shrink_alloc -- common/autotest_common.sh@1123 -- # no_shrink_alloc 00:04:46.477 18:08:29 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@195 -- # get_test_nr_hugepages 2097152 0 00:04:46.477 18:08:29 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@49 -- # local size=2097152 00:04:46.477 18:08:29 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@50 -- # (( 2 > 1 )) 00:04:46.477 18:08:29 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@51 -- # shift 00:04:46.477 18:08:29 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@52 -- # node_ids=('0') 00:04:46.477 18:08:29 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@52 -- # local node_ids 00:04:46.477 18:08:29 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@55 -- # (( size >= default_hugepages )) 00:04:46.477 18:08:29 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@57 -- # nr_hugepages=1024 00:04:46.477 18:08:29 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@58 -- # get_test_nr_hugepages_per_node 0 00:04:46.477 18:08:29 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@62 -- # user_nodes=('0') 00:04:46.477 18:08:29 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@62 -- # local user_nodes 00:04:46.477 18:08:29 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@64 -- # local _nr_hugepages=1024 00:04:46.477 18:08:29 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@65 -- # local _no_nodes=2 00:04:46.477 18:08:29 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@67 -- # nodes_test=() 00:04:46.477 18:08:29 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@67 -- # local -g nodes_test 00:04:46.477 18:08:29 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@69 -- # (( 1 > 0 )) 00:04:46.477 18:08:29 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@70 -- # for _no_nodes in "${user_nodes[@]}" 00:04:46.477 18:08:29 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@71 -- # nodes_test[_no_nodes]=1024 00:04:46.477 18:08:29 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@73 -- # return 0 00:04:46.477 18:08:29 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@198 -- # setup output 00:04:46.477 18:08:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@9 -- # [[ output == output ]] 00:04:46.477 18:08:29 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@10 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh 00:04:49.763 0000:d7:05.5 (8086 201d): Skipping not allowed VMD controller at 0000:d7:05.5 00:04:49.763 0000:85:05.5 (8086 201d): Skipping not allowed VMD controller at 0000:85:05.5 00:04:49.763 0000:00:04.7 (8086 2021): Already using the vfio-pci driver 00:04:49.763 0000:5e:00.0 (8086 0b60): Already using the vfio-pci driver 00:04:49.763 0000:00:04.6 (8086 2021): Already using the vfio-pci driver 00:04:49.763 0000:00:04.5 (8086 2021): Already using the vfio-pci driver 00:04:49.763 0000:00:04.4 (8086 2021): Already using the vfio-pci driver 00:04:49.763 0000:00:04.3 (8086 2021): Already using the vfio-pci driver 00:04:49.763 0000:00:04.2 (8086 2021): Already using the vfio-pci driver 00:04:49.763 0000:00:04.1 (8086 2021): Already using the vfio-pci driver 00:04:49.763 0000:00:04.0 (8086 2021): Already using the vfio-pci driver 00:04:49.763 0000:80:04.7 (8086 2021): Already using the vfio-pci driver 00:04:49.763 0000:80:04.6 (8086 2021): Already using the vfio-pci driver 00:04:49.763 0000:80:04.5 (8086 2021): Already using the vfio-pci driver 00:04:49.763 0000:80:04.4 (8086 2021): Already using the vfio-pci driver 00:04:49.763 0000:80:04.3 (8086 2021): Already using the vfio-pci driver 00:04:49.763 0000:80:04.2 (8086 2021): Already using the vfio-pci driver 00:04:49.763 0000:80:04.1 (8086 2021): Already using the vfio-pci driver 00:04:49.763 0000:80:04.0 (8086 2021): Already using the vfio-pci driver 00:04:49.763 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@199 -- # verify_nr_hugepages 00:04:49.763 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@89 -- # local node 00:04:49.763 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@90 -- # local sorted_t 00:04:49.763 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@91 -- # local sorted_s 00:04:49.763 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@92 -- # local surp 00:04:49.763 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@93 -- # local resv 00:04:49.763 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@94 -- # local anon 00:04:49.763 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@96 -- # [[ always [madvise] never != *\[\n\e\v\e\r\]* ]] 00:04:49.763 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@97 -- # get_meminfo AnonHugePages 00:04:49.763 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@17 -- # local get=AnonHugePages 00:04:49.763 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@18 -- # local node= 00:04:49.763 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@19 -- # local var val 00:04:49.763 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@20 -- # local mem_f mem 00:04:49.763 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:49.763 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:49.763 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:49.763 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:04:49.763 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:49.763 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:49.763 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:49.763 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92293472 kB' 'MemFree: 79187536 kB' 'MemAvailable: 82459316 kB' 'Buffers: 11136 kB' 'Cached: 9252176 kB' 'SwapCached: 0 kB' 'Active: 6309268 kB' 'Inactive: 3441940 kB' 'Active(anon): 5918652 kB' 'Inactive(anon): 0 kB' 'Active(file): 390616 kB' 'Inactive(file): 3441940 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 490872 kB' 'Mapped: 185996 kB' 'Shmem: 5430756 kB' 'KReclaimable: 185732 kB' 'Slab: 481824 kB' 'SReclaimable: 185732 kB' 'SUnreclaim: 296092 kB' 'KernelStack: 16208 kB' 'PageTables: 7828 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 53486764 kB' 'Committed_AS: 7292632 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 200996 kB' 'VmallocChunk: 0 kB' 'Percpu: 47360 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 634276 kB' 'DirectMap2M: 12673024 kB' 'DirectMap1G: 88080384 kB' 00:04:49.763 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:49.763 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:49.763 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:49.763 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:49.763 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:49.763 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:49.763 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:49.763 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:49.763 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:49.763 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:49.763 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:49.763 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:49.763 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Buffers == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:49.763 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:49.763 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:49.763 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:49.763 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Cached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:49.763 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:49.763 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:49.763 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:49.763 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapCached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:49.763 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:49.763 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:49.763 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:49.763 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:49.763 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:49.763 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:49.763 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:49.763 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:49.763 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:49.763 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:49.763 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:49.763 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:49.763 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:49.763 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:49.763 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:49.763 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:49.763 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:50.028 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:50.028 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:50.029 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:50.029 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:50.029 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:50.029 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:50.029 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:50.029 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:50.029 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:50.029 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:50.029 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unevictable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:50.029 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:50.029 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:50.029 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:50.029 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mlocked == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:50.029 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:50.029 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:50.029 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:50.029 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:50.029 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:50.029 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:50.029 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:50.029 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:50.029 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:50.029 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:50.029 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:50.029 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswap == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:50.029 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:50.029 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:50.029 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:50.029 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:50.029 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:50.029 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:50.029 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:50.029 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Dirty == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:50.029 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:50.029 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:50.029 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:50.029 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Writeback == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:50.029 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:50.029 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:50.029 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:50.029 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonPages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:50.029 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:50.029 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:50.029 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:50.029 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:50.029 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:50.029 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:50.029 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:50.029 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Shmem == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:50.029 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:50.029 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:50.029 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:50.029 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:50.029 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:50.029 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:50.029 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:50.029 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Slab == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:50.029 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:50.029 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:50.029 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:50.029 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:50.029 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:50.029 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:50.029 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:50.029 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:50.029 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:50.029 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:50.029 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:50.029 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KernelStack == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:50.029 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:50.029 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:50.029 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:50.029 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ PageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:50.029 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:50.029 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:50.029 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:50.029 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:50.029 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:50.029 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:50.029 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:50.029 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:50.029 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:50.029 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:50.029 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:50.029 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Bounce == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:50.029 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:50.029 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:50.029 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:50.029 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:50.029 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:50.029 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:50.029 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:50.029 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:50.029 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:50.029 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:50.029 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:50.029 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:50.029 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:50.029 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:50.029 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:50.029 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:50.029 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:50.029 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:50.029 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:50.029 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:50.029 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:50.029 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:50.029 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:50.029 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:50.029 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:50.029 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:50.029 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:50.029 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Percpu == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:50.029 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:50.029 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:50.029 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:50.029 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:50.029 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:50.029 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:50.029 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:50.029 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:50.029 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # echo 0 00:04:50.029 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # return 0 00:04:50.029 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@97 -- # anon=0 00:04:50.029 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@99 -- # get_meminfo HugePages_Surp 00:04:50.029 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:04:50.030 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@18 -- # local node= 00:04:50.030 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@19 -- # local var val 00:04:50.030 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@20 -- # local mem_f mem 00:04:50.030 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:50.030 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:50.030 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:50.030 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:04:50.030 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:50.030 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:50.030 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:50.030 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92293472 kB' 'MemFree: 79187388 kB' 'MemAvailable: 82459168 kB' 'Buffers: 11136 kB' 'Cached: 9252180 kB' 'SwapCached: 0 kB' 'Active: 6308232 kB' 'Inactive: 3441940 kB' 'Active(anon): 5917616 kB' 'Inactive(anon): 0 kB' 'Active(file): 390616 kB' 'Inactive(file): 3441940 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 490280 kB' 'Mapped: 185916 kB' 'Shmem: 5430760 kB' 'KReclaimable: 185732 kB' 'Slab: 481872 kB' 'SReclaimable: 185732 kB' 'SUnreclaim: 296140 kB' 'KernelStack: 16160 kB' 'PageTables: 7808 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 53486764 kB' 'Committed_AS: 7292652 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 200964 kB' 'VmallocChunk: 0 kB' 'Percpu: 47360 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 634276 kB' 'DirectMap2M: 12673024 kB' 'DirectMap1G: 88080384 kB' 00:04:50.030 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:50.030 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:50.030 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:50.030 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:50.030 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:50.030 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:50.030 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:50.030 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:50.030 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:50.030 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:50.030 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:50.030 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:50.030 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:50.030 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:50.030 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:50.030 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:50.030 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:50.030 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:50.030 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:50.030 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:50.030 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:50.030 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:50.030 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:50.030 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:50.030 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:50.030 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:50.030 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:50.030 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:50.030 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:50.030 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:50.030 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:50.030 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:50.030 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:50.030 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:50.030 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:50.030 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:50.030 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:50.030 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:50.030 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:50.030 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:50.030 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:50.030 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:50.030 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:50.030 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:50.030 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:50.030 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:50.030 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:50.030 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:50.030 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:50.030 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:50.030 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:50.030 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:50.030 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:50.030 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:50.030 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:50.030 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:50.030 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:50.030 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:50.030 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:50.030 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:50.030 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:50.030 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:50.030 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:50.030 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:50.030 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:50.030 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:50.030 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:50.030 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:50.030 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:50.030 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:50.030 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:50.030 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:50.030 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:50.030 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:50.030 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:50.030 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:50.030 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:50.030 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:50.030 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:50.030 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:50.030 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:50.030 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:50.030 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:50.030 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:50.030 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:50.030 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:50.030 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:50.030 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:50.030 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:50.030 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:50.030 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:50.030 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:50.030 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:50.030 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:50.030 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:50.030 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:50.030 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:50.030 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:50.030 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:50.030 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:50.030 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:50.030 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:50.030 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:50.031 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:50.031 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:50.031 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:50.031 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:50.031 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:50.031 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:50.031 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:50.031 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:50.031 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:50.031 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:50.031 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:50.031 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:50.031 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:50.031 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:50.031 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:50.031 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:50.031 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:50.031 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:50.031 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:50.031 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:50.031 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:50.031 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:50.031 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:50.031 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:50.031 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:50.031 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:50.031 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:50.031 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:50.031 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:50.031 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:50.031 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:50.031 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:50.031 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:50.031 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:50.031 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:50.031 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:50.031 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:50.031 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:50.031 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:50.031 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:50.031 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:50.031 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:50.031 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:50.031 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:50.031 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:50.031 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:50.031 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:50.031 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:50.031 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:50.031 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:50.031 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:50.031 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:50.031 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:50.031 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:50.031 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:50.031 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:50.031 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:50.031 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:50.031 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:50.031 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:50.031 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:50.031 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:50.031 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:50.031 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:50.031 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:50.031 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:50.031 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:50.031 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:50.031 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:50.031 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:50.031 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:50.031 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:50.031 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:50.031 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:50.031 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:50.031 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:50.031 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:50.031 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:50.031 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:50.031 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:50.031 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:50.031 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:50.031 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:50.031 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:50.031 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:50.031 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:50.031 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:50.031 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:50.031 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:50.031 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:50.031 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:50.031 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:50.031 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:50.031 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:50.031 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:50.031 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:50.031 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:50.031 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:50.031 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:50.031 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:50.031 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:50.031 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:50.031 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # echo 0 00:04:50.031 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # return 0 00:04:50.031 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@99 -- # surp=0 00:04:50.031 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@100 -- # get_meminfo HugePages_Rsvd 00:04:50.031 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@17 -- # local get=HugePages_Rsvd 00:04:50.031 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@18 -- # local node= 00:04:50.031 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@19 -- # local var val 00:04:50.031 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@20 -- # local mem_f mem 00:04:50.031 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:50.031 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:50.031 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:50.031 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:04:50.031 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:50.031 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:50.031 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:50.032 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92293472 kB' 'MemFree: 79187924 kB' 'MemAvailable: 82459704 kB' 'Buffers: 11136 kB' 'Cached: 9252196 kB' 'SwapCached: 0 kB' 'Active: 6308208 kB' 'Inactive: 3441940 kB' 'Active(anon): 5917592 kB' 'Inactive(anon): 0 kB' 'Active(file): 390616 kB' 'Inactive(file): 3441940 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 490332 kB' 'Mapped: 185916 kB' 'Shmem: 5430776 kB' 'KReclaimable: 185732 kB' 'Slab: 481868 kB' 'SReclaimable: 185732 kB' 'SUnreclaim: 296136 kB' 'KernelStack: 16144 kB' 'PageTables: 7756 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 53486764 kB' 'Committed_AS: 7292672 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 200980 kB' 'VmallocChunk: 0 kB' 'Percpu: 47360 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 634276 kB' 'DirectMap2M: 12673024 kB' 'DirectMap1G: 88080384 kB' 00:04:50.032 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:50.032 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:50.032 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:50.032 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:50.032 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:50.032 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:50.032 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:50.032 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:50.032 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:50.032 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:50.032 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:50.032 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:50.032 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:50.032 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:50.032 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:50.032 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:50.032 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:50.032 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:50.032 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:50.032 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:50.032 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:50.032 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:50.032 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:50.032 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:50.032 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:50.032 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:50.032 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:50.032 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:50.032 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:50.032 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:50.032 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:50.032 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:50.032 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:50.032 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:50.032 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:50.032 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:50.032 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:50.032 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:50.032 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:50.032 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:50.032 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:50.032 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:50.032 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:50.032 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:50.032 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:50.032 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:50.032 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:50.032 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:50.032 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:50.032 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:50.032 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:50.032 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:50.032 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:50.032 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:50.032 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:50.032 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:50.032 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:50.032 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:50.032 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:50.032 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:50.032 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:50.032 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:50.032 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:50.032 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:50.032 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:50.032 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:50.032 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:50.032 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:50.032 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:50.032 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:50.032 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:50.032 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:50.032 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:50.032 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:50.032 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:50.032 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:50.032 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:50.032 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:50.032 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:50.032 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:50.032 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:50.032 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:50.032 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:50.032 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:50.032 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:50.032 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:50.032 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:50.032 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:50.032 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:50.032 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:50.032 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:50.032 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:50.032 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:50.032 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:50.032 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:50.032 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:50.032 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:50.032 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:50.032 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:50.032 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:50.032 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:50.032 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:50.032 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:50.032 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:50.032 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:50.032 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:50.032 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:50.032 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:50.032 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:50.032 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:50.033 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:50.033 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:50.033 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:50.033 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:50.033 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:50.033 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:50.033 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:50.033 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:50.033 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:50.033 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:50.033 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:50.033 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:50.033 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:50.033 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:50.033 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:50.033 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:50.033 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:50.033 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:50.033 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:50.033 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:50.033 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:50.033 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:50.033 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:50.033 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:50.033 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:50.033 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:50.033 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:50.033 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:50.033 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:50.033 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:50.033 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:50.033 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:50.033 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:50.033 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:50.033 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:50.033 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:50.033 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:50.033 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:50.033 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:50.033 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:50.033 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:50.033 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:50.033 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:50.033 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:50.033 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:50.033 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:50.033 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:50.033 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:50.033 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:50.033 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:50.033 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:50.033 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:50.033 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:50.033 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:50.033 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:50.033 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:50.033 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:50.033 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:50.033 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:50.033 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:50.033 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:50.033 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:50.033 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:50.033 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:50.033 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:50.033 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:50.033 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:50.033 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:50.033 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:50.033 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:50.033 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:50.033 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:50.033 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:50.033 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:50.033 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:50.033 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:50.033 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:50.033 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:50.033 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:50.033 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:50.033 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:50.033 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:50.033 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:50.033 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:50.033 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:50.033 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:50.033 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:50.033 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:50.033 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:50.033 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:50.033 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:50.033 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # echo 0 00:04:50.033 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # return 0 00:04:50.033 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@100 -- # resv=0 00:04:50.033 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@102 -- # echo nr_hugepages=1024 00:04:50.033 nr_hugepages=1024 00:04:50.033 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@103 -- # echo resv_hugepages=0 00:04:50.033 resv_hugepages=0 00:04:50.033 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@104 -- # echo surplus_hugepages=0 00:04:50.033 surplus_hugepages=0 00:04:50.033 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@105 -- # echo anon_hugepages=0 00:04:50.033 anon_hugepages=0 00:04:50.033 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@107 -- # (( 1024 == nr_hugepages + surp + resv )) 00:04:50.033 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@109 -- # (( 1024 == nr_hugepages )) 00:04:50.033 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@110 -- # get_meminfo HugePages_Total 00:04:50.033 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@17 -- # local get=HugePages_Total 00:04:50.034 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@18 -- # local node= 00:04:50.034 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@19 -- # local var val 00:04:50.034 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@20 -- # local mem_f mem 00:04:50.034 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:50.034 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:50.034 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:50.034 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:04:50.034 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:50.034 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92293472 kB' 'MemFree: 79188316 kB' 'MemAvailable: 82460096 kB' 'Buffers: 11136 kB' 'Cached: 9252220 kB' 'SwapCached: 0 kB' 'Active: 6308300 kB' 'Inactive: 3441940 kB' 'Active(anon): 5917684 kB' 'Inactive(anon): 0 kB' 'Active(file): 390616 kB' 'Inactive(file): 3441940 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 490284 kB' 'Mapped: 185916 kB' 'Shmem: 5430800 kB' 'KReclaimable: 185732 kB' 'Slab: 481868 kB' 'SReclaimable: 185732 kB' 'SUnreclaim: 296136 kB' 'KernelStack: 16160 kB' 'PageTables: 7804 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 53486764 kB' 'Committed_AS: 7292696 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 200980 kB' 'VmallocChunk: 0 kB' 'Percpu: 47360 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 634276 kB' 'DirectMap2M: 12673024 kB' 'DirectMap1G: 88080384 kB' 00:04:50.034 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:50.034 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:50.034 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:50.034 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:50.034 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:50.034 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:50.034 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:50.034 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:50.034 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:50.034 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:50.034 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:50.034 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:50.034 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:50.034 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:50.034 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:50.034 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:50.034 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:50.034 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:50.034 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:50.034 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:50.034 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:50.034 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:50.034 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:50.034 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:50.034 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:50.034 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:50.034 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:50.034 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:50.034 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:50.034 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:50.034 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:50.034 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:50.034 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:50.034 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:50.034 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:50.034 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:50.034 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:50.034 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:50.034 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:50.034 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:50.034 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:50.034 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:50.034 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:50.034 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:50.034 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:50.034 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:50.034 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:50.034 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:50.034 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:50.034 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:50.034 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:50.034 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:50.034 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:50.034 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:50.034 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:50.034 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:50.034 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:50.034 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:50.034 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:50.034 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:50.034 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:50.034 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:50.034 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:50.034 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:50.034 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:50.034 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:50.034 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:50.034 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:50.034 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:50.034 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:50.034 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:50.034 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:50.034 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:50.034 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:50.034 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:50.034 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:50.034 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:50.034 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:50.034 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:50.034 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:50.034 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:50.034 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:50.034 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:50.034 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:50.034 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:50.034 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:50.034 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:50.034 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:50.034 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:50.034 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:50.034 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:50.034 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:50.034 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:50.034 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:50.034 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:50.034 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:50.034 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:50.034 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:50.034 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:50.034 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:50.034 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:50.034 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:50.034 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:50.034 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:50.034 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:50.035 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:50.035 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:50.035 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:50.035 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:50.035 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:50.035 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:50.035 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:50.035 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:50.035 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:50.035 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:50.035 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:50.035 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:50.035 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:50.035 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:50.035 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:50.035 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:50.035 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:50.035 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:50.035 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:50.035 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:50.035 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:50.035 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:50.035 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:50.035 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:50.035 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:50.035 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:50.035 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:50.035 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:50.035 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:50.035 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:50.035 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:50.035 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:50.035 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:50.035 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:50.035 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:50.035 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:50.035 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:50.035 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:50.035 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:50.035 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:50.035 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:50.035 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:50.035 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:50.035 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:50.035 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:50.035 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:50.035 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:50.035 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:50.035 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:50.035 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:50.035 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:50.035 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:50.035 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:50.035 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:50.035 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:50.035 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:50.035 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:50.035 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:50.035 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:50.035 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:50.035 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:50.035 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:50.035 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:50.035 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:50.035 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:50.035 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:50.035 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:50.035 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:50.035 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:50.035 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:50.035 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:50.035 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:50.035 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:50.035 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:50.035 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:50.035 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:50.035 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:50.035 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:50.035 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:50.035 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:50.035 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:50.035 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:50.035 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:50.035 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:50.035 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:50.035 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:50.035 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:50.035 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:50.035 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:50.035 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:50.035 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # echo 1024 00:04:50.035 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # return 0 00:04:50.035 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@110 -- # (( 1024 == nr_hugepages + surp + resv )) 00:04:50.035 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@112 -- # get_nodes 00:04:50.035 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@27 -- # local node 00:04:50.035 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:04:50.035 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=1024 00:04:50.035 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:04:50.035 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=0 00:04:50.035 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@32 -- # no_nodes=2 00:04:50.035 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@33 -- # (( no_nodes > 0 )) 00:04:50.035 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:04:50.035 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:04:50.035 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 0 00:04:50.035 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:04:50.035 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@18 -- # local node=0 00:04:50.035 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@19 -- # local var val 00:04:50.035 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@20 -- # local mem_f mem 00:04:50.035 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:50.035 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node0/meminfo ]] 00:04:50.035 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node0/meminfo 00:04:50.035 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:04:50.035 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:50.035 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:50.035 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:50.035 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 48116940 kB' 'MemFree: 37082780 kB' 'MemUsed: 11034160 kB' 'SwapCached: 0 kB' 'Active: 4831180 kB' 'Inactive: 3371680 kB' 'Active(anon): 4673284 kB' 'Inactive(anon): 0 kB' 'Active(file): 157896 kB' 'Inactive(file): 3371680 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 7979240 kB' 'Mapped: 106036 kB' 'AnonPages: 226868 kB' 'Shmem: 4449664 kB' 'KernelStack: 9080 kB' 'PageTables: 3928 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 104000 kB' 'Slab: 286728 kB' 'SReclaimable: 104000 kB' 'SUnreclaim: 182728 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Surp: 0' 00:04:50.035 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:50.036 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:50.036 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:50.036 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:50.036 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:50.036 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:50.036 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:50.036 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:50.036 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:50.036 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:50.036 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:50.036 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:50.036 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:50.036 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:50.036 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:50.036 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:50.036 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:50.036 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:50.036 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:50.036 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:50.036 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:50.036 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:50.036 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:50.036 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:50.036 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:50.036 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:50.036 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:50.036 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:50.036 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:50.036 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:50.036 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:50.036 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:50.036 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:50.036 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:50.036 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:50.036 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:50.036 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:50.036 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:50.036 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:50.036 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:50.036 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:50.036 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:50.036 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:50.036 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:50.036 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:50.036 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:50.036 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:50.036 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:50.036 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:50.036 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:50.036 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:50.036 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:50.036 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:50.036 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:50.036 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:50.036 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:50.036 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:50.036 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:50.036 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:50.036 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:50.036 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:50.036 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:50.036 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:50.036 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:50.036 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:50.036 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:50.036 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:50.036 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:50.036 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:50.036 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:50.036 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:50.036 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:50.036 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:50.036 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:50.036 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:50.036 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:50.036 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:50.036 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:50.036 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:50.036 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:50.036 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:50.036 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:50.036 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:50.036 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:50.036 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:50.036 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:50.036 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:50.036 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:50.036 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:50.036 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:50.036 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:50.036 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:50.036 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:50.036 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:50.036 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:50.036 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:50.036 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:50.036 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:50.036 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:50.036 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:50.036 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:50.036 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:50.036 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:50.036 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:50.036 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:50.036 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:50.036 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:50.036 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:50.036 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:50.036 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:50.036 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:50.036 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:50.036 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:50.036 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:50.036 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:50.036 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:50.036 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:50.036 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:50.036 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:50.036 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:50.036 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:50.036 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:50.036 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:50.036 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:50.036 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:50.036 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:50.036 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:50.037 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:50.037 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:50.037 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:50.037 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:50.037 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:50.037 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:50.037 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:50.037 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:50.037 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:50.037 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:50.037 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:50.037 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:50.037 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:50.037 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:50.037 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:50.037 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:50.037 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:50.037 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:50.037 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # echo 0 00:04:50.037 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # return 0 00:04:50.037 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:04:50.037 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:04:50.037 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:04:50.037 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:04:50.037 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@128 -- # echo 'node0=1024 expecting 1024' 00:04:50.037 node0=1024 expecting 1024 00:04:50.037 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@130 -- # [[ 1024 == \1\0\2\4 ]] 00:04:50.037 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@202 -- # CLEAR_HUGE=no 00:04:50.037 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@202 -- # NRHUGE=512 00:04:50.037 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@202 -- # setup output 00:04:50.037 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@9 -- # [[ output == output ]] 00:04:50.037 18:08:33 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@10 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh 00:04:54.228 0000:d7:05.5 (8086 201d): Skipping not allowed VMD controller at 0000:d7:05.5 00:04:54.228 0000:85:05.5 (8086 201d): Skipping not allowed VMD controller at 0000:85:05.5 00:04:54.228 0000:00:04.7 (8086 2021): Already using the vfio-pci driver 00:04:54.228 0000:5e:00.0 (8086 0b60): Already using the vfio-pci driver 00:04:54.228 0000:00:04.6 (8086 2021): Already using the vfio-pci driver 00:04:54.228 0000:00:04.5 (8086 2021): Already using the vfio-pci driver 00:04:54.228 0000:00:04.4 (8086 2021): Already using the vfio-pci driver 00:04:54.228 0000:00:04.3 (8086 2021): Already using the vfio-pci driver 00:04:54.228 0000:00:04.2 (8086 2021): Already using the vfio-pci driver 00:04:54.228 0000:00:04.1 (8086 2021): Already using the vfio-pci driver 00:04:54.228 0000:00:04.0 (8086 2021): Already using the vfio-pci driver 00:04:54.228 0000:80:04.7 (8086 2021): Already using the vfio-pci driver 00:04:54.228 0000:80:04.6 (8086 2021): Already using the vfio-pci driver 00:04:54.228 0000:80:04.5 (8086 2021): Already using the vfio-pci driver 00:04:54.228 0000:80:04.4 (8086 2021): Already using the vfio-pci driver 00:04:54.228 0000:80:04.3 (8086 2021): Already using the vfio-pci driver 00:04:54.228 0000:80:04.2 (8086 2021): Already using the vfio-pci driver 00:04:54.228 0000:80:04.1 (8086 2021): Already using the vfio-pci driver 00:04:54.228 0000:80:04.0 (8086 2021): Already using the vfio-pci driver 00:04:54.228 INFO: Requested 512 hugepages but 1024 already allocated on node0 00:04:54.228 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@204 -- # verify_nr_hugepages 00:04:54.228 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@89 -- # local node 00:04:54.228 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@90 -- # local sorted_t 00:04:54.228 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@91 -- # local sorted_s 00:04:54.228 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@92 -- # local surp 00:04:54.229 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@93 -- # local resv 00:04:54.229 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@94 -- # local anon 00:04:54.229 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@96 -- # [[ always [madvise] never != *\[\n\e\v\e\r\]* ]] 00:04:54.229 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@97 -- # get_meminfo AnonHugePages 00:04:54.229 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@17 -- # local get=AnonHugePages 00:04:54.229 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@18 -- # local node= 00:04:54.229 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@19 -- # local var val 00:04:54.229 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@20 -- # local mem_f mem 00:04:54.229 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:54.229 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:54.229 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:54.229 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:04:54.229 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:54.229 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.229 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.229 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92293472 kB' 'MemFree: 79179280 kB' 'MemAvailable: 82451060 kB' 'Buffers: 11136 kB' 'Cached: 9252304 kB' 'SwapCached: 0 kB' 'Active: 6310748 kB' 'Inactive: 3441940 kB' 'Active(anon): 5920132 kB' 'Inactive(anon): 0 kB' 'Active(file): 390616 kB' 'Inactive(file): 3441940 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 492400 kB' 'Mapped: 185924 kB' 'Shmem: 5430884 kB' 'KReclaimable: 185732 kB' 'Slab: 481620 kB' 'SReclaimable: 185732 kB' 'SUnreclaim: 295888 kB' 'KernelStack: 16752 kB' 'PageTables: 9432 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 53486764 kB' 'Committed_AS: 7295620 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 201220 kB' 'VmallocChunk: 0 kB' 'Percpu: 47360 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 634276 kB' 'DirectMap2M: 12673024 kB' 'DirectMap1G: 88080384 kB' 00:04:54.229 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:54.229 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:54.229 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.229 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.229 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:54.229 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:54.229 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.229 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.229 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:54.229 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:54.229 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.229 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.229 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Buffers == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:54.229 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:54.229 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.229 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.229 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Cached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:54.229 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:54.229 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.229 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.229 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapCached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:54.229 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:54.229 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.229 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.229 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:54.229 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:54.229 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.229 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.229 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:54.229 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:54.229 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.229 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.229 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:54.229 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:54.229 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.229 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.229 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:54.229 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:54.229 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.229 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.229 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:54.229 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:54.229 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.229 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.229 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:54.229 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:54.229 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.229 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.229 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unevictable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:54.229 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:54.229 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.229 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.229 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mlocked == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:54.229 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:54.229 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.229 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.229 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:54.229 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:54.229 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.229 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.229 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:54.229 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:54.229 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.229 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.229 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswap == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:54.229 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:54.229 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.229 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.229 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:54.229 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:54.229 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.229 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.229 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Dirty == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:54.229 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:54.229 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.229 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.229 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Writeback == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:54.229 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:54.229 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.229 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.229 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonPages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:54.229 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:54.229 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.230 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.230 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:54.230 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:54.230 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.230 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.230 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Shmem == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:54.230 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:54.230 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.230 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.230 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:54.230 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:54.230 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.230 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.230 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Slab == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:54.230 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:54.230 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.230 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.230 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:54.230 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:54.230 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.230 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.230 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:54.230 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:54.230 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.230 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.230 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KernelStack == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:54.230 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:54.230 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.230 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.230 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ PageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:54.230 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:54.230 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.230 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.230 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:54.230 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:54.230 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.230 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.230 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:54.230 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:54.230 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.230 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.230 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Bounce == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:54.230 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:54.230 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.230 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.230 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:54.230 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:54.230 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.230 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.230 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:54.230 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:54.230 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.230 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.230 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:54.230 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:54.230 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.230 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.230 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:54.230 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:54.230 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.230 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.230 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:54.230 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:54.230 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.230 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.230 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:54.230 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:54.230 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.230 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.230 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Percpu == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:54.230 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:54.230 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.230 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.230 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:54.230 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:54.230 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.230 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.230 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:54.230 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # echo 0 00:04:54.230 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # return 0 00:04:54.230 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@97 -- # anon=0 00:04:54.230 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@99 -- # get_meminfo HugePages_Surp 00:04:54.230 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:04:54.230 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@18 -- # local node= 00:04:54.230 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@19 -- # local var val 00:04:54.230 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@20 -- # local mem_f mem 00:04:54.230 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:54.230 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:54.230 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:54.230 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:04:54.230 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:54.230 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.230 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.230 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92293472 kB' 'MemFree: 79181328 kB' 'MemAvailable: 82453104 kB' 'Buffers: 11136 kB' 'Cached: 9252308 kB' 'SwapCached: 0 kB' 'Active: 6310820 kB' 'Inactive: 3441940 kB' 'Active(anon): 5920204 kB' 'Inactive(anon): 0 kB' 'Active(file): 390616 kB' 'Inactive(file): 3441940 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 492480 kB' 'Mapped: 185916 kB' 'Shmem: 5430888 kB' 'KReclaimable: 185724 kB' 'Slab: 481628 kB' 'SReclaimable: 185724 kB' 'SUnreclaim: 295904 kB' 'KernelStack: 16848 kB' 'PageTables: 9724 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 53486764 kB' 'Committed_AS: 7295636 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 201108 kB' 'VmallocChunk: 0 kB' 'Percpu: 47360 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 634276 kB' 'DirectMap2M: 12673024 kB' 'DirectMap1G: 88080384 kB' 00:04:54.230 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:54.230 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:54.230 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.230 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.230 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:54.230 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:54.230 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.230 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.230 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:54.230 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:54.230 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.230 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.230 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:54.230 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:54.230 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.230 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.230 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:54.230 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:54.230 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.230 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.230 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:54.230 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:54.231 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.231 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.231 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:54.231 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:54.231 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.231 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.231 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:54.231 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:54.231 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.231 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.231 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:54.231 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:54.231 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.231 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.231 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:54.231 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:54.231 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.231 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.231 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:54.231 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:54.231 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.231 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.231 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:54.231 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:54.231 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.231 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.231 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:54.231 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:54.231 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.231 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.231 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:54.231 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:54.231 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.231 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.231 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:54.231 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:54.231 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.231 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.231 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:54.231 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:54.231 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.231 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.231 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:54.231 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:54.231 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.231 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.231 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:54.231 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:54.231 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.231 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.231 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:54.231 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:54.231 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.231 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.231 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:54.231 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:54.231 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.231 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.231 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:54.231 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:54.231 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.231 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.231 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:54.231 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:54.231 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.231 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.231 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:54.231 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:54.231 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.231 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.231 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:54.231 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:54.231 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.231 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.231 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:54.231 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:54.231 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.231 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.231 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:54.231 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:54.231 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.231 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.231 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:54.231 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:54.231 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.231 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.231 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:54.231 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:54.231 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.231 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.231 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:54.231 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:54.231 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.231 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.231 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:54.231 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:54.231 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.231 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.231 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:54.231 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:54.231 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.231 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.231 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:54.231 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:54.231 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.231 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.231 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:54.231 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:54.231 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.231 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.231 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:54.231 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:54.231 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.231 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.231 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:54.231 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:54.231 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.231 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.231 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:54.231 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:54.231 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.231 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.231 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:54.231 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:54.231 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.231 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.231 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:54.231 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:54.231 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.232 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.232 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:54.232 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:54.232 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.232 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.232 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:54.232 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:54.232 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.232 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.232 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:54.232 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:54.232 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.232 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.232 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:54.232 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:54.232 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.232 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.232 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:54.232 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:54.232 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.232 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.232 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:54.232 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:54.232 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.232 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.232 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:54.232 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:54.232 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.232 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.232 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:54.232 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:54.232 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.232 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.232 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:54.232 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:54.232 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.232 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.232 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:54.232 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:54.232 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.232 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.232 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:54.232 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:54.232 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.232 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.232 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:54.232 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:54.232 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.232 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.232 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:54.232 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:54.232 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.232 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.232 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:54.232 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # echo 0 00:04:54.232 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # return 0 00:04:54.232 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@99 -- # surp=0 00:04:54.232 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@100 -- # get_meminfo HugePages_Rsvd 00:04:54.232 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@17 -- # local get=HugePages_Rsvd 00:04:54.232 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@18 -- # local node= 00:04:54.232 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@19 -- # local var val 00:04:54.232 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@20 -- # local mem_f mem 00:04:54.232 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:54.232 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:54.232 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:54.232 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:04:54.232 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:54.232 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.232 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.232 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92293472 kB' 'MemFree: 79185444 kB' 'MemAvailable: 82457220 kB' 'Buffers: 11136 kB' 'Cached: 9252328 kB' 'SwapCached: 0 kB' 'Active: 6310820 kB' 'Inactive: 3441940 kB' 'Active(anon): 5920204 kB' 'Inactive(anon): 0 kB' 'Active(file): 390616 kB' 'Inactive(file): 3441940 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 492524 kB' 'Mapped: 185924 kB' 'Shmem: 5430908 kB' 'KReclaimable: 185724 kB' 'Slab: 481644 kB' 'SReclaimable: 185724 kB' 'SUnreclaim: 295920 kB' 'KernelStack: 16592 kB' 'PageTables: 9352 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 53486764 kB' 'Committed_AS: 7295660 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 201092 kB' 'VmallocChunk: 0 kB' 'Percpu: 47360 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 634276 kB' 'DirectMap2M: 12673024 kB' 'DirectMap1G: 88080384 kB' 00:04:54.232 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:54.232 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:54.232 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.232 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.232 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:54.232 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:54.232 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.232 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.232 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:54.232 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:54.232 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.232 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.232 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:54.232 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:54.232 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.232 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.232 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:54.232 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:54.232 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.232 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.232 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:54.232 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:54.232 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.232 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.232 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:54.232 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:54.232 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.232 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.232 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:54.232 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:54.232 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.232 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.232 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:54.232 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:54.232 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.232 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.232 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:54.232 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:54.232 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.232 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.232 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:54.232 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:54.232 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.232 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.232 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:54.233 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:54.233 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.233 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.233 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:54.233 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:54.233 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.233 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.233 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:54.233 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:54.233 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.233 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.233 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:54.233 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:54.233 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.233 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.233 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:54.233 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:54.233 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.233 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.233 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:54.233 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:54.233 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.233 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.233 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:54.233 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:54.233 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.233 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.233 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:54.233 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:54.233 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.233 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.233 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:54.233 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:54.233 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.233 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.233 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:54.233 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:54.233 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.233 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.233 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:54.233 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:54.233 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.233 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.233 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:54.233 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:54.233 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.233 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.233 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:54.233 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:54.233 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.233 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.233 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:54.233 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:54.233 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.233 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.233 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:54.233 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:54.233 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.233 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.233 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:54.233 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:54.233 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.233 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.233 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:54.233 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:54.233 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.233 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.233 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:54.233 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:54.233 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.233 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.233 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:54.233 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:54.233 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.233 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.233 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:54.233 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:54.233 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.233 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.233 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:54.233 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:54.233 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.233 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.233 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:54.233 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:54.233 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.233 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.233 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:54.233 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:54.233 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.233 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.233 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:54.233 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:54.233 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.233 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.233 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:54.233 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:54.233 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.233 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.233 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:54.233 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:54.234 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.234 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.234 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:54.234 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:54.234 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.234 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.234 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:54.234 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:54.234 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.234 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.234 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:54.234 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:54.234 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.234 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.234 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:54.234 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:54.234 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.234 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.234 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:54.234 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:54.234 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.234 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.234 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:54.234 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:54.234 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.234 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.234 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:54.234 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:54.234 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.234 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.234 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:54.234 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:54.234 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.234 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.234 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:54.234 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:54.234 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.234 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.234 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:54.234 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:54.234 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.234 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.234 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:54.234 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:54.234 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.234 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.234 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:54.234 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:54.234 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.234 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.234 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:54.234 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:54.234 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.234 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.234 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:54.234 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # echo 0 00:04:54.234 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # return 0 00:04:54.234 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@100 -- # resv=0 00:04:54.234 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@102 -- # echo nr_hugepages=1024 00:04:54.234 nr_hugepages=1024 00:04:54.234 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@103 -- # echo resv_hugepages=0 00:04:54.234 resv_hugepages=0 00:04:54.234 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@104 -- # echo surplus_hugepages=0 00:04:54.234 surplus_hugepages=0 00:04:54.234 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@105 -- # echo anon_hugepages=0 00:04:54.234 anon_hugepages=0 00:04:54.234 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@107 -- # (( 1024 == nr_hugepages + surp + resv )) 00:04:54.234 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@109 -- # (( 1024 == nr_hugepages )) 00:04:54.234 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@110 -- # get_meminfo HugePages_Total 00:04:54.234 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@17 -- # local get=HugePages_Total 00:04:54.234 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@18 -- # local node= 00:04:54.234 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@19 -- # local var val 00:04:54.234 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@20 -- # local mem_f mem 00:04:54.234 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:54.234 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:54.234 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:54.234 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:04:54.234 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:54.234 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.234 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.234 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92293472 kB' 'MemFree: 79183152 kB' 'MemAvailable: 82454928 kB' 'Buffers: 11136 kB' 'Cached: 9252348 kB' 'SwapCached: 0 kB' 'Active: 6310880 kB' 'Inactive: 3441940 kB' 'Active(anon): 5920264 kB' 'Inactive(anon): 0 kB' 'Active(file): 390616 kB' 'Inactive(file): 3441940 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 492568 kB' 'Mapped: 185924 kB' 'Shmem: 5430928 kB' 'KReclaimable: 185724 kB' 'Slab: 481516 kB' 'SReclaimable: 185724 kB' 'SUnreclaim: 295792 kB' 'KernelStack: 16688 kB' 'PageTables: 9568 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 53486764 kB' 'Committed_AS: 7295680 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 201172 kB' 'VmallocChunk: 0 kB' 'Percpu: 47360 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 634276 kB' 'DirectMap2M: 12673024 kB' 'DirectMap1G: 88080384 kB' 00:04:54.234 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:54.234 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:54.234 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.234 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.234 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:54.234 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:54.234 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.234 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.234 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:54.234 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:54.234 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.234 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.234 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:54.234 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:54.234 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.234 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.234 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:54.234 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:54.234 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.234 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.234 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:54.234 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:54.234 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.234 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.234 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:54.234 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:54.234 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.234 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.234 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:54.234 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:54.234 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.234 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.234 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:54.234 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:54.234 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.234 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.235 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:54.235 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:54.235 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.235 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.235 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:54.235 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:54.235 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.235 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.235 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:54.235 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:54.235 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.235 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.235 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:54.235 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:54.235 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.235 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.235 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:54.235 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:54.235 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.235 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.235 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:54.235 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:54.235 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.235 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.235 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:54.235 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:54.235 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.235 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.235 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:54.235 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:54.235 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.235 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.235 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:54.235 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:54.235 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.235 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.235 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:54.235 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:54.235 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.235 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.235 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:54.235 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:54.235 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.235 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.235 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:54.235 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:54.235 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.235 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.235 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:54.235 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:54.235 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.235 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.235 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:54.235 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:54.235 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.235 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.235 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:54.235 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:54.235 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.235 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.235 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:54.235 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:54.235 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.235 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.235 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:54.235 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:54.235 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.235 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.235 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:54.235 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:54.235 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.235 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.235 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:54.235 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:54.235 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.235 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.235 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:54.235 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:54.235 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.235 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.235 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:54.235 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:54.235 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.235 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.235 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:54.235 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:54.235 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.235 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.235 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:54.235 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:54.235 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.235 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.235 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:54.235 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:54.235 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.235 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.235 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:54.235 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:54.235 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.235 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.235 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:54.235 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:54.235 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.235 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.235 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:54.235 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:54.235 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.235 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.235 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:54.235 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:54.235 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.235 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.235 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:54.235 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:54.235 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.235 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.235 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:54.235 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:54.235 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.235 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.235 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:54.235 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:54.235 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.235 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.235 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:54.235 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:54.235 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.235 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.235 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:54.236 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:54.236 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.236 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.236 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:54.236 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:54.236 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.236 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.236 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:54.236 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:54.236 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.236 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.236 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:54.236 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:54.236 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.236 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.236 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:54.236 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:54.236 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.236 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.236 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:54.236 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:54.236 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.236 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.236 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:54.236 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:54.236 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.236 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.236 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:54.236 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # echo 1024 00:04:54.236 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # return 0 00:04:54.236 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@110 -- # (( 1024 == nr_hugepages + surp + resv )) 00:04:54.236 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@112 -- # get_nodes 00:04:54.236 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@27 -- # local node 00:04:54.236 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:04:54.236 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=1024 00:04:54.236 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:04:54.236 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=0 00:04:54.236 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@32 -- # no_nodes=2 00:04:54.236 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@33 -- # (( no_nodes > 0 )) 00:04:54.236 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:04:54.236 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:04:54.236 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 0 00:04:54.236 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:04:54.236 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@18 -- # local node=0 00:04:54.236 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@19 -- # local var val 00:04:54.236 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@20 -- # local mem_f mem 00:04:54.236 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:54.236 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node0/meminfo ]] 00:04:54.236 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node0/meminfo 00:04:54.236 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:04:54.236 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:54.236 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.236 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.236 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 48116940 kB' 'MemFree: 37086292 kB' 'MemUsed: 11030648 kB' 'SwapCached: 0 kB' 'Active: 4832716 kB' 'Inactive: 3371680 kB' 'Active(anon): 4674820 kB' 'Inactive(anon): 0 kB' 'Active(file): 157896 kB' 'Inactive(file): 3371680 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 7979244 kB' 'Mapped: 106044 kB' 'AnonPages: 228288 kB' 'Shmem: 4449668 kB' 'KernelStack: 9432 kB' 'PageTables: 5408 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 104000 kB' 'Slab: 286768 kB' 'SReclaimable: 104000 kB' 'SUnreclaim: 182768 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Surp: 0' 00:04:54.236 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:54.236 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:54.236 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.236 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.236 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:54.236 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:54.236 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.236 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.236 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:54.236 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:54.236 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.236 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.236 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:54.236 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:54.236 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.236 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.236 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:54.236 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:54.236 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.236 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.236 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:54.236 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:54.236 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.236 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.236 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:54.236 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:54.236 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.236 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.236 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:54.236 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:54.236 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.236 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.236 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:54.236 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:54.236 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.236 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.236 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:54.236 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:54.236 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.236 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.236 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:54.236 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:54.236 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.236 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.236 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:54.236 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:54.236 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.236 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.236 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:54.236 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:54.236 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.236 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.236 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:54.236 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:54.236 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.236 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.236 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:54.236 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:54.236 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.236 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.236 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:54.236 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:54.236 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.236 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.236 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:54.237 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:54.237 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.237 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.237 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:54.237 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:54.237 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.237 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.237 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:54.237 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:54.237 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.237 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.237 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:54.237 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:54.237 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.237 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.237 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:54.237 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:54.237 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.237 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.237 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:54.237 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:54.237 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.237 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.237 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:54.237 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:54.237 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.237 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.237 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:54.237 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:54.237 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.237 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.237 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:54.237 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:54.237 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.237 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.237 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:54.237 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:54.237 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.237 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.237 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:54.237 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:54.237 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.237 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.237 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:54.237 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:54.237 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.237 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.237 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:54.237 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:54.237 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.237 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.237 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:54.237 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:54.237 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.237 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.237 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:54.237 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:54.237 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.237 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.237 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:54.237 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:54.237 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.237 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.237 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:54.237 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:54.237 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.237 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.237 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:54.237 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:54.237 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.237 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.237 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:54.237 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:54.237 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.237 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.237 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:54.237 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:04:54.237 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.237 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.237 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:54.237 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # echo 0 00:04:54.237 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # return 0 00:04:54.237 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:04:54.237 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:04:54.237 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:04:54.237 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:04:54.237 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@128 -- # echo 'node0=1024 expecting 1024' 00:04:54.237 node0=1024 expecting 1024 00:04:54.237 18:08:37 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@130 -- # [[ 1024 == \1\0\2\4 ]] 00:04:54.237 00:04:54.237 real 0m7.544s 00:04:54.237 user 0m2.967s 00:04:54.237 sys 0m4.770s 00:04:54.237 18:08:37 setup.sh.hugepages.no_shrink_alloc -- common/autotest_common.sh@1124 -- # xtrace_disable 00:04:54.237 18:08:37 setup.sh.hugepages.no_shrink_alloc -- common/autotest_common.sh@10 -- # set +x 00:04:54.237 ************************************ 00:04:54.237 END TEST no_shrink_alloc 00:04:54.237 ************************************ 00:04:54.237 18:08:37 setup.sh.hugepages -- common/autotest_common.sh@1142 -- # return 0 00:04:54.237 18:08:37 setup.sh.hugepages -- setup/hugepages.sh@217 -- # clear_hp 00:04:54.237 18:08:37 setup.sh.hugepages -- setup/hugepages.sh@37 -- # local node hp 00:04:54.237 18:08:37 setup.sh.hugepages -- setup/hugepages.sh@39 -- # for node in "${!nodes_sys[@]}" 00:04:54.237 18:08:37 setup.sh.hugepages -- setup/hugepages.sh@40 -- # for hp in "/sys/devices/system/node/node$node/hugepages/hugepages-"* 00:04:54.237 18:08:37 setup.sh.hugepages -- setup/hugepages.sh@41 -- # echo 0 00:04:54.237 18:08:37 setup.sh.hugepages -- setup/hugepages.sh@40 -- # for hp in "/sys/devices/system/node/node$node/hugepages/hugepages-"* 00:04:54.237 18:08:37 setup.sh.hugepages -- setup/hugepages.sh@41 -- # echo 0 00:04:54.237 18:08:37 setup.sh.hugepages -- setup/hugepages.sh@39 -- # for node in "${!nodes_sys[@]}" 00:04:54.237 18:08:37 setup.sh.hugepages -- setup/hugepages.sh@40 -- # for hp in "/sys/devices/system/node/node$node/hugepages/hugepages-"* 00:04:54.237 18:08:37 setup.sh.hugepages -- setup/hugepages.sh@41 -- # echo 0 00:04:54.237 18:08:37 setup.sh.hugepages -- setup/hugepages.sh@40 -- # for hp in "/sys/devices/system/node/node$node/hugepages/hugepages-"* 00:04:54.237 18:08:37 setup.sh.hugepages -- setup/hugepages.sh@41 -- # echo 0 00:04:54.237 18:08:37 setup.sh.hugepages -- setup/hugepages.sh@45 -- # export CLEAR_HUGE=yes 00:04:54.237 18:08:37 setup.sh.hugepages -- setup/hugepages.sh@45 -- # CLEAR_HUGE=yes 00:04:54.237 00:04:54.237 real 0m29.154s 00:04:54.237 user 0m10.146s 00:04:54.238 sys 0m17.025s 00:04:54.238 18:08:37 setup.sh.hugepages -- common/autotest_common.sh@1124 -- # xtrace_disable 00:04:54.238 18:08:37 setup.sh.hugepages -- common/autotest_common.sh@10 -- # set +x 00:04:54.238 ************************************ 00:04:54.238 END TEST hugepages 00:04:54.238 ************************************ 00:04:54.238 18:08:37 setup.sh -- common/autotest_common.sh@1142 -- # return 0 00:04:54.238 18:08:37 setup.sh -- setup/test-setup.sh@14 -- # run_test driver /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/driver.sh 00:04:54.238 18:08:37 setup.sh -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:04:54.238 18:08:37 setup.sh -- common/autotest_common.sh@1105 -- # xtrace_disable 00:04:54.238 18:08:37 setup.sh -- common/autotest_common.sh@10 -- # set +x 00:04:54.238 ************************************ 00:04:54.238 START TEST driver 00:04:54.238 ************************************ 00:04:54.238 18:08:37 setup.sh.driver -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/driver.sh 00:04:54.238 * Looking for test storage... 00:04:54.238 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup 00:04:54.238 18:08:37 setup.sh.driver -- setup/driver.sh@68 -- # setup reset 00:04:54.238 18:08:37 setup.sh.driver -- setup/common.sh@9 -- # [[ reset == output ]] 00:04:54.238 18:08:37 setup.sh.driver -- setup/common.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh reset 00:04:59.508 18:08:42 setup.sh.driver -- setup/driver.sh@69 -- # run_test guess_driver guess_driver 00:04:59.508 18:08:42 setup.sh.driver -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:04:59.508 18:08:42 setup.sh.driver -- common/autotest_common.sh@1105 -- # xtrace_disable 00:04:59.508 18:08:42 setup.sh.driver -- common/autotest_common.sh@10 -- # set +x 00:04:59.508 ************************************ 00:04:59.508 START TEST guess_driver 00:04:59.508 ************************************ 00:04:59.508 18:08:42 setup.sh.driver.guess_driver -- common/autotest_common.sh@1123 -- # guess_driver 00:04:59.508 18:08:42 setup.sh.driver.guess_driver -- setup/driver.sh@46 -- # local driver setup_driver marker 00:04:59.508 18:08:42 setup.sh.driver.guess_driver -- setup/driver.sh@47 -- # local fail=0 00:04:59.508 18:08:42 setup.sh.driver.guess_driver -- setup/driver.sh@49 -- # pick_driver 00:04:59.508 18:08:42 setup.sh.driver.guess_driver -- setup/driver.sh@36 -- # vfio 00:04:59.508 18:08:42 setup.sh.driver.guess_driver -- setup/driver.sh@21 -- # local iommu_grups 00:04:59.508 18:08:42 setup.sh.driver.guess_driver -- setup/driver.sh@22 -- # local unsafe_vfio 00:04:59.508 18:08:42 setup.sh.driver.guess_driver -- setup/driver.sh@24 -- # [[ -e /sys/module/vfio/parameters/enable_unsafe_noiommu_mode ]] 00:04:59.508 18:08:42 setup.sh.driver.guess_driver -- setup/driver.sh@25 -- # unsafe_vfio=N 00:04:59.508 18:08:42 setup.sh.driver.guess_driver -- setup/driver.sh@27 -- # iommu_groups=(/sys/kernel/iommu_groups/*) 00:04:59.508 18:08:42 setup.sh.driver.guess_driver -- setup/driver.sh@29 -- # (( 216 > 0 )) 00:04:59.508 18:08:42 setup.sh.driver.guess_driver -- setup/driver.sh@30 -- # is_driver vfio_pci 00:04:59.508 18:08:42 setup.sh.driver.guess_driver -- setup/driver.sh@14 -- # mod vfio_pci 00:04:59.508 18:08:42 setup.sh.driver.guess_driver -- setup/driver.sh@12 -- # dep vfio_pci 00:04:59.508 18:08:42 setup.sh.driver.guess_driver -- setup/driver.sh@11 -- # modprobe --show-depends vfio_pci 00:04:59.508 18:08:42 setup.sh.driver.guess_driver -- setup/driver.sh@12 -- # [[ insmod /lib/modules/6.7.0-68.fc38.x86_64/kernel/virt/lib/irqbypass.ko.xz 00:04:59.508 insmod /lib/modules/6.7.0-68.fc38.x86_64/kernel/drivers/iommu/iommufd/iommufd.ko.xz 00:04:59.508 insmod /lib/modules/6.7.0-68.fc38.x86_64/kernel/drivers/vfio/vfio.ko.xz 00:04:59.508 insmod /lib/modules/6.7.0-68.fc38.x86_64/kernel/drivers/iommu/iommufd/iommufd.ko.xz 00:04:59.508 insmod /lib/modules/6.7.0-68.fc38.x86_64/kernel/drivers/vfio/vfio.ko.xz 00:04:59.508 insmod /lib/modules/6.7.0-68.fc38.x86_64/kernel/drivers/vfio/vfio_iommu_type1.ko.xz 00:04:59.508 insmod /lib/modules/6.7.0-68.fc38.x86_64/kernel/drivers/vfio/pci/vfio-pci-core.ko.xz 00:04:59.508 insmod /lib/modules/6.7.0-68.fc38.x86_64/kernel/drivers/vfio/pci/vfio-pci.ko.xz == *\.\k\o* ]] 00:04:59.508 18:08:42 setup.sh.driver.guess_driver -- setup/driver.sh@30 -- # return 0 00:04:59.508 18:08:42 setup.sh.driver.guess_driver -- setup/driver.sh@37 -- # echo vfio-pci 00:04:59.508 18:08:42 setup.sh.driver.guess_driver -- setup/driver.sh@49 -- # driver=vfio-pci 00:04:59.508 18:08:42 setup.sh.driver.guess_driver -- setup/driver.sh@51 -- # [[ vfio-pci == \N\o\ \v\a\l\i\d\ \d\r\i\v\e\r\ \f\o\u\n\d ]] 00:04:59.508 18:08:42 setup.sh.driver.guess_driver -- setup/driver.sh@56 -- # echo 'Looking for driver=vfio-pci' 00:04:59.508 Looking for driver=vfio-pci 00:04:59.508 18:08:42 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:04:59.508 18:08:42 setup.sh.driver.guess_driver -- setup/driver.sh@45 -- # setup output config 00:04:59.508 18:08:42 setup.sh.driver.guess_driver -- setup/common.sh@9 -- # [[ output == output ]] 00:04:59.508 18:08:42 setup.sh.driver.guess_driver -- setup/common.sh@10 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh config 00:05:02.798 18:08:46 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ not == \-\> ]] 00:05:02.798 18:08:46 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # continue 00:05:02.798 18:08:46 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:05:02.798 18:08:46 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ not == \-\> ]] 00:05:02.798 18:08:46 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # continue 00:05:02.798 18:08:46 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:05:02.798 18:08:46 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:05:02.798 18:08:46 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:05:02.798 18:08:46 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:05:02.798 18:08:46 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:05:02.798 18:08:46 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:05:02.798 18:08:46 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:05:02.798 18:08:46 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:05:02.798 18:08:46 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:05:02.798 18:08:46 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:05:02.798 18:08:46 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:05:02.798 18:08:46 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:05:02.798 18:08:46 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:05:02.798 18:08:46 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:05:02.798 18:08:46 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:05:02.798 18:08:46 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:05:02.798 18:08:46 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:05:02.798 18:08:46 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:05:02.798 18:08:46 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:05:02.798 18:08:46 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:05:02.798 18:08:46 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:05:02.798 18:08:46 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:05:02.798 18:08:46 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:05:02.798 18:08:46 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:05:02.798 18:08:46 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:05:02.798 18:08:46 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:05:02.798 18:08:46 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:05:02.798 18:08:46 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:05:02.798 18:08:46 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:05:02.798 18:08:46 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:05:02.798 18:08:46 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:05:02.798 18:08:46 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:05:02.798 18:08:46 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:05:02.798 18:08:46 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:05:02.798 18:08:46 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:05:02.798 18:08:46 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:05:02.798 18:08:46 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:05:02.798 18:08:46 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:05:02.798 18:08:46 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:05:02.798 18:08:46 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:05:02.798 18:08:46 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:05:02.798 18:08:46 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:05:02.798 18:08:46 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:05:02.798 18:08:46 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:05:02.798 18:08:46 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:05:02.798 18:08:46 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:05:02.798 18:08:46 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:05:02.798 18:08:46 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:05:02.798 18:08:46 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:05:05.333 18:08:49 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:05:05.333 18:08:49 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:05:05.333 18:08:49 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:05:05.591 18:08:49 setup.sh.driver.guess_driver -- setup/driver.sh@64 -- # (( fail == 0 )) 00:05:05.591 18:08:49 setup.sh.driver.guess_driver -- setup/driver.sh@65 -- # setup reset 00:05:05.591 18:08:49 setup.sh.driver.guess_driver -- setup/common.sh@9 -- # [[ reset == output ]] 00:05:05.591 18:08:49 setup.sh.driver.guess_driver -- setup/common.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh reset 00:05:10.864 00:05:10.864 real 0m11.476s 00:05:10.864 user 0m2.875s 00:05:10.864 sys 0m5.414s 00:05:10.864 18:08:54 setup.sh.driver.guess_driver -- common/autotest_common.sh@1124 -- # xtrace_disable 00:05:10.864 18:08:54 setup.sh.driver.guess_driver -- common/autotest_common.sh@10 -- # set +x 00:05:10.864 ************************************ 00:05:10.864 END TEST guess_driver 00:05:10.864 ************************************ 00:05:10.864 18:08:54 setup.sh.driver -- common/autotest_common.sh@1142 -- # return 0 00:05:10.864 00:05:10.864 real 0m16.626s 00:05:10.864 user 0m4.394s 00:05:10.864 sys 0m8.268s 00:05:10.864 18:08:54 setup.sh.driver -- common/autotest_common.sh@1124 -- # xtrace_disable 00:05:10.864 18:08:54 setup.sh.driver -- common/autotest_common.sh@10 -- # set +x 00:05:10.864 ************************************ 00:05:10.864 END TEST driver 00:05:10.864 ************************************ 00:05:10.864 18:08:54 setup.sh -- common/autotest_common.sh@1142 -- # return 0 00:05:10.864 18:08:54 setup.sh -- setup/test-setup.sh@15 -- # run_test devices /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/devices.sh 00:05:10.864 18:08:54 setup.sh -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:05:10.864 18:08:54 setup.sh -- common/autotest_common.sh@1105 -- # xtrace_disable 00:05:10.864 18:08:54 setup.sh -- common/autotest_common.sh@10 -- # set +x 00:05:10.864 ************************************ 00:05:10.864 START TEST devices 00:05:10.864 ************************************ 00:05:10.864 18:08:54 setup.sh.devices -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/devices.sh 00:05:10.864 * Looking for test storage... 00:05:10.864 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup 00:05:10.864 18:08:54 setup.sh.devices -- setup/devices.sh@190 -- # trap cleanup EXIT 00:05:10.864 18:08:54 setup.sh.devices -- setup/devices.sh@192 -- # setup reset 00:05:10.864 18:08:54 setup.sh.devices -- setup/common.sh@9 -- # [[ reset == output ]] 00:05:10.864 18:08:54 setup.sh.devices -- setup/common.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh reset 00:05:15.056 18:08:58 setup.sh.devices -- setup/devices.sh@194 -- # get_zoned_devs 00:05:15.056 18:08:58 setup.sh.devices -- common/autotest_common.sh@1669 -- # zoned_devs=() 00:05:15.056 18:08:58 setup.sh.devices -- common/autotest_common.sh@1669 -- # local -gA zoned_devs 00:05:15.056 18:08:58 setup.sh.devices -- common/autotest_common.sh@1670 -- # local nvme bdf 00:05:15.056 18:08:58 setup.sh.devices -- common/autotest_common.sh@1672 -- # for nvme in /sys/block/nvme* 00:05:15.056 18:08:58 setup.sh.devices -- common/autotest_common.sh@1673 -- # is_block_zoned nvme0n1 00:05:15.056 18:08:58 setup.sh.devices -- common/autotest_common.sh@1662 -- # local device=nvme0n1 00:05:15.056 18:08:58 setup.sh.devices -- common/autotest_common.sh@1664 -- # [[ -e /sys/block/nvme0n1/queue/zoned ]] 00:05:15.056 18:08:58 setup.sh.devices -- common/autotest_common.sh@1665 -- # [[ none != none ]] 00:05:15.056 18:08:58 setup.sh.devices -- setup/devices.sh@196 -- # blocks=() 00:05:15.056 18:08:58 setup.sh.devices -- setup/devices.sh@196 -- # declare -a blocks 00:05:15.056 18:08:58 setup.sh.devices -- setup/devices.sh@197 -- # blocks_to_pci=() 00:05:15.056 18:08:58 setup.sh.devices -- setup/devices.sh@197 -- # declare -A blocks_to_pci 00:05:15.056 18:08:58 setup.sh.devices -- setup/devices.sh@198 -- # min_disk_size=3221225472 00:05:15.056 18:08:58 setup.sh.devices -- setup/devices.sh@200 -- # for block in "/sys/block/nvme"!(*c*) 00:05:15.056 18:08:58 setup.sh.devices -- setup/devices.sh@201 -- # ctrl=nvme0n1 00:05:15.056 18:08:58 setup.sh.devices -- setup/devices.sh@201 -- # ctrl=nvme0 00:05:15.056 18:08:58 setup.sh.devices -- setup/devices.sh@202 -- # pci=0000:5e:00.0 00:05:15.056 18:08:58 setup.sh.devices -- setup/devices.sh@203 -- # [[ '' == *\0\0\0\0\:\5\e\:\0\0\.\0* ]] 00:05:15.056 18:08:58 setup.sh.devices -- setup/devices.sh@204 -- # block_in_use nvme0n1 00:05:15.056 18:08:58 setup.sh.devices -- scripts/common.sh@378 -- # local block=nvme0n1 pt 00:05:15.056 18:08:58 setup.sh.devices -- scripts/common.sh@387 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/spdk-gpt.py nvme0n1 00:05:15.056 No valid GPT data, bailing 00:05:15.056 18:08:58 setup.sh.devices -- scripts/common.sh@391 -- # blkid -s PTTYPE -o value /dev/nvme0n1 00:05:15.056 18:08:58 setup.sh.devices -- scripts/common.sh@391 -- # pt= 00:05:15.056 18:08:58 setup.sh.devices -- scripts/common.sh@392 -- # return 1 00:05:15.056 18:08:58 setup.sh.devices -- setup/devices.sh@204 -- # sec_size_to_bytes nvme0n1 00:05:15.056 18:08:58 setup.sh.devices -- setup/common.sh@76 -- # local dev=nvme0n1 00:05:15.056 18:08:58 setup.sh.devices -- setup/common.sh@78 -- # [[ -e /sys/block/nvme0n1 ]] 00:05:15.056 18:08:58 setup.sh.devices -- setup/common.sh@80 -- # echo 7681501126656 00:05:15.056 18:08:58 setup.sh.devices -- setup/devices.sh@204 -- # (( 7681501126656 >= min_disk_size )) 00:05:15.056 18:08:58 setup.sh.devices -- setup/devices.sh@205 -- # blocks+=("${block##*/}") 00:05:15.056 18:08:58 setup.sh.devices -- setup/devices.sh@206 -- # blocks_to_pci["${block##*/}"]=0000:5e:00.0 00:05:15.056 18:08:58 setup.sh.devices -- setup/devices.sh@209 -- # (( 1 > 0 )) 00:05:15.056 18:08:58 setup.sh.devices -- setup/devices.sh@211 -- # declare -r test_disk=nvme0n1 00:05:15.056 18:08:58 setup.sh.devices -- setup/devices.sh@213 -- # run_test nvme_mount nvme_mount 00:05:15.056 18:08:58 setup.sh.devices -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:05:15.056 18:08:58 setup.sh.devices -- common/autotest_common.sh@1105 -- # xtrace_disable 00:05:15.056 18:08:58 setup.sh.devices -- common/autotest_common.sh@10 -- # set +x 00:05:15.056 ************************************ 00:05:15.056 START TEST nvme_mount 00:05:15.056 ************************************ 00:05:15.056 18:08:58 setup.sh.devices.nvme_mount -- common/autotest_common.sh@1123 -- # nvme_mount 00:05:15.056 18:08:58 setup.sh.devices.nvme_mount -- setup/devices.sh@95 -- # nvme_disk=nvme0n1 00:05:15.056 18:08:58 setup.sh.devices.nvme_mount -- setup/devices.sh@96 -- # nvme_disk_p=nvme0n1p1 00:05:15.056 18:08:58 setup.sh.devices.nvme_mount -- setup/devices.sh@97 -- # nvme_mount=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount 00:05:15.056 18:08:58 setup.sh.devices.nvme_mount -- setup/devices.sh@98 -- # nvme_dummy_test_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount/test_nvme 00:05:15.056 18:08:58 setup.sh.devices.nvme_mount -- setup/devices.sh@101 -- # partition_drive nvme0n1 1 00:05:15.056 18:08:58 setup.sh.devices.nvme_mount -- setup/common.sh@39 -- # local disk=nvme0n1 00:05:15.056 18:08:58 setup.sh.devices.nvme_mount -- setup/common.sh@40 -- # local part_no=1 00:05:15.056 18:08:58 setup.sh.devices.nvme_mount -- setup/common.sh@41 -- # local size=1073741824 00:05:15.056 18:08:58 setup.sh.devices.nvme_mount -- setup/common.sh@43 -- # local part part_start=0 part_end=0 00:05:15.056 18:08:58 setup.sh.devices.nvme_mount -- setup/common.sh@44 -- # parts=() 00:05:15.056 18:08:58 setup.sh.devices.nvme_mount -- setup/common.sh@44 -- # local parts 00:05:15.056 18:08:58 setup.sh.devices.nvme_mount -- setup/common.sh@46 -- # (( part = 1 )) 00:05:15.056 18:08:58 setup.sh.devices.nvme_mount -- setup/common.sh@46 -- # (( part <= part_no )) 00:05:15.056 18:08:58 setup.sh.devices.nvme_mount -- setup/common.sh@47 -- # parts+=("${disk}p$part") 00:05:15.056 18:08:58 setup.sh.devices.nvme_mount -- setup/common.sh@46 -- # (( part++ )) 00:05:15.056 18:08:58 setup.sh.devices.nvme_mount -- setup/common.sh@46 -- # (( part <= part_no )) 00:05:15.056 18:08:58 setup.sh.devices.nvme_mount -- setup/common.sh@51 -- # (( size /= 512 )) 00:05:15.056 18:08:58 setup.sh.devices.nvme_mount -- setup/common.sh@56 -- # sgdisk /dev/nvme0n1 --zap-all 00:05:15.056 18:08:58 setup.sh.devices.nvme_mount -- setup/common.sh@53 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/sync_dev_uevents.sh block/partition nvme0n1p1 00:05:16.021 Creating new GPT entries in memory. 00:05:16.021 GPT data structures destroyed! You may now partition the disk using fdisk or 00:05:16.021 other utilities. 00:05:16.021 18:08:59 setup.sh.devices.nvme_mount -- setup/common.sh@57 -- # (( part = 1 )) 00:05:16.021 18:08:59 setup.sh.devices.nvme_mount -- setup/common.sh@57 -- # (( part <= part_no )) 00:05:16.021 18:08:59 setup.sh.devices.nvme_mount -- setup/common.sh@58 -- # (( part_start = part_start == 0 ? 2048 : part_end + 1 )) 00:05:16.021 18:08:59 setup.sh.devices.nvme_mount -- setup/common.sh@59 -- # (( part_end = part_start + size - 1 )) 00:05:16.021 18:08:59 setup.sh.devices.nvme_mount -- setup/common.sh@60 -- # flock /dev/nvme0n1 sgdisk /dev/nvme0n1 --new=1:2048:2099199 00:05:16.956 Creating new GPT entries in memory. 00:05:16.956 The operation has completed successfully. 00:05:16.956 18:09:00 setup.sh.devices.nvme_mount -- setup/common.sh@57 -- # (( part++ )) 00:05:16.956 18:09:00 setup.sh.devices.nvme_mount -- setup/common.sh@57 -- # (( part <= part_no )) 00:05:16.956 18:09:00 setup.sh.devices.nvme_mount -- setup/common.sh@62 -- # wait 2400502 00:05:16.956 18:09:00 setup.sh.devices.nvme_mount -- setup/devices.sh@102 -- # mkfs /dev/nvme0n1p1 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount 00:05:16.956 18:09:00 setup.sh.devices.nvme_mount -- setup/common.sh@66 -- # local dev=/dev/nvme0n1p1 mount=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount size= 00:05:16.956 18:09:00 setup.sh.devices.nvme_mount -- setup/common.sh@68 -- # mkdir -p /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount 00:05:16.956 18:09:00 setup.sh.devices.nvme_mount -- setup/common.sh@70 -- # [[ -e /dev/nvme0n1p1 ]] 00:05:16.956 18:09:00 setup.sh.devices.nvme_mount -- setup/common.sh@71 -- # mkfs.ext4 -qF /dev/nvme0n1p1 00:05:16.956 18:09:00 setup.sh.devices.nvme_mount -- setup/common.sh@72 -- # mount /dev/nvme0n1p1 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount 00:05:16.956 18:09:00 setup.sh.devices.nvme_mount -- setup/devices.sh@105 -- # verify 0000:5e:00.0 nvme0n1:nvme0n1p1 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount/test_nvme 00:05:16.956 18:09:00 setup.sh.devices.nvme_mount -- setup/devices.sh@48 -- # local dev=0000:5e:00.0 00:05:16.957 18:09:00 setup.sh.devices.nvme_mount -- setup/devices.sh@49 -- # local mounts=nvme0n1:nvme0n1p1 00:05:16.957 18:09:00 setup.sh.devices.nvme_mount -- setup/devices.sh@50 -- # local mount_point=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount 00:05:16.957 18:09:00 setup.sh.devices.nvme_mount -- setup/devices.sh@51 -- # local test_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount/test_nvme 00:05:16.957 18:09:00 setup.sh.devices.nvme_mount -- setup/devices.sh@53 -- # local found=0 00:05:16.957 18:09:00 setup.sh.devices.nvme_mount -- setup/devices.sh@55 -- # [[ -n /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount/test_nvme ]] 00:05:16.957 18:09:00 setup.sh.devices.nvme_mount -- setup/devices.sh@56 -- # : 00:05:16.957 18:09:00 setup.sh.devices.nvme_mount -- setup/devices.sh@59 -- # local pci status 00:05:16.957 18:09:00 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:16.957 18:09:00 setup.sh.devices.nvme_mount -- setup/devices.sh@47 -- # PCI_ALLOWED=0000:5e:00.0 00:05:16.957 18:09:00 setup.sh.devices.nvme_mount -- setup/devices.sh@47 -- # setup output config 00:05:16.957 18:09:00 setup.sh.devices.nvme_mount -- setup/common.sh@9 -- # [[ output == output ]] 00:05:16.957 18:09:00 setup.sh.devices.nvme_mount -- setup/common.sh@10 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh config 00:05:21.142 18:09:03 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:5e:00.0 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:21.143 18:09:03 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ Active devices: mount@nvme0n1:nvme0n1p1, so not binding PCI dev == *\A\c\t\i\v\e\ \d\e\v\i\c\e\s\:\ *\n\v\m\e\0\n\1\:\n\v\m\e\0\n\1\p\1* ]] 00:05:21.143 18:09:03 setup.sh.devices.nvme_mount -- setup/devices.sh@63 -- # found=1 00:05:21.143 18:09:03 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:21.143 18:09:03 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.7 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:21.143 18:09:03 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:21.143 18:09:03 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.6 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:21.143 18:09:03 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:21.143 18:09:03 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.5 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:21.143 18:09:03 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:21.143 18:09:03 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.4 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:21.143 18:09:03 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:21.143 18:09:03 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.3 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:21.143 18:09:03 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:21.143 18:09:03 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.2 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:21.143 18:09:03 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:21.143 18:09:03 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.1 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:21.143 18:09:03 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:21.143 18:09:03 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.0 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:21.143 18:09:03 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:21.143 18:09:03 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:d7:05.5 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:21.143 18:09:03 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:21.143 18:09:03 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:d7:05.5 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:21.143 18:09:03 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:21.143 18:09:03 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:85:05.5 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:21.143 18:09:03 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:21.143 18:09:03 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:85:05.5 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:21.143 18:09:03 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:21.143 18:09:03 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.7 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:21.143 18:09:03 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:21.143 18:09:03 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.6 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:21.143 18:09:03 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:21.143 18:09:03 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.5 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:21.143 18:09:03 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:21.143 18:09:03 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.4 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:21.143 18:09:03 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:21.143 18:09:03 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.3 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:21.143 18:09:03 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:21.143 18:09:03 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.2 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:21.143 18:09:03 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:21.143 18:09:03 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.1 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:21.143 18:09:03 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:21.143 18:09:03 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.0 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:21.143 18:09:03 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:21.143 18:09:04 setup.sh.devices.nvme_mount -- setup/devices.sh@66 -- # (( found == 1 )) 00:05:21.143 18:09:04 setup.sh.devices.nvme_mount -- setup/devices.sh@68 -- # [[ -n /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount ]] 00:05:21.143 18:09:04 setup.sh.devices.nvme_mount -- setup/devices.sh@71 -- # mountpoint -q /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount 00:05:21.143 18:09:04 setup.sh.devices.nvme_mount -- setup/devices.sh@73 -- # [[ -e /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount/test_nvme ]] 00:05:21.143 18:09:04 setup.sh.devices.nvme_mount -- setup/devices.sh@74 -- # rm /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount/test_nvme 00:05:21.143 18:09:04 setup.sh.devices.nvme_mount -- setup/devices.sh@110 -- # cleanup_nvme 00:05:21.143 18:09:04 setup.sh.devices.nvme_mount -- setup/devices.sh@20 -- # mountpoint -q /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount 00:05:21.143 18:09:04 setup.sh.devices.nvme_mount -- setup/devices.sh@21 -- # umount /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount 00:05:21.143 18:09:04 setup.sh.devices.nvme_mount -- setup/devices.sh@24 -- # [[ -b /dev/nvme0n1p1 ]] 00:05:21.143 18:09:04 setup.sh.devices.nvme_mount -- setup/devices.sh@25 -- # wipefs --all /dev/nvme0n1p1 00:05:21.143 /dev/nvme0n1p1: 2 bytes were erased at offset 0x00000438 (ext4): 53 ef 00:05:21.143 18:09:04 setup.sh.devices.nvme_mount -- setup/devices.sh@27 -- # [[ -b /dev/nvme0n1 ]] 00:05:21.143 18:09:04 setup.sh.devices.nvme_mount -- setup/devices.sh@28 -- # wipefs --all /dev/nvme0n1 00:05:21.143 /dev/nvme0n1: 8 bytes were erased at offset 0x00000200 (gpt): 45 46 49 20 50 41 52 54 00:05:21.143 /dev/nvme0n1: 8 bytes were erased at offset 0x6fc7d255e00 (gpt): 45 46 49 20 50 41 52 54 00:05:21.143 /dev/nvme0n1: 2 bytes were erased at offset 0x000001fe (PMBR): 55 aa 00:05:21.143 /dev/nvme0n1: calling ioctl to re-read partition table: Success 00:05:21.143 18:09:04 setup.sh.devices.nvme_mount -- setup/devices.sh@113 -- # mkfs /dev/nvme0n1 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount 1024M 00:05:21.143 18:09:04 setup.sh.devices.nvme_mount -- setup/common.sh@66 -- # local dev=/dev/nvme0n1 mount=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount size=1024M 00:05:21.143 18:09:04 setup.sh.devices.nvme_mount -- setup/common.sh@68 -- # mkdir -p /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount 00:05:21.143 18:09:04 setup.sh.devices.nvme_mount -- setup/common.sh@70 -- # [[ -e /dev/nvme0n1 ]] 00:05:21.143 18:09:04 setup.sh.devices.nvme_mount -- setup/common.sh@71 -- # mkfs.ext4 -qF /dev/nvme0n1 1024M 00:05:21.143 18:09:04 setup.sh.devices.nvme_mount -- setup/common.sh@72 -- # mount /dev/nvme0n1 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount 00:05:21.143 18:09:04 setup.sh.devices.nvme_mount -- setup/devices.sh@116 -- # verify 0000:5e:00.0 nvme0n1:nvme0n1 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount/test_nvme 00:05:21.143 18:09:04 setup.sh.devices.nvme_mount -- setup/devices.sh@48 -- # local dev=0000:5e:00.0 00:05:21.143 18:09:04 setup.sh.devices.nvme_mount -- setup/devices.sh@49 -- # local mounts=nvme0n1:nvme0n1 00:05:21.143 18:09:04 setup.sh.devices.nvme_mount -- setup/devices.sh@50 -- # local mount_point=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount 00:05:21.143 18:09:04 setup.sh.devices.nvme_mount -- setup/devices.sh@51 -- # local test_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount/test_nvme 00:05:21.143 18:09:04 setup.sh.devices.nvme_mount -- setup/devices.sh@53 -- # local found=0 00:05:21.143 18:09:04 setup.sh.devices.nvme_mount -- setup/devices.sh@55 -- # [[ -n /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount/test_nvme ]] 00:05:21.143 18:09:04 setup.sh.devices.nvme_mount -- setup/devices.sh@56 -- # : 00:05:21.143 18:09:04 setup.sh.devices.nvme_mount -- setup/devices.sh@59 -- # local pci status 00:05:21.143 18:09:04 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:21.143 18:09:04 setup.sh.devices.nvme_mount -- setup/devices.sh@47 -- # PCI_ALLOWED=0000:5e:00.0 00:05:21.143 18:09:04 setup.sh.devices.nvme_mount -- setup/devices.sh@47 -- # setup output config 00:05:21.143 18:09:04 setup.sh.devices.nvme_mount -- setup/common.sh@9 -- # [[ output == output ]] 00:05:21.143 18:09:04 setup.sh.devices.nvme_mount -- setup/common.sh@10 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh config 00:05:24.430 18:09:08 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:5e:00.0 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:24.430 18:09:08 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ Active devices: mount@nvme0n1:nvme0n1, so not binding PCI dev == *\A\c\t\i\v\e\ \d\e\v\i\c\e\s\:\ *\n\v\m\e\0\n\1\:\n\v\m\e\0\n\1* ]] 00:05:24.430 18:09:08 setup.sh.devices.nvme_mount -- setup/devices.sh@63 -- # found=1 00:05:24.430 18:09:08 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:24.430 18:09:08 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.7 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:24.430 18:09:08 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:24.430 18:09:08 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.6 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:24.430 18:09:08 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:24.430 18:09:08 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.5 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:24.430 18:09:08 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:24.430 18:09:08 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.4 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:24.430 18:09:08 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:24.430 18:09:08 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.3 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:24.430 18:09:08 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:24.430 18:09:08 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.2 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:24.430 18:09:08 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:24.430 18:09:08 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.1 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:24.430 18:09:08 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:24.430 18:09:08 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.0 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:24.430 18:09:08 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:24.430 18:09:08 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:d7:05.5 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:24.430 18:09:08 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:24.430 18:09:08 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:d7:05.5 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:24.430 18:09:08 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:24.430 18:09:08 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:85:05.5 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:24.430 18:09:08 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:24.430 18:09:08 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:85:05.5 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:24.430 18:09:08 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:24.430 18:09:08 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.7 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:24.430 18:09:08 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:24.430 18:09:08 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.6 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:24.430 18:09:08 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:24.430 18:09:08 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.5 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:24.430 18:09:08 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:24.430 18:09:08 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.4 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:24.430 18:09:08 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:24.430 18:09:08 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.3 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:24.430 18:09:08 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:24.430 18:09:08 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.2 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:24.430 18:09:08 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:24.430 18:09:08 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.1 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:24.430 18:09:08 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:24.430 18:09:08 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.0 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:24.430 18:09:08 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:24.689 18:09:08 setup.sh.devices.nvme_mount -- setup/devices.sh@66 -- # (( found == 1 )) 00:05:24.689 18:09:08 setup.sh.devices.nvme_mount -- setup/devices.sh@68 -- # [[ -n /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount ]] 00:05:24.689 18:09:08 setup.sh.devices.nvme_mount -- setup/devices.sh@71 -- # mountpoint -q /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount 00:05:24.689 18:09:08 setup.sh.devices.nvme_mount -- setup/devices.sh@73 -- # [[ -e /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount/test_nvme ]] 00:05:24.689 18:09:08 setup.sh.devices.nvme_mount -- setup/devices.sh@74 -- # rm /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount/test_nvme 00:05:24.689 18:09:08 setup.sh.devices.nvme_mount -- setup/devices.sh@123 -- # umount /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount 00:05:24.689 18:09:08 setup.sh.devices.nvme_mount -- setup/devices.sh@125 -- # verify 0000:5e:00.0 data@nvme0n1 '' '' 00:05:24.689 18:09:08 setup.sh.devices.nvme_mount -- setup/devices.sh@48 -- # local dev=0000:5e:00.0 00:05:24.689 18:09:08 setup.sh.devices.nvme_mount -- setup/devices.sh@49 -- # local mounts=data@nvme0n1 00:05:24.689 18:09:08 setup.sh.devices.nvme_mount -- setup/devices.sh@50 -- # local mount_point= 00:05:24.689 18:09:08 setup.sh.devices.nvme_mount -- setup/devices.sh@51 -- # local test_file= 00:05:24.689 18:09:08 setup.sh.devices.nvme_mount -- setup/devices.sh@53 -- # local found=0 00:05:24.689 18:09:08 setup.sh.devices.nvme_mount -- setup/devices.sh@55 -- # [[ -n '' ]] 00:05:24.689 18:09:08 setup.sh.devices.nvme_mount -- setup/devices.sh@59 -- # local pci status 00:05:24.689 18:09:08 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:24.689 18:09:08 setup.sh.devices.nvme_mount -- setup/devices.sh@47 -- # PCI_ALLOWED=0000:5e:00.0 00:05:24.689 18:09:08 setup.sh.devices.nvme_mount -- setup/devices.sh@47 -- # setup output config 00:05:24.689 18:09:08 setup.sh.devices.nvme_mount -- setup/common.sh@9 -- # [[ output == output ]] 00:05:24.689 18:09:08 setup.sh.devices.nvme_mount -- setup/common.sh@10 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh config 00:05:27.976 18:09:11 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:5e:00.0 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:27.976 18:09:11 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ Active devices: data@nvme0n1, so not binding PCI dev == *\A\c\t\i\v\e\ \d\e\v\i\c\e\s\:\ *\d\a\t\a\@\n\v\m\e\0\n\1* ]] 00:05:27.976 18:09:11 setup.sh.devices.nvme_mount -- setup/devices.sh@63 -- # found=1 00:05:27.976 18:09:11 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:27.976 18:09:11 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.7 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:27.976 18:09:11 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:27.976 18:09:11 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.6 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:27.976 18:09:11 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:27.976 18:09:11 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.5 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:27.976 18:09:11 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:27.976 18:09:11 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.4 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:27.976 18:09:11 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:27.976 18:09:11 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.3 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:27.976 18:09:11 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:27.976 18:09:11 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.2 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:27.976 18:09:11 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:27.976 18:09:11 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.1 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:27.976 18:09:11 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:27.976 18:09:11 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.0 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:27.976 18:09:11 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:27.976 18:09:11 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:d7:05.5 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:27.976 18:09:11 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:27.976 18:09:11 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:d7:05.5 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:27.976 18:09:11 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:27.976 18:09:11 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:85:05.5 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:27.976 18:09:11 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:27.976 18:09:11 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:85:05.5 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:27.976 18:09:11 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:27.976 18:09:11 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.7 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:27.976 18:09:11 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:27.976 18:09:11 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.6 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:27.976 18:09:11 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:27.976 18:09:11 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.5 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:27.976 18:09:11 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:27.976 18:09:11 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.4 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:27.976 18:09:11 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:27.976 18:09:11 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.3 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:27.976 18:09:11 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:27.976 18:09:11 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.2 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:27.976 18:09:11 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:27.976 18:09:11 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.1 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:27.976 18:09:11 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:27.976 18:09:11 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.0 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:27.976 18:09:11 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:27.976 18:09:11 setup.sh.devices.nvme_mount -- setup/devices.sh@66 -- # (( found == 1 )) 00:05:27.976 18:09:11 setup.sh.devices.nvme_mount -- setup/devices.sh@68 -- # [[ -n '' ]] 00:05:27.976 18:09:11 setup.sh.devices.nvme_mount -- setup/devices.sh@68 -- # return 0 00:05:27.976 18:09:11 setup.sh.devices.nvme_mount -- setup/devices.sh@128 -- # cleanup_nvme 00:05:27.976 18:09:11 setup.sh.devices.nvme_mount -- setup/devices.sh@20 -- # mountpoint -q /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount 00:05:27.976 18:09:11 setup.sh.devices.nvme_mount -- setup/devices.sh@24 -- # [[ -b /dev/nvme0n1p1 ]] 00:05:27.976 18:09:11 setup.sh.devices.nvme_mount -- setup/devices.sh@27 -- # [[ -b /dev/nvme0n1 ]] 00:05:27.976 18:09:11 setup.sh.devices.nvme_mount -- setup/devices.sh@28 -- # wipefs --all /dev/nvme0n1 00:05:27.976 /dev/nvme0n1: 2 bytes were erased at offset 0x00000438 (ext4): 53 ef 00:05:27.976 00:05:27.976 real 0m13.098s 00:05:27.976 user 0m3.766s 00:05:27.976 sys 0m7.178s 00:05:27.976 18:09:11 setup.sh.devices.nvme_mount -- common/autotest_common.sh@1124 -- # xtrace_disable 00:05:27.976 18:09:11 setup.sh.devices.nvme_mount -- common/autotest_common.sh@10 -- # set +x 00:05:27.976 ************************************ 00:05:27.976 END TEST nvme_mount 00:05:27.976 ************************************ 00:05:27.976 18:09:11 setup.sh.devices -- common/autotest_common.sh@1142 -- # return 0 00:05:27.976 18:09:11 setup.sh.devices -- setup/devices.sh@214 -- # run_test dm_mount dm_mount 00:05:27.976 18:09:11 setup.sh.devices -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:05:27.976 18:09:11 setup.sh.devices -- common/autotest_common.sh@1105 -- # xtrace_disable 00:05:27.976 18:09:11 setup.sh.devices -- common/autotest_common.sh@10 -- # set +x 00:05:27.976 ************************************ 00:05:27.976 START TEST dm_mount 00:05:27.976 ************************************ 00:05:27.976 18:09:11 setup.sh.devices.dm_mount -- common/autotest_common.sh@1123 -- # dm_mount 00:05:27.976 18:09:11 setup.sh.devices.dm_mount -- setup/devices.sh@144 -- # pv=nvme0n1 00:05:27.976 18:09:11 setup.sh.devices.dm_mount -- setup/devices.sh@145 -- # pv0=nvme0n1p1 00:05:27.976 18:09:11 setup.sh.devices.dm_mount -- setup/devices.sh@146 -- # pv1=nvme0n1p2 00:05:27.976 18:09:11 setup.sh.devices.dm_mount -- setup/devices.sh@148 -- # partition_drive nvme0n1 00:05:27.976 18:09:11 setup.sh.devices.dm_mount -- setup/common.sh@39 -- # local disk=nvme0n1 00:05:27.976 18:09:11 setup.sh.devices.dm_mount -- setup/common.sh@40 -- # local part_no=2 00:05:27.976 18:09:11 setup.sh.devices.dm_mount -- setup/common.sh@41 -- # local size=1073741824 00:05:27.976 18:09:11 setup.sh.devices.dm_mount -- setup/common.sh@43 -- # local part part_start=0 part_end=0 00:05:27.976 18:09:11 setup.sh.devices.dm_mount -- setup/common.sh@44 -- # parts=() 00:05:27.976 18:09:11 setup.sh.devices.dm_mount -- setup/common.sh@44 -- # local parts 00:05:27.976 18:09:11 setup.sh.devices.dm_mount -- setup/common.sh@46 -- # (( part = 1 )) 00:05:27.976 18:09:11 setup.sh.devices.dm_mount -- setup/common.sh@46 -- # (( part <= part_no )) 00:05:27.976 18:09:11 setup.sh.devices.dm_mount -- setup/common.sh@47 -- # parts+=("${disk}p$part") 00:05:27.976 18:09:11 setup.sh.devices.dm_mount -- setup/common.sh@46 -- # (( part++ )) 00:05:27.976 18:09:11 setup.sh.devices.dm_mount -- setup/common.sh@46 -- # (( part <= part_no )) 00:05:27.976 18:09:11 setup.sh.devices.dm_mount -- setup/common.sh@47 -- # parts+=("${disk}p$part") 00:05:27.976 18:09:11 setup.sh.devices.dm_mount -- setup/common.sh@46 -- # (( part++ )) 00:05:27.976 18:09:11 setup.sh.devices.dm_mount -- setup/common.sh@46 -- # (( part <= part_no )) 00:05:27.976 18:09:11 setup.sh.devices.dm_mount -- setup/common.sh@51 -- # (( size /= 512 )) 00:05:27.976 18:09:11 setup.sh.devices.dm_mount -- setup/common.sh@56 -- # sgdisk /dev/nvme0n1 --zap-all 00:05:27.976 18:09:11 setup.sh.devices.dm_mount -- setup/common.sh@53 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/sync_dev_uevents.sh block/partition nvme0n1p1 nvme0n1p2 00:05:28.914 Creating new GPT entries in memory. 00:05:28.914 GPT data structures destroyed! You may now partition the disk using fdisk or 00:05:28.914 other utilities. 00:05:28.914 18:09:12 setup.sh.devices.dm_mount -- setup/common.sh@57 -- # (( part = 1 )) 00:05:28.914 18:09:12 setup.sh.devices.dm_mount -- setup/common.sh@57 -- # (( part <= part_no )) 00:05:28.914 18:09:12 setup.sh.devices.dm_mount -- setup/common.sh@58 -- # (( part_start = part_start == 0 ? 2048 : part_end + 1 )) 00:05:28.914 18:09:12 setup.sh.devices.dm_mount -- setup/common.sh@59 -- # (( part_end = part_start + size - 1 )) 00:05:28.914 18:09:12 setup.sh.devices.dm_mount -- setup/common.sh@60 -- # flock /dev/nvme0n1 sgdisk /dev/nvme0n1 --new=1:2048:2099199 00:05:30.289 Creating new GPT entries in memory. 00:05:30.289 The operation has completed successfully. 00:05:30.289 18:09:13 setup.sh.devices.dm_mount -- setup/common.sh@57 -- # (( part++ )) 00:05:30.289 18:09:13 setup.sh.devices.dm_mount -- setup/common.sh@57 -- # (( part <= part_no )) 00:05:30.289 18:09:13 setup.sh.devices.dm_mount -- setup/common.sh@58 -- # (( part_start = part_start == 0 ? 2048 : part_end + 1 )) 00:05:30.289 18:09:13 setup.sh.devices.dm_mount -- setup/common.sh@59 -- # (( part_end = part_start + size - 1 )) 00:05:30.289 18:09:13 setup.sh.devices.dm_mount -- setup/common.sh@60 -- # flock /dev/nvme0n1 sgdisk /dev/nvme0n1 --new=2:2099200:4196351 00:05:31.225 The operation has completed successfully. 00:05:31.225 18:09:14 setup.sh.devices.dm_mount -- setup/common.sh@57 -- # (( part++ )) 00:05:31.225 18:09:14 setup.sh.devices.dm_mount -- setup/common.sh@57 -- # (( part <= part_no )) 00:05:31.225 18:09:14 setup.sh.devices.dm_mount -- setup/common.sh@62 -- # wait 2405197 00:05:31.225 18:09:14 setup.sh.devices.dm_mount -- setup/devices.sh@150 -- # dm_name=nvme_dm_test 00:05:31.225 18:09:14 setup.sh.devices.dm_mount -- setup/devices.sh@151 -- # dm_mount=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/dm_mount 00:05:31.225 18:09:14 setup.sh.devices.dm_mount -- setup/devices.sh@152 -- # dm_dummy_test_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/dm_mount/test_dm 00:05:31.225 18:09:14 setup.sh.devices.dm_mount -- setup/devices.sh@155 -- # dmsetup create nvme_dm_test 00:05:31.225 18:09:14 setup.sh.devices.dm_mount -- setup/devices.sh@160 -- # for t in {1..5} 00:05:31.225 18:09:14 setup.sh.devices.dm_mount -- setup/devices.sh@161 -- # [[ -e /dev/mapper/nvme_dm_test ]] 00:05:31.225 18:09:14 setup.sh.devices.dm_mount -- setup/devices.sh@161 -- # break 00:05:31.225 18:09:14 setup.sh.devices.dm_mount -- setup/devices.sh@164 -- # [[ -e /dev/mapper/nvme_dm_test ]] 00:05:31.225 18:09:14 setup.sh.devices.dm_mount -- setup/devices.sh@165 -- # readlink -f /dev/mapper/nvme_dm_test 00:05:31.225 18:09:14 setup.sh.devices.dm_mount -- setup/devices.sh@165 -- # dm=/dev/dm-0 00:05:31.225 18:09:14 setup.sh.devices.dm_mount -- setup/devices.sh@166 -- # dm=dm-0 00:05:31.225 18:09:14 setup.sh.devices.dm_mount -- setup/devices.sh@168 -- # [[ -e /sys/class/block/nvme0n1p1/holders/dm-0 ]] 00:05:31.225 18:09:14 setup.sh.devices.dm_mount -- setup/devices.sh@169 -- # [[ -e /sys/class/block/nvme0n1p2/holders/dm-0 ]] 00:05:31.225 18:09:14 setup.sh.devices.dm_mount -- setup/devices.sh@171 -- # mkfs /dev/mapper/nvme_dm_test /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/dm_mount 00:05:31.225 18:09:14 setup.sh.devices.dm_mount -- setup/common.sh@66 -- # local dev=/dev/mapper/nvme_dm_test mount=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/dm_mount size= 00:05:31.225 18:09:14 setup.sh.devices.dm_mount -- setup/common.sh@68 -- # mkdir -p /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/dm_mount 00:05:31.225 18:09:14 setup.sh.devices.dm_mount -- setup/common.sh@70 -- # [[ -e /dev/mapper/nvme_dm_test ]] 00:05:31.225 18:09:14 setup.sh.devices.dm_mount -- setup/common.sh@71 -- # mkfs.ext4 -qF /dev/mapper/nvme_dm_test 00:05:31.225 18:09:14 setup.sh.devices.dm_mount -- setup/common.sh@72 -- # mount /dev/mapper/nvme_dm_test /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/dm_mount 00:05:31.225 18:09:14 setup.sh.devices.dm_mount -- setup/devices.sh@174 -- # verify 0000:5e:00.0 nvme0n1:nvme_dm_test /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/dm_mount /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/dm_mount/test_dm 00:05:31.225 18:09:14 setup.sh.devices.dm_mount -- setup/devices.sh@48 -- # local dev=0000:5e:00.0 00:05:31.225 18:09:14 setup.sh.devices.dm_mount -- setup/devices.sh@49 -- # local mounts=nvme0n1:nvme_dm_test 00:05:31.225 18:09:14 setup.sh.devices.dm_mount -- setup/devices.sh@50 -- # local mount_point=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/dm_mount 00:05:31.225 18:09:14 setup.sh.devices.dm_mount -- setup/devices.sh@51 -- # local test_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/dm_mount/test_dm 00:05:31.225 18:09:14 setup.sh.devices.dm_mount -- setup/devices.sh@53 -- # local found=0 00:05:31.225 18:09:14 setup.sh.devices.dm_mount -- setup/devices.sh@55 -- # [[ -n /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/dm_mount/test_dm ]] 00:05:31.225 18:09:14 setup.sh.devices.dm_mount -- setup/devices.sh@56 -- # : 00:05:31.225 18:09:14 setup.sh.devices.dm_mount -- setup/devices.sh@59 -- # local pci status 00:05:31.225 18:09:14 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:31.225 18:09:14 setup.sh.devices.dm_mount -- setup/devices.sh@47 -- # PCI_ALLOWED=0000:5e:00.0 00:05:31.225 18:09:14 setup.sh.devices.dm_mount -- setup/devices.sh@47 -- # setup output config 00:05:31.225 18:09:14 setup.sh.devices.dm_mount -- setup/common.sh@9 -- # [[ output == output ]] 00:05:31.225 18:09:14 setup.sh.devices.dm_mount -- setup/common.sh@10 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh config 00:05:34.516 18:09:18 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:5e:00.0 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:34.516 18:09:18 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ Active devices: holder@nvme0n1p1:dm-0,holder@nvme0n1p2:dm-0,mount@nvme0n1:nvme_dm_test, so not binding PCI dev == *\A\c\t\i\v\e\ \d\e\v\i\c\e\s\:\ *\n\v\m\e\0\n\1\:\n\v\m\e\_\d\m\_\t\e\s\t* ]] 00:05:34.516 18:09:18 setup.sh.devices.dm_mount -- setup/devices.sh@63 -- # found=1 00:05:34.516 18:09:18 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:34.516 18:09:18 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.7 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:34.516 18:09:18 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:34.516 18:09:18 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.6 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:34.516 18:09:18 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:34.516 18:09:18 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.5 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:34.516 18:09:18 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:34.516 18:09:18 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.4 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:34.516 18:09:18 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:34.516 18:09:18 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.3 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:34.516 18:09:18 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:34.516 18:09:18 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.2 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:34.516 18:09:18 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:34.516 18:09:18 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.1 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:34.516 18:09:18 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:34.516 18:09:18 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.0 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:34.516 18:09:18 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:34.516 18:09:18 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:d7:05.5 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:34.516 18:09:18 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:34.516 18:09:18 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:d7:05.5 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:34.516 18:09:18 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:34.516 18:09:18 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:85:05.5 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:34.516 18:09:18 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:34.516 18:09:18 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:85:05.5 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:34.516 18:09:18 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:34.516 18:09:18 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.7 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:34.516 18:09:18 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:34.516 18:09:18 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.6 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:34.516 18:09:18 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:34.516 18:09:18 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.5 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:34.516 18:09:18 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:34.516 18:09:18 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.4 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:34.516 18:09:18 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:34.516 18:09:18 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.3 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:34.516 18:09:18 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:34.516 18:09:18 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.2 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:34.516 18:09:18 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:34.516 18:09:18 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.1 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:34.516 18:09:18 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:34.516 18:09:18 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.0 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:34.516 18:09:18 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:34.777 18:09:18 setup.sh.devices.dm_mount -- setup/devices.sh@66 -- # (( found == 1 )) 00:05:34.777 18:09:18 setup.sh.devices.dm_mount -- setup/devices.sh@68 -- # [[ -n /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/dm_mount ]] 00:05:34.777 18:09:18 setup.sh.devices.dm_mount -- setup/devices.sh@71 -- # mountpoint -q /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/dm_mount 00:05:34.777 18:09:18 setup.sh.devices.dm_mount -- setup/devices.sh@73 -- # [[ -e /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/dm_mount/test_dm ]] 00:05:34.777 18:09:18 setup.sh.devices.dm_mount -- setup/devices.sh@74 -- # rm /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/dm_mount/test_dm 00:05:34.777 18:09:18 setup.sh.devices.dm_mount -- setup/devices.sh@182 -- # umount /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/dm_mount 00:05:34.777 18:09:18 setup.sh.devices.dm_mount -- setup/devices.sh@184 -- # verify 0000:5e:00.0 holder@nvme0n1p1:dm-0,holder@nvme0n1p2:dm-0 '' '' 00:05:34.777 18:09:18 setup.sh.devices.dm_mount -- setup/devices.sh@48 -- # local dev=0000:5e:00.0 00:05:34.777 18:09:18 setup.sh.devices.dm_mount -- setup/devices.sh@49 -- # local mounts=holder@nvme0n1p1:dm-0,holder@nvme0n1p2:dm-0 00:05:34.777 18:09:18 setup.sh.devices.dm_mount -- setup/devices.sh@50 -- # local mount_point= 00:05:34.777 18:09:18 setup.sh.devices.dm_mount -- setup/devices.sh@51 -- # local test_file= 00:05:34.777 18:09:18 setup.sh.devices.dm_mount -- setup/devices.sh@53 -- # local found=0 00:05:34.777 18:09:18 setup.sh.devices.dm_mount -- setup/devices.sh@55 -- # [[ -n '' ]] 00:05:34.777 18:09:18 setup.sh.devices.dm_mount -- setup/devices.sh@59 -- # local pci status 00:05:34.777 18:09:18 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:34.777 18:09:18 setup.sh.devices.dm_mount -- setup/devices.sh@47 -- # PCI_ALLOWED=0000:5e:00.0 00:05:34.777 18:09:18 setup.sh.devices.dm_mount -- setup/devices.sh@47 -- # setup output config 00:05:34.777 18:09:18 setup.sh.devices.dm_mount -- setup/common.sh@9 -- # [[ output == output ]] 00:05:34.777 18:09:18 setup.sh.devices.dm_mount -- setup/common.sh@10 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh config 00:05:38.068 18:09:21 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:5e:00.0 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:38.068 18:09:21 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ Active devices: holder@nvme0n1p1:dm-0,holder@nvme0n1p2:dm-0, so not binding PCI dev == *\A\c\t\i\v\e\ \d\e\v\i\c\e\s\:\ *\h\o\l\d\e\r\@\n\v\m\e\0\n\1\p\1\:\d\m\-\0\,\h\o\l\d\e\r\@\n\v\m\e\0\n\1\p\2\:\d\m\-\0* ]] 00:05:38.068 18:09:21 setup.sh.devices.dm_mount -- setup/devices.sh@63 -- # found=1 00:05:38.068 18:09:21 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:38.068 18:09:21 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.7 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:38.068 18:09:21 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:38.068 18:09:21 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.6 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:38.068 18:09:21 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:38.068 18:09:21 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.5 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:38.068 18:09:21 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:38.068 18:09:21 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.4 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:38.068 18:09:21 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:38.068 18:09:21 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.3 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:38.068 18:09:21 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:38.068 18:09:21 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.2 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:38.068 18:09:21 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:38.068 18:09:21 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.1 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:38.068 18:09:21 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:38.068 18:09:21 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.0 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:38.068 18:09:21 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:38.068 18:09:21 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:d7:05.5 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:38.068 18:09:21 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:38.068 18:09:21 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:d7:05.5 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:38.068 18:09:21 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:38.068 18:09:21 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:85:05.5 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:38.068 18:09:21 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:38.068 18:09:21 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:85:05.5 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:38.068 18:09:21 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:38.068 18:09:21 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.7 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:38.068 18:09:21 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:38.068 18:09:21 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.6 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:38.068 18:09:21 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:38.068 18:09:21 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.5 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:38.068 18:09:21 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:38.068 18:09:21 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.4 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:38.068 18:09:21 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:38.068 18:09:21 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.3 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:38.068 18:09:21 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:38.068 18:09:21 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.2 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:38.068 18:09:21 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:38.068 18:09:21 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.1 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:38.068 18:09:21 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:38.068 18:09:21 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.0 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:38.068 18:09:21 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:38.068 18:09:21 setup.sh.devices.dm_mount -- setup/devices.sh@66 -- # (( found == 1 )) 00:05:38.068 18:09:21 setup.sh.devices.dm_mount -- setup/devices.sh@68 -- # [[ -n '' ]] 00:05:38.068 18:09:21 setup.sh.devices.dm_mount -- setup/devices.sh@68 -- # return 0 00:05:38.068 18:09:21 setup.sh.devices.dm_mount -- setup/devices.sh@187 -- # cleanup_dm 00:05:38.068 18:09:21 setup.sh.devices.dm_mount -- setup/devices.sh@33 -- # mountpoint -q /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/dm_mount 00:05:38.327 18:09:21 setup.sh.devices.dm_mount -- setup/devices.sh@36 -- # [[ -L /dev/mapper/nvme_dm_test ]] 00:05:38.327 18:09:21 setup.sh.devices.dm_mount -- setup/devices.sh@37 -- # dmsetup remove --force nvme_dm_test 00:05:38.327 18:09:21 setup.sh.devices.dm_mount -- setup/devices.sh@39 -- # [[ -b /dev/nvme0n1p1 ]] 00:05:38.327 18:09:21 setup.sh.devices.dm_mount -- setup/devices.sh@40 -- # wipefs --all /dev/nvme0n1p1 00:05:38.327 /dev/nvme0n1p1: 2 bytes were erased at offset 0x00000438 (ext4): 53 ef 00:05:38.327 18:09:21 setup.sh.devices.dm_mount -- setup/devices.sh@42 -- # [[ -b /dev/nvme0n1p2 ]] 00:05:38.327 18:09:21 setup.sh.devices.dm_mount -- setup/devices.sh@43 -- # wipefs --all /dev/nvme0n1p2 00:05:38.327 00:05:38.327 real 0m10.245s 00:05:38.327 user 0m2.559s 00:05:38.327 sys 0m4.785s 00:05:38.327 18:09:21 setup.sh.devices.dm_mount -- common/autotest_common.sh@1124 -- # xtrace_disable 00:05:38.327 18:09:21 setup.sh.devices.dm_mount -- common/autotest_common.sh@10 -- # set +x 00:05:38.327 ************************************ 00:05:38.327 END TEST dm_mount 00:05:38.327 ************************************ 00:05:38.327 18:09:21 setup.sh.devices -- common/autotest_common.sh@1142 -- # return 0 00:05:38.327 18:09:21 setup.sh.devices -- setup/devices.sh@1 -- # cleanup 00:05:38.327 18:09:21 setup.sh.devices -- setup/devices.sh@11 -- # cleanup_nvme 00:05:38.327 18:09:21 setup.sh.devices -- setup/devices.sh@20 -- # mountpoint -q /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount 00:05:38.327 18:09:21 setup.sh.devices -- setup/devices.sh@24 -- # [[ -b /dev/nvme0n1p1 ]] 00:05:38.327 18:09:21 setup.sh.devices -- setup/devices.sh@25 -- # wipefs --all /dev/nvme0n1p1 00:05:38.327 18:09:21 setup.sh.devices -- setup/devices.sh@27 -- # [[ -b /dev/nvme0n1 ]] 00:05:38.327 18:09:21 setup.sh.devices -- setup/devices.sh@28 -- # wipefs --all /dev/nvme0n1 00:05:38.586 /dev/nvme0n1: 8 bytes were erased at offset 0x00000200 (gpt): 45 46 49 20 50 41 52 54 00:05:38.586 /dev/nvme0n1: 8 bytes were erased at offset 0x6fc7d255e00 (gpt): 45 46 49 20 50 41 52 54 00:05:38.586 /dev/nvme0n1: 2 bytes were erased at offset 0x000001fe (PMBR): 55 aa 00:05:38.586 /dev/nvme0n1: calling ioctl to re-read partition table: Success 00:05:38.586 18:09:22 setup.sh.devices -- setup/devices.sh@12 -- # cleanup_dm 00:05:38.586 18:09:22 setup.sh.devices -- setup/devices.sh@33 -- # mountpoint -q /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/dm_mount 00:05:38.586 18:09:22 setup.sh.devices -- setup/devices.sh@36 -- # [[ -L /dev/mapper/nvme_dm_test ]] 00:05:38.586 18:09:22 setup.sh.devices -- setup/devices.sh@39 -- # [[ -b /dev/nvme0n1p1 ]] 00:05:38.586 18:09:22 setup.sh.devices -- setup/devices.sh@42 -- # [[ -b /dev/nvme0n1p2 ]] 00:05:38.586 18:09:22 setup.sh.devices -- setup/devices.sh@14 -- # [[ -b /dev/nvme0n1 ]] 00:05:38.586 18:09:22 setup.sh.devices -- setup/devices.sh@15 -- # wipefs --all /dev/nvme0n1 00:05:38.586 00:05:38.586 real 0m27.917s 00:05:38.586 user 0m7.878s 00:05:38.586 sys 0m14.891s 00:05:38.586 18:09:22 setup.sh.devices -- common/autotest_common.sh@1124 -- # xtrace_disable 00:05:38.586 18:09:22 setup.sh.devices -- common/autotest_common.sh@10 -- # set +x 00:05:38.586 ************************************ 00:05:38.586 END TEST devices 00:05:38.586 ************************************ 00:05:38.586 18:09:22 setup.sh -- common/autotest_common.sh@1142 -- # return 0 00:05:38.586 00:05:38.586 real 1m40.688s 00:05:38.586 user 0m30.760s 00:05:38.586 sys 0m55.819s 00:05:38.586 18:09:22 setup.sh -- common/autotest_common.sh@1124 -- # xtrace_disable 00:05:38.586 18:09:22 setup.sh -- common/autotest_common.sh@10 -- # set +x 00:05:38.586 ************************************ 00:05:38.586 END TEST setup.sh 00:05:38.586 ************************************ 00:05:38.586 18:09:22 -- common/autotest_common.sh@1142 -- # return 0 00:05:38.586 18:09:22 -- spdk/autotest.sh@128 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh status 00:05:41.921 0000:d7:05.5 (8086 201d): Skipping not allowed VMD controller at 0000:d7:05.5 00:05:41.921 0000:85:05.5 (8086 201d): Skipping not allowed VMD controller at 0000:85:05.5 00:05:41.921 Hugepages 00:05:41.921 node hugesize free / total 00:05:41.921 node0 1048576kB 0 / 0 00:05:41.921 node0 2048kB 1024 / 1024 00:05:41.921 node1 1048576kB 0 / 0 00:05:41.921 node1 2048kB 1024 / 1024 00:05:41.921 00:05:41.921 Type BDF Vendor Device NUMA Driver Device Block devices 00:05:41.921 I/OAT 0000:00:04.0 8086 2021 0 ioatdma - - 00:05:41.921 I/OAT 0000:00:04.1 8086 2021 0 ioatdma - - 00:05:41.921 I/OAT 0000:00:04.2 8086 2021 0 ioatdma - - 00:05:41.921 I/OAT 0000:00:04.3 8086 2021 0 ioatdma - - 00:05:41.921 I/OAT 0000:00:04.4 8086 2021 0 ioatdma - - 00:05:41.921 I/OAT 0000:00:04.5 8086 2021 0 ioatdma - - 00:05:41.921 I/OAT 0000:00:04.6 8086 2021 0 ioatdma - - 00:05:41.921 I/OAT 0000:00:04.7 8086 2021 0 ioatdma - - 00:05:41.921 NVMe 0000:5e:00.0 8086 0b60 0 nvme nvme0 nvme0n1 00:05:41.921 I/OAT 0000:80:04.0 8086 2021 1 ioatdma - - 00:05:41.921 I/OAT 0000:80:04.1 8086 2021 1 ioatdma - - 00:05:41.921 I/OAT 0000:80:04.2 8086 2021 1 ioatdma - - 00:05:41.921 I/OAT 0000:80:04.3 8086 2021 1 ioatdma - - 00:05:41.921 I/OAT 0000:80:04.4 8086 2021 1 ioatdma - - 00:05:41.921 I/OAT 0000:80:04.5 8086 2021 1 ioatdma - - 00:05:41.921 I/OAT 0000:80:04.6 8086 2021 1 ioatdma - - 00:05:41.921 I/OAT 0000:80:04.7 8086 2021 1 ioatdma - - 00:05:41.921 VMD 0000:85:05.5 8086 201d 1 vfio-pci - - 00:05:41.921 VMD 0000:d7:05.5 8086 201d 1 vfio-pci - - 00:05:42.194 18:09:25 -- spdk/autotest.sh@130 -- # uname -s 00:05:42.194 18:09:25 -- spdk/autotest.sh@130 -- # [[ Linux == Linux ]] 00:05:42.194 18:09:25 -- spdk/autotest.sh@132 -- # nvme_namespace_revert 00:05:42.194 18:09:25 -- common/autotest_common.sh@1531 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh 00:05:45.482 0000:d7:05.5 (8086 201d): Skipping not allowed VMD controller at 0000:d7:05.5 00:05:45.482 0000:85:05.5 (8086 201d): Skipping not allowed VMD controller at 0000:85:05.5 00:05:45.482 0000:00:04.7 (8086 2021): ioatdma -> vfio-pci 00:05:45.482 0000:00:04.6 (8086 2021): ioatdma -> vfio-pci 00:05:45.482 0000:00:04.5 (8086 2021): ioatdma -> vfio-pci 00:05:45.741 0000:00:04.4 (8086 2021): ioatdma -> vfio-pci 00:05:45.741 0000:00:04.3 (8086 2021): ioatdma -> vfio-pci 00:05:45.741 0000:00:04.2 (8086 2021): ioatdma -> vfio-pci 00:05:45.741 0000:00:04.1 (8086 2021): ioatdma -> vfio-pci 00:05:45.741 0000:00:04.0 (8086 2021): ioatdma -> vfio-pci 00:05:45.741 0000:80:04.7 (8086 2021): ioatdma -> vfio-pci 00:05:45.741 0000:80:04.6 (8086 2021): ioatdma -> vfio-pci 00:05:45.741 0000:80:04.5 (8086 2021): ioatdma -> vfio-pci 00:05:45.741 0000:80:04.4 (8086 2021): ioatdma -> vfio-pci 00:05:45.741 0000:80:04.3 (8086 2021): ioatdma -> vfio-pci 00:05:45.741 0000:80:04.2 (8086 2021): ioatdma -> vfio-pci 00:05:45.741 0000:80:04.1 (8086 2021): ioatdma -> vfio-pci 00:05:45.741 0000:80:04.0 (8086 2021): ioatdma -> vfio-pci 00:05:48.273 0000:5e:00.0 (8086 0b60): nvme -> vfio-pci 00:05:48.532 18:09:32 -- common/autotest_common.sh@1532 -- # sleep 1 00:05:49.468 18:09:33 -- common/autotest_common.sh@1533 -- # bdfs=() 00:05:49.468 18:09:33 -- common/autotest_common.sh@1533 -- # local bdfs 00:05:49.468 18:09:33 -- common/autotest_common.sh@1534 -- # bdfs=($(get_nvme_bdfs)) 00:05:49.468 18:09:33 -- common/autotest_common.sh@1534 -- # get_nvme_bdfs 00:05:49.468 18:09:33 -- common/autotest_common.sh@1513 -- # bdfs=() 00:05:49.468 18:09:33 -- common/autotest_common.sh@1513 -- # local bdfs 00:05:49.468 18:09:33 -- common/autotest_common.sh@1514 -- # bdfs=($("$rootdir/scripts/gen_nvme.sh" | jq -r '.config[].params.traddr')) 00:05:49.468 18:09:33 -- common/autotest_common.sh@1514 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/gen_nvme.sh 00:05:49.468 18:09:33 -- common/autotest_common.sh@1514 -- # jq -r '.config[].params.traddr' 00:05:49.468 18:09:33 -- common/autotest_common.sh@1515 -- # (( 1 == 0 )) 00:05:49.468 18:09:33 -- common/autotest_common.sh@1519 -- # printf '%s\n' 0000:5e:00.0 00:05:49.468 18:09:33 -- common/autotest_common.sh@1536 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh reset 00:05:52.755 0000:d7:05.5 (8086 201d): Skipping not allowed VMD controller at 0000:d7:05.5 00:05:52.755 0000:85:05.5 (8086 201d): Skipping not allowed VMD controller at 0000:85:05.5 00:05:52.755 Waiting for block devices as requested 00:05:53.013 0000:5e:00.0 (8086 0b60): vfio-pci -> nvme 00:05:53.013 0000:00:04.7 (8086 2021): vfio-pci -> ioatdma 00:05:53.013 0000:00:04.6 (8086 2021): vfio-pci -> ioatdma 00:05:53.272 0000:00:04.5 (8086 2021): vfio-pci -> ioatdma 00:05:53.272 0000:00:04.4 (8086 2021): vfio-pci -> ioatdma 00:05:53.272 0000:00:04.3 (8086 2021): vfio-pci -> ioatdma 00:05:53.530 0000:00:04.2 (8086 2021): vfio-pci -> ioatdma 00:05:53.531 0000:00:04.1 (8086 2021): vfio-pci -> ioatdma 00:05:53.531 0000:00:04.0 (8086 2021): vfio-pci -> ioatdma 00:05:53.789 0000:80:04.7 (8086 2021): vfio-pci -> ioatdma 00:05:53.789 0000:80:04.6 (8086 2021): vfio-pci -> ioatdma 00:05:53.789 0000:80:04.5 (8086 2021): vfio-pci -> ioatdma 00:05:54.049 0000:80:04.4 (8086 2021): vfio-pci -> ioatdma 00:05:54.049 0000:80:04.3 (8086 2021): vfio-pci -> ioatdma 00:05:54.049 0000:80:04.2 (8086 2021): vfio-pci -> ioatdma 00:05:54.307 0000:80:04.1 (8086 2021): vfio-pci -> ioatdma 00:05:54.307 0000:80:04.0 (8086 2021): vfio-pci -> ioatdma 00:05:54.307 18:09:37 -- common/autotest_common.sh@1538 -- # for bdf in "${bdfs[@]}" 00:05:54.307 18:09:37 -- common/autotest_common.sh@1539 -- # get_nvme_ctrlr_from_bdf 0000:5e:00.0 00:05:54.307 18:09:37 -- common/autotest_common.sh@1502 -- # readlink -f /sys/class/nvme/nvme0 00:05:54.307 18:09:37 -- common/autotest_common.sh@1502 -- # grep 0000:5e:00.0/nvme/nvme 00:05:54.307 18:09:38 -- common/autotest_common.sh@1502 -- # bdf_sysfs_path=/sys/devices/pci0000:5d/0000:5d:02.0/0000:5e:00.0/nvme/nvme0 00:05:54.307 18:09:38 -- common/autotest_common.sh@1503 -- # [[ -z /sys/devices/pci0000:5d/0000:5d:02.0/0000:5e:00.0/nvme/nvme0 ]] 00:05:54.307 18:09:38 -- common/autotest_common.sh@1507 -- # basename /sys/devices/pci0000:5d/0000:5d:02.0/0000:5e:00.0/nvme/nvme0 00:05:54.307 18:09:38 -- common/autotest_common.sh@1507 -- # printf '%s\n' nvme0 00:05:54.307 18:09:38 -- common/autotest_common.sh@1539 -- # nvme_ctrlr=/dev/nvme0 00:05:54.307 18:09:38 -- common/autotest_common.sh@1540 -- # [[ -z /dev/nvme0 ]] 00:05:54.307 18:09:38 -- common/autotest_common.sh@1545 -- # nvme id-ctrl /dev/nvme0 00:05:54.307 18:09:38 -- common/autotest_common.sh@1545 -- # grep oacs 00:05:54.307 18:09:38 -- common/autotest_common.sh@1545 -- # cut -d: -f2 00:05:54.307 18:09:38 -- common/autotest_common.sh@1545 -- # oacs=' 0x3f' 00:05:54.307 18:09:38 -- common/autotest_common.sh@1546 -- # oacs_ns_manage=8 00:05:54.307 18:09:38 -- common/autotest_common.sh@1548 -- # [[ 8 -ne 0 ]] 00:05:54.307 18:09:38 -- common/autotest_common.sh@1554 -- # grep unvmcap 00:05:54.307 18:09:38 -- common/autotest_common.sh@1554 -- # nvme id-ctrl /dev/nvme0 00:05:54.307 18:09:38 -- common/autotest_common.sh@1554 -- # cut -d: -f2 00:05:54.307 18:09:38 -- common/autotest_common.sh@1554 -- # unvmcap=' 0' 00:05:54.307 18:09:38 -- common/autotest_common.sh@1555 -- # [[ 0 -eq 0 ]] 00:05:54.307 18:09:38 -- common/autotest_common.sh@1557 -- # continue 00:05:54.307 18:09:38 -- spdk/autotest.sh@135 -- # timing_exit pre_cleanup 00:05:54.307 18:09:38 -- common/autotest_common.sh@728 -- # xtrace_disable 00:05:54.307 18:09:38 -- common/autotest_common.sh@10 -- # set +x 00:05:54.565 18:09:38 -- spdk/autotest.sh@138 -- # timing_enter afterboot 00:05:54.566 18:09:38 -- common/autotest_common.sh@722 -- # xtrace_disable 00:05:54.566 18:09:38 -- common/autotest_common.sh@10 -- # set +x 00:05:54.566 18:09:38 -- spdk/autotest.sh@139 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh 00:05:57.852 0000:d7:05.5 (8086 201d): Skipping not allowed VMD controller at 0000:d7:05.5 00:05:57.852 0000:85:05.5 (8086 201d): Skipping not allowed VMD controller at 0000:85:05.5 00:05:58.111 0000:00:04.7 (8086 2021): ioatdma -> vfio-pci 00:05:58.111 0000:00:04.6 (8086 2021): ioatdma -> vfio-pci 00:05:58.111 0000:00:04.5 (8086 2021): ioatdma -> vfio-pci 00:05:58.111 0000:00:04.4 (8086 2021): ioatdma -> vfio-pci 00:05:58.111 0000:00:04.3 (8086 2021): ioatdma -> vfio-pci 00:05:58.111 0000:00:04.2 (8086 2021): ioatdma -> vfio-pci 00:05:58.111 0000:00:04.1 (8086 2021): ioatdma -> vfio-pci 00:05:58.111 0000:00:04.0 (8086 2021): ioatdma -> vfio-pci 00:05:58.111 0000:80:04.7 (8086 2021): ioatdma -> vfio-pci 00:05:58.111 0000:80:04.6 (8086 2021): ioatdma -> vfio-pci 00:05:58.111 0000:80:04.5 (8086 2021): ioatdma -> vfio-pci 00:05:58.111 0000:80:04.4 (8086 2021): ioatdma -> vfio-pci 00:05:58.111 0000:80:04.3 (8086 2021): ioatdma -> vfio-pci 00:05:58.111 0000:80:04.2 (8086 2021): ioatdma -> vfio-pci 00:05:58.111 0000:80:04.1 (8086 2021): ioatdma -> vfio-pci 00:05:58.111 0000:80:04.0 (8086 2021): ioatdma -> vfio-pci 00:06:00.642 0000:5e:00.0 (8086 0b60): nvme -> vfio-pci 00:06:00.901 18:09:44 -- spdk/autotest.sh@140 -- # timing_exit afterboot 00:06:00.901 18:09:44 -- common/autotest_common.sh@728 -- # xtrace_disable 00:06:00.901 18:09:44 -- common/autotest_common.sh@10 -- # set +x 00:06:00.901 18:09:44 -- spdk/autotest.sh@144 -- # opal_revert_cleanup 00:06:00.901 18:09:44 -- common/autotest_common.sh@1591 -- # mapfile -t bdfs 00:06:00.901 18:09:44 -- common/autotest_common.sh@1591 -- # get_nvme_bdfs_by_id 0x0a54 00:06:00.901 18:09:44 -- common/autotest_common.sh@1577 -- # bdfs=() 00:06:00.901 18:09:44 -- common/autotest_common.sh@1577 -- # local bdfs 00:06:00.901 18:09:44 -- common/autotest_common.sh@1579 -- # get_nvme_bdfs 00:06:00.901 18:09:44 -- common/autotest_common.sh@1513 -- # bdfs=() 00:06:00.901 18:09:44 -- common/autotest_common.sh@1513 -- # local bdfs 00:06:00.901 18:09:44 -- common/autotest_common.sh@1514 -- # bdfs=($("$rootdir/scripts/gen_nvme.sh" | jq -r '.config[].params.traddr')) 00:06:00.901 18:09:44 -- common/autotest_common.sh@1514 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/gen_nvme.sh 00:06:00.901 18:09:44 -- common/autotest_common.sh@1514 -- # jq -r '.config[].params.traddr' 00:06:00.901 18:09:44 -- common/autotest_common.sh@1515 -- # (( 1 == 0 )) 00:06:00.901 18:09:44 -- common/autotest_common.sh@1519 -- # printf '%s\n' 0000:5e:00.0 00:06:00.901 18:09:44 -- common/autotest_common.sh@1579 -- # for bdf in $(get_nvme_bdfs) 00:06:00.901 18:09:44 -- common/autotest_common.sh@1580 -- # cat /sys/bus/pci/devices/0000:5e:00.0/device 00:06:00.901 18:09:44 -- common/autotest_common.sh@1580 -- # device=0x0b60 00:06:00.901 18:09:44 -- common/autotest_common.sh@1581 -- # [[ 0x0b60 == \0\x\0\a\5\4 ]] 00:06:00.901 18:09:44 -- common/autotest_common.sh@1586 -- # printf '%s\n' 00:06:00.901 18:09:44 -- common/autotest_common.sh@1592 -- # [[ -z '' ]] 00:06:00.901 18:09:44 -- common/autotest_common.sh@1593 -- # return 0 00:06:00.901 18:09:44 -- spdk/autotest.sh@150 -- # '[' 0 -eq 1 ']' 00:06:00.901 18:09:44 -- spdk/autotest.sh@154 -- # '[' 1 -eq 1 ']' 00:06:00.901 18:09:44 -- spdk/autotest.sh@155 -- # [[ 1 -eq 1 ]] 00:06:00.901 18:09:44 -- spdk/autotest.sh@156 -- # [[ 0 -eq 1 ]] 00:06:00.901 18:09:44 -- spdk/autotest.sh@159 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/qat_setup.sh 00:06:01.836 Restarting all devices. 00:06:05.124 lstat() error: No such file or directory 00:06:05.124 QAT Error: No GENERAL section found 00:06:05.124 Failed to configure qat_dev0 00:06:05.124 lstat() error: No such file or directory 00:06:05.124 QAT Error: No GENERAL section found 00:06:05.124 Failed to configure qat_dev1 00:06:05.124 lstat() error: No such file or directory 00:06:05.124 QAT Error: No GENERAL section found 00:06:05.124 Failed to configure qat_dev2 00:06:05.124 enable sriov 00:06:05.124 Checking status of all devices. 00:06:05.124 There is 3 QAT acceleration device(s) in the system: 00:06:05.124 qat_dev0 - type: c6xx, inst_id: 0, node_id: 0, bsf: 0000:3d:00.0, #accel: 5 #engines: 10 state: down 00:06:05.124 qat_dev1 - type: c6xx, inst_id: 1, node_id: 0, bsf: 0000:3f:00.0, #accel: 5 #engines: 10 state: down 00:06:05.124 qat_dev2 - type: c6xx, inst_id: 2, node_id: 1, bsf: 0000:da:00.0, #accel: 5 #engines: 10 state: down 00:06:06.059 0000:3d:00.0 set to 16 VFs 00:06:06.628 0000:3f:00.0 set to 16 VFs 00:06:07.562 0000:da:00.0 set to 16 VFs 00:06:08.940 Properly configured the qat device with driver uio_pci_generic. 00:06:08.940 18:09:52 -- spdk/autotest.sh@162 -- # timing_enter lib 00:06:08.940 18:09:52 -- common/autotest_common.sh@722 -- # xtrace_disable 00:06:08.940 18:09:52 -- common/autotest_common.sh@10 -- # set +x 00:06:08.940 18:09:52 -- spdk/autotest.sh@164 -- # [[ 0 -eq 1 ]] 00:06:08.940 18:09:52 -- spdk/autotest.sh@168 -- # run_test env /var/jenkins/workspace/crypto-phy-autotest/spdk/test/env/env.sh 00:06:08.940 18:09:52 -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:06:08.940 18:09:52 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:06:08.940 18:09:52 -- common/autotest_common.sh@10 -- # set +x 00:06:08.940 ************************************ 00:06:08.940 START TEST env 00:06:08.940 ************************************ 00:06:08.940 18:09:52 env -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/env/env.sh 00:06:08.940 * Looking for test storage... 00:06:08.940 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/env 00:06:08.940 18:09:52 env -- env/env.sh@10 -- # run_test env_memory /var/jenkins/workspace/crypto-phy-autotest/spdk/test/env/memory/memory_ut 00:06:08.940 18:09:52 env -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:06:08.940 18:09:52 env -- common/autotest_common.sh@1105 -- # xtrace_disable 00:06:08.940 18:09:52 env -- common/autotest_common.sh@10 -- # set +x 00:06:08.940 ************************************ 00:06:08.940 START TEST env_memory 00:06:08.940 ************************************ 00:06:08.940 18:09:52 env.env_memory -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/env/memory/memory_ut 00:06:08.940 00:06:08.940 00:06:08.940 CUnit - A unit testing framework for C - Version 2.1-3 00:06:08.940 http://cunit.sourceforge.net/ 00:06:08.940 00:06:08.940 00:06:08.940 Suite: memory 00:06:09.199 Test: alloc and free memory map ...[2024-07-12 18:09:52.689630] /var/jenkins/workspace/crypto-phy-autotest/spdk/lib/env_dpdk/memory.c: 283:spdk_mem_map_alloc: *ERROR*: Initial mem_map notify failed 00:06:09.199 passed 00:06:09.199 Test: mem map translation ...[2024-07-12 18:09:52.718957] /var/jenkins/workspace/crypto-phy-autotest/spdk/lib/env_dpdk/memory.c: 590:spdk_mem_map_set_translation: *ERROR*: invalid spdk_mem_map_set_translation parameters, vaddr=2097152 len=1234 00:06:09.199 [2024-07-12 18:09:52.718981] /var/jenkins/workspace/crypto-phy-autotest/spdk/lib/env_dpdk/memory.c: 590:spdk_mem_map_set_translation: *ERROR*: invalid spdk_mem_map_set_translation parameters, vaddr=1234 len=2097152 00:06:09.199 [2024-07-12 18:09:52.719037] /var/jenkins/workspace/crypto-phy-autotest/spdk/lib/env_dpdk/memory.c: 584:spdk_mem_map_set_translation: *ERROR*: invalid usermode virtual address 281474976710656 00:06:09.199 [2024-07-12 18:09:52.719050] /var/jenkins/workspace/crypto-phy-autotest/spdk/lib/env_dpdk/memory.c: 600:spdk_mem_map_set_translation: *ERROR*: could not get 0xffffffe00000 map 00:06:09.199 passed 00:06:09.199 Test: mem map registration ...[2024-07-12 18:09:52.776719] /var/jenkins/workspace/crypto-phy-autotest/spdk/lib/env_dpdk/memory.c: 346:spdk_mem_register: *ERROR*: invalid spdk_mem_register parameters, vaddr=0x200000 len=1234 00:06:09.199 [2024-07-12 18:09:52.776753] /var/jenkins/workspace/crypto-phy-autotest/spdk/lib/env_dpdk/memory.c: 346:spdk_mem_register: *ERROR*: invalid spdk_mem_register parameters, vaddr=0x4d2 len=2097152 00:06:09.199 passed 00:06:09.199 Test: mem map adjacent registrations ...passed 00:06:09.199 00:06:09.199 Run Summary: Type Total Ran Passed Failed Inactive 00:06:09.199 suites 1 1 n/a 0 0 00:06:09.199 tests 4 4 4 0 0 00:06:09.199 asserts 152 152 152 0 n/a 00:06:09.199 00:06:09.199 Elapsed time = 0.198 seconds 00:06:09.199 00:06:09.199 real 0m0.213s 00:06:09.199 user 0m0.199s 00:06:09.199 sys 0m0.013s 00:06:09.199 18:09:52 env.env_memory -- common/autotest_common.sh@1124 -- # xtrace_disable 00:06:09.199 18:09:52 env.env_memory -- common/autotest_common.sh@10 -- # set +x 00:06:09.199 ************************************ 00:06:09.199 END TEST env_memory 00:06:09.199 ************************************ 00:06:09.199 18:09:52 env -- common/autotest_common.sh@1142 -- # return 0 00:06:09.199 18:09:52 env -- env/env.sh@11 -- # run_test env_vtophys /var/jenkins/workspace/crypto-phy-autotest/spdk/test/env/vtophys/vtophys 00:06:09.199 18:09:52 env -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:06:09.199 18:09:52 env -- common/autotest_common.sh@1105 -- # xtrace_disable 00:06:09.199 18:09:52 env -- common/autotest_common.sh@10 -- # set +x 00:06:09.458 ************************************ 00:06:09.458 START TEST env_vtophys 00:06:09.458 ************************************ 00:06:09.458 18:09:52 env.env_vtophys -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/env/vtophys/vtophys 00:06:09.458 EAL: lib.eal log level changed from notice to debug 00:06:09.458 EAL: Detected lcore 0 as core 0 on socket 0 00:06:09.458 EAL: Detected lcore 1 as core 1 on socket 0 00:06:09.458 EAL: Detected lcore 2 as core 2 on socket 0 00:06:09.458 EAL: Detected lcore 3 as core 3 on socket 0 00:06:09.458 EAL: Detected lcore 4 as core 4 on socket 0 00:06:09.458 EAL: Detected lcore 5 as core 8 on socket 0 00:06:09.458 EAL: Detected lcore 6 as core 9 on socket 0 00:06:09.458 EAL: Detected lcore 7 as core 10 on socket 0 00:06:09.458 EAL: Detected lcore 8 as core 11 on socket 0 00:06:09.458 EAL: Detected lcore 9 as core 16 on socket 0 00:06:09.458 EAL: Detected lcore 10 as core 17 on socket 0 00:06:09.458 EAL: Detected lcore 11 as core 18 on socket 0 00:06:09.458 EAL: Detected lcore 12 as core 19 on socket 0 00:06:09.458 EAL: Detected lcore 13 as core 20 on socket 0 00:06:09.458 EAL: Detected lcore 14 as core 24 on socket 0 00:06:09.458 EAL: Detected lcore 15 as core 25 on socket 0 00:06:09.458 EAL: Detected lcore 16 as core 26 on socket 0 00:06:09.458 EAL: Detected lcore 17 as core 27 on socket 0 00:06:09.458 EAL: Detected lcore 18 as core 0 on socket 1 00:06:09.458 EAL: Detected lcore 19 as core 1 on socket 1 00:06:09.458 EAL: Detected lcore 20 as core 2 on socket 1 00:06:09.458 EAL: Detected lcore 21 as core 3 on socket 1 00:06:09.458 EAL: Detected lcore 22 as core 4 on socket 1 00:06:09.458 EAL: Detected lcore 23 as core 8 on socket 1 00:06:09.458 EAL: Detected lcore 24 as core 9 on socket 1 00:06:09.458 EAL: Detected lcore 25 as core 10 on socket 1 00:06:09.458 EAL: Detected lcore 26 as core 11 on socket 1 00:06:09.458 EAL: Detected lcore 27 as core 16 on socket 1 00:06:09.458 EAL: Detected lcore 28 as core 17 on socket 1 00:06:09.458 EAL: Detected lcore 29 as core 18 on socket 1 00:06:09.458 EAL: Detected lcore 30 as core 19 on socket 1 00:06:09.458 EAL: Detected lcore 31 as core 20 on socket 1 00:06:09.458 EAL: Detected lcore 32 as core 24 on socket 1 00:06:09.458 EAL: Detected lcore 33 as core 25 on socket 1 00:06:09.458 EAL: Detected lcore 34 as core 26 on socket 1 00:06:09.458 EAL: Detected lcore 35 as core 27 on socket 1 00:06:09.458 EAL: Detected lcore 36 as core 0 on socket 0 00:06:09.458 EAL: Detected lcore 37 as core 1 on socket 0 00:06:09.458 EAL: Detected lcore 38 as core 2 on socket 0 00:06:09.458 EAL: Detected lcore 39 as core 3 on socket 0 00:06:09.458 EAL: Detected lcore 40 as core 4 on socket 0 00:06:09.458 EAL: Detected lcore 41 as core 8 on socket 0 00:06:09.458 EAL: Detected lcore 42 as core 9 on socket 0 00:06:09.458 EAL: Detected lcore 43 as core 10 on socket 0 00:06:09.458 EAL: Detected lcore 44 as core 11 on socket 0 00:06:09.458 EAL: Detected lcore 45 as core 16 on socket 0 00:06:09.458 EAL: Detected lcore 46 as core 17 on socket 0 00:06:09.458 EAL: Detected lcore 47 as core 18 on socket 0 00:06:09.458 EAL: Detected lcore 48 as core 19 on socket 0 00:06:09.458 EAL: Detected lcore 49 as core 20 on socket 0 00:06:09.459 EAL: Detected lcore 50 as core 24 on socket 0 00:06:09.459 EAL: Detected lcore 51 as core 25 on socket 0 00:06:09.459 EAL: Detected lcore 52 as core 26 on socket 0 00:06:09.459 EAL: Detected lcore 53 as core 27 on socket 0 00:06:09.459 EAL: Detected lcore 54 as core 0 on socket 1 00:06:09.459 EAL: Detected lcore 55 as core 1 on socket 1 00:06:09.459 EAL: Detected lcore 56 as core 2 on socket 1 00:06:09.459 EAL: Detected lcore 57 as core 3 on socket 1 00:06:09.459 EAL: Detected lcore 58 as core 4 on socket 1 00:06:09.459 EAL: Detected lcore 59 as core 8 on socket 1 00:06:09.459 EAL: Detected lcore 60 as core 9 on socket 1 00:06:09.459 EAL: Detected lcore 61 as core 10 on socket 1 00:06:09.459 EAL: Detected lcore 62 as core 11 on socket 1 00:06:09.459 EAL: Detected lcore 63 as core 16 on socket 1 00:06:09.459 EAL: Detected lcore 64 as core 17 on socket 1 00:06:09.459 EAL: Detected lcore 65 as core 18 on socket 1 00:06:09.459 EAL: Detected lcore 66 as core 19 on socket 1 00:06:09.459 EAL: Detected lcore 67 as core 20 on socket 1 00:06:09.459 EAL: Detected lcore 68 as core 24 on socket 1 00:06:09.459 EAL: Detected lcore 69 as core 25 on socket 1 00:06:09.459 EAL: Detected lcore 70 as core 26 on socket 1 00:06:09.459 EAL: Detected lcore 71 as core 27 on socket 1 00:06:09.459 EAL: Maximum logical cores by configuration: 128 00:06:09.459 EAL: Detected CPU lcores: 72 00:06:09.459 EAL: Detected NUMA nodes: 2 00:06:09.459 EAL: Checking presence of .so 'librte_eal.so.24.1' 00:06:09.459 EAL: Detected shared linkage of DPDK 00:06:09.459 EAL: No shared files mode enabled, IPC will be disabled 00:06:09.459 EAL: No shared files mode enabled, IPC is disabled 00:06:09.459 EAL: PCI driver qat for device 0000:3d:01.0 wants IOVA as 'PA' 00:06:09.459 EAL: PCI driver qat for device 0000:3d:01.1 wants IOVA as 'PA' 00:06:09.459 EAL: PCI driver qat for device 0000:3d:01.2 wants IOVA as 'PA' 00:06:09.459 EAL: PCI driver qat for device 0000:3d:01.3 wants IOVA as 'PA' 00:06:09.459 EAL: PCI driver qat for device 0000:3d:01.4 wants IOVA as 'PA' 00:06:09.459 EAL: PCI driver qat for device 0000:3d:01.5 wants IOVA as 'PA' 00:06:09.459 EAL: PCI driver qat for device 0000:3d:01.6 wants IOVA as 'PA' 00:06:09.459 EAL: PCI driver qat for device 0000:3d:01.7 wants IOVA as 'PA' 00:06:09.459 EAL: PCI driver qat for device 0000:3d:02.0 wants IOVA as 'PA' 00:06:09.459 EAL: PCI driver qat for device 0000:3d:02.1 wants IOVA as 'PA' 00:06:09.459 EAL: PCI driver qat for device 0000:3d:02.2 wants IOVA as 'PA' 00:06:09.459 EAL: PCI driver qat for device 0000:3d:02.3 wants IOVA as 'PA' 00:06:09.459 EAL: PCI driver qat for device 0000:3d:02.4 wants IOVA as 'PA' 00:06:09.459 EAL: PCI driver qat for device 0000:3d:02.5 wants IOVA as 'PA' 00:06:09.459 EAL: PCI driver qat for device 0000:3d:02.6 wants IOVA as 'PA' 00:06:09.459 EAL: PCI driver qat for device 0000:3d:02.7 wants IOVA as 'PA' 00:06:09.459 EAL: PCI driver qat for device 0000:3f:01.0 wants IOVA as 'PA' 00:06:09.459 EAL: PCI driver qat for device 0000:3f:01.1 wants IOVA as 'PA' 00:06:09.459 EAL: PCI driver qat for device 0000:3f:01.2 wants IOVA as 'PA' 00:06:09.459 EAL: PCI driver qat for device 0000:3f:01.3 wants IOVA as 'PA' 00:06:09.459 EAL: PCI driver qat for device 0000:3f:01.4 wants IOVA as 'PA' 00:06:09.459 EAL: PCI driver qat for device 0000:3f:01.5 wants IOVA as 'PA' 00:06:09.459 EAL: PCI driver qat for device 0000:3f:01.6 wants IOVA as 'PA' 00:06:09.459 EAL: PCI driver qat for device 0000:3f:01.7 wants IOVA as 'PA' 00:06:09.459 EAL: PCI driver qat for device 0000:3f:02.0 wants IOVA as 'PA' 00:06:09.459 EAL: PCI driver qat for device 0000:3f:02.1 wants IOVA as 'PA' 00:06:09.459 EAL: PCI driver qat for device 0000:3f:02.2 wants IOVA as 'PA' 00:06:09.459 EAL: PCI driver qat for device 0000:3f:02.3 wants IOVA as 'PA' 00:06:09.459 EAL: PCI driver qat for device 0000:3f:02.4 wants IOVA as 'PA' 00:06:09.459 EAL: PCI driver qat for device 0000:3f:02.5 wants IOVA as 'PA' 00:06:09.459 EAL: PCI driver qat for device 0000:3f:02.6 wants IOVA as 'PA' 00:06:09.459 EAL: PCI driver qat for device 0000:3f:02.7 wants IOVA as 'PA' 00:06:09.459 EAL: PCI driver qat for device 0000:da:01.0 wants IOVA as 'PA' 00:06:09.459 EAL: PCI driver qat for device 0000:da:01.1 wants IOVA as 'PA' 00:06:09.459 EAL: PCI driver qat for device 0000:da:01.2 wants IOVA as 'PA' 00:06:09.459 EAL: PCI driver qat for device 0000:da:01.3 wants IOVA as 'PA' 00:06:09.459 EAL: PCI driver qat for device 0000:da:01.4 wants IOVA as 'PA' 00:06:09.459 EAL: PCI driver qat for device 0000:da:01.5 wants IOVA as 'PA' 00:06:09.459 EAL: PCI driver qat for device 0000:da:01.6 wants IOVA as 'PA' 00:06:09.459 EAL: PCI driver qat for device 0000:da:01.7 wants IOVA as 'PA' 00:06:09.459 EAL: PCI driver qat for device 0000:da:02.0 wants IOVA as 'PA' 00:06:09.459 EAL: PCI driver qat for device 0000:da:02.1 wants IOVA as 'PA' 00:06:09.459 EAL: PCI driver qat for device 0000:da:02.2 wants IOVA as 'PA' 00:06:09.459 EAL: PCI driver qat for device 0000:da:02.3 wants IOVA as 'PA' 00:06:09.459 EAL: PCI driver qat for device 0000:da:02.4 wants IOVA as 'PA' 00:06:09.459 EAL: PCI driver qat for device 0000:da:02.5 wants IOVA as 'PA' 00:06:09.459 EAL: PCI driver qat for device 0000:da:02.6 wants IOVA as 'PA' 00:06:09.459 EAL: PCI driver qat for device 0000:da:02.7 wants IOVA as 'PA' 00:06:09.459 EAL: Bus pci wants IOVA as 'PA' 00:06:09.459 EAL: Bus auxiliary wants IOVA as 'DC' 00:06:09.459 EAL: Bus vdev wants IOVA as 'DC' 00:06:09.459 EAL: Selected IOVA mode 'PA' 00:06:09.459 EAL: Probing VFIO support... 00:06:09.459 EAL: IOMMU type 1 (Type 1) is supported 00:06:09.459 EAL: IOMMU type 7 (sPAPR) is not supported 00:06:09.459 EAL: IOMMU type 8 (No-IOMMU) is not supported 00:06:09.459 EAL: VFIO support initialized 00:06:09.459 EAL: Ask a virtual area of 0x2e000 bytes 00:06:09.459 EAL: Virtual area found at 0x200000000000 (size = 0x2e000) 00:06:09.459 EAL: Setting up physically contiguous memory... 00:06:09.459 EAL: Setting maximum number of open files to 524288 00:06:09.459 EAL: Detected memory type: socket_id:0 hugepage_sz:2097152 00:06:09.459 EAL: Detected memory type: socket_id:1 hugepage_sz:2097152 00:06:09.459 EAL: Creating 4 segment lists: n_segs:8192 socket_id:0 hugepage_sz:2097152 00:06:09.459 EAL: Ask a virtual area of 0x61000 bytes 00:06:09.459 EAL: Virtual area found at 0x20000002e000 (size = 0x61000) 00:06:09.459 EAL: Memseg list allocated at socket 0, page size 0x800kB 00:06:09.459 EAL: Ask a virtual area of 0x400000000 bytes 00:06:09.459 EAL: Virtual area found at 0x200000200000 (size = 0x400000000) 00:06:09.459 EAL: VA reserved for memseg list at 0x200000200000, size 400000000 00:06:09.459 EAL: Ask a virtual area of 0x61000 bytes 00:06:09.459 EAL: Virtual area found at 0x200400200000 (size = 0x61000) 00:06:09.459 EAL: Memseg list allocated at socket 0, page size 0x800kB 00:06:09.459 EAL: Ask a virtual area of 0x400000000 bytes 00:06:09.459 EAL: Virtual area found at 0x200400400000 (size = 0x400000000) 00:06:09.459 EAL: VA reserved for memseg list at 0x200400400000, size 400000000 00:06:09.459 EAL: Ask a virtual area of 0x61000 bytes 00:06:09.459 EAL: Virtual area found at 0x200800400000 (size = 0x61000) 00:06:09.459 EAL: Memseg list allocated at socket 0, page size 0x800kB 00:06:09.459 EAL: Ask a virtual area of 0x400000000 bytes 00:06:09.459 EAL: Virtual area found at 0x200800600000 (size = 0x400000000) 00:06:09.459 EAL: VA reserved for memseg list at 0x200800600000, size 400000000 00:06:09.459 EAL: Ask a virtual area of 0x61000 bytes 00:06:09.459 EAL: Virtual area found at 0x200c00600000 (size = 0x61000) 00:06:09.459 EAL: Memseg list allocated at socket 0, page size 0x800kB 00:06:09.459 EAL: Ask a virtual area of 0x400000000 bytes 00:06:09.459 EAL: Virtual area found at 0x200c00800000 (size = 0x400000000) 00:06:09.459 EAL: VA reserved for memseg list at 0x200c00800000, size 400000000 00:06:09.459 EAL: Creating 4 segment lists: n_segs:8192 socket_id:1 hugepage_sz:2097152 00:06:09.459 EAL: Ask a virtual area of 0x61000 bytes 00:06:09.459 EAL: Virtual area found at 0x201000800000 (size = 0x61000) 00:06:09.459 EAL: Memseg list allocated at socket 1, page size 0x800kB 00:06:09.459 EAL: Ask a virtual area of 0x400000000 bytes 00:06:09.459 EAL: Virtual area found at 0x201000a00000 (size = 0x400000000) 00:06:09.459 EAL: VA reserved for memseg list at 0x201000a00000, size 400000000 00:06:09.459 EAL: Ask a virtual area of 0x61000 bytes 00:06:09.459 EAL: Virtual area found at 0x201400a00000 (size = 0x61000) 00:06:09.459 EAL: Memseg list allocated at socket 1, page size 0x800kB 00:06:09.459 EAL: Ask a virtual area of 0x400000000 bytes 00:06:09.459 EAL: Virtual area found at 0x201400c00000 (size = 0x400000000) 00:06:09.459 EAL: VA reserved for memseg list at 0x201400c00000, size 400000000 00:06:09.459 EAL: Ask a virtual area of 0x61000 bytes 00:06:09.459 EAL: Virtual area found at 0x201800c00000 (size = 0x61000) 00:06:09.459 EAL: Memseg list allocated at socket 1, page size 0x800kB 00:06:09.459 EAL: Ask a virtual area of 0x400000000 bytes 00:06:09.459 EAL: Virtual area found at 0x201800e00000 (size = 0x400000000) 00:06:09.459 EAL: VA reserved for memseg list at 0x201800e00000, size 400000000 00:06:09.459 EAL: Ask a virtual area of 0x61000 bytes 00:06:09.459 EAL: Virtual area found at 0x201c00e00000 (size = 0x61000) 00:06:09.459 EAL: Memseg list allocated at socket 1, page size 0x800kB 00:06:09.459 EAL: Ask a virtual area of 0x400000000 bytes 00:06:09.459 EAL: Virtual area found at 0x201c01000000 (size = 0x400000000) 00:06:09.459 EAL: VA reserved for memseg list at 0x201c01000000, size 400000000 00:06:09.459 EAL: Hugepages will be freed exactly as allocated. 00:06:09.459 EAL: No shared files mode enabled, IPC is disabled 00:06:09.459 EAL: No shared files mode enabled, IPC is disabled 00:06:09.459 EAL: TSC frequency is ~2300000 KHz 00:06:09.459 EAL: Main lcore 0 is ready (tid=7f733129db00;cpuset=[0]) 00:06:09.459 EAL: Trying to obtain current memory policy. 00:06:09.459 EAL: Setting policy MPOL_PREFERRED for socket 0 00:06:09.459 EAL: Restoring previous memory policy: 0 00:06:09.459 EAL: request: mp_malloc_sync 00:06:09.459 EAL: No shared files mode enabled, IPC is disabled 00:06:09.459 EAL: Heap on socket 0 was expanded by 2MB 00:06:09.459 EAL: PCI device 0000:3d:01.0 on NUMA socket 0 00:06:09.459 EAL: probe driver: 8086:37c9 qat 00:06:09.459 EAL: PCI memory mapped at 0x202001000000 00:06:09.459 EAL: PCI memory mapped at 0x202001001000 00:06:09.459 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:01.0 (socket 0) 00:06:09.459 EAL: PCI device 0000:3d:01.1 on NUMA socket 0 00:06:09.459 EAL: probe driver: 8086:37c9 qat 00:06:09.459 EAL: PCI memory mapped at 0x202001002000 00:06:09.459 EAL: PCI memory mapped at 0x202001003000 00:06:09.459 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:01.1 (socket 0) 00:06:09.459 EAL: PCI device 0000:3d:01.2 on NUMA socket 0 00:06:09.459 EAL: probe driver: 8086:37c9 qat 00:06:09.459 EAL: PCI memory mapped at 0x202001004000 00:06:09.459 EAL: PCI memory mapped at 0x202001005000 00:06:09.459 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:01.2 (socket 0) 00:06:09.459 EAL: PCI device 0000:3d:01.3 on NUMA socket 0 00:06:09.459 EAL: probe driver: 8086:37c9 qat 00:06:09.459 EAL: PCI memory mapped at 0x202001006000 00:06:09.459 EAL: PCI memory mapped at 0x202001007000 00:06:09.459 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:01.3 (socket 0) 00:06:09.459 EAL: PCI device 0000:3d:01.4 on NUMA socket 0 00:06:09.459 EAL: probe driver: 8086:37c9 qat 00:06:09.459 EAL: PCI memory mapped at 0x202001008000 00:06:09.459 EAL: PCI memory mapped at 0x202001009000 00:06:09.459 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:01.4 (socket 0) 00:06:09.459 EAL: PCI device 0000:3d:01.5 on NUMA socket 0 00:06:09.459 EAL: probe driver: 8086:37c9 qat 00:06:09.459 EAL: PCI memory mapped at 0x20200100a000 00:06:09.459 EAL: PCI memory mapped at 0x20200100b000 00:06:09.459 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:01.5 (socket 0) 00:06:09.459 EAL: PCI device 0000:3d:01.6 on NUMA socket 0 00:06:09.459 EAL: probe driver: 8086:37c9 qat 00:06:09.459 EAL: PCI memory mapped at 0x20200100c000 00:06:09.459 EAL: PCI memory mapped at 0x20200100d000 00:06:09.459 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:01.6 (socket 0) 00:06:09.459 EAL: PCI device 0000:3d:01.7 on NUMA socket 0 00:06:09.459 EAL: probe driver: 8086:37c9 qat 00:06:09.459 EAL: PCI memory mapped at 0x20200100e000 00:06:09.459 EAL: PCI memory mapped at 0x20200100f000 00:06:09.459 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:01.7 (socket 0) 00:06:09.459 EAL: PCI device 0000:3d:02.0 on NUMA socket 0 00:06:09.459 EAL: probe driver: 8086:37c9 qat 00:06:09.459 EAL: PCI memory mapped at 0x202001010000 00:06:09.459 EAL: PCI memory mapped at 0x202001011000 00:06:09.459 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:02.0 (socket 0) 00:06:09.459 EAL: PCI device 0000:3d:02.1 on NUMA socket 0 00:06:09.459 EAL: probe driver: 8086:37c9 qat 00:06:09.459 EAL: PCI memory mapped at 0x202001012000 00:06:09.459 EAL: PCI memory mapped at 0x202001013000 00:06:09.459 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:02.1 (socket 0) 00:06:09.459 EAL: PCI device 0000:3d:02.2 on NUMA socket 0 00:06:09.459 EAL: probe driver: 8086:37c9 qat 00:06:09.459 EAL: PCI memory mapped at 0x202001014000 00:06:09.459 EAL: PCI memory mapped at 0x202001015000 00:06:09.459 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:02.2 (socket 0) 00:06:09.459 EAL: PCI device 0000:3d:02.3 on NUMA socket 0 00:06:09.459 EAL: probe driver: 8086:37c9 qat 00:06:09.459 EAL: PCI memory mapped at 0x202001016000 00:06:09.459 EAL: PCI memory mapped at 0x202001017000 00:06:09.459 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:02.3 (socket 0) 00:06:09.459 EAL: PCI device 0000:3d:02.4 on NUMA socket 0 00:06:09.459 EAL: probe driver: 8086:37c9 qat 00:06:09.459 EAL: PCI memory mapped at 0x202001018000 00:06:09.459 EAL: PCI memory mapped at 0x202001019000 00:06:09.459 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:02.4 (socket 0) 00:06:09.459 EAL: PCI device 0000:3d:02.5 on NUMA socket 0 00:06:09.459 EAL: probe driver: 8086:37c9 qat 00:06:09.459 EAL: PCI memory mapped at 0x20200101a000 00:06:09.459 EAL: PCI memory mapped at 0x20200101b000 00:06:09.459 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:02.5 (socket 0) 00:06:09.459 EAL: PCI device 0000:3d:02.6 on NUMA socket 0 00:06:09.459 EAL: probe driver: 8086:37c9 qat 00:06:09.459 EAL: PCI memory mapped at 0x20200101c000 00:06:09.459 EAL: PCI memory mapped at 0x20200101d000 00:06:09.459 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:02.6 (socket 0) 00:06:09.459 EAL: PCI device 0000:3d:02.7 on NUMA socket 0 00:06:09.459 EAL: probe driver: 8086:37c9 qat 00:06:09.459 EAL: PCI memory mapped at 0x20200101e000 00:06:09.459 EAL: PCI memory mapped at 0x20200101f000 00:06:09.459 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:02.7 (socket 0) 00:06:09.459 EAL: PCI device 0000:3f:01.0 on NUMA socket 0 00:06:09.459 EAL: probe driver: 8086:37c9 qat 00:06:09.459 EAL: PCI memory mapped at 0x202001020000 00:06:09.459 EAL: PCI memory mapped at 0x202001021000 00:06:09.459 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:01.0 (socket 0) 00:06:09.459 EAL: PCI device 0000:3f:01.1 on NUMA socket 0 00:06:09.459 EAL: probe driver: 8086:37c9 qat 00:06:09.459 EAL: PCI memory mapped at 0x202001022000 00:06:09.459 EAL: PCI memory mapped at 0x202001023000 00:06:09.459 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:01.1 (socket 0) 00:06:09.459 EAL: PCI device 0000:3f:01.2 on NUMA socket 0 00:06:09.459 EAL: probe driver: 8086:37c9 qat 00:06:09.459 EAL: PCI memory mapped at 0x202001024000 00:06:09.459 EAL: PCI memory mapped at 0x202001025000 00:06:09.459 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:01.2 (socket 0) 00:06:09.459 EAL: PCI device 0000:3f:01.3 on NUMA socket 0 00:06:09.459 EAL: probe driver: 8086:37c9 qat 00:06:09.459 EAL: PCI memory mapped at 0x202001026000 00:06:09.459 EAL: PCI memory mapped at 0x202001027000 00:06:09.459 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:01.3 (socket 0) 00:06:09.459 EAL: PCI device 0000:3f:01.4 on NUMA socket 0 00:06:09.459 EAL: probe driver: 8086:37c9 qat 00:06:09.459 EAL: PCI memory mapped at 0x202001028000 00:06:09.459 EAL: PCI memory mapped at 0x202001029000 00:06:09.459 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:01.4 (socket 0) 00:06:09.459 EAL: PCI device 0000:3f:01.5 on NUMA socket 0 00:06:09.459 EAL: probe driver: 8086:37c9 qat 00:06:09.459 EAL: PCI memory mapped at 0x20200102a000 00:06:09.459 EAL: PCI memory mapped at 0x20200102b000 00:06:09.459 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:01.5 (socket 0) 00:06:09.459 EAL: PCI device 0000:3f:01.6 on NUMA socket 0 00:06:09.459 EAL: probe driver: 8086:37c9 qat 00:06:09.460 EAL: PCI memory mapped at 0x20200102c000 00:06:09.460 EAL: PCI memory mapped at 0x20200102d000 00:06:09.460 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:01.6 (socket 0) 00:06:09.460 EAL: PCI device 0000:3f:01.7 on NUMA socket 0 00:06:09.460 EAL: probe driver: 8086:37c9 qat 00:06:09.460 EAL: PCI memory mapped at 0x20200102e000 00:06:09.460 EAL: PCI memory mapped at 0x20200102f000 00:06:09.460 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:01.7 (socket 0) 00:06:09.460 EAL: PCI device 0000:3f:02.0 on NUMA socket 0 00:06:09.460 EAL: probe driver: 8086:37c9 qat 00:06:09.460 EAL: PCI memory mapped at 0x202001030000 00:06:09.460 EAL: PCI memory mapped at 0x202001031000 00:06:09.460 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:02.0 (socket 0) 00:06:09.460 EAL: PCI device 0000:3f:02.1 on NUMA socket 0 00:06:09.460 EAL: probe driver: 8086:37c9 qat 00:06:09.460 EAL: PCI memory mapped at 0x202001032000 00:06:09.460 EAL: PCI memory mapped at 0x202001033000 00:06:09.460 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:02.1 (socket 0) 00:06:09.460 EAL: PCI device 0000:3f:02.2 on NUMA socket 0 00:06:09.460 EAL: probe driver: 8086:37c9 qat 00:06:09.460 EAL: PCI memory mapped at 0x202001034000 00:06:09.460 EAL: PCI memory mapped at 0x202001035000 00:06:09.460 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:02.2 (socket 0) 00:06:09.460 EAL: PCI device 0000:3f:02.3 on NUMA socket 0 00:06:09.460 EAL: probe driver: 8086:37c9 qat 00:06:09.460 EAL: PCI memory mapped at 0x202001036000 00:06:09.460 EAL: PCI memory mapped at 0x202001037000 00:06:09.460 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:02.3 (socket 0) 00:06:09.460 EAL: PCI device 0000:3f:02.4 on NUMA socket 0 00:06:09.460 EAL: probe driver: 8086:37c9 qat 00:06:09.460 EAL: PCI memory mapped at 0x202001038000 00:06:09.460 EAL: PCI memory mapped at 0x202001039000 00:06:09.460 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:02.4 (socket 0) 00:06:09.460 EAL: PCI device 0000:3f:02.5 on NUMA socket 0 00:06:09.460 EAL: probe driver: 8086:37c9 qat 00:06:09.460 EAL: PCI memory mapped at 0x20200103a000 00:06:09.460 EAL: PCI memory mapped at 0x20200103b000 00:06:09.460 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:02.5 (socket 0) 00:06:09.460 EAL: PCI device 0000:3f:02.6 on NUMA socket 0 00:06:09.460 EAL: probe driver: 8086:37c9 qat 00:06:09.460 EAL: PCI memory mapped at 0x20200103c000 00:06:09.460 EAL: PCI memory mapped at 0x20200103d000 00:06:09.460 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:02.6 (socket 0) 00:06:09.460 EAL: PCI device 0000:3f:02.7 on NUMA socket 0 00:06:09.460 EAL: probe driver: 8086:37c9 qat 00:06:09.460 EAL: PCI memory mapped at 0x20200103e000 00:06:09.460 EAL: PCI memory mapped at 0x20200103f000 00:06:09.460 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:02.7 (socket 0) 00:06:09.460 EAL: PCI device 0000:da:01.0 on NUMA socket 1 00:06:09.460 EAL: probe driver: 8086:37c9 qat 00:06:09.460 EAL: PCI memory mapped at 0x202001040000 00:06:09.460 EAL: PCI memory mapped at 0x202001041000 00:06:09.460 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:01.0 (socket 1) 00:06:09.460 EAL: Trying to obtain current memory policy. 00:06:09.460 EAL: Setting policy MPOL_PREFERRED for socket 1 00:06:09.460 EAL: Restoring previous memory policy: 4 00:06:09.460 EAL: request: mp_malloc_sync 00:06:09.460 EAL: No shared files mode enabled, IPC is disabled 00:06:09.460 EAL: Heap on socket 1 was expanded by 2MB 00:06:09.460 EAL: PCI device 0000:da:01.1 on NUMA socket 1 00:06:09.460 EAL: probe driver: 8086:37c9 qat 00:06:09.460 EAL: PCI memory mapped at 0x202001042000 00:06:09.460 EAL: PCI memory mapped at 0x202001043000 00:06:09.460 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:01.1 (socket 1) 00:06:09.460 EAL: PCI device 0000:da:01.2 on NUMA socket 1 00:06:09.460 EAL: probe driver: 8086:37c9 qat 00:06:09.460 EAL: PCI memory mapped at 0x202001044000 00:06:09.460 EAL: PCI memory mapped at 0x202001045000 00:06:09.460 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:01.2 (socket 1) 00:06:09.460 EAL: PCI device 0000:da:01.3 on NUMA socket 1 00:06:09.460 EAL: probe driver: 8086:37c9 qat 00:06:09.460 EAL: PCI memory mapped at 0x202001046000 00:06:09.460 EAL: PCI memory mapped at 0x202001047000 00:06:09.460 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:01.3 (socket 1) 00:06:09.460 EAL: PCI device 0000:da:01.4 on NUMA socket 1 00:06:09.460 EAL: probe driver: 8086:37c9 qat 00:06:09.460 EAL: PCI memory mapped at 0x202001048000 00:06:09.460 EAL: PCI memory mapped at 0x202001049000 00:06:09.460 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:01.4 (socket 1) 00:06:09.460 EAL: PCI device 0000:da:01.5 on NUMA socket 1 00:06:09.460 EAL: probe driver: 8086:37c9 qat 00:06:09.460 EAL: PCI memory mapped at 0x20200104a000 00:06:09.460 EAL: PCI memory mapped at 0x20200104b000 00:06:09.460 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:01.5 (socket 1) 00:06:09.460 EAL: PCI device 0000:da:01.6 on NUMA socket 1 00:06:09.460 EAL: probe driver: 8086:37c9 qat 00:06:09.460 EAL: PCI memory mapped at 0x20200104c000 00:06:09.460 EAL: PCI memory mapped at 0x20200104d000 00:06:09.460 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:01.6 (socket 1) 00:06:09.460 EAL: PCI device 0000:da:01.7 on NUMA socket 1 00:06:09.460 EAL: probe driver: 8086:37c9 qat 00:06:09.460 EAL: PCI memory mapped at 0x20200104e000 00:06:09.460 EAL: PCI memory mapped at 0x20200104f000 00:06:09.460 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:01.7 (socket 1) 00:06:09.460 EAL: PCI device 0000:da:02.0 on NUMA socket 1 00:06:09.460 EAL: probe driver: 8086:37c9 qat 00:06:09.460 EAL: PCI memory mapped at 0x202001050000 00:06:09.460 EAL: PCI memory mapped at 0x202001051000 00:06:09.460 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:02.0 (socket 1) 00:06:09.460 EAL: PCI device 0000:da:02.1 on NUMA socket 1 00:06:09.460 EAL: probe driver: 8086:37c9 qat 00:06:09.460 EAL: PCI memory mapped at 0x202001052000 00:06:09.460 EAL: PCI memory mapped at 0x202001053000 00:06:09.460 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:02.1 (socket 1) 00:06:09.460 EAL: PCI device 0000:da:02.2 on NUMA socket 1 00:06:09.460 EAL: probe driver: 8086:37c9 qat 00:06:09.460 EAL: PCI memory mapped at 0x202001054000 00:06:09.460 EAL: PCI memory mapped at 0x202001055000 00:06:09.460 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:02.2 (socket 1) 00:06:09.460 EAL: PCI device 0000:da:02.3 on NUMA socket 1 00:06:09.460 EAL: probe driver: 8086:37c9 qat 00:06:09.460 EAL: PCI memory mapped at 0x202001056000 00:06:09.460 EAL: PCI memory mapped at 0x202001057000 00:06:09.460 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:02.3 (socket 1) 00:06:09.460 EAL: PCI device 0000:da:02.4 on NUMA socket 1 00:06:09.460 EAL: probe driver: 8086:37c9 qat 00:06:09.460 EAL: PCI memory mapped at 0x202001058000 00:06:09.460 EAL: PCI memory mapped at 0x202001059000 00:06:09.460 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:02.4 (socket 1) 00:06:09.460 EAL: PCI device 0000:da:02.5 on NUMA socket 1 00:06:09.460 EAL: probe driver: 8086:37c9 qat 00:06:09.460 EAL: PCI memory mapped at 0x20200105a000 00:06:09.460 EAL: PCI memory mapped at 0x20200105b000 00:06:09.460 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:02.5 (socket 1) 00:06:09.460 EAL: PCI device 0000:da:02.6 on NUMA socket 1 00:06:09.460 EAL: probe driver: 8086:37c9 qat 00:06:09.460 EAL: PCI memory mapped at 0x20200105c000 00:06:09.460 EAL: PCI memory mapped at 0x20200105d000 00:06:09.460 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:02.6 (socket 1) 00:06:09.460 EAL: PCI device 0000:da:02.7 on NUMA socket 1 00:06:09.460 EAL: probe driver: 8086:37c9 qat 00:06:09.460 EAL: PCI memory mapped at 0x20200105e000 00:06:09.460 EAL: PCI memory mapped at 0x20200105f000 00:06:09.460 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:02.7 (socket 1) 00:06:09.460 EAL: No shared files mode enabled, IPC is disabled 00:06:09.460 EAL: No shared files mode enabled, IPC is disabled 00:06:09.460 EAL: No PCI address specified using 'addr=' in: bus=pci 00:06:09.460 EAL: Mem event callback 'spdk:(nil)' registered 00:06:09.460 00:06:09.460 00:06:09.460 CUnit - A unit testing framework for C - Version 2.1-3 00:06:09.460 http://cunit.sourceforge.net/ 00:06:09.460 00:06:09.460 00:06:09.460 Suite: components_suite 00:06:09.460 Test: vtophys_malloc_test ...passed 00:06:09.460 Test: vtophys_spdk_malloc_test ...EAL: Trying to obtain current memory policy. 00:06:09.460 EAL: Setting policy MPOL_PREFERRED for socket 0 00:06:09.460 EAL: Restoring previous memory policy: 4 00:06:09.460 EAL: Calling mem event callback 'spdk:(nil)' 00:06:09.460 EAL: request: mp_malloc_sync 00:06:09.460 EAL: No shared files mode enabled, IPC is disabled 00:06:09.460 EAL: Heap on socket 0 was expanded by 4MB 00:06:09.460 EAL: Calling mem event callback 'spdk:(nil)' 00:06:09.460 EAL: request: mp_malloc_sync 00:06:09.460 EAL: No shared files mode enabled, IPC is disabled 00:06:09.460 EAL: Heap on socket 0 was shrunk by 4MB 00:06:09.460 EAL: Trying to obtain current memory policy. 00:06:09.460 EAL: Setting policy MPOL_PREFERRED for socket 0 00:06:09.460 EAL: Restoring previous memory policy: 4 00:06:09.460 EAL: Calling mem event callback 'spdk:(nil)' 00:06:09.460 EAL: request: mp_malloc_sync 00:06:09.460 EAL: No shared files mode enabled, IPC is disabled 00:06:09.460 EAL: Heap on socket 0 was expanded by 6MB 00:06:09.460 EAL: Calling mem event callback 'spdk:(nil)' 00:06:09.460 EAL: request: mp_malloc_sync 00:06:09.460 EAL: No shared files mode enabled, IPC is disabled 00:06:09.460 EAL: Heap on socket 0 was shrunk by 6MB 00:06:09.460 EAL: Trying to obtain current memory policy. 00:06:09.460 EAL: Setting policy MPOL_PREFERRED for socket 0 00:06:09.460 EAL: Restoring previous memory policy: 4 00:06:09.460 EAL: Calling mem event callback 'spdk:(nil)' 00:06:09.460 EAL: request: mp_malloc_sync 00:06:09.460 EAL: No shared files mode enabled, IPC is disabled 00:06:09.460 EAL: Heap on socket 0 was expanded by 10MB 00:06:09.460 EAL: Calling mem event callback 'spdk:(nil)' 00:06:09.460 EAL: request: mp_malloc_sync 00:06:09.460 EAL: No shared files mode enabled, IPC is disabled 00:06:09.460 EAL: Heap on socket 0 was shrunk by 10MB 00:06:09.460 EAL: Trying to obtain current memory policy. 00:06:09.460 EAL: Setting policy MPOL_PREFERRED for socket 0 00:06:09.460 EAL: Restoring previous memory policy: 4 00:06:09.460 EAL: Calling mem event callback 'spdk:(nil)' 00:06:09.460 EAL: request: mp_malloc_sync 00:06:09.460 EAL: No shared files mode enabled, IPC is disabled 00:06:09.460 EAL: Heap on socket 0 was expanded by 18MB 00:06:09.460 EAL: Calling mem event callback 'spdk:(nil)' 00:06:09.460 EAL: request: mp_malloc_sync 00:06:09.460 EAL: No shared files mode enabled, IPC is disabled 00:06:09.460 EAL: Heap on socket 0 was shrunk by 18MB 00:06:09.460 EAL: Trying to obtain current memory policy. 00:06:09.460 EAL: Setting policy MPOL_PREFERRED for socket 0 00:06:09.460 EAL: Restoring previous memory policy: 4 00:06:09.460 EAL: Calling mem event callback 'spdk:(nil)' 00:06:09.460 EAL: request: mp_malloc_sync 00:06:09.460 EAL: No shared files mode enabled, IPC is disabled 00:06:09.460 EAL: Heap on socket 0 was expanded by 34MB 00:06:09.460 EAL: Calling mem event callback 'spdk:(nil)' 00:06:09.460 EAL: request: mp_malloc_sync 00:06:09.460 EAL: No shared files mode enabled, IPC is disabled 00:06:09.460 EAL: Heap on socket 0 was shrunk by 34MB 00:06:09.460 EAL: Trying to obtain current memory policy. 00:06:09.460 EAL: Setting policy MPOL_PREFERRED for socket 0 00:06:09.460 EAL: Restoring previous memory policy: 4 00:06:09.460 EAL: Calling mem event callback 'spdk:(nil)' 00:06:09.460 EAL: request: mp_malloc_sync 00:06:09.460 EAL: No shared files mode enabled, IPC is disabled 00:06:09.460 EAL: Heap on socket 0 was expanded by 66MB 00:06:09.460 EAL: Calling mem event callback 'spdk:(nil)' 00:06:09.460 EAL: request: mp_malloc_sync 00:06:09.460 EAL: No shared files mode enabled, IPC is disabled 00:06:09.460 EAL: Heap on socket 0 was shrunk by 66MB 00:06:09.460 EAL: Trying to obtain current memory policy. 00:06:09.460 EAL: Setting policy MPOL_PREFERRED for socket 0 00:06:09.460 EAL: Restoring previous memory policy: 4 00:06:09.460 EAL: Calling mem event callback 'spdk:(nil)' 00:06:09.460 EAL: request: mp_malloc_sync 00:06:09.460 EAL: No shared files mode enabled, IPC is disabled 00:06:09.460 EAL: Heap on socket 0 was expanded by 130MB 00:06:09.719 EAL: Calling mem event callback 'spdk:(nil)' 00:06:09.719 EAL: request: mp_malloc_sync 00:06:09.719 EAL: No shared files mode enabled, IPC is disabled 00:06:09.719 EAL: Heap on socket 0 was shrunk by 130MB 00:06:09.719 EAL: Trying to obtain current memory policy. 00:06:09.719 EAL: Setting policy MPOL_PREFERRED for socket 0 00:06:09.719 EAL: Restoring previous memory policy: 4 00:06:09.719 EAL: Calling mem event callback 'spdk:(nil)' 00:06:09.719 EAL: request: mp_malloc_sync 00:06:09.719 EAL: No shared files mode enabled, IPC is disabled 00:06:09.719 EAL: Heap on socket 0 was expanded by 258MB 00:06:09.719 EAL: Calling mem event callback 'spdk:(nil)' 00:06:09.719 EAL: request: mp_malloc_sync 00:06:09.719 EAL: No shared files mode enabled, IPC is disabled 00:06:09.719 EAL: Heap on socket 0 was shrunk by 258MB 00:06:09.719 EAL: Trying to obtain current memory policy. 00:06:09.719 EAL: Setting policy MPOL_PREFERRED for socket 0 00:06:09.978 EAL: Restoring previous memory policy: 4 00:06:09.978 EAL: Calling mem event callback 'spdk:(nil)' 00:06:09.978 EAL: request: mp_malloc_sync 00:06:09.978 EAL: No shared files mode enabled, IPC is disabled 00:06:09.978 EAL: Heap on socket 0 was expanded by 514MB 00:06:09.978 EAL: Calling mem event callback 'spdk:(nil)' 00:06:09.978 EAL: request: mp_malloc_sync 00:06:09.978 EAL: No shared files mode enabled, IPC is disabled 00:06:09.978 EAL: Heap on socket 0 was shrunk by 514MB 00:06:09.978 EAL: Trying to obtain current memory policy. 00:06:09.978 EAL: Setting policy MPOL_PREFERRED for socket 0 00:06:10.237 EAL: Restoring previous memory policy: 4 00:06:10.237 EAL: Calling mem event callback 'spdk:(nil)' 00:06:10.237 EAL: request: mp_malloc_sync 00:06:10.237 EAL: No shared files mode enabled, IPC is disabled 00:06:10.237 EAL: Heap on socket 0 was expanded by 1026MB 00:06:10.495 EAL: Calling mem event callback 'spdk:(nil)' 00:06:10.754 EAL: request: mp_malloc_sync 00:06:10.754 EAL: No shared files mode enabled, IPC is disabled 00:06:10.754 EAL: Heap on socket 0 was shrunk by 1026MB 00:06:10.754 passed 00:06:10.754 00:06:10.754 Run Summary: Type Total Ran Passed Failed Inactive 00:06:10.754 suites 1 1 n/a 0 0 00:06:10.754 tests 2 2 2 0 0 00:06:10.754 asserts 5799 5799 5799 0 n/a 00:06:10.754 00:06:10.754 Elapsed time = 1.185 seconds 00:06:10.754 EAL: No shared files mode enabled, IPC is disabled 00:06:10.754 EAL: No shared files mode enabled, IPC is disabled 00:06:10.754 EAL: No shared files mode enabled, IPC is disabled 00:06:10.754 00:06:10.754 real 0m1.369s 00:06:10.754 user 0m0.781s 00:06:10.754 sys 0m0.557s 00:06:10.754 18:09:54 env.env_vtophys -- common/autotest_common.sh@1124 -- # xtrace_disable 00:06:10.754 18:09:54 env.env_vtophys -- common/autotest_common.sh@10 -- # set +x 00:06:10.754 ************************************ 00:06:10.754 END TEST env_vtophys 00:06:10.754 ************************************ 00:06:10.754 18:09:54 env -- common/autotest_common.sh@1142 -- # return 0 00:06:10.754 18:09:54 env -- env/env.sh@12 -- # run_test env_pci /var/jenkins/workspace/crypto-phy-autotest/spdk/test/env/pci/pci_ut 00:06:10.754 18:09:54 env -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:06:10.754 18:09:54 env -- common/autotest_common.sh@1105 -- # xtrace_disable 00:06:10.754 18:09:54 env -- common/autotest_common.sh@10 -- # set +x 00:06:10.754 ************************************ 00:06:10.754 START TEST env_pci 00:06:10.754 ************************************ 00:06:10.754 18:09:54 env.env_pci -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/env/pci/pci_ut 00:06:10.754 00:06:10.754 00:06:10.754 CUnit - A unit testing framework for C - Version 2.1-3 00:06:10.754 http://cunit.sourceforge.net/ 00:06:10.754 00:06:10.754 00:06:10.754 Suite: pci 00:06:10.754 Test: pci_hook ...[2024-07-12 18:09:54.411204] /var/jenkins/workspace/crypto-phy-autotest/spdk/lib/env_dpdk/pci.c:1040:spdk_pci_device_claim: *ERROR*: Cannot create lock on device /var/tmp/spdk_pci_lock_10000:00:01.0, probably process 2415954 has claimed it 00:06:10.754 EAL: Cannot find device (10000:00:01.0) 00:06:10.754 EAL: Failed to attach device on primary process 00:06:10.754 passed 00:06:10.754 00:06:10.754 Run Summary: Type Total Ran Passed Failed Inactive 00:06:10.754 suites 1 1 n/a 0 0 00:06:10.754 tests 1 1 1 0 0 00:06:10.754 asserts 25 25 25 0 n/a 00:06:10.754 00:06:10.754 Elapsed time = 0.042 seconds 00:06:10.754 00:06:10.754 real 0m0.067s 00:06:10.754 user 0m0.022s 00:06:10.754 sys 0m0.044s 00:06:10.754 18:09:54 env.env_pci -- common/autotest_common.sh@1124 -- # xtrace_disable 00:06:10.754 18:09:54 env.env_pci -- common/autotest_common.sh@10 -- # set +x 00:06:10.754 ************************************ 00:06:10.754 END TEST env_pci 00:06:10.754 ************************************ 00:06:11.013 18:09:54 env -- common/autotest_common.sh@1142 -- # return 0 00:06:11.013 18:09:54 env -- env/env.sh@14 -- # argv='-c 0x1 ' 00:06:11.013 18:09:54 env -- env/env.sh@15 -- # uname 00:06:11.013 18:09:54 env -- env/env.sh@15 -- # '[' Linux = Linux ']' 00:06:11.013 18:09:54 env -- env/env.sh@22 -- # argv+=--base-virtaddr=0x200000000000 00:06:11.013 18:09:54 env -- env/env.sh@24 -- # run_test env_dpdk_post_init /var/jenkins/workspace/crypto-phy-autotest/spdk/test/env/env_dpdk_post_init/env_dpdk_post_init -c 0x1 --base-virtaddr=0x200000000000 00:06:11.013 18:09:54 env -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:06:11.013 18:09:54 env -- common/autotest_common.sh@1105 -- # xtrace_disable 00:06:11.013 18:09:54 env -- common/autotest_common.sh@10 -- # set +x 00:06:11.013 ************************************ 00:06:11.013 START TEST env_dpdk_post_init 00:06:11.013 ************************************ 00:06:11.013 18:09:54 env.env_dpdk_post_init -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/env/env_dpdk_post_init/env_dpdk_post_init -c 0x1 --base-virtaddr=0x200000000000 00:06:11.013 EAL: Detected CPU lcores: 72 00:06:11.013 EAL: Detected NUMA nodes: 2 00:06:11.013 EAL: Detected shared linkage of DPDK 00:06:11.013 EAL: Multi-process socket /var/run/dpdk/rte/mp_socket 00:06:11.013 EAL: Selected IOVA mode 'PA' 00:06:11.013 EAL: VFIO support initialized 00:06:11.013 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:01.0 (socket 0) 00:06:11.013 CRYPTODEV: Creating cryptodev 0000:3d:01.0_qat_asym 00:06:11.013 CRYPTODEV: Initialisation parameters - name: 0000:3d:01.0_qat_asym,socket id: 0, max queue pairs: 0 00:06:11.013 CRYPTODEV: Creating cryptodev 0000:3d:01.0_qat_sym 00:06:11.013 CRYPTODEV: Initialisation parameters - name: 0000:3d:01.0_qat_sym,socket id: 0, max queue pairs: 0 00:06:11.013 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:01.1 (socket 0) 00:06:11.013 CRYPTODEV: Creating cryptodev 0000:3d:01.1_qat_asym 00:06:11.013 CRYPTODEV: Initialisation parameters - name: 0000:3d:01.1_qat_asym,socket id: 0, max queue pairs: 0 00:06:11.013 CRYPTODEV: Creating cryptodev 0000:3d:01.1_qat_sym 00:06:11.013 CRYPTODEV: Initialisation parameters - name: 0000:3d:01.1_qat_sym,socket id: 0, max queue pairs: 0 00:06:11.013 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:01.2 (socket 0) 00:06:11.013 CRYPTODEV: Creating cryptodev 0000:3d:01.2_qat_asym 00:06:11.014 CRYPTODEV: Initialisation parameters - name: 0000:3d:01.2_qat_asym,socket id: 0, max queue pairs: 0 00:06:11.014 CRYPTODEV: Creating cryptodev 0000:3d:01.2_qat_sym 00:06:11.014 CRYPTODEV: Initialisation parameters - name: 0000:3d:01.2_qat_sym,socket id: 0, max queue pairs: 0 00:06:11.014 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:01.3 (socket 0) 00:06:11.014 CRYPTODEV: Creating cryptodev 0000:3d:01.3_qat_asym 00:06:11.014 CRYPTODEV: Initialisation parameters - name: 0000:3d:01.3_qat_asym,socket id: 0, max queue pairs: 0 00:06:11.014 CRYPTODEV: Creating cryptodev 0000:3d:01.3_qat_sym 00:06:11.014 CRYPTODEV: Initialisation parameters - name: 0000:3d:01.3_qat_sym,socket id: 0, max queue pairs: 0 00:06:11.014 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:01.4 (socket 0) 00:06:11.014 CRYPTODEV: Creating cryptodev 0000:3d:01.4_qat_asym 00:06:11.014 CRYPTODEV: Initialisation parameters - name: 0000:3d:01.4_qat_asym,socket id: 0, max queue pairs: 0 00:06:11.014 CRYPTODEV: Creating cryptodev 0000:3d:01.4_qat_sym 00:06:11.014 CRYPTODEV: Initialisation parameters - name: 0000:3d:01.4_qat_sym,socket id: 0, max queue pairs: 0 00:06:11.014 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:01.5 (socket 0) 00:06:11.014 CRYPTODEV: Creating cryptodev 0000:3d:01.5_qat_asym 00:06:11.014 CRYPTODEV: Initialisation parameters - name: 0000:3d:01.5_qat_asym,socket id: 0, max queue pairs: 0 00:06:11.014 CRYPTODEV: Creating cryptodev 0000:3d:01.5_qat_sym 00:06:11.014 CRYPTODEV: Initialisation parameters - name: 0000:3d:01.5_qat_sym,socket id: 0, max queue pairs: 0 00:06:11.014 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:01.6 (socket 0) 00:06:11.014 CRYPTODEV: Creating cryptodev 0000:3d:01.6_qat_asym 00:06:11.014 CRYPTODEV: Initialisation parameters - name: 0000:3d:01.6_qat_asym,socket id: 0, max queue pairs: 0 00:06:11.014 CRYPTODEV: Creating cryptodev 0000:3d:01.6_qat_sym 00:06:11.014 CRYPTODEV: Initialisation parameters - name: 0000:3d:01.6_qat_sym,socket id: 0, max queue pairs: 0 00:06:11.014 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:01.7 (socket 0) 00:06:11.014 CRYPTODEV: Creating cryptodev 0000:3d:01.7_qat_asym 00:06:11.014 CRYPTODEV: Initialisation parameters - name: 0000:3d:01.7_qat_asym,socket id: 0, max queue pairs: 0 00:06:11.014 CRYPTODEV: Creating cryptodev 0000:3d:01.7_qat_sym 00:06:11.014 CRYPTODEV: Initialisation parameters - name: 0000:3d:01.7_qat_sym,socket id: 0, max queue pairs: 0 00:06:11.014 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:02.0 (socket 0) 00:06:11.014 CRYPTODEV: Creating cryptodev 0000:3d:02.0_qat_asym 00:06:11.014 CRYPTODEV: Initialisation parameters - name: 0000:3d:02.0_qat_asym,socket id: 0, max queue pairs: 0 00:06:11.014 CRYPTODEV: Creating cryptodev 0000:3d:02.0_qat_sym 00:06:11.014 CRYPTODEV: Initialisation parameters - name: 0000:3d:02.0_qat_sym,socket id: 0, max queue pairs: 0 00:06:11.014 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:02.1 (socket 0) 00:06:11.014 CRYPTODEV: Creating cryptodev 0000:3d:02.1_qat_asym 00:06:11.014 CRYPTODEV: Initialisation parameters - name: 0000:3d:02.1_qat_asym,socket id: 0, max queue pairs: 0 00:06:11.014 CRYPTODEV: Creating cryptodev 0000:3d:02.1_qat_sym 00:06:11.014 CRYPTODEV: Initialisation parameters - name: 0000:3d:02.1_qat_sym,socket id: 0, max queue pairs: 0 00:06:11.014 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:02.2 (socket 0) 00:06:11.014 CRYPTODEV: Creating cryptodev 0000:3d:02.2_qat_asym 00:06:11.014 CRYPTODEV: Initialisation parameters - name: 0000:3d:02.2_qat_asym,socket id: 0, max queue pairs: 0 00:06:11.014 CRYPTODEV: Creating cryptodev 0000:3d:02.2_qat_sym 00:06:11.014 CRYPTODEV: Initialisation parameters - name: 0000:3d:02.2_qat_sym,socket id: 0, max queue pairs: 0 00:06:11.014 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:02.3 (socket 0) 00:06:11.014 CRYPTODEV: Creating cryptodev 0000:3d:02.3_qat_asym 00:06:11.014 CRYPTODEV: Initialisation parameters - name: 0000:3d:02.3_qat_asym,socket id: 0, max queue pairs: 0 00:06:11.014 CRYPTODEV: Creating cryptodev 0000:3d:02.3_qat_sym 00:06:11.014 CRYPTODEV: Initialisation parameters - name: 0000:3d:02.3_qat_sym,socket id: 0, max queue pairs: 0 00:06:11.014 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:02.4 (socket 0) 00:06:11.014 CRYPTODEV: Creating cryptodev 0000:3d:02.4_qat_asym 00:06:11.014 CRYPTODEV: Initialisation parameters - name: 0000:3d:02.4_qat_asym,socket id: 0, max queue pairs: 0 00:06:11.014 CRYPTODEV: Creating cryptodev 0000:3d:02.4_qat_sym 00:06:11.014 CRYPTODEV: Initialisation parameters - name: 0000:3d:02.4_qat_sym,socket id: 0, max queue pairs: 0 00:06:11.014 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:02.5 (socket 0) 00:06:11.014 CRYPTODEV: Creating cryptodev 0000:3d:02.5_qat_asym 00:06:11.014 CRYPTODEV: Initialisation parameters - name: 0000:3d:02.5_qat_asym,socket id: 0, max queue pairs: 0 00:06:11.014 CRYPTODEV: Creating cryptodev 0000:3d:02.5_qat_sym 00:06:11.014 CRYPTODEV: Initialisation parameters - name: 0000:3d:02.5_qat_sym,socket id: 0, max queue pairs: 0 00:06:11.014 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:02.6 (socket 0) 00:06:11.014 CRYPTODEV: Creating cryptodev 0000:3d:02.6_qat_asym 00:06:11.014 CRYPTODEV: Initialisation parameters - name: 0000:3d:02.6_qat_asym,socket id: 0, max queue pairs: 0 00:06:11.014 CRYPTODEV: Creating cryptodev 0000:3d:02.6_qat_sym 00:06:11.014 CRYPTODEV: Initialisation parameters - name: 0000:3d:02.6_qat_sym,socket id: 0, max queue pairs: 0 00:06:11.014 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:02.7 (socket 0) 00:06:11.014 CRYPTODEV: Creating cryptodev 0000:3d:02.7_qat_asym 00:06:11.014 CRYPTODEV: Initialisation parameters - name: 0000:3d:02.7_qat_asym,socket id: 0, max queue pairs: 0 00:06:11.014 CRYPTODEV: Creating cryptodev 0000:3d:02.7_qat_sym 00:06:11.014 CRYPTODEV: Initialisation parameters - name: 0000:3d:02.7_qat_sym,socket id: 0, max queue pairs: 0 00:06:11.014 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:01.0 (socket 0) 00:06:11.014 CRYPTODEV: Creating cryptodev 0000:3f:01.0_qat_asym 00:06:11.014 CRYPTODEV: Initialisation parameters - name: 0000:3f:01.0_qat_asym,socket id: 0, max queue pairs: 0 00:06:11.014 CRYPTODEV: Creating cryptodev 0000:3f:01.0_qat_sym 00:06:11.014 CRYPTODEV: Initialisation parameters - name: 0000:3f:01.0_qat_sym,socket id: 0, max queue pairs: 0 00:06:11.014 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:01.1 (socket 0) 00:06:11.014 CRYPTODEV: Creating cryptodev 0000:3f:01.1_qat_asym 00:06:11.014 CRYPTODEV: Initialisation parameters - name: 0000:3f:01.1_qat_asym,socket id: 0, max queue pairs: 0 00:06:11.014 CRYPTODEV: Creating cryptodev 0000:3f:01.1_qat_sym 00:06:11.014 CRYPTODEV: Initialisation parameters - name: 0000:3f:01.1_qat_sym,socket id: 0, max queue pairs: 0 00:06:11.014 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:01.2 (socket 0) 00:06:11.014 CRYPTODEV: Creating cryptodev 0000:3f:01.2_qat_asym 00:06:11.014 CRYPTODEV: Initialisation parameters - name: 0000:3f:01.2_qat_asym,socket id: 0, max queue pairs: 0 00:06:11.014 CRYPTODEV: Creating cryptodev 0000:3f:01.2_qat_sym 00:06:11.014 CRYPTODEV: Initialisation parameters - name: 0000:3f:01.2_qat_sym,socket id: 0, max queue pairs: 0 00:06:11.014 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:01.3 (socket 0) 00:06:11.014 CRYPTODEV: Creating cryptodev 0000:3f:01.3_qat_asym 00:06:11.014 CRYPTODEV: Initialisation parameters - name: 0000:3f:01.3_qat_asym,socket id: 0, max queue pairs: 0 00:06:11.014 CRYPTODEV: Creating cryptodev 0000:3f:01.3_qat_sym 00:06:11.014 CRYPTODEV: Initialisation parameters - name: 0000:3f:01.3_qat_sym,socket id: 0, max queue pairs: 0 00:06:11.014 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:01.4 (socket 0) 00:06:11.014 CRYPTODEV: Creating cryptodev 0000:3f:01.4_qat_asym 00:06:11.014 CRYPTODEV: Initialisation parameters - name: 0000:3f:01.4_qat_asym,socket id: 0, max queue pairs: 0 00:06:11.014 CRYPTODEV: Creating cryptodev 0000:3f:01.4_qat_sym 00:06:11.014 CRYPTODEV: Initialisation parameters - name: 0000:3f:01.4_qat_sym,socket id: 0, max queue pairs: 0 00:06:11.014 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:01.5 (socket 0) 00:06:11.014 CRYPTODEV: Creating cryptodev 0000:3f:01.5_qat_asym 00:06:11.014 CRYPTODEV: Initialisation parameters - name: 0000:3f:01.5_qat_asym,socket id: 0, max queue pairs: 0 00:06:11.014 CRYPTODEV: Creating cryptodev 0000:3f:01.5_qat_sym 00:06:11.014 CRYPTODEV: Initialisation parameters - name: 0000:3f:01.5_qat_sym,socket id: 0, max queue pairs: 0 00:06:11.014 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:01.6 (socket 0) 00:06:11.014 CRYPTODEV: Creating cryptodev 0000:3f:01.6_qat_asym 00:06:11.014 CRYPTODEV: Initialisation parameters - name: 0000:3f:01.6_qat_asym,socket id: 0, max queue pairs: 0 00:06:11.014 CRYPTODEV: Creating cryptodev 0000:3f:01.6_qat_sym 00:06:11.014 CRYPTODEV: Initialisation parameters - name: 0000:3f:01.6_qat_sym,socket id: 0, max queue pairs: 0 00:06:11.014 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:01.7 (socket 0) 00:06:11.014 CRYPTODEV: Creating cryptodev 0000:3f:01.7_qat_asym 00:06:11.014 CRYPTODEV: Initialisation parameters - name: 0000:3f:01.7_qat_asym,socket id: 0, max queue pairs: 0 00:06:11.014 CRYPTODEV: Creating cryptodev 0000:3f:01.7_qat_sym 00:06:11.014 CRYPTODEV: Initialisation parameters - name: 0000:3f:01.7_qat_sym,socket id: 0, max queue pairs: 0 00:06:11.014 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:02.0 (socket 0) 00:06:11.014 CRYPTODEV: Creating cryptodev 0000:3f:02.0_qat_asym 00:06:11.014 CRYPTODEV: Initialisation parameters - name: 0000:3f:02.0_qat_asym,socket id: 0, max queue pairs: 0 00:06:11.014 CRYPTODEV: Creating cryptodev 0000:3f:02.0_qat_sym 00:06:11.014 CRYPTODEV: Initialisation parameters - name: 0000:3f:02.0_qat_sym,socket id: 0, max queue pairs: 0 00:06:11.014 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:02.1 (socket 0) 00:06:11.014 CRYPTODEV: Creating cryptodev 0000:3f:02.1_qat_asym 00:06:11.014 CRYPTODEV: Initialisation parameters - name: 0000:3f:02.1_qat_asym,socket id: 0, max queue pairs: 0 00:06:11.014 CRYPTODEV: Creating cryptodev 0000:3f:02.1_qat_sym 00:06:11.014 CRYPTODEV: Initialisation parameters - name: 0000:3f:02.1_qat_sym,socket id: 0, max queue pairs: 0 00:06:11.014 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:02.2 (socket 0) 00:06:11.014 CRYPTODEV: Creating cryptodev 0000:3f:02.2_qat_asym 00:06:11.014 CRYPTODEV: Initialisation parameters - name: 0000:3f:02.2_qat_asym,socket id: 0, max queue pairs: 0 00:06:11.014 CRYPTODEV: Creating cryptodev 0000:3f:02.2_qat_sym 00:06:11.014 CRYPTODEV: Initialisation parameters - name: 0000:3f:02.2_qat_sym,socket id: 0, max queue pairs: 0 00:06:11.014 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:02.3 (socket 0) 00:06:11.014 CRYPTODEV: Creating cryptodev 0000:3f:02.3_qat_asym 00:06:11.014 CRYPTODEV: Initialisation parameters - name: 0000:3f:02.3_qat_asym,socket id: 0, max queue pairs: 0 00:06:11.014 CRYPTODEV: Creating cryptodev 0000:3f:02.3_qat_sym 00:06:11.014 CRYPTODEV: Initialisation parameters - name: 0000:3f:02.3_qat_sym,socket id: 0, max queue pairs: 0 00:06:11.014 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:02.4 (socket 0) 00:06:11.014 CRYPTODEV: Creating cryptodev 0000:3f:02.4_qat_asym 00:06:11.014 CRYPTODEV: Initialisation parameters - name: 0000:3f:02.4_qat_asym,socket id: 0, max queue pairs: 0 00:06:11.014 CRYPTODEV: Creating cryptodev 0000:3f:02.4_qat_sym 00:06:11.014 CRYPTODEV: Initialisation parameters - name: 0000:3f:02.4_qat_sym,socket id: 0, max queue pairs: 0 00:06:11.014 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:02.5 (socket 0) 00:06:11.014 CRYPTODEV: Creating cryptodev 0000:3f:02.5_qat_asym 00:06:11.014 CRYPTODEV: Initialisation parameters - name: 0000:3f:02.5_qat_asym,socket id: 0, max queue pairs: 0 00:06:11.014 CRYPTODEV: Creating cryptodev 0000:3f:02.5_qat_sym 00:06:11.014 CRYPTODEV: Initialisation parameters - name: 0000:3f:02.5_qat_sym,socket id: 0, max queue pairs: 0 00:06:11.014 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:02.6 (socket 0) 00:06:11.014 CRYPTODEV: Creating cryptodev 0000:3f:02.6_qat_asym 00:06:11.014 CRYPTODEV: Initialisation parameters - name: 0000:3f:02.6_qat_asym,socket id: 0, max queue pairs: 0 00:06:11.014 CRYPTODEV: Creating cryptodev 0000:3f:02.6_qat_sym 00:06:11.014 CRYPTODEV: Initialisation parameters - name: 0000:3f:02.6_qat_sym,socket id: 0, max queue pairs: 0 00:06:11.014 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:02.7 (socket 0) 00:06:11.014 CRYPTODEV: Creating cryptodev 0000:3f:02.7_qat_asym 00:06:11.014 CRYPTODEV: Initialisation parameters - name: 0000:3f:02.7_qat_asym,socket id: 0, max queue pairs: 0 00:06:11.014 CRYPTODEV: Creating cryptodev 0000:3f:02.7_qat_sym 00:06:11.014 CRYPTODEV: Initialisation parameters - name: 0000:3f:02.7_qat_sym,socket id: 0, max queue pairs: 0 00:06:11.014 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:01.0 (socket 1) 00:06:11.014 CRYPTODEV: Creating cryptodev 0000:da:01.0_qat_asym 00:06:11.014 CRYPTODEV: Initialisation parameters - name: 0000:da:01.0_qat_asym,socket id: 1, max queue pairs: 0 00:06:11.014 CRYPTODEV: Creating cryptodev 0000:da:01.0_qat_sym 00:06:11.014 CRYPTODEV: Initialisation parameters - name: 0000:da:01.0_qat_sym,socket id: 1, max queue pairs: 0 00:06:11.014 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:01.1 (socket 1) 00:06:11.014 CRYPTODEV: Creating cryptodev 0000:da:01.1_qat_asym 00:06:11.014 CRYPTODEV: Initialisation parameters - name: 0000:da:01.1_qat_asym,socket id: 1, max queue pairs: 0 00:06:11.014 CRYPTODEV: Creating cryptodev 0000:da:01.1_qat_sym 00:06:11.014 CRYPTODEV: Initialisation parameters - name: 0000:da:01.1_qat_sym,socket id: 1, max queue pairs: 0 00:06:11.015 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:01.2 (socket 1) 00:06:11.015 CRYPTODEV: Creating cryptodev 0000:da:01.2_qat_asym 00:06:11.015 CRYPTODEV: Initialisation parameters - name: 0000:da:01.2_qat_asym,socket id: 1, max queue pairs: 0 00:06:11.015 CRYPTODEV: Creating cryptodev 0000:da:01.2_qat_sym 00:06:11.015 CRYPTODEV: Initialisation parameters - name: 0000:da:01.2_qat_sym,socket id: 1, max queue pairs: 0 00:06:11.015 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:01.3 (socket 1) 00:06:11.015 CRYPTODEV: Creating cryptodev 0000:da:01.3_qat_asym 00:06:11.015 CRYPTODEV: Initialisation parameters - name: 0000:da:01.3_qat_asym,socket id: 1, max queue pairs: 0 00:06:11.015 CRYPTODEV: Creating cryptodev 0000:da:01.3_qat_sym 00:06:11.015 CRYPTODEV: Initialisation parameters - name: 0000:da:01.3_qat_sym,socket id: 1, max queue pairs: 0 00:06:11.015 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:01.4 (socket 1) 00:06:11.015 CRYPTODEV: Creating cryptodev 0000:da:01.4_qat_asym 00:06:11.015 CRYPTODEV: Initialisation parameters - name: 0000:da:01.4_qat_asym,socket id: 1, max queue pairs: 0 00:06:11.015 CRYPTODEV: Creating cryptodev 0000:da:01.4_qat_sym 00:06:11.015 CRYPTODEV: Initialisation parameters - name: 0000:da:01.4_qat_sym,socket id: 1, max queue pairs: 0 00:06:11.015 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:01.5 (socket 1) 00:06:11.015 CRYPTODEV: Creating cryptodev 0000:da:01.5_qat_asym 00:06:11.015 CRYPTODEV: Initialisation parameters - name: 0000:da:01.5_qat_asym,socket id: 1, max queue pairs: 0 00:06:11.015 CRYPTODEV: Creating cryptodev 0000:da:01.5_qat_sym 00:06:11.015 CRYPTODEV: Initialisation parameters - name: 0000:da:01.5_qat_sym,socket id: 1, max queue pairs: 0 00:06:11.015 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:01.6 (socket 1) 00:06:11.015 CRYPTODEV: Creating cryptodev 0000:da:01.6_qat_asym 00:06:11.015 CRYPTODEV: Initialisation parameters - name: 0000:da:01.6_qat_asym,socket id: 1, max queue pairs: 0 00:06:11.015 CRYPTODEV: Creating cryptodev 0000:da:01.6_qat_sym 00:06:11.015 CRYPTODEV: Initialisation parameters - name: 0000:da:01.6_qat_sym,socket id: 1, max queue pairs: 0 00:06:11.015 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:01.7 (socket 1) 00:06:11.015 CRYPTODEV: Creating cryptodev 0000:da:01.7_qat_asym 00:06:11.015 CRYPTODEV: Initialisation parameters - name: 0000:da:01.7_qat_asym,socket id: 1, max queue pairs: 0 00:06:11.015 CRYPTODEV: Creating cryptodev 0000:da:01.7_qat_sym 00:06:11.015 CRYPTODEV: Initialisation parameters - name: 0000:da:01.7_qat_sym,socket id: 1, max queue pairs: 0 00:06:11.015 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:02.0 (socket 1) 00:06:11.015 CRYPTODEV: Creating cryptodev 0000:da:02.0_qat_asym 00:06:11.015 CRYPTODEV: Initialisation parameters - name: 0000:da:02.0_qat_asym,socket id: 1, max queue pairs: 0 00:06:11.015 CRYPTODEV: Creating cryptodev 0000:da:02.0_qat_sym 00:06:11.015 CRYPTODEV: Initialisation parameters - name: 0000:da:02.0_qat_sym,socket id: 1, max queue pairs: 0 00:06:11.015 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:02.1 (socket 1) 00:06:11.015 CRYPTODEV: Creating cryptodev 0000:da:02.1_qat_asym 00:06:11.015 CRYPTODEV: Initialisation parameters - name: 0000:da:02.1_qat_asym,socket id: 1, max queue pairs: 0 00:06:11.015 CRYPTODEV: Creating cryptodev 0000:da:02.1_qat_sym 00:06:11.015 CRYPTODEV: Initialisation parameters - name: 0000:da:02.1_qat_sym,socket id: 1, max queue pairs: 0 00:06:11.015 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:02.2 (socket 1) 00:06:11.015 CRYPTODEV: Creating cryptodev 0000:da:02.2_qat_asym 00:06:11.015 CRYPTODEV: Initialisation parameters - name: 0000:da:02.2_qat_asym,socket id: 1, max queue pairs: 0 00:06:11.015 CRYPTODEV: Creating cryptodev 0000:da:02.2_qat_sym 00:06:11.015 CRYPTODEV: Initialisation parameters - name: 0000:da:02.2_qat_sym,socket id: 1, max queue pairs: 0 00:06:11.015 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:02.3 (socket 1) 00:06:11.015 CRYPTODEV: Creating cryptodev 0000:da:02.3_qat_asym 00:06:11.015 CRYPTODEV: Initialisation parameters - name: 0000:da:02.3_qat_asym,socket id: 1, max queue pairs: 0 00:06:11.015 CRYPTODEV: Creating cryptodev 0000:da:02.3_qat_sym 00:06:11.015 CRYPTODEV: Initialisation parameters - name: 0000:da:02.3_qat_sym,socket id: 1, max queue pairs: 0 00:06:11.015 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:02.4 (socket 1) 00:06:11.015 CRYPTODEV: Creating cryptodev 0000:da:02.4_qat_asym 00:06:11.015 CRYPTODEV: Initialisation parameters - name: 0000:da:02.4_qat_asym,socket id: 1, max queue pairs: 0 00:06:11.015 CRYPTODEV: Creating cryptodev 0000:da:02.4_qat_sym 00:06:11.015 CRYPTODEV: Initialisation parameters - name: 0000:da:02.4_qat_sym,socket id: 1, max queue pairs: 0 00:06:11.015 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:02.5 (socket 1) 00:06:11.015 CRYPTODEV: Creating cryptodev 0000:da:02.5_qat_asym 00:06:11.015 CRYPTODEV: Initialisation parameters - name: 0000:da:02.5_qat_asym,socket id: 1, max queue pairs: 0 00:06:11.015 CRYPTODEV: Creating cryptodev 0000:da:02.5_qat_sym 00:06:11.015 CRYPTODEV: Initialisation parameters - name: 0000:da:02.5_qat_sym,socket id: 1, max queue pairs: 0 00:06:11.015 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:02.6 (socket 1) 00:06:11.015 CRYPTODEV: Creating cryptodev 0000:da:02.6_qat_asym 00:06:11.015 CRYPTODEV: Initialisation parameters - name: 0000:da:02.6_qat_asym,socket id: 1, max queue pairs: 0 00:06:11.015 CRYPTODEV: Creating cryptodev 0000:da:02.6_qat_sym 00:06:11.015 CRYPTODEV: Initialisation parameters - name: 0000:da:02.6_qat_sym,socket id: 1, max queue pairs: 0 00:06:11.015 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:02.7 (socket 1) 00:06:11.015 CRYPTODEV: Creating cryptodev 0000:da:02.7_qat_asym 00:06:11.015 CRYPTODEV: Initialisation parameters - name: 0000:da:02.7_qat_asym,socket id: 1, max queue pairs: 0 00:06:11.015 CRYPTODEV: Creating cryptodev 0000:da:02.7_qat_sym 00:06:11.015 CRYPTODEV: Initialisation parameters - name: 0000:da:02.7_qat_sym,socket id: 1, max queue pairs: 0 00:06:11.015 TELEMETRY: No legacy callbacks, legacy socket not created 00:06:11.274 EAL: Using IOMMU type 1 (Type 1) 00:06:11.274 EAL: Ignore mapping IO port bar(1) 00:06:11.274 EAL: Probe PCI driver: spdk_ioat (8086:2021) device: 0000:00:04.0 (socket 0) 00:06:11.274 EAL: Ignore mapping IO port bar(1) 00:06:11.274 EAL: Probe PCI driver: spdk_ioat (8086:2021) device: 0000:00:04.1 (socket 0) 00:06:11.274 EAL: Ignore mapping IO port bar(1) 00:06:11.274 EAL: Probe PCI driver: spdk_ioat (8086:2021) device: 0000:00:04.2 (socket 0) 00:06:11.274 EAL: Ignore mapping IO port bar(1) 00:06:11.274 EAL: Probe PCI driver: spdk_ioat (8086:2021) device: 0000:00:04.3 (socket 0) 00:06:11.274 EAL: Ignore mapping IO port bar(1) 00:06:11.274 EAL: Probe PCI driver: spdk_ioat (8086:2021) device: 0000:00:04.4 (socket 0) 00:06:11.274 EAL: Ignore mapping IO port bar(1) 00:06:11.274 EAL: Probe PCI driver: spdk_ioat (8086:2021) device: 0000:00:04.5 (socket 0) 00:06:11.274 EAL: Ignore mapping IO port bar(1) 00:06:11.274 EAL: Probe PCI driver: spdk_ioat (8086:2021) device: 0000:00:04.6 (socket 0) 00:06:11.274 EAL: Ignore mapping IO port bar(1) 00:06:11.274 EAL: Probe PCI driver: spdk_ioat (8086:2021) device: 0000:00:04.7 (socket 0) 00:06:11.534 EAL: Probe PCI driver: spdk_nvme (8086:0b60) device: 0000:5e:00.0 (socket 0) 00:06:11.534 EAL: Ignore mapping IO port bar(1) 00:06:11.534 EAL: Probe PCI driver: spdk_ioat (8086:2021) device: 0000:80:04.0 (socket 1) 00:06:11.534 EAL: Ignore mapping IO port bar(1) 00:06:11.534 EAL: Probe PCI driver: spdk_ioat (8086:2021) device: 0000:80:04.1 (socket 1) 00:06:11.534 EAL: Ignore mapping IO port bar(1) 00:06:11.534 EAL: Probe PCI driver: spdk_ioat (8086:2021) device: 0000:80:04.2 (socket 1) 00:06:11.534 EAL: Ignore mapping IO port bar(1) 00:06:11.534 EAL: Probe PCI driver: spdk_ioat (8086:2021) device: 0000:80:04.3 (socket 1) 00:06:11.534 EAL: Ignore mapping IO port bar(1) 00:06:11.534 EAL: Probe PCI driver: spdk_ioat (8086:2021) device: 0000:80:04.4 (socket 1) 00:06:11.534 EAL: Ignore mapping IO port bar(1) 00:06:11.534 EAL: Probe PCI driver: spdk_ioat (8086:2021) device: 0000:80:04.5 (socket 1) 00:06:11.534 EAL: Ignore mapping IO port bar(1) 00:06:11.534 EAL: Probe PCI driver: spdk_ioat (8086:2021) device: 0000:80:04.6 (socket 1) 00:06:11.534 EAL: Ignore mapping IO port bar(1) 00:06:11.534 EAL: Probe PCI driver: spdk_ioat (8086:2021) device: 0000:80:04.7 (socket 1) 00:06:11.534 EAL: Ignore mapping IO port bar(1) 00:06:11.534 EAL: Ignore mapping IO port bar(5) 00:06:11.534 EAL: Probe PCI driver: spdk_vmd (8086:201d) device: 0000:85:05.5 (socket 1) 00:06:11.534 EAL: Ignore mapping IO port bar(1) 00:06:11.534 EAL: Ignore mapping IO port bar(5) 00:06:11.534 EAL: Probe PCI driver: spdk_vmd (8086:201d) device: 0000:d7:05.5 (socket 1) 00:06:14.823 EAL: Releasing PCI mapped resource for 0000:5e:00.0 00:06:14.823 EAL: Calling pci_unmap_resource for 0000:5e:00.0 at 0x202001080000 00:06:14.823 Starting DPDK initialization... 00:06:14.823 Starting SPDK post initialization... 00:06:14.823 SPDK NVMe probe 00:06:14.823 Attaching to 0000:5e:00.0 00:06:14.823 Attached to 0000:5e:00.0 00:06:14.823 Cleaning up... 00:06:14.823 00:06:14.823 real 0m3.524s 00:06:14.823 user 0m2.478s 00:06:14.823 sys 0m0.607s 00:06:14.823 18:09:58 env.env_dpdk_post_init -- common/autotest_common.sh@1124 -- # xtrace_disable 00:06:14.823 18:09:58 env.env_dpdk_post_init -- common/autotest_common.sh@10 -- # set +x 00:06:14.823 ************************************ 00:06:14.823 END TEST env_dpdk_post_init 00:06:14.823 ************************************ 00:06:14.823 18:09:58 env -- common/autotest_common.sh@1142 -- # return 0 00:06:14.823 18:09:58 env -- env/env.sh@26 -- # uname 00:06:14.823 18:09:58 env -- env/env.sh@26 -- # '[' Linux = Linux ']' 00:06:14.823 18:09:58 env -- env/env.sh@29 -- # run_test env_mem_callbacks /var/jenkins/workspace/crypto-phy-autotest/spdk/test/env/mem_callbacks/mem_callbacks 00:06:14.823 18:09:58 env -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:06:14.823 18:09:58 env -- common/autotest_common.sh@1105 -- # xtrace_disable 00:06:14.823 18:09:58 env -- common/autotest_common.sh@10 -- # set +x 00:06:14.823 ************************************ 00:06:14.823 START TEST env_mem_callbacks 00:06:14.823 ************************************ 00:06:14.823 18:09:58 env.env_mem_callbacks -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/env/mem_callbacks/mem_callbacks 00:06:14.823 EAL: Detected CPU lcores: 72 00:06:14.823 EAL: Detected NUMA nodes: 2 00:06:14.823 EAL: Detected shared linkage of DPDK 00:06:14.823 EAL: Multi-process socket /var/run/dpdk/rte/mp_socket 00:06:14.823 EAL: Selected IOVA mode 'PA' 00:06:14.823 EAL: VFIO support initialized 00:06:14.823 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:01.0 (socket 0) 00:06:14.823 CRYPTODEV: Creating cryptodev 0000:3d:01.0_qat_asym 00:06:14.823 CRYPTODEV: Initialisation parameters - name: 0000:3d:01.0_qat_asym,socket id: 0, max queue pairs: 0 00:06:14.823 CRYPTODEV: Creating cryptodev 0000:3d:01.0_qat_sym 00:06:14.823 CRYPTODEV: Initialisation parameters - name: 0000:3d:01.0_qat_sym,socket id: 0, max queue pairs: 0 00:06:14.823 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:01.1 (socket 0) 00:06:14.823 CRYPTODEV: Creating cryptodev 0000:3d:01.1_qat_asym 00:06:14.823 CRYPTODEV: Initialisation parameters - name: 0000:3d:01.1_qat_asym,socket id: 0, max queue pairs: 0 00:06:14.823 CRYPTODEV: Creating cryptodev 0000:3d:01.1_qat_sym 00:06:14.823 CRYPTODEV: Initialisation parameters - name: 0000:3d:01.1_qat_sym,socket id: 0, max queue pairs: 0 00:06:14.823 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:01.2 (socket 0) 00:06:14.823 CRYPTODEV: Creating cryptodev 0000:3d:01.2_qat_asym 00:06:14.823 CRYPTODEV: Initialisation parameters - name: 0000:3d:01.2_qat_asym,socket id: 0, max queue pairs: 0 00:06:14.823 CRYPTODEV: Creating cryptodev 0000:3d:01.2_qat_sym 00:06:14.823 CRYPTODEV: Initialisation parameters - name: 0000:3d:01.2_qat_sym,socket id: 0, max queue pairs: 0 00:06:14.823 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:01.3 (socket 0) 00:06:14.823 CRYPTODEV: Creating cryptodev 0000:3d:01.3_qat_asym 00:06:14.823 CRYPTODEV: Initialisation parameters - name: 0000:3d:01.3_qat_asym,socket id: 0, max queue pairs: 0 00:06:14.823 CRYPTODEV: Creating cryptodev 0000:3d:01.3_qat_sym 00:06:14.823 CRYPTODEV: Initialisation parameters - name: 0000:3d:01.3_qat_sym,socket id: 0, max queue pairs: 0 00:06:14.823 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:01.4 (socket 0) 00:06:14.823 CRYPTODEV: Creating cryptodev 0000:3d:01.4_qat_asym 00:06:14.823 CRYPTODEV: Initialisation parameters - name: 0000:3d:01.4_qat_asym,socket id: 0, max queue pairs: 0 00:06:14.823 CRYPTODEV: Creating cryptodev 0000:3d:01.4_qat_sym 00:06:14.823 CRYPTODEV: Initialisation parameters - name: 0000:3d:01.4_qat_sym,socket id: 0, max queue pairs: 0 00:06:14.823 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:01.5 (socket 0) 00:06:14.823 CRYPTODEV: Creating cryptodev 0000:3d:01.5_qat_asym 00:06:14.823 CRYPTODEV: Initialisation parameters - name: 0000:3d:01.5_qat_asym,socket id: 0, max queue pairs: 0 00:06:14.823 CRYPTODEV: Creating cryptodev 0000:3d:01.5_qat_sym 00:06:14.823 CRYPTODEV: Initialisation parameters - name: 0000:3d:01.5_qat_sym,socket id: 0, max queue pairs: 0 00:06:14.823 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:01.6 (socket 0) 00:06:14.823 CRYPTODEV: Creating cryptodev 0000:3d:01.6_qat_asym 00:06:14.823 CRYPTODEV: Initialisation parameters - name: 0000:3d:01.6_qat_asym,socket id: 0, max queue pairs: 0 00:06:14.823 CRYPTODEV: Creating cryptodev 0000:3d:01.6_qat_sym 00:06:14.823 CRYPTODEV: Initialisation parameters - name: 0000:3d:01.6_qat_sym,socket id: 0, max queue pairs: 0 00:06:14.823 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:01.7 (socket 0) 00:06:14.823 CRYPTODEV: Creating cryptodev 0000:3d:01.7_qat_asym 00:06:14.823 CRYPTODEV: Initialisation parameters - name: 0000:3d:01.7_qat_asym,socket id: 0, max queue pairs: 0 00:06:14.823 CRYPTODEV: Creating cryptodev 0000:3d:01.7_qat_sym 00:06:14.823 CRYPTODEV: Initialisation parameters - name: 0000:3d:01.7_qat_sym,socket id: 0, max queue pairs: 0 00:06:14.823 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:02.0 (socket 0) 00:06:14.823 CRYPTODEV: Creating cryptodev 0000:3d:02.0_qat_asym 00:06:14.823 CRYPTODEV: Initialisation parameters - name: 0000:3d:02.0_qat_asym,socket id: 0, max queue pairs: 0 00:06:14.823 CRYPTODEV: Creating cryptodev 0000:3d:02.0_qat_sym 00:06:14.823 CRYPTODEV: Initialisation parameters - name: 0000:3d:02.0_qat_sym,socket id: 0, max queue pairs: 0 00:06:14.823 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:02.1 (socket 0) 00:06:14.823 CRYPTODEV: Creating cryptodev 0000:3d:02.1_qat_asym 00:06:14.823 CRYPTODEV: Initialisation parameters - name: 0000:3d:02.1_qat_asym,socket id: 0, max queue pairs: 0 00:06:14.823 CRYPTODEV: Creating cryptodev 0000:3d:02.1_qat_sym 00:06:14.823 CRYPTODEV: Initialisation parameters - name: 0000:3d:02.1_qat_sym,socket id: 0, max queue pairs: 0 00:06:14.823 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:02.2 (socket 0) 00:06:14.823 CRYPTODEV: Creating cryptodev 0000:3d:02.2_qat_asym 00:06:14.823 CRYPTODEV: Initialisation parameters - name: 0000:3d:02.2_qat_asym,socket id: 0, max queue pairs: 0 00:06:14.823 CRYPTODEV: Creating cryptodev 0000:3d:02.2_qat_sym 00:06:14.823 CRYPTODEV: Initialisation parameters - name: 0000:3d:02.2_qat_sym,socket id: 0, max queue pairs: 0 00:06:14.823 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:02.3 (socket 0) 00:06:14.823 CRYPTODEV: Creating cryptodev 0000:3d:02.3_qat_asym 00:06:14.823 CRYPTODEV: Initialisation parameters - name: 0000:3d:02.3_qat_asym,socket id: 0, max queue pairs: 0 00:06:14.823 CRYPTODEV: Creating cryptodev 0000:3d:02.3_qat_sym 00:06:14.823 CRYPTODEV: Initialisation parameters - name: 0000:3d:02.3_qat_sym,socket id: 0, max queue pairs: 0 00:06:14.823 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:02.4 (socket 0) 00:06:14.823 CRYPTODEV: Creating cryptodev 0000:3d:02.4_qat_asym 00:06:14.823 CRYPTODEV: Initialisation parameters - name: 0000:3d:02.4_qat_asym,socket id: 0, max queue pairs: 0 00:06:14.823 CRYPTODEV: Creating cryptodev 0000:3d:02.4_qat_sym 00:06:14.823 CRYPTODEV: Initialisation parameters - name: 0000:3d:02.4_qat_sym,socket id: 0, max queue pairs: 0 00:06:14.823 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:02.5 (socket 0) 00:06:14.823 CRYPTODEV: Creating cryptodev 0000:3d:02.5_qat_asym 00:06:14.823 CRYPTODEV: Initialisation parameters - name: 0000:3d:02.5_qat_asym,socket id: 0, max queue pairs: 0 00:06:14.823 CRYPTODEV: Creating cryptodev 0000:3d:02.5_qat_sym 00:06:14.823 CRYPTODEV: Initialisation parameters - name: 0000:3d:02.5_qat_sym,socket id: 0, max queue pairs: 0 00:06:14.823 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:02.6 (socket 0) 00:06:14.823 CRYPTODEV: Creating cryptodev 0000:3d:02.6_qat_asym 00:06:14.823 CRYPTODEV: Initialisation parameters - name: 0000:3d:02.6_qat_asym,socket id: 0, max queue pairs: 0 00:06:14.823 CRYPTODEV: Creating cryptodev 0000:3d:02.6_qat_sym 00:06:14.823 CRYPTODEV: Initialisation parameters - name: 0000:3d:02.6_qat_sym,socket id: 0, max queue pairs: 0 00:06:14.823 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:02.7 (socket 0) 00:06:14.823 CRYPTODEV: Creating cryptodev 0000:3d:02.7_qat_asym 00:06:14.823 CRYPTODEV: Initialisation parameters - name: 0000:3d:02.7_qat_asym,socket id: 0, max queue pairs: 0 00:06:14.823 CRYPTODEV: Creating cryptodev 0000:3d:02.7_qat_sym 00:06:14.823 CRYPTODEV: Initialisation parameters - name: 0000:3d:02.7_qat_sym,socket id: 0, max queue pairs: 0 00:06:14.823 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:01.0 (socket 0) 00:06:14.823 CRYPTODEV: Creating cryptodev 0000:3f:01.0_qat_asym 00:06:14.823 CRYPTODEV: Initialisation parameters - name: 0000:3f:01.0_qat_asym,socket id: 0, max queue pairs: 0 00:06:14.823 CRYPTODEV: Creating cryptodev 0000:3f:01.0_qat_sym 00:06:14.823 CRYPTODEV: Initialisation parameters - name: 0000:3f:01.0_qat_sym,socket id: 0, max queue pairs: 0 00:06:14.823 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:01.1 (socket 0) 00:06:14.823 CRYPTODEV: Creating cryptodev 0000:3f:01.1_qat_asym 00:06:14.823 CRYPTODEV: Initialisation parameters - name: 0000:3f:01.1_qat_asym,socket id: 0, max queue pairs: 0 00:06:14.823 CRYPTODEV: Creating cryptodev 0000:3f:01.1_qat_sym 00:06:14.824 CRYPTODEV: Initialisation parameters - name: 0000:3f:01.1_qat_sym,socket id: 0, max queue pairs: 0 00:06:14.824 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:01.2 (socket 0) 00:06:14.824 CRYPTODEV: Creating cryptodev 0000:3f:01.2_qat_asym 00:06:14.824 CRYPTODEV: Initialisation parameters - name: 0000:3f:01.2_qat_asym,socket id: 0, max queue pairs: 0 00:06:14.824 CRYPTODEV: Creating cryptodev 0000:3f:01.2_qat_sym 00:06:14.824 CRYPTODEV: Initialisation parameters - name: 0000:3f:01.2_qat_sym,socket id: 0, max queue pairs: 0 00:06:14.824 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:01.3 (socket 0) 00:06:14.824 CRYPTODEV: Creating cryptodev 0000:3f:01.3_qat_asym 00:06:14.824 CRYPTODEV: Initialisation parameters - name: 0000:3f:01.3_qat_asym,socket id: 0, max queue pairs: 0 00:06:14.824 CRYPTODEV: Creating cryptodev 0000:3f:01.3_qat_sym 00:06:14.824 CRYPTODEV: Initialisation parameters - name: 0000:3f:01.3_qat_sym,socket id: 0, max queue pairs: 0 00:06:14.824 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:01.4 (socket 0) 00:06:14.824 CRYPTODEV: Creating cryptodev 0000:3f:01.4_qat_asym 00:06:14.824 CRYPTODEV: Initialisation parameters - name: 0000:3f:01.4_qat_asym,socket id: 0, max queue pairs: 0 00:06:14.824 CRYPTODEV: Creating cryptodev 0000:3f:01.4_qat_sym 00:06:14.824 CRYPTODEV: Initialisation parameters - name: 0000:3f:01.4_qat_sym,socket id: 0, max queue pairs: 0 00:06:14.824 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:01.5 (socket 0) 00:06:14.824 CRYPTODEV: Creating cryptodev 0000:3f:01.5_qat_asym 00:06:14.824 CRYPTODEV: Initialisation parameters - name: 0000:3f:01.5_qat_asym,socket id: 0, max queue pairs: 0 00:06:14.824 CRYPTODEV: Creating cryptodev 0000:3f:01.5_qat_sym 00:06:14.824 CRYPTODEV: Initialisation parameters - name: 0000:3f:01.5_qat_sym,socket id: 0, max queue pairs: 0 00:06:14.824 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:01.6 (socket 0) 00:06:14.824 CRYPTODEV: Creating cryptodev 0000:3f:01.6_qat_asym 00:06:14.824 CRYPTODEV: Initialisation parameters - name: 0000:3f:01.6_qat_asym,socket id: 0, max queue pairs: 0 00:06:14.824 CRYPTODEV: Creating cryptodev 0000:3f:01.6_qat_sym 00:06:14.824 CRYPTODEV: Initialisation parameters - name: 0000:3f:01.6_qat_sym,socket id: 0, max queue pairs: 0 00:06:14.824 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:01.7 (socket 0) 00:06:14.824 CRYPTODEV: Creating cryptodev 0000:3f:01.7_qat_asym 00:06:14.824 CRYPTODEV: Initialisation parameters - name: 0000:3f:01.7_qat_asym,socket id: 0, max queue pairs: 0 00:06:14.824 CRYPTODEV: Creating cryptodev 0000:3f:01.7_qat_sym 00:06:14.824 CRYPTODEV: Initialisation parameters - name: 0000:3f:01.7_qat_sym,socket id: 0, max queue pairs: 0 00:06:14.824 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:02.0 (socket 0) 00:06:14.824 CRYPTODEV: Creating cryptodev 0000:3f:02.0_qat_asym 00:06:14.824 CRYPTODEV: Initialisation parameters - name: 0000:3f:02.0_qat_asym,socket id: 0, max queue pairs: 0 00:06:14.824 CRYPTODEV: Creating cryptodev 0000:3f:02.0_qat_sym 00:06:14.824 CRYPTODEV: Initialisation parameters - name: 0000:3f:02.0_qat_sym,socket id: 0, max queue pairs: 0 00:06:14.824 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:02.1 (socket 0) 00:06:14.824 CRYPTODEV: Creating cryptodev 0000:3f:02.1_qat_asym 00:06:14.824 CRYPTODEV: Initialisation parameters - name: 0000:3f:02.1_qat_asym,socket id: 0, max queue pairs: 0 00:06:14.824 CRYPTODEV: Creating cryptodev 0000:3f:02.1_qat_sym 00:06:14.824 CRYPTODEV: Initialisation parameters - name: 0000:3f:02.1_qat_sym,socket id: 0, max queue pairs: 0 00:06:14.824 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:02.2 (socket 0) 00:06:14.824 CRYPTODEV: Creating cryptodev 0000:3f:02.2_qat_asym 00:06:14.824 CRYPTODEV: Initialisation parameters - name: 0000:3f:02.2_qat_asym,socket id: 0, max queue pairs: 0 00:06:14.824 CRYPTODEV: Creating cryptodev 0000:3f:02.2_qat_sym 00:06:14.824 CRYPTODEV: Initialisation parameters - name: 0000:3f:02.2_qat_sym,socket id: 0, max queue pairs: 0 00:06:14.824 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:02.3 (socket 0) 00:06:14.824 CRYPTODEV: Creating cryptodev 0000:3f:02.3_qat_asym 00:06:14.824 CRYPTODEV: Initialisation parameters - name: 0000:3f:02.3_qat_asym,socket id: 0, max queue pairs: 0 00:06:14.824 CRYPTODEV: Creating cryptodev 0000:3f:02.3_qat_sym 00:06:14.824 CRYPTODEV: Initialisation parameters - name: 0000:3f:02.3_qat_sym,socket id: 0, max queue pairs: 0 00:06:14.824 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:02.4 (socket 0) 00:06:14.824 CRYPTODEV: Creating cryptodev 0000:3f:02.4_qat_asym 00:06:14.824 CRYPTODEV: Initialisation parameters - name: 0000:3f:02.4_qat_asym,socket id: 0, max queue pairs: 0 00:06:14.824 CRYPTODEV: Creating cryptodev 0000:3f:02.4_qat_sym 00:06:14.824 CRYPTODEV: Initialisation parameters - name: 0000:3f:02.4_qat_sym,socket id: 0, max queue pairs: 0 00:06:14.824 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:02.5 (socket 0) 00:06:14.824 CRYPTODEV: Creating cryptodev 0000:3f:02.5_qat_asym 00:06:14.824 CRYPTODEV: Initialisation parameters - name: 0000:3f:02.5_qat_asym,socket id: 0, max queue pairs: 0 00:06:14.824 CRYPTODEV: Creating cryptodev 0000:3f:02.5_qat_sym 00:06:14.824 CRYPTODEV: Initialisation parameters - name: 0000:3f:02.5_qat_sym,socket id: 0, max queue pairs: 0 00:06:14.824 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:02.6 (socket 0) 00:06:14.824 CRYPTODEV: Creating cryptodev 0000:3f:02.6_qat_asym 00:06:14.824 CRYPTODEV: Initialisation parameters - name: 0000:3f:02.6_qat_asym,socket id: 0, max queue pairs: 0 00:06:14.824 CRYPTODEV: Creating cryptodev 0000:3f:02.6_qat_sym 00:06:14.824 CRYPTODEV: Initialisation parameters - name: 0000:3f:02.6_qat_sym,socket id: 0, max queue pairs: 0 00:06:14.824 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:02.7 (socket 0) 00:06:14.824 CRYPTODEV: Creating cryptodev 0000:3f:02.7_qat_asym 00:06:14.824 CRYPTODEV: Initialisation parameters - name: 0000:3f:02.7_qat_asym,socket id: 0, max queue pairs: 0 00:06:14.824 CRYPTODEV: Creating cryptodev 0000:3f:02.7_qat_sym 00:06:14.824 CRYPTODEV: Initialisation parameters - name: 0000:3f:02.7_qat_sym,socket id: 0, max queue pairs: 0 00:06:14.824 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:01.0 (socket 1) 00:06:14.824 CRYPTODEV: Creating cryptodev 0000:da:01.0_qat_asym 00:06:14.824 CRYPTODEV: Initialisation parameters - name: 0000:da:01.0_qat_asym,socket id: 1, max queue pairs: 0 00:06:14.824 CRYPTODEV: Creating cryptodev 0000:da:01.0_qat_sym 00:06:14.824 CRYPTODEV: Initialisation parameters - name: 0000:da:01.0_qat_sym,socket id: 1, max queue pairs: 0 00:06:14.824 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:01.1 (socket 1) 00:06:14.824 CRYPTODEV: Creating cryptodev 0000:da:01.1_qat_asym 00:06:14.824 CRYPTODEV: Initialisation parameters - name: 0000:da:01.1_qat_asym,socket id: 1, max queue pairs: 0 00:06:14.824 CRYPTODEV: Creating cryptodev 0000:da:01.1_qat_sym 00:06:14.824 CRYPTODEV: Initialisation parameters - name: 0000:da:01.1_qat_sym,socket id: 1, max queue pairs: 0 00:06:14.824 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:01.2 (socket 1) 00:06:14.824 CRYPTODEV: Creating cryptodev 0000:da:01.2_qat_asym 00:06:14.824 CRYPTODEV: Initialisation parameters - name: 0000:da:01.2_qat_asym,socket id: 1, max queue pairs: 0 00:06:14.824 CRYPTODEV: Creating cryptodev 0000:da:01.2_qat_sym 00:06:14.824 CRYPTODEV: Initialisation parameters - name: 0000:da:01.2_qat_sym,socket id: 1, max queue pairs: 0 00:06:14.824 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:01.3 (socket 1) 00:06:14.824 CRYPTODEV: Creating cryptodev 0000:da:01.3_qat_asym 00:06:14.824 CRYPTODEV: Initialisation parameters - name: 0000:da:01.3_qat_asym,socket id: 1, max queue pairs: 0 00:06:14.824 CRYPTODEV: Creating cryptodev 0000:da:01.3_qat_sym 00:06:14.824 CRYPTODEV: Initialisation parameters - name: 0000:da:01.3_qat_sym,socket id: 1, max queue pairs: 0 00:06:14.824 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:01.4 (socket 1) 00:06:14.824 CRYPTODEV: Creating cryptodev 0000:da:01.4_qat_asym 00:06:14.824 CRYPTODEV: Initialisation parameters - name: 0000:da:01.4_qat_asym,socket id: 1, max queue pairs: 0 00:06:14.824 CRYPTODEV: Creating cryptodev 0000:da:01.4_qat_sym 00:06:14.824 CRYPTODEV: Initialisation parameters - name: 0000:da:01.4_qat_sym,socket id: 1, max queue pairs: 0 00:06:14.824 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:01.5 (socket 1) 00:06:14.824 CRYPTODEV: Creating cryptodev 0000:da:01.5_qat_asym 00:06:14.824 CRYPTODEV: Initialisation parameters - name: 0000:da:01.5_qat_asym,socket id: 1, max queue pairs: 0 00:06:14.824 CRYPTODEV: Creating cryptodev 0000:da:01.5_qat_sym 00:06:14.824 CRYPTODEV: Initialisation parameters - name: 0000:da:01.5_qat_sym,socket id: 1, max queue pairs: 0 00:06:14.824 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:01.6 (socket 1) 00:06:14.824 CRYPTODEV: Creating cryptodev 0000:da:01.6_qat_asym 00:06:14.824 CRYPTODEV: Initialisation parameters - name: 0000:da:01.6_qat_asym,socket id: 1, max queue pairs: 0 00:06:14.824 CRYPTODEV: Creating cryptodev 0000:da:01.6_qat_sym 00:06:14.824 CRYPTODEV: Initialisation parameters - name: 0000:da:01.6_qat_sym,socket id: 1, max queue pairs: 0 00:06:14.824 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:01.7 (socket 1) 00:06:14.824 CRYPTODEV: Creating cryptodev 0000:da:01.7_qat_asym 00:06:14.824 CRYPTODEV: Initialisation parameters - name: 0000:da:01.7_qat_asym,socket id: 1, max queue pairs: 0 00:06:14.824 CRYPTODEV: Creating cryptodev 0000:da:01.7_qat_sym 00:06:14.824 CRYPTODEV: Initialisation parameters - name: 0000:da:01.7_qat_sym,socket id: 1, max queue pairs: 0 00:06:14.824 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:02.0 (socket 1) 00:06:14.824 CRYPTODEV: Creating cryptodev 0000:da:02.0_qat_asym 00:06:14.824 CRYPTODEV: Initialisation parameters - name: 0000:da:02.0_qat_asym,socket id: 1, max queue pairs: 0 00:06:14.824 CRYPTODEV: Creating cryptodev 0000:da:02.0_qat_sym 00:06:14.824 CRYPTODEV: Initialisation parameters - name: 0000:da:02.0_qat_sym,socket id: 1, max queue pairs: 0 00:06:14.824 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:02.1 (socket 1) 00:06:14.824 CRYPTODEV: Creating cryptodev 0000:da:02.1_qat_asym 00:06:14.824 CRYPTODEV: Initialisation parameters - name: 0000:da:02.1_qat_asym,socket id: 1, max queue pairs: 0 00:06:14.824 CRYPTODEV: Creating cryptodev 0000:da:02.1_qat_sym 00:06:14.824 CRYPTODEV: Initialisation parameters - name: 0000:da:02.1_qat_sym,socket id: 1, max queue pairs: 0 00:06:14.824 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:02.2 (socket 1) 00:06:14.824 CRYPTODEV: Creating cryptodev 0000:da:02.2_qat_asym 00:06:14.824 CRYPTODEV: Initialisation parameters - name: 0000:da:02.2_qat_asym,socket id: 1, max queue pairs: 0 00:06:14.824 CRYPTODEV: Creating cryptodev 0000:da:02.2_qat_sym 00:06:14.824 CRYPTODEV: Initialisation parameters - name: 0000:da:02.2_qat_sym,socket id: 1, max queue pairs: 0 00:06:14.824 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:02.3 (socket 1) 00:06:14.824 CRYPTODEV: Creating cryptodev 0000:da:02.3_qat_asym 00:06:14.824 CRYPTODEV: Initialisation parameters - name: 0000:da:02.3_qat_asym,socket id: 1, max queue pairs: 0 00:06:14.824 CRYPTODEV: Creating cryptodev 0000:da:02.3_qat_sym 00:06:14.824 CRYPTODEV: Initialisation parameters - name: 0000:da:02.3_qat_sym,socket id: 1, max queue pairs: 0 00:06:14.824 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:02.4 (socket 1) 00:06:14.824 CRYPTODEV: Creating cryptodev 0000:da:02.4_qat_asym 00:06:14.824 CRYPTODEV: Initialisation parameters - name: 0000:da:02.4_qat_asym,socket id: 1, max queue pairs: 0 00:06:14.824 CRYPTODEV: Creating cryptodev 0000:da:02.4_qat_sym 00:06:14.824 CRYPTODEV: Initialisation parameters - name: 0000:da:02.4_qat_sym,socket id: 1, max queue pairs: 0 00:06:14.824 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:02.5 (socket 1) 00:06:14.824 CRYPTODEV: Creating cryptodev 0000:da:02.5_qat_asym 00:06:14.824 CRYPTODEV: Initialisation parameters - name: 0000:da:02.5_qat_asym,socket id: 1, max queue pairs: 0 00:06:14.824 CRYPTODEV: Creating cryptodev 0000:da:02.5_qat_sym 00:06:14.824 CRYPTODEV: Initialisation parameters - name: 0000:da:02.5_qat_sym,socket id: 1, max queue pairs: 0 00:06:14.824 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:02.6 (socket 1) 00:06:14.824 CRYPTODEV: Creating cryptodev 0000:da:02.6_qat_asym 00:06:14.824 CRYPTODEV: Initialisation parameters - name: 0000:da:02.6_qat_asym,socket id: 1, max queue pairs: 0 00:06:14.824 CRYPTODEV: Creating cryptodev 0000:da:02.6_qat_sym 00:06:14.824 CRYPTODEV: Initialisation parameters - name: 0000:da:02.6_qat_sym,socket id: 1, max queue pairs: 0 00:06:14.824 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:02.7 (socket 1) 00:06:14.824 CRYPTODEV: Creating cryptodev 0000:da:02.7_qat_asym 00:06:14.824 CRYPTODEV: Initialisation parameters - name: 0000:da:02.7_qat_asym,socket id: 1, max queue pairs: 0 00:06:14.824 CRYPTODEV: Creating cryptodev 0000:da:02.7_qat_sym 00:06:14.825 CRYPTODEV: Initialisation parameters - name: 0000:da:02.7_qat_sym,socket id: 1, max queue pairs: 0 00:06:14.825 TELEMETRY: No legacy callbacks, legacy socket not created 00:06:14.825 00:06:14.825 00:06:14.825 CUnit - A unit testing framework for C - Version 2.1-3 00:06:14.825 http://cunit.sourceforge.net/ 00:06:14.825 00:06:14.825 00:06:14.825 Suite: memory 00:06:14.825 Test: test ... 00:06:14.825 register 0x200000200000 2097152 00:06:14.825 register 0x201000a00000 2097152 00:06:14.825 malloc 3145728 00:06:14.825 register 0x200000400000 4194304 00:06:14.825 buf 0x200000500000 len 3145728 PASSED 00:06:14.825 malloc 64 00:06:14.825 buf 0x2000004fff40 len 64 PASSED 00:06:14.825 malloc 4194304 00:06:14.825 register 0x200000800000 6291456 00:06:14.825 buf 0x200000a00000 len 4194304 PASSED 00:06:14.825 free 0x200000500000 3145728 00:06:14.825 free 0x2000004fff40 64 00:06:14.825 unregister 0x200000400000 4194304 PASSED 00:06:14.825 free 0x200000a00000 4194304 00:06:14.825 unregister 0x200000800000 6291456 PASSED 00:06:14.825 malloc 8388608 00:06:14.825 register 0x200000400000 10485760 00:06:14.825 buf 0x200000600000 len 8388608 PASSED 00:06:14.825 free 0x200000600000 8388608 00:06:14.825 unregister 0x200000400000 10485760 PASSED 00:06:14.825 passed 00:06:14.825 00:06:14.825 Run Summary: Type Total Ran Passed Failed Inactive 00:06:14.825 suites 1 1 n/a 0 0 00:06:14.825 tests 1 1 1 0 0 00:06:14.825 asserts 16 16 16 0 n/a 00:06:14.825 00:06:14.825 Elapsed time = 0.006 seconds 00:06:14.825 00:06:14.825 real 0m0.107s 00:06:14.825 user 0m0.036s 00:06:14.825 sys 0m0.070s 00:06:14.825 18:09:58 env.env_mem_callbacks -- common/autotest_common.sh@1124 -- # xtrace_disable 00:06:14.825 18:09:58 env.env_mem_callbacks -- common/autotest_common.sh@10 -- # set +x 00:06:14.825 ************************************ 00:06:14.825 END TEST env_mem_callbacks 00:06:14.825 ************************************ 00:06:14.825 18:09:58 env -- common/autotest_common.sh@1142 -- # return 0 00:06:14.825 00:06:14.825 real 0m5.806s 00:06:14.825 user 0m3.704s 00:06:14.825 sys 0m1.670s 00:06:14.825 18:09:58 env -- common/autotest_common.sh@1124 -- # xtrace_disable 00:06:14.825 18:09:58 env -- common/autotest_common.sh@10 -- # set +x 00:06:14.825 ************************************ 00:06:14.825 END TEST env 00:06:14.825 ************************************ 00:06:14.825 18:09:58 -- common/autotest_common.sh@1142 -- # return 0 00:06:14.825 18:09:58 -- spdk/autotest.sh@169 -- # run_test rpc /var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc/rpc.sh 00:06:14.825 18:09:58 -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:06:14.825 18:09:58 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:06:14.825 18:09:58 -- common/autotest_common.sh@10 -- # set +x 00:06:14.825 ************************************ 00:06:14.825 START TEST rpc 00:06:14.825 ************************************ 00:06:14.825 18:09:58 rpc -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc/rpc.sh 00:06:14.825 * Looking for test storage... 00:06:14.825 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc 00:06:14.825 18:09:58 rpc -- rpc/rpc.sh@65 -- # spdk_pid=2416605 00:06:14.825 18:09:58 rpc -- rpc/rpc.sh@64 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt -e bdev 00:06:14.825 18:09:58 rpc -- rpc/rpc.sh@66 -- # trap 'killprocess $spdk_pid; exit 1' SIGINT SIGTERM EXIT 00:06:14.825 18:09:58 rpc -- rpc/rpc.sh@67 -- # waitforlisten 2416605 00:06:14.825 18:09:58 rpc -- common/autotest_common.sh@829 -- # '[' -z 2416605 ']' 00:06:14.825 18:09:58 rpc -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:06:14.825 18:09:58 rpc -- common/autotest_common.sh@834 -- # local max_retries=100 00:06:14.825 18:09:58 rpc -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:06:14.825 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:06:14.825 18:09:58 rpc -- common/autotest_common.sh@838 -- # xtrace_disable 00:06:14.825 18:09:58 rpc -- common/autotest_common.sh@10 -- # set +x 00:06:14.825 [2024-07-12 18:09:58.535578] Starting SPDK v24.09-pre git sha1 182dd7de4 / DPDK 24.03.0 initialization... 00:06:14.825 [2024-07-12 18:09:58.535633] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2416605 ] 00:06:15.084 [2024-07-12 18:09:58.649518] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:15.084 [2024-07-12 18:09:58.753647] app.c: 603:app_setup_trace: *NOTICE*: Tracepoint Group Mask bdev specified. 00:06:15.085 [2024-07-12 18:09:58.753695] app.c: 604:app_setup_trace: *NOTICE*: Use 'spdk_trace -s spdk_tgt -p 2416605' to capture a snapshot of events at runtime. 00:06:15.085 [2024-07-12 18:09:58.753710] app.c: 609:app_setup_trace: *NOTICE*: 'spdk_trace' without parameters will also work if this is the only 00:06:15.085 [2024-07-12 18:09:58.753724] app.c: 610:app_setup_trace: *NOTICE*: SPDK application currently running. 00:06:15.085 [2024-07-12 18:09:58.753735] app.c: 611:app_setup_trace: *NOTICE*: Or copy /dev/shm/spdk_tgt_trace.pid2416605 for offline analysis/debug. 00:06:15.085 [2024-07-12 18:09:58.753764] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:06:16.023 18:09:59 rpc -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:06:16.023 18:09:59 rpc -- common/autotest_common.sh@862 -- # return 0 00:06:16.023 18:09:59 rpc -- rpc/rpc.sh@69 -- # export PYTHONPATH=:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc 00:06:16.023 18:09:59 rpc -- rpc/rpc.sh@69 -- # PYTHONPATH=:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc 00:06:16.023 18:09:59 rpc -- rpc/rpc.sh@72 -- # rpc=rpc_cmd 00:06:16.023 18:09:59 rpc -- rpc/rpc.sh@73 -- # run_test rpc_integrity rpc_integrity 00:06:16.023 18:09:59 rpc -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:06:16.023 18:09:59 rpc -- common/autotest_common.sh@1105 -- # xtrace_disable 00:06:16.023 18:09:59 rpc -- common/autotest_common.sh@10 -- # set +x 00:06:16.023 ************************************ 00:06:16.023 START TEST rpc_integrity 00:06:16.023 ************************************ 00:06:16.023 18:09:59 rpc.rpc_integrity -- common/autotest_common.sh@1123 -- # rpc_integrity 00:06:16.023 18:09:59 rpc.rpc_integrity -- rpc/rpc.sh@12 -- # rpc_cmd bdev_get_bdevs 00:06:16.023 18:09:59 rpc.rpc_integrity -- common/autotest_common.sh@559 -- # xtrace_disable 00:06:16.023 18:09:59 rpc.rpc_integrity -- common/autotest_common.sh@10 -- # set +x 00:06:16.023 18:09:59 rpc.rpc_integrity -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:06:16.023 18:09:59 rpc.rpc_integrity -- rpc/rpc.sh@12 -- # bdevs='[]' 00:06:16.023 18:09:59 rpc.rpc_integrity -- rpc/rpc.sh@13 -- # jq length 00:06:16.023 18:09:59 rpc.rpc_integrity -- rpc/rpc.sh@13 -- # '[' 0 == 0 ']' 00:06:16.023 18:09:59 rpc.rpc_integrity -- rpc/rpc.sh@15 -- # rpc_cmd bdev_malloc_create 8 512 00:06:16.023 18:09:59 rpc.rpc_integrity -- common/autotest_common.sh@559 -- # xtrace_disable 00:06:16.023 18:09:59 rpc.rpc_integrity -- common/autotest_common.sh@10 -- # set +x 00:06:16.023 18:09:59 rpc.rpc_integrity -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:06:16.023 18:09:59 rpc.rpc_integrity -- rpc/rpc.sh@15 -- # malloc=Malloc0 00:06:16.023 18:09:59 rpc.rpc_integrity -- rpc/rpc.sh@16 -- # rpc_cmd bdev_get_bdevs 00:06:16.023 18:09:59 rpc.rpc_integrity -- common/autotest_common.sh@559 -- # xtrace_disable 00:06:16.023 18:09:59 rpc.rpc_integrity -- common/autotest_common.sh@10 -- # set +x 00:06:16.023 18:09:59 rpc.rpc_integrity -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:06:16.023 18:09:59 rpc.rpc_integrity -- rpc/rpc.sh@16 -- # bdevs='[ 00:06:16.023 { 00:06:16.023 "name": "Malloc0", 00:06:16.023 "aliases": [ 00:06:16.023 "0b95e335-31b5-4240-b90a-965bc6662760" 00:06:16.023 ], 00:06:16.023 "product_name": "Malloc disk", 00:06:16.023 "block_size": 512, 00:06:16.023 "num_blocks": 16384, 00:06:16.023 "uuid": "0b95e335-31b5-4240-b90a-965bc6662760", 00:06:16.023 "assigned_rate_limits": { 00:06:16.023 "rw_ios_per_sec": 0, 00:06:16.023 "rw_mbytes_per_sec": 0, 00:06:16.023 "r_mbytes_per_sec": 0, 00:06:16.023 "w_mbytes_per_sec": 0 00:06:16.023 }, 00:06:16.023 "claimed": false, 00:06:16.023 "zoned": false, 00:06:16.023 "supported_io_types": { 00:06:16.023 "read": true, 00:06:16.023 "write": true, 00:06:16.023 "unmap": true, 00:06:16.023 "flush": true, 00:06:16.023 "reset": true, 00:06:16.023 "nvme_admin": false, 00:06:16.023 "nvme_io": false, 00:06:16.023 "nvme_io_md": false, 00:06:16.023 "write_zeroes": true, 00:06:16.023 "zcopy": true, 00:06:16.023 "get_zone_info": false, 00:06:16.023 "zone_management": false, 00:06:16.023 "zone_append": false, 00:06:16.023 "compare": false, 00:06:16.023 "compare_and_write": false, 00:06:16.023 "abort": true, 00:06:16.023 "seek_hole": false, 00:06:16.023 "seek_data": false, 00:06:16.023 "copy": true, 00:06:16.023 "nvme_iov_md": false 00:06:16.023 }, 00:06:16.023 "memory_domains": [ 00:06:16.023 { 00:06:16.023 "dma_device_id": "system", 00:06:16.023 "dma_device_type": 1 00:06:16.023 }, 00:06:16.023 { 00:06:16.023 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:06:16.023 "dma_device_type": 2 00:06:16.023 } 00:06:16.023 ], 00:06:16.023 "driver_specific": {} 00:06:16.023 } 00:06:16.023 ]' 00:06:16.023 18:09:59 rpc.rpc_integrity -- rpc/rpc.sh@17 -- # jq length 00:06:16.023 18:09:59 rpc.rpc_integrity -- rpc/rpc.sh@17 -- # '[' 1 == 1 ']' 00:06:16.023 18:09:59 rpc.rpc_integrity -- rpc/rpc.sh@19 -- # rpc_cmd bdev_passthru_create -b Malloc0 -p Passthru0 00:06:16.023 18:09:59 rpc.rpc_integrity -- common/autotest_common.sh@559 -- # xtrace_disable 00:06:16.023 18:09:59 rpc.rpc_integrity -- common/autotest_common.sh@10 -- # set +x 00:06:16.023 [2024-07-12 18:09:59.567996] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on Malloc0 00:06:16.023 [2024-07-12 18:09:59.568036] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:06:16.023 [2024-07-12 18:09:59.568057] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1c63eb0 00:06:16.023 [2024-07-12 18:09:59.568069] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:06:16.023 [2024-07-12 18:09:59.569552] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:06:16.023 [2024-07-12 18:09:59.569580] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: Passthru0 00:06:16.023 Passthru0 00:06:16.023 18:09:59 rpc.rpc_integrity -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:06:16.023 18:09:59 rpc.rpc_integrity -- rpc/rpc.sh@20 -- # rpc_cmd bdev_get_bdevs 00:06:16.023 18:09:59 rpc.rpc_integrity -- common/autotest_common.sh@559 -- # xtrace_disable 00:06:16.023 18:09:59 rpc.rpc_integrity -- common/autotest_common.sh@10 -- # set +x 00:06:16.023 18:09:59 rpc.rpc_integrity -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:06:16.023 18:09:59 rpc.rpc_integrity -- rpc/rpc.sh@20 -- # bdevs='[ 00:06:16.023 { 00:06:16.023 "name": "Malloc0", 00:06:16.023 "aliases": [ 00:06:16.023 "0b95e335-31b5-4240-b90a-965bc6662760" 00:06:16.023 ], 00:06:16.023 "product_name": "Malloc disk", 00:06:16.023 "block_size": 512, 00:06:16.023 "num_blocks": 16384, 00:06:16.023 "uuid": "0b95e335-31b5-4240-b90a-965bc6662760", 00:06:16.023 "assigned_rate_limits": { 00:06:16.023 "rw_ios_per_sec": 0, 00:06:16.023 "rw_mbytes_per_sec": 0, 00:06:16.023 "r_mbytes_per_sec": 0, 00:06:16.023 "w_mbytes_per_sec": 0 00:06:16.023 }, 00:06:16.023 "claimed": true, 00:06:16.023 "claim_type": "exclusive_write", 00:06:16.023 "zoned": false, 00:06:16.023 "supported_io_types": { 00:06:16.023 "read": true, 00:06:16.023 "write": true, 00:06:16.023 "unmap": true, 00:06:16.023 "flush": true, 00:06:16.023 "reset": true, 00:06:16.023 "nvme_admin": false, 00:06:16.023 "nvme_io": false, 00:06:16.023 "nvme_io_md": false, 00:06:16.023 "write_zeroes": true, 00:06:16.023 "zcopy": true, 00:06:16.023 "get_zone_info": false, 00:06:16.023 "zone_management": false, 00:06:16.023 "zone_append": false, 00:06:16.023 "compare": false, 00:06:16.023 "compare_and_write": false, 00:06:16.024 "abort": true, 00:06:16.024 "seek_hole": false, 00:06:16.024 "seek_data": false, 00:06:16.024 "copy": true, 00:06:16.024 "nvme_iov_md": false 00:06:16.024 }, 00:06:16.024 "memory_domains": [ 00:06:16.024 { 00:06:16.024 "dma_device_id": "system", 00:06:16.024 "dma_device_type": 1 00:06:16.024 }, 00:06:16.024 { 00:06:16.024 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:06:16.024 "dma_device_type": 2 00:06:16.024 } 00:06:16.024 ], 00:06:16.024 "driver_specific": {} 00:06:16.024 }, 00:06:16.024 { 00:06:16.024 "name": "Passthru0", 00:06:16.024 "aliases": [ 00:06:16.024 "e6c1610a-d793-5127-b3b5-f14e004f0456" 00:06:16.024 ], 00:06:16.024 "product_name": "passthru", 00:06:16.024 "block_size": 512, 00:06:16.024 "num_blocks": 16384, 00:06:16.024 "uuid": "e6c1610a-d793-5127-b3b5-f14e004f0456", 00:06:16.024 "assigned_rate_limits": { 00:06:16.024 "rw_ios_per_sec": 0, 00:06:16.024 "rw_mbytes_per_sec": 0, 00:06:16.024 "r_mbytes_per_sec": 0, 00:06:16.024 "w_mbytes_per_sec": 0 00:06:16.024 }, 00:06:16.024 "claimed": false, 00:06:16.024 "zoned": false, 00:06:16.024 "supported_io_types": { 00:06:16.024 "read": true, 00:06:16.024 "write": true, 00:06:16.024 "unmap": true, 00:06:16.024 "flush": true, 00:06:16.024 "reset": true, 00:06:16.024 "nvme_admin": false, 00:06:16.024 "nvme_io": false, 00:06:16.024 "nvme_io_md": false, 00:06:16.024 "write_zeroes": true, 00:06:16.024 "zcopy": true, 00:06:16.024 "get_zone_info": false, 00:06:16.024 "zone_management": false, 00:06:16.024 "zone_append": false, 00:06:16.024 "compare": false, 00:06:16.024 "compare_and_write": false, 00:06:16.024 "abort": true, 00:06:16.024 "seek_hole": false, 00:06:16.024 "seek_data": false, 00:06:16.024 "copy": true, 00:06:16.024 "nvme_iov_md": false 00:06:16.024 }, 00:06:16.024 "memory_domains": [ 00:06:16.024 { 00:06:16.024 "dma_device_id": "system", 00:06:16.024 "dma_device_type": 1 00:06:16.024 }, 00:06:16.024 { 00:06:16.024 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:06:16.024 "dma_device_type": 2 00:06:16.024 } 00:06:16.024 ], 00:06:16.024 "driver_specific": { 00:06:16.024 "passthru": { 00:06:16.024 "name": "Passthru0", 00:06:16.024 "base_bdev_name": "Malloc0" 00:06:16.024 } 00:06:16.024 } 00:06:16.024 } 00:06:16.024 ]' 00:06:16.024 18:09:59 rpc.rpc_integrity -- rpc/rpc.sh@21 -- # jq length 00:06:16.024 18:09:59 rpc.rpc_integrity -- rpc/rpc.sh@21 -- # '[' 2 == 2 ']' 00:06:16.024 18:09:59 rpc.rpc_integrity -- rpc/rpc.sh@23 -- # rpc_cmd bdev_passthru_delete Passthru0 00:06:16.024 18:09:59 rpc.rpc_integrity -- common/autotest_common.sh@559 -- # xtrace_disable 00:06:16.024 18:09:59 rpc.rpc_integrity -- common/autotest_common.sh@10 -- # set +x 00:06:16.024 18:09:59 rpc.rpc_integrity -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:06:16.024 18:09:59 rpc.rpc_integrity -- rpc/rpc.sh@24 -- # rpc_cmd bdev_malloc_delete Malloc0 00:06:16.024 18:09:59 rpc.rpc_integrity -- common/autotest_common.sh@559 -- # xtrace_disable 00:06:16.024 18:09:59 rpc.rpc_integrity -- common/autotest_common.sh@10 -- # set +x 00:06:16.024 18:09:59 rpc.rpc_integrity -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:06:16.024 18:09:59 rpc.rpc_integrity -- rpc/rpc.sh@25 -- # rpc_cmd bdev_get_bdevs 00:06:16.024 18:09:59 rpc.rpc_integrity -- common/autotest_common.sh@559 -- # xtrace_disable 00:06:16.024 18:09:59 rpc.rpc_integrity -- common/autotest_common.sh@10 -- # set +x 00:06:16.024 18:09:59 rpc.rpc_integrity -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:06:16.024 18:09:59 rpc.rpc_integrity -- rpc/rpc.sh@25 -- # bdevs='[]' 00:06:16.024 18:09:59 rpc.rpc_integrity -- rpc/rpc.sh@26 -- # jq length 00:06:16.024 18:09:59 rpc.rpc_integrity -- rpc/rpc.sh@26 -- # '[' 0 == 0 ']' 00:06:16.024 00:06:16.024 real 0m0.286s 00:06:16.024 user 0m0.182s 00:06:16.024 sys 0m0.042s 00:06:16.024 18:09:59 rpc.rpc_integrity -- common/autotest_common.sh@1124 -- # xtrace_disable 00:06:16.024 18:09:59 rpc.rpc_integrity -- common/autotest_common.sh@10 -- # set +x 00:06:16.024 ************************************ 00:06:16.024 END TEST rpc_integrity 00:06:16.024 ************************************ 00:06:16.024 18:09:59 rpc -- common/autotest_common.sh@1142 -- # return 0 00:06:16.024 18:09:59 rpc -- rpc/rpc.sh@74 -- # run_test rpc_plugins rpc_plugins 00:06:16.024 18:09:59 rpc -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:06:16.024 18:09:59 rpc -- common/autotest_common.sh@1105 -- # xtrace_disable 00:06:16.024 18:09:59 rpc -- common/autotest_common.sh@10 -- # set +x 00:06:16.284 ************************************ 00:06:16.284 START TEST rpc_plugins 00:06:16.284 ************************************ 00:06:16.284 18:09:59 rpc.rpc_plugins -- common/autotest_common.sh@1123 -- # rpc_plugins 00:06:16.284 18:09:59 rpc.rpc_plugins -- rpc/rpc.sh@30 -- # rpc_cmd --plugin rpc_plugin create_malloc 00:06:16.284 18:09:59 rpc.rpc_plugins -- common/autotest_common.sh@559 -- # xtrace_disable 00:06:16.284 18:09:59 rpc.rpc_plugins -- common/autotest_common.sh@10 -- # set +x 00:06:16.284 18:09:59 rpc.rpc_plugins -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:06:16.284 18:09:59 rpc.rpc_plugins -- rpc/rpc.sh@30 -- # malloc=Malloc1 00:06:16.284 18:09:59 rpc.rpc_plugins -- rpc/rpc.sh@31 -- # rpc_cmd bdev_get_bdevs 00:06:16.284 18:09:59 rpc.rpc_plugins -- common/autotest_common.sh@559 -- # xtrace_disable 00:06:16.284 18:09:59 rpc.rpc_plugins -- common/autotest_common.sh@10 -- # set +x 00:06:16.284 18:09:59 rpc.rpc_plugins -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:06:16.284 18:09:59 rpc.rpc_plugins -- rpc/rpc.sh@31 -- # bdevs='[ 00:06:16.284 { 00:06:16.284 "name": "Malloc1", 00:06:16.284 "aliases": [ 00:06:16.284 "dc495e74-dadf-4781-b819-cb65b88a1dfe" 00:06:16.284 ], 00:06:16.284 "product_name": "Malloc disk", 00:06:16.284 "block_size": 4096, 00:06:16.284 "num_blocks": 256, 00:06:16.284 "uuid": "dc495e74-dadf-4781-b819-cb65b88a1dfe", 00:06:16.284 "assigned_rate_limits": { 00:06:16.284 "rw_ios_per_sec": 0, 00:06:16.284 "rw_mbytes_per_sec": 0, 00:06:16.284 "r_mbytes_per_sec": 0, 00:06:16.284 "w_mbytes_per_sec": 0 00:06:16.284 }, 00:06:16.284 "claimed": false, 00:06:16.284 "zoned": false, 00:06:16.284 "supported_io_types": { 00:06:16.284 "read": true, 00:06:16.284 "write": true, 00:06:16.284 "unmap": true, 00:06:16.284 "flush": true, 00:06:16.284 "reset": true, 00:06:16.284 "nvme_admin": false, 00:06:16.284 "nvme_io": false, 00:06:16.284 "nvme_io_md": false, 00:06:16.284 "write_zeroes": true, 00:06:16.284 "zcopy": true, 00:06:16.284 "get_zone_info": false, 00:06:16.284 "zone_management": false, 00:06:16.284 "zone_append": false, 00:06:16.284 "compare": false, 00:06:16.284 "compare_and_write": false, 00:06:16.284 "abort": true, 00:06:16.284 "seek_hole": false, 00:06:16.284 "seek_data": false, 00:06:16.284 "copy": true, 00:06:16.284 "nvme_iov_md": false 00:06:16.284 }, 00:06:16.284 "memory_domains": [ 00:06:16.284 { 00:06:16.284 "dma_device_id": "system", 00:06:16.284 "dma_device_type": 1 00:06:16.284 }, 00:06:16.284 { 00:06:16.284 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:06:16.284 "dma_device_type": 2 00:06:16.284 } 00:06:16.284 ], 00:06:16.284 "driver_specific": {} 00:06:16.284 } 00:06:16.284 ]' 00:06:16.284 18:09:59 rpc.rpc_plugins -- rpc/rpc.sh@32 -- # jq length 00:06:16.284 18:09:59 rpc.rpc_plugins -- rpc/rpc.sh@32 -- # '[' 1 == 1 ']' 00:06:16.284 18:09:59 rpc.rpc_plugins -- rpc/rpc.sh@34 -- # rpc_cmd --plugin rpc_plugin delete_malloc Malloc1 00:06:16.284 18:09:59 rpc.rpc_plugins -- common/autotest_common.sh@559 -- # xtrace_disable 00:06:16.284 18:09:59 rpc.rpc_plugins -- common/autotest_common.sh@10 -- # set +x 00:06:16.284 18:09:59 rpc.rpc_plugins -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:06:16.284 18:09:59 rpc.rpc_plugins -- rpc/rpc.sh@35 -- # rpc_cmd bdev_get_bdevs 00:06:16.284 18:09:59 rpc.rpc_plugins -- common/autotest_common.sh@559 -- # xtrace_disable 00:06:16.284 18:09:59 rpc.rpc_plugins -- common/autotest_common.sh@10 -- # set +x 00:06:16.284 18:09:59 rpc.rpc_plugins -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:06:16.284 18:09:59 rpc.rpc_plugins -- rpc/rpc.sh@35 -- # bdevs='[]' 00:06:16.284 18:09:59 rpc.rpc_plugins -- rpc/rpc.sh@36 -- # jq length 00:06:16.284 18:09:59 rpc.rpc_plugins -- rpc/rpc.sh@36 -- # '[' 0 == 0 ']' 00:06:16.284 00:06:16.284 real 0m0.151s 00:06:16.284 user 0m0.098s 00:06:16.284 sys 0m0.018s 00:06:16.284 18:09:59 rpc.rpc_plugins -- common/autotest_common.sh@1124 -- # xtrace_disable 00:06:16.284 18:09:59 rpc.rpc_plugins -- common/autotest_common.sh@10 -- # set +x 00:06:16.284 ************************************ 00:06:16.284 END TEST rpc_plugins 00:06:16.284 ************************************ 00:06:16.284 18:09:59 rpc -- common/autotest_common.sh@1142 -- # return 0 00:06:16.284 18:09:59 rpc -- rpc/rpc.sh@75 -- # run_test rpc_trace_cmd_test rpc_trace_cmd_test 00:06:16.284 18:09:59 rpc -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:06:16.284 18:09:59 rpc -- common/autotest_common.sh@1105 -- # xtrace_disable 00:06:16.284 18:09:59 rpc -- common/autotest_common.sh@10 -- # set +x 00:06:16.543 ************************************ 00:06:16.543 START TEST rpc_trace_cmd_test 00:06:16.543 ************************************ 00:06:16.543 18:10:00 rpc.rpc_trace_cmd_test -- common/autotest_common.sh@1123 -- # rpc_trace_cmd_test 00:06:16.543 18:10:00 rpc.rpc_trace_cmd_test -- rpc/rpc.sh@40 -- # local info 00:06:16.543 18:10:00 rpc.rpc_trace_cmd_test -- rpc/rpc.sh@42 -- # rpc_cmd trace_get_info 00:06:16.543 18:10:00 rpc.rpc_trace_cmd_test -- common/autotest_common.sh@559 -- # xtrace_disable 00:06:16.543 18:10:00 rpc.rpc_trace_cmd_test -- common/autotest_common.sh@10 -- # set +x 00:06:16.543 18:10:00 rpc.rpc_trace_cmd_test -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:06:16.543 18:10:00 rpc.rpc_trace_cmd_test -- rpc/rpc.sh@42 -- # info='{ 00:06:16.543 "tpoint_shm_path": "/dev/shm/spdk_tgt_trace.pid2416605", 00:06:16.543 "tpoint_group_mask": "0x8", 00:06:16.543 "iscsi_conn": { 00:06:16.543 "mask": "0x2", 00:06:16.543 "tpoint_mask": "0x0" 00:06:16.543 }, 00:06:16.543 "scsi": { 00:06:16.543 "mask": "0x4", 00:06:16.543 "tpoint_mask": "0x0" 00:06:16.543 }, 00:06:16.543 "bdev": { 00:06:16.543 "mask": "0x8", 00:06:16.543 "tpoint_mask": "0xffffffffffffffff" 00:06:16.544 }, 00:06:16.544 "nvmf_rdma": { 00:06:16.544 "mask": "0x10", 00:06:16.544 "tpoint_mask": "0x0" 00:06:16.544 }, 00:06:16.544 "nvmf_tcp": { 00:06:16.544 "mask": "0x20", 00:06:16.544 "tpoint_mask": "0x0" 00:06:16.544 }, 00:06:16.544 "ftl": { 00:06:16.544 "mask": "0x40", 00:06:16.544 "tpoint_mask": "0x0" 00:06:16.544 }, 00:06:16.544 "blobfs": { 00:06:16.544 "mask": "0x80", 00:06:16.544 "tpoint_mask": "0x0" 00:06:16.544 }, 00:06:16.544 "dsa": { 00:06:16.544 "mask": "0x200", 00:06:16.544 "tpoint_mask": "0x0" 00:06:16.544 }, 00:06:16.544 "thread": { 00:06:16.544 "mask": "0x400", 00:06:16.544 "tpoint_mask": "0x0" 00:06:16.544 }, 00:06:16.544 "nvme_pcie": { 00:06:16.544 "mask": "0x800", 00:06:16.544 "tpoint_mask": "0x0" 00:06:16.544 }, 00:06:16.544 "iaa": { 00:06:16.544 "mask": "0x1000", 00:06:16.544 "tpoint_mask": "0x0" 00:06:16.544 }, 00:06:16.544 "nvme_tcp": { 00:06:16.544 "mask": "0x2000", 00:06:16.544 "tpoint_mask": "0x0" 00:06:16.544 }, 00:06:16.544 "bdev_nvme": { 00:06:16.544 "mask": "0x4000", 00:06:16.544 "tpoint_mask": "0x0" 00:06:16.544 }, 00:06:16.544 "sock": { 00:06:16.544 "mask": "0x8000", 00:06:16.544 "tpoint_mask": "0x0" 00:06:16.544 } 00:06:16.544 }' 00:06:16.544 18:10:00 rpc.rpc_trace_cmd_test -- rpc/rpc.sh@43 -- # jq length 00:06:16.544 18:10:00 rpc.rpc_trace_cmd_test -- rpc/rpc.sh@43 -- # '[' 16 -gt 2 ']' 00:06:16.544 18:10:00 rpc.rpc_trace_cmd_test -- rpc/rpc.sh@44 -- # jq 'has("tpoint_group_mask")' 00:06:16.544 18:10:00 rpc.rpc_trace_cmd_test -- rpc/rpc.sh@44 -- # '[' true = true ']' 00:06:16.544 18:10:00 rpc.rpc_trace_cmd_test -- rpc/rpc.sh@45 -- # jq 'has("tpoint_shm_path")' 00:06:16.544 18:10:00 rpc.rpc_trace_cmd_test -- rpc/rpc.sh@45 -- # '[' true = true ']' 00:06:16.544 18:10:00 rpc.rpc_trace_cmd_test -- rpc/rpc.sh@46 -- # jq 'has("bdev")' 00:06:16.544 18:10:00 rpc.rpc_trace_cmd_test -- rpc/rpc.sh@46 -- # '[' true = true ']' 00:06:16.544 18:10:00 rpc.rpc_trace_cmd_test -- rpc/rpc.sh@47 -- # jq -r .bdev.tpoint_mask 00:06:16.544 18:10:00 rpc.rpc_trace_cmd_test -- rpc/rpc.sh@47 -- # '[' 0xffffffffffffffff '!=' 0x0 ']' 00:06:16.544 00:06:16.544 real 0m0.243s 00:06:16.544 user 0m0.205s 00:06:16.544 sys 0m0.029s 00:06:16.544 18:10:00 rpc.rpc_trace_cmd_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:06:16.544 18:10:00 rpc.rpc_trace_cmd_test -- common/autotest_common.sh@10 -- # set +x 00:06:16.544 ************************************ 00:06:16.544 END TEST rpc_trace_cmd_test 00:06:16.544 ************************************ 00:06:16.803 18:10:00 rpc -- common/autotest_common.sh@1142 -- # return 0 00:06:16.803 18:10:00 rpc -- rpc/rpc.sh@76 -- # [[ 0 -eq 1 ]] 00:06:16.803 18:10:00 rpc -- rpc/rpc.sh@80 -- # rpc=rpc_cmd 00:06:16.803 18:10:00 rpc -- rpc/rpc.sh@81 -- # run_test rpc_daemon_integrity rpc_integrity 00:06:16.803 18:10:00 rpc -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:06:16.803 18:10:00 rpc -- common/autotest_common.sh@1105 -- # xtrace_disable 00:06:16.803 18:10:00 rpc -- common/autotest_common.sh@10 -- # set +x 00:06:16.803 ************************************ 00:06:16.803 START TEST rpc_daemon_integrity 00:06:16.803 ************************************ 00:06:16.803 18:10:00 rpc.rpc_daemon_integrity -- common/autotest_common.sh@1123 -- # rpc_integrity 00:06:16.803 18:10:00 rpc.rpc_daemon_integrity -- rpc/rpc.sh@12 -- # rpc_cmd bdev_get_bdevs 00:06:16.803 18:10:00 rpc.rpc_daemon_integrity -- common/autotest_common.sh@559 -- # xtrace_disable 00:06:16.803 18:10:00 rpc.rpc_daemon_integrity -- common/autotest_common.sh@10 -- # set +x 00:06:16.803 18:10:00 rpc.rpc_daemon_integrity -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:06:16.803 18:10:00 rpc.rpc_daemon_integrity -- rpc/rpc.sh@12 -- # bdevs='[]' 00:06:16.803 18:10:00 rpc.rpc_daemon_integrity -- rpc/rpc.sh@13 -- # jq length 00:06:16.803 18:10:00 rpc.rpc_daemon_integrity -- rpc/rpc.sh@13 -- # '[' 0 == 0 ']' 00:06:16.803 18:10:00 rpc.rpc_daemon_integrity -- rpc/rpc.sh@15 -- # rpc_cmd bdev_malloc_create 8 512 00:06:16.803 18:10:00 rpc.rpc_daemon_integrity -- common/autotest_common.sh@559 -- # xtrace_disable 00:06:16.803 18:10:00 rpc.rpc_daemon_integrity -- common/autotest_common.sh@10 -- # set +x 00:06:16.803 18:10:00 rpc.rpc_daemon_integrity -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:06:16.803 18:10:00 rpc.rpc_daemon_integrity -- rpc/rpc.sh@15 -- # malloc=Malloc2 00:06:16.803 18:10:00 rpc.rpc_daemon_integrity -- rpc/rpc.sh@16 -- # rpc_cmd bdev_get_bdevs 00:06:16.803 18:10:00 rpc.rpc_daemon_integrity -- common/autotest_common.sh@559 -- # xtrace_disable 00:06:16.803 18:10:00 rpc.rpc_daemon_integrity -- common/autotest_common.sh@10 -- # set +x 00:06:16.803 18:10:00 rpc.rpc_daemon_integrity -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:06:16.803 18:10:00 rpc.rpc_daemon_integrity -- rpc/rpc.sh@16 -- # bdevs='[ 00:06:16.803 { 00:06:16.803 "name": "Malloc2", 00:06:16.803 "aliases": [ 00:06:16.803 "42221535-d2fe-434c-9083-25eb6cff06be" 00:06:16.803 ], 00:06:16.803 "product_name": "Malloc disk", 00:06:16.803 "block_size": 512, 00:06:16.803 "num_blocks": 16384, 00:06:16.803 "uuid": "42221535-d2fe-434c-9083-25eb6cff06be", 00:06:16.803 "assigned_rate_limits": { 00:06:16.803 "rw_ios_per_sec": 0, 00:06:16.803 "rw_mbytes_per_sec": 0, 00:06:16.803 "r_mbytes_per_sec": 0, 00:06:16.803 "w_mbytes_per_sec": 0 00:06:16.803 }, 00:06:16.803 "claimed": false, 00:06:16.803 "zoned": false, 00:06:16.803 "supported_io_types": { 00:06:16.803 "read": true, 00:06:16.803 "write": true, 00:06:16.803 "unmap": true, 00:06:16.803 "flush": true, 00:06:16.803 "reset": true, 00:06:16.803 "nvme_admin": false, 00:06:16.803 "nvme_io": false, 00:06:16.803 "nvme_io_md": false, 00:06:16.803 "write_zeroes": true, 00:06:16.803 "zcopy": true, 00:06:16.803 "get_zone_info": false, 00:06:16.803 "zone_management": false, 00:06:16.803 "zone_append": false, 00:06:16.803 "compare": false, 00:06:16.803 "compare_and_write": false, 00:06:16.803 "abort": true, 00:06:16.803 "seek_hole": false, 00:06:16.803 "seek_data": false, 00:06:16.803 "copy": true, 00:06:16.803 "nvme_iov_md": false 00:06:16.803 }, 00:06:16.803 "memory_domains": [ 00:06:16.803 { 00:06:16.803 "dma_device_id": "system", 00:06:16.803 "dma_device_type": 1 00:06:16.803 }, 00:06:16.803 { 00:06:16.803 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:06:16.803 "dma_device_type": 2 00:06:16.803 } 00:06:16.803 ], 00:06:16.803 "driver_specific": {} 00:06:16.803 } 00:06:16.803 ]' 00:06:16.803 18:10:00 rpc.rpc_daemon_integrity -- rpc/rpc.sh@17 -- # jq length 00:06:16.803 18:10:00 rpc.rpc_daemon_integrity -- rpc/rpc.sh@17 -- # '[' 1 == 1 ']' 00:06:16.803 18:10:00 rpc.rpc_daemon_integrity -- rpc/rpc.sh@19 -- # rpc_cmd bdev_passthru_create -b Malloc2 -p Passthru0 00:06:16.803 18:10:00 rpc.rpc_daemon_integrity -- common/autotest_common.sh@559 -- # xtrace_disable 00:06:16.803 18:10:00 rpc.rpc_daemon_integrity -- common/autotest_common.sh@10 -- # set +x 00:06:16.803 [2024-07-12 18:10:00.482604] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on Malloc2 00:06:16.803 [2024-07-12 18:10:00.482644] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:06:16.803 [2024-07-12 18:10:00.482670] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1c64b20 00:06:16.803 [2024-07-12 18:10:00.482683] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:06:16.803 [2024-07-12 18:10:00.484057] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:06:16.803 [2024-07-12 18:10:00.484086] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: Passthru0 00:06:16.803 Passthru0 00:06:16.803 18:10:00 rpc.rpc_daemon_integrity -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:06:16.803 18:10:00 rpc.rpc_daemon_integrity -- rpc/rpc.sh@20 -- # rpc_cmd bdev_get_bdevs 00:06:16.803 18:10:00 rpc.rpc_daemon_integrity -- common/autotest_common.sh@559 -- # xtrace_disable 00:06:16.803 18:10:00 rpc.rpc_daemon_integrity -- common/autotest_common.sh@10 -- # set +x 00:06:16.803 18:10:00 rpc.rpc_daemon_integrity -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:06:16.803 18:10:00 rpc.rpc_daemon_integrity -- rpc/rpc.sh@20 -- # bdevs='[ 00:06:16.803 { 00:06:16.803 "name": "Malloc2", 00:06:16.803 "aliases": [ 00:06:16.803 "42221535-d2fe-434c-9083-25eb6cff06be" 00:06:16.803 ], 00:06:16.803 "product_name": "Malloc disk", 00:06:16.803 "block_size": 512, 00:06:16.803 "num_blocks": 16384, 00:06:16.803 "uuid": "42221535-d2fe-434c-9083-25eb6cff06be", 00:06:16.803 "assigned_rate_limits": { 00:06:16.803 "rw_ios_per_sec": 0, 00:06:16.803 "rw_mbytes_per_sec": 0, 00:06:16.803 "r_mbytes_per_sec": 0, 00:06:16.803 "w_mbytes_per_sec": 0 00:06:16.803 }, 00:06:16.803 "claimed": true, 00:06:16.803 "claim_type": "exclusive_write", 00:06:16.803 "zoned": false, 00:06:16.803 "supported_io_types": { 00:06:16.804 "read": true, 00:06:16.804 "write": true, 00:06:16.804 "unmap": true, 00:06:16.804 "flush": true, 00:06:16.804 "reset": true, 00:06:16.804 "nvme_admin": false, 00:06:16.804 "nvme_io": false, 00:06:16.804 "nvme_io_md": false, 00:06:16.804 "write_zeroes": true, 00:06:16.804 "zcopy": true, 00:06:16.804 "get_zone_info": false, 00:06:16.804 "zone_management": false, 00:06:16.804 "zone_append": false, 00:06:16.804 "compare": false, 00:06:16.804 "compare_and_write": false, 00:06:16.804 "abort": true, 00:06:16.804 "seek_hole": false, 00:06:16.804 "seek_data": false, 00:06:16.804 "copy": true, 00:06:16.804 "nvme_iov_md": false 00:06:16.804 }, 00:06:16.804 "memory_domains": [ 00:06:16.804 { 00:06:16.804 "dma_device_id": "system", 00:06:16.804 "dma_device_type": 1 00:06:16.804 }, 00:06:16.804 { 00:06:16.804 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:06:16.804 "dma_device_type": 2 00:06:16.804 } 00:06:16.804 ], 00:06:16.804 "driver_specific": {} 00:06:16.804 }, 00:06:16.804 { 00:06:16.804 "name": "Passthru0", 00:06:16.804 "aliases": [ 00:06:16.804 "6393bb3c-31ea-5c60-9b64-2ecd90b8c070" 00:06:16.804 ], 00:06:16.804 "product_name": "passthru", 00:06:16.804 "block_size": 512, 00:06:16.804 "num_blocks": 16384, 00:06:16.804 "uuid": "6393bb3c-31ea-5c60-9b64-2ecd90b8c070", 00:06:16.804 "assigned_rate_limits": { 00:06:16.804 "rw_ios_per_sec": 0, 00:06:16.804 "rw_mbytes_per_sec": 0, 00:06:16.804 "r_mbytes_per_sec": 0, 00:06:16.804 "w_mbytes_per_sec": 0 00:06:16.804 }, 00:06:16.804 "claimed": false, 00:06:16.804 "zoned": false, 00:06:16.804 "supported_io_types": { 00:06:16.804 "read": true, 00:06:16.804 "write": true, 00:06:16.804 "unmap": true, 00:06:16.804 "flush": true, 00:06:16.804 "reset": true, 00:06:16.804 "nvme_admin": false, 00:06:16.804 "nvme_io": false, 00:06:16.804 "nvme_io_md": false, 00:06:16.804 "write_zeroes": true, 00:06:16.804 "zcopy": true, 00:06:16.804 "get_zone_info": false, 00:06:16.804 "zone_management": false, 00:06:16.804 "zone_append": false, 00:06:16.804 "compare": false, 00:06:16.804 "compare_and_write": false, 00:06:16.804 "abort": true, 00:06:16.804 "seek_hole": false, 00:06:16.804 "seek_data": false, 00:06:16.804 "copy": true, 00:06:16.804 "nvme_iov_md": false 00:06:16.804 }, 00:06:16.804 "memory_domains": [ 00:06:16.804 { 00:06:16.804 "dma_device_id": "system", 00:06:16.804 "dma_device_type": 1 00:06:16.804 }, 00:06:16.804 { 00:06:16.804 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:06:16.804 "dma_device_type": 2 00:06:16.804 } 00:06:16.804 ], 00:06:16.804 "driver_specific": { 00:06:16.804 "passthru": { 00:06:16.804 "name": "Passthru0", 00:06:16.804 "base_bdev_name": "Malloc2" 00:06:16.804 } 00:06:16.804 } 00:06:16.804 } 00:06:16.804 ]' 00:06:16.804 18:10:00 rpc.rpc_daemon_integrity -- rpc/rpc.sh@21 -- # jq length 00:06:17.063 18:10:00 rpc.rpc_daemon_integrity -- rpc/rpc.sh@21 -- # '[' 2 == 2 ']' 00:06:17.063 18:10:00 rpc.rpc_daemon_integrity -- rpc/rpc.sh@23 -- # rpc_cmd bdev_passthru_delete Passthru0 00:06:17.063 18:10:00 rpc.rpc_daemon_integrity -- common/autotest_common.sh@559 -- # xtrace_disable 00:06:17.063 18:10:00 rpc.rpc_daemon_integrity -- common/autotest_common.sh@10 -- # set +x 00:06:17.063 18:10:00 rpc.rpc_daemon_integrity -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:06:17.063 18:10:00 rpc.rpc_daemon_integrity -- rpc/rpc.sh@24 -- # rpc_cmd bdev_malloc_delete Malloc2 00:06:17.063 18:10:00 rpc.rpc_daemon_integrity -- common/autotest_common.sh@559 -- # xtrace_disable 00:06:17.063 18:10:00 rpc.rpc_daemon_integrity -- common/autotest_common.sh@10 -- # set +x 00:06:17.063 18:10:00 rpc.rpc_daemon_integrity -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:06:17.063 18:10:00 rpc.rpc_daemon_integrity -- rpc/rpc.sh@25 -- # rpc_cmd bdev_get_bdevs 00:06:17.063 18:10:00 rpc.rpc_daemon_integrity -- common/autotest_common.sh@559 -- # xtrace_disable 00:06:17.064 18:10:00 rpc.rpc_daemon_integrity -- common/autotest_common.sh@10 -- # set +x 00:06:17.064 18:10:00 rpc.rpc_daemon_integrity -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:06:17.064 18:10:00 rpc.rpc_daemon_integrity -- rpc/rpc.sh@25 -- # bdevs='[]' 00:06:17.064 18:10:00 rpc.rpc_daemon_integrity -- rpc/rpc.sh@26 -- # jq length 00:06:17.064 18:10:00 rpc.rpc_daemon_integrity -- rpc/rpc.sh@26 -- # '[' 0 == 0 ']' 00:06:17.064 00:06:17.064 real 0m0.290s 00:06:17.064 user 0m0.186s 00:06:17.064 sys 0m0.047s 00:06:17.064 18:10:00 rpc.rpc_daemon_integrity -- common/autotest_common.sh@1124 -- # xtrace_disable 00:06:17.064 18:10:00 rpc.rpc_daemon_integrity -- common/autotest_common.sh@10 -- # set +x 00:06:17.064 ************************************ 00:06:17.064 END TEST rpc_daemon_integrity 00:06:17.064 ************************************ 00:06:17.064 18:10:00 rpc -- common/autotest_common.sh@1142 -- # return 0 00:06:17.064 18:10:00 rpc -- rpc/rpc.sh@83 -- # trap - SIGINT SIGTERM EXIT 00:06:17.064 18:10:00 rpc -- rpc/rpc.sh@84 -- # killprocess 2416605 00:06:17.064 18:10:00 rpc -- common/autotest_common.sh@948 -- # '[' -z 2416605 ']' 00:06:17.064 18:10:00 rpc -- common/autotest_common.sh@952 -- # kill -0 2416605 00:06:17.064 18:10:00 rpc -- common/autotest_common.sh@953 -- # uname 00:06:17.064 18:10:00 rpc -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:06:17.064 18:10:00 rpc -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2416605 00:06:17.064 18:10:00 rpc -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:06:17.064 18:10:00 rpc -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:06:17.064 18:10:00 rpc -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2416605' 00:06:17.064 killing process with pid 2416605 00:06:17.064 18:10:00 rpc -- common/autotest_common.sh@967 -- # kill 2416605 00:06:17.064 18:10:00 rpc -- common/autotest_common.sh@972 -- # wait 2416605 00:06:17.631 00:06:17.631 real 0m2.718s 00:06:17.631 user 0m3.409s 00:06:17.631 sys 0m0.850s 00:06:17.631 18:10:01 rpc -- common/autotest_common.sh@1124 -- # xtrace_disable 00:06:17.631 18:10:01 rpc -- common/autotest_common.sh@10 -- # set +x 00:06:17.631 ************************************ 00:06:17.631 END TEST rpc 00:06:17.631 ************************************ 00:06:17.631 18:10:01 -- common/autotest_common.sh@1142 -- # return 0 00:06:17.631 18:10:01 -- spdk/autotest.sh@170 -- # run_test skip_rpc /var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc/skip_rpc.sh 00:06:17.631 18:10:01 -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:06:17.631 18:10:01 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:06:17.631 18:10:01 -- common/autotest_common.sh@10 -- # set +x 00:06:17.631 ************************************ 00:06:17.631 START TEST skip_rpc 00:06:17.631 ************************************ 00:06:17.632 18:10:01 skip_rpc -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc/skip_rpc.sh 00:06:17.632 * Looking for test storage... 00:06:17.632 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc 00:06:17.632 18:10:01 skip_rpc -- rpc/skip_rpc.sh@11 -- # CONFIG_PATH=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc/config.json 00:06:17.632 18:10:01 skip_rpc -- rpc/skip_rpc.sh@12 -- # LOG_PATH=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc/log.txt 00:06:17.632 18:10:01 skip_rpc -- rpc/skip_rpc.sh@73 -- # run_test skip_rpc test_skip_rpc 00:06:17.632 18:10:01 skip_rpc -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:06:17.632 18:10:01 skip_rpc -- common/autotest_common.sh@1105 -- # xtrace_disable 00:06:17.632 18:10:01 skip_rpc -- common/autotest_common.sh@10 -- # set +x 00:06:17.632 ************************************ 00:06:17.632 START TEST skip_rpc 00:06:17.632 ************************************ 00:06:17.632 18:10:01 skip_rpc.skip_rpc -- common/autotest_common.sh@1123 -- # test_skip_rpc 00:06:17.632 18:10:01 skip_rpc.skip_rpc -- rpc/skip_rpc.sh@16 -- # local spdk_pid=2417131 00:06:17.632 18:10:01 skip_rpc.skip_rpc -- rpc/skip_rpc.sh@18 -- # trap 'killprocess $spdk_pid; exit 1' SIGINT SIGTERM EXIT 00:06:17.632 18:10:01 skip_rpc.skip_rpc -- rpc/skip_rpc.sh@19 -- # sleep 5 00:06:17.632 18:10:01 skip_rpc.skip_rpc -- rpc/skip_rpc.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt --no-rpc-server -m 0x1 00:06:17.890 [2024-07-12 18:10:01.397592] Starting SPDK v24.09-pre git sha1 182dd7de4 / DPDK 24.03.0 initialization... 00:06:17.890 [2024-07-12 18:10:01.397660] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2417131 ] 00:06:17.890 [2024-07-12 18:10:01.525795] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:18.149 [2024-07-12 18:10:01.624275] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:06:23.420 18:10:06 skip_rpc.skip_rpc -- rpc/skip_rpc.sh@21 -- # NOT rpc_cmd spdk_get_version 00:06:23.420 18:10:06 skip_rpc.skip_rpc -- common/autotest_common.sh@648 -- # local es=0 00:06:23.420 18:10:06 skip_rpc.skip_rpc -- common/autotest_common.sh@650 -- # valid_exec_arg rpc_cmd spdk_get_version 00:06:23.420 18:10:06 skip_rpc.skip_rpc -- common/autotest_common.sh@636 -- # local arg=rpc_cmd 00:06:23.420 18:10:06 skip_rpc.skip_rpc -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:06:23.420 18:10:06 skip_rpc.skip_rpc -- common/autotest_common.sh@640 -- # type -t rpc_cmd 00:06:23.420 18:10:06 skip_rpc.skip_rpc -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:06:23.420 18:10:06 skip_rpc.skip_rpc -- common/autotest_common.sh@651 -- # rpc_cmd spdk_get_version 00:06:23.420 18:10:06 skip_rpc.skip_rpc -- common/autotest_common.sh@559 -- # xtrace_disable 00:06:23.420 18:10:06 skip_rpc.skip_rpc -- common/autotest_common.sh@10 -- # set +x 00:06:23.420 18:10:06 skip_rpc.skip_rpc -- common/autotest_common.sh@587 -- # [[ 1 == 0 ]] 00:06:23.420 18:10:06 skip_rpc.skip_rpc -- common/autotest_common.sh@651 -- # es=1 00:06:23.420 18:10:06 skip_rpc.skip_rpc -- common/autotest_common.sh@659 -- # (( es > 128 )) 00:06:23.420 18:10:06 skip_rpc.skip_rpc -- common/autotest_common.sh@670 -- # [[ -n '' ]] 00:06:23.420 18:10:06 skip_rpc.skip_rpc -- common/autotest_common.sh@675 -- # (( !es == 0 )) 00:06:23.420 18:10:06 skip_rpc.skip_rpc -- rpc/skip_rpc.sh@22 -- # trap - SIGINT SIGTERM EXIT 00:06:23.420 18:10:06 skip_rpc.skip_rpc -- rpc/skip_rpc.sh@23 -- # killprocess 2417131 00:06:23.420 18:10:06 skip_rpc.skip_rpc -- common/autotest_common.sh@948 -- # '[' -z 2417131 ']' 00:06:23.420 18:10:06 skip_rpc.skip_rpc -- common/autotest_common.sh@952 -- # kill -0 2417131 00:06:23.420 18:10:06 skip_rpc.skip_rpc -- common/autotest_common.sh@953 -- # uname 00:06:23.420 18:10:06 skip_rpc.skip_rpc -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:06:23.420 18:10:06 skip_rpc.skip_rpc -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2417131 00:06:23.420 18:10:06 skip_rpc.skip_rpc -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:06:23.420 18:10:06 skip_rpc.skip_rpc -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:06:23.420 18:10:06 skip_rpc.skip_rpc -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2417131' 00:06:23.420 killing process with pid 2417131 00:06:23.420 18:10:06 skip_rpc.skip_rpc -- common/autotest_common.sh@967 -- # kill 2417131 00:06:23.420 18:10:06 skip_rpc.skip_rpc -- common/autotest_common.sh@972 -- # wait 2417131 00:06:23.420 00:06:23.420 real 0m5.450s 00:06:23.420 user 0m5.102s 00:06:23.420 sys 0m0.364s 00:06:23.420 18:10:06 skip_rpc.skip_rpc -- common/autotest_common.sh@1124 -- # xtrace_disable 00:06:23.420 18:10:06 skip_rpc.skip_rpc -- common/autotest_common.sh@10 -- # set +x 00:06:23.420 ************************************ 00:06:23.420 END TEST skip_rpc 00:06:23.420 ************************************ 00:06:23.420 18:10:06 skip_rpc -- common/autotest_common.sh@1142 -- # return 0 00:06:23.420 18:10:06 skip_rpc -- rpc/skip_rpc.sh@74 -- # run_test skip_rpc_with_json test_skip_rpc_with_json 00:06:23.420 18:10:06 skip_rpc -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:06:23.420 18:10:06 skip_rpc -- common/autotest_common.sh@1105 -- # xtrace_disable 00:06:23.420 18:10:06 skip_rpc -- common/autotest_common.sh@10 -- # set +x 00:06:23.420 ************************************ 00:06:23.420 START TEST skip_rpc_with_json 00:06:23.420 ************************************ 00:06:23.420 18:10:06 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@1123 -- # test_skip_rpc_with_json 00:06:23.420 18:10:06 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@44 -- # gen_json_config 00:06:23.420 18:10:06 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@28 -- # local spdk_pid=2417863 00:06:23.420 18:10:06 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@30 -- # trap 'killprocess $spdk_pid; exit 1' SIGINT SIGTERM EXIT 00:06:23.420 18:10:06 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@31 -- # waitforlisten 2417863 00:06:23.420 18:10:06 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@829 -- # '[' -z 2417863 ']' 00:06:23.420 18:10:06 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:06:23.420 18:10:06 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@834 -- # local max_retries=100 00:06:23.420 18:10:06 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:06:23.420 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:06:23.420 18:10:06 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@27 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt -m 0x1 00:06:23.420 18:10:06 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@838 -- # xtrace_disable 00:06:23.420 18:10:06 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@10 -- # set +x 00:06:23.420 [2024-07-12 18:10:06.930170] Starting SPDK v24.09-pre git sha1 182dd7de4 / DPDK 24.03.0 initialization... 00:06:23.420 [2024-07-12 18:10:06.930238] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2417863 ] 00:06:23.420 [2024-07-12 18:10:07.057724] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:23.679 [2024-07-12 18:10:07.165990] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:06:24.248 18:10:07 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:06:24.248 18:10:07 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@862 -- # return 0 00:06:24.248 18:10:07 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@34 -- # rpc_cmd nvmf_get_transports --trtype tcp 00:06:24.248 18:10:07 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@559 -- # xtrace_disable 00:06:24.248 18:10:07 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@10 -- # set +x 00:06:24.248 [2024-07-12 18:10:07.765547] nvmf_rpc.c:2562:rpc_nvmf_get_transports: *ERROR*: transport 'tcp' does not exist 00:06:24.248 request: 00:06:24.248 { 00:06:24.248 "trtype": "tcp", 00:06:24.248 "method": "nvmf_get_transports", 00:06:24.248 "req_id": 1 00:06:24.248 } 00:06:24.248 Got JSON-RPC error response 00:06:24.248 response: 00:06:24.248 { 00:06:24.248 "code": -19, 00:06:24.248 "message": "No such device" 00:06:24.248 } 00:06:24.248 18:10:07 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@587 -- # [[ 1 == 0 ]] 00:06:24.248 18:10:07 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@34 -- # rpc_cmd nvmf_create_transport -t tcp 00:06:24.248 18:10:07 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@559 -- # xtrace_disable 00:06:24.248 18:10:07 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@10 -- # set +x 00:06:24.248 [2024-07-12 18:10:07.773672] tcp.c: 672:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:06:24.248 18:10:07 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:06:24.248 18:10:07 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@36 -- # rpc_cmd save_config 00:06:24.248 18:10:07 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@559 -- # xtrace_disable 00:06:24.248 18:10:07 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@10 -- # set +x 00:06:24.248 18:10:07 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:06:24.248 18:10:07 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@37 -- # cat /var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc/config.json 00:06:24.248 { 00:06:24.248 "subsystems": [ 00:06:24.248 { 00:06:24.248 "subsystem": "keyring", 00:06:24.248 "config": [] 00:06:24.248 }, 00:06:24.248 { 00:06:24.248 "subsystem": "iobuf", 00:06:24.248 "config": [ 00:06:24.248 { 00:06:24.248 "method": "iobuf_set_options", 00:06:24.248 "params": { 00:06:24.248 "small_pool_count": 8192, 00:06:24.248 "large_pool_count": 1024, 00:06:24.248 "small_bufsize": 8192, 00:06:24.248 "large_bufsize": 135168 00:06:24.248 } 00:06:24.248 } 00:06:24.248 ] 00:06:24.248 }, 00:06:24.248 { 00:06:24.248 "subsystem": "sock", 00:06:24.248 "config": [ 00:06:24.248 { 00:06:24.248 "method": "sock_set_default_impl", 00:06:24.248 "params": { 00:06:24.248 "impl_name": "posix" 00:06:24.248 } 00:06:24.248 }, 00:06:24.248 { 00:06:24.248 "method": "sock_impl_set_options", 00:06:24.248 "params": { 00:06:24.248 "impl_name": "ssl", 00:06:24.248 "recv_buf_size": 4096, 00:06:24.248 "send_buf_size": 4096, 00:06:24.248 "enable_recv_pipe": true, 00:06:24.248 "enable_quickack": false, 00:06:24.248 "enable_placement_id": 0, 00:06:24.248 "enable_zerocopy_send_server": true, 00:06:24.248 "enable_zerocopy_send_client": false, 00:06:24.248 "zerocopy_threshold": 0, 00:06:24.248 "tls_version": 0, 00:06:24.248 "enable_ktls": false 00:06:24.248 } 00:06:24.248 }, 00:06:24.248 { 00:06:24.248 "method": "sock_impl_set_options", 00:06:24.248 "params": { 00:06:24.248 "impl_name": "posix", 00:06:24.248 "recv_buf_size": 2097152, 00:06:24.248 "send_buf_size": 2097152, 00:06:24.248 "enable_recv_pipe": true, 00:06:24.248 "enable_quickack": false, 00:06:24.248 "enable_placement_id": 0, 00:06:24.248 "enable_zerocopy_send_server": true, 00:06:24.248 "enable_zerocopy_send_client": false, 00:06:24.248 "zerocopy_threshold": 0, 00:06:24.248 "tls_version": 0, 00:06:24.248 "enable_ktls": false 00:06:24.248 } 00:06:24.248 } 00:06:24.248 ] 00:06:24.248 }, 00:06:24.248 { 00:06:24.248 "subsystem": "vmd", 00:06:24.248 "config": [] 00:06:24.248 }, 00:06:24.248 { 00:06:24.248 "subsystem": "accel", 00:06:24.248 "config": [ 00:06:24.248 { 00:06:24.248 "method": "accel_set_options", 00:06:24.248 "params": { 00:06:24.248 "small_cache_size": 128, 00:06:24.248 "large_cache_size": 16, 00:06:24.248 "task_count": 2048, 00:06:24.248 "sequence_count": 2048, 00:06:24.248 "buf_count": 2048 00:06:24.248 } 00:06:24.248 } 00:06:24.248 ] 00:06:24.248 }, 00:06:24.248 { 00:06:24.248 "subsystem": "bdev", 00:06:24.248 "config": [ 00:06:24.248 { 00:06:24.248 "method": "bdev_set_options", 00:06:24.248 "params": { 00:06:24.248 "bdev_io_pool_size": 65535, 00:06:24.248 "bdev_io_cache_size": 256, 00:06:24.248 "bdev_auto_examine": true, 00:06:24.248 "iobuf_small_cache_size": 128, 00:06:24.248 "iobuf_large_cache_size": 16 00:06:24.248 } 00:06:24.248 }, 00:06:24.248 { 00:06:24.248 "method": "bdev_raid_set_options", 00:06:24.248 "params": { 00:06:24.248 "process_window_size_kb": 1024 00:06:24.248 } 00:06:24.248 }, 00:06:24.248 { 00:06:24.248 "method": "bdev_iscsi_set_options", 00:06:24.248 "params": { 00:06:24.248 "timeout_sec": 30 00:06:24.248 } 00:06:24.248 }, 00:06:24.248 { 00:06:24.248 "method": "bdev_nvme_set_options", 00:06:24.248 "params": { 00:06:24.248 "action_on_timeout": "none", 00:06:24.248 "timeout_us": 0, 00:06:24.248 "timeout_admin_us": 0, 00:06:24.248 "keep_alive_timeout_ms": 10000, 00:06:24.248 "arbitration_burst": 0, 00:06:24.248 "low_priority_weight": 0, 00:06:24.248 "medium_priority_weight": 0, 00:06:24.248 "high_priority_weight": 0, 00:06:24.248 "nvme_adminq_poll_period_us": 10000, 00:06:24.248 "nvme_ioq_poll_period_us": 0, 00:06:24.248 "io_queue_requests": 0, 00:06:24.248 "delay_cmd_submit": true, 00:06:24.248 "transport_retry_count": 4, 00:06:24.248 "bdev_retry_count": 3, 00:06:24.248 "transport_ack_timeout": 0, 00:06:24.248 "ctrlr_loss_timeout_sec": 0, 00:06:24.248 "reconnect_delay_sec": 0, 00:06:24.248 "fast_io_fail_timeout_sec": 0, 00:06:24.248 "disable_auto_failback": false, 00:06:24.248 "generate_uuids": false, 00:06:24.248 "transport_tos": 0, 00:06:24.248 "nvme_error_stat": false, 00:06:24.248 "rdma_srq_size": 0, 00:06:24.248 "io_path_stat": false, 00:06:24.248 "allow_accel_sequence": false, 00:06:24.248 "rdma_max_cq_size": 0, 00:06:24.248 "rdma_cm_event_timeout_ms": 0, 00:06:24.248 "dhchap_digests": [ 00:06:24.248 "sha256", 00:06:24.248 "sha384", 00:06:24.248 "sha512" 00:06:24.248 ], 00:06:24.248 "dhchap_dhgroups": [ 00:06:24.248 "null", 00:06:24.248 "ffdhe2048", 00:06:24.248 "ffdhe3072", 00:06:24.248 "ffdhe4096", 00:06:24.248 "ffdhe6144", 00:06:24.248 "ffdhe8192" 00:06:24.248 ] 00:06:24.248 } 00:06:24.248 }, 00:06:24.248 { 00:06:24.248 "method": "bdev_nvme_set_hotplug", 00:06:24.248 "params": { 00:06:24.248 "period_us": 100000, 00:06:24.248 "enable": false 00:06:24.248 } 00:06:24.249 }, 00:06:24.249 { 00:06:24.249 "method": "bdev_wait_for_examine" 00:06:24.249 } 00:06:24.249 ] 00:06:24.249 }, 00:06:24.249 { 00:06:24.249 "subsystem": "scsi", 00:06:24.249 "config": null 00:06:24.249 }, 00:06:24.249 { 00:06:24.249 "subsystem": "scheduler", 00:06:24.249 "config": [ 00:06:24.249 { 00:06:24.249 "method": "framework_set_scheduler", 00:06:24.249 "params": { 00:06:24.249 "name": "static" 00:06:24.249 } 00:06:24.249 } 00:06:24.249 ] 00:06:24.249 }, 00:06:24.249 { 00:06:24.249 "subsystem": "vhost_scsi", 00:06:24.249 "config": [] 00:06:24.249 }, 00:06:24.249 { 00:06:24.249 "subsystem": "vhost_blk", 00:06:24.249 "config": [] 00:06:24.249 }, 00:06:24.249 { 00:06:24.249 "subsystem": "ublk", 00:06:24.249 "config": [] 00:06:24.249 }, 00:06:24.249 { 00:06:24.249 "subsystem": "nbd", 00:06:24.249 "config": [] 00:06:24.249 }, 00:06:24.249 { 00:06:24.249 "subsystem": "nvmf", 00:06:24.249 "config": [ 00:06:24.249 { 00:06:24.249 "method": "nvmf_set_config", 00:06:24.249 "params": { 00:06:24.249 "discovery_filter": "match_any", 00:06:24.249 "admin_cmd_passthru": { 00:06:24.249 "identify_ctrlr": false 00:06:24.249 } 00:06:24.249 } 00:06:24.249 }, 00:06:24.249 { 00:06:24.249 "method": "nvmf_set_max_subsystems", 00:06:24.249 "params": { 00:06:24.249 "max_subsystems": 1024 00:06:24.249 } 00:06:24.249 }, 00:06:24.249 { 00:06:24.249 "method": "nvmf_set_crdt", 00:06:24.249 "params": { 00:06:24.249 "crdt1": 0, 00:06:24.249 "crdt2": 0, 00:06:24.249 "crdt3": 0 00:06:24.249 } 00:06:24.249 }, 00:06:24.249 { 00:06:24.249 "method": "nvmf_create_transport", 00:06:24.249 "params": { 00:06:24.249 "trtype": "TCP", 00:06:24.249 "max_queue_depth": 128, 00:06:24.249 "max_io_qpairs_per_ctrlr": 127, 00:06:24.249 "in_capsule_data_size": 4096, 00:06:24.249 "max_io_size": 131072, 00:06:24.249 "io_unit_size": 131072, 00:06:24.249 "max_aq_depth": 128, 00:06:24.249 "num_shared_buffers": 511, 00:06:24.249 "buf_cache_size": 4294967295, 00:06:24.249 "dif_insert_or_strip": false, 00:06:24.249 "zcopy": false, 00:06:24.249 "c2h_success": true, 00:06:24.249 "sock_priority": 0, 00:06:24.249 "abort_timeout_sec": 1, 00:06:24.249 "ack_timeout": 0, 00:06:24.249 "data_wr_pool_size": 0 00:06:24.249 } 00:06:24.249 } 00:06:24.249 ] 00:06:24.249 }, 00:06:24.249 { 00:06:24.249 "subsystem": "iscsi", 00:06:24.249 "config": [ 00:06:24.249 { 00:06:24.249 "method": "iscsi_set_options", 00:06:24.249 "params": { 00:06:24.249 "node_base": "iqn.2016-06.io.spdk", 00:06:24.249 "max_sessions": 128, 00:06:24.249 "max_connections_per_session": 2, 00:06:24.249 "max_queue_depth": 64, 00:06:24.249 "default_time2wait": 2, 00:06:24.249 "default_time2retain": 20, 00:06:24.249 "first_burst_length": 8192, 00:06:24.249 "immediate_data": true, 00:06:24.249 "allow_duplicated_isid": false, 00:06:24.249 "error_recovery_level": 0, 00:06:24.249 "nop_timeout": 60, 00:06:24.249 "nop_in_interval": 30, 00:06:24.249 "disable_chap": false, 00:06:24.249 "require_chap": false, 00:06:24.249 "mutual_chap": false, 00:06:24.249 "chap_group": 0, 00:06:24.249 "max_large_datain_per_connection": 64, 00:06:24.249 "max_r2t_per_connection": 4, 00:06:24.249 "pdu_pool_size": 36864, 00:06:24.249 "immediate_data_pool_size": 16384, 00:06:24.249 "data_out_pool_size": 2048 00:06:24.249 } 00:06:24.249 } 00:06:24.249 ] 00:06:24.249 } 00:06:24.249 ] 00:06:24.249 } 00:06:24.249 18:10:07 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@39 -- # trap - SIGINT SIGTERM EXIT 00:06:24.249 18:10:07 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@40 -- # killprocess 2417863 00:06:24.249 18:10:07 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@948 -- # '[' -z 2417863 ']' 00:06:24.249 18:10:07 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@952 -- # kill -0 2417863 00:06:24.249 18:10:07 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@953 -- # uname 00:06:24.249 18:10:07 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:06:24.249 18:10:07 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2417863 00:06:24.508 18:10:07 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:06:24.508 18:10:07 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:06:24.508 18:10:07 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2417863' 00:06:24.508 killing process with pid 2417863 00:06:24.508 18:10:07 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@967 -- # kill 2417863 00:06:24.508 18:10:07 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@972 -- # wait 2417863 00:06:24.767 18:10:08 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@47 -- # local spdk_pid=2418051 00:06:24.767 18:10:08 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@46 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt --no-rpc-server -m 0x1 --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc/config.json 00:06:24.767 18:10:08 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@48 -- # sleep 5 00:06:30.034 18:10:13 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@50 -- # killprocess 2418051 00:06:30.034 18:10:13 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@948 -- # '[' -z 2418051 ']' 00:06:30.034 18:10:13 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@952 -- # kill -0 2418051 00:06:30.034 18:10:13 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@953 -- # uname 00:06:30.034 18:10:13 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:06:30.034 18:10:13 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2418051 00:06:30.034 18:10:13 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:06:30.034 18:10:13 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:06:30.034 18:10:13 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2418051' 00:06:30.034 killing process with pid 2418051 00:06:30.034 18:10:13 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@967 -- # kill 2418051 00:06:30.034 18:10:13 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@972 -- # wait 2418051 00:06:30.293 18:10:13 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@51 -- # grep -q 'TCP Transport Init' /var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc/log.txt 00:06:30.293 18:10:13 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@52 -- # rm /var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc/log.txt 00:06:30.293 00:06:30.293 real 0m6.954s 00:06:30.293 user 0m6.608s 00:06:30.293 sys 0m0.781s 00:06:30.294 18:10:13 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@1124 -- # xtrace_disable 00:06:30.294 18:10:13 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@10 -- # set +x 00:06:30.294 ************************************ 00:06:30.294 END TEST skip_rpc_with_json 00:06:30.294 ************************************ 00:06:30.294 18:10:13 skip_rpc -- common/autotest_common.sh@1142 -- # return 0 00:06:30.294 18:10:13 skip_rpc -- rpc/skip_rpc.sh@75 -- # run_test skip_rpc_with_delay test_skip_rpc_with_delay 00:06:30.294 18:10:13 skip_rpc -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:06:30.294 18:10:13 skip_rpc -- common/autotest_common.sh@1105 -- # xtrace_disable 00:06:30.294 18:10:13 skip_rpc -- common/autotest_common.sh@10 -- # set +x 00:06:30.294 ************************************ 00:06:30.294 START TEST skip_rpc_with_delay 00:06:30.294 ************************************ 00:06:30.294 18:10:13 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@1123 -- # test_skip_rpc_with_delay 00:06:30.294 18:10:13 skip_rpc.skip_rpc_with_delay -- rpc/skip_rpc.sh@57 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt --no-rpc-server -m 0x1 --wait-for-rpc 00:06:30.294 18:10:13 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@648 -- # local es=0 00:06:30.294 18:10:13 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@650 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt --no-rpc-server -m 0x1 --wait-for-rpc 00:06:30.294 18:10:13 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@636 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt 00:06:30.294 18:10:13 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:06:30.294 18:10:13 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@640 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt 00:06:30.294 18:10:13 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:06:30.294 18:10:13 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@642 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt 00:06:30.294 18:10:13 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:06:30.294 18:10:13 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@642 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt 00:06:30.294 18:10:13 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@642 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt ]] 00:06:30.294 18:10:13 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@651 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt --no-rpc-server -m 0x1 --wait-for-rpc 00:06:30.294 [2024-07-12 18:10:13.961900] app.c: 831:spdk_app_start: *ERROR*: Cannot use '--wait-for-rpc' if no RPC server is going to be started. 00:06:30.294 [2024-07-12 18:10:13.962000] app.c: 710:unclaim_cpu_cores: *ERROR*: Failed to unlink lock fd for core 0, errno: 2 00:06:30.294 18:10:13 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@651 -- # es=1 00:06:30.294 18:10:13 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@659 -- # (( es > 128 )) 00:06:30.294 18:10:13 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@670 -- # [[ -n '' ]] 00:06:30.294 18:10:13 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@675 -- # (( !es == 0 )) 00:06:30.294 00:06:30.294 real 0m0.089s 00:06:30.294 user 0m0.049s 00:06:30.294 sys 0m0.038s 00:06:30.294 18:10:13 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@1124 -- # xtrace_disable 00:06:30.294 18:10:13 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@10 -- # set +x 00:06:30.294 ************************************ 00:06:30.294 END TEST skip_rpc_with_delay 00:06:30.294 ************************************ 00:06:30.294 18:10:14 skip_rpc -- common/autotest_common.sh@1142 -- # return 0 00:06:30.294 18:10:14 skip_rpc -- rpc/skip_rpc.sh@77 -- # uname 00:06:30.294 18:10:14 skip_rpc -- rpc/skip_rpc.sh@77 -- # '[' Linux '!=' FreeBSD ']' 00:06:30.294 18:10:14 skip_rpc -- rpc/skip_rpc.sh@78 -- # run_test exit_on_failed_rpc_init test_exit_on_failed_rpc_init 00:06:30.294 18:10:14 skip_rpc -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:06:30.294 18:10:14 skip_rpc -- common/autotest_common.sh@1105 -- # xtrace_disable 00:06:30.294 18:10:14 skip_rpc -- common/autotest_common.sh@10 -- # set +x 00:06:30.591 ************************************ 00:06:30.591 START TEST exit_on_failed_rpc_init 00:06:30.591 ************************************ 00:06:30.591 18:10:14 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@1123 -- # test_exit_on_failed_rpc_init 00:06:30.591 18:10:14 skip_rpc.exit_on_failed_rpc_init -- rpc/skip_rpc.sh@62 -- # local spdk_pid=2418866 00:06:30.591 18:10:14 skip_rpc.exit_on_failed_rpc_init -- rpc/skip_rpc.sh@63 -- # waitforlisten 2418866 00:06:30.591 18:10:14 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@829 -- # '[' -z 2418866 ']' 00:06:30.591 18:10:14 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:06:30.591 18:10:14 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@834 -- # local max_retries=100 00:06:30.591 18:10:14 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:06:30.591 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:06:30.591 18:10:14 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@838 -- # xtrace_disable 00:06:30.591 18:10:14 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@10 -- # set +x 00:06:30.591 18:10:14 skip_rpc.exit_on_failed_rpc_init -- rpc/skip_rpc.sh@61 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt -m 0x1 00:06:30.591 [2024-07-12 18:10:14.128618] Starting SPDK v24.09-pre git sha1 182dd7de4 / DPDK 24.03.0 initialization... 00:06:30.591 [2024-07-12 18:10:14.128684] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2418866 ] 00:06:30.591 [2024-07-12 18:10:14.259035] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:30.875 [2024-07-12 18:10:14.367218] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:06:31.444 18:10:15 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:06:31.444 18:10:15 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@862 -- # return 0 00:06:31.444 18:10:15 skip_rpc.exit_on_failed_rpc_init -- rpc/skip_rpc.sh@65 -- # trap 'killprocess $spdk_pid; exit 1' SIGINT SIGTERM EXIT 00:06:31.444 18:10:15 skip_rpc.exit_on_failed_rpc_init -- rpc/skip_rpc.sh@67 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt -m 0x2 00:06:31.444 18:10:15 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@648 -- # local es=0 00:06:31.444 18:10:15 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@650 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt -m 0x2 00:06:31.444 18:10:15 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@636 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt 00:06:31.444 18:10:15 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:06:31.444 18:10:15 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@640 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt 00:06:31.444 18:10:15 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:06:31.444 18:10:15 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@642 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt 00:06:31.444 18:10:15 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:06:31.444 18:10:15 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@642 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt 00:06:31.444 18:10:15 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@642 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt ]] 00:06:31.444 18:10:15 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@651 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt -m 0x2 00:06:31.444 [2024-07-12 18:10:15.112574] Starting SPDK v24.09-pre git sha1 182dd7de4 / DPDK 24.03.0 initialization... 00:06:31.444 [2024-07-12 18:10:15.112641] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x2 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2418992 ] 00:06:31.703 [2024-07-12 18:10:15.230546] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:31.703 [2024-07-12 18:10:15.330362] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:06:31.703 [2024-07-12 18:10:15.330447] rpc.c: 180:_spdk_rpc_listen: *ERROR*: RPC Unix domain socket path /var/tmp/spdk.sock in use. Specify another. 00:06:31.703 [2024-07-12 18:10:15.330463] rpc.c: 166:spdk_rpc_initialize: *ERROR*: Unable to start RPC service at /var/tmp/spdk.sock 00:06:31.703 [2024-07-12 18:10:15.330475] app.c:1052:spdk_app_stop: *WARNING*: spdk_app_stop'd on non-zero 00:06:31.964 18:10:15 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@651 -- # es=234 00:06:31.964 18:10:15 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@659 -- # (( es > 128 )) 00:06:31.964 18:10:15 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@660 -- # es=106 00:06:31.964 18:10:15 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@661 -- # case "$es" in 00:06:31.964 18:10:15 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@668 -- # es=1 00:06:31.964 18:10:15 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@675 -- # (( !es == 0 )) 00:06:31.964 18:10:15 skip_rpc.exit_on_failed_rpc_init -- rpc/skip_rpc.sh@69 -- # trap - SIGINT SIGTERM EXIT 00:06:31.964 18:10:15 skip_rpc.exit_on_failed_rpc_init -- rpc/skip_rpc.sh@70 -- # killprocess 2418866 00:06:31.964 18:10:15 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@948 -- # '[' -z 2418866 ']' 00:06:31.964 18:10:15 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@952 -- # kill -0 2418866 00:06:31.964 18:10:15 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@953 -- # uname 00:06:31.964 18:10:15 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:06:31.964 18:10:15 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2418866 00:06:31.964 18:10:15 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:06:31.964 18:10:15 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:06:31.964 18:10:15 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2418866' 00:06:31.964 killing process with pid 2418866 00:06:31.964 18:10:15 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@967 -- # kill 2418866 00:06:31.964 18:10:15 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@972 -- # wait 2418866 00:06:32.224 00:06:32.224 real 0m1.782s 00:06:32.224 user 0m2.078s 00:06:32.224 sys 0m0.581s 00:06:32.224 18:10:15 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@1124 -- # xtrace_disable 00:06:32.224 18:10:15 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@10 -- # set +x 00:06:32.224 ************************************ 00:06:32.224 END TEST exit_on_failed_rpc_init 00:06:32.224 ************************************ 00:06:32.224 18:10:15 skip_rpc -- common/autotest_common.sh@1142 -- # return 0 00:06:32.224 18:10:15 skip_rpc -- rpc/skip_rpc.sh@81 -- # rm /var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc/config.json 00:06:32.224 00:06:32.224 real 0m14.697s 00:06:32.224 user 0m14.002s 00:06:32.224 sys 0m2.050s 00:06:32.224 18:10:15 skip_rpc -- common/autotest_common.sh@1124 -- # xtrace_disable 00:06:32.224 18:10:15 skip_rpc -- common/autotest_common.sh@10 -- # set +x 00:06:32.224 ************************************ 00:06:32.224 END TEST skip_rpc 00:06:32.224 ************************************ 00:06:32.224 18:10:15 -- common/autotest_common.sh@1142 -- # return 0 00:06:32.224 18:10:15 -- spdk/autotest.sh@171 -- # run_test rpc_client /var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_client/rpc_client.sh 00:06:32.224 18:10:15 -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:06:32.224 18:10:15 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:06:32.224 18:10:15 -- common/autotest_common.sh@10 -- # set +x 00:06:32.483 ************************************ 00:06:32.483 START TEST rpc_client 00:06:32.483 ************************************ 00:06:32.483 18:10:15 rpc_client -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_client/rpc_client.sh 00:06:32.483 * Looking for test storage... 00:06:32.483 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_client 00:06:32.483 18:10:16 rpc_client -- rpc_client/rpc_client.sh@10 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_client/rpc_client_test 00:06:32.483 OK 00:06:32.483 18:10:16 rpc_client -- rpc_client/rpc_client.sh@12 -- # trap - SIGINT SIGTERM EXIT 00:06:32.483 00:06:32.483 real 0m0.116s 00:06:32.483 user 0m0.040s 00:06:32.483 sys 0m0.083s 00:06:32.483 18:10:16 rpc_client -- common/autotest_common.sh@1124 -- # xtrace_disable 00:06:32.483 18:10:16 rpc_client -- common/autotest_common.sh@10 -- # set +x 00:06:32.483 ************************************ 00:06:32.483 END TEST rpc_client 00:06:32.483 ************************************ 00:06:32.483 18:10:16 -- common/autotest_common.sh@1142 -- # return 0 00:06:32.483 18:10:16 -- spdk/autotest.sh@172 -- # run_test json_config /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/json_config.sh 00:06:32.483 18:10:16 -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:06:32.483 18:10:16 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:06:32.483 18:10:16 -- common/autotest_common.sh@10 -- # set +x 00:06:32.483 ************************************ 00:06:32.483 START TEST json_config 00:06:32.483 ************************************ 00:06:32.483 18:10:16 json_config -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/json_config.sh 00:06:32.743 18:10:16 json_config -- json_config/json_config.sh@8 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/nvmf/common.sh 00:06:32.743 18:10:16 json_config -- nvmf/common.sh@7 -- # uname -s 00:06:32.743 18:10:16 json_config -- nvmf/common.sh@7 -- # [[ Linux == FreeBSD ]] 00:06:32.743 18:10:16 json_config -- nvmf/common.sh@9 -- # NVMF_PORT=4420 00:06:32.743 18:10:16 json_config -- nvmf/common.sh@10 -- # NVMF_SECOND_PORT=4421 00:06:32.743 18:10:16 json_config -- nvmf/common.sh@11 -- # NVMF_THIRD_PORT=4422 00:06:32.743 18:10:16 json_config -- nvmf/common.sh@12 -- # NVMF_IP_PREFIX=192.168.100 00:06:32.743 18:10:16 json_config -- nvmf/common.sh@13 -- # NVMF_IP_LEAST_ADDR=8 00:06:32.743 18:10:16 json_config -- nvmf/common.sh@14 -- # NVMF_TCP_IP_ADDRESS=127.0.0.1 00:06:32.743 18:10:16 json_config -- nvmf/common.sh@15 -- # NVMF_TRANSPORT_OPTS= 00:06:32.743 18:10:16 json_config -- nvmf/common.sh@16 -- # NVMF_SERIAL=SPDKISFASTANDAWESOME 00:06:32.743 18:10:16 json_config -- nvmf/common.sh@17 -- # nvme gen-hostnqn 00:06:32.743 18:10:16 json_config -- nvmf/common.sh@17 -- # NVME_HOSTNQN=nqn.2014-08.org.nvmexpress:uuid:809e2efd-7f71-e711-906e-0017a4403562 00:06:32.743 18:10:16 json_config -- nvmf/common.sh@18 -- # NVME_HOSTID=809e2efd-7f71-e711-906e-0017a4403562 00:06:32.743 18:10:16 json_config -- nvmf/common.sh@19 -- # NVME_HOST=("--hostnqn=$NVME_HOSTNQN" "--hostid=$NVME_HOSTID") 00:06:32.743 18:10:16 json_config -- nvmf/common.sh@20 -- # NVME_CONNECT='nvme connect' 00:06:32.743 18:10:16 json_config -- nvmf/common.sh@21 -- # NET_TYPE=phy-fallback 00:06:32.743 18:10:16 json_config -- nvmf/common.sh@22 -- # NVME_SUBNQN=nqn.2016-06.io.spdk:testnqn 00:06:32.743 18:10:16 json_config -- nvmf/common.sh@45 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/common.sh 00:06:32.743 18:10:16 json_config -- scripts/common.sh@508 -- # [[ -e /bin/wpdk_common.sh ]] 00:06:32.743 18:10:16 json_config -- scripts/common.sh@516 -- # [[ -e /etc/opt/spdk-pkgdep/paths/export.sh ]] 00:06:32.743 18:10:16 json_config -- scripts/common.sh@517 -- # source /etc/opt/spdk-pkgdep/paths/export.sh 00:06:32.743 18:10:16 json_config -- paths/export.sh@2 -- # PATH=/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:06:32.743 18:10:16 json_config -- paths/export.sh@3 -- # PATH=/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:06:32.743 18:10:16 json_config -- paths/export.sh@4 -- # PATH=/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:06:32.743 18:10:16 json_config -- paths/export.sh@5 -- # export PATH 00:06:32.743 18:10:16 json_config -- paths/export.sh@6 -- # echo /opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:06:32.743 18:10:16 json_config -- nvmf/common.sh@47 -- # : 0 00:06:32.743 18:10:16 json_config -- nvmf/common.sh@48 -- # export NVMF_APP_SHM_ID 00:06:32.743 18:10:16 json_config -- nvmf/common.sh@49 -- # build_nvmf_app_args 00:06:32.743 18:10:16 json_config -- nvmf/common.sh@25 -- # '[' 0 -eq 1 ']' 00:06:32.743 18:10:16 json_config -- nvmf/common.sh@29 -- # NVMF_APP+=(-i "$NVMF_APP_SHM_ID" -e 0xFFFF) 00:06:32.743 18:10:16 json_config -- nvmf/common.sh@31 -- # NVMF_APP+=("${NO_HUGE[@]}") 00:06:32.743 18:10:16 json_config -- nvmf/common.sh@33 -- # '[' -n '' ']' 00:06:32.743 18:10:16 json_config -- nvmf/common.sh@35 -- # '[' 0 -eq 1 ']' 00:06:32.743 18:10:16 json_config -- nvmf/common.sh@51 -- # have_pci_nics=0 00:06:32.743 18:10:16 json_config -- json_config/json_config.sh@9 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/common.sh 00:06:32.743 18:10:16 json_config -- json_config/json_config.sh@11 -- # [[ 0 -eq 1 ]] 00:06:32.743 18:10:16 json_config -- json_config/json_config.sh@15 -- # [[ 0 -ne 1 ]] 00:06:32.743 18:10:16 json_config -- json_config/json_config.sh@15 -- # [[ 0 -eq 1 ]] 00:06:32.743 18:10:16 json_config -- json_config/json_config.sh@26 -- # (( SPDK_TEST_BLOCKDEV + SPDK_TEST_ISCSI + SPDK_TEST_NVMF + SPDK_TEST_VHOST + SPDK_TEST_VHOST_INIT + SPDK_TEST_RBD == 0 )) 00:06:32.743 18:10:16 json_config -- json_config/json_config.sh@31 -- # app_pid=(['target']='' ['initiator']='') 00:06:32.743 18:10:16 json_config -- json_config/json_config.sh@31 -- # declare -A app_pid 00:06:32.743 18:10:16 json_config -- json_config/json_config.sh@32 -- # app_socket=(['target']='/var/tmp/spdk_tgt.sock' ['initiator']='/var/tmp/spdk_initiator.sock') 00:06:32.743 18:10:16 json_config -- json_config/json_config.sh@32 -- # declare -A app_socket 00:06:32.743 18:10:16 json_config -- json_config/json_config.sh@33 -- # app_params=(['target']='-m 0x1 -s 1024' ['initiator']='-m 0x2 -g -u -s 1024') 00:06:32.743 18:10:16 json_config -- json_config/json_config.sh@33 -- # declare -A app_params 00:06:32.743 18:10:16 json_config -- json_config/json_config.sh@34 -- # configs_path=(['target']='/var/jenkins/workspace/crypto-phy-autotest/spdk/spdk_tgt_config.json' ['initiator']='/var/jenkins/workspace/crypto-phy-autotest/spdk/spdk_initiator_config.json') 00:06:32.743 18:10:16 json_config -- json_config/json_config.sh@34 -- # declare -A configs_path 00:06:32.743 18:10:16 json_config -- json_config/json_config.sh@40 -- # last_event_id=0 00:06:32.743 18:10:16 json_config -- json_config/json_config.sh@355 -- # trap 'on_error_exit "${FUNCNAME}" "${LINENO}"' ERR 00:06:32.743 18:10:16 json_config -- json_config/json_config.sh@356 -- # echo 'INFO: JSON configuration test init' 00:06:32.743 INFO: JSON configuration test init 00:06:32.744 18:10:16 json_config -- json_config/json_config.sh@357 -- # json_config_test_init 00:06:32.744 18:10:16 json_config -- json_config/json_config.sh@262 -- # timing_enter json_config_test_init 00:06:32.744 18:10:16 json_config -- common/autotest_common.sh@722 -- # xtrace_disable 00:06:32.744 18:10:16 json_config -- common/autotest_common.sh@10 -- # set +x 00:06:32.744 18:10:16 json_config -- json_config/json_config.sh@263 -- # timing_enter json_config_setup_target 00:06:32.744 18:10:16 json_config -- common/autotest_common.sh@722 -- # xtrace_disable 00:06:32.744 18:10:16 json_config -- common/autotest_common.sh@10 -- # set +x 00:06:32.744 18:10:16 json_config -- json_config/json_config.sh@265 -- # json_config_test_start_app target --wait-for-rpc 00:06:32.744 18:10:16 json_config -- json_config/common.sh@9 -- # local app=target 00:06:32.744 18:10:16 json_config -- json_config/common.sh@10 -- # shift 00:06:32.744 18:10:16 json_config -- json_config/common.sh@12 -- # [[ -n 22 ]] 00:06:32.744 18:10:16 json_config -- json_config/common.sh@13 -- # [[ -z '' ]] 00:06:32.744 18:10:16 json_config -- json_config/common.sh@15 -- # local app_extra_params= 00:06:32.744 18:10:16 json_config -- json_config/common.sh@16 -- # [[ 0 -eq 1 ]] 00:06:32.744 18:10:16 json_config -- json_config/common.sh@16 -- # [[ 0 -eq 1 ]] 00:06:32.744 18:10:16 json_config -- json_config/common.sh@22 -- # app_pid["$app"]=2419274 00:06:32.744 18:10:16 json_config -- json_config/common.sh@24 -- # echo 'Waiting for target to run...' 00:06:32.744 Waiting for target to run... 00:06:32.744 18:10:16 json_config -- json_config/common.sh@25 -- # waitforlisten 2419274 /var/tmp/spdk_tgt.sock 00:06:32.744 18:10:16 json_config -- json_config/common.sh@21 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt -m 0x1 -s 1024 -r /var/tmp/spdk_tgt.sock --wait-for-rpc 00:06:32.744 18:10:16 json_config -- common/autotest_common.sh@829 -- # '[' -z 2419274 ']' 00:06:32.744 18:10:16 json_config -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk_tgt.sock 00:06:32.744 18:10:16 json_config -- common/autotest_common.sh@834 -- # local max_retries=100 00:06:32.744 18:10:16 json_config -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk_tgt.sock...' 00:06:32.744 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk_tgt.sock... 00:06:32.744 18:10:16 json_config -- common/autotest_common.sh@838 -- # xtrace_disable 00:06:32.744 18:10:16 json_config -- common/autotest_common.sh@10 -- # set +x 00:06:32.744 [2024-07-12 18:10:16.355056] Starting SPDK v24.09-pre git sha1 182dd7de4 / DPDK 24.03.0 initialization... 00:06:32.744 [2024-07-12 18:10:16.355132] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 -m 1024 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2419274 ] 00:06:33.312 [2024-07-12 18:10:16.924318] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:33.312 [2024-07-12 18:10:17.025704] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:06:33.571 18:10:17 json_config -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:06:33.571 18:10:17 json_config -- common/autotest_common.sh@862 -- # return 0 00:06:33.571 18:10:17 json_config -- json_config/common.sh@26 -- # echo '' 00:06:33.571 00:06:33.571 18:10:17 json_config -- json_config/json_config.sh@269 -- # create_accel_config 00:06:33.571 18:10:17 json_config -- json_config/json_config.sh@93 -- # timing_enter create_accel_config 00:06:33.571 18:10:17 json_config -- common/autotest_common.sh@722 -- # xtrace_disable 00:06:33.571 18:10:17 json_config -- common/autotest_common.sh@10 -- # set +x 00:06:33.571 18:10:17 json_config -- json_config/json_config.sh@95 -- # [[ 1 -eq 1 ]] 00:06:33.571 18:10:17 json_config -- json_config/json_config.sh@96 -- # tgt_rpc dpdk_cryptodev_scan_accel_module 00:06:33.571 18:10:17 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock dpdk_cryptodev_scan_accel_module 00:06:33.830 18:10:17 json_config -- json_config/json_config.sh@97 -- # tgt_rpc accel_assign_opc -o encrypt -m dpdk_cryptodev 00:06:33.830 18:10:17 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock accel_assign_opc -o encrypt -m dpdk_cryptodev 00:06:34.111 [2024-07-12 18:10:17.747991] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation encrypt will be assigned to module dpdk_cryptodev 00:06:34.111 18:10:17 json_config -- json_config/json_config.sh@98 -- # tgt_rpc accel_assign_opc -o decrypt -m dpdk_cryptodev 00:06:34.111 18:10:17 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock accel_assign_opc -o decrypt -m dpdk_cryptodev 00:06:34.369 [2024-07-12 18:10:17.916438] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation decrypt will be assigned to module dpdk_cryptodev 00:06:34.369 18:10:17 json_config -- json_config/json_config.sh@101 -- # timing_exit create_accel_config 00:06:34.369 18:10:17 json_config -- common/autotest_common.sh@728 -- # xtrace_disable 00:06:34.370 18:10:17 json_config -- common/autotest_common.sh@10 -- # set +x 00:06:34.370 18:10:17 json_config -- json_config/json_config.sh@273 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/gen_nvme.sh --json-with-subsystems 00:06:34.370 18:10:17 json_config -- json_config/json_config.sh@274 -- # tgt_rpc load_config 00:06:34.370 18:10:17 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock load_config 00:06:34.628 [2024-07-12 18:10:18.218024] accel_dpdk_cryptodev.c:1178:accel_dpdk_cryptodev_init: *NOTICE*: Found crypto devices: 97 00:06:37.166 18:10:20 json_config -- json_config/json_config.sh@276 -- # tgt_check_notification_types 00:06:37.166 18:10:20 json_config -- json_config/json_config.sh@43 -- # timing_enter tgt_check_notification_types 00:06:37.166 18:10:20 json_config -- common/autotest_common.sh@722 -- # xtrace_disable 00:06:37.166 18:10:20 json_config -- common/autotest_common.sh@10 -- # set +x 00:06:37.166 18:10:20 json_config -- json_config/json_config.sh@45 -- # local ret=0 00:06:37.166 18:10:20 json_config -- json_config/json_config.sh@46 -- # enabled_types=('bdev_register' 'bdev_unregister') 00:06:37.166 18:10:20 json_config -- json_config/json_config.sh@46 -- # local enabled_types 00:06:37.166 18:10:20 json_config -- json_config/json_config.sh@48 -- # tgt_rpc notify_get_types 00:06:37.166 18:10:20 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock notify_get_types 00:06:37.166 18:10:20 json_config -- json_config/json_config.sh@48 -- # jq -r '.[]' 00:06:37.425 18:10:21 json_config -- json_config/json_config.sh@48 -- # get_types=('bdev_register' 'bdev_unregister') 00:06:37.425 18:10:21 json_config -- json_config/json_config.sh@48 -- # local get_types 00:06:37.425 18:10:21 json_config -- json_config/json_config.sh@49 -- # [[ bdev_register bdev_unregister != \b\d\e\v\_\r\e\g\i\s\t\e\r\ \b\d\e\v\_\u\n\r\e\g\i\s\t\e\r ]] 00:06:37.425 18:10:21 json_config -- json_config/json_config.sh@54 -- # timing_exit tgt_check_notification_types 00:06:37.425 18:10:21 json_config -- common/autotest_common.sh@728 -- # xtrace_disable 00:06:37.425 18:10:21 json_config -- common/autotest_common.sh@10 -- # set +x 00:06:37.425 18:10:21 json_config -- json_config/json_config.sh@55 -- # return 0 00:06:37.425 18:10:21 json_config -- json_config/json_config.sh@278 -- # [[ 1 -eq 1 ]] 00:06:37.425 18:10:21 json_config -- json_config/json_config.sh@279 -- # create_bdev_subsystem_config 00:06:37.425 18:10:21 json_config -- json_config/json_config.sh@105 -- # timing_enter create_bdev_subsystem_config 00:06:37.425 18:10:21 json_config -- common/autotest_common.sh@722 -- # xtrace_disable 00:06:37.425 18:10:21 json_config -- common/autotest_common.sh@10 -- # set +x 00:06:37.425 18:10:21 json_config -- json_config/json_config.sh@107 -- # expected_notifications=() 00:06:37.425 18:10:21 json_config -- json_config/json_config.sh@107 -- # local expected_notifications 00:06:37.425 18:10:21 json_config -- json_config/json_config.sh@111 -- # expected_notifications+=($(get_notifications)) 00:06:37.425 18:10:21 json_config -- json_config/json_config.sh@111 -- # get_notifications 00:06:37.425 18:10:21 json_config -- json_config/json_config.sh@59 -- # local ev_type ev_ctx event_id 00:06:37.425 18:10:21 json_config -- json_config/json_config.sh@61 -- # IFS=: 00:06:37.425 18:10:21 json_config -- json_config/json_config.sh@61 -- # read -r ev_type ev_ctx event_id 00:06:37.425 18:10:21 json_config -- json_config/json_config.sh@58 -- # tgt_rpc notify_get_notifications -i 0 00:06:37.425 18:10:21 json_config -- json_config/json_config.sh@58 -- # jq -r '.[] | "\(.type):\(.ctx):\(.id)"' 00:06:37.425 18:10:21 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock notify_get_notifications -i 0 00:06:37.684 18:10:21 json_config -- json_config/json_config.sh@62 -- # echo bdev_register:Nvme0n1 00:06:37.684 18:10:21 json_config -- json_config/json_config.sh@61 -- # IFS=: 00:06:37.684 18:10:21 json_config -- json_config/json_config.sh@61 -- # read -r ev_type ev_ctx event_id 00:06:37.684 18:10:21 json_config -- json_config/json_config.sh@113 -- # [[ 1 -eq 1 ]] 00:06:37.684 18:10:21 json_config -- json_config/json_config.sh@114 -- # local lvol_store_base_bdev=Nvme0n1 00:06:37.684 18:10:21 json_config -- json_config/json_config.sh@116 -- # tgt_rpc bdev_split_create Nvme0n1 2 00:06:37.684 18:10:21 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_split_create Nvme0n1 2 00:06:37.943 Nvme0n1p0 Nvme0n1p1 00:06:37.943 18:10:21 json_config -- json_config/json_config.sh@117 -- # tgt_rpc bdev_split_create Malloc0 3 00:06:37.943 18:10:21 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_split_create Malloc0 3 00:06:38.202 [2024-07-12 18:10:21.831053] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:06:38.202 [2024-07-12 18:10:21.831114] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:06:38.202 00:06:38.202 18:10:21 json_config -- json_config/json_config.sh@118 -- # tgt_rpc bdev_malloc_create 8 4096 --name Malloc3 00:06:38.202 18:10:21 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_malloc_create 8 4096 --name Malloc3 00:06:38.461 Malloc3 00:06:38.461 18:10:22 json_config -- json_config/json_config.sh@119 -- # tgt_rpc bdev_passthru_create -b Malloc3 -p PTBdevFromMalloc3 00:06:38.461 18:10:22 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_passthru_create -b Malloc3 -p PTBdevFromMalloc3 00:06:38.720 [2024-07-12 18:10:22.324457] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on Malloc3 00:06:38.720 [2024-07-12 18:10:22.324511] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:06:38.720 [2024-07-12 18:10:22.324536] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x2343a00 00:06:38.720 [2024-07-12 18:10:22.324549] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:06:38.720 [2024-07-12 18:10:22.326186] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:06:38.720 [2024-07-12 18:10:22.326218] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: PTBdevFromMalloc3 00:06:38.720 PTBdevFromMalloc3 00:06:38.720 18:10:22 json_config -- json_config/json_config.sh@121 -- # tgt_rpc bdev_null_create Null0 32 512 00:06:38.720 18:10:22 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_null_create Null0 32 512 00:06:38.979 Null0 00:06:38.979 18:10:22 json_config -- json_config/json_config.sh@123 -- # tgt_rpc bdev_malloc_create 32 512 --name Malloc0 00:06:38.979 18:10:22 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_malloc_create 32 512 --name Malloc0 00:06:39.239 Malloc0 00:06:39.239 18:10:22 json_config -- json_config/json_config.sh@124 -- # tgt_rpc bdev_malloc_create 16 4096 --name Malloc1 00:06:39.239 18:10:22 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_malloc_create 16 4096 --name Malloc1 00:06:39.498 Malloc1 00:06:39.498 18:10:23 json_config -- json_config/json_config.sh@137 -- # expected_notifications+=(bdev_register:${lvol_store_base_bdev}p1 bdev_register:${lvol_store_base_bdev}p0 bdev_register:Malloc3 bdev_register:PTBdevFromMalloc3 bdev_register:Null0 bdev_register:Malloc0 bdev_register:Malloc0p2 bdev_register:Malloc0p1 bdev_register:Malloc0p0 bdev_register:Malloc1) 00:06:39.498 18:10:23 json_config -- json_config/json_config.sh@140 -- # dd if=/dev/zero of=/sample_aio bs=1024 count=102400 00:06:39.757 102400+0 records in 00:06:39.757 102400+0 records out 00:06:39.757 104857600 bytes (105 MB, 100 MiB) copied, 0.299884 s, 350 MB/s 00:06:39.757 18:10:23 json_config -- json_config/json_config.sh@141 -- # tgt_rpc bdev_aio_create /sample_aio aio_disk 1024 00:06:39.757 18:10:23 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_aio_create /sample_aio aio_disk 1024 00:06:40.015 aio_disk 00:06:40.015 18:10:23 json_config -- json_config/json_config.sh@142 -- # expected_notifications+=(bdev_register:aio_disk) 00:06:40.015 18:10:23 json_config -- json_config/json_config.sh@147 -- # tgt_rpc bdev_lvol_create_lvstore -c 1048576 Nvme0n1p0 lvs_test 00:06:40.015 18:10:23 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_lvol_create_lvstore -c 1048576 Nvme0n1p0 lvs_test 00:06:45.282 1c338c2a-81bf-478f-80f6-78d6ffef0171 00:06:45.282 18:10:28 json_config -- json_config/json_config.sh@154 -- # expected_notifications+=("bdev_register:$(tgt_rpc bdev_lvol_create -l lvs_test lvol0 32)" "bdev_register:$(tgt_rpc bdev_lvol_create -l lvs_test -t lvol1 32)" "bdev_register:$(tgt_rpc bdev_lvol_snapshot lvs_test/lvol0 snapshot0)" "bdev_register:$(tgt_rpc bdev_lvol_clone lvs_test/snapshot0 clone0)") 00:06:45.282 18:10:28 json_config -- json_config/json_config.sh@154 -- # tgt_rpc bdev_lvol_create -l lvs_test lvol0 32 00:06:45.282 18:10:28 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_lvol_create -l lvs_test lvol0 32 00:06:45.282 18:10:28 json_config -- json_config/json_config.sh@154 -- # tgt_rpc bdev_lvol_create -l lvs_test -t lvol1 32 00:06:45.282 18:10:28 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_lvol_create -l lvs_test -t lvol1 32 00:06:45.282 18:10:28 json_config -- json_config/json_config.sh@154 -- # tgt_rpc bdev_lvol_snapshot lvs_test/lvol0 snapshot0 00:06:45.282 18:10:28 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_lvol_snapshot lvs_test/lvol0 snapshot0 00:06:45.541 18:10:29 json_config -- json_config/json_config.sh@154 -- # tgt_rpc bdev_lvol_clone lvs_test/snapshot0 clone0 00:06:45.541 18:10:29 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_lvol_clone lvs_test/snapshot0 clone0 00:06:45.800 18:10:29 json_config -- json_config/json_config.sh@157 -- # [[ 1 -eq 1 ]] 00:06:45.800 18:10:29 json_config -- json_config/json_config.sh@158 -- # tgt_rpc bdev_malloc_create 8 1024 --name MallocForCryptoBdev 00:06:45.800 18:10:29 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_malloc_create 8 1024 --name MallocForCryptoBdev 00:06:46.059 MallocForCryptoBdev 00:06:46.059 18:10:29 json_config -- json_config/json_config.sh@159 -- # lspci -d:37c8 00:06:46.059 18:10:29 json_config -- json_config/json_config.sh@159 -- # wc -l 00:06:46.059 18:10:29 json_config -- json_config/json_config.sh@159 -- # [[ 3 -eq 0 ]] 00:06:46.059 18:10:29 json_config -- json_config/json_config.sh@162 -- # local crypto_driver=crypto_qat 00:06:46.059 18:10:29 json_config -- json_config/json_config.sh@165 -- # tgt_rpc bdev_crypto_create MallocForCryptoBdev CryptoMallocBdev -p crypto_qat -k 01234567891234560123456789123456 00:06:46.059 18:10:29 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_crypto_create MallocForCryptoBdev CryptoMallocBdev -p crypto_qat -k 01234567891234560123456789123456 00:06:46.317 [2024-07-12 18:10:29.848555] vbdev_crypto_rpc.c: 136:rpc_bdev_crypto_create: *WARNING*: "crypto_pmd" parameters is obsolete and ignored 00:06:46.317 CryptoMallocBdev 00:06:46.317 18:10:29 json_config -- json_config/json_config.sh@169 -- # expected_notifications+=(bdev_register:MallocForCryptoBdev bdev_register:CryptoMallocBdev) 00:06:46.317 18:10:29 json_config -- json_config/json_config.sh@172 -- # [[ 0 -eq 1 ]] 00:06:46.317 18:10:29 json_config -- json_config/json_config.sh@178 -- # tgt_check_notifications bdev_register:Nvme0n1 bdev_register:Nvme0n1p1 bdev_register:Nvme0n1p0 bdev_register:Malloc3 bdev_register:PTBdevFromMalloc3 bdev_register:Null0 bdev_register:Malloc0 bdev_register:Malloc0p2 bdev_register:Malloc0p1 bdev_register:Malloc0p0 bdev_register:Malloc1 bdev_register:aio_disk bdev_register:379fb8a0-22f5-4007-acd5-90dd63a5d32e bdev_register:162b3c86-c782-4a35-8c6b-8e4397eb6012 bdev_register:80fba356-5d48-4782-9cd9-d5402b19a4d7 bdev_register:85260a70-be03-4743-9a57-c9c248f8fce2 bdev_register:MallocForCryptoBdev bdev_register:CryptoMallocBdev 00:06:46.317 18:10:29 json_config -- json_config/json_config.sh@67 -- # local events_to_check 00:06:46.317 18:10:29 json_config -- json_config/json_config.sh@68 -- # local recorded_events 00:06:46.317 18:10:29 json_config -- json_config/json_config.sh@71 -- # events_to_check=($(printf '%s\n' "$@" | sort)) 00:06:46.317 18:10:29 json_config -- json_config/json_config.sh@71 -- # printf '%s\n' bdev_register:Nvme0n1 bdev_register:Nvme0n1p1 bdev_register:Nvme0n1p0 bdev_register:Malloc3 bdev_register:PTBdevFromMalloc3 bdev_register:Null0 bdev_register:Malloc0 bdev_register:Malloc0p2 bdev_register:Malloc0p1 bdev_register:Malloc0p0 bdev_register:Malloc1 bdev_register:aio_disk bdev_register:379fb8a0-22f5-4007-acd5-90dd63a5d32e bdev_register:162b3c86-c782-4a35-8c6b-8e4397eb6012 bdev_register:80fba356-5d48-4782-9cd9-d5402b19a4d7 bdev_register:85260a70-be03-4743-9a57-c9c248f8fce2 bdev_register:MallocForCryptoBdev bdev_register:CryptoMallocBdev 00:06:46.317 18:10:29 json_config -- json_config/json_config.sh@71 -- # sort 00:06:46.317 18:10:29 json_config -- json_config/json_config.sh@72 -- # recorded_events=($(get_notifications | sort)) 00:06:46.317 18:10:29 json_config -- json_config/json_config.sh@72 -- # get_notifications 00:06:46.317 18:10:29 json_config -- json_config/json_config.sh@72 -- # sort 00:06:46.317 18:10:29 json_config -- json_config/json_config.sh@59 -- # local ev_type ev_ctx event_id 00:06:46.317 18:10:29 json_config -- json_config/json_config.sh@61 -- # IFS=: 00:06:46.317 18:10:29 json_config -- json_config/json_config.sh@61 -- # read -r ev_type ev_ctx event_id 00:06:46.317 18:10:29 json_config -- json_config/json_config.sh@58 -- # tgt_rpc notify_get_notifications -i 0 00:06:46.317 18:10:29 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock notify_get_notifications -i 0 00:06:46.317 18:10:29 json_config -- json_config/json_config.sh@58 -- # jq -r '.[] | "\(.type):\(.ctx):\(.id)"' 00:06:46.576 18:10:30 json_config -- json_config/json_config.sh@62 -- # echo bdev_register:Nvme0n1 00:06:46.576 18:10:30 json_config -- json_config/json_config.sh@61 -- # IFS=: 00:06:46.576 18:10:30 json_config -- json_config/json_config.sh@61 -- # read -r ev_type ev_ctx event_id 00:06:46.576 18:10:30 json_config -- json_config/json_config.sh@62 -- # echo bdev_register:Nvme0n1p1 00:06:46.576 18:10:30 json_config -- json_config/json_config.sh@61 -- # IFS=: 00:06:46.576 18:10:30 json_config -- json_config/json_config.sh@61 -- # read -r ev_type ev_ctx event_id 00:06:46.576 18:10:30 json_config -- json_config/json_config.sh@62 -- # echo bdev_register:Nvme0n1p0 00:06:46.576 18:10:30 json_config -- json_config/json_config.sh@61 -- # IFS=: 00:06:46.576 18:10:30 json_config -- json_config/json_config.sh@61 -- # read -r ev_type ev_ctx event_id 00:06:46.576 18:10:30 json_config -- json_config/json_config.sh@62 -- # echo bdev_register:Malloc3 00:06:46.576 18:10:30 json_config -- json_config/json_config.sh@61 -- # IFS=: 00:06:46.576 18:10:30 json_config -- json_config/json_config.sh@61 -- # read -r ev_type ev_ctx event_id 00:06:46.576 18:10:30 json_config -- json_config/json_config.sh@62 -- # echo bdev_register:PTBdevFromMalloc3 00:06:46.576 18:10:30 json_config -- json_config/json_config.sh@61 -- # IFS=: 00:06:46.576 18:10:30 json_config -- json_config/json_config.sh@61 -- # read -r ev_type ev_ctx event_id 00:06:46.576 18:10:30 json_config -- json_config/json_config.sh@62 -- # echo bdev_register:Null0 00:06:46.576 18:10:30 json_config -- json_config/json_config.sh@61 -- # IFS=: 00:06:46.576 18:10:30 json_config -- json_config/json_config.sh@61 -- # read -r ev_type ev_ctx event_id 00:06:46.576 18:10:30 json_config -- json_config/json_config.sh@62 -- # echo bdev_register:Malloc0 00:06:46.576 18:10:30 json_config -- json_config/json_config.sh@61 -- # IFS=: 00:06:46.576 18:10:30 json_config -- json_config/json_config.sh@61 -- # read -r ev_type ev_ctx event_id 00:06:46.576 18:10:30 json_config -- json_config/json_config.sh@62 -- # echo bdev_register:Malloc0p2 00:06:46.576 18:10:30 json_config -- json_config/json_config.sh@61 -- # IFS=: 00:06:46.576 18:10:30 json_config -- json_config/json_config.sh@61 -- # read -r ev_type ev_ctx event_id 00:06:46.576 18:10:30 json_config -- json_config/json_config.sh@62 -- # echo bdev_register:Malloc0p1 00:06:46.576 18:10:30 json_config -- json_config/json_config.sh@61 -- # IFS=: 00:06:46.576 18:10:30 json_config -- json_config/json_config.sh@61 -- # read -r ev_type ev_ctx event_id 00:06:46.576 18:10:30 json_config -- json_config/json_config.sh@62 -- # echo bdev_register:Malloc0p0 00:06:46.576 18:10:30 json_config -- json_config/json_config.sh@61 -- # IFS=: 00:06:46.576 18:10:30 json_config -- json_config/json_config.sh@61 -- # read -r ev_type ev_ctx event_id 00:06:46.576 18:10:30 json_config -- json_config/json_config.sh@62 -- # echo bdev_register:Malloc1 00:06:46.576 18:10:30 json_config -- json_config/json_config.sh@61 -- # IFS=: 00:06:46.576 18:10:30 json_config -- json_config/json_config.sh@61 -- # read -r ev_type ev_ctx event_id 00:06:46.576 18:10:30 json_config -- json_config/json_config.sh@62 -- # echo bdev_register:aio_disk 00:06:46.576 18:10:30 json_config -- json_config/json_config.sh@61 -- # IFS=: 00:06:46.576 18:10:30 json_config -- json_config/json_config.sh@61 -- # read -r ev_type ev_ctx event_id 00:06:46.576 18:10:30 json_config -- json_config/json_config.sh@62 -- # echo bdev_register:379fb8a0-22f5-4007-acd5-90dd63a5d32e 00:06:46.576 18:10:30 json_config -- json_config/json_config.sh@61 -- # IFS=: 00:06:46.576 18:10:30 json_config -- json_config/json_config.sh@61 -- # read -r ev_type ev_ctx event_id 00:06:46.576 18:10:30 json_config -- json_config/json_config.sh@62 -- # echo bdev_register:162b3c86-c782-4a35-8c6b-8e4397eb6012 00:06:46.576 18:10:30 json_config -- json_config/json_config.sh@61 -- # IFS=: 00:06:46.576 18:10:30 json_config -- json_config/json_config.sh@61 -- # read -r ev_type ev_ctx event_id 00:06:46.576 18:10:30 json_config -- json_config/json_config.sh@62 -- # echo bdev_register:80fba356-5d48-4782-9cd9-d5402b19a4d7 00:06:46.576 18:10:30 json_config -- json_config/json_config.sh@61 -- # IFS=: 00:06:46.576 18:10:30 json_config -- json_config/json_config.sh@61 -- # read -r ev_type ev_ctx event_id 00:06:46.576 18:10:30 json_config -- json_config/json_config.sh@62 -- # echo bdev_register:85260a70-be03-4743-9a57-c9c248f8fce2 00:06:46.576 18:10:30 json_config -- json_config/json_config.sh@61 -- # IFS=: 00:06:46.576 18:10:30 json_config -- json_config/json_config.sh@61 -- # read -r ev_type ev_ctx event_id 00:06:46.576 18:10:30 json_config -- json_config/json_config.sh@62 -- # echo bdev_register:MallocForCryptoBdev 00:06:46.576 18:10:30 json_config -- json_config/json_config.sh@61 -- # IFS=: 00:06:46.576 18:10:30 json_config -- json_config/json_config.sh@61 -- # read -r ev_type ev_ctx event_id 00:06:46.576 18:10:30 json_config -- json_config/json_config.sh@62 -- # echo bdev_register:CryptoMallocBdev 00:06:46.576 18:10:30 json_config -- json_config/json_config.sh@61 -- # IFS=: 00:06:46.576 18:10:30 json_config -- json_config/json_config.sh@61 -- # read -r ev_type ev_ctx event_id 00:06:46.577 18:10:30 json_config -- json_config/json_config.sh@74 -- # [[ bdev_register:162b3c86-c782-4a35-8c6b-8e4397eb6012 bdev_register:379fb8a0-22f5-4007-acd5-90dd63a5d32e bdev_register:80fba356-5d48-4782-9cd9-d5402b19a4d7 bdev_register:85260a70-be03-4743-9a57-c9c248f8fce2 bdev_register:aio_disk bdev_register:CryptoMallocBdev bdev_register:Malloc0 bdev_register:Malloc0p0 bdev_register:Malloc0p1 bdev_register:Malloc0p2 bdev_register:Malloc1 bdev_register:Malloc3 bdev_register:MallocForCryptoBdev bdev_register:Null0 bdev_register:Nvme0n1 bdev_register:Nvme0n1p0 bdev_register:Nvme0n1p1 bdev_register:PTBdevFromMalloc3 != \b\d\e\v\_\r\e\g\i\s\t\e\r\:\1\6\2\b\3\c\8\6\-\c\7\8\2\-\4\a\3\5\-\8\c\6\b\-\8\e\4\3\9\7\e\b\6\0\1\2\ \b\d\e\v\_\r\e\g\i\s\t\e\r\:\3\7\9\f\b\8\a\0\-\2\2\f\5\-\4\0\0\7\-\a\c\d\5\-\9\0\d\d\6\3\a\5\d\3\2\e\ \b\d\e\v\_\r\e\g\i\s\t\e\r\:\8\0\f\b\a\3\5\6\-\5\d\4\8\-\4\7\8\2\-\9\c\d\9\-\d\5\4\0\2\b\1\9\a\4\d\7\ \b\d\e\v\_\r\e\g\i\s\t\e\r\:\8\5\2\6\0\a\7\0\-\b\e\0\3\-\4\7\4\3\-\9\a\5\7\-\c\9\c\2\4\8\f\8\f\c\e\2\ \b\d\e\v\_\r\e\g\i\s\t\e\r\:\a\i\o\_\d\i\s\k\ \b\d\e\v\_\r\e\g\i\s\t\e\r\:\C\r\y\p\t\o\M\a\l\l\o\c\B\d\e\v\ \b\d\e\v\_\r\e\g\i\s\t\e\r\:\M\a\l\l\o\c\0\ \b\d\e\v\_\r\e\g\i\s\t\e\r\:\M\a\l\l\o\c\0\p\0\ \b\d\e\v\_\r\e\g\i\s\t\e\r\:\M\a\l\l\o\c\0\p\1\ \b\d\e\v\_\r\e\g\i\s\t\e\r\:\M\a\l\l\o\c\0\p\2\ \b\d\e\v\_\r\e\g\i\s\t\e\r\:\M\a\l\l\o\c\1\ \b\d\e\v\_\r\e\g\i\s\t\e\r\:\M\a\l\l\o\c\3\ \b\d\e\v\_\r\e\g\i\s\t\e\r\:\M\a\l\l\o\c\F\o\r\C\r\y\p\t\o\B\d\e\v\ \b\d\e\v\_\r\e\g\i\s\t\e\r\:\N\u\l\l\0\ \b\d\e\v\_\r\e\g\i\s\t\e\r\:\N\v\m\e\0\n\1\ \b\d\e\v\_\r\e\g\i\s\t\e\r\:\N\v\m\e\0\n\1\p\0\ \b\d\e\v\_\r\e\g\i\s\t\e\r\:\N\v\m\e\0\n\1\p\1\ \b\d\e\v\_\r\e\g\i\s\t\e\r\:\P\T\B\d\e\v\F\r\o\m\M\a\l\l\o\c\3 ]] 00:06:46.577 18:10:30 json_config -- json_config/json_config.sh@86 -- # cat 00:06:46.577 18:10:30 json_config -- json_config/json_config.sh@86 -- # printf ' %s\n' bdev_register:162b3c86-c782-4a35-8c6b-8e4397eb6012 bdev_register:379fb8a0-22f5-4007-acd5-90dd63a5d32e bdev_register:80fba356-5d48-4782-9cd9-d5402b19a4d7 bdev_register:85260a70-be03-4743-9a57-c9c248f8fce2 bdev_register:aio_disk bdev_register:CryptoMallocBdev bdev_register:Malloc0 bdev_register:Malloc0p0 bdev_register:Malloc0p1 bdev_register:Malloc0p2 bdev_register:Malloc1 bdev_register:Malloc3 bdev_register:MallocForCryptoBdev bdev_register:Null0 bdev_register:Nvme0n1 bdev_register:Nvme0n1p0 bdev_register:Nvme0n1p1 bdev_register:PTBdevFromMalloc3 00:06:46.577 Expected events matched: 00:06:46.577 bdev_register:162b3c86-c782-4a35-8c6b-8e4397eb6012 00:06:46.577 bdev_register:379fb8a0-22f5-4007-acd5-90dd63a5d32e 00:06:46.577 bdev_register:80fba356-5d48-4782-9cd9-d5402b19a4d7 00:06:46.577 bdev_register:85260a70-be03-4743-9a57-c9c248f8fce2 00:06:46.577 bdev_register:aio_disk 00:06:46.577 bdev_register:CryptoMallocBdev 00:06:46.577 bdev_register:Malloc0 00:06:46.577 bdev_register:Malloc0p0 00:06:46.577 bdev_register:Malloc0p1 00:06:46.577 bdev_register:Malloc0p2 00:06:46.577 bdev_register:Malloc1 00:06:46.577 bdev_register:Malloc3 00:06:46.577 bdev_register:MallocForCryptoBdev 00:06:46.577 bdev_register:Null0 00:06:46.577 bdev_register:Nvme0n1 00:06:46.577 bdev_register:Nvme0n1p0 00:06:46.577 bdev_register:Nvme0n1p1 00:06:46.577 bdev_register:PTBdevFromMalloc3 00:06:46.577 18:10:30 json_config -- json_config/json_config.sh@180 -- # timing_exit create_bdev_subsystem_config 00:06:46.577 18:10:30 json_config -- common/autotest_common.sh@728 -- # xtrace_disable 00:06:46.577 18:10:30 json_config -- common/autotest_common.sh@10 -- # set +x 00:06:46.577 18:10:30 json_config -- json_config/json_config.sh@282 -- # [[ 0 -eq 1 ]] 00:06:46.577 18:10:30 json_config -- json_config/json_config.sh@286 -- # [[ 0 -eq 1 ]] 00:06:46.577 18:10:30 json_config -- json_config/json_config.sh@290 -- # [[ 0 -eq 1 ]] 00:06:46.577 18:10:30 json_config -- json_config/json_config.sh@293 -- # timing_exit json_config_setup_target 00:06:46.577 18:10:30 json_config -- common/autotest_common.sh@728 -- # xtrace_disable 00:06:46.577 18:10:30 json_config -- common/autotest_common.sh@10 -- # set +x 00:06:46.577 18:10:30 json_config -- json_config/json_config.sh@295 -- # [[ 0 -eq 1 ]] 00:06:46.577 18:10:30 json_config -- json_config/json_config.sh@300 -- # tgt_rpc bdev_malloc_create 8 512 --name MallocBdevForConfigChangeCheck 00:06:46.577 18:10:30 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_malloc_create 8 512 --name MallocBdevForConfigChangeCheck 00:06:46.835 MallocBdevForConfigChangeCheck 00:06:46.835 18:10:30 json_config -- json_config/json_config.sh@302 -- # timing_exit json_config_test_init 00:06:46.835 18:10:30 json_config -- common/autotest_common.sh@728 -- # xtrace_disable 00:06:46.835 18:10:30 json_config -- common/autotest_common.sh@10 -- # set +x 00:06:46.835 18:10:30 json_config -- json_config/json_config.sh@359 -- # tgt_rpc save_config 00:06:46.835 18:10:30 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock save_config 00:06:47.403 18:10:30 json_config -- json_config/json_config.sh@361 -- # echo 'INFO: shutting down applications...' 00:06:47.403 INFO: shutting down applications... 00:06:47.403 18:10:30 json_config -- json_config/json_config.sh@362 -- # [[ 0 -eq 1 ]] 00:06:47.403 18:10:30 json_config -- json_config/json_config.sh@368 -- # json_config_clear target 00:06:47.403 18:10:30 json_config -- json_config/json_config.sh@332 -- # [[ -n 22 ]] 00:06:47.403 18:10:30 json_config -- json_config/json_config.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/clear_config.py -s /var/tmp/spdk_tgt.sock clear_config 00:06:47.403 [2024-07-12 18:10:31.112498] vbdev_lvol.c: 150:vbdev_lvs_hotremove_cb: *NOTICE*: bdev Nvme0n1p0 being removed: closing lvstore lvs_test 00:06:50.689 Calling clear_iscsi_subsystem 00:06:50.689 Calling clear_nvmf_subsystem 00:06:50.689 Calling clear_nbd_subsystem 00:06:50.689 Calling clear_ublk_subsystem 00:06:50.689 Calling clear_vhost_blk_subsystem 00:06:50.689 Calling clear_vhost_scsi_subsystem 00:06:50.689 Calling clear_bdev_subsystem 00:06:50.689 18:10:34 json_config -- json_config/json_config.sh@337 -- # local config_filter=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/config_filter.py 00:06:50.689 18:10:34 json_config -- json_config/json_config.sh@343 -- # count=100 00:06:50.689 18:10:34 json_config -- json_config/json_config.sh@344 -- # '[' 100 -gt 0 ']' 00:06:50.689 18:10:34 json_config -- json_config/json_config.sh@345 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock save_config 00:06:50.689 18:10:34 json_config -- json_config/json_config.sh@345 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/config_filter.py -method delete_global_parameters 00:06:50.689 18:10:34 json_config -- json_config/json_config.sh@345 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/config_filter.py -method check_empty 00:06:50.948 18:10:34 json_config -- json_config/json_config.sh@345 -- # break 00:06:50.948 18:10:34 json_config -- json_config/json_config.sh@350 -- # '[' 100 -eq 0 ']' 00:06:50.948 18:10:34 json_config -- json_config/json_config.sh@369 -- # json_config_test_shutdown_app target 00:06:50.948 18:10:34 json_config -- json_config/common.sh@31 -- # local app=target 00:06:50.948 18:10:34 json_config -- json_config/common.sh@34 -- # [[ -n 22 ]] 00:06:50.948 18:10:34 json_config -- json_config/common.sh@35 -- # [[ -n 2419274 ]] 00:06:50.948 18:10:34 json_config -- json_config/common.sh@38 -- # kill -SIGINT 2419274 00:06:50.948 18:10:34 json_config -- json_config/common.sh@40 -- # (( i = 0 )) 00:06:50.948 18:10:34 json_config -- json_config/common.sh@40 -- # (( i < 30 )) 00:06:50.948 18:10:34 json_config -- json_config/common.sh@41 -- # kill -0 2419274 00:06:50.948 18:10:34 json_config -- json_config/common.sh@45 -- # sleep 0.5 00:06:51.517 18:10:34 json_config -- json_config/common.sh@40 -- # (( i++ )) 00:06:51.517 18:10:34 json_config -- json_config/common.sh@40 -- # (( i < 30 )) 00:06:51.517 18:10:34 json_config -- json_config/common.sh@41 -- # kill -0 2419274 00:06:51.517 18:10:34 json_config -- json_config/common.sh@42 -- # app_pid["$app"]= 00:06:51.517 18:10:34 json_config -- json_config/common.sh@43 -- # break 00:06:51.517 18:10:34 json_config -- json_config/common.sh@48 -- # [[ -n '' ]] 00:06:51.517 18:10:34 json_config -- json_config/common.sh@53 -- # echo 'SPDK target shutdown done' 00:06:51.517 SPDK target shutdown done 00:06:51.517 18:10:34 json_config -- json_config/json_config.sh@371 -- # echo 'INFO: relaunching applications...' 00:06:51.517 INFO: relaunching applications... 00:06:51.517 18:10:34 json_config -- json_config/json_config.sh@372 -- # json_config_test_start_app target --json /var/jenkins/workspace/crypto-phy-autotest/spdk/spdk_tgt_config.json 00:06:51.517 18:10:34 json_config -- json_config/common.sh@9 -- # local app=target 00:06:51.517 18:10:34 json_config -- json_config/common.sh@10 -- # shift 00:06:51.517 18:10:34 json_config -- json_config/common.sh@12 -- # [[ -n 22 ]] 00:06:51.517 18:10:34 json_config -- json_config/common.sh@13 -- # [[ -z '' ]] 00:06:51.517 18:10:34 json_config -- json_config/common.sh@15 -- # local app_extra_params= 00:06:51.517 18:10:34 json_config -- json_config/common.sh@16 -- # [[ 0 -eq 1 ]] 00:06:51.517 18:10:34 json_config -- json_config/common.sh@16 -- # [[ 0 -eq 1 ]] 00:06:51.517 18:10:34 json_config -- json_config/common.sh@22 -- # app_pid["$app"]=2421888 00:06:51.517 18:10:34 json_config -- json_config/common.sh@24 -- # echo 'Waiting for target to run...' 00:06:51.517 Waiting for target to run... 00:06:51.517 18:10:34 json_config -- json_config/common.sh@21 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt -m 0x1 -s 1024 -r /var/tmp/spdk_tgt.sock --json /var/jenkins/workspace/crypto-phy-autotest/spdk/spdk_tgt_config.json 00:06:51.517 18:10:34 json_config -- json_config/common.sh@25 -- # waitforlisten 2421888 /var/tmp/spdk_tgt.sock 00:06:51.517 18:10:34 json_config -- common/autotest_common.sh@829 -- # '[' -z 2421888 ']' 00:06:51.517 18:10:34 json_config -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk_tgt.sock 00:06:51.517 18:10:34 json_config -- common/autotest_common.sh@834 -- # local max_retries=100 00:06:51.517 18:10:34 json_config -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk_tgt.sock...' 00:06:51.517 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk_tgt.sock... 00:06:51.517 18:10:34 json_config -- common/autotest_common.sh@838 -- # xtrace_disable 00:06:51.517 18:10:34 json_config -- common/autotest_common.sh@10 -- # set +x 00:06:51.517 [2024-07-12 18:10:35.030820] Starting SPDK v24.09-pre git sha1 182dd7de4 / DPDK 24.03.0 initialization... 00:06:51.517 [2024-07-12 18:10:35.030901] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 -m 1024 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2421888 ] 00:06:52.086 [2024-07-12 18:10:35.680513] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:52.086 [2024-07-12 18:10:35.782611] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:06:52.359 [2024-07-12 18:10:35.836749] accel_dpdk_cryptodev.c: 223:accel_dpdk_cryptodev_set_driver: *NOTICE*: Using driver crypto_aesni_mb 00:06:52.359 [2024-07-12 18:10:35.844785] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation encrypt will be assigned to module dpdk_cryptodev 00:06:52.359 [2024-07-12 18:10:35.852802] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation decrypt will be assigned to module dpdk_cryptodev 00:06:52.359 [2024-07-12 18:10:35.934168] accel_dpdk_cryptodev.c:1178:accel_dpdk_cryptodev_init: *NOTICE*: Found crypto devices: 97 00:06:54.957 [2024-07-12 18:10:38.149363] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on Malloc3 00:06:54.957 [2024-07-12 18:10:38.149431] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc3 00:06:54.958 [2024-07-12 18:10:38.149446] vbdev_passthru.c: 735:bdev_passthru_create_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:06:54.958 [2024-07-12 18:10:38.157379] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Nvme0n1 00:06:54.958 [2024-07-12 18:10:38.157409] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Nvme0n1 00:06:54.958 [2024-07-12 18:10:38.165391] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:06:54.958 [2024-07-12 18:10:38.165418] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:06:54.958 [2024-07-12 18:10:38.173425] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "CryptoMallocBdev_AES_CBC" 00:06:54.958 [2024-07-12 18:10:38.173454] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: MallocForCryptoBdev 00:06:54.958 [2024-07-12 18:10:38.173467] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:06:54.958 [2024-07-12 18:10:38.543422] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on Malloc3 00:06:54.958 [2024-07-12 18:10:38.543471] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:06:54.958 [2024-07-12 18:10:38.543488] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xd55b90 00:06:54.958 [2024-07-12 18:10:38.543501] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:06:54.958 [2024-07-12 18:10:38.543805] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:06:54.958 [2024-07-12 18:10:38.543822] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: PTBdevFromMalloc3 00:06:54.958 18:10:38 json_config -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:06:54.958 18:10:38 json_config -- common/autotest_common.sh@862 -- # return 0 00:06:54.958 18:10:38 json_config -- json_config/common.sh@26 -- # echo '' 00:06:54.958 00:06:54.958 18:10:38 json_config -- json_config/json_config.sh@373 -- # [[ 0 -eq 1 ]] 00:06:54.958 18:10:38 json_config -- json_config/json_config.sh@377 -- # echo 'INFO: Checking if target configuration is the same...' 00:06:54.958 INFO: Checking if target configuration is the same... 00:06:54.958 18:10:38 json_config -- json_config/json_config.sh@378 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/json_diff.sh /dev/fd/62 /var/jenkins/workspace/crypto-phy-autotest/spdk/spdk_tgt_config.json 00:06:54.958 18:10:38 json_config -- json_config/json_config.sh@378 -- # tgt_rpc save_config 00:06:54.958 18:10:38 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock save_config 00:06:54.958 + '[' 2 -ne 2 ']' 00:06:54.958 +++ dirname /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/json_diff.sh 00:06:54.958 ++ readlink -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/../.. 00:06:54.958 + rootdir=/var/jenkins/workspace/crypto-phy-autotest/spdk 00:06:54.958 +++ basename /dev/fd/62 00:06:54.958 ++ mktemp /tmp/62.XXX 00:06:54.958 + tmp_file_1=/tmp/62.PyT 00:06:54.958 +++ basename /var/jenkins/workspace/crypto-phy-autotest/spdk/spdk_tgt_config.json 00:06:54.958 ++ mktemp /tmp/spdk_tgt_config.json.XXX 00:06:54.958 + tmp_file_2=/tmp/spdk_tgt_config.json.Nhr 00:06:54.958 + ret=0 00:06:54.958 + /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/config_filter.py -method sort 00:06:55.526 + /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/config_filter.py -method sort 00:06:55.526 + diff -u /tmp/62.PyT /tmp/spdk_tgt_config.json.Nhr 00:06:55.526 + echo 'INFO: JSON config files are the same' 00:06:55.526 INFO: JSON config files are the same 00:06:55.526 + rm /tmp/62.PyT /tmp/spdk_tgt_config.json.Nhr 00:06:55.526 + exit 0 00:06:55.526 18:10:39 json_config -- json_config/json_config.sh@379 -- # [[ 0 -eq 1 ]] 00:06:55.526 18:10:39 json_config -- json_config/json_config.sh@384 -- # echo 'INFO: changing configuration and checking if this can be detected...' 00:06:55.526 INFO: changing configuration and checking if this can be detected... 00:06:55.526 18:10:39 json_config -- json_config/json_config.sh@386 -- # tgt_rpc bdev_malloc_delete MallocBdevForConfigChangeCheck 00:06:55.526 18:10:39 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_malloc_delete MallocBdevForConfigChangeCheck 00:06:55.784 18:10:39 json_config -- json_config/json_config.sh@387 -- # tgt_rpc save_config 00:06:55.785 18:10:39 json_config -- json_config/json_config.sh@387 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/json_diff.sh /dev/fd/62 /var/jenkins/workspace/crypto-phy-autotest/spdk/spdk_tgt_config.json 00:06:55.785 18:10:39 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock save_config 00:06:55.785 + '[' 2 -ne 2 ']' 00:06:55.785 +++ dirname /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/json_diff.sh 00:06:55.785 ++ readlink -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/../.. 00:06:55.785 + rootdir=/var/jenkins/workspace/crypto-phy-autotest/spdk 00:06:55.785 +++ basename /dev/fd/62 00:06:55.785 ++ mktemp /tmp/62.XXX 00:06:55.785 + tmp_file_1=/tmp/62.FW2 00:06:55.785 +++ basename /var/jenkins/workspace/crypto-phy-autotest/spdk/spdk_tgt_config.json 00:06:55.785 ++ mktemp /tmp/spdk_tgt_config.json.XXX 00:06:55.785 + tmp_file_2=/tmp/spdk_tgt_config.json.HIz 00:06:55.785 + ret=0 00:06:55.785 + /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/config_filter.py -method sort 00:06:56.044 + /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/config_filter.py -method sort 00:06:56.044 + diff -u /tmp/62.FW2 /tmp/spdk_tgt_config.json.HIz 00:06:56.044 + ret=1 00:06:56.044 + echo '=== Start of file: /tmp/62.FW2 ===' 00:06:56.044 + cat /tmp/62.FW2 00:06:56.044 + echo '=== End of file: /tmp/62.FW2 ===' 00:06:56.044 + echo '' 00:06:56.044 + echo '=== Start of file: /tmp/spdk_tgt_config.json.HIz ===' 00:06:56.044 + cat /tmp/spdk_tgt_config.json.HIz 00:06:56.044 + echo '=== End of file: /tmp/spdk_tgt_config.json.HIz ===' 00:06:56.044 + echo '' 00:06:56.044 + rm /tmp/62.FW2 /tmp/spdk_tgt_config.json.HIz 00:06:56.044 + exit 1 00:06:56.044 18:10:39 json_config -- json_config/json_config.sh@391 -- # echo 'INFO: configuration change detected.' 00:06:56.044 INFO: configuration change detected. 00:06:56.044 18:10:39 json_config -- json_config/json_config.sh@394 -- # json_config_test_fini 00:06:56.044 18:10:39 json_config -- json_config/json_config.sh@306 -- # timing_enter json_config_test_fini 00:06:56.044 18:10:39 json_config -- common/autotest_common.sh@722 -- # xtrace_disable 00:06:56.044 18:10:39 json_config -- common/autotest_common.sh@10 -- # set +x 00:06:56.044 18:10:39 json_config -- json_config/json_config.sh@307 -- # local ret=0 00:06:56.044 18:10:39 json_config -- json_config/json_config.sh@309 -- # [[ -n '' ]] 00:06:56.044 18:10:39 json_config -- json_config/json_config.sh@317 -- # [[ -n 2421888 ]] 00:06:56.044 18:10:39 json_config -- json_config/json_config.sh@320 -- # cleanup_bdev_subsystem_config 00:06:56.044 18:10:39 json_config -- json_config/json_config.sh@184 -- # timing_enter cleanup_bdev_subsystem_config 00:06:56.044 18:10:39 json_config -- common/autotest_common.sh@722 -- # xtrace_disable 00:06:56.044 18:10:39 json_config -- common/autotest_common.sh@10 -- # set +x 00:06:56.044 18:10:39 json_config -- json_config/json_config.sh@186 -- # [[ 1 -eq 1 ]] 00:06:56.044 18:10:39 json_config -- json_config/json_config.sh@187 -- # tgt_rpc bdev_lvol_delete lvs_test/clone0 00:06:56.044 18:10:39 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_lvol_delete lvs_test/clone0 00:06:56.302 18:10:39 json_config -- json_config/json_config.sh@188 -- # tgt_rpc bdev_lvol_delete lvs_test/lvol0 00:06:56.302 18:10:39 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_lvol_delete lvs_test/lvol0 00:06:56.561 18:10:40 json_config -- json_config/json_config.sh@189 -- # tgt_rpc bdev_lvol_delete lvs_test/snapshot0 00:06:56.561 18:10:40 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_lvol_delete lvs_test/snapshot0 00:06:56.820 18:10:40 json_config -- json_config/json_config.sh@190 -- # tgt_rpc bdev_lvol_delete_lvstore -l lvs_test 00:06:56.821 18:10:40 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_lvol_delete_lvstore -l lvs_test 00:06:57.080 18:10:40 json_config -- json_config/json_config.sh@193 -- # uname -s 00:06:57.080 18:10:40 json_config -- json_config/json_config.sh@193 -- # [[ Linux = Linux ]] 00:06:57.080 18:10:40 json_config -- json_config/json_config.sh@194 -- # rm -f /sample_aio 00:06:57.080 18:10:40 json_config -- json_config/json_config.sh@197 -- # [[ 0 -eq 1 ]] 00:06:57.080 18:10:40 json_config -- json_config/json_config.sh@201 -- # timing_exit cleanup_bdev_subsystem_config 00:06:57.080 18:10:40 json_config -- common/autotest_common.sh@728 -- # xtrace_disable 00:06:57.080 18:10:40 json_config -- common/autotest_common.sh@10 -- # set +x 00:06:57.080 18:10:40 json_config -- json_config/json_config.sh@323 -- # killprocess 2421888 00:06:57.080 18:10:40 json_config -- common/autotest_common.sh@948 -- # '[' -z 2421888 ']' 00:06:57.080 18:10:40 json_config -- common/autotest_common.sh@952 -- # kill -0 2421888 00:06:57.080 18:10:40 json_config -- common/autotest_common.sh@953 -- # uname 00:06:57.080 18:10:40 json_config -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:06:57.080 18:10:40 json_config -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2421888 00:06:57.080 18:10:40 json_config -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:06:57.080 18:10:40 json_config -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:06:57.080 18:10:40 json_config -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2421888' 00:06:57.080 killing process with pid 2421888 00:06:57.080 18:10:40 json_config -- common/autotest_common.sh@967 -- # kill 2421888 00:06:57.080 18:10:40 json_config -- common/autotest_common.sh@972 -- # wait 2421888 00:07:00.365 18:10:44 json_config -- json_config/json_config.sh@326 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/spdk_initiator_config.json /var/jenkins/workspace/crypto-phy-autotest/spdk/spdk_tgt_config.json 00:07:00.365 18:10:44 json_config -- json_config/json_config.sh@327 -- # timing_exit json_config_test_fini 00:07:00.365 18:10:44 json_config -- common/autotest_common.sh@728 -- # xtrace_disable 00:07:00.365 18:10:44 json_config -- common/autotest_common.sh@10 -- # set +x 00:07:00.365 18:10:44 json_config -- json_config/json_config.sh@328 -- # return 0 00:07:00.365 18:10:44 json_config -- json_config/json_config.sh@396 -- # echo 'INFO: Success' 00:07:00.365 INFO: Success 00:07:00.365 00:07:00.365 real 0m27.900s 00:07:00.365 user 0m33.539s 00:07:00.365 sys 0m4.149s 00:07:00.365 18:10:44 json_config -- common/autotest_common.sh@1124 -- # xtrace_disable 00:07:00.365 18:10:44 json_config -- common/autotest_common.sh@10 -- # set +x 00:07:00.365 ************************************ 00:07:00.365 END TEST json_config 00:07:00.365 ************************************ 00:07:00.624 18:10:44 -- common/autotest_common.sh@1142 -- # return 0 00:07:00.624 18:10:44 -- spdk/autotest.sh@173 -- # run_test json_config_extra_key /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/json_config_extra_key.sh 00:07:00.624 18:10:44 -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:07:00.624 18:10:44 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:07:00.624 18:10:44 -- common/autotest_common.sh@10 -- # set +x 00:07:00.624 ************************************ 00:07:00.624 START TEST json_config_extra_key 00:07:00.624 ************************************ 00:07:00.624 18:10:44 json_config_extra_key -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/json_config_extra_key.sh 00:07:00.624 18:10:44 json_config_extra_key -- json_config/json_config_extra_key.sh@9 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/nvmf/common.sh 00:07:00.624 18:10:44 json_config_extra_key -- nvmf/common.sh@7 -- # uname -s 00:07:00.624 18:10:44 json_config_extra_key -- nvmf/common.sh@7 -- # [[ Linux == FreeBSD ]] 00:07:00.624 18:10:44 json_config_extra_key -- nvmf/common.sh@9 -- # NVMF_PORT=4420 00:07:00.624 18:10:44 json_config_extra_key -- nvmf/common.sh@10 -- # NVMF_SECOND_PORT=4421 00:07:00.624 18:10:44 json_config_extra_key -- nvmf/common.sh@11 -- # NVMF_THIRD_PORT=4422 00:07:00.624 18:10:44 json_config_extra_key -- nvmf/common.sh@12 -- # NVMF_IP_PREFIX=192.168.100 00:07:00.624 18:10:44 json_config_extra_key -- nvmf/common.sh@13 -- # NVMF_IP_LEAST_ADDR=8 00:07:00.624 18:10:44 json_config_extra_key -- nvmf/common.sh@14 -- # NVMF_TCP_IP_ADDRESS=127.0.0.1 00:07:00.624 18:10:44 json_config_extra_key -- nvmf/common.sh@15 -- # NVMF_TRANSPORT_OPTS= 00:07:00.624 18:10:44 json_config_extra_key -- nvmf/common.sh@16 -- # NVMF_SERIAL=SPDKISFASTANDAWESOME 00:07:00.624 18:10:44 json_config_extra_key -- nvmf/common.sh@17 -- # nvme gen-hostnqn 00:07:00.624 18:10:44 json_config_extra_key -- nvmf/common.sh@17 -- # NVME_HOSTNQN=nqn.2014-08.org.nvmexpress:uuid:809e2efd-7f71-e711-906e-0017a4403562 00:07:00.624 18:10:44 json_config_extra_key -- nvmf/common.sh@18 -- # NVME_HOSTID=809e2efd-7f71-e711-906e-0017a4403562 00:07:00.624 18:10:44 json_config_extra_key -- nvmf/common.sh@19 -- # NVME_HOST=("--hostnqn=$NVME_HOSTNQN" "--hostid=$NVME_HOSTID") 00:07:00.624 18:10:44 json_config_extra_key -- nvmf/common.sh@20 -- # NVME_CONNECT='nvme connect' 00:07:00.624 18:10:44 json_config_extra_key -- nvmf/common.sh@21 -- # NET_TYPE=phy-fallback 00:07:00.624 18:10:44 json_config_extra_key -- nvmf/common.sh@22 -- # NVME_SUBNQN=nqn.2016-06.io.spdk:testnqn 00:07:00.624 18:10:44 json_config_extra_key -- nvmf/common.sh@45 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/common.sh 00:07:00.624 18:10:44 json_config_extra_key -- scripts/common.sh@508 -- # [[ -e /bin/wpdk_common.sh ]] 00:07:00.624 18:10:44 json_config_extra_key -- scripts/common.sh@516 -- # [[ -e /etc/opt/spdk-pkgdep/paths/export.sh ]] 00:07:00.624 18:10:44 json_config_extra_key -- scripts/common.sh@517 -- # source /etc/opt/spdk-pkgdep/paths/export.sh 00:07:00.624 18:10:44 json_config_extra_key -- paths/export.sh@2 -- # PATH=/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:07:00.624 18:10:44 json_config_extra_key -- paths/export.sh@3 -- # PATH=/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:07:00.624 18:10:44 json_config_extra_key -- paths/export.sh@4 -- # PATH=/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:07:00.624 18:10:44 json_config_extra_key -- paths/export.sh@5 -- # export PATH 00:07:00.624 18:10:44 json_config_extra_key -- paths/export.sh@6 -- # echo /opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:07:00.624 18:10:44 json_config_extra_key -- nvmf/common.sh@47 -- # : 0 00:07:00.624 18:10:44 json_config_extra_key -- nvmf/common.sh@48 -- # export NVMF_APP_SHM_ID 00:07:00.624 18:10:44 json_config_extra_key -- nvmf/common.sh@49 -- # build_nvmf_app_args 00:07:00.624 18:10:44 json_config_extra_key -- nvmf/common.sh@25 -- # '[' 0 -eq 1 ']' 00:07:00.624 18:10:44 json_config_extra_key -- nvmf/common.sh@29 -- # NVMF_APP+=(-i "$NVMF_APP_SHM_ID" -e 0xFFFF) 00:07:00.624 18:10:44 json_config_extra_key -- nvmf/common.sh@31 -- # NVMF_APP+=("${NO_HUGE[@]}") 00:07:00.624 18:10:44 json_config_extra_key -- nvmf/common.sh@33 -- # '[' -n '' ']' 00:07:00.625 18:10:44 json_config_extra_key -- nvmf/common.sh@35 -- # '[' 0 -eq 1 ']' 00:07:00.625 18:10:44 json_config_extra_key -- nvmf/common.sh@51 -- # have_pci_nics=0 00:07:00.625 18:10:44 json_config_extra_key -- json_config/json_config_extra_key.sh@10 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/common.sh 00:07:00.625 18:10:44 json_config_extra_key -- json_config/json_config_extra_key.sh@17 -- # app_pid=(['target']='') 00:07:00.625 18:10:44 json_config_extra_key -- json_config/json_config_extra_key.sh@17 -- # declare -A app_pid 00:07:00.625 18:10:44 json_config_extra_key -- json_config/json_config_extra_key.sh@18 -- # app_socket=(['target']='/var/tmp/spdk_tgt.sock') 00:07:00.625 18:10:44 json_config_extra_key -- json_config/json_config_extra_key.sh@18 -- # declare -A app_socket 00:07:00.625 18:10:44 json_config_extra_key -- json_config/json_config_extra_key.sh@19 -- # app_params=(['target']='-m 0x1 -s 1024') 00:07:00.625 18:10:44 json_config_extra_key -- json_config/json_config_extra_key.sh@19 -- # declare -A app_params 00:07:00.625 18:10:44 json_config_extra_key -- json_config/json_config_extra_key.sh@20 -- # configs_path=(['target']='/var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/extra_key.json') 00:07:00.625 18:10:44 json_config_extra_key -- json_config/json_config_extra_key.sh@20 -- # declare -A configs_path 00:07:00.625 18:10:44 json_config_extra_key -- json_config/json_config_extra_key.sh@22 -- # trap 'on_error_exit "${FUNCNAME}" "${LINENO}"' ERR 00:07:00.625 18:10:44 json_config_extra_key -- json_config/json_config_extra_key.sh@24 -- # echo 'INFO: launching applications...' 00:07:00.625 INFO: launching applications... 00:07:00.625 18:10:44 json_config_extra_key -- json_config/json_config_extra_key.sh@25 -- # json_config_test_start_app target --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/extra_key.json 00:07:00.625 18:10:44 json_config_extra_key -- json_config/common.sh@9 -- # local app=target 00:07:00.625 18:10:44 json_config_extra_key -- json_config/common.sh@10 -- # shift 00:07:00.625 18:10:44 json_config_extra_key -- json_config/common.sh@12 -- # [[ -n 22 ]] 00:07:00.625 18:10:44 json_config_extra_key -- json_config/common.sh@13 -- # [[ -z '' ]] 00:07:00.625 18:10:44 json_config_extra_key -- json_config/common.sh@15 -- # local app_extra_params= 00:07:00.625 18:10:44 json_config_extra_key -- json_config/common.sh@16 -- # [[ 0 -eq 1 ]] 00:07:00.625 18:10:44 json_config_extra_key -- json_config/common.sh@16 -- # [[ 0 -eq 1 ]] 00:07:00.625 18:10:44 json_config_extra_key -- json_config/common.sh@22 -- # app_pid["$app"]=2423234 00:07:00.625 18:10:44 json_config_extra_key -- json_config/common.sh@24 -- # echo 'Waiting for target to run...' 00:07:00.625 Waiting for target to run... 00:07:00.625 18:10:44 json_config_extra_key -- json_config/common.sh@25 -- # waitforlisten 2423234 /var/tmp/spdk_tgt.sock 00:07:00.625 18:10:44 json_config_extra_key -- common/autotest_common.sh@829 -- # '[' -z 2423234 ']' 00:07:00.625 18:10:44 json_config_extra_key -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk_tgt.sock 00:07:00.625 18:10:44 json_config_extra_key -- json_config/common.sh@21 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt -m 0x1 -s 1024 -r /var/tmp/spdk_tgt.sock --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/extra_key.json 00:07:00.625 18:10:44 json_config_extra_key -- common/autotest_common.sh@834 -- # local max_retries=100 00:07:00.625 18:10:44 json_config_extra_key -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk_tgt.sock...' 00:07:00.625 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk_tgt.sock... 00:07:00.625 18:10:44 json_config_extra_key -- common/autotest_common.sh@838 -- # xtrace_disable 00:07:00.625 18:10:44 json_config_extra_key -- common/autotest_common.sh@10 -- # set +x 00:07:00.625 [2024-07-12 18:10:44.320450] Starting SPDK v24.09-pre git sha1 182dd7de4 / DPDK 24.03.0 initialization... 00:07:00.625 [2024-07-12 18:10:44.320522] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 -m 1024 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2423234 ] 00:07:01.193 [2024-07-12 18:10:44.876901] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:01.451 [2024-07-12 18:10:44.980975] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:07:01.709 18:10:45 json_config_extra_key -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:07:01.709 18:10:45 json_config_extra_key -- common/autotest_common.sh@862 -- # return 0 00:07:01.709 18:10:45 json_config_extra_key -- json_config/common.sh@26 -- # echo '' 00:07:01.709 00:07:01.709 18:10:45 json_config_extra_key -- json_config/json_config_extra_key.sh@27 -- # echo 'INFO: shutting down applications...' 00:07:01.709 INFO: shutting down applications... 00:07:01.709 18:10:45 json_config_extra_key -- json_config/json_config_extra_key.sh@28 -- # json_config_test_shutdown_app target 00:07:01.709 18:10:45 json_config_extra_key -- json_config/common.sh@31 -- # local app=target 00:07:01.709 18:10:45 json_config_extra_key -- json_config/common.sh@34 -- # [[ -n 22 ]] 00:07:01.709 18:10:45 json_config_extra_key -- json_config/common.sh@35 -- # [[ -n 2423234 ]] 00:07:01.709 18:10:45 json_config_extra_key -- json_config/common.sh@38 -- # kill -SIGINT 2423234 00:07:01.709 18:10:45 json_config_extra_key -- json_config/common.sh@40 -- # (( i = 0 )) 00:07:01.709 18:10:45 json_config_extra_key -- json_config/common.sh@40 -- # (( i < 30 )) 00:07:01.709 18:10:45 json_config_extra_key -- json_config/common.sh@41 -- # kill -0 2423234 00:07:01.709 18:10:45 json_config_extra_key -- json_config/common.sh@45 -- # sleep 0.5 00:07:02.276 18:10:45 json_config_extra_key -- json_config/common.sh@40 -- # (( i++ )) 00:07:02.276 18:10:45 json_config_extra_key -- json_config/common.sh@40 -- # (( i < 30 )) 00:07:02.276 18:10:45 json_config_extra_key -- json_config/common.sh@41 -- # kill -0 2423234 00:07:02.276 18:10:45 json_config_extra_key -- json_config/common.sh@42 -- # app_pid["$app"]= 00:07:02.276 18:10:45 json_config_extra_key -- json_config/common.sh@43 -- # break 00:07:02.277 18:10:45 json_config_extra_key -- json_config/common.sh@48 -- # [[ -n '' ]] 00:07:02.277 18:10:45 json_config_extra_key -- json_config/common.sh@53 -- # echo 'SPDK target shutdown done' 00:07:02.277 SPDK target shutdown done 00:07:02.277 18:10:45 json_config_extra_key -- json_config/json_config_extra_key.sh@30 -- # echo Success 00:07:02.277 Success 00:07:02.277 00:07:02.277 real 0m1.616s 00:07:02.277 user 0m1.108s 00:07:02.277 sys 0m0.707s 00:07:02.277 18:10:45 json_config_extra_key -- common/autotest_common.sh@1124 -- # xtrace_disable 00:07:02.277 18:10:45 json_config_extra_key -- common/autotest_common.sh@10 -- # set +x 00:07:02.277 ************************************ 00:07:02.277 END TEST json_config_extra_key 00:07:02.277 ************************************ 00:07:02.277 18:10:45 -- common/autotest_common.sh@1142 -- # return 0 00:07:02.277 18:10:45 -- spdk/autotest.sh@174 -- # run_test alias_rpc /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/alias_rpc/alias_rpc.sh 00:07:02.277 18:10:45 -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:07:02.277 18:10:45 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:07:02.277 18:10:45 -- common/autotest_common.sh@10 -- # set +x 00:07:02.277 ************************************ 00:07:02.277 START TEST alias_rpc 00:07:02.277 ************************************ 00:07:02.277 18:10:45 alias_rpc -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/alias_rpc/alias_rpc.sh 00:07:02.277 * Looking for test storage... 00:07:02.277 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/alias_rpc 00:07:02.277 18:10:45 alias_rpc -- alias_rpc/alias_rpc.sh@10 -- # trap 'killprocess $spdk_tgt_pid; exit 1' ERR 00:07:02.277 18:10:45 alias_rpc -- alias_rpc/alias_rpc.sh@13 -- # spdk_tgt_pid=2423465 00:07:02.277 18:10:45 alias_rpc -- alias_rpc/alias_rpc.sh@14 -- # waitforlisten 2423465 00:07:02.277 18:10:45 alias_rpc -- alias_rpc/alias_rpc.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt 00:07:02.277 18:10:45 alias_rpc -- common/autotest_common.sh@829 -- # '[' -z 2423465 ']' 00:07:02.277 18:10:45 alias_rpc -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:07:02.277 18:10:45 alias_rpc -- common/autotest_common.sh@834 -- # local max_retries=100 00:07:02.277 18:10:45 alias_rpc -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:07:02.277 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:07:02.277 18:10:45 alias_rpc -- common/autotest_common.sh@838 -- # xtrace_disable 00:07:02.277 18:10:45 alias_rpc -- common/autotest_common.sh@10 -- # set +x 00:07:02.277 [2024-07-12 18:10:46.004691] Starting SPDK v24.09-pre git sha1 182dd7de4 / DPDK 24.03.0 initialization... 00:07:02.277 [2024-07-12 18:10:46.004765] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2423465 ] 00:07:02.535 [2024-07-12 18:10:46.133595] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:02.535 [2024-07-12 18:10:46.234625] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:07:03.471 18:10:46 alias_rpc -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:07:03.471 18:10:46 alias_rpc -- common/autotest_common.sh@862 -- # return 0 00:07:03.471 18:10:46 alias_rpc -- alias_rpc/alias_rpc.sh@17 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py load_config -i 00:07:03.471 18:10:47 alias_rpc -- alias_rpc/alias_rpc.sh@19 -- # killprocess 2423465 00:07:03.471 18:10:47 alias_rpc -- common/autotest_common.sh@948 -- # '[' -z 2423465 ']' 00:07:03.471 18:10:47 alias_rpc -- common/autotest_common.sh@952 -- # kill -0 2423465 00:07:03.471 18:10:47 alias_rpc -- common/autotest_common.sh@953 -- # uname 00:07:03.471 18:10:47 alias_rpc -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:07:03.729 18:10:47 alias_rpc -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2423465 00:07:03.729 18:10:47 alias_rpc -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:07:03.729 18:10:47 alias_rpc -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:07:03.729 18:10:47 alias_rpc -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2423465' 00:07:03.729 killing process with pid 2423465 00:07:03.729 18:10:47 alias_rpc -- common/autotest_common.sh@967 -- # kill 2423465 00:07:03.729 18:10:47 alias_rpc -- common/autotest_common.sh@972 -- # wait 2423465 00:07:03.988 00:07:03.988 real 0m1.797s 00:07:03.988 user 0m1.972s 00:07:03.988 sys 0m0.565s 00:07:03.988 18:10:47 alias_rpc -- common/autotest_common.sh@1124 -- # xtrace_disable 00:07:03.988 18:10:47 alias_rpc -- common/autotest_common.sh@10 -- # set +x 00:07:03.988 ************************************ 00:07:03.988 END TEST alias_rpc 00:07:03.988 ************************************ 00:07:03.988 18:10:47 -- common/autotest_common.sh@1142 -- # return 0 00:07:03.988 18:10:47 -- spdk/autotest.sh@176 -- # [[ 0 -eq 0 ]] 00:07:03.988 18:10:47 -- spdk/autotest.sh@177 -- # run_test spdkcli_tcp /var/jenkins/workspace/crypto-phy-autotest/spdk/test/spdkcli/tcp.sh 00:07:03.988 18:10:47 -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:07:03.988 18:10:47 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:07:03.988 18:10:47 -- common/autotest_common.sh@10 -- # set +x 00:07:03.988 ************************************ 00:07:03.988 START TEST spdkcli_tcp 00:07:03.988 ************************************ 00:07:03.988 18:10:47 spdkcli_tcp -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/spdkcli/tcp.sh 00:07:04.247 * Looking for test storage... 00:07:04.247 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/spdkcli 00:07:04.247 18:10:47 spdkcli_tcp -- spdkcli/tcp.sh@9 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/spdkcli/common.sh 00:07:04.247 18:10:47 spdkcli_tcp -- spdkcli/common.sh@6 -- # spdkcli_job=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/spdkcli/spdkcli_job.py 00:07:04.247 18:10:47 spdkcli_tcp -- spdkcli/common.sh@7 -- # spdk_clear_config_py=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/clear_config.py 00:07:04.247 18:10:47 spdkcli_tcp -- spdkcli/tcp.sh@18 -- # IP_ADDRESS=127.0.0.1 00:07:04.247 18:10:47 spdkcli_tcp -- spdkcli/tcp.sh@19 -- # PORT=9998 00:07:04.247 18:10:47 spdkcli_tcp -- spdkcli/tcp.sh@21 -- # trap 'err_cleanup; exit 1' SIGINT SIGTERM EXIT 00:07:04.247 18:10:47 spdkcli_tcp -- spdkcli/tcp.sh@23 -- # timing_enter run_spdk_tgt_tcp 00:07:04.247 18:10:47 spdkcli_tcp -- common/autotest_common.sh@722 -- # xtrace_disable 00:07:04.247 18:10:47 spdkcli_tcp -- common/autotest_common.sh@10 -- # set +x 00:07:04.247 18:10:47 spdkcli_tcp -- spdkcli/tcp.sh@25 -- # spdk_tgt_pid=2423848 00:07:04.247 18:10:47 spdkcli_tcp -- spdkcli/tcp.sh@27 -- # waitforlisten 2423848 00:07:04.247 18:10:47 spdkcli_tcp -- common/autotest_common.sh@829 -- # '[' -z 2423848 ']' 00:07:04.247 18:10:47 spdkcli_tcp -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:07:04.247 18:10:47 spdkcli_tcp -- common/autotest_common.sh@834 -- # local max_retries=100 00:07:04.247 18:10:47 spdkcli_tcp -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:07:04.247 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:07:04.247 18:10:47 spdkcli_tcp -- spdkcli/tcp.sh@24 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt -m 0x3 -p 0 00:07:04.247 18:10:47 spdkcli_tcp -- common/autotest_common.sh@838 -- # xtrace_disable 00:07:04.247 18:10:47 spdkcli_tcp -- common/autotest_common.sh@10 -- # set +x 00:07:04.247 [2024-07-12 18:10:47.892362] Starting SPDK v24.09-pre git sha1 182dd7de4 / DPDK 24.03.0 initialization... 00:07:04.247 [2024-07-12 18:10:47.892430] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x3 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2423848 ] 00:07:04.506 [2024-07-12 18:10:48.014918] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 2 00:07:04.506 [2024-07-12 18:10:48.119465] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:07:04.506 [2024-07-12 18:10:48.119471] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:07:05.443 18:10:48 spdkcli_tcp -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:07:05.443 18:10:48 spdkcli_tcp -- common/autotest_common.sh@862 -- # return 0 00:07:05.443 18:10:48 spdkcli_tcp -- spdkcli/tcp.sh@31 -- # socat_pid=2423882 00:07:05.443 18:10:48 spdkcli_tcp -- spdkcli/tcp.sh@33 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -r 100 -t 2 -s 127.0.0.1 -p 9998 rpc_get_methods 00:07:05.443 18:10:48 spdkcli_tcp -- spdkcli/tcp.sh@30 -- # socat TCP-LISTEN:9998 UNIX-CONNECT:/var/tmp/spdk.sock 00:07:05.443 [ 00:07:05.443 "bdev_malloc_delete", 00:07:05.443 "bdev_malloc_create", 00:07:05.443 "bdev_null_resize", 00:07:05.443 "bdev_null_delete", 00:07:05.443 "bdev_null_create", 00:07:05.443 "bdev_nvme_cuse_unregister", 00:07:05.443 "bdev_nvme_cuse_register", 00:07:05.443 "bdev_opal_new_user", 00:07:05.443 "bdev_opal_set_lock_state", 00:07:05.443 "bdev_opal_delete", 00:07:05.443 "bdev_opal_get_info", 00:07:05.443 "bdev_opal_create", 00:07:05.443 "bdev_nvme_opal_revert", 00:07:05.443 "bdev_nvme_opal_init", 00:07:05.443 "bdev_nvme_send_cmd", 00:07:05.443 "bdev_nvme_get_path_iostat", 00:07:05.443 "bdev_nvme_get_mdns_discovery_info", 00:07:05.443 "bdev_nvme_stop_mdns_discovery", 00:07:05.443 "bdev_nvme_start_mdns_discovery", 00:07:05.443 "bdev_nvme_set_multipath_policy", 00:07:05.443 "bdev_nvme_set_preferred_path", 00:07:05.443 "bdev_nvme_get_io_paths", 00:07:05.443 "bdev_nvme_remove_error_injection", 00:07:05.443 "bdev_nvme_add_error_injection", 00:07:05.443 "bdev_nvme_get_discovery_info", 00:07:05.443 "bdev_nvme_stop_discovery", 00:07:05.443 "bdev_nvme_start_discovery", 00:07:05.443 "bdev_nvme_get_controller_health_info", 00:07:05.443 "bdev_nvme_disable_controller", 00:07:05.443 "bdev_nvme_enable_controller", 00:07:05.443 "bdev_nvme_reset_controller", 00:07:05.443 "bdev_nvme_get_transport_statistics", 00:07:05.443 "bdev_nvme_apply_firmware", 00:07:05.443 "bdev_nvme_detach_controller", 00:07:05.443 "bdev_nvme_get_controllers", 00:07:05.443 "bdev_nvme_attach_controller", 00:07:05.443 "bdev_nvme_set_hotplug", 00:07:05.443 "bdev_nvme_set_options", 00:07:05.443 "bdev_passthru_delete", 00:07:05.443 "bdev_passthru_create", 00:07:05.443 "bdev_lvol_set_parent_bdev", 00:07:05.443 "bdev_lvol_set_parent", 00:07:05.443 "bdev_lvol_check_shallow_copy", 00:07:05.443 "bdev_lvol_start_shallow_copy", 00:07:05.443 "bdev_lvol_grow_lvstore", 00:07:05.443 "bdev_lvol_get_lvols", 00:07:05.443 "bdev_lvol_get_lvstores", 00:07:05.443 "bdev_lvol_delete", 00:07:05.443 "bdev_lvol_set_read_only", 00:07:05.443 "bdev_lvol_resize", 00:07:05.443 "bdev_lvol_decouple_parent", 00:07:05.443 "bdev_lvol_inflate", 00:07:05.443 "bdev_lvol_rename", 00:07:05.443 "bdev_lvol_clone_bdev", 00:07:05.443 "bdev_lvol_clone", 00:07:05.443 "bdev_lvol_snapshot", 00:07:05.443 "bdev_lvol_create", 00:07:05.443 "bdev_lvol_delete_lvstore", 00:07:05.443 "bdev_lvol_rename_lvstore", 00:07:05.443 "bdev_lvol_create_lvstore", 00:07:05.443 "bdev_raid_set_options", 00:07:05.443 "bdev_raid_remove_base_bdev", 00:07:05.443 "bdev_raid_add_base_bdev", 00:07:05.443 "bdev_raid_delete", 00:07:05.443 "bdev_raid_create", 00:07:05.443 "bdev_raid_get_bdevs", 00:07:05.443 "bdev_error_inject_error", 00:07:05.443 "bdev_error_delete", 00:07:05.443 "bdev_error_create", 00:07:05.443 "bdev_split_delete", 00:07:05.443 "bdev_split_create", 00:07:05.443 "bdev_delay_delete", 00:07:05.443 "bdev_delay_create", 00:07:05.443 "bdev_delay_update_latency", 00:07:05.443 "bdev_zone_block_delete", 00:07:05.443 "bdev_zone_block_create", 00:07:05.443 "blobfs_create", 00:07:05.443 "blobfs_detect", 00:07:05.443 "blobfs_set_cache_size", 00:07:05.443 "bdev_crypto_delete", 00:07:05.443 "bdev_crypto_create", 00:07:05.443 "bdev_compress_delete", 00:07:05.443 "bdev_compress_create", 00:07:05.443 "bdev_compress_get_orphans", 00:07:05.443 "bdev_aio_delete", 00:07:05.443 "bdev_aio_rescan", 00:07:05.443 "bdev_aio_create", 00:07:05.443 "bdev_ftl_set_property", 00:07:05.443 "bdev_ftl_get_properties", 00:07:05.443 "bdev_ftl_get_stats", 00:07:05.443 "bdev_ftl_unmap", 00:07:05.443 "bdev_ftl_unload", 00:07:05.443 "bdev_ftl_delete", 00:07:05.443 "bdev_ftl_load", 00:07:05.443 "bdev_ftl_create", 00:07:05.443 "bdev_virtio_attach_controller", 00:07:05.443 "bdev_virtio_scsi_get_devices", 00:07:05.443 "bdev_virtio_detach_controller", 00:07:05.443 "bdev_virtio_blk_set_hotplug", 00:07:05.443 "bdev_iscsi_delete", 00:07:05.443 "bdev_iscsi_create", 00:07:05.443 "bdev_iscsi_set_options", 00:07:05.443 "accel_error_inject_error", 00:07:05.443 "ioat_scan_accel_module", 00:07:05.443 "dsa_scan_accel_module", 00:07:05.443 "iaa_scan_accel_module", 00:07:05.443 "dpdk_cryptodev_get_driver", 00:07:05.443 "dpdk_cryptodev_set_driver", 00:07:05.443 "dpdk_cryptodev_scan_accel_module", 00:07:05.443 "compressdev_scan_accel_module", 00:07:05.443 "keyring_file_remove_key", 00:07:05.443 "keyring_file_add_key", 00:07:05.443 "keyring_linux_set_options", 00:07:05.443 "iscsi_get_histogram", 00:07:05.443 "iscsi_enable_histogram", 00:07:05.443 "iscsi_set_options", 00:07:05.443 "iscsi_get_auth_groups", 00:07:05.443 "iscsi_auth_group_remove_secret", 00:07:05.443 "iscsi_auth_group_add_secret", 00:07:05.443 "iscsi_delete_auth_group", 00:07:05.443 "iscsi_create_auth_group", 00:07:05.443 "iscsi_set_discovery_auth", 00:07:05.443 "iscsi_get_options", 00:07:05.443 "iscsi_target_node_request_logout", 00:07:05.444 "iscsi_target_node_set_redirect", 00:07:05.444 "iscsi_target_node_set_auth", 00:07:05.444 "iscsi_target_node_add_lun", 00:07:05.444 "iscsi_get_stats", 00:07:05.444 "iscsi_get_connections", 00:07:05.444 "iscsi_portal_group_set_auth", 00:07:05.444 "iscsi_start_portal_group", 00:07:05.444 "iscsi_delete_portal_group", 00:07:05.444 "iscsi_create_portal_group", 00:07:05.444 "iscsi_get_portal_groups", 00:07:05.444 "iscsi_delete_target_node", 00:07:05.444 "iscsi_target_node_remove_pg_ig_maps", 00:07:05.444 "iscsi_target_node_add_pg_ig_maps", 00:07:05.444 "iscsi_create_target_node", 00:07:05.444 "iscsi_get_target_nodes", 00:07:05.444 "iscsi_delete_initiator_group", 00:07:05.444 "iscsi_initiator_group_remove_initiators", 00:07:05.444 "iscsi_initiator_group_add_initiators", 00:07:05.444 "iscsi_create_initiator_group", 00:07:05.444 "iscsi_get_initiator_groups", 00:07:05.444 "nvmf_set_crdt", 00:07:05.444 "nvmf_set_config", 00:07:05.444 "nvmf_set_max_subsystems", 00:07:05.444 "nvmf_stop_mdns_prr", 00:07:05.444 "nvmf_publish_mdns_prr", 00:07:05.444 "nvmf_subsystem_get_listeners", 00:07:05.444 "nvmf_subsystem_get_qpairs", 00:07:05.444 "nvmf_subsystem_get_controllers", 00:07:05.444 "nvmf_get_stats", 00:07:05.444 "nvmf_get_transports", 00:07:05.444 "nvmf_create_transport", 00:07:05.444 "nvmf_get_targets", 00:07:05.444 "nvmf_delete_target", 00:07:05.444 "nvmf_create_target", 00:07:05.444 "nvmf_subsystem_allow_any_host", 00:07:05.444 "nvmf_subsystem_remove_host", 00:07:05.444 "nvmf_subsystem_add_host", 00:07:05.444 "nvmf_ns_remove_host", 00:07:05.444 "nvmf_ns_add_host", 00:07:05.444 "nvmf_subsystem_remove_ns", 00:07:05.444 "nvmf_subsystem_add_ns", 00:07:05.444 "nvmf_subsystem_listener_set_ana_state", 00:07:05.444 "nvmf_discovery_get_referrals", 00:07:05.444 "nvmf_discovery_remove_referral", 00:07:05.444 "nvmf_discovery_add_referral", 00:07:05.444 "nvmf_subsystem_remove_listener", 00:07:05.444 "nvmf_subsystem_add_listener", 00:07:05.444 "nvmf_delete_subsystem", 00:07:05.444 "nvmf_create_subsystem", 00:07:05.444 "nvmf_get_subsystems", 00:07:05.444 "env_dpdk_get_mem_stats", 00:07:05.444 "nbd_get_disks", 00:07:05.444 "nbd_stop_disk", 00:07:05.444 "nbd_start_disk", 00:07:05.444 "ublk_recover_disk", 00:07:05.444 "ublk_get_disks", 00:07:05.444 "ublk_stop_disk", 00:07:05.444 "ublk_start_disk", 00:07:05.444 "ublk_destroy_target", 00:07:05.444 "ublk_create_target", 00:07:05.444 "virtio_blk_create_transport", 00:07:05.444 "virtio_blk_get_transports", 00:07:05.444 "vhost_controller_set_coalescing", 00:07:05.444 "vhost_get_controllers", 00:07:05.444 "vhost_delete_controller", 00:07:05.444 "vhost_create_blk_controller", 00:07:05.444 "vhost_scsi_controller_remove_target", 00:07:05.444 "vhost_scsi_controller_add_target", 00:07:05.444 "vhost_start_scsi_controller", 00:07:05.444 "vhost_create_scsi_controller", 00:07:05.444 "thread_set_cpumask", 00:07:05.444 "framework_get_governor", 00:07:05.444 "framework_get_scheduler", 00:07:05.444 "framework_set_scheduler", 00:07:05.444 "framework_get_reactors", 00:07:05.444 "thread_get_io_channels", 00:07:05.444 "thread_get_pollers", 00:07:05.444 "thread_get_stats", 00:07:05.444 "framework_monitor_context_switch", 00:07:05.444 "spdk_kill_instance", 00:07:05.444 "log_enable_timestamps", 00:07:05.444 "log_get_flags", 00:07:05.444 "log_clear_flag", 00:07:05.444 "log_set_flag", 00:07:05.444 "log_get_level", 00:07:05.444 "log_set_level", 00:07:05.444 "log_get_print_level", 00:07:05.444 "log_set_print_level", 00:07:05.444 "framework_enable_cpumask_locks", 00:07:05.444 "framework_disable_cpumask_locks", 00:07:05.444 "framework_wait_init", 00:07:05.444 "framework_start_init", 00:07:05.444 "scsi_get_devices", 00:07:05.444 "bdev_get_histogram", 00:07:05.444 "bdev_enable_histogram", 00:07:05.444 "bdev_set_qos_limit", 00:07:05.444 "bdev_set_qd_sampling_period", 00:07:05.444 "bdev_get_bdevs", 00:07:05.444 "bdev_reset_iostat", 00:07:05.444 "bdev_get_iostat", 00:07:05.444 "bdev_examine", 00:07:05.444 "bdev_wait_for_examine", 00:07:05.444 "bdev_set_options", 00:07:05.444 "notify_get_notifications", 00:07:05.444 "notify_get_types", 00:07:05.444 "accel_get_stats", 00:07:05.444 "accel_set_options", 00:07:05.444 "accel_set_driver", 00:07:05.444 "accel_crypto_key_destroy", 00:07:05.444 "accel_crypto_keys_get", 00:07:05.444 "accel_crypto_key_create", 00:07:05.444 "accel_assign_opc", 00:07:05.444 "accel_get_module_info", 00:07:05.444 "accel_get_opc_assignments", 00:07:05.444 "vmd_rescan", 00:07:05.444 "vmd_remove_device", 00:07:05.444 "vmd_enable", 00:07:05.444 "sock_get_default_impl", 00:07:05.444 "sock_set_default_impl", 00:07:05.444 "sock_impl_set_options", 00:07:05.444 "sock_impl_get_options", 00:07:05.444 "iobuf_get_stats", 00:07:05.444 "iobuf_set_options", 00:07:05.444 "framework_get_pci_devices", 00:07:05.444 "framework_get_config", 00:07:05.444 "framework_get_subsystems", 00:07:05.444 "trace_get_info", 00:07:05.444 "trace_get_tpoint_group_mask", 00:07:05.444 "trace_disable_tpoint_group", 00:07:05.444 "trace_enable_tpoint_group", 00:07:05.444 "trace_clear_tpoint_mask", 00:07:05.444 "trace_set_tpoint_mask", 00:07:05.444 "keyring_get_keys", 00:07:05.444 "spdk_get_version", 00:07:05.444 "rpc_get_methods" 00:07:05.444 ] 00:07:05.444 18:10:49 spdkcli_tcp -- spdkcli/tcp.sh@35 -- # timing_exit run_spdk_tgt_tcp 00:07:05.444 18:10:49 spdkcli_tcp -- common/autotest_common.sh@728 -- # xtrace_disable 00:07:05.444 18:10:49 spdkcli_tcp -- common/autotest_common.sh@10 -- # set +x 00:07:05.444 18:10:49 spdkcli_tcp -- spdkcli/tcp.sh@37 -- # trap - SIGINT SIGTERM EXIT 00:07:05.444 18:10:49 spdkcli_tcp -- spdkcli/tcp.sh@38 -- # killprocess 2423848 00:07:05.444 18:10:49 spdkcli_tcp -- common/autotest_common.sh@948 -- # '[' -z 2423848 ']' 00:07:05.444 18:10:49 spdkcli_tcp -- common/autotest_common.sh@952 -- # kill -0 2423848 00:07:05.444 18:10:49 spdkcli_tcp -- common/autotest_common.sh@953 -- # uname 00:07:05.444 18:10:49 spdkcli_tcp -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:07:05.444 18:10:49 spdkcli_tcp -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2423848 00:07:05.444 18:10:49 spdkcli_tcp -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:07:05.444 18:10:49 spdkcli_tcp -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:07:05.444 18:10:49 spdkcli_tcp -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2423848' 00:07:05.444 killing process with pid 2423848 00:07:05.444 18:10:49 spdkcli_tcp -- common/autotest_common.sh@967 -- # kill 2423848 00:07:05.444 18:10:49 spdkcli_tcp -- common/autotest_common.sh@972 -- # wait 2423848 00:07:06.013 00:07:06.013 real 0m1.838s 00:07:06.013 user 0m3.355s 00:07:06.013 sys 0m0.590s 00:07:06.013 18:10:49 spdkcli_tcp -- common/autotest_common.sh@1124 -- # xtrace_disable 00:07:06.013 18:10:49 spdkcli_tcp -- common/autotest_common.sh@10 -- # set +x 00:07:06.013 ************************************ 00:07:06.013 END TEST spdkcli_tcp 00:07:06.013 ************************************ 00:07:06.013 18:10:49 -- common/autotest_common.sh@1142 -- # return 0 00:07:06.013 18:10:49 -- spdk/autotest.sh@180 -- # run_test dpdk_mem_utility /var/jenkins/workspace/crypto-phy-autotest/spdk/test/dpdk_memory_utility/test_dpdk_mem_info.sh 00:07:06.013 18:10:49 -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:07:06.013 18:10:49 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:07:06.013 18:10:49 -- common/autotest_common.sh@10 -- # set +x 00:07:06.013 ************************************ 00:07:06.013 START TEST dpdk_mem_utility 00:07:06.013 ************************************ 00:07:06.013 18:10:49 dpdk_mem_utility -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/dpdk_memory_utility/test_dpdk_mem_info.sh 00:07:06.013 * Looking for test storage... 00:07:06.013 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/dpdk_memory_utility 00:07:06.013 18:10:49 dpdk_mem_utility -- dpdk_memory_utility/test_dpdk_mem_info.sh@10 -- # MEM_SCRIPT=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/dpdk_mem_info.py 00:07:06.013 18:10:49 dpdk_mem_utility -- dpdk_memory_utility/test_dpdk_mem_info.sh@13 -- # spdkpid=2424119 00:07:06.013 18:10:49 dpdk_mem_utility -- dpdk_memory_utility/test_dpdk_mem_info.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt 00:07:06.013 18:10:49 dpdk_mem_utility -- dpdk_memory_utility/test_dpdk_mem_info.sh@15 -- # waitforlisten 2424119 00:07:06.013 18:10:49 dpdk_mem_utility -- common/autotest_common.sh@829 -- # '[' -z 2424119 ']' 00:07:06.013 18:10:49 dpdk_mem_utility -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:07:06.013 18:10:49 dpdk_mem_utility -- common/autotest_common.sh@834 -- # local max_retries=100 00:07:06.013 18:10:49 dpdk_mem_utility -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:07:06.013 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:07:06.013 18:10:49 dpdk_mem_utility -- common/autotest_common.sh@838 -- # xtrace_disable 00:07:06.013 18:10:49 dpdk_mem_utility -- common/autotest_common.sh@10 -- # set +x 00:07:06.272 [2024-07-12 18:10:49.786870] Starting SPDK v24.09-pre git sha1 182dd7de4 / DPDK 24.03.0 initialization... 00:07:06.272 [2024-07-12 18:10:49.786950] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2424119 ] 00:07:06.272 [2024-07-12 18:10:49.905915] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:06.530 [2024-07-12 18:10:50.013936] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:07:07.097 18:10:50 dpdk_mem_utility -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:07:07.097 18:10:50 dpdk_mem_utility -- common/autotest_common.sh@862 -- # return 0 00:07:07.097 18:10:50 dpdk_mem_utility -- dpdk_memory_utility/test_dpdk_mem_info.sh@17 -- # trap 'killprocess $spdkpid' SIGINT SIGTERM EXIT 00:07:07.097 18:10:50 dpdk_mem_utility -- dpdk_memory_utility/test_dpdk_mem_info.sh@19 -- # rpc_cmd env_dpdk_get_mem_stats 00:07:07.097 18:10:50 dpdk_mem_utility -- common/autotest_common.sh@559 -- # xtrace_disable 00:07:07.097 18:10:50 dpdk_mem_utility -- common/autotest_common.sh@10 -- # set +x 00:07:07.097 { 00:07:07.097 "filename": "/tmp/spdk_mem_dump.txt" 00:07:07.097 } 00:07:07.097 18:10:50 dpdk_mem_utility -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:07:07.097 18:10:50 dpdk_mem_utility -- dpdk_memory_utility/test_dpdk_mem_info.sh@21 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/dpdk_mem_info.py 00:07:07.097 DPDK memory size 816.000000 MiB in 2 heap(s) 00:07:07.097 2 heaps totaling size 816.000000 MiB 00:07:07.097 size: 814.000000 MiB heap id: 0 00:07:07.097 size: 2.000000 MiB heap id: 1 00:07:07.097 end heaps---------- 00:07:07.097 8 mempools totaling size 598.116089 MiB 00:07:07.097 size: 212.674988 MiB name: PDU_immediate_data_Pool 00:07:07.097 size: 158.602051 MiB name: PDU_data_out_Pool 00:07:07.097 size: 84.521057 MiB name: bdev_io_2424119 00:07:07.097 size: 51.011292 MiB name: evtpool_2424119 00:07:07.097 size: 50.003479 MiB name: msgpool_2424119 00:07:07.097 size: 21.763794 MiB name: PDU_Pool 00:07:07.097 size: 19.513306 MiB name: SCSI_TASK_Pool 00:07:07.097 size: 0.026123 MiB name: Session_Pool 00:07:07.097 end mempools------- 00:07:07.097 201 memzones totaling size 4.176453 MiB 00:07:07.097 size: 1.000366 MiB name: RG_ring_0_2424119 00:07:07.097 size: 1.000366 MiB name: RG_ring_1_2424119 00:07:07.097 size: 1.000366 MiB name: RG_ring_4_2424119 00:07:07.097 size: 1.000366 MiB name: RG_ring_5_2424119 00:07:07.097 size: 0.125366 MiB name: RG_ring_2_2424119 00:07:07.097 size: 0.015991 MiB name: RG_ring_3_2424119 00:07:07.097 size: 0.001160 MiB name: QAT_SYM_CAPA_GEN_1 00:07:07.097 size: 0.000305 MiB name: 0000:3d:01.0_qat 00:07:07.097 size: 0.000305 MiB name: 0000:3d:01.1_qat 00:07:07.097 size: 0.000305 MiB name: 0000:3d:01.2_qat 00:07:07.097 size: 0.000305 MiB name: 0000:3d:01.3_qat 00:07:07.097 size: 0.000305 MiB name: 0000:3d:01.4_qat 00:07:07.097 size: 0.000305 MiB name: 0000:3d:01.5_qat 00:07:07.097 size: 0.000305 MiB name: 0000:3d:01.6_qat 00:07:07.097 size: 0.000305 MiB name: 0000:3d:01.7_qat 00:07:07.097 size: 0.000305 MiB name: 0000:3d:02.0_qat 00:07:07.097 size: 0.000305 MiB name: 0000:3d:02.1_qat 00:07:07.097 size: 0.000305 MiB name: 0000:3d:02.2_qat 00:07:07.097 size: 0.000305 MiB name: 0000:3d:02.3_qat 00:07:07.097 size: 0.000305 MiB name: 0000:3d:02.4_qat 00:07:07.097 size: 0.000305 MiB name: 0000:3d:02.5_qat 00:07:07.097 size: 0.000305 MiB name: 0000:3d:02.6_qat 00:07:07.097 size: 0.000305 MiB name: 0000:3d:02.7_qat 00:07:07.097 size: 0.000305 MiB name: 0000:3f:01.0_qat 00:07:07.097 size: 0.000305 MiB name: 0000:3f:01.1_qat 00:07:07.097 size: 0.000305 MiB name: 0000:3f:01.2_qat 00:07:07.097 size: 0.000305 MiB name: 0000:3f:01.3_qat 00:07:07.097 size: 0.000305 MiB name: 0000:3f:01.4_qat 00:07:07.097 size: 0.000305 MiB name: 0000:3f:01.5_qat 00:07:07.097 size: 0.000305 MiB name: 0000:3f:01.6_qat 00:07:07.097 size: 0.000305 MiB name: 0000:3f:01.7_qat 00:07:07.097 size: 0.000305 MiB name: 0000:3f:02.0_qat 00:07:07.097 size: 0.000305 MiB name: 0000:3f:02.1_qat 00:07:07.097 size: 0.000305 MiB name: 0000:3f:02.2_qat 00:07:07.097 size: 0.000305 MiB name: 0000:3f:02.3_qat 00:07:07.097 size: 0.000305 MiB name: 0000:3f:02.4_qat 00:07:07.097 size: 0.000305 MiB name: 0000:3f:02.5_qat 00:07:07.097 size: 0.000305 MiB name: 0000:3f:02.6_qat 00:07:07.097 size: 0.000305 MiB name: 0000:3f:02.7_qat 00:07:07.097 size: 0.000305 MiB name: 0000:da:01.0_qat 00:07:07.097 size: 0.000305 MiB name: 0000:da:01.1_qat 00:07:07.097 size: 0.000305 MiB name: 0000:da:01.2_qat 00:07:07.097 size: 0.000305 MiB name: 0000:da:01.3_qat 00:07:07.097 size: 0.000305 MiB name: 0000:da:01.4_qat 00:07:07.097 size: 0.000305 MiB name: 0000:da:01.5_qat 00:07:07.097 size: 0.000305 MiB name: 0000:da:01.6_qat 00:07:07.097 size: 0.000305 MiB name: 0000:da:01.7_qat 00:07:07.097 size: 0.000305 MiB name: 0000:da:02.0_qat 00:07:07.097 size: 0.000305 MiB name: 0000:da:02.1_qat 00:07:07.097 size: 0.000305 MiB name: 0000:da:02.2_qat 00:07:07.097 size: 0.000305 MiB name: 0000:da:02.3_qat 00:07:07.097 size: 0.000305 MiB name: 0000:da:02.4_qat 00:07:07.097 size: 0.000305 MiB name: 0000:da:02.5_qat 00:07:07.097 size: 0.000305 MiB name: 0000:da:02.6_qat 00:07:07.097 size: 0.000305 MiB name: 0000:da:02.7_qat 00:07:07.097 size: 0.000183 MiB name: QAT_ASYM_CAPA_GEN_1 00:07:07.097 size: 0.000122 MiB name: rte_cryptodev_data_0 00:07:07.097 size: 0.000122 MiB name: rte_cryptodev_data_1 00:07:07.097 size: 0.000122 MiB name: rte_compressdev_data_0 00:07:07.097 size: 0.000122 MiB name: rte_cryptodev_data_2 00:07:07.097 size: 0.000122 MiB name: rte_cryptodev_data_3 00:07:07.097 size: 0.000122 MiB name: rte_compressdev_data_1 00:07:07.097 size: 0.000122 MiB name: rte_cryptodev_data_4 00:07:07.097 size: 0.000122 MiB name: rte_cryptodev_data_5 00:07:07.097 size: 0.000122 MiB name: rte_compressdev_data_2 00:07:07.097 size: 0.000122 MiB name: rte_cryptodev_data_6 00:07:07.097 size: 0.000122 MiB name: rte_cryptodev_data_7 00:07:07.097 size: 0.000122 MiB name: rte_compressdev_data_3 00:07:07.097 size: 0.000122 MiB name: rte_cryptodev_data_8 00:07:07.097 size: 0.000122 MiB name: rte_cryptodev_data_9 00:07:07.097 size: 0.000122 MiB name: rte_compressdev_data_4 00:07:07.097 size: 0.000122 MiB name: rte_cryptodev_data_10 00:07:07.097 size: 0.000122 MiB name: rte_cryptodev_data_11 00:07:07.097 size: 0.000122 MiB name: rte_compressdev_data_5 00:07:07.097 size: 0.000122 MiB name: rte_cryptodev_data_12 00:07:07.097 size: 0.000122 MiB name: rte_cryptodev_data_13 00:07:07.097 size: 0.000122 MiB name: rte_compressdev_data_6 00:07:07.097 size: 0.000122 MiB name: rte_cryptodev_data_14 00:07:07.097 size: 0.000122 MiB name: rte_cryptodev_data_15 00:07:07.097 size: 0.000122 MiB name: rte_compressdev_data_7 00:07:07.097 size: 0.000122 MiB name: rte_cryptodev_data_16 00:07:07.097 size: 0.000122 MiB name: rte_cryptodev_data_17 00:07:07.097 size: 0.000122 MiB name: rte_compressdev_data_8 00:07:07.097 size: 0.000122 MiB name: rte_cryptodev_data_18 00:07:07.097 size: 0.000122 MiB name: rte_cryptodev_data_19 00:07:07.097 size: 0.000122 MiB name: rte_compressdev_data_9 00:07:07.097 size: 0.000122 MiB name: rte_cryptodev_data_20 00:07:07.097 size: 0.000122 MiB name: rte_cryptodev_data_21 00:07:07.097 size: 0.000122 MiB name: rte_compressdev_data_10 00:07:07.097 size: 0.000122 MiB name: rte_cryptodev_data_22 00:07:07.097 size: 0.000122 MiB name: rte_cryptodev_data_23 00:07:07.097 size: 0.000122 MiB name: rte_compressdev_data_11 00:07:07.097 size: 0.000122 MiB name: rte_cryptodev_data_24 00:07:07.097 size: 0.000122 MiB name: rte_cryptodev_data_25 00:07:07.097 size: 0.000122 MiB name: rte_compressdev_data_12 00:07:07.097 size: 0.000122 MiB name: rte_cryptodev_data_26 00:07:07.097 size: 0.000122 MiB name: rte_cryptodev_data_27 00:07:07.097 size: 0.000122 MiB name: rte_compressdev_data_13 00:07:07.097 size: 0.000122 MiB name: rte_cryptodev_data_28 00:07:07.097 size: 0.000122 MiB name: rte_cryptodev_data_29 00:07:07.097 size: 0.000122 MiB name: rte_compressdev_data_14 00:07:07.097 size: 0.000122 MiB name: rte_cryptodev_data_30 00:07:07.097 size: 0.000122 MiB name: rte_cryptodev_data_31 00:07:07.097 size: 0.000122 MiB name: rte_compressdev_data_15 00:07:07.097 size: 0.000122 MiB name: rte_cryptodev_data_32 00:07:07.097 size: 0.000122 MiB name: rte_cryptodev_data_33 00:07:07.097 size: 0.000122 MiB name: rte_compressdev_data_16 00:07:07.097 size: 0.000122 MiB name: rte_cryptodev_data_34 00:07:07.097 size: 0.000122 MiB name: rte_cryptodev_data_35 00:07:07.097 size: 0.000122 MiB name: rte_compressdev_data_17 00:07:07.097 size: 0.000122 MiB name: rte_cryptodev_data_36 00:07:07.097 size: 0.000122 MiB name: rte_cryptodev_data_37 00:07:07.097 size: 0.000122 MiB name: rte_compressdev_data_18 00:07:07.097 size: 0.000122 MiB name: rte_cryptodev_data_38 00:07:07.097 size: 0.000122 MiB name: rte_cryptodev_data_39 00:07:07.097 size: 0.000122 MiB name: rte_compressdev_data_19 00:07:07.097 size: 0.000122 MiB name: rte_cryptodev_data_40 00:07:07.097 size: 0.000122 MiB name: rte_cryptodev_data_41 00:07:07.097 size: 0.000122 MiB name: rte_compressdev_data_20 00:07:07.098 size: 0.000122 MiB name: rte_cryptodev_data_42 00:07:07.098 size: 0.000122 MiB name: rte_cryptodev_data_43 00:07:07.098 size: 0.000122 MiB name: rte_compressdev_data_21 00:07:07.098 size: 0.000122 MiB name: rte_cryptodev_data_44 00:07:07.098 size: 0.000122 MiB name: rte_cryptodev_data_45 00:07:07.098 size: 0.000122 MiB name: rte_compressdev_data_22 00:07:07.098 size: 0.000122 MiB name: rte_cryptodev_data_46 00:07:07.098 size: 0.000122 MiB name: rte_cryptodev_data_47 00:07:07.098 size: 0.000122 MiB name: rte_compressdev_data_23 00:07:07.098 size: 0.000122 MiB name: rte_cryptodev_data_48 00:07:07.098 size: 0.000122 MiB name: rte_cryptodev_data_49 00:07:07.098 size: 0.000122 MiB name: rte_compressdev_data_24 00:07:07.098 size: 0.000122 MiB name: rte_cryptodev_data_50 00:07:07.098 size: 0.000122 MiB name: rte_cryptodev_data_51 00:07:07.098 size: 0.000122 MiB name: rte_compressdev_data_25 00:07:07.098 size: 0.000122 MiB name: rte_cryptodev_data_52 00:07:07.098 size: 0.000122 MiB name: rte_cryptodev_data_53 00:07:07.098 size: 0.000122 MiB name: rte_compressdev_data_26 00:07:07.098 size: 0.000122 MiB name: rte_cryptodev_data_54 00:07:07.098 size: 0.000122 MiB name: rte_cryptodev_data_55 00:07:07.098 size: 0.000122 MiB name: rte_compressdev_data_27 00:07:07.098 size: 0.000122 MiB name: rte_cryptodev_data_56 00:07:07.098 size: 0.000122 MiB name: rte_cryptodev_data_57 00:07:07.098 size: 0.000122 MiB name: rte_compressdev_data_28 00:07:07.098 size: 0.000122 MiB name: rte_cryptodev_data_58 00:07:07.098 size: 0.000122 MiB name: rte_cryptodev_data_59 00:07:07.098 size: 0.000122 MiB name: rte_compressdev_data_29 00:07:07.098 size: 0.000122 MiB name: rte_cryptodev_data_60 00:07:07.098 size: 0.000122 MiB name: rte_cryptodev_data_61 00:07:07.098 size: 0.000122 MiB name: rte_compressdev_data_30 00:07:07.098 size: 0.000122 MiB name: rte_cryptodev_data_62 00:07:07.098 size: 0.000122 MiB name: rte_cryptodev_data_63 00:07:07.098 size: 0.000122 MiB name: rte_compressdev_data_31 00:07:07.098 size: 0.000122 MiB name: rte_cryptodev_data_64 00:07:07.098 size: 0.000122 MiB name: rte_cryptodev_data_65 00:07:07.098 size: 0.000122 MiB name: rte_compressdev_data_32 00:07:07.098 size: 0.000122 MiB name: rte_cryptodev_data_66 00:07:07.098 size: 0.000122 MiB name: rte_cryptodev_data_67 00:07:07.098 size: 0.000122 MiB name: rte_compressdev_data_33 00:07:07.098 size: 0.000122 MiB name: rte_cryptodev_data_68 00:07:07.098 size: 0.000122 MiB name: rte_cryptodev_data_69 00:07:07.098 size: 0.000122 MiB name: rte_compressdev_data_34 00:07:07.098 size: 0.000122 MiB name: rte_cryptodev_data_70 00:07:07.098 size: 0.000122 MiB name: rte_cryptodev_data_71 00:07:07.098 size: 0.000122 MiB name: rte_compressdev_data_35 00:07:07.098 size: 0.000122 MiB name: rte_cryptodev_data_72 00:07:07.098 size: 0.000122 MiB name: rte_cryptodev_data_73 00:07:07.098 size: 0.000122 MiB name: rte_compressdev_data_36 00:07:07.098 size: 0.000122 MiB name: rte_cryptodev_data_74 00:07:07.098 size: 0.000122 MiB name: rte_cryptodev_data_75 00:07:07.098 size: 0.000122 MiB name: rte_compressdev_data_37 00:07:07.098 size: 0.000122 MiB name: rte_cryptodev_data_76 00:07:07.098 size: 0.000122 MiB name: rte_cryptodev_data_77 00:07:07.098 size: 0.000122 MiB name: rte_compressdev_data_38 00:07:07.098 size: 0.000122 MiB name: rte_cryptodev_data_78 00:07:07.098 size: 0.000122 MiB name: rte_cryptodev_data_79 00:07:07.098 size: 0.000122 MiB name: rte_compressdev_data_39 00:07:07.098 size: 0.000122 MiB name: rte_cryptodev_data_80 00:07:07.098 size: 0.000122 MiB name: rte_cryptodev_data_81 00:07:07.098 size: 0.000122 MiB name: rte_compressdev_data_40 00:07:07.098 size: 0.000122 MiB name: rte_cryptodev_data_82 00:07:07.098 size: 0.000122 MiB name: rte_cryptodev_data_83 00:07:07.098 size: 0.000122 MiB name: rte_compressdev_data_41 00:07:07.098 size: 0.000122 MiB name: rte_cryptodev_data_84 00:07:07.098 size: 0.000122 MiB name: rte_cryptodev_data_85 00:07:07.098 size: 0.000122 MiB name: rte_compressdev_data_42 00:07:07.098 size: 0.000122 MiB name: rte_cryptodev_data_86 00:07:07.098 size: 0.000122 MiB name: rte_cryptodev_data_87 00:07:07.098 size: 0.000122 MiB name: rte_compressdev_data_43 00:07:07.098 size: 0.000122 MiB name: rte_cryptodev_data_88 00:07:07.098 size: 0.000122 MiB name: rte_cryptodev_data_89 00:07:07.098 size: 0.000122 MiB name: rte_compressdev_data_44 00:07:07.098 size: 0.000122 MiB name: rte_cryptodev_data_90 00:07:07.098 size: 0.000122 MiB name: rte_cryptodev_data_91 00:07:07.098 size: 0.000122 MiB name: rte_compressdev_data_45 00:07:07.098 size: 0.000122 MiB name: rte_cryptodev_data_92 00:07:07.098 size: 0.000122 MiB name: rte_cryptodev_data_93 00:07:07.098 size: 0.000122 MiB name: rte_compressdev_data_46 00:07:07.098 size: 0.000122 MiB name: rte_cryptodev_data_94 00:07:07.098 size: 0.000122 MiB name: rte_cryptodev_data_95 00:07:07.098 size: 0.000122 MiB name: rte_compressdev_data_47 00:07:07.098 size: 0.000061 MiB name: QAT_COMP_CAPA_GEN_1 00:07:07.098 end memzones------- 00:07:07.098 18:10:50 dpdk_mem_utility -- dpdk_memory_utility/test_dpdk_mem_info.sh@23 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/dpdk_mem_info.py -m 0 00:07:07.360 heap id: 0 total size: 814.000000 MiB number of busy elements: 526 number of free elements: 14 00:07:07.360 list of free elements. size: 11.813538 MiB 00:07:07.360 element at address: 0x200000400000 with size: 1.999512 MiB 00:07:07.360 element at address: 0x200018e00000 with size: 0.999878 MiB 00:07:07.360 element at address: 0x200019000000 with size: 0.999878 MiB 00:07:07.360 element at address: 0x200003e00000 with size: 0.996460 MiB 00:07:07.360 element at address: 0x200031c00000 with size: 0.994446 MiB 00:07:07.360 element at address: 0x200013800000 with size: 0.978882 MiB 00:07:07.360 element at address: 0x200007000000 with size: 0.959839 MiB 00:07:07.360 element at address: 0x200019200000 with size: 0.937256 MiB 00:07:07.360 element at address: 0x20001aa00000 with size: 0.582336 MiB 00:07:07.360 element at address: 0x200003a00000 with size: 0.498535 MiB 00:07:07.360 element at address: 0x20000b200000 with size: 0.491272 MiB 00:07:07.360 element at address: 0x200000800000 with size: 0.486877 MiB 00:07:07.360 element at address: 0x200019400000 with size: 0.485840 MiB 00:07:07.360 element at address: 0x200027e00000 with size: 0.402527 MiB 00:07:07.360 list of standard malloc elements. size: 199.878174 MiB 00:07:07.360 element at address: 0x20000b3fff80 with size: 132.000122 MiB 00:07:07.360 element at address: 0x2000071fff80 with size: 64.000122 MiB 00:07:07.360 element at address: 0x200018efff80 with size: 1.000122 MiB 00:07:07.360 element at address: 0x2000190fff80 with size: 1.000122 MiB 00:07:07.360 element at address: 0x2000192fff80 with size: 1.000122 MiB 00:07:07.360 element at address: 0x2000003d9f00 with size: 0.140747 MiB 00:07:07.360 element at address: 0x2000192eff00 with size: 0.062622 MiB 00:07:07.360 element at address: 0x2000003fdf80 with size: 0.007935 MiB 00:07:07.360 element at address: 0x200000330b40 with size: 0.004395 MiB 00:07:07.360 element at address: 0x2000003340c0 with size: 0.004395 MiB 00:07:07.360 element at address: 0x200000337640 with size: 0.004395 MiB 00:07:07.360 element at address: 0x20000033abc0 with size: 0.004395 MiB 00:07:07.360 element at address: 0x20000033e140 with size: 0.004395 MiB 00:07:07.360 element at address: 0x2000003416c0 with size: 0.004395 MiB 00:07:07.360 element at address: 0x200000344c40 with size: 0.004395 MiB 00:07:07.360 element at address: 0x2000003481c0 with size: 0.004395 MiB 00:07:07.360 element at address: 0x20000034b740 with size: 0.004395 MiB 00:07:07.360 element at address: 0x20000034ecc0 with size: 0.004395 MiB 00:07:07.360 element at address: 0x200000352240 with size: 0.004395 MiB 00:07:07.360 element at address: 0x2000003557c0 with size: 0.004395 MiB 00:07:07.360 element at address: 0x200000358d40 with size: 0.004395 MiB 00:07:07.360 element at address: 0x20000035c2c0 with size: 0.004395 MiB 00:07:07.360 element at address: 0x20000035f840 with size: 0.004395 MiB 00:07:07.360 element at address: 0x200000362dc0 with size: 0.004395 MiB 00:07:07.360 element at address: 0x200000366880 with size: 0.004395 MiB 00:07:07.360 element at address: 0x20000036a340 with size: 0.004395 MiB 00:07:07.360 element at address: 0x20000036de00 with size: 0.004395 MiB 00:07:07.360 element at address: 0x2000003718c0 with size: 0.004395 MiB 00:07:07.360 element at address: 0x200000375380 with size: 0.004395 MiB 00:07:07.360 element at address: 0x200000378e40 with size: 0.004395 MiB 00:07:07.360 element at address: 0x20000037c900 with size: 0.004395 MiB 00:07:07.360 element at address: 0x2000003803c0 with size: 0.004395 MiB 00:07:07.360 element at address: 0x200000383e80 with size: 0.004395 MiB 00:07:07.360 element at address: 0x200000387940 with size: 0.004395 MiB 00:07:07.360 element at address: 0x20000038b400 with size: 0.004395 MiB 00:07:07.360 element at address: 0x20000038eec0 with size: 0.004395 MiB 00:07:07.360 element at address: 0x200000392980 with size: 0.004395 MiB 00:07:07.360 element at address: 0x200000396440 with size: 0.004395 MiB 00:07:07.360 element at address: 0x200000399f00 with size: 0.004395 MiB 00:07:07.360 element at address: 0x20000039d9c0 with size: 0.004395 MiB 00:07:07.360 element at address: 0x2000003a1480 with size: 0.004395 MiB 00:07:07.360 element at address: 0x2000003a4f40 with size: 0.004395 MiB 00:07:07.360 element at address: 0x2000003a8a00 with size: 0.004395 MiB 00:07:07.360 element at address: 0x2000003ac4c0 with size: 0.004395 MiB 00:07:07.360 element at address: 0x2000003aff80 with size: 0.004395 MiB 00:07:07.360 element at address: 0x2000003b3a40 with size: 0.004395 MiB 00:07:07.360 element at address: 0x2000003b7500 with size: 0.004395 MiB 00:07:07.360 element at address: 0x2000003bafc0 with size: 0.004395 MiB 00:07:07.360 element at address: 0x2000003bea80 with size: 0.004395 MiB 00:07:07.360 element at address: 0x2000003c2540 with size: 0.004395 MiB 00:07:07.360 element at address: 0x2000003c6000 with size: 0.004395 MiB 00:07:07.360 element at address: 0x2000003c9ac0 with size: 0.004395 MiB 00:07:07.360 element at address: 0x2000003cd580 with size: 0.004395 MiB 00:07:07.360 element at address: 0x2000003d1040 with size: 0.004395 MiB 00:07:07.360 element at address: 0x2000003d4b00 with size: 0.004395 MiB 00:07:07.361 element at address: 0x2000003d8d00 with size: 0.004395 MiB 00:07:07.361 element at address: 0x20000032ea40 with size: 0.004028 MiB 00:07:07.361 element at address: 0x20000032fac0 with size: 0.004028 MiB 00:07:07.361 element at address: 0x200000331fc0 with size: 0.004028 MiB 00:07:07.361 element at address: 0x200000333040 with size: 0.004028 MiB 00:07:07.361 element at address: 0x200000335540 with size: 0.004028 MiB 00:07:07.361 element at address: 0x2000003365c0 with size: 0.004028 MiB 00:07:07.361 element at address: 0x200000338ac0 with size: 0.004028 MiB 00:07:07.361 element at address: 0x200000339b40 with size: 0.004028 MiB 00:07:07.361 element at address: 0x20000033c040 with size: 0.004028 MiB 00:07:07.361 element at address: 0x20000033d0c0 with size: 0.004028 MiB 00:07:07.361 element at address: 0x20000033f5c0 with size: 0.004028 MiB 00:07:07.361 element at address: 0x200000340640 with size: 0.004028 MiB 00:07:07.361 element at address: 0x200000342b40 with size: 0.004028 MiB 00:07:07.361 element at address: 0x200000343bc0 with size: 0.004028 MiB 00:07:07.361 element at address: 0x2000003460c0 with size: 0.004028 MiB 00:07:07.361 element at address: 0x200000347140 with size: 0.004028 MiB 00:07:07.361 element at address: 0x200000349640 with size: 0.004028 MiB 00:07:07.361 element at address: 0x20000034a6c0 with size: 0.004028 MiB 00:07:07.361 element at address: 0x20000034cbc0 with size: 0.004028 MiB 00:07:07.361 element at address: 0x20000034dc40 with size: 0.004028 MiB 00:07:07.361 element at address: 0x200000350140 with size: 0.004028 MiB 00:07:07.361 element at address: 0x2000003511c0 with size: 0.004028 MiB 00:07:07.361 element at address: 0x2000003536c0 with size: 0.004028 MiB 00:07:07.361 element at address: 0x200000354740 with size: 0.004028 MiB 00:07:07.361 element at address: 0x200000356c40 with size: 0.004028 MiB 00:07:07.361 element at address: 0x200000357cc0 with size: 0.004028 MiB 00:07:07.361 element at address: 0x20000035a1c0 with size: 0.004028 MiB 00:07:07.361 element at address: 0x20000035b240 with size: 0.004028 MiB 00:07:07.361 element at address: 0x20000035d740 with size: 0.004028 MiB 00:07:07.361 element at address: 0x20000035e7c0 with size: 0.004028 MiB 00:07:07.361 element at address: 0x200000360cc0 with size: 0.004028 MiB 00:07:07.361 element at address: 0x200000361d40 with size: 0.004028 MiB 00:07:07.361 element at address: 0x200000364780 with size: 0.004028 MiB 00:07:07.361 element at address: 0x200000365800 with size: 0.004028 MiB 00:07:07.361 element at address: 0x200000368240 with size: 0.004028 MiB 00:07:07.361 element at address: 0x2000003692c0 with size: 0.004028 MiB 00:07:07.361 element at address: 0x20000036bd00 with size: 0.004028 MiB 00:07:07.361 element at address: 0x20000036cd80 with size: 0.004028 MiB 00:07:07.361 element at address: 0x20000036f7c0 with size: 0.004028 MiB 00:07:07.361 element at address: 0x200000370840 with size: 0.004028 MiB 00:07:07.361 element at address: 0x200000373280 with size: 0.004028 MiB 00:07:07.361 element at address: 0x200000374300 with size: 0.004028 MiB 00:07:07.361 element at address: 0x200000376d40 with size: 0.004028 MiB 00:07:07.361 element at address: 0x200000377dc0 with size: 0.004028 MiB 00:07:07.361 element at address: 0x20000037a800 with size: 0.004028 MiB 00:07:07.361 element at address: 0x20000037b880 with size: 0.004028 MiB 00:07:07.361 element at address: 0x20000037e2c0 with size: 0.004028 MiB 00:07:07.361 element at address: 0x20000037f340 with size: 0.004028 MiB 00:07:07.361 element at address: 0x200000381d80 with size: 0.004028 MiB 00:07:07.361 element at address: 0x200000382e00 with size: 0.004028 MiB 00:07:07.361 element at address: 0x200000385840 with size: 0.004028 MiB 00:07:07.361 element at address: 0x2000003868c0 with size: 0.004028 MiB 00:07:07.361 element at address: 0x200000389300 with size: 0.004028 MiB 00:07:07.361 element at address: 0x20000038a380 with size: 0.004028 MiB 00:07:07.361 element at address: 0x20000038cdc0 with size: 0.004028 MiB 00:07:07.361 element at address: 0x20000038de40 with size: 0.004028 MiB 00:07:07.361 element at address: 0x200000390880 with size: 0.004028 MiB 00:07:07.361 element at address: 0x200000391900 with size: 0.004028 MiB 00:07:07.361 element at address: 0x200000394340 with size: 0.004028 MiB 00:07:07.361 element at address: 0x2000003953c0 with size: 0.004028 MiB 00:07:07.361 element at address: 0x200000397e00 with size: 0.004028 MiB 00:07:07.361 element at address: 0x200000398e80 with size: 0.004028 MiB 00:07:07.361 element at address: 0x20000039b8c0 with size: 0.004028 MiB 00:07:07.361 element at address: 0x20000039c940 with size: 0.004028 MiB 00:07:07.361 element at address: 0x20000039f380 with size: 0.004028 MiB 00:07:07.361 element at address: 0x2000003a0400 with size: 0.004028 MiB 00:07:07.361 element at address: 0x2000003a2e40 with size: 0.004028 MiB 00:07:07.361 element at address: 0x2000003a3ec0 with size: 0.004028 MiB 00:07:07.361 element at address: 0x2000003a6900 with size: 0.004028 MiB 00:07:07.361 element at address: 0x2000003a7980 with size: 0.004028 MiB 00:07:07.361 element at address: 0x2000003aa3c0 with size: 0.004028 MiB 00:07:07.361 element at address: 0x2000003ab440 with size: 0.004028 MiB 00:07:07.361 element at address: 0x2000003ade80 with size: 0.004028 MiB 00:07:07.361 element at address: 0x2000003aef00 with size: 0.004028 MiB 00:07:07.361 element at address: 0x2000003b1940 with size: 0.004028 MiB 00:07:07.361 element at address: 0x2000003b29c0 with size: 0.004028 MiB 00:07:07.361 element at address: 0x2000003b5400 with size: 0.004028 MiB 00:07:07.361 element at address: 0x2000003b6480 with size: 0.004028 MiB 00:07:07.361 element at address: 0x2000003b8ec0 with size: 0.004028 MiB 00:07:07.361 element at address: 0x2000003b9f40 with size: 0.004028 MiB 00:07:07.361 element at address: 0x2000003bc980 with size: 0.004028 MiB 00:07:07.361 element at address: 0x2000003bda00 with size: 0.004028 MiB 00:07:07.361 element at address: 0x2000003c0440 with size: 0.004028 MiB 00:07:07.361 element at address: 0x2000003c14c0 with size: 0.004028 MiB 00:07:07.361 element at address: 0x2000003c3f00 with size: 0.004028 MiB 00:07:07.361 element at address: 0x2000003c4f80 with size: 0.004028 MiB 00:07:07.361 element at address: 0x2000003c79c0 with size: 0.004028 MiB 00:07:07.361 element at address: 0x2000003c8a40 with size: 0.004028 MiB 00:07:07.361 element at address: 0x2000003cb480 with size: 0.004028 MiB 00:07:07.361 element at address: 0x2000003cc500 with size: 0.004028 MiB 00:07:07.361 element at address: 0x2000003cef40 with size: 0.004028 MiB 00:07:07.361 element at address: 0x2000003cffc0 with size: 0.004028 MiB 00:07:07.361 element at address: 0x2000003d2a00 with size: 0.004028 MiB 00:07:07.361 element at address: 0x2000003d3a80 with size: 0.004028 MiB 00:07:07.361 element at address: 0x2000003d6c00 with size: 0.004028 MiB 00:07:07.361 element at address: 0x2000003d7c80 with size: 0.004028 MiB 00:07:07.361 element at address: 0x2000002048c0 with size: 0.000305 MiB 00:07:07.361 element at address: 0x200000200000 with size: 0.000183 MiB 00:07:07.361 element at address: 0x2000002000c0 with size: 0.000183 MiB 00:07:07.361 element at address: 0x200000200180 with size: 0.000183 MiB 00:07:07.361 element at address: 0x200000200240 with size: 0.000183 MiB 00:07:07.361 element at address: 0x200000200300 with size: 0.000183 MiB 00:07:07.361 element at address: 0x2000002003c0 with size: 0.000183 MiB 00:07:07.361 element at address: 0x200000200480 with size: 0.000183 MiB 00:07:07.361 element at address: 0x200000200540 with size: 0.000183 MiB 00:07:07.361 element at address: 0x200000200600 with size: 0.000183 MiB 00:07:07.361 element at address: 0x2000002006c0 with size: 0.000183 MiB 00:07:07.361 element at address: 0x200000200780 with size: 0.000183 MiB 00:07:07.361 element at address: 0x200000200840 with size: 0.000183 MiB 00:07:07.361 element at address: 0x200000200900 with size: 0.000183 MiB 00:07:07.361 element at address: 0x2000002009c0 with size: 0.000183 MiB 00:07:07.361 element at address: 0x200000200a80 with size: 0.000183 MiB 00:07:07.361 element at address: 0x200000200b40 with size: 0.000183 MiB 00:07:07.362 element at address: 0x200000200c00 with size: 0.000183 MiB 00:07:07.362 element at address: 0x200000200cc0 with size: 0.000183 MiB 00:07:07.362 element at address: 0x200000200d80 with size: 0.000183 MiB 00:07:07.362 element at address: 0x200000200e40 with size: 0.000183 MiB 00:07:07.362 element at address: 0x200000200f00 with size: 0.000183 MiB 00:07:07.362 element at address: 0x200000200fc0 with size: 0.000183 MiB 00:07:07.362 element at address: 0x200000201080 with size: 0.000183 MiB 00:07:07.362 element at address: 0x200000201140 with size: 0.000183 MiB 00:07:07.362 element at address: 0x200000201200 with size: 0.000183 MiB 00:07:07.362 element at address: 0x2000002012c0 with size: 0.000183 MiB 00:07:07.362 element at address: 0x200000201380 with size: 0.000183 MiB 00:07:07.362 element at address: 0x200000201440 with size: 0.000183 MiB 00:07:07.362 element at address: 0x200000201500 with size: 0.000183 MiB 00:07:07.362 element at address: 0x2000002015c0 with size: 0.000183 MiB 00:07:07.362 element at address: 0x200000201680 with size: 0.000183 MiB 00:07:07.362 element at address: 0x200000201740 with size: 0.000183 MiB 00:07:07.362 element at address: 0x200000201800 with size: 0.000183 MiB 00:07:07.362 element at address: 0x2000002018c0 with size: 0.000183 MiB 00:07:07.362 element at address: 0x200000201980 with size: 0.000183 MiB 00:07:07.362 element at address: 0x200000201a40 with size: 0.000183 MiB 00:07:07.362 element at address: 0x200000201b00 with size: 0.000183 MiB 00:07:07.362 element at address: 0x200000201bc0 with size: 0.000183 MiB 00:07:07.362 element at address: 0x200000201c80 with size: 0.000183 MiB 00:07:07.362 element at address: 0x200000201d40 with size: 0.000183 MiB 00:07:07.362 element at address: 0x200000201e00 with size: 0.000183 MiB 00:07:07.362 element at address: 0x200000201ec0 with size: 0.000183 MiB 00:07:07.362 element at address: 0x200000201f80 with size: 0.000183 MiB 00:07:07.362 element at address: 0x200000202040 with size: 0.000183 MiB 00:07:07.362 element at address: 0x200000202100 with size: 0.000183 MiB 00:07:07.362 element at address: 0x2000002021c0 with size: 0.000183 MiB 00:07:07.362 element at address: 0x200000202280 with size: 0.000183 MiB 00:07:07.362 element at address: 0x200000202340 with size: 0.000183 MiB 00:07:07.362 element at address: 0x200000202400 with size: 0.000183 MiB 00:07:07.362 element at address: 0x2000002024c0 with size: 0.000183 MiB 00:07:07.362 element at address: 0x200000202580 with size: 0.000183 MiB 00:07:07.362 element at address: 0x200000202640 with size: 0.000183 MiB 00:07:07.362 element at address: 0x200000202700 with size: 0.000183 MiB 00:07:07.362 element at address: 0x2000002027c0 with size: 0.000183 MiB 00:07:07.362 element at address: 0x200000202880 with size: 0.000183 MiB 00:07:07.362 element at address: 0x200000202940 with size: 0.000183 MiB 00:07:07.362 element at address: 0x200000202a00 with size: 0.000183 MiB 00:07:07.362 element at address: 0x200000202ac0 with size: 0.000183 MiB 00:07:07.362 element at address: 0x200000202b80 with size: 0.000183 MiB 00:07:07.362 element at address: 0x200000202c40 with size: 0.000183 MiB 00:07:07.362 element at address: 0x200000202d00 with size: 0.000183 MiB 00:07:07.362 element at address: 0x200000202dc0 with size: 0.000183 MiB 00:07:07.362 element at address: 0x200000202e80 with size: 0.000183 MiB 00:07:07.362 element at address: 0x200000202f40 with size: 0.000183 MiB 00:07:07.362 element at address: 0x200000203000 with size: 0.000183 MiB 00:07:07.362 element at address: 0x2000002030c0 with size: 0.000183 MiB 00:07:07.362 element at address: 0x200000203180 with size: 0.000183 MiB 00:07:07.362 element at address: 0x200000203240 with size: 0.000183 MiB 00:07:07.362 element at address: 0x200000203300 with size: 0.000183 MiB 00:07:07.362 element at address: 0x2000002033c0 with size: 0.000183 MiB 00:07:07.362 element at address: 0x200000203480 with size: 0.000183 MiB 00:07:07.362 element at address: 0x200000203540 with size: 0.000183 MiB 00:07:07.362 element at address: 0x200000203600 with size: 0.000183 MiB 00:07:07.362 element at address: 0x2000002036c0 with size: 0.000183 MiB 00:07:07.362 element at address: 0x200000203780 with size: 0.000183 MiB 00:07:07.362 element at address: 0x200000203840 with size: 0.000183 MiB 00:07:07.362 element at address: 0x200000203900 with size: 0.000183 MiB 00:07:07.362 element at address: 0x2000002039c0 with size: 0.000183 MiB 00:07:07.362 element at address: 0x200000203a80 with size: 0.000183 MiB 00:07:07.362 element at address: 0x200000203b40 with size: 0.000183 MiB 00:07:07.362 element at address: 0x200000203c00 with size: 0.000183 MiB 00:07:07.362 element at address: 0x200000203cc0 with size: 0.000183 MiB 00:07:07.362 element at address: 0x200000203d80 with size: 0.000183 MiB 00:07:07.362 element at address: 0x200000203e40 with size: 0.000183 MiB 00:07:07.362 element at address: 0x200000203f00 with size: 0.000183 MiB 00:07:07.362 element at address: 0x200000203fc0 with size: 0.000183 MiB 00:07:07.362 element at address: 0x200000204080 with size: 0.000183 MiB 00:07:07.362 element at address: 0x200000204140 with size: 0.000183 MiB 00:07:07.362 element at address: 0x200000204200 with size: 0.000183 MiB 00:07:07.362 element at address: 0x2000002042c0 with size: 0.000183 MiB 00:07:07.362 element at address: 0x200000204380 with size: 0.000183 MiB 00:07:07.362 element at address: 0x200000204440 with size: 0.000183 MiB 00:07:07.362 element at address: 0x200000204500 with size: 0.000183 MiB 00:07:07.362 element at address: 0x2000002045c0 with size: 0.000183 MiB 00:07:07.362 element at address: 0x200000204680 with size: 0.000183 MiB 00:07:07.362 element at address: 0x200000204740 with size: 0.000183 MiB 00:07:07.362 element at address: 0x200000204800 with size: 0.000183 MiB 00:07:07.362 element at address: 0x200000204a00 with size: 0.000183 MiB 00:07:07.362 element at address: 0x200000204ac0 with size: 0.000183 MiB 00:07:07.362 element at address: 0x200000204b80 with size: 0.000183 MiB 00:07:07.362 element at address: 0x200000204c40 with size: 0.000183 MiB 00:07:07.362 element at address: 0x200000204d00 with size: 0.000183 MiB 00:07:07.362 element at address: 0x200000204dc0 with size: 0.000183 MiB 00:07:07.362 element at address: 0x200000204e80 with size: 0.000183 MiB 00:07:07.362 element at address: 0x200000204f40 with size: 0.000183 MiB 00:07:07.362 element at address: 0x200000205000 with size: 0.000183 MiB 00:07:07.362 element at address: 0x2000002050c0 with size: 0.000183 MiB 00:07:07.362 element at address: 0x200000205180 with size: 0.000183 MiB 00:07:07.362 element at address: 0x200000205240 with size: 0.000183 MiB 00:07:07.362 element at address: 0x200000205300 with size: 0.000183 MiB 00:07:07.362 element at address: 0x2000002053c0 with size: 0.000183 MiB 00:07:07.362 element at address: 0x200000205480 with size: 0.000183 MiB 00:07:07.362 element at address: 0x200000205540 with size: 0.000183 MiB 00:07:07.362 element at address: 0x200000205600 with size: 0.000183 MiB 00:07:07.362 element at address: 0x2000002056c0 with size: 0.000183 MiB 00:07:07.362 element at address: 0x200000205780 with size: 0.000183 MiB 00:07:07.362 element at address: 0x200000205840 with size: 0.000183 MiB 00:07:07.362 element at address: 0x200000205900 with size: 0.000183 MiB 00:07:07.362 element at address: 0x2000002059c0 with size: 0.000183 MiB 00:07:07.362 element at address: 0x200000205a80 with size: 0.000183 MiB 00:07:07.362 element at address: 0x200000205b40 with size: 0.000183 MiB 00:07:07.362 element at address: 0x200000205c00 with size: 0.000183 MiB 00:07:07.362 element at address: 0x200000205cc0 with size: 0.000183 MiB 00:07:07.362 element at address: 0x200000205d80 with size: 0.000183 MiB 00:07:07.362 element at address: 0x200000205e40 with size: 0.000183 MiB 00:07:07.362 element at address: 0x200000205f00 with size: 0.000183 MiB 00:07:07.362 element at address: 0x200000205fc0 with size: 0.000183 MiB 00:07:07.362 element at address: 0x200000206080 with size: 0.000183 MiB 00:07:07.362 element at address: 0x200000206140 with size: 0.000183 MiB 00:07:07.362 element at address: 0x200000206200 with size: 0.000183 MiB 00:07:07.362 element at address: 0x2000002062c0 with size: 0.000183 MiB 00:07:07.362 element at address: 0x2000002064c0 with size: 0.000183 MiB 00:07:07.363 element at address: 0x20000020a780 with size: 0.000183 MiB 00:07:07.363 element at address: 0x20000022aa40 with size: 0.000183 MiB 00:07:07.363 element at address: 0x20000022ab00 with size: 0.000183 MiB 00:07:07.363 element at address: 0x20000022abc0 with size: 0.000183 MiB 00:07:07.363 element at address: 0x20000022ac80 with size: 0.000183 MiB 00:07:07.363 element at address: 0x20000022ad40 with size: 0.000183 MiB 00:07:07.363 element at address: 0x20000022ae00 with size: 0.000183 MiB 00:07:07.363 element at address: 0x20000022aec0 with size: 0.000183 MiB 00:07:07.363 element at address: 0x20000022af80 with size: 0.000183 MiB 00:07:07.363 element at address: 0x20000022b040 with size: 0.000183 MiB 00:07:07.363 element at address: 0x20000022b100 with size: 0.000183 MiB 00:07:07.363 element at address: 0x20000022b1c0 with size: 0.000183 MiB 00:07:07.363 element at address: 0x20000022b280 with size: 0.000183 MiB 00:07:07.363 element at address: 0x20000022b340 with size: 0.000183 MiB 00:07:07.363 element at address: 0x20000022b400 with size: 0.000183 MiB 00:07:07.363 element at address: 0x20000022b4c0 with size: 0.000183 MiB 00:07:07.363 element at address: 0x20000022b580 with size: 0.000183 MiB 00:07:07.363 element at address: 0x20000022b640 with size: 0.000183 MiB 00:07:07.363 element at address: 0x20000022b700 with size: 0.000183 MiB 00:07:07.363 element at address: 0x20000022b7c0 with size: 0.000183 MiB 00:07:07.363 element at address: 0x20000022b9c0 with size: 0.000183 MiB 00:07:07.363 element at address: 0x20000022ba80 with size: 0.000183 MiB 00:07:07.363 element at address: 0x20000022bb40 with size: 0.000183 MiB 00:07:07.363 element at address: 0x20000022bc00 with size: 0.000183 MiB 00:07:07.363 element at address: 0x20000022bcc0 with size: 0.000183 MiB 00:07:07.363 element at address: 0x20000022bd80 with size: 0.000183 MiB 00:07:07.363 element at address: 0x20000022be40 with size: 0.000183 MiB 00:07:07.363 element at address: 0x20000022bf00 with size: 0.000183 MiB 00:07:07.363 element at address: 0x20000022bfc0 with size: 0.000183 MiB 00:07:07.363 element at address: 0x20000022c080 with size: 0.000183 MiB 00:07:07.363 element at address: 0x20000022c140 with size: 0.000183 MiB 00:07:07.363 element at address: 0x20000022c200 with size: 0.000183 MiB 00:07:07.363 element at address: 0x20000022c2c0 with size: 0.000183 MiB 00:07:07.363 element at address: 0x20000022c380 with size: 0.000183 MiB 00:07:07.363 element at address: 0x20000022c440 with size: 0.000183 MiB 00:07:07.363 element at address: 0x20000022c500 with size: 0.000183 MiB 00:07:07.363 element at address: 0x20000032e700 with size: 0.000183 MiB 00:07:07.363 element at address: 0x20000032e7c0 with size: 0.000183 MiB 00:07:07.363 element at address: 0x200000331d40 with size: 0.000183 MiB 00:07:07.363 element at address: 0x2000003352c0 with size: 0.000183 MiB 00:07:07.363 element at address: 0x200000338840 with size: 0.000183 MiB 00:07:07.363 element at address: 0x20000033bdc0 with size: 0.000183 MiB 00:07:07.363 element at address: 0x20000033f340 with size: 0.000183 MiB 00:07:07.363 element at address: 0x2000003428c0 with size: 0.000183 MiB 00:07:07.363 element at address: 0x200000345e40 with size: 0.000183 MiB 00:07:07.363 element at address: 0x2000003493c0 with size: 0.000183 MiB 00:07:07.363 element at address: 0x20000034c940 with size: 0.000183 MiB 00:07:07.363 element at address: 0x20000034fec0 with size: 0.000183 MiB 00:07:07.363 element at address: 0x200000353440 with size: 0.000183 MiB 00:07:07.363 element at address: 0x2000003569c0 with size: 0.000183 MiB 00:07:07.363 element at address: 0x200000359f40 with size: 0.000183 MiB 00:07:07.363 element at address: 0x20000035d4c0 with size: 0.000183 MiB 00:07:07.363 element at address: 0x200000360a40 with size: 0.000183 MiB 00:07:07.363 element at address: 0x200000363fc0 with size: 0.000183 MiB 00:07:07.363 element at address: 0x200000364180 with size: 0.000183 MiB 00:07:07.363 element at address: 0x200000364240 with size: 0.000183 MiB 00:07:07.363 element at address: 0x200000364400 with size: 0.000183 MiB 00:07:07.363 element at address: 0x200000367a80 with size: 0.000183 MiB 00:07:07.363 element at address: 0x200000367c40 with size: 0.000183 MiB 00:07:07.363 element at address: 0x200000367d00 with size: 0.000183 MiB 00:07:07.363 element at address: 0x200000367ec0 with size: 0.000183 MiB 00:07:07.363 element at address: 0x20000036b540 with size: 0.000183 MiB 00:07:07.363 element at address: 0x20000036b700 with size: 0.000183 MiB 00:07:07.363 element at address: 0x20000036b7c0 with size: 0.000183 MiB 00:07:07.363 element at address: 0x20000036b980 with size: 0.000183 MiB 00:07:07.363 element at address: 0x20000036f000 with size: 0.000183 MiB 00:07:07.363 element at address: 0x20000036f1c0 with size: 0.000183 MiB 00:07:07.363 element at address: 0x20000036f280 with size: 0.000183 MiB 00:07:07.363 element at address: 0x20000036f440 with size: 0.000183 MiB 00:07:07.363 element at address: 0x200000372ac0 with size: 0.000183 MiB 00:07:07.363 element at address: 0x200000372c80 with size: 0.000183 MiB 00:07:07.363 element at address: 0x200000372d40 with size: 0.000183 MiB 00:07:07.363 element at address: 0x200000372f00 with size: 0.000183 MiB 00:07:07.363 element at address: 0x200000376580 with size: 0.000183 MiB 00:07:07.363 element at address: 0x200000376740 with size: 0.000183 MiB 00:07:07.363 element at address: 0x200000376800 with size: 0.000183 MiB 00:07:07.363 element at address: 0x2000003769c0 with size: 0.000183 MiB 00:07:07.363 element at address: 0x20000037a040 with size: 0.000183 MiB 00:07:07.363 element at address: 0x20000037a200 with size: 0.000183 MiB 00:07:07.363 element at address: 0x20000037a2c0 with size: 0.000183 MiB 00:07:07.363 element at address: 0x20000037a480 with size: 0.000183 MiB 00:07:07.363 element at address: 0x20000037db00 with size: 0.000183 MiB 00:07:07.363 element at address: 0x20000037dcc0 with size: 0.000183 MiB 00:07:07.363 element at address: 0x20000037dd80 with size: 0.000183 MiB 00:07:07.363 element at address: 0x20000037df40 with size: 0.000183 MiB 00:07:07.363 element at address: 0x2000003815c0 with size: 0.000183 MiB 00:07:07.363 element at address: 0x200000381780 with size: 0.000183 MiB 00:07:07.363 element at address: 0x200000381840 with size: 0.000183 MiB 00:07:07.363 element at address: 0x200000381a00 with size: 0.000183 MiB 00:07:07.363 element at address: 0x200000385080 with size: 0.000183 MiB 00:07:07.363 element at address: 0x200000385240 with size: 0.000183 MiB 00:07:07.363 element at address: 0x200000385300 with size: 0.000183 MiB 00:07:07.363 element at address: 0x2000003854c0 with size: 0.000183 MiB 00:07:07.363 element at address: 0x200000388b40 with size: 0.000183 MiB 00:07:07.363 element at address: 0x200000388d00 with size: 0.000183 MiB 00:07:07.363 element at address: 0x200000388dc0 with size: 0.000183 MiB 00:07:07.363 element at address: 0x200000388f80 with size: 0.000183 MiB 00:07:07.363 element at address: 0x20000038c600 with size: 0.000183 MiB 00:07:07.363 element at address: 0x20000038c7c0 with size: 0.000183 MiB 00:07:07.363 element at address: 0x20000038c880 with size: 0.000183 MiB 00:07:07.363 element at address: 0x20000038ca40 with size: 0.000183 MiB 00:07:07.363 element at address: 0x2000003900c0 with size: 0.000183 MiB 00:07:07.363 element at address: 0x200000390280 with size: 0.000183 MiB 00:07:07.363 element at address: 0x200000390340 with size: 0.000183 MiB 00:07:07.363 element at address: 0x200000390500 with size: 0.000183 MiB 00:07:07.363 element at address: 0x200000393b80 with size: 0.000183 MiB 00:07:07.363 element at address: 0x200000393d40 with size: 0.000183 MiB 00:07:07.363 element at address: 0x200000393e00 with size: 0.000183 MiB 00:07:07.363 element at address: 0x200000393fc0 with size: 0.000183 MiB 00:07:07.363 element at address: 0x200000397640 with size: 0.000183 MiB 00:07:07.364 element at address: 0x200000397800 with size: 0.000183 MiB 00:07:07.364 element at address: 0x2000003978c0 with size: 0.000183 MiB 00:07:07.364 element at address: 0x200000397a80 with size: 0.000183 MiB 00:07:07.364 element at address: 0x20000039b100 with size: 0.000183 MiB 00:07:07.364 element at address: 0x20000039b2c0 with size: 0.000183 MiB 00:07:07.364 element at address: 0x20000039b380 with size: 0.000183 MiB 00:07:07.364 element at address: 0x20000039b540 with size: 0.000183 MiB 00:07:07.364 element at address: 0x20000039ebc0 with size: 0.000183 MiB 00:07:07.364 element at address: 0x20000039ed80 with size: 0.000183 MiB 00:07:07.364 element at address: 0x20000039ee40 with size: 0.000183 MiB 00:07:07.364 element at address: 0x20000039f000 with size: 0.000183 MiB 00:07:07.364 element at address: 0x2000003a2680 with size: 0.000183 MiB 00:07:07.364 element at address: 0x2000003a2840 with size: 0.000183 MiB 00:07:07.364 element at address: 0x2000003a2900 with size: 0.000183 MiB 00:07:07.364 element at address: 0x2000003a2ac0 with size: 0.000183 MiB 00:07:07.364 element at address: 0x2000003a6140 with size: 0.000183 MiB 00:07:07.364 element at address: 0x2000003a6300 with size: 0.000183 MiB 00:07:07.364 element at address: 0x2000003a63c0 with size: 0.000183 MiB 00:07:07.364 element at address: 0x2000003a6580 with size: 0.000183 MiB 00:07:07.364 element at address: 0x2000003a9c00 with size: 0.000183 MiB 00:07:07.364 element at address: 0x2000003a9dc0 with size: 0.000183 MiB 00:07:07.364 element at address: 0x2000003a9e80 with size: 0.000183 MiB 00:07:07.364 element at address: 0x2000003aa040 with size: 0.000183 MiB 00:07:07.364 element at address: 0x2000003ad6c0 with size: 0.000183 MiB 00:07:07.364 element at address: 0x2000003ad880 with size: 0.000183 MiB 00:07:07.364 element at address: 0x2000003ad940 with size: 0.000183 MiB 00:07:07.364 element at address: 0x2000003adb00 with size: 0.000183 MiB 00:07:07.364 element at address: 0x2000003b1180 with size: 0.000183 MiB 00:07:07.364 element at address: 0x2000003b1340 with size: 0.000183 MiB 00:07:07.364 element at address: 0x2000003b1400 with size: 0.000183 MiB 00:07:07.364 element at address: 0x2000003b15c0 with size: 0.000183 MiB 00:07:07.364 element at address: 0x2000003b4c40 with size: 0.000183 MiB 00:07:07.364 element at address: 0x2000003b4e00 with size: 0.000183 MiB 00:07:07.364 element at address: 0x2000003b4ec0 with size: 0.000183 MiB 00:07:07.364 element at address: 0x2000003b5080 with size: 0.000183 MiB 00:07:07.364 element at address: 0x2000003b8700 with size: 0.000183 MiB 00:07:07.364 element at address: 0x2000003b88c0 with size: 0.000183 MiB 00:07:07.364 element at address: 0x2000003b8980 with size: 0.000183 MiB 00:07:07.364 element at address: 0x2000003b8b40 with size: 0.000183 MiB 00:07:07.364 element at address: 0x2000003bc1c0 with size: 0.000183 MiB 00:07:07.364 element at address: 0x2000003bc380 with size: 0.000183 MiB 00:07:07.364 element at address: 0x2000003bc440 with size: 0.000183 MiB 00:07:07.364 element at address: 0x2000003bc600 with size: 0.000183 MiB 00:07:07.364 element at address: 0x2000003bfc80 with size: 0.000183 MiB 00:07:07.364 element at address: 0x2000003bfe40 with size: 0.000183 MiB 00:07:07.364 element at address: 0x2000003bff00 with size: 0.000183 MiB 00:07:07.364 element at address: 0x2000003c00c0 with size: 0.000183 MiB 00:07:07.364 element at address: 0x2000003c3740 with size: 0.000183 MiB 00:07:07.364 element at address: 0x2000003c3900 with size: 0.000183 MiB 00:07:07.364 element at address: 0x2000003c39c0 with size: 0.000183 MiB 00:07:07.364 element at address: 0x2000003c3b80 with size: 0.000183 MiB 00:07:07.364 element at address: 0x2000003c7200 with size: 0.000183 MiB 00:07:07.364 element at address: 0x2000003c73c0 with size: 0.000183 MiB 00:07:07.364 element at address: 0x2000003c7480 with size: 0.000183 MiB 00:07:07.364 element at address: 0x2000003c7640 with size: 0.000183 MiB 00:07:07.364 element at address: 0x2000003cacc0 with size: 0.000183 MiB 00:07:07.364 element at address: 0x2000003cae80 with size: 0.000183 MiB 00:07:07.364 element at address: 0x2000003caf40 with size: 0.000183 MiB 00:07:07.364 element at address: 0x2000003cb100 with size: 0.000183 MiB 00:07:07.364 element at address: 0x2000003ce780 with size: 0.000183 MiB 00:07:07.364 element at address: 0x2000003ce940 with size: 0.000183 MiB 00:07:07.364 element at address: 0x2000003cea00 with size: 0.000183 MiB 00:07:07.364 element at address: 0x2000003cebc0 with size: 0.000183 MiB 00:07:07.364 element at address: 0x2000003d2240 with size: 0.000183 MiB 00:07:07.364 element at address: 0x2000003d2400 with size: 0.000183 MiB 00:07:07.364 element at address: 0x2000003d24c0 with size: 0.000183 MiB 00:07:07.364 element at address: 0x2000003d2680 with size: 0.000183 MiB 00:07:07.364 element at address: 0x2000003d5dc0 with size: 0.000183 MiB 00:07:07.364 element at address: 0x2000003d64c0 with size: 0.000183 MiB 00:07:07.364 element at address: 0x2000003d6580 with size: 0.000183 MiB 00:07:07.364 element at address: 0x2000003d6880 with size: 0.000183 MiB 00:07:07.364 element at address: 0x20000087ca40 with size: 0.000183 MiB 00:07:07.364 element at address: 0x20000087cb00 with size: 0.000183 MiB 00:07:07.364 element at address: 0x20000087cbc0 with size: 0.000183 MiB 00:07:07.364 element at address: 0x20000087cc80 with size: 0.000183 MiB 00:07:07.364 element at address: 0x20000087cd40 with size: 0.000183 MiB 00:07:07.364 element at address: 0x20000087ce00 with size: 0.000183 MiB 00:07:07.364 element at address: 0x20000087cec0 with size: 0.000183 MiB 00:07:07.364 element at address: 0x2000008fd180 with size: 0.000183 MiB 00:07:07.364 element at address: 0x2000070fdd80 with size: 0.000183 MiB 00:07:07.364 element at address: 0x20001aa95140 with size: 0.000183 MiB 00:07:07.364 element at address: 0x20001aa95200 with size: 0.000183 MiB 00:07:07.364 element at address: 0x20001aa952c0 with size: 0.000183 MiB 00:07:07.364 element at address: 0x20001aa95380 with size: 0.000183 MiB 00:07:07.364 element at address: 0x20001aa95440 with size: 0.000183 MiB 00:07:07.364 element at address: 0x200027e670c0 with size: 0.000183 MiB 00:07:07.364 element at address: 0x200027e67180 with size: 0.000183 MiB 00:07:07.364 element at address: 0x200027e6dd80 with size: 0.000183 MiB 00:07:07.364 element at address: 0x200027e6df80 with size: 0.000183 MiB 00:07:07.364 element at address: 0x200027e6e040 with size: 0.000183 MiB 00:07:07.364 element at address: 0x200027e6e100 with size: 0.000183 MiB 00:07:07.364 element at address: 0x200027e6e1c0 with size: 0.000183 MiB 00:07:07.364 element at address: 0x200027e6e280 with size: 0.000183 MiB 00:07:07.364 element at address: 0x200027e6e340 with size: 0.000183 MiB 00:07:07.364 element at address: 0x200027e6e400 with size: 0.000183 MiB 00:07:07.364 element at address: 0x200027e6e4c0 with size: 0.000183 MiB 00:07:07.364 element at address: 0x200027e6e580 with size: 0.000183 MiB 00:07:07.364 element at address: 0x200027e6e640 with size: 0.000183 MiB 00:07:07.364 element at address: 0x200027e6e700 with size: 0.000183 MiB 00:07:07.364 element at address: 0x200027e6e7c0 with size: 0.000183 MiB 00:07:07.364 element at address: 0x200027e6e880 with size: 0.000183 MiB 00:07:07.364 element at address: 0x200027e6e940 with size: 0.000183 MiB 00:07:07.364 element at address: 0x200027e6ea00 with size: 0.000183 MiB 00:07:07.364 element at address: 0x200027e6eac0 with size: 0.000183 MiB 00:07:07.364 element at address: 0x200027e6eb80 with size: 0.000183 MiB 00:07:07.364 element at address: 0x200027e6ec40 with size: 0.000183 MiB 00:07:07.364 element at address: 0x200027e6ed00 with size: 0.000183 MiB 00:07:07.364 element at address: 0x200027e6edc0 with size: 0.000183 MiB 00:07:07.364 element at address: 0x200027e6ee80 with size: 0.000183 MiB 00:07:07.364 element at address: 0x200027e6ef40 with size: 0.000183 MiB 00:07:07.364 element at address: 0x200027e6f000 with size: 0.000183 MiB 00:07:07.364 element at address: 0x200027e6f0c0 with size: 0.000183 MiB 00:07:07.364 element at address: 0x200027e6f180 with size: 0.000183 MiB 00:07:07.364 element at address: 0x200027e6f240 with size: 0.000183 MiB 00:07:07.364 element at address: 0x200027e6f300 with size: 0.000183 MiB 00:07:07.364 element at address: 0x200027e6f3c0 with size: 0.000183 MiB 00:07:07.364 element at address: 0x200027e6f480 with size: 0.000183 MiB 00:07:07.364 element at address: 0x200027e6f540 with size: 0.000183 MiB 00:07:07.364 element at address: 0x200027e6f600 with size: 0.000183 MiB 00:07:07.364 element at address: 0x200027e6f6c0 with size: 0.000183 MiB 00:07:07.364 element at address: 0x200027e6f780 with size: 0.000183 MiB 00:07:07.364 element at address: 0x200027e6f840 with size: 0.000183 MiB 00:07:07.364 element at address: 0x200027e6f900 with size: 0.000183 MiB 00:07:07.364 element at address: 0x200027e6f9c0 with size: 0.000183 MiB 00:07:07.365 element at address: 0x200027e6fa80 with size: 0.000183 MiB 00:07:07.365 element at address: 0x200027e6fb40 with size: 0.000183 MiB 00:07:07.365 element at address: 0x200027e6fc00 with size: 0.000183 MiB 00:07:07.365 element at address: 0x200027e6fcc0 with size: 0.000183 MiB 00:07:07.365 element at address: 0x200027e6fd80 with size: 0.000183 MiB 00:07:07.365 element at address: 0x200027e6fe40 with size: 0.000183 MiB 00:07:07.365 element at address: 0x200027e6ff00 with size: 0.000183 MiB 00:07:07.365 list of memzone associated elements. size: 602.308289 MiB 00:07:07.365 element at address: 0x20001aa95500 with size: 211.416748 MiB 00:07:07.365 associated memzone info: size: 211.416626 MiB name: MP_PDU_immediate_data_Pool_0 00:07:07.365 element at address: 0x200027e6ffc0 with size: 157.562561 MiB 00:07:07.365 associated memzone info: size: 157.562439 MiB name: MP_PDU_data_out_Pool_0 00:07:07.365 element at address: 0x2000139fab80 with size: 84.020630 MiB 00:07:07.365 associated memzone info: size: 84.020508 MiB name: MP_bdev_io_2424119_0 00:07:07.365 element at address: 0x2000009ff380 with size: 48.003052 MiB 00:07:07.365 associated memzone info: size: 48.002930 MiB name: MP_evtpool_2424119_0 00:07:07.365 element at address: 0x200003fff380 with size: 48.003052 MiB 00:07:07.365 associated memzone info: size: 48.002930 MiB name: MP_msgpool_2424119_0 00:07:07.365 element at address: 0x2000195be940 with size: 20.255554 MiB 00:07:07.365 associated memzone info: size: 20.255432 MiB name: MP_PDU_Pool_0 00:07:07.365 element at address: 0x200031dfeb40 with size: 18.005066 MiB 00:07:07.365 associated memzone info: size: 18.004944 MiB name: MP_SCSI_TASK_Pool_0 00:07:07.365 element at address: 0x2000005ffe00 with size: 2.000488 MiB 00:07:07.365 associated memzone info: size: 2.000366 MiB name: RG_MP_evtpool_2424119 00:07:07.365 element at address: 0x200003bffe00 with size: 2.000488 MiB 00:07:07.365 associated memzone info: size: 2.000366 MiB name: RG_MP_msgpool_2424119 00:07:07.365 element at address: 0x20000022c5c0 with size: 1.008118 MiB 00:07:07.365 associated memzone info: size: 1.007996 MiB name: MP_evtpool_2424119 00:07:07.365 element at address: 0x20000b2fde40 with size: 1.008118 MiB 00:07:07.365 associated memzone info: size: 1.007996 MiB name: MP_PDU_Pool 00:07:07.365 element at address: 0x2000194bc800 with size: 1.008118 MiB 00:07:07.365 associated memzone info: size: 1.007996 MiB name: MP_PDU_immediate_data_Pool 00:07:07.365 element at address: 0x2000070fde40 with size: 1.008118 MiB 00:07:07.365 associated memzone info: size: 1.007996 MiB name: MP_PDU_data_out_Pool 00:07:07.365 element at address: 0x2000008fd240 with size: 1.008118 MiB 00:07:07.365 associated memzone info: size: 1.007996 MiB name: MP_SCSI_TASK_Pool 00:07:07.365 element at address: 0x200003eff180 with size: 1.000488 MiB 00:07:07.365 associated memzone info: size: 1.000366 MiB name: RG_ring_0_2424119 00:07:07.365 element at address: 0x200003affc00 with size: 1.000488 MiB 00:07:07.365 associated memzone info: size: 1.000366 MiB name: RG_ring_1_2424119 00:07:07.365 element at address: 0x2000138fa980 with size: 1.000488 MiB 00:07:07.365 associated memzone info: size: 1.000366 MiB name: RG_ring_4_2424119 00:07:07.365 element at address: 0x200031cfe940 with size: 1.000488 MiB 00:07:07.365 associated memzone info: size: 1.000366 MiB name: RG_ring_5_2424119 00:07:07.365 element at address: 0x200003a7fa00 with size: 0.500488 MiB 00:07:07.365 associated memzone info: size: 0.500366 MiB name: RG_MP_bdev_io_2424119 00:07:07.365 element at address: 0x20000b27dc40 with size: 0.500488 MiB 00:07:07.365 associated memzone info: size: 0.500366 MiB name: RG_MP_PDU_Pool 00:07:07.365 element at address: 0x20000087cf80 with size: 0.500488 MiB 00:07:07.365 associated memzone info: size: 0.500366 MiB name: RG_MP_SCSI_TASK_Pool 00:07:07.365 element at address: 0x20001947c600 with size: 0.250488 MiB 00:07:07.365 associated memzone info: size: 0.250366 MiB name: RG_MP_PDU_immediate_data_Pool 00:07:07.365 element at address: 0x20000020a840 with size: 0.125488 MiB 00:07:07.365 associated memzone info: size: 0.125366 MiB name: RG_ring_2_2424119 00:07:07.365 element at address: 0x2000070f5b80 with size: 0.031738 MiB 00:07:07.365 associated memzone info: size: 0.031616 MiB name: RG_MP_PDU_data_out_Pool 00:07:07.365 element at address: 0x200027e67240 with size: 0.023743 MiB 00:07:07.365 associated memzone info: size: 0.023621 MiB name: MP_Session_Pool_0 00:07:07.365 element at address: 0x200000206580 with size: 0.016113 MiB 00:07:07.365 associated memzone info: size: 0.015991 MiB name: RG_ring_3_2424119 00:07:07.365 element at address: 0x200027e6d380 with size: 0.002441 MiB 00:07:07.365 associated memzone info: size: 0.002319 MiB name: RG_MP_Session_Pool 00:07:07.365 element at address: 0x2000003d5f80 with size: 0.001282 MiB 00:07:07.365 associated memzone info: size: 0.001160 MiB name: QAT_SYM_CAPA_GEN_1 00:07:07.365 element at address: 0x2000003d6a40 with size: 0.000427 MiB 00:07:07.365 associated memzone info: size: 0.000305 MiB name: 0000:3d:01.0_qat 00:07:07.365 element at address: 0x2000003d2840 with size: 0.000427 MiB 00:07:07.365 associated memzone info: size: 0.000305 MiB name: 0000:3d:01.1_qat 00:07:07.365 element at address: 0x2000003ced80 with size: 0.000427 MiB 00:07:07.365 associated memzone info: size: 0.000305 MiB name: 0000:3d:01.2_qat 00:07:07.365 element at address: 0x2000003cb2c0 with size: 0.000427 MiB 00:07:07.365 associated memzone info: size: 0.000305 MiB name: 0000:3d:01.3_qat 00:07:07.365 element at address: 0x2000003c7800 with size: 0.000427 MiB 00:07:07.365 associated memzone info: size: 0.000305 MiB name: 0000:3d:01.4_qat 00:07:07.365 element at address: 0x2000003c3d40 with size: 0.000427 MiB 00:07:07.365 associated memzone info: size: 0.000305 MiB name: 0000:3d:01.5_qat 00:07:07.365 element at address: 0x2000003c0280 with size: 0.000427 MiB 00:07:07.365 associated memzone info: size: 0.000305 MiB name: 0000:3d:01.6_qat 00:07:07.365 element at address: 0x2000003bc7c0 with size: 0.000427 MiB 00:07:07.365 associated memzone info: size: 0.000305 MiB name: 0000:3d:01.7_qat 00:07:07.365 element at address: 0x2000003b8d00 with size: 0.000427 MiB 00:07:07.365 associated memzone info: size: 0.000305 MiB name: 0000:3d:02.0_qat 00:07:07.365 element at address: 0x2000003b5240 with size: 0.000427 MiB 00:07:07.365 associated memzone info: size: 0.000305 MiB name: 0000:3d:02.1_qat 00:07:07.365 element at address: 0x2000003b1780 with size: 0.000427 MiB 00:07:07.365 associated memzone info: size: 0.000305 MiB name: 0000:3d:02.2_qat 00:07:07.365 element at address: 0x2000003adcc0 with size: 0.000427 MiB 00:07:07.365 associated memzone info: size: 0.000305 MiB name: 0000:3d:02.3_qat 00:07:07.365 element at address: 0x2000003aa200 with size: 0.000427 MiB 00:07:07.365 associated memzone info: size: 0.000305 MiB name: 0000:3d:02.4_qat 00:07:07.365 element at address: 0x2000003a6740 with size: 0.000427 MiB 00:07:07.365 associated memzone info: size: 0.000305 MiB name: 0000:3d:02.5_qat 00:07:07.365 element at address: 0x2000003a2c80 with size: 0.000427 MiB 00:07:07.365 associated memzone info: size: 0.000305 MiB name: 0000:3d:02.6_qat 00:07:07.365 element at address: 0x20000039f1c0 with size: 0.000427 MiB 00:07:07.365 associated memzone info: size: 0.000305 MiB name: 0000:3d:02.7_qat 00:07:07.365 element at address: 0x20000039b700 with size: 0.000427 MiB 00:07:07.365 associated memzone info: size: 0.000305 MiB name: 0000:3f:01.0_qat 00:07:07.365 element at address: 0x200000397c40 with size: 0.000427 MiB 00:07:07.365 associated memzone info: size: 0.000305 MiB name: 0000:3f:01.1_qat 00:07:07.365 element at address: 0x200000394180 with size: 0.000427 MiB 00:07:07.365 associated memzone info: size: 0.000305 MiB name: 0000:3f:01.2_qat 00:07:07.365 element at address: 0x2000003906c0 with size: 0.000427 MiB 00:07:07.365 associated memzone info: size: 0.000305 MiB name: 0000:3f:01.3_qat 00:07:07.365 element at address: 0x20000038cc00 with size: 0.000427 MiB 00:07:07.365 associated memzone info: size: 0.000305 MiB name: 0000:3f:01.4_qat 00:07:07.365 element at address: 0x200000389140 with size: 0.000427 MiB 00:07:07.365 associated memzone info: size: 0.000305 MiB name: 0000:3f:01.5_qat 00:07:07.366 element at address: 0x200000385680 with size: 0.000427 MiB 00:07:07.366 associated memzone info: size: 0.000305 MiB name: 0000:3f:01.6_qat 00:07:07.366 element at address: 0x200000381bc0 with size: 0.000427 MiB 00:07:07.366 associated memzone info: size: 0.000305 MiB name: 0000:3f:01.7_qat 00:07:07.366 element at address: 0x20000037e100 with size: 0.000427 MiB 00:07:07.366 associated memzone info: size: 0.000305 MiB name: 0000:3f:02.0_qat 00:07:07.366 element at address: 0x20000037a640 with size: 0.000427 MiB 00:07:07.366 associated memzone info: size: 0.000305 MiB name: 0000:3f:02.1_qat 00:07:07.366 element at address: 0x200000376b80 with size: 0.000427 MiB 00:07:07.366 associated memzone info: size: 0.000305 MiB name: 0000:3f:02.2_qat 00:07:07.366 element at address: 0x2000003730c0 with size: 0.000427 MiB 00:07:07.366 associated memzone info: size: 0.000305 MiB name: 0000:3f:02.3_qat 00:07:07.366 element at address: 0x20000036f600 with size: 0.000427 MiB 00:07:07.366 associated memzone info: size: 0.000305 MiB name: 0000:3f:02.4_qat 00:07:07.366 element at address: 0x20000036bb40 with size: 0.000427 MiB 00:07:07.366 associated memzone info: size: 0.000305 MiB name: 0000:3f:02.5_qat 00:07:07.366 element at address: 0x200000368080 with size: 0.000427 MiB 00:07:07.366 associated memzone info: size: 0.000305 MiB name: 0000:3f:02.6_qat 00:07:07.366 element at address: 0x2000003645c0 with size: 0.000427 MiB 00:07:07.366 associated memzone info: size: 0.000305 MiB name: 0000:3f:02.7_qat 00:07:07.366 element at address: 0x200000360b00 with size: 0.000427 MiB 00:07:07.366 associated memzone info: size: 0.000305 MiB name: 0000:da:01.0_qat 00:07:07.366 element at address: 0x20000035d580 with size: 0.000427 MiB 00:07:07.366 associated memzone info: size: 0.000305 MiB name: 0000:da:01.1_qat 00:07:07.366 element at address: 0x20000035a000 with size: 0.000427 MiB 00:07:07.366 associated memzone info: size: 0.000305 MiB name: 0000:da:01.2_qat 00:07:07.366 element at address: 0x200000356a80 with size: 0.000427 MiB 00:07:07.366 associated memzone info: size: 0.000305 MiB name: 0000:da:01.3_qat 00:07:07.366 element at address: 0x200000353500 with size: 0.000427 MiB 00:07:07.366 associated memzone info: size: 0.000305 MiB name: 0000:da:01.4_qat 00:07:07.366 element at address: 0x20000034ff80 with size: 0.000427 MiB 00:07:07.366 associated memzone info: size: 0.000305 MiB name: 0000:da:01.5_qat 00:07:07.366 element at address: 0x20000034ca00 with size: 0.000427 MiB 00:07:07.366 associated memzone info: size: 0.000305 MiB name: 0000:da:01.6_qat 00:07:07.366 element at address: 0x200000349480 with size: 0.000427 MiB 00:07:07.366 associated memzone info: size: 0.000305 MiB name: 0000:da:01.7_qat 00:07:07.366 element at address: 0x200000345f00 with size: 0.000427 MiB 00:07:07.366 associated memzone info: size: 0.000305 MiB name: 0000:da:02.0_qat 00:07:07.366 element at address: 0x200000342980 with size: 0.000427 MiB 00:07:07.366 associated memzone info: size: 0.000305 MiB name: 0000:da:02.1_qat 00:07:07.366 element at address: 0x20000033f400 with size: 0.000427 MiB 00:07:07.366 associated memzone info: size: 0.000305 MiB name: 0000:da:02.2_qat 00:07:07.366 element at address: 0x20000033be80 with size: 0.000427 MiB 00:07:07.366 associated memzone info: size: 0.000305 MiB name: 0000:da:02.3_qat 00:07:07.366 element at address: 0x200000338900 with size: 0.000427 MiB 00:07:07.366 associated memzone info: size: 0.000305 MiB name: 0000:da:02.4_qat 00:07:07.366 element at address: 0x200000335380 with size: 0.000427 MiB 00:07:07.366 associated memzone info: size: 0.000305 MiB name: 0000:da:02.5_qat 00:07:07.366 element at address: 0x200000331e00 with size: 0.000427 MiB 00:07:07.366 associated memzone info: size: 0.000305 MiB name: 0000:da:02.6_qat 00:07:07.366 element at address: 0x20000032e880 with size: 0.000427 MiB 00:07:07.366 associated memzone info: size: 0.000305 MiB name: 0000:da:02.7_qat 00:07:07.366 element at address: 0x2000003d6740 with size: 0.000305 MiB 00:07:07.366 associated memzone info: size: 0.000183 MiB name: QAT_ASYM_CAPA_GEN_1 00:07:07.366 element at address: 0x20000022b880 with size: 0.000305 MiB 00:07:07.366 associated memzone info: size: 0.000183 MiB name: MP_msgpool_2424119 00:07:07.366 element at address: 0x200000206380 with size: 0.000305 MiB 00:07:07.366 associated memzone info: size: 0.000183 MiB name: MP_bdev_io_2424119 00:07:07.366 element at address: 0x200027e6de40 with size: 0.000305 MiB 00:07:07.366 associated memzone info: size: 0.000183 MiB name: MP_Session_Pool 00:07:07.366 element at address: 0x2000003d6940 with size: 0.000244 MiB 00:07:07.366 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_0 00:07:07.366 element at address: 0x2000003d6640 with size: 0.000244 MiB 00:07:07.366 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_1 00:07:07.366 element at address: 0x2000003d5e80 with size: 0.000244 MiB 00:07:07.366 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_0 00:07:07.366 element at address: 0x2000003d2740 with size: 0.000244 MiB 00:07:07.366 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_2 00:07:07.366 element at address: 0x2000003d2580 with size: 0.000244 MiB 00:07:07.366 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_3 00:07:07.366 element at address: 0x2000003d2300 with size: 0.000244 MiB 00:07:07.366 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_1 00:07:07.366 element at address: 0x2000003cec80 with size: 0.000244 MiB 00:07:07.366 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_4 00:07:07.366 element at address: 0x2000003ceac0 with size: 0.000244 MiB 00:07:07.366 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_5 00:07:07.366 element at address: 0x2000003ce840 with size: 0.000244 MiB 00:07:07.366 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_2 00:07:07.367 element at address: 0x2000003cb1c0 with size: 0.000244 MiB 00:07:07.367 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_6 00:07:07.367 element at address: 0x2000003cb000 with size: 0.000244 MiB 00:07:07.367 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_7 00:07:07.367 element at address: 0x2000003cad80 with size: 0.000244 MiB 00:07:07.367 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_3 00:07:07.367 element at address: 0x2000003c7700 with size: 0.000244 MiB 00:07:07.367 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_8 00:07:07.367 element at address: 0x2000003c7540 with size: 0.000244 MiB 00:07:07.367 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_9 00:07:07.367 element at address: 0x2000003c72c0 with size: 0.000244 MiB 00:07:07.367 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_4 00:07:07.367 element at address: 0x2000003c3c40 with size: 0.000244 MiB 00:07:07.367 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_10 00:07:07.367 element at address: 0x2000003c3a80 with size: 0.000244 MiB 00:07:07.367 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_11 00:07:07.367 element at address: 0x2000003c3800 with size: 0.000244 MiB 00:07:07.367 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_5 00:07:07.367 element at address: 0x2000003c0180 with size: 0.000244 MiB 00:07:07.367 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_12 00:07:07.367 element at address: 0x2000003bffc0 with size: 0.000244 MiB 00:07:07.367 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_13 00:07:07.367 element at address: 0x2000003bfd40 with size: 0.000244 MiB 00:07:07.367 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_6 00:07:07.367 element at address: 0x2000003bc6c0 with size: 0.000244 MiB 00:07:07.367 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_14 00:07:07.367 element at address: 0x2000003bc500 with size: 0.000244 MiB 00:07:07.367 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_15 00:07:07.367 element at address: 0x2000003bc280 with size: 0.000244 MiB 00:07:07.367 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_7 00:07:07.367 element at address: 0x2000003b8c00 with size: 0.000244 MiB 00:07:07.367 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_16 00:07:07.367 element at address: 0x2000003b8a40 with size: 0.000244 MiB 00:07:07.367 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_17 00:07:07.367 element at address: 0x2000003b87c0 with size: 0.000244 MiB 00:07:07.367 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_8 00:07:07.367 element at address: 0x2000003b5140 with size: 0.000244 MiB 00:07:07.367 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_18 00:07:07.367 element at address: 0x2000003b4f80 with size: 0.000244 MiB 00:07:07.367 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_19 00:07:07.367 element at address: 0x2000003b4d00 with size: 0.000244 MiB 00:07:07.367 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_9 00:07:07.367 element at address: 0x2000003b1680 with size: 0.000244 MiB 00:07:07.367 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_20 00:07:07.367 element at address: 0x2000003b14c0 with size: 0.000244 MiB 00:07:07.367 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_21 00:07:07.367 element at address: 0x2000003b1240 with size: 0.000244 MiB 00:07:07.367 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_10 00:07:07.367 element at address: 0x2000003adbc0 with size: 0.000244 MiB 00:07:07.367 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_22 00:07:07.367 element at address: 0x2000003ada00 with size: 0.000244 MiB 00:07:07.367 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_23 00:07:07.367 element at address: 0x2000003ad780 with size: 0.000244 MiB 00:07:07.367 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_11 00:07:07.367 element at address: 0x2000003aa100 with size: 0.000244 MiB 00:07:07.367 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_24 00:07:07.367 element at address: 0x2000003a9f40 with size: 0.000244 MiB 00:07:07.367 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_25 00:07:07.367 element at address: 0x2000003a9cc0 with size: 0.000244 MiB 00:07:07.367 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_12 00:07:07.367 element at address: 0x2000003a6640 with size: 0.000244 MiB 00:07:07.367 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_26 00:07:07.367 element at address: 0x2000003a6480 with size: 0.000244 MiB 00:07:07.367 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_27 00:07:07.367 element at address: 0x2000003a6200 with size: 0.000244 MiB 00:07:07.367 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_13 00:07:07.367 element at address: 0x2000003a2b80 with size: 0.000244 MiB 00:07:07.367 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_28 00:07:07.367 element at address: 0x2000003a29c0 with size: 0.000244 MiB 00:07:07.367 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_29 00:07:07.367 element at address: 0x2000003a2740 with size: 0.000244 MiB 00:07:07.367 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_14 00:07:07.367 element at address: 0x20000039f0c0 with size: 0.000244 MiB 00:07:07.367 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_30 00:07:07.367 element at address: 0x20000039ef00 with size: 0.000244 MiB 00:07:07.367 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_31 00:07:07.367 element at address: 0x20000039ec80 with size: 0.000244 MiB 00:07:07.367 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_15 00:07:07.367 element at address: 0x20000039b600 with size: 0.000244 MiB 00:07:07.367 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_32 00:07:07.367 element at address: 0x20000039b440 with size: 0.000244 MiB 00:07:07.367 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_33 00:07:07.367 element at address: 0x20000039b1c0 with size: 0.000244 MiB 00:07:07.367 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_16 00:07:07.367 element at address: 0x200000397b40 with size: 0.000244 MiB 00:07:07.367 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_34 00:07:07.367 element at address: 0x200000397980 with size: 0.000244 MiB 00:07:07.367 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_35 00:07:07.367 element at address: 0x200000397700 with size: 0.000244 MiB 00:07:07.367 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_17 00:07:07.367 element at address: 0x200000394080 with size: 0.000244 MiB 00:07:07.367 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_36 00:07:07.367 element at address: 0x200000393ec0 with size: 0.000244 MiB 00:07:07.367 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_37 00:07:07.367 element at address: 0x200000393c40 with size: 0.000244 MiB 00:07:07.367 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_18 00:07:07.367 element at address: 0x2000003905c0 with size: 0.000244 MiB 00:07:07.367 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_38 00:07:07.367 element at address: 0x200000390400 with size: 0.000244 MiB 00:07:07.367 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_39 00:07:07.367 element at address: 0x200000390180 with size: 0.000244 MiB 00:07:07.367 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_19 00:07:07.367 element at address: 0x20000038cb00 with size: 0.000244 MiB 00:07:07.367 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_40 00:07:07.367 element at address: 0x20000038c940 with size: 0.000244 MiB 00:07:07.367 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_41 00:07:07.367 element at address: 0x20000038c6c0 with size: 0.000244 MiB 00:07:07.367 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_20 00:07:07.367 element at address: 0x200000389040 with size: 0.000244 MiB 00:07:07.367 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_42 00:07:07.367 element at address: 0x200000388e80 with size: 0.000244 MiB 00:07:07.367 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_43 00:07:07.367 element at address: 0x200000388c00 with size: 0.000244 MiB 00:07:07.367 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_21 00:07:07.367 element at address: 0x200000385580 with size: 0.000244 MiB 00:07:07.367 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_44 00:07:07.367 element at address: 0x2000003853c0 with size: 0.000244 MiB 00:07:07.367 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_45 00:07:07.367 element at address: 0x200000385140 with size: 0.000244 MiB 00:07:07.367 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_22 00:07:07.367 element at address: 0x200000381ac0 with size: 0.000244 MiB 00:07:07.368 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_46 00:07:07.368 element at address: 0x200000381900 with size: 0.000244 MiB 00:07:07.368 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_47 00:07:07.368 element at address: 0x200000381680 with size: 0.000244 MiB 00:07:07.368 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_23 00:07:07.368 element at address: 0x20000037e000 with size: 0.000244 MiB 00:07:07.368 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_48 00:07:07.368 element at address: 0x20000037de40 with size: 0.000244 MiB 00:07:07.368 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_49 00:07:07.368 element at address: 0x20000037dbc0 with size: 0.000244 MiB 00:07:07.368 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_24 00:07:07.368 element at address: 0x20000037a540 with size: 0.000244 MiB 00:07:07.368 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_50 00:07:07.368 element at address: 0x20000037a380 with size: 0.000244 MiB 00:07:07.368 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_51 00:07:07.368 element at address: 0x20000037a100 with size: 0.000244 MiB 00:07:07.368 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_25 00:07:07.368 element at address: 0x200000376a80 with size: 0.000244 MiB 00:07:07.368 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_52 00:07:07.368 element at address: 0x2000003768c0 with size: 0.000244 MiB 00:07:07.368 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_53 00:07:07.368 element at address: 0x200000376640 with size: 0.000244 MiB 00:07:07.368 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_26 00:07:07.368 element at address: 0x200000372fc0 with size: 0.000244 MiB 00:07:07.368 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_54 00:07:07.368 element at address: 0x200000372e00 with size: 0.000244 MiB 00:07:07.368 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_55 00:07:07.368 element at address: 0x200000372b80 with size: 0.000244 MiB 00:07:07.368 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_27 00:07:07.368 element at address: 0x20000036f500 with size: 0.000244 MiB 00:07:07.368 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_56 00:07:07.368 element at address: 0x20000036f340 with size: 0.000244 MiB 00:07:07.368 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_57 00:07:07.368 element at address: 0x20000036f0c0 with size: 0.000244 MiB 00:07:07.368 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_28 00:07:07.368 element at address: 0x20000036ba40 with size: 0.000244 MiB 00:07:07.368 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_58 00:07:07.368 element at address: 0x20000036b880 with size: 0.000244 MiB 00:07:07.368 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_59 00:07:07.368 element at address: 0x20000036b600 with size: 0.000244 MiB 00:07:07.368 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_29 00:07:07.368 element at address: 0x200000367f80 with size: 0.000244 MiB 00:07:07.368 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_60 00:07:07.368 element at address: 0x200000367dc0 with size: 0.000244 MiB 00:07:07.368 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_61 00:07:07.368 element at address: 0x200000367b40 with size: 0.000244 MiB 00:07:07.368 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_30 00:07:07.368 element at address: 0x2000003644c0 with size: 0.000244 MiB 00:07:07.368 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_62 00:07:07.368 element at address: 0x200000364300 with size: 0.000244 MiB 00:07:07.368 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_63 00:07:07.368 element at address: 0x200000364080 with size: 0.000244 MiB 00:07:07.368 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_31 00:07:07.368 element at address: 0x2000003d5d00 with size: 0.000183 MiB 00:07:07.368 associated memzone info: size: 0.000061 MiB name: QAT_COMP_CAPA_GEN_1 00:07:07.368 18:10:51 dpdk_mem_utility -- dpdk_memory_utility/test_dpdk_mem_info.sh@25 -- # trap - SIGINT SIGTERM EXIT 00:07:07.368 18:10:51 dpdk_mem_utility -- dpdk_memory_utility/test_dpdk_mem_info.sh@26 -- # killprocess 2424119 00:07:07.368 18:10:51 dpdk_mem_utility -- common/autotest_common.sh@948 -- # '[' -z 2424119 ']' 00:07:07.368 18:10:51 dpdk_mem_utility -- common/autotest_common.sh@952 -- # kill -0 2424119 00:07:07.368 18:10:51 dpdk_mem_utility -- common/autotest_common.sh@953 -- # uname 00:07:07.368 18:10:51 dpdk_mem_utility -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:07:07.368 18:10:51 dpdk_mem_utility -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2424119 00:07:07.368 18:10:51 dpdk_mem_utility -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:07:07.368 18:10:51 dpdk_mem_utility -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:07:07.368 18:10:51 dpdk_mem_utility -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2424119' 00:07:07.368 killing process with pid 2424119 00:07:07.368 18:10:51 dpdk_mem_utility -- common/autotest_common.sh@967 -- # kill 2424119 00:07:07.368 18:10:51 dpdk_mem_utility -- common/autotest_common.sh@972 -- # wait 2424119 00:07:07.935 00:07:07.935 real 0m1.826s 00:07:07.935 user 0m2.054s 00:07:07.936 sys 0m0.564s 00:07:07.936 18:10:51 dpdk_mem_utility -- common/autotest_common.sh@1124 -- # xtrace_disable 00:07:07.936 18:10:51 dpdk_mem_utility -- common/autotest_common.sh@10 -- # set +x 00:07:07.936 ************************************ 00:07:07.936 END TEST dpdk_mem_utility 00:07:07.936 ************************************ 00:07:07.936 18:10:51 -- common/autotest_common.sh@1142 -- # return 0 00:07:07.936 18:10:51 -- spdk/autotest.sh@181 -- # run_test event /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/event.sh 00:07:07.936 18:10:51 -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:07:07.936 18:10:51 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:07:07.936 18:10:51 -- common/autotest_common.sh@10 -- # set +x 00:07:07.936 ************************************ 00:07:07.936 START TEST event 00:07:07.936 ************************************ 00:07:07.936 18:10:51 event -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/event.sh 00:07:07.936 * Looking for test storage... 00:07:07.936 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event 00:07:07.936 18:10:51 event -- event/event.sh@9 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbd_common.sh 00:07:07.936 18:10:51 event -- bdev/nbd_common.sh@6 -- # set -e 00:07:07.936 18:10:51 event -- event/event.sh@45 -- # run_test event_perf /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/event_perf/event_perf -m 0xF -t 1 00:07:07.936 18:10:51 event -- common/autotest_common.sh@1099 -- # '[' 6 -le 1 ']' 00:07:07.936 18:10:51 event -- common/autotest_common.sh@1105 -- # xtrace_disable 00:07:07.936 18:10:51 event -- common/autotest_common.sh@10 -- # set +x 00:07:08.194 ************************************ 00:07:08.194 START TEST event_perf 00:07:08.194 ************************************ 00:07:08.194 18:10:51 event.event_perf -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/event_perf/event_perf -m 0xF -t 1 00:07:08.194 Running I/O for 1 seconds...[2024-07-12 18:10:51.701335] Starting SPDK v24.09-pre git sha1 182dd7de4 / DPDK 24.03.0 initialization... 00:07:08.194 [2024-07-12 18:10:51.701401] [ DPDK EAL parameters: event_perf --no-shconf -c 0xF --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2424401 ] 00:07:08.194 [2024-07-12 18:10:51.831770] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 4 00:07:08.452 [2024-07-12 18:10:51.937522] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:07:08.452 [2024-07-12 18:10:51.937607] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 2 00:07:08.452 [2024-07-12 18:10:51.937681] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 3 00:07:08.452 [2024-07-12 18:10:51.937685] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:07:09.386 Running I/O for 1 seconds... 00:07:09.386 lcore 0: 178900 00:07:09.386 lcore 1: 178899 00:07:09.386 lcore 2: 178900 00:07:09.386 lcore 3: 178902 00:07:09.386 done. 00:07:09.386 00:07:09.386 real 0m1.364s 00:07:09.386 user 0m4.214s 00:07:09.386 sys 0m0.144s 00:07:09.386 18:10:53 event.event_perf -- common/autotest_common.sh@1124 -- # xtrace_disable 00:07:09.386 18:10:53 event.event_perf -- common/autotest_common.sh@10 -- # set +x 00:07:09.386 ************************************ 00:07:09.386 END TEST event_perf 00:07:09.386 ************************************ 00:07:09.386 18:10:53 event -- common/autotest_common.sh@1142 -- # return 0 00:07:09.386 18:10:53 event -- event/event.sh@46 -- # run_test event_reactor /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/reactor/reactor -t 1 00:07:09.386 18:10:53 event -- common/autotest_common.sh@1099 -- # '[' 4 -le 1 ']' 00:07:09.386 18:10:53 event -- common/autotest_common.sh@1105 -- # xtrace_disable 00:07:09.386 18:10:53 event -- common/autotest_common.sh@10 -- # set +x 00:07:09.644 ************************************ 00:07:09.644 START TEST event_reactor 00:07:09.644 ************************************ 00:07:09.644 18:10:53 event.event_reactor -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/reactor/reactor -t 1 00:07:09.644 [2024-07-12 18:10:53.147651] Starting SPDK v24.09-pre git sha1 182dd7de4 / DPDK 24.03.0 initialization... 00:07:09.644 [2024-07-12 18:10:53.147713] [ DPDK EAL parameters: reactor --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2424650 ] 00:07:09.644 [2024-07-12 18:10:53.274843] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:09.644 [2024-07-12 18:10:53.372070] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:07:11.016 test_start 00:07:11.016 oneshot 00:07:11.016 tick 100 00:07:11.016 tick 100 00:07:11.016 tick 250 00:07:11.016 tick 100 00:07:11.016 tick 100 00:07:11.016 tick 250 00:07:11.016 tick 100 00:07:11.016 tick 500 00:07:11.016 tick 100 00:07:11.016 tick 100 00:07:11.016 tick 250 00:07:11.016 tick 100 00:07:11.016 tick 100 00:07:11.016 test_end 00:07:11.016 00:07:11.016 real 0m1.343s 00:07:11.016 user 0m1.202s 00:07:11.016 sys 0m0.135s 00:07:11.016 18:10:54 event.event_reactor -- common/autotest_common.sh@1124 -- # xtrace_disable 00:07:11.016 18:10:54 event.event_reactor -- common/autotest_common.sh@10 -- # set +x 00:07:11.016 ************************************ 00:07:11.016 END TEST event_reactor 00:07:11.016 ************************************ 00:07:11.016 18:10:54 event -- common/autotest_common.sh@1142 -- # return 0 00:07:11.016 18:10:54 event -- event/event.sh@47 -- # run_test event_reactor_perf /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/reactor_perf/reactor_perf -t 1 00:07:11.016 18:10:54 event -- common/autotest_common.sh@1099 -- # '[' 4 -le 1 ']' 00:07:11.016 18:10:54 event -- common/autotest_common.sh@1105 -- # xtrace_disable 00:07:11.016 18:10:54 event -- common/autotest_common.sh@10 -- # set +x 00:07:11.016 ************************************ 00:07:11.016 START TEST event_reactor_perf 00:07:11.016 ************************************ 00:07:11.016 18:10:54 event.event_reactor_perf -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/reactor_perf/reactor_perf -t 1 00:07:11.016 [2024-07-12 18:10:54.562983] Starting SPDK v24.09-pre git sha1 182dd7de4 / DPDK 24.03.0 initialization... 00:07:11.016 [2024-07-12 18:10:54.563049] [ DPDK EAL parameters: reactor_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2424885 ] 00:07:11.016 [2024-07-12 18:10:54.688293] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:11.274 [2024-07-12 18:10:54.789103] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:07:12.266 test_start 00:07:12.266 test_end 00:07:12.266 Performance: 328848 events per second 00:07:12.266 00:07:12.266 real 0m1.342s 00:07:12.266 user 0m1.199s 00:07:12.266 sys 0m0.137s 00:07:12.266 18:10:55 event.event_reactor_perf -- common/autotest_common.sh@1124 -- # xtrace_disable 00:07:12.266 18:10:55 event.event_reactor_perf -- common/autotest_common.sh@10 -- # set +x 00:07:12.266 ************************************ 00:07:12.266 END TEST event_reactor_perf 00:07:12.266 ************************************ 00:07:12.266 18:10:55 event -- common/autotest_common.sh@1142 -- # return 0 00:07:12.266 18:10:55 event -- event/event.sh@49 -- # uname -s 00:07:12.266 18:10:55 event -- event/event.sh@49 -- # '[' Linux = Linux ']' 00:07:12.266 18:10:55 event -- event/event.sh@50 -- # run_test event_scheduler /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/scheduler/scheduler.sh 00:07:12.266 18:10:55 event -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:07:12.266 18:10:55 event -- common/autotest_common.sh@1105 -- # xtrace_disable 00:07:12.266 18:10:55 event -- common/autotest_common.sh@10 -- # set +x 00:07:12.266 ************************************ 00:07:12.266 START TEST event_scheduler 00:07:12.266 ************************************ 00:07:12.266 18:10:55 event.event_scheduler -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/scheduler/scheduler.sh 00:07:12.552 * Looking for test storage... 00:07:12.552 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/scheduler 00:07:12.552 18:10:56 event.event_scheduler -- scheduler/scheduler.sh@29 -- # rpc=rpc_cmd 00:07:12.552 18:10:56 event.event_scheduler -- scheduler/scheduler.sh@35 -- # scheduler_pid=2425140 00:07:12.552 18:10:56 event.event_scheduler -- scheduler/scheduler.sh@34 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/scheduler/scheduler -m 0xF -p 0x2 --wait-for-rpc -f 00:07:12.552 18:10:56 event.event_scheduler -- scheduler/scheduler.sh@36 -- # trap 'killprocess $scheduler_pid; exit 1' SIGINT SIGTERM EXIT 00:07:12.552 18:10:56 event.event_scheduler -- scheduler/scheduler.sh@37 -- # waitforlisten 2425140 00:07:12.552 18:10:56 event.event_scheduler -- common/autotest_common.sh@829 -- # '[' -z 2425140 ']' 00:07:12.552 18:10:56 event.event_scheduler -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:07:12.552 18:10:56 event.event_scheduler -- common/autotest_common.sh@834 -- # local max_retries=100 00:07:12.552 18:10:56 event.event_scheduler -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:07:12.552 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:07:12.552 18:10:56 event.event_scheduler -- common/autotest_common.sh@838 -- # xtrace_disable 00:07:12.552 18:10:56 event.event_scheduler -- common/autotest_common.sh@10 -- # set +x 00:07:12.552 [2024-07-12 18:10:56.123541] Starting SPDK v24.09-pre git sha1 182dd7de4 / DPDK 24.03.0 initialization... 00:07:12.552 [2024-07-12 18:10:56.123601] [ DPDK EAL parameters: scheduler --no-shconf -c 0xF --main-lcore=2 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2425140 ] 00:07:12.552 [2024-07-12 18:10:56.207717] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 4 00:07:12.811 [2024-07-12 18:10:56.296788] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:07:12.811 [2024-07-12 18:10:56.296952] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 3 00:07:12.811 [2024-07-12 18:10:56.296954] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 2 00:07:12.811 [2024-07-12 18:10:56.296868] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:07:13.378 18:10:57 event.event_scheduler -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:07:13.378 18:10:57 event.event_scheduler -- common/autotest_common.sh@862 -- # return 0 00:07:13.378 18:10:57 event.event_scheduler -- scheduler/scheduler.sh@39 -- # rpc_cmd framework_set_scheduler dynamic 00:07:13.378 18:10:57 event.event_scheduler -- common/autotest_common.sh@559 -- # xtrace_disable 00:07:13.378 18:10:57 event.event_scheduler -- common/autotest_common.sh@10 -- # set +x 00:07:13.378 [2024-07-12 18:10:57.023652] dpdk_governor.c: 173:_init: *ERROR*: App core mask contains some but not all of a set of SMT siblings 00:07:13.378 [2024-07-12 18:10:57.023674] scheduler_dynamic.c: 270:init: *NOTICE*: Unable to initialize dpdk governor 00:07:13.378 [2024-07-12 18:10:57.023686] scheduler_dynamic.c: 416:set_opts: *NOTICE*: Setting scheduler load limit to 20 00:07:13.378 [2024-07-12 18:10:57.023694] scheduler_dynamic.c: 418:set_opts: *NOTICE*: Setting scheduler core limit to 80 00:07:13.378 [2024-07-12 18:10:57.023702] scheduler_dynamic.c: 420:set_opts: *NOTICE*: Setting scheduler core busy to 95 00:07:13.378 18:10:57 event.event_scheduler -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:07:13.378 18:10:57 event.event_scheduler -- scheduler/scheduler.sh@40 -- # rpc_cmd framework_start_init 00:07:13.378 18:10:57 event.event_scheduler -- common/autotest_common.sh@559 -- # xtrace_disable 00:07:13.378 18:10:57 event.event_scheduler -- common/autotest_common.sh@10 -- # set +x 00:07:13.637 [2024-07-12 18:10:57.118031] scheduler.c: 382:test_start: *NOTICE*: Scheduler test application started. 00:07:13.637 18:10:57 event.event_scheduler -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:07:13.637 18:10:57 event.event_scheduler -- scheduler/scheduler.sh@43 -- # run_test scheduler_create_thread scheduler_create_thread 00:07:13.637 18:10:57 event.event_scheduler -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:07:13.637 18:10:57 event.event_scheduler -- common/autotest_common.sh@1105 -- # xtrace_disable 00:07:13.637 18:10:57 event.event_scheduler -- common/autotest_common.sh@10 -- # set +x 00:07:13.637 ************************************ 00:07:13.637 START TEST scheduler_create_thread 00:07:13.637 ************************************ 00:07:13.637 18:10:57 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@1123 -- # scheduler_create_thread 00:07:13.637 18:10:57 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@12 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n active_pinned -m 0x1 -a 100 00:07:13.637 18:10:57 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@559 -- # xtrace_disable 00:07:13.637 18:10:57 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@10 -- # set +x 00:07:13.637 2 00:07:13.637 18:10:57 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:07:13.637 18:10:57 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@13 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n active_pinned -m 0x2 -a 100 00:07:13.637 18:10:57 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@559 -- # xtrace_disable 00:07:13.637 18:10:57 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@10 -- # set +x 00:07:13.637 3 00:07:13.637 18:10:57 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:07:13.637 18:10:57 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@14 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n active_pinned -m 0x4 -a 100 00:07:13.637 18:10:57 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@559 -- # xtrace_disable 00:07:13.637 18:10:57 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@10 -- # set +x 00:07:13.637 4 00:07:13.637 18:10:57 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:07:13.637 18:10:57 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@15 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n active_pinned -m 0x8 -a 100 00:07:13.637 18:10:57 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@559 -- # xtrace_disable 00:07:13.637 18:10:57 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@10 -- # set +x 00:07:13.637 5 00:07:13.637 18:10:57 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:07:13.637 18:10:57 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@16 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n idle_pinned -m 0x1 -a 0 00:07:13.637 18:10:57 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@559 -- # xtrace_disable 00:07:13.637 18:10:57 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@10 -- # set +x 00:07:13.637 6 00:07:13.637 18:10:57 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:07:13.637 18:10:57 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@17 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n idle_pinned -m 0x2 -a 0 00:07:13.637 18:10:57 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@559 -- # xtrace_disable 00:07:13.637 18:10:57 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@10 -- # set +x 00:07:13.637 7 00:07:13.637 18:10:57 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:07:13.637 18:10:57 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@18 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n idle_pinned -m 0x4 -a 0 00:07:13.637 18:10:57 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@559 -- # xtrace_disable 00:07:13.637 18:10:57 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@10 -- # set +x 00:07:13.637 8 00:07:13.637 18:10:57 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:07:13.637 18:10:57 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@19 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n idle_pinned -m 0x8 -a 0 00:07:13.637 18:10:57 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@559 -- # xtrace_disable 00:07:13.637 18:10:57 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@10 -- # set +x 00:07:13.637 9 00:07:13.637 18:10:57 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:07:13.637 18:10:57 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@21 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n one_third_active -a 30 00:07:13.637 18:10:57 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@559 -- # xtrace_disable 00:07:13.637 18:10:57 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@10 -- # set +x 00:07:13.637 10 00:07:13.637 18:10:57 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:07:13.637 18:10:57 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@22 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n half_active -a 0 00:07:13.637 18:10:57 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@559 -- # xtrace_disable 00:07:13.637 18:10:57 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@10 -- # set +x 00:07:13.638 18:10:57 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:07:13.638 18:10:57 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@22 -- # thread_id=11 00:07:13.638 18:10:57 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@23 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_set_active 11 50 00:07:13.638 18:10:57 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@559 -- # xtrace_disable 00:07:13.638 18:10:57 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@10 -- # set +x 00:07:13.638 18:10:57 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:07:13.638 18:10:57 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@25 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n deleted -a 100 00:07:13.638 18:10:57 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@559 -- # xtrace_disable 00:07:13.638 18:10:57 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@10 -- # set +x 00:07:15.013 18:10:58 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:07:15.013 18:10:58 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@25 -- # thread_id=12 00:07:15.013 18:10:58 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@26 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_delete 12 00:07:15.014 18:10:58 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@559 -- # xtrace_disable 00:07:15.014 18:10:58 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@10 -- # set +x 00:07:16.388 18:10:59 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:07:16.388 00:07:16.388 real 0m2.622s 00:07:16.388 user 0m0.021s 00:07:16.388 sys 0m0.010s 00:07:16.388 18:10:59 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@1124 -- # xtrace_disable 00:07:16.388 18:10:59 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@10 -- # set +x 00:07:16.388 ************************************ 00:07:16.388 END TEST scheduler_create_thread 00:07:16.388 ************************************ 00:07:16.388 18:10:59 event.event_scheduler -- common/autotest_common.sh@1142 -- # return 0 00:07:16.389 18:10:59 event.event_scheduler -- scheduler/scheduler.sh@45 -- # trap - SIGINT SIGTERM EXIT 00:07:16.389 18:10:59 event.event_scheduler -- scheduler/scheduler.sh@46 -- # killprocess 2425140 00:07:16.389 18:10:59 event.event_scheduler -- common/autotest_common.sh@948 -- # '[' -z 2425140 ']' 00:07:16.389 18:10:59 event.event_scheduler -- common/autotest_common.sh@952 -- # kill -0 2425140 00:07:16.389 18:10:59 event.event_scheduler -- common/autotest_common.sh@953 -- # uname 00:07:16.389 18:10:59 event.event_scheduler -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:07:16.389 18:10:59 event.event_scheduler -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2425140 00:07:16.389 18:10:59 event.event_scheduler -- common/autotest_common.sh@954 -- # process_name=reactor_2 00:07:16.389 18:10:59 event.event_scheduler -- common/autotest_common.sh@958 -- # '[' reactor_2 = sudo ']' 00:07:16.389 18:10:59 event.event_scheduler -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2425140' 00:07:16.389 killing process with pid 2425140 00:07:16.389 18:10:59 event.event_scheduler -- common/autotest_common.sh@967 -- # kill 2425140 00:07:16.389 18:10:59 event.event_scheduler -- common/autotest_common.sh@972 -- # wait 2425140 00:07:16.647 [2024-07-12 18:11:00.264240] scheduler.c: 360:test_shutdown: *NOTICE*: Scheduler test application stopped. 00:07:16.906 00:07:16.906 real 0m4.534s 00:07:16.906 user 0m8.572s 00:07:16.906 sys 0m0.478s 00:07:16.906 18:11:00 event.event_scheduler -- common/autotest_common.sh@1124 -- # xtrace_disable 00:07:16.906 18:11:00 event.event_scheduler -- common/autotest_common.sh@10 -- # set +x 00:07:16.906 ************************************ 00:07:16.906 END TEST event_scheduler 00:07:16.906 ************************************ 00:07:16.906 18:11:00 event -- common/autotest_common.sh@1142 -- # return 0 00:07:16.906 18:11:00 event -- event/event.sh@51 -- # modprobe -n nbd 00:07:16.906 18:11:00 event -- event/event.sh@52 -- # run_test app_repeat app_repeat_test 00:07:16.906 18:11:00 event -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:07:16.906 18:11:00 event -- common/autotest_common.sh@1105 -- # xtrace_disable 00:07:16.906 18:11:00 event -- common/autotest_common.sh@10 -- # set +x 00:07:16.906 ************************************ 00:07:16.906 START TEST app_repeat 00:07:16.906 ************************************ 00:07:16.906 18:11:00 event.app_repeat -- common/autotest_common.sh@1123 -- # app_repeat_test 00:07:16.906 18:11:00 event.app_repeat -- event/event.sh@12 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:07:16.906 18:11:00 event.app_repeat -- event/event.sh@13 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:07:16.906 18:11:00 event.app_repeat -- event/event.sh@13 -- # local nbd_list 00:07:16.906 18:11:00 event.app_repeat -- event/event.sh@14 -- # bdev_list=('Malloc0' 'Malloc1') 00:07:16.906 18:11:00 event.app_repeat -- event/event.sh@14 -- # local bdev_list 00:07:16.906 18:11:00 event.app_repeat -- event/event.sh@15 -- # local repeat_times=4 00:07:16.906 18:11:00 event.app_repeat -- event/event.sh@17 -- # modprobe nbd 00:07:16.906 18:11:00 event.app_repeat -- event/event.sh@19 -- # repeat_pid=2425723 00:07:16.906 18:11:00 event.app_repeat -- event/event.sh@20 -- # trap 'killprocess $repeat_pid; exit 1' SIGINT SIGTERM EXIT 00:07:16.906 18:11:00 event.app_repeat -- event/event.sh@18 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/app_repeat/app_repeat -r /var/tmp/spdk-nbd.sock -m 0x3 -t 4 00:07:16.906 18:11:00 event.app_repeat -- event/event.sh@21 -- # echo 'Process app_repeat pid: 2425723' 00:07:16.906 Process app_repeat pid: 2425723 00:07:16.906 18:11:00 event.app_repeat -- event/event.sh@23 -- # for i in {0..2} 00:07:16.906 18:11:00 event.app_repeat -- event/event.sh@24 -- # echo 'spdk_app_start Round 0' 00:07:16.906 spdk_app_start Round 0 00:07:16.906 18:11:00 event.app_repeat -- event/event.sh@25 -- # waitforlisten 2425723 /var/tmp/spdk-nbd.sock 00:07:16.906 18:11:00 event.app_repeat -- common/autotest_common.sh@829 -- # '[' -z 2425723 ']' 00:07:16.906 18:11:00 event.app_repeat -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-nbd.sock 00:07:16.906 18:11:00 event.app_repeat -- common/autotest_common.sh@834 -- # local max_retries=100 00:07:16.906 18:11:00 event.app_repeat -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock...' 00:07:16.906 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock... 00:07:16.906 18:11:00 event.app_repeat -- common/autotest_common.sh@838 -- # xtrace_disable 00:07:16.906 18:11:00 event.app_repeat -- common/autotest_common.sh@10 -- # set +x 00:07:16.906 [2024-07-12 18:11:00.627447] Starting SPDK v24.09-pre git sha1 182dd7de4 / DPDK 24.03.0 initialization... 00:07:16.906 [2024-07-12 18:11:00.627515] [ DPDK EAL parameters: app_repeat --no-shconf -c 0x3 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2425723 ] 00:07:17.165 [2024-07-12 18:11:00.758428] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 2 00:07:17.165 [2024-07-12 18:11:00.856172] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:07:17.165 [2024-07-12 18:11:00.856178] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:07:18.100 18:11:01 event.app_repeat -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:07:18.100 18:11:01 event.app_repeat -- common/autotest_common.sh@862 -- # return 0 00:07:18.100 18:11:01 event.app_repeat -- event/event.sh@27 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_malloc_create 64 4096 00:07:18.100 Malloc0 00:07:18.100 18:11:01 event.app_repeat -- event/event.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_malloc_create 64 4096 00:07:18.360 Malloc1 00:07:18.360 18:11:01 event.app_repeat -- event/event.sh@30 -- # nbd_rpc_data_verify /var/tmp/spdk-nbd.sock 'Malloc0 Malloc1' '/dev/nbd0 /dev/nbd1' 00:07:18.360 18:11:01 event.app_repeat -- bdev/nbd_common.sh@90 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:07:18.360 18:11:01 event.app_repeat -- bdev/nbd_common.sh@91 -- # bdev_list=('Malloc0' 'Malloc1') 00:07:18.360 18:11:01 event.app_repeat -- bdev/nbd_common.sh@91 -- # local bdev_list 00:07:18.360 18:11:01 event.app_repeat -- bdev/nbd_common.sh@92 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:07:18.360 18:11:01 event.app_repeat -- bdev/nbd_common.sh@92 -- # local nbd_list 00:07:18.360 18:11:01 event.app_repeat -- bdev/nbd_common.sh@94 -- # nbd_start_disks /var/tmp/spdk-nbd.sock 'Malloc0 Malloc1' '/dev/nbd0 /dev/nbd1' 00:07:18.360 18:11:01 event.app_repeat -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:07:18.360 18:11:01 event.app_repeat -- bdev/nbd_common.sh@10 -- # bdev_list=('Malloc0' 'Malloc1') 00:07:18.360 18:11:01 event.app_repeat -- bdev/nbd_common.sh@10 -- # local bdev_list 00:07:18.360 18:11:01 event.app_repeat -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:07:18.360 18:11:01 event.app_repeat -- bdev/nbd_common.sh@11 -- # local nbd_list 00:07:18.360 18:11:01 event.app_repeat -- bdev/nbd_common.sh@12 -- # local i 00:07:18.360 18:11:01 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:07:18.360 18:11:01 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:07:18.360 18:11:01 event.app_repeat -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc0 /dev/nbd0 00:07:18.618 /dev/nbd0 00:07:18.619 18:11:02 event.app_repeat -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:07:18.619 18:11:02 event.app_repeat -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:07:18.619 18:11:02 event.app_repeat -- common/autotest_common.sh@866 -- # local nbd_name=nbd0 00:07:18.619 18:11:02 event.app_repeat -- common/autotest_common.sh@867 -- # local i 00:07:18.619 18:11:02 event.app_repeat -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:07:18.619 18:11:02 event.app_repeat -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:07:18.619 18:11:02 event.app_repeat -- common/autotest_common.sh@870 -- # grep -q -w nbd0 /proc/partitions 00:07:18.619 18:11:02 event.app_repeat -- common/autotest_common.sh@871 -- # break 00:07:18.619 18:11:02 event.app_repeat -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:07:18.619 18:11:02 event.app_repeat -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:07:18.619 18:11:02 event.app_repeat -- common/autotest_common.sh@883 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdtest bs=4096 count=1 iflag=direct 00:07:18.619 1+0 records in 00:07:18.619 1+0 records out 00:07:18.619 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000255947 s, 16.0 MB/s 00:07:18.619 18:11:02 event.app_repeat -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdtest 00:07:18.619 18:11:02 event.app_repeat -- common/autotest_common.sh@884 -- # size=4096 00:07:18.619 18:11:02 event.app_repeat -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdtest 00:07:18.619 18:11:02 event.app_repeat -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:07:18.619 18:11:02 event.app_repeat -- common/autotest_common.sh@887 -- # return 0 00:07:18.619 18:11:02 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:07:18.619 18:11:02 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:07:18.619 18:11:02 event.app_repeat -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc1 /dev/nbd1 00:07:18.877 /dev/nbd1 00:07:18.877 18:11:02 event.app_repeat -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:07:18.877 18:11:02 event.app_repeat -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:07:18.877 18:11:02 event.app_repeat -- common/autotest_common.sh@866 -- # local nbd_name=nbd1 00:07:18.877 18:11:02 event.app_repeat -- common/autotest_common.sh@867 -- # local i 00:07:18.877 18:11:02 event.app_repeat -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:07:18.877 18:11:02 event.app_repeat -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:07:18.877 18:11:02 event.app_repeat -- common/autotest_common.sh@870 -- # grep -q -w nbd1 /proc/partitions 00:07:18.877 18:11:02 event.app_repeat -- common/autotest_common.sh@871 -- # break 00:07:18.877 18:11:02 event.app_repeat -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:07:18.877 18:11:02 event.app_repeat -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:07:18.877 18:11:02 event.app_repeat -- common/autotest_common.sh@883 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdtest bs=4096 count=1 iflag=direct 00:07:18.877 1+0 records in 00:07:18.877 1+0 records out 00:07:18.877 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000274983 s, 14.9 MB/s 00:07:18.877 18:11:02 event.app_repeat -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdtest 00:07:18.877 18:11:02 event.app_repeat -- common/autotest_common.sh@884 -- # size=4096 00:07:18.877 18:11:02 event.app_repeat -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdtest 00:07:18.877 18:11:02 event.app_repeat -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:07:18.877 18:11:02 event.app_repeat -- common/autotest_common.sh@887 -- # return 0 00:07:18.877 18:11:02 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:07:18.877 18:11:02 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:07:18.877 18:11:02 event.app_repeat -- bdev/nbd_common.sh@95 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:07:18.877 18:11:02 event.app_repeat -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:07:18.877 18:11:02 event.app_repeat -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:07:19.136 18:11:02 event.app_repeat -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[ 00:07:19.136 { 00:07:19.136 "nbd_device": "/dev/nbd0", 00:07:19.136 "bdev_name": "Malloc0" 00:07:19.136 }, 00:07:19.136 { 00:07:19.136 "nbd_device": "/dev/nbd1", 00:07:19.136 "bdev_name": "Malloc1" 00:07:19.136 } 00:07:19.136 ]' 00:07:19.136 18:11:02 event.app_repeat -- bdev/nbd_common.sh@64 -- # echo '[ 00:07:19.136 { 00:07:19.136 "nbd_device": "/dev/nbd0", 00:07:19.136 "bdev_name": "Malloc0" 00:07:19.136 }, 00:07:19.136 { 00:07:19.136 "nbd_device": "/dev/nbd1", 00:07:19.136 "bdev_name": "Malloc1" 00:07:19.136 } 00:07:19.136 ]' 00:07:19.136 18:11:02 event.app_repeat -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:07:19.136 18:11:02 event.app_repeat -- bdev/nbd_common.sh@64 -- # nbd_disks_name='/dev/nbd0 00:07:19.136 /dev/nbd1' 00:07:19.136 18:11:02 event.app_repeat -- bdev/nbd_common.sh@65 -- # echo '/dev/nbd0 00:07:19.136 /dev/nbd1' 00:07:19.136 18:11:02 event.app_repeat -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:07:19.136 18:11:02 event.app_repeat -- bdev/nbd_common.sh@65 -- # count=2 00:07:19.136 18:11:02 event.app_repeat -- bdev/nbd_common.sh@66 -- # echo 2 00:07:19.136 18:11:02 event.app_repeat -- bdev/nbd_common.sh@95 -- # count=2 00:07:19.136 18:11:02 event.app_repeat -- bdev/nbd_common.sh@96 -- # '[' 2 -ne 2 ']' 00:07:19.136 18:11:02 event.app_repeat -- bdev/nbd_common.sh@100 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1' write 00:07:19.136 18:11:02 event.app_repeat -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:07:19.136 18:11:02 event.app_repeat -- bdev/nbd_common.sh@70 -- # local nbd_list 00:07:19.136 18:11:02 event.app_repeat -- bdev/nbd_common.sh@71 -- # local operation=write 00:07:19.136 18:11:02 event.app_repeat -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest 00:07:19.136 18:11:02 event.app_repeat -- bdev/nbd_common.sh@74 -- # '[' write = write ']' 00:07:19.136 18:11:02 event.app_repeat -- bdev/nbd_common.sh@76 -- # dd if=/dev/urandom of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest bs=4096 count=256 00:07:19.136 256+0 records in 00:07:19.136 256+0 records out 00:07:19.136 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.00817942 s, 128 MB/s 00:07:19.136 18:11:02 event.app_repeat -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:07:19.136 18:11:02 event.app_repeat -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest of=/dev/nbd0 bs=4096 count=256 oflag=direct 00:07:19.136 256+0 records in 00:07:19.136 256+0 records out 00:07:19.136 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0197026 s, 53.2 MB/s 00:07:19.136 18:11:02 event.app_repeat -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:07:19.136 18:11:02 event.app_repeat -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest of=/dev/nbd1 bs=4096 count=256 oflag=direct 00:07:19.136 256+0 records in 00:07:19.136 256+0 records out 00:07:19.136 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0318531 s, 32.9 MB/s 00:07:19.136 18:11:02 event.app_repeat -- bdev/nbd_common.sh@101 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1' verify 00:07:19.136 18:11:02 event.app_repeat -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:07:19.136 18:11:02 event.app_repeat -- bdev/nbd_common.sh@70 -- # local nbd_list 00:07:19.136 18:11:02 event.app_repeat -- bdev/nbd_common.sh@71 -- # local operation=verify 00:07:19.136 18:11:02 event.app_repeat -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest 00:07:19.136 18:11:02 event.app_repeat -- bdev/nbd_common.sh@74 -- # '[' verify = write ']' 00:07:19.136 18:11:02 event.app_repeat -- bdev/nbd_common.sh@80 -- # '[' verify = verify ']' 00:07:19.136 18:11:02 event.app_repeat -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:07:19.136 18:11:02 event.app_repeat -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest /dev/nbd0 00:07:19.136 18:11:02 event.app_repeat -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:07:19.136 18:11:02 event.app_repeat -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest /dev/nbd1 00:07:19.136 18:11:02 event.app_repeat -- bdev/nbd_common.sh@85 -- # rm /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest 00:07:19.137 18:11:02 event.app_repeat -- bdev/nbd_common.sh@103 -- # nbd_stop_disks /var/tmp/spdk-nbd.sock '/dev/nbd0 /dev/nbd1' 00:07:19.137 18:11:02 event.app_repeat -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:07:19.137 18:11:02 event.app_repeat -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:07:19.137 18:11:02 event.app_repeat -- bdev/nbd_common.sh@50 -- # local nbd_list 00:07:19.137 18:11:02 event.app_repeat -- bdev/nbd_common.sh@51 -- # local i 00:07:19.137 18:11:02 event.app_repeat -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:07:19.137 18:11:02 event.app_repeat -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd0 00:07:19.394 18:11:03 event.app_repeat -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:07:19.395 18:11:03 event.app_repeat -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:07:19.395 18:11:03 event.app_repeat -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:07:19.395 18:11:03 event.app_repeat -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:07:19.395 18:11:03 event.app_repeat -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:07:19.653 18:11:03 event.app_repeat -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:07:19.653 18:11:03 event.app_repeat -- bdev/nbd_common.sh@41 -- # break 00:07:19.653 18:11:03 event.app_repeat -- bdev/nbd_common.sh@45 -- # return 0 00:07:19.653 18:11:03 event.app_repeat -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:07:19.653 18:11:03 event.app_repeat -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd1 00:07:19.910 18:11:03 event.app_repeat -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:07:19.910 18:11:03 event.app_repeat -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:07:19.910 18:11:03 event.app_repeat -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:07:19.910 18:11:03 event.app_repeat -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:07:19.910 18:11:03 event.app_repeat -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:07:19.910 18:11:03 event.app_repeat -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:07:19.910 18:11:03 event.app_repeat -- bdev/nbd_common.sh@41 -- # break 00:07:19.910 18:11:03 event.app_repeat -- bdev/nbd_common.sh@45 -- # return 0 00:07:19.910 18:11:03 event.app_repeat -- bdev/nbd_common.sh@104 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:07:19.910 18:11:03 event.app_repeat -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:07:19.910 18:11:03 event.app_repeat -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:07:20.168 18:11:03 event.app_repeat -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[]' 00:07:20.168 18:11:03 event.app_repeat -- bdev/nbd_common.sh@64 -- # echo '[]' 00:07:20.168 18:11:03 event.app_repeat -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:07:20.168 18:11:03 event.app_repeat -- bdev/nbd_common.sh@64 -- # nbd_disks_name= 00:07:20.168 18:11:03 event.app_repeat -- bdev/nbd_common.sh@65 -- # echo '' 00:07:20.168 18:11:03 event.app_repeat -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:07:20.168 18:11:03 event.app_repeat -- bdev/nbd_common.sh@65 -- # true 00:07:20.168 18:11:03 event.app_repeat -- bdev/nbd_common.sh@65 -- # count=0 00:07:20.168 18:11:03 event.app_repeat -- bdev/nbd_common.sh@66 -- # echo 0 00:07:20.168 18:11:03 event.app_repeat -- bdev/nbd_common.sh@104 -- # count=0 00:07:20.168 18:11:03 event.app_repeat -- bdev/nbd_common.sh@105 -- # '[' 0 -ne 0 ']' 00:07:20.168 18:11:03 event.app_repeat -- bdev/nbd_common.sh@109 -- # return 0 00:07:20.168 18:11:03 event.app_repeat -- event/event.sh@34 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock spdk_kill_instance SIGTERM 00:07:20.426 18:11:03 event.app_repeat -- event/event.sh@35 -- # sleep 3 00:07:20.684 [2024-07-12 18:11:04.235170] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 2 00:07:20.684 [2024-07-12 18:11:04.333115] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:07:20.684 [2024-07-12 18:11:04.333121] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:07:20.684 [2024-07-12 18:11:04.379538] notify.c: 45:spdk_notify_type_register: *NOTICE*: Notification type 'bdev_register' already registered. 00:07:20.684 [2024-07-12 18:11:04.379590] notify.c: 45:spdk_notify_type_register: *NOTICE*: Notification type 'bdev_unregister' already registered. 00:07:23.966 18:11:06 event.app_repeat -- event/event.sh@23 -- # for i in {0..2} 00:07:23.966 18:11:06 event.app_repeat -- event/event.sh@24 -- # echo 'spdk_app_start Round 1' 00:07:23.966 spdk_app_start Round 1 00:07:23.966 18:11:06 event.app_repeat -- event/event.sh@25 -- # waitforlisten 2425723 /var/tmp/spdk-nbd.sock 00:07:23.966 18:11:06 event.app_repeat -- common/autotest_common.sh@829 -- # '[' -z 2425723 ']' 00:07:23.966 18:11:06 event.app_repeat -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-nbd.sock 00:07:23.966 18:11:06 event.app_repeat -- common/autotest_common.sh@834 -- # local max_retries=100 00:07:23.966 18:11:06 event.app_repeat -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock...' 00:07:23.966 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock... 00:07:23.966 18:11:06 event.app_repeat -- common/autotest_common.sh@838 -- # xtrace_disable 00:07:23.966 18:11:06 event.app_repeat -- common/autotest_common.sh@10 -- # set +x 00:07:23.966 18:11:07 event.app_repeat -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:07:23.966 18:11:07 event.app_repeat -- common/autotest_common.sh@862 -- # return 0 00:07:23.966 18:11:07 event.app_repeat -- event/event.sh@27 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_malloc_create 64 4096 00:07:23.966 Malloc0 00:07:23.966 18:11:07 event.app_repeat -- event/event.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_malloc_create 64 4096 00:07:23.966 Malloc1 00:07:23.966 18:11:07 event.app_repeat -- event/event.sh@30 -- # nbd_rpc_data_verify /var/tmp/spdk-nbd.sock 'Malloc0 Malloc1' '/dev/nbd0 /dev/nbd1' 00:07:23.966 18:11:07 event.app_repeat -- bdev/nbd_common.sh@90 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:07:23.966 18:11:07 event.app_repeat -- bdev/nbd_common.sh@91 -- # bdev_list=('Malloc0' 'Malloc1') 00:07:23.966 18:11:07 event.app_repeat -- bdev/nbd_common.sh@91 -- # local bdev_list 00:07:23.966 18:11:07 event.app_repeat -- bdev/nbd_common.sh@92 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:07:23.966 18:11:07 event.app_repeat -- bdev/nbd_common.sh@92 -- # local nbd_list 00:07:23.966 18:11:07 event.app_repeat -- bdev/nbd_common.sh@94 -- # nbd_start_disks /var/tmp/spdk-nbd.sock 'Malloc0 Malloc1' '/dev/nbd0 /dev/nbd1' 00:07:23.966 18:11:07 event.app_repeat -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:07:23.966 18:11:07 event.app_repeat -- bdev/nbd_common.sh@10 -- # bdev_list=('Malloc0' 'Malloc1') 00:07:23.966 18:11:07 event.app_repeat -- bdev/nbd_common.sh@10 -- # local bdev_list 00:07:23.966 18:11:07 event.app_repeat -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:07:23.966 18:11:07 event.app_repeat -- bdev/nbd_common.sh@11 -- # local nbd_list 00:07:23.966 18:11:07 event.app_repeat -- bdev/nbd_common.sh@12 -- # local i 00:07:23.966 18:11:07 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:07:23.966 18:11:07 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:07:23.966 18:11:07 event.app_repeat -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc0 /dev/nbd0 00:07:23.966 /dev/nbd0 00:07:24.223 18:11:07 event.app_repeat -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:07:24.223 18:11:07 event.app_repeat -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:07:24.223 18:11:07 event.app_repeat -- common/autotest_common.sh@866 -- # local nbd_name=nbd0 00:07:24.223 18:11:07 event.app_repeat -- common/autotest_common.sh@867 -- # local i 00:07:24.223 18:11:07 event.app_repeat -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:07:24.223 18:11:07 event.app_repeat -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:07:24.223 18:11:07 event.app_repeat -- common/autotest_common.sh@870 -- # grep -q -w nbd0 /proc/partitions 00:07:24.223 18:11:07 event.app_repeat -- common/autotest_common.sh@871 -- # break 00:07:24.223 18:11:07 event.app_repeat -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:07:24.223 18:11:07 event.app_repeat -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:07:24.223 18:11:07 event.app_repeat -- common/autotest_common.sh@883 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdtest bs=4096 count=1 iflag=direct 00:07:24.223 1+0 records in 00:07:24.223 1+0 records out 00:07:24.223 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000228408 s, 17.9 MB/s 00:07:24.223 18:11:07 event.app_repeat -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdtest 00:07:24.223 18:11:07 event.app_repeat -- common/autotest_common.sh@884 -- # size=4096 00:07:24.223 18:11:07 event.app_repeat -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdtest 00:07:24.223 18:11:07 event.app_repeat -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:07:24.223 18:11:07 event.app_repeat -- common/autotest_common.sh@887 -- # return 0 00:07:24.223 18:11:07 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:07:24.223 18:11:07 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:07:24.223 18:11:07 event.app_repeat -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc1 /dev/nbd1 00:07:24.481 /dev/nbd1 00:07:24.481 18:11:07 event.app_repeat -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:07:24.481 18:11:07 event.app_repeat -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:07:24.481 18:11:07 event.app_repeat -- common/autotest_common.sh@866 -- # local nbd_name=nbd1 00:07:24.481 18:11:07 event.app_repeat -- common/autotest_common.sh@867 -- # local i 00:07:24.481 18:11:07 event.app_repeat -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:07:24.481 18:11:07 event.app_repeat -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:07:24.481 18:11:07 event.app_repeat -- common/autotest_common.sh@870 -- # grep -q -w nbd1 /proc/partitions 00:07:24.481 18:11:08 event.app_repeat -- common/autotest_common.sh@871 -- # break 00:07:24.481 18:11:08 event.app_repeat -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:07:24.481 18:11:08 event.app_repeat -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:07:24.481 18:11:08 event.app_repeat -- common/autotest_common.sh@883 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdtest bs=4096 count=1 iflag=direct 00:07:24.481 1+0 records in 00:07:24.481 1+0 records out 00:07:24.481 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000267248 s, 15.3 MB/s 00:07:24.481 18:11:08 event.app_repeat -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdtest 00:07:24.481 18:11:08 event.app_repeat -- common/autotest_common.sh@884 -- # size=4096 00:07:24.481 18:11:08 event.app_repeat -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdtest 00:07:24.481 18:11:08 event.app_repeat -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:07:24.481 18:11:08 event.app_repeat -- common/autotest_common.sh@887 -- # return 0 00:07:24.481 18:11:08 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:07:24.481 18:11:08 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:07:24.481 18:11:08 event.app_repeat -- bdev/nbd_common.sh@95 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:07:24.481 18:11:08 event.app_repeat -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:07:24.481 18:11:08 event.app_repeat -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:07:24.739 18:11:08 event.app_repeat -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[ 00:07:24.739 { 00:07:24.739 "nbd_device": "/dev/nbd0", 00:07:24.739 "bdev_name": "Malloc0" 00:07:24.739 }, 00:07:24.739 { 00:07:24.739 "nbd_device": "/dev/nbd1", 00:07:24.739 "bdev_name": "Malloc1" 00:07:24.739 } 00:07:24.739 ]' 00:07:24.739 18:11:08 event.app_repeat -- bdev/nbd_common.sh@64 -- # echo '[ 00:07:24.739 { 00:07:24.739 "nbd_device": "/dev/nbd0", 00:07:24.739 "bdev_name": "Malloc0" 00:07:24.739 }, 00:07:24.739 { 00:07:24.739 "nbd_device": "/dev/nbd1", 00:07:24.739 "bdev_name": "Malloc1" 00:07:24.739 } 00:07:24.739 ]' 00:07:24.739 18:11:08 event.app_repeat -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:07:24.739 18:11:08 event.app_repeat -- bdev/nbd_common.sh@64 -- # nbd_disks_name='/dev/nbd0 00:07:24.739 /dev/nbd1' 00:07:24.739 18:11:08 event.app_repeat -- bdev/nbd_common.sh@65 -- # echo '/dev/nbd0 00:07:24.739 /dev/nbd1' 00:07:24.739 18:11:08 event.app_repeat -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:07:24.739 18:11:08 event.app_repeat -- bdev/nbd_common.sh@65 -- # count=2 00:07:24.739 18:11:08 event.app_repeat -- bdev/nbd_common.sh@66 -- # echo 2 00:07:24.739 18:11:08 event.app_repeat -- bdev/nbd_common.sh@95 -- # count=2 00:07:24.739 18:11:08 event.app_repeat -- bdev/nbd_common.sh@96 -- # '[' 2 -ne 2 ']' 00:07:24.739 18:11:08 event.app_repeat -- bdev/nbd_common.sh@100 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1' write 00:07:24.739 18:11:08 event.app_repeat -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:07:24.739 18:11:08 event.app_repeat -- bdev/nbd_common.sh@70 -- # local nbd_list 00:07:24.739 18:11:08 event.app_repeat -- bdev/nbd_common.sh@71 -- # local operation=write 00:07:24.739 18:11:08 event.app_repeat -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest 00:07:24.739 18:11:08 event.app_repeat -- bdev/nbd_common.sh@74 -- # '[' write = write ']' 00:07:24.739 18:11:08 event.app_repeat -- bdev/nbd_common.sh@76 -- # dd if=/dev/urandom of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest bs=4096 count=256 00:07:24.739 256+0 records in 00:07:24.739 256+0 records out 00:07:24.739 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0111911 s, 93.7 MB/s 00:07:24.739 18:11:08 event.app_repeat -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:07:24.739 18:11:08 event.app_repeat -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest of=/dev/nbd0 bs=4096 count=256 oflag=direct 00:07:24.739 256+0 records in 00:07:24.739 256+0 records out 00:07:24.739 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0184956 s, 56.7 MB/s 00:07:24.739 18:11:08 event.app_repeat -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:07:24.739 18:11:08 event.app_repeat -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest of=/dev/nbd1 bs=4096 count=256 oflag=direct 00:07:24.739 256+0 records in 00:07:24.739 256+0 records out 00:07:24.739 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0314011 s, 33.4 MB/s 00:07:24.739 18:11:08 event.app_repeat -- bdev/nbd_common.sh@101 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1' verify 00:07:24.739 18:11:08 event.app_repeat -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:07:24.739 18:11:08 event.app_repeat -- bdev/nbd_common.sh@70 -- # local nbd_list 00:07:24.739 18:11:08 event.app_repeat -- bdev/nbd_common.sh@71 -- # local operation=verify 00:07:24.739 18:11:08 event.app_repeat -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest 00:07:24.739 18:11:08 event.app_repeat -- bdev/nbd_common.sh@74 -- # '[' verify = write ']' 00:07:24.739 18:11:08 event.app_repeat -- bdev/nbd_common.sh@80 -- # '[' verify = verify ']' 00:07:24.739 18:11:08 event.app_repeat -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:07:24.739 18:11:08 event.app_repeat -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest /dev/nbd0 00:07:24.739 18:11:08 event.app_repeat -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:07:24.739 18:11:08 event.app_repeat -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest /dev/nbd1 00:07:24.739 18:11:08 event.app_repeat -- bdev/nbd_common.sh@85 -- # rm /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest 00:07:24.739 18:11:08 event.app_repeat -- bdev/nbd_common.sh@103 -- # nbd_stop_disks /var/tmp/spdk-nbd.sock '/dev/nbd0 /dev/nbd1' 00:07:24.739 18:11:08 event.app_repeat -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:07:24.739 18:11:08 event.app_repeat -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:07:24.739 18:11:08 event.app_repeat -- bdev/nbd_common.sh@50 -- # local nbd_list 00:07:24.739 18:11:08 event.app_repeat -- bdev/nbd_common.sh@51 -- # local i 00:07:24.739 18:11:08 event.app_repeat -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:07:24.739 18:11:08 event.app_repeat -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd0 00:07:24.997 18:11:08 event.app_repeat -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:07:24.997 18:11:08 event.app_repeat -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:07:24.997 18:11:08 event.app_repeat -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:07:24.997 18:11:08 event.app_repeat -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:07:24.997 18:11:08 event.app_repeat -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:07:24.997 18:11:08 event.app_repeat -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:07:24.997 18:11:08 event.app_repeat -- bdev/nbd_common.sh@41 -- # break 00:07:24.997 18:11:08 event.app_repeat -- bdev/nbd_common.sh@45 -- # return 0 00:07:24.997 18:11:08 event.app_repeat -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:07:24.997 18:11:08 event.app_repeat -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd1 00:07:25.253 18:11:08 event.app_repeat -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:07:25.253 18:11:08 event.app_repeat -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:07:25.253 18:11:08 event.app_repeat -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:07:25.253 18:11:08 event.app_repeat -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:07:25.253 18:11:08 event.app_repeat -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:07:25.253 18:11:08 event.app_repeat -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:07:25.253 18:11:08 event.app_repeat -- bdev/nbd_common.sh@41 -- # break 00:07:25.253 18:11:08 event.app_repeat -- bdev/nbd_common.sh@45 -- # return 0 00:07:25.253 18:11:08 event.app_repeat -- bdev/nbd_common.sh@104 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:07:25.253 18:11:08 event.app_repeat -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:07:25.253 18:11:08 event.app_repeat -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:07:25.509 18:11:09 event.app_repeat -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[]' 00:07:25.509 18:11:09 event.app_repeat -- bdev/nbd_common.sh@64 -- # echo '[]' 00:07:25.509 18:11:09 event.app_repeat -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:07:25.766 18:11:09 event.app_repeat -- bdev/nbd_common.sh@64 -- # nbd_disks_name= 00:07:25.766 18:11:09 event.app_repeat -- bdev/nbd_common.sh@65 -- # echo '' 00:07:25.766 18:11:09 event.app_repeat -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:07:25.766 18:11:09 event.app_repeat -- bdev/nbd_common.sh@65 -- # true 00:07:25.766 18:11:09 event.app_repeat -- bdev/nbd_common.sh@65 -- # count=0 00:07:25.766 18:11:09 event.app_repeat -- bdev/nbd_common.sh@66 -- # echo 0 00:07:25.766 18:11:09 event.app_repeat -- bdev/nbd_common.sh@104 -- # count=0 00:07:25.766 18:11:09 event.app_repeat -- bdev/nbd_common.sh@105 -- # '[' 0 -ne 0 ']' 00:07:25.766 18:11:09 event.app_repeat -- bdev/nbd_common.sh@109 -- # return 0 00:07:25.766 18:11:09 event.app_repeat -- event/event.sh@34 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock spdk_kill_instance SIGTERM 00:07:26.024 18:11:09 event.app_repeat -- event/event.sh@35 -- # sleep 3 00:07:26.282 [2024-07-12 18:11:09.771458] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 2 00:07:26.282 [2024-07-12 18:11:09.870654] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:07:26.282 [2024-07-12 18:11:09.870658] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:07:26.282 [2024-07-12 18:11:09.924116] notify.c: 45:spdk_notify_type_register: *NOTICE*: Notification type 'bdev_register' already registered. 00:07:26.282 [2024-07-12 18:11:09.924171] notify.c: 45:spdk_notify_type_register: *NOTICE*: Notification type 'bdev_unregister' already registered. 00:07:28.809 18:11:12 event.app_repeat -- event/event.sh@23 -- # for i in {0..2} 00:07:28.809 18:11:12 event.app_repeat -- event/event.sh@24 -- # echo 'spdk_app_start Round 2' 00:07:28.809 spdk_app_start Round 2 00:07:28.809 18:11:12 event.app_repeat -- event/event.sh@25 -- # waitforlisten 2425723 /var/tmp/spdk-nbd.sock 00:07:28.809 18:11:12 event.app_repeat -- common/autotest_common.sh@829 -- # '[' -z 2425723 ']' 00:07:28.809 18:11:12 event.app_repeat -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-nbd.sock 00:07:28.809 18:11:12 event.app_repeat -- common/autotest_common.sh@834 -- # local max_retries=100 00:07:28.809 18:11:12 event.app_repeat -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock...' 00:07:28.809 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock... 00:07:28.809 18:11:12 event.app_repeat -- common/autotest_common.sh@838 -- # xtrace_disable 00:07:28.809 18:11:12 event.app_repeat -- common/autotest_common.sh@10 -- # set +x 00:07:29.067 18:11:12 event.app_repeat -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:07:29.067 18:11:12 event.app_repeat -- common/autotest_common.sh@862 -- # return 0 00:07:29.067 18:11:12 event.app_repeat -- event/event.sh@27 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_malloc_create 64 4096 00:07:29.329 Malloc0 00:07:29.329 18:11:12 event.app_repeat -- event/event.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_malloc_create 64 4096 00:07:29.587 Malloc1 00:07:29.587 18:11:13 event.app_repeat -- event/event.sh@30 -- # nbd_rpc_data_verify /var/tmp/spdk-nbd.sock 'Malloc0 Malloc1' '/dev/nbd0 /dev/nbd1' 00:07:29.587 18:11:13 event.app_repeat -- bdev/nbd_common.sh@90 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:07:29.587 18:11:13 event.app_repeat -- bdev/nbd_common.sh@91 -- # bdev_list=('Malloc0' 'Malloc1') 00:07:29.587 18:11:13 event.app_repeat -- bdev/nbd_common.sh@91 -- # local bdev_list 00:07:29.587 18:11:13 event.app_repeat -- bdev/nbd_common.sh@92 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:07:29.587 18:11:13 event.app_repeat -- bdev/nbd_common.sh@92 -- # local nbd_list 00:07:29.587 18:11:13 event.app_repeat -- bdev/nbd_common.sh@94 -- # nbd_start_disks /var/tmp/spdk-nbd.sock 'Malloc0 Malloc1' '/dev/nbd0 /dev/nbd1' 00:07:29.587 18:11:13 event.app_repeat -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:07:29.587 18:11:13 event.app_repeat -- bdev/nbd_common.sh@10 -- # bdev_list=('Malloc0' 'Malloc1') 00:07:29.587 18:11:13 event.app_repeat -- bdev/nbd_common.sh@10 -- # local bdev_list 00:07:29.587 18:11:13 event.app_repeat -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:07:29.587 18:11:13 event.app_repeat -- bdev/nbd_common.sh@11 -- # local nbd_list 00:07:29.587 18:11:13 event.app_repeat -- bdev/nbd_common.sh@12 -- # local i 00:07:29.587 18:11:13 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:07:29.587 18:11:13 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:07:29.587 18:11:13 event.app_repeat -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc0 /dev/nbd0 00:07:29.844 /dev/nbd0 00:07:29.844 18:11:13 event.app_repeat -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:07:29.844 18:11:13 event.app_repeat -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:07:29.844 18:11:13 event.app_repeat -- common/autotest_common.sh@866 -- # local nbd_name=nbd0 00:07:29.844 18:11:13 event.app_repeat -- common/autotest_common.sh@867 -- # local i 00:07:29.844 18:11:13 event.app_repeat -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:07:29.844 18:11:13 event.app_repeat -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:07:29.844 18:11:13 event.app_repeat -- common/autotest_common.sh@870 -- # grep -q -w nbd0 /proc/partitions 00:07:29.844 18:11:13 event.app_repeat -- common/autotest_common.sh@871 -- # break 00:07:29.844 18:11:13 event.app_repeat -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:07:29.844 18:11:13 event.app_repeat -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:07:29.844 18:11:13 event.app_repeat -- common/autotest_common.sh@883 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdtest bs=4096 count=1 iflag=direct 00:07:29.844 1+0 records in 00:07:29.844 1+0 records out 00:07:29.844 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000228683 s, 17.9 MB/s 00:07:29.844 18:11:13 event.app_repeat -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdtest 00:07:29.844 18:11:13 event.app_repeat -- common/autotest_common.sh@884 -- # size=4096 00:07:29.844 18:11:13 event.app_repeat -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdtest 00:07:29.844 18:11:13 event.app_repeat -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:07:29.844 18:11:13 event.app_repeat -- common/autotest_common.sh@887 -- # return 0 00:07:29.844 18:11:13 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:07:29.844 18:11:13 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:07:29.844 18:11:13 event.app_repeat -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc1 /dev/nbd1 00:07:30.102 /dev/nbd1 00:07:30.102 18:11:13 event.app_repeat -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:07:30.102 18:11:13 event.app_repeat -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:07:30.102 18:11:13 event.app_repeat -- common/autotest_common.sh@866 -- # local nbd_name=nbd1 00:07:30.102 18:11:13 event.app_repeat -- common/autotest_common.sh@867 -- # local i 00:07:30.102 18:11:13 event.app_repeat -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:07:30.102 18:11:13 event.app_repeat -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:07:30.102 18:11:13 event.app_repeat -- common/autotest_common.sh@870 -- # grep -q -w nbd1 /proc/partitions 00:07:30.102 18:11:13 event.app_repeat -- common/autotest_common.sh@871 -- # break 00:07:30.102 18:11:13 event.app_repeat -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:07:30.102 18:11:13 event.app_repeat -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:07:30.102 18:11:13 event.app_repeat -- common/autotest_common.sh@883 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdtest bs=4096 count=1 iflag=direct 00:07:30.102 1+0 records in 00:07:30.102 1+0 records out 00:07:30.102 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000259654 s, 15.8 MB/s 00:07:30.102 18:11:13 event.app_repeat -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdtest 00:07:30.102 18:11:13 event.app_repeat -- common/autotest_common.sh@884 -- # size=4096 00:07:30.102 18:11:13 event.app_repeat -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdtest 00:07:30.102 18:11:13 event.app_repeat -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:07:30.102 18:11:13 event.app_repeat -- common/autotest_common.sh@887 -- # return 0 00:07:30.102 18:11:13 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:07:30.102 18:11:13 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:07:30.102 18:11:13 event.app_repeat -- bdev/nbd_common.sh@95 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:07:30.102 18:11:13 event.app_repeat -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:07:30.102 18:11:13 event.app_repeat -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:07:30.361 18:11:13 event.app_repeat -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[ 00:07:30.361 { 00:07:30.361 "nbd_device": "/dev/nbd0", 00:07:30.361 "bdev_name": "Malloc0" 00:07:30.361 }, 00:07:30.361 { 00:07:30.361 "nbd_device": "/dev/nbd1", 00:07:30.361 "bdev_name": "Malloc1" 00:07:30.361 } 00:07:30.361 ]' 00:07:30.361 18:11:13 event.app_repeat -- bdev/nbd_common.sh@64 -- # echo '[ 00:07:30.361 { 00:07:30.361 "nbd_device": "/dev/nbd0", 00:07:30.361 "bdev_name": "Malloc0" 00:07:30.361 }, 00:07:30.361 { 00:07:30.361 "nbd_device": "/dev/nbd1", 00:07:30.361 "bdev_name": "Malloc1" 00:07:30.361 } 00:07:30.361 ]' 00:07:30.361 18:11:13 event.app_repeat -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:07:30.361 18:11:14 event.app_repeat -- bdev/nbd_common.sh@64 -- # nbd_disks_name='/dev/nbd0 00:07:30.361 /dev/nbd1' 00:07:30.361 18:11:14 event.app_repeat -- bdev/nbd_common.sh@65 -- # echo '/dev/nbd0 00:07:30.361 /dev/nbd1' 00:07:30.361 18:11:14 event.app_repeat -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:07:30.361 18:11:14 event.app_repeat -- bdev/nbd_common.sh@65 -- # count=2 00:07:30.361 18:11:14 event.app_repeat -- bdev/nbd_common.sh@66 -- # echo 2 00:07:30.361 18:11:14 event.app_repeat -- bdev/nbd_common.sh@95 -- # count=2 00:07:30.361 18:11:14 event.app_repeat -- bdev/nbd_common.sh@96 -- # '[' 2 -ne 2 ']' 00:07:30.361 18:11:14 event.app_repeat -- bdev/nbd_common.sh@100 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1' write 00:07:30.361 18:11:14 event.app_repeat -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:07:30.361 18:11:14 event.app_repeat -- bdev/nbd_common.sh@70 -- # local nbd_list 00:07:30.361 18:11:14 event.app_repeat -- bdev/nbd_common.sh@71 -- # local operation=write 00:07:30.361 18:11:14 event.app_repeat -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest 00:07:30.361 18:11:14 event.app_repeat -- bdev/nbd_common.sh@74 -- # '[' write = write ']' 00:07:30.361 18:11:14 event.app_repeat -- bdev/nbd_common.sh@76 -- # dd if=/dev/urandom of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest bs=4096 count=256 00:07:30.361 256+0 records in 00:07:30.361 256+0 records out 00:07:30.361 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0114813 s, 91.3 MB/s 00:07:30.361 18:11:14 event.app_repeat -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:07:30.361 18:11:14 event.app_repeat -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest of=/dev/nbd0 bs=4096 count=256 oflag=direct 00:07:30.361 256+0 records in 00:07:30.361 256+0 records out 00:07:30.361 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0299567 s, 35.0 MB/s 00:07:30.361 18:11:14 event.app_repeat -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:07:30.361 18:11:14 event.app_repeat -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest of=/dev/nbd1 bs=4096 count=256 oflag=direct 00:07:30.619 256+0 records in 00:07:30.619 256+0 records out 00:07:30.620 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0290859 s, 36.1 MB/s 00:07:30.620 18:11:14 event.app_repeat -- bdev/nbd_common.sh@101 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1' verify 00:07:30.620 18:11:14 event.app_repeat -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:07:30.620 18:11:14 event.app_repeat -- bdev/nbd_common.sh@70 -- # local nbd_list 00:07:30.620 18:11:14 event.app_repeat -- bdev/nbd_common.sh@71 -- # local operation=verify 00:07:30.620 18:11:14 event.app_repeat -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest 00:07:30.620 18:11:14 event.app_repeat -- bdev/nbd_common.sh@74 -- # '[' verify = write ']' 00:07:30.620 18:11:14 event.app_repeat -- bdev/nbd_common.sh@80 -- # '[' verify = verify ']' 00:07:30.620 18:11:14 event.app_repeat -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:07:30.620 18:11:14 event.app_repeat -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest /dev/nbd0 00:07:30.620 18:11:14 event.app_repeat -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:07:30.620 18:11:14 event.app_repeat -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest /dev/nbd1 00:07:30.620 18:11:14 event.app_repeat -- bdev/nbd_common.sh@85 -- # rm /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest 00:07:30.620 18:11:14 event.app_repeat -- bdev/nbd_common.sh@103 -- # nbd_stop_disks /var/tmp/spdk-nbd.sock '/dev/nbd0 /dev/nbd1' 00:07:30.620 18:11:14 event.app_repeat -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:07:30.620 18:11:14 event.app_repeat -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:07:30.620 18:11:14 event.app_repeat -- bdev/nbd_common.sh@50 -- # local nbd_list 00:07:30.620 18:11:14 event.app_repeat -- bdev/nbd_common.sh@51 -- # local i 00:07:30.620 18:11:14 event.app_repeat -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:07:30.620 18:11:14 event.app_repeat -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd0 00:07:30.881 18:11:14 event.app_repeat -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:07:30.881 18:11:14 event.app_repeat -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:07:30.881 18:11:14 event.app_repeat -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:07:30.881 18:11:14 event.app_repeat -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:07:30.881 18:11:14 event.app_repeat -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:07:30.881 18:11:14 event.app_repeat -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:07:30.881 18:11:14 event.app_repeat -- bdev/nbd_common.sh@41 -- # break 00:07:30.881 18:11:14 event.app_repeat -- bdev/nbd_common.sh@45 -- # return 0 00:07:30.881 18:11:14 event.app_repeat -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:07:30.881 18:11:14 event.app_repeat -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd1 00:07:31.139 18:11:14 event.app_repeat -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:07:31.139 18:11:14 event.app_repeat -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:07:31.139 18:11:14 event.app_repeat -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:07:31.139 18:11:14 event.app_repeat -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:07:31.139 18:11:14 event.app_repeat -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:07:31.139 18:11:14 event.app_repeat -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:07:31.139 18:11:14 event.app_repeat -- bdev/nbd_common.sh@41 -- # break 00:07:31.139 18:11:14 event.app_repeat -- bdev/nbd_common.sh@45 -- # return 0 00:07:31.139 18:11:14 event.app_repeat -- bdev/nbd_common.sh@104 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:07:31.139 18:11:14 event.app_repeat -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:07:31.139 18:11:14 event.app_repeat -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:07:31.397 18:11:14 event.app_repeat -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[]' 00:07:31.397 18:11:14 event.app_repeat -- bdev/nbd_common.sh@64 -- # echo '[]' 00:07:31.397 18:11:14 event.app_repeat -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:07:31.397 18:11:14 event.app_repeat -- bdev/nbd_common.sh@64 -- # nbd_disks_name= 00:07:31.397 18:11:14 event.app_repeat -- bdev/nbd_common.sh@65 -- # echo '' 00:07:31.397 18:11:14 event.app_repeat -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:07:31.397 18:11:14 event.app_repeat -- bdev/nbd_common.sh@65 -- # true 00:07:31.397 18:11:14 event.app_repeat -- bdev/nbd_common.sh@65 -- # count=0 00:07:31.397 18:11:14 event.app_repeat -- bdev/nbd_common.sh@66 -- # echo 0 00:07:31.397 18:11:14 event.app_repeat -- bdev/nbd_common.sh@104 -- # count=0 00:07:31.397 18:11:14 event.app_repeat -- bdev/nbd_common.sh@105 -- # '[' 0 -ne 0 ']' 00:07:31.397 18:11:14 event.app_repeat -- bdev/nbd_common.sh@109 -- # return 0 00:07:31.397 18:11:14 event.app_repeat -- event/event.sh@34 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock spdk_kill_instance SIGTERM 00:07:31.658 18:11:15 event.app_repeat -- event/event.sh@35 -- # sleep 3 00:07:31.975 [2024-07-12 18:11:15.483407] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 2 00:07:31.975 [2024-07-12 18:11:15.579549] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:07:31.975 [2024-07-12 18:11:15.579554] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:07:31.975 [2024-07-12 18:11:15.625266] notify.c: 45:spdk_notify_type_register: *NOTICE*: Notification type 'bdev_register' already registered. 00:07:31.975 [2024-07-12 18:11:15.625322] notify.c: 45:spdk_notify_type_register: *NOTICE*: Notification type 'bdev_unregister' already registered. 00:07:35.258 18:11:18 event.app_repeat -- event/event.sh@38 -- # waitforlisten 2425723 /var/tmp/spdk-nbd.sock 00:07:35.258 18:11:18 event.app_repeat -- common/autotest_common.sh@829 -- # '[' -z 2425723 ']' 00:07:35.258 18:11:18 event.app_repeat -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-nbd.sock 00:07:35.258 18:11:18 event.app_repeat -- common/autotest_common.sh@834 -- # local max_retries=100 00:07:35.258 18:11:18 event.app_repeat -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock...' 00:07:35.258 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock... 00:07:35.258 18:11:18 event.app_repeat -- common/autotest_common.sh@838 -- # xtrace_disable 00:07:35.258 18:11:18 event.app_repeat -- common/autotest_common.sh@10 -- # set +x 00:07:35.258 18:11:18 event.app_repeat -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:07:35.258 18:11:18 event.app_repeat -- common/autotest_common.sh@862 -- # return 0 00:07:35.258 18:11:18 event.app_repeat -- event/event.sh@39 -- # killprocess 2425723 00:07:35.258 18:11:18 event.app_repeat -- common/autotest_common.sh@948 -- # '[' -z 2425723 ']' 00:07:35.258 18:11:18 event.app_repeat -- common/autotest_common.sh@952 -- # kill -0 2425723 00:07:35.258 18:11:18 event.app_repeat -- common/autotest_common.sh@953 -- # uname 00:07:35.258 18:11:18 event.app_repeat -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:07:35.258 18:11:18 event.app_repeat -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2425723 00:07:35.258 18:11:18 event.app_repeat -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:07:35.259 18:11:18 event.app_repeat -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:07:35.259 18:11:18 event.app_repeat -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2425723' 00:07:35.259 killing process with pid 2425723 00:07:35.259 18:11:18 event.app_repeat -- common/autotest_common.sh@967 -- # kill 2425723 00:07:35.259 18:11:18 event.app_repeat -- common/autotest_common.sh@972 -- # wait 2425723 00:07:35.259 spdk_app_start is called in Round 0. 00:07:35.259 Shutdown signal received, stop current app iteration 00:07:35.259 Starting SPDK v24.09-pre git sha1 182dd7de4 / DPDK 24.03.0 reinitialization... 00:07:35.259 spdk_app_start is called in Round 1. 00:07:35.259 Shutdown signal received, stop current app iteration 00:07:35.259 Starting SPDK v24.09-pre git sha1 182dd7de4 / DPDK 24.03.0 reinitialization... 00:07:35.259 spdk_app_start is called in Round 2. 00:07:35.259 Shutdown signal received, stop current app iteration 00:07:35.259 Starting SPDK v24.09-pre git sha1 182dd7de4 / DPDK 24.03.0 reinitialization... 00:07:35.259 spdk_app_start is called in Round 3. 00:07:35.259 Shutdown signal received, stop current app iteration 00:07:35.259 18:11:18 event.app_repeat -- event/event.sh@40 -- # trap - SIGINT SIGTERM EXIT 00:07:35.259 18:11:18 event.app_repeat -- event/event.sh@42 -- # return 0 00:07:35.259 00:07:35.259 real 0m18.163s 00:07:35.259 user 0m38.884s 00:07:35.259 sys 0m3.845s 00:07:35.259 18:11:18 event.app_repeat -- common/autotest_common.sh@1124 -- # xtrace_disable 00:07:35.259 18:11:18 event.app_repeat -- common/autotest_common.sh@10 -- # set +x 00:07:35.259 ************************************ 00:07:35.259 END TEST app_repeat 00:07:35.259 ************************************ 00:07:35.259 18:11:18 event -- common/autotest_common.sh@1142 -- # return 0 00:07:35.259 18:11:18 event -- event/event.sh@54 -- # (( SPDK_TEST_CRYPTO == 0 )) 00:07:35.259 00:07:35.259 real 0m27.264s 00:07:35.259 user 0m54.245s 00:07:35.259 sys 0m5.120s 00:07:35.259 18:11:18 event -- common/autotest_common.sh@1124 -- # xtrace_disable 00:07:35.259 18:11:18 event -- common/autotest_common.sh@10 -- # set +x 00:07:35.259 ************************************ 00:07:35.259 END TEST event 00:07:35.259 ************************************ 00:07:35.259 18:11:18 -- common/autotest_common.sh@1142 -- # return 0 00:07:35.259 18:11:18 -- spdk/autotest.sh@182 -- # run_test thread /var/jenkins/workspace/crypto-phy-autotest/spdk/test/thread/thread.sh 00:07:35.259 18:11:18 -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:07:35.259 18:11:18 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:07:35.259 18:11:18 -- common/autotest_common.sh@10 -- # set +x 00:07:35.259 ************************************ 00:07:35.259 START TEST thread 00:07:35.259 ************************************ 00:07:35.259 18:11:18 thread -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/thread/thread.sh 00:07:35.259 * Looking for test storage... 00:07:35.259 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/thread 00:07:35.259 18:11:18 thread -- thread/thread.sh@11 -- # run_test thread_poller_perf /var/jenkins/workspace/crypto-phy-autotest/spdk/test/thread/poller_perf/poller_perf -b 1000 -l 1 -t 1 00:07:35.259 18:11:18 thread -- common/autotest_common.sh@1099 -- # '[' 8 -le 1 ']' 00:07:35.259 18:11:18 thread -- common/autotest_common.sh@1105 -- # xtrace_disable 00:07:35.259 18:11:18 thread -- common/autotest_common.sh@10 -- # set +x 00:07:35.518 ************************************ 00:07:35.518 START TEST thread_poller_perf 00:07:35.518 ************************************ 00:07:35.518 18:11:19 thread.thread_poller_perf -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/thread/poller_perf/poller_perf -b 1000 -l 1 -t 1 00:07:35.518 [2024-07-12 18:11:19.042465] Starting SPDK v24.09-pre git sha1 182dd7de4 / DPDK 24.03.0 initialization... 00:07:35.518 [2024-07-12 18:11:19.042532] [ DPDK EAL parameters: poller_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2428412 ] 00:07:35.518 [2024-07-12 18:11:19.161602] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:35.776 [2024-07-12 18:11:19.259034] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:07:35.776 Running 1000 pollers for 1 seconds with 1 microseconds period. 00:07:36.710 ====================================== 00:07:36.710 busy:2311733776 (cyc) 00:07:36.710 total_run_count: 265000 00:07:36.710 tsc_hz: 2300000000 (cyc) 00:07:36.710 ====================================== 00:07:36.710 poller_cost: 8723 (cyc), 3792 (nsec) 00:07:36.710 00:07:36.710 real 0m1.348s 00:07:36.710 user 0m1.212s 00:07:36.710 sys 0m0.129s 00:07:36.710 18:11:20 thread.thread_poller_perf -- common/autotest_common.sh@1124 -- # xtrace_disable 00:07:36.710 18:11:20 thread.thread_poller_perf -- common/autotest_common.sh@10 -- # set +x 00:07:36.710 ************************************ 00:07:36.710 END TEST thread_poller_perf 00:07:36.710 ************************************ 00:07:36.710 18:11:20 thread -- common/autotest_common.sh@1142 -- # return 0 00:07:36.710 18:11:20 thread -- thread/thread.sh@12 -- # run_test thread_poller_perf /var/jenkins/workspace/crypto-phy-autotest/spdk/test/thread/poller_perf/poller_perf -b 1000 -l 0 -t 1 00:07:36.710 18:11:20 thread -- common/autotest_common.sh@1099 -- # '[' 8 -le 1 ']' 00:07:36.710 18:11:20 thread -- common/autotest_common.sh@1105 -- # xtrace_disable 00:07:36.710 18:11:20 thread -- common/autotest_common.sh@10 -- # set +x 00:07:36.969 ************************************ 00:07:36.969 START TEST thread_poller_perf 00:07:36.969 ************************************ 00:07:36.969 18:11:20 thread.thread_poller_perf -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/thread/poller_perf/poller_perf -b 1000 -l 0 -t 1 00:07:36.969 [2024-07-12 18:11:20.456339] Starting SPDK v24.09-pre git sha1 182dd7de4 / DPDK 24.03.0 initialization... 00:07:36.969 [2024-07-12 18:11:20.456400] [ DPDK EAL parameters: poller_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2428617 ] 00:07:36.969 [2024-07-12 18:11:20.583509] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:36.969 [2024-07-12 18:11:20.683446] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:07:36.969 Running 1000 pollers for 1 seconds with 0 microseconds period. 00:07:38.344 ====================================== 00:07:38.344 busy:2302819032 (cyc) 00:07:38.344 total_run_count: 3511000 00:07:38.344 tsc_hz: 2300000000 (cyc) 00:07:38.344 ====================================== 00:07:38.344 poller_cost: 655 (cyc), 284 (nsec) 00:07:38.344 00:07:38.344 real 0m1.335s 00:07:38.344 user 0m1.202s 00:07:38.344 sys 0m0.127s 00:07:38.344 18:11:21 thread.thread_poller_perf -- common/autotest_common.sh@1124 -- # xtrace_disable 00:07:38.344 18:11:21 thread.thread_poller_perf -- common/autotest_common.sh@10 -- # set +x 00:07:38.344 ************************************ 00:07:38.344 END TEST thread_poller_perf 00:07:38.344 ************************************ 00:07:38.344 18:11:21 thread -- common/autotest_common.sh@1142 -- # return 0 00:07:38.344 18:11:21 thread -- thread/thread.sh@17 -- # [[ y != \y ]] 00:07:38.344 00:07:38.344 real 0m2.934s 00:07:38.344 user 0m2.520s 00:07:38.344 sys 0m0.423s 00:07:38.344 18:11:21 thread -- common/autotest_common.sh@1124 -- # xtrace_disable 00:07:38.344 18:11:21 thread -- common/autotest_common.sh@10 -- # set +x 00:07:38.344 ************************************ 00:07:38.344 END TEST thread 00:07:38.344 ************************************ 00:07:38.344 18:11:21 -- common/autotest_common.sh@1142 -- # return 0 00:07:38.344 18:11:21 -- spdk/autotest.sh@183 -- # run_test accel /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/accel.sh 00:07:38.344 18:11:21 -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:07:38.344 18:11:21 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:07:38.344 18:11:21 -- common/autotest_common.sh@10 -- # set +x 00:07:38.344 ************************************ 00:07:38.344 START TEST accel 00:07:38.344 ************************************ 00:07:38.344 18:11:21 accel -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/accel.sh 00:07:38.344 * Looking for test storage... 00:07:38.344 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel 00:07:38.344 18:11:22 accel -- accel/accel.sh@81 -- # declare -A expected_opcs 00:07:38.344 18:11:22 accel -- accel/accel.sh@82 -- # get_expected_opcs 00:07:38.344 18:11:22 accel -- accel/accel.sh@60 -- # trap 'killprocess $spdk_tgt_pid; exit 1' ERR 00:07:38.344 18:11:22 accel -- accel/accel.sh@62 -- # spdk_tgt_pid=2428851 00:07:38.344 18:11:22 accel -- accel/accel.sh@63 -- # waitforlisten 2428851 00:07:38.344 18:11:22 accel -- common/autotest_common.sh@829 -- # '[' -z 2428851 ']' 00:07:38.344 18:11:22 accel -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:07:38.344 18:11:22 accel -- common/autotest_common.sh@834 -- # local max_retries=100 00:07:38.344 18:11:22 accel -- accel/accel.sh@61 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt -c /dev/fd/63 00:07:38.344 18:11:22 accel -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:07:38.344 18:11:22 accel -- accel/accel.sh@61 -- # build_accel_config 00:07:38.344 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:07:38.344 18:11:22 accel -- common/autotest_common.sh@838 -- # xtrace_disable 00:07:38.344 18:11:22 accel -- accel/accel.sh@31 -- # accel_json_cfg=() 00:07:38.344 18:11:22 accel -- common/autotest_common.sh@10 -- # set +x 00:07:38.344 18:11:22 accel -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:07:38.344 18:11:22 accel -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:38.344 18:11:22 accel -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:38.344 18:11:22 accel -- accel/accel.sh@36 -- # [[ -n '' ]] 00:07:38.344 18:11:22 accel -- accel/accel.sh@40 -- # local IFS=, 00:07:38.344 18:11:22 accel -- accel/accel.sh@41 -- # jq -r . 00:07:38.603 [2024-07-12 18:11:22.073433] Starting SPDK v24.09-pre git sha1 182dd7de4 / DPDK 24.03.0 initialization... 00:07:38.603 [2024-07-12 18:11:22.073512] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2428851 ] 00:07:38.603 [2024-07-12 18:11:22.204181] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:38.603 [2024-07-12 18:11:22.301692] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:07:39.537 18:11:22 accel -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:07:39.537 18:11:22 accel -- common/autotest_common.sh@862 -- # return 0 00:07:39.537 18:11:22 accel -- accel/accel.sh@65 -- # [[ 0 -gt 0 ]] 00:07:39.537 18:11:22 accel -- accel/accel.sh@66 -- # [[ 0 -gt 0 ]] 00:07:39.537 18:11:22 accel -- accel/accel.sh@67 -- # [[ 0 -gt 0 ]] 00:07:39.537 18:11:22 accel -- accel/accel.sh@68 -- # [[ -n '' ]] 00:07:39.537 18:11:22 accel -- accel/accel.sh@70 -- # exp_opcs=($($rpc_py accel_get_opc_assignments | jq -r ". | to_entries | map(\"\(.key)=\(.value)\") | .[]")) 00:07:39.537 18:11:22 accel -- accel/accel.sh@70 -- # rpc_cmd accel_get_opc_assignments 00:07:39.537 18:11:22 accel -- common/autotest_common.sh@559 -- # xtrace_disable 00:07:39.537 18:11:22 accel -- accel/accel.sh@70 -- # jq -r '. | to_entries | map("\(.key)=\(.value)") | .[]' 00:07:39.537 18:11:22 accel -- common/autotest_common.sh@10 -- # set +x 00:07:39.537 18:11:22 accel -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:07:39.537 18:11:22 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:07:39.537 18:11:22 accel -- accel/accel.sh@72 -- # IFS== 00:07:39.537 18:11:22 accel -- accel/accel.sh@72 -- # read -r opc module 00:07:39.537 18:11:22 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:07:39.538 18:11:22 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:07:39.538 18:11:22 accel -- accel/accel.sh@72 -- # IFS== 00:07:39.538 18:11:22 accel -- accel/accel.sh@72 -- # read -r opc module 00:07:39.538 18:11:22 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:07:39.538 18:11:22 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:07:39.538 18:11:22 accel -- accel/accel.sh@72 -- # IFS== 00:07:39.538 18:11:22 accel -- accel/accel.sh@72 -- # read -r opc module 00:07:39.538 18:11:22 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:07:39.538 18:11:22 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:07:39.538 18:11:22 accel -- accel/accel.sh@72 -- # IFS== 00:07:39.538 18:11:22 accel -- accel/accel.sh@72 -- # read -r opc module 00:07:39.538 18:11:22 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:07:39.538 18:11:22 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:07:39.538 18:11:22 accel -- accel/accel.sh@72 -- # IFS== 00:07:39.538 18:11:22 accel -- accel/accel.sh@72 -- # read -r opc module 00:07:39.538 18:11:22 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:07:39.538 18:11:22 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:07:39.538 18:11:22 accel -- accel/accel.sh@72 -- # IFS== 00:07:39.538 18:11:22 accel -- accel/accel.sh@72 -- # read -r opc module 00:07:39.538 18:11:22 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:07:39.538 18:11:22 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:07:39.538 18:11:22 accel -- accel/accel.sh@72 -- # IFS== 00:07:39.538 18:11:22 accel -- accel/accel.sh@72 -- # read -r opc module 00:07:39.538 18:11:22 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:07:39.538 18:11:22 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:07:39.538 18:11:22 accel -- accel/accel.sh@72 -- # IFS== 00:07:39.538 18:11:22 accel -- accel/accel.sh@72 -- # read -r opc module 00:07:39.538 18:11:22 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:07:39.538 18:11:22 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:07:39.538 18:11:22 accel -- accel/accel.sh@72 -- # IFS== 00:07:39.538 18:11:22 accel -- accel/accel.sh@72 -- # read -r opc module 00:07:39.538 18:11:22 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:07:39.538 18:11:22 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:07:39.538 18:11:22 accel -- accel/accel.sh@72 -- # IFS== 00:07:39.538 18:11:22 accel -- accel/accel.sh@72 -- # read -r opc module 00:07:39.538 18:11:22 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:07:39.538 18:11:22 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:07:39.538 18:11:22 accel -- accel/accel.sh@72 -- # IFS== 00:07:39.538 18:11:22 accel -- accel/accel.sh@72 -- # read -r opc module 00:07:39.538 18:11:22 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:07:39.538 18:11:22 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:07:39.538 18:11:22 accel -- accel/accel.sh@72 -- # IFS== 00:07:39.538 18:11:22 accel -- accel/accel.sh@72 -- # read -r opc module 00:07:39.538 18:11:22 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:07:39.538 18:11:22 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:07:39.538 18:11:22 accel -- accel/accel.sh@72 -- # IFS== 00:07:39.538 18:11:22 accel -- accel/accel.sh@72 -- # read -r opc module 00:07:39.538 18:11:22 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:07:39.538 18:11:22 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:07:39.538 18:11:22 accel -- accel/accel.sh@72 -- # IFS== 00:07:39.538 18:11:22 accel -- accel/accel.sh@72 -- # read -r opc module 00:07:39.538 18:11:22 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:07:39.538 18:11:22 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:07:39.538 18:11:22 accel -- accel/accel.sh@72 -- # IFS== 00:07:39.538 18:11:22 accel -- accel/accel.sh@72 -- # read -r opc module 00:07:39.538 18:11:22 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:07:39.538 18:11:22 accel -- accel/accel.sh@75 -- # killprocess 2428851 00:07:39.538 18:11:22 accel -- common/autotest_common.sh@948 -- # '[' -z 2428851 ']' 00:07:39.538 18:11:22 accel -- common/autotest_common.sh@952 -- # kill -0 2428851 00:07:39.538 18:11:22 accel -- common/autotest_common.sh@953 -- # uname 00:07:39.538 18:11:22 accel -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:07:39.538 18:11:22 accel -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2428851 00:07:39.538 18:11:23 accel -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:07:39.538 18:11:23 accel -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:07:39.538 18:11:23 accel -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2428851' 00:07:39.538 killing process with pid 2428851 00:07:39.538 18:11:23 accel -- common/autotest_common.sh@967 -- # kill 2428851 00:07:39.538 18:11:23 accel -- common/autotest_common.sh@972 -- # wait 2428851 00:07:39.795 18:11:23 accel -- accel/accel.sh@76 -- # trap - ERR 00:07:39.795 18:11:23 accel -- accel/accel.sh@89 -- # run_test accel_help accel_perf -h 00:07:39.795 18:11:23 accel -- common/autotest_common.sh@1099 -- # '[' 3 -le 1 ']' 00:07:39.795 18:11:23 accel -- common/autotest_common.sh@1105 -- # xtrace_disable 00:07:39.796 18:11:23 accel -- common/autotest_common.sh@10 -- # set +x 00:07:39.796 18:11:23 accel.accel_help -- common/autotest_common.sh@1123 -- # accel_perf -h 00:07:39.796 18:11:23 accel.accel_help -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -h 00:07:39.796 18:11:23 accel.accel_help -- accel/accel.sh@12 -- # build_accel_config 00:07:39.796 18:11:23 accel.accel_help -- accel/accel.sh@31 -- # accel_json_cfg=() 00:07:39.796 18:11:23 accel.accel_help -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:07:39.796 18:11:23 accel.accel_help -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:39.796 18:11:23 accel.accel_help -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:39.796 18:11:23 accel.accel_help -- accel/accel.sh@36 -- # [[ -n '' ]] 00:07:39.796 18:11:23 accel.accel_help -- accel/accel.sh@40 -- # local IFS=, 00:07:39.796 18:11:23 accel.accel_help -- accel/accel.sh@41 -- # jq -r . 00:07:39.796 18:11:23 accel.accel_help -- common/autotest_common.sh@1124 -- # xtrace_disable 00:07:39.796 18:11:23 accel.accel_help -- common/autotest_common.sh@10 -- # set +x 00:07:40.054 18:11:23 accel -- common/autotest_common.sh@1142 -- # return 0 00:07:40.054 18:11:23 accel -- accel/accel.sh@91 -- # run_test accel_missing_filename NOT accel_perf -t 1 -w compress 00:07:40.054 18:11:23 accel -- common/autotest_common.sh@1099 -- # '[' 7 -le 1 ']' 00:07:40.054 18:11:23 accel -- common/autotest_common.sh@1105 -- # xtrace_disable 00:07:40.054 18:11:23 accel -- common/autotest_common.sh@10 -- # set +x 00:07:40.054 ************************************ 00:07:40.054 START TEST accel_missing_filename 00:07:40.054 ************************************ 00:07:40.054 18:11:23 accel.accel_missing_filename -- common/autotest_common.sh@1123 -- # NOT accel_perf -t 1 -w compress 00:07:40.054 18:11:23 accel.accel_missing_filename -- common/autotest_common.sh@648 -- # local es=0 00:07:40.054 18:11:23 accel.accel_missing_filename -- common/autotest_common.sh@650 -- # valid_exec_arg accel_perf -t 1 -w compress 00:07:40.054 18:11:23 accel.accel_missing_filename -- common/autotest_common.sh@636 -- # local arg=accel_perf 00:07:40.054 18:11:23 accel.accel_missing_filename -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:07:40.054 18:11:23 accel.accel_missing_filename -- common/autotest_common.sh@640 -- # type -t accel_perf 00:07:40.054 18:11:23 accel.accel_missing_filename -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:07:40.054 18:11:23 accel.accel_missing_filename -- common/autotest_common.sh@651 -- # accel_perf -t 1 -w compress 00:07:40.054 18:11:23 accel.accel_missing_filename -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w compress 00:07:40.054 18:11:23 accel.accel_missing_filename -- accel/accel.sh@12 -- # build_accel_config 00:07:40.054 18:11:23 accel.accel_missing_filename -- accel/accel.sh@31 -- # accel_json_cfg=() 00:07:40.054 18:11:23 accel.accel_missing_filename -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:07:40.054 18:11:23 accel.accel_missing_filename -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:40.054 18:11:23 accel.accel_missing_filename -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:40.054 18:11:23 accel.accel_missing_filename -- accel/accel.sh@36 -- # [[ -n '' ]] 00:07:40.054 18:11:23 accel.accel_missing_filename -- accel/accel.sh@40 -- # local IFS=, 00:07:40.054 18:11:23 accel.accel_missing_filename -- accel/accel.sh@41 -- # jq -r . 00:07:40.054 [2024-07-12 18:11:23.622714] Starting SPDK v24.09-pre git sha1 182dd7de4 / DPDK 24.03.0 initialization... 00:07:40.054 [2024-07-12 18:11:23.622778] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2429070 ] 00:07:40.054 [2024-07-12 18:11:23.750019] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:40.312 [2024-07-12 18:11:23.851525] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:07:40.312 [2024-07-12 18:11:23.913607] app.c:1052:spdk_app_stop: *WARNING*: spdk_app_stop'd on non-zero 00:07:40.312 [2024-07-12 18:11:23.985717] accel_perf.c:1464:main: *ERROR*: ERROR starting application 00:07:40.569 A filename is required. 00:07:40.569 18:11:24 accel.accel_missing_filename -- common/autotest_common.sh@651 -- # es=234 00:07:40.569 18:11:24 accel.accel_missing_filename -- common/autotest_common.sh@659 -- # (( es > 128 )) 00:07:40.569 18:11:24 accel.accel_missing_filename -- common/autotest_common.sh@660 -- # es=106 00:07:40.569 18:11:24 accel.accel_missing_filename -- common/autotest_common.sh@661 -- # case "$es" in 00:07:40.570 18:11:24 accel.accel_missing_filename -- common/autotest_common.sh@668 -- # es=1 00:07:40.570 18:11:24 accel.accel_missing_filename -- common/autotest_common.sh@675 -- # (( !es == 0 )) 00:07:40.570 00:07:40.570 real 0m0.495s 00:07:40.570 user 0m0.327s 00:07:40.570 sys 0m0.199s 00:07:40.570 18:11:24 accel.accel_missing_filename -- common/autotest_common.sh@1124 -- # xtrace_disable 00:07:40.570 18:11:24 accel.accel_missing_filename -- common/autotest_common.sh@10 -- # set +x 00:07:40.570 ************************************ 00:07:40.570 END TEST accel_missing_filename 00:07:40.570 ************************************ 00:07:40.570 18:11:24 accel -- common/autotest_common.sh@1142 -- # return 0 00:07:40.570 18:11:24 accel -- accel/accel.sh@93 -- # run_test accel_compress_verify NOT accel_perf -t 1 -w compress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y 00:07:40.570 18:11:24 accel -- common/autotest_common.sh@1099 -- # '[' 10 -le 1 ']' 00:07:40.570 18:11:24 accel -- common/autotest_common.sh@1105 -- # xtrace_disable 00:07:40.570 18:11:24 accel -- common/autotest_common.sh@10 -- # set +x 00:07:40.570 ************************************ 00:07:40.570 START TEST accel_compress_verify 00:07:40.570 ************************************ 00:07:40.570 18:11:24 accel.accel_compress_verify -- common/autotest_common.sh@1123 -- # NOT accel_perf -t 1 -w compress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y 00:07:40.570 18:11:24 accel.accel_compress_verify -- common/autotest_common.sh@648 -- # local es=0 00:07:40.570 18:11:24 accel.accel_compress_verify -- common/autotest_common.sh@650 -- # valid_exec_arg accel_perf -t 1 -w compress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y 00:07:40.570 18:11:24 accel.accel_compress_verify -- common/autotest_common.sh@636 -- # local arg=accel_perf 00:07:40.570 18:11:24 accel.accel_compress_verify -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:07:40.570 18:11:24 accel.accel_compress_verify -- common/autotest_common.sh@640 -- # type -t accel_perf 00:07:40.570 18:11:24 accel.accel_compress_verify -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:07:40.570 18:11:24 accel.accel_compress_verify -- common/autotest_common.sh@651 -- # accel_perf -t 1 -w compress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y 00:07:40.570 18:11:24 accel.accel_compress_verify -- accel/accel.sh@12 -- # build_accel_config 00:07:40.570 18:11:24 accel.accel_compress_verify -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w compress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y 00:07:40.570 18:11:24 accel.accel_compress_verify -- accel/accel.sh@31 -- # accel_json_cfg=() 00:07:40.570 18:11:24 accel.accel_compress_verify -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:07:40.570 18:11:24 accel.accel_compress_verify -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:40.570 18:11:24 accel.accel_compress_verify -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:40.570 18:11:24 accel.accel_compress_verify -- accel/accel.sh@36 -- # [[ -n '' ]] 00:07:40.570 18:11:24 accel.accel_compress_verify -- accel/accel.sh@40 -- # local IFS=, 00:07:40.570 18:11:24 accel.accel_compress_verify -- accel/accel.sh@41 -- # jq -r . 00:07:40.570 [2024-07-12 18:11:24.183023] Starting SPDK v24.09-pre git sha1 182dd7de4 / DPDK 24.03.0 initialization... 00:07:40.570 [2024-07-12 18:11:24.183088] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2429220 ] 00:07:40.828 [2024-07-12 18:11:24.313458] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:40.828 [2024-07-12 18:11:24.413077] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:07:40.828 [2024-07-12 18:11:24.482020] app.c:1052:spdk_app_stop: *WARNING*: spdk_app_stop'd on non-zero 00:07:40.828 [2024-07-12 18:11:24.556241] accel_perf.c:1464:main: *ERROR*: ERROR starting application 00:07:41.086 00:07:41.086 Compression does not support the verify option, aborting. 00:07:41.086 18:11:24 accel.accel_compress_verify -- common/autotest_common.sh@651 -- # es=161 00:07:41.086 18:11:24 accel.accel_compress_verify -- common/autotest_common.sh@659 -- # (( es > 128 )) 00:07:41.086 18:11:24 accel.accel_compress_verify -- common/autotest_common.sh@660 -- # es=33 00:07:41.086 18:11:24 accel.accel_compress_verify -- common/autotest_common.sh@661 -- # case "$es" in 00:07:41.086 18:11:24 accel.accel_compress_verify -- common/autotest_common.sh@668 -- # es=1 00:07:41.086 18:11:24 accel.accel_compress_verify -- common/autotest_common.sh@675 -- # (( !es == 0 )) 00:07:41.086 00:07:41.086 real 0m0.505s 00:07:41.086 user 0m0.333s 00:07:41.086 sys 0m0.191s 00:07:41.086 18:11:24 accel.accel_compress_verify -- common/autotest_common.sh@1124 -- # xtrace_disable 00:07:41.086 18:11:24 accel.accel_compress_verify -- common/autotest_common.sh@10 -- # set +x 00:07:41.086 ************************************ 00:07:41.086 END TEST accel_compress_verify 00:07:41.086 ************************************ 00:07:41.086 18:11:24 accel -- common/autotest_common.sh@1142 -- # return 0 00:07:41.086 18:11:24 accel -- accel/accel.sh@95 -- # run_test accel_wrong_workload NOT accel_perf -t 1 -w foobar 00:07:41.086 18:11:24 accel -- common/autotest_common.sh@1099 -- # '[' 7 -le 1 ']' 00:07:41.086 18:11:24 accel -- common/autotest_common.sh@1105 -- # xtrace_disable 00:07:41.086 18:11:24 accel -- common/autotest_common.sh@10 -- # set +x 00:07:41.086 ************************************ 00:07:41.086 START TEST accel_wrong_workload 00:07:41.086 ************************************ 00:07:41.086 18:11:24 accel.accel_wrong_workload -- common/autotest_common.sh@1123 -- # NOT accel_perf -t 1 -w foobar 00:07:41.086 18:11:24 accel.accel_wrong_workload -- common/autotest_common.sh@648 -- # local es=0 00:07:41.086 18:11:24 accel.accel_wrong_workload -- common/autotest_common.sh@650 -- # valid_exec_arg accel_perf -t 1 -w foobar 00:07:41.086 18:11:24 accel.accel_wrong_workload -- common/autotest_common.sh@636 -- # local arg=accel_perf 00:07:41.086 18:11:24 accel.accel_wrong_workload -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:07:41.086 18:11:24 accel.accel_wrong_workload -- common/autotest_common.sh@640 -- # type -t accel_perf 00:07:41.086 18:11:24 accel.accel_wrong_workload -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:07:41.086 18:11:24 accel.accel_wrong_workload -- common/autotest_common.sh@651 -- # accel_perf -t 1 -w foobar 00:07:41.086 18:11:24 accel.accel_wrong_workload -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w foobar 00:07:41.086 18:11:24 accel.accel_wrong_workload -- accel/accel.sh@12 -- # build_accel_config 00:07:41.086 18:11:24 accel.accel_wrong_workload -- accel/accel.sh@31 -- # accel_json_cfg=() 00:07:41.086 18:11:24 accel.accel_wrong_workload -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:07:41.086 18:11:24 accel.accel_wrong_workload -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:41.086 18:11:24 accel.accel_wrong_workload -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:41.086 18:11:24 accel.accel_wrong_workload -- accel/accel.sh@36 -- # [[ -n '' ]] 00:07:41.086 18:11:24 accel.accel_wrong_workload -- accel/accel.sh@40 -- # local IFS=, 00:07:41.086 18:11:24 accel.accel_wrong_workload -- accel/accel.sh@41 -- # jq -r . 00:07:41.086 Unsupported workload type: foobar 00:07:41.086 [2024-07-12 18:11:24.772630] app.c:1450:spdk_app_parse_args: *ERROR*: Parsing app-specific command line parameter 'w' failed: 1 00:07:41.086 accel_perf options: 00:07:41.086 [-h help message] 00:07:41.086 [-q queue depth per core] 00:07:41.086 [-C for supported workloads, use this value to configure the io vector size to test (default 1) 00:07:41.086 [-T number of threads per core 00:07:41.086 [-o transfer size in bytes (default: 4KiB. For compress/decompress, 0 means the input file size)] 00:07:41.086 [-t time in seconds] 00:07:41.086 [-w workload type must be one of these: copy, fill, crc32c, copy_crc32c, compare, compress, decompress, dualcast, xor, 00:07:41.086 [ dif_verify, dif_verify_copy, dif_generate, dif_generate_copy 00:07:41.086 [-M assign module to the operation, not compatible with accel_assign_opc RPC 00:07:41.086 [-l for compress/decompress workloads, name of uncompressed input file 00:07:41.086 [-S for crc32c workload, use this seed value (default 0) 00:07:41.086 [-P for compare workload, percentage of operations that should miscompare (percent, default 0) 00:07:41.086 [-f for fill workload, use this BYTE value (default 255) 00:07:41.086 [-x for xor workload, use this number of source buffers (default, minimum: 2)] 00:07:41.086 [-y verify result if this switch is on] 00:07:41.086 [-a tasks to allocate per core (default: same value as -q)] 00:07:41.086 Can be used to spread operations across a wider range of memory. 00:07:41.086 18:11:24 accel.accel_wrong_workload -- common/autotest_common.sh@651 -- # es=1 00:07:41.086 18:11:24 accel.accel_wrong_workload -- common/autotest_common.sh@659 -- # (( es > 128 )) 00:07:41.086 18:11:24 accel.accel_wrong_workload -- common/autotest_common.sh@670 -- # [[ -n '' ]] 00:07:41.086 18:11:24 accel.accel_wrong_workload -- common/autotest_common.sh@675 -- # (( !es == 0 )) 00:07:41.086 00:07:41.086 real 0m0.043s 00:07:41.086 user 0m0.027s 00:07:41.086 sys 0m0.016s 00:07:41.086 18:11:24 accel.accel_wrong_workload -- common/autotest_common.sh@1124 -- # xtrace_disable 00:07:41.086 18:11:24 accel.accel_wrong_workload -- common/autotest_common.sh@10 -- # set +x 00:07:41.086 ************************************ 00:07:41.086 END TEST accel_wrong_workload 00:07:41.086 ************************************ 00:07:41.086 Error: writing output failed: Broken pipe 00:07:41.345 18:11:24 accel -- common/autotest_common.sh@1142 -- # return 0 00:07:41.345 18:11:24 accel -- accel/accel.sh@97 -- # run_test accel_negative_buffers NOT accel_perf -t 1 -w xor -y -x -1 00:07:41.345 18:11:24 accel -- common/autotest_common.sh@1099 -- # '[' 10 -le 1 ']' 00:07:41.345 18:11:24 accel -- common/autotest_common.sh@1105 -- # xtrace_disable 00:07:41.345 18:11:24 accel -- common/autotest_common.sh@10 -- # set +x 00:07:41.345 ************************************ 00:07:41.345 START TEST accel_negative_buffers 00:07:41.345 ************************************ 00:07:41.345 18:11:24 accel.accel_negative_buffers -- common/autotest_common.sh@1123 -- # NOT accel_perf -t 1 -w xor -y -x -1 00:07:41.345 18:11:24 accel.accel_negative_buffers -- common/autotest_common.sh@648 -- # local es=0 00:07:41.345 18:11:24 accel.accel_negative_buffers -- common/autotest_common.sh@650 -- # valid_exec_arg accel_perf -t 1 -w xor -y -x -1 00:07:41.345 18:11:24 accel.accel_negative_buffers -- common/autotest_common.sh@636 -- # local arg=accel_perf 00:07:41.345 18:11:24 accel.accel_negative_buffers -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:07:41.345 18:11:24 accel.accel_negative_buffers -- common/autotest_common.sh@640 -- # type -t accel_perf 00:07:41.345 18:11:24 accel.accel_negative_buffers -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:07:41.345 18:11:24 accel.accel_negative_buffers -- common/autotest_common.sh@651 -- # accel_perf -t 1 -w xor -y -x -1 00:07:41.345 18:11:24 accel.accel_negative_buffers -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w xor -y -x -1 00:07:41.345 18:11:24 accel.accel_negative_buffers -- accel/accel.sh@12 -- # build_accel_config 00:07:41.345 18:11:24 accel.accel_negative_buffers -- accel/accel.sh@31 -- # accel_json_cfg=() 00:07:41.345 18:11:24 accel.accel_negative_buffers -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:07:41.345 18:11:24 accel.accel_negative_buffers -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:41.345 18:11:24 accel.accel_negative_buffers -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:41.345 18:11:24 accel.accel_negative_buffers -- accel/accel.sh@36 -- # [[ -n '' ]] 00:07:41.345 18:11:24 accel.accel_negative_buffers -- accel/accel.sh@40 -- # local IFS=, 00:07:41.345 18:11:24 accel.accel_negative_buffers -- accel/accel.sh@41 -- # jq -r . 00:07:41.345 -x option must be non-negative. 00:07:41.345 [2024-07-12 18:11:24.888537] app.c:1450:spdk_app_parse_args: *ERROR*: Parsing app-specific command line parameter 'x' failed: 1 00:07:41.345 accel_perf options: 00:07:41.345 [-h help message] 00:07:41.345 [-q queue depth per core] 00:07:41.345 [-C for supported workloads, use this value to configure the io vector size to test (default 1) 00:07:41.345 [-T number of threads per core 00:07:41.345 [-o transfer size in bytes (default: 4KiB. For compress/decompress, 0 means the input file size)] 00:07:41.345 [-t time in seconds] 00:07:41.345 [-w workload type must be one of these: copy, fill, crc32c, copy_crc32c, compare, compress, decompress, dualcast, xor, 00:07:41.345 [ dif_verify, dif_verify_copy, dif_generate, dif_generate_copy 00:07:41.345 [-M assign module to the operation, not compatible with accel_assign_opc RPC 00:07:41.345 [-l for compress/decompress workloads, name of uncompressed input file 00:07:41.345 [-S for crc32c workload, use this seed value (default 0) 00:07:41.345 [-P for compare workload, percentage of operations that should miscompare (percent, default 0) 00:07:41.345 [-f for fill workload, use this BYTE value (default 255) 00:07:41.345 [-x for xor workload, use this number of source buffers (default, minimum: 2)] 00:07:41.345 [-y verify result if this switch is on] 00:07:41.345 [-a tasks to allocate per core (default: same value as -q)] 00:07:41.345 Can be used to spread operations across a wider range of memory. 00:07:41.345 18:11:24 accel.accel_negative_buffers -- common/autotest_common.sh@651 -- # es=1 00:07:41.345 18:11:24 accel.accel_negative_buffers -- common/autotest_common.sh@659 -- # (( es > 128 )) 00:07:41.345 18:11:24 accel.accel_negative_buffers -- common/autotest_common.sh@670 -- # [[ -n '' ]] 00:07:41.345 18:11:24 accel.accel_negative_buffers -- common/autotest_common.sh@675 -- # (( !es == 0 )) 00:07:41.345 00:07:41.345 real 0m0.043s 00:07:41.345 user 0m0.024s 00:07:41.345 sys 0m0.019s 00:07:41.345 18:11:24 accel.accel_negative_buffers -- common/autotest_common.sh@1124 -- # xtrace_disable 00:07:41.345 18:11:24 accel.accel_negative_buffers -- common/autotest_common.sh@10 -- # set +x 00:07:41.345 ************************************ 00:07:41.345 END TEST accel_negative_buffers 00:07:41.345 ************************************ 00:07:41.345 Error: writing output failed: Broken pipe 00:07:41.345 18:11:24 accel -- common/autotest_common.sh@1142 -- # return 0 00:07:41.345 18:11:24 accel -- accel/accel.sh@101 -- # run_test accel_crc32c accel_test -t 1 -w crc32c -S 32 -y 00:07:41.345 18:11:24 accel -- common/autotest_common.sh@1099 -- # '[' 9 -le 1 ']' 00:07:41.345 18:11:24 accel -- common/autotest_common.sh@1105 -- # xtrace_disable 00:07:41.345 18:11:24 accel -- common/autotest_common.sh@10 -- # set +x 00:07:41.345 ************************************ 00:07:41.345 START TEST accel_crc32c 00:07:41.345 ************************************ 00:07:41.345 18:11:24 accel.accel_crc32c -- common/autotest_common.sh@1123 -- # accel_test -t 1 -w crc32c -S 32 -y 00:07:41.345 18:11:24 accel.accel_crc32c -- accel/accel.sh@16 -- # local accel_opc 00:07:41.345 18:11:24 accel.accel_crc32c -- accel/accel.sh@17 -- # local accel_module 00:07:41.345 18:11:24 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:07:41.345 18:11:24 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:07:41.345 18:11:24 accel.accel_crc32c -- accel/accel.sh@15 -- # accel_perf -t 1 -w crc32c -S 32 -y 00:07:41.345 18:11:24 accel.accel_crc32c -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w crc32c -S 32 -y 00:07:41.345 18:11:24 accel.accel_crc32c -- accel/accel.sh@12 -- # build_accel_config 00:07:41.345 18:11:24 accel.accel_crc32c -- accel/accel.sh@31 -- # accel_json_cfg=() 00:07:41.345 18:11:24 accel.accel_crc32c -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:07:41.345 18:11:24 accel.accel_crc32c -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:41.345 18:11:24 accel.accel_crc32c -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:41.345 18:11:24 accel.accel_crc32c -- accel/accel.sh@36 -- # [[ -n '' ]] 00:07:41.345 18:11:24 accel.accel_crc32c -- accel/accel.sh@40 -- # local IFS=, 00:07:41.345 18:11:24 accel.accel_crc32c -- accel/accel.sh@41 -- # jq -r . 00:07:41.345 [2024-07-12 18:11:25.012556] Starting SPDK v24.09-pre git sha1 182dd7de4 / DPDK 24.03.0 initialization... 00:07:41.345 [2024-07-12 18:11:25.012620] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2429321 ] 00:07:41.604 [2024-07-12 18:11:25.142181] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:41.604 [2024-07-12 18:11:25.247282] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:07:41.604 18:11:25 accel.accel_crc32c -- accel/accel.sh@20 -- # val= 00:07:41.604 18:11:25 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:07:41.604 18:11:25 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:07:41.604 18:11:25 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:07:41.604 18:11:25 accel.accel_crc32c -- accel/accel.sh@20 -- # val= 00:07:41.604 18:11:25 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:07:41.604 18:11:25 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:07:41.604 18:11:25 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:07:41.604 18:11:25 accel.accel_crc32c -- accel/accel.sh@20 -- # val=0x1 00:07:41.604 18:11:25 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:07:41.604 18:11:25 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:07:41.604 18:11:25 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:07:41.604 18:11:25 accel.accel_crc32c -- accel/accel.sh@20 -- # val= 00:07:41.604 18:11:25 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:07:41.604 18:11:25 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:07:41.604 18:11:25 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:07:41.604 18:11:25 accel.accel_crc32c -- accel/accel.sh@20 -- # val= 00:07:41.604 18:11:25 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:07:41.604 18:11:25 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:07:41.604 18:11:25 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:07:41.604 18:11:25 accel.accel_crc32c -- accel/accel.sh@20 -- # val=crc32c 00:07:41.604 18:11:25 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:07:41.604 18:11:25 accel.accel_crc32c -- accel/accel.sh@23 -- # accel_opc=crc32c 00:07:41.604 18:11:25 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:07:41.604 18:11:25 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:07:41.604 18:11:25 accel.accel_crc32c -- accel/accel.sh@20 -- # val=32 00:07:41.604 18:11:25 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:07:41.604 18:11:25 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:07:41.604 18:11:25 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:07:41.604 18:11:25 accel.accel_crc32c -- accel/accel.sh@20 -- # val='4096 bytes' 00:07:41.604 18:11:25 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:07:41.604 18:11:25 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:07:41.604 18:11:25 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:07:41.604 18:11:25 accel.accel_crc32c -- accel/accel.sh@20 -- # val= 00:07:41.604 18:11:25 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:07:41.604 18:11:25 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:07:41.604 18:11:25 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:07:41.604 18:11:25 accel.accel_crc32c -- accel/accel.sh@20 -- # val=software 00:07:41.604 18:11:25 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:07:41.604 18:11:25 accel.accel_crc32c -- accel/accel.sh@22 -- # accel_module=software 00:07:41.604 18:11:25 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:07:41.604 18:11:25 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:07:41.604 18:11:25 accel.accel_crc32c -- accel/accel.sh@20 -- # val=32 00:07:41.604 18:11:25 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:07:41.604 18:11:25 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:07:41.604 18:11:25 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:07:41.604 18:11:25 accel.accel_crc32c -- accel/accel.sh@20 -- # val=32 00:07:41.604 18:11:25 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:07:41.604 18:11:25 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:07:41.604 18:11:25 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:07:41.604 18:11:25 accel.accel_crc32c -- accel/accel.sh@20 -- # val=1 00:07:41.604 18:11:25 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:07:41.604 18:11:25 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:07:41.604 18:11:25 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:07:41.604 18:11:25 accel.accel_crc32c -- accel/accel.sh@20 -- # val='1 seconds' 00:07:41.604 18:11:25 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:07:41.604 18:11:25 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:07:41.604 18:11:25 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:07:41.604 18:11:25 accel.accel_crc32c -- accel/accel.sh@20 -- # val=Yes 00:07:41.604 18:11:25 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:07:41.604 18:11:25 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:07:41.604 18:11:25 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:07:41.604 18:11:25 accel.accel_crc32c -- accel/accel.sh@20 -- # val= 00:07:41.604 18:11:25 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:07:41.604 18:11:25 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:07:41.604 18:11:25 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:07:41.604 18:11:25 accel.accel_crc32c -- accel/accel.sh@20 -- # val= 00:07:41.604 18:11:25 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:07:41.604 18:11:25 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:07:41.604 18:11:25 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:07:42.978 18:11:26 accel.accel_crc32c -- accel/accel.sh@20 -- # val= 00:07:42.978 18:11:26 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:07:42.978 18:11:26 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:07:42.978 18:11:26 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:07:42.978 18:11:26 accel.accel_crc32c -- accel/accel.sh@20 -- # val= 00:07:42.978 18:11:26 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:07:42.978 18:11:26 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:07:42.978 18:11:26 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:07:42.978 18:11:26 accel.accel_crc32c -- accel/accel.sh@20 -- # val= 00:07:42.978 18:11:26 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:07:42.978 18:11:26 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:07:42.978 18:11:26 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:07:42.978 18:11:26 accel.accel_crc32c -- accel/accel.sh@20 -- # val= 00:07:42.978 18:11:26 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:07:42.978 18:11:26 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:07:42.978 18:11:26 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:07:42.978 18:11:26 accel.accel_crc32c -- accel/accel.sh@20 -- # val= 00:07:42.978 18:11:26 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:07:42.978 18:11:26 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:07:42.978 18:11:26 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:07:42.978 18:11:26 accel.accel_crc32c -- accel/accel.sh@20 -- # val= 00:07:42.978 18:11:26 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:07:42.978 18:11:26 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:07:42.978 18:11:26 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:07:42.978 18:11:26 accel.accel_crc32c -- accel/accel.sh@27 -- # [[ -n software ]] 00:07:42.978 18:11:26 accel.accel_crc32c -- accel/accel.sh@27 -- # [[ -n crc32c ]] 00:07:42.978 18:11:26 accel.accel_crc32c -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:07:42.978 00:07:42.978 real 0m1.505s 00:07:42.978 user 0m1.316s 00:07:42.978 sys 0m0.190s 00:07:42.978 18:11:26 accel.accel_crc32c -- common/autotest_common.sh@1124 -- # xtrace_disable 00:07:42.978 18:11:26 accel.accel_crc32c -- common/autotest_common.sh@10 -- # set +x 00:07:42.978 ************************************ 00:07:42.978 END TEST accel_crc32c 00:07:42.978 ************************************ 00:07:42.978 18:11:26 accel -- common/autotest_common.sh@1142 -- # return 0 00:07:42.978 18:11:26 accel -- accel/accel.sh@102 -- # run_test accel_crc32c_C2 accel_test -t 1 -w crc32c -y -C 2 00:07:42.978 18:11:26 accel -- common/autotest_common.sh@1099 -- # '[' 9 -le 1 ']' 00:07:42.978 18:11:26 accel -- common/autotest_common.sh@1105 -- # xtrace_disable 00:07:42.978 18:11:26 accel -- common/autotest_common.sh@10 -- # set +x 00:07:42.978 ************************************ 00:07:42.978 START TEST accel_crc32c_C2 00:07:42.978 ************************************ 00:07:42.978 18:11:26 accel.accel_crc32c_C2 -- common/autotest_common.sh@1123 -- # accel_test -t 1 -w crc32c -y -C 2 00:07:42.978 18:11:26 accel.accel_crc32c_C2 -- accel/accel.sh@16 -- # local accel_opc 00:07:42.978 18:11:26 accel.accel_crc32c_C2 -- accel/accel.sh@17 -- # local accel_module 00:07:42.978 18:11:26 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:07:42.978 18:11:26 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:07:42.978 18:11:26 accel.accel_crc32c_C2 -- accel/accel.sh@15 -- # accel_perf -t 1 -w crc32c -y -C 2 00:07:42.978 18:11:26 accel.accel_crc32c_C2 -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w crc32c -y -C 2 00:07:42.978 18:11:26 accel.accel_crc32c_C2 -- accel/accel.sh@12 -- # build_accel_config 00:07:42.978 18:11:26 accel.accel_crc32c_C2 -- accel/accel.sh@31 -- # accel_json_cfg=() 00:07:42.978 18:11:26 accel.accel_crc32c_C2 -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:07:42.978 18:11:26 accel.accel_crc32c_C2 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:42.978 18:11:26 accel.accel_crc32c_C2 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:42.978 18:11:26 accel.accel_crc32c_C2 -- accel/accel.sh@36 -- # [[ -n '' ]] 00:07:42.978 18:11:26 accel.accel_crc32c_C2 -- accel/accel.sh@40 -- # local IFS=, 00:07:42.978 18:11:26 accel.accel_crc32c_C2 -- accel/accel.sh@41 -- # jq -r . 00:07:42.978 [2024-07-12 18:11:26.602591] Starting SPDK v24.09-pre git sha1 182dd7de4 / DPDK 24.03.0 initialization... 00:07:42.978 [2024-07-12 18:11:26.602660] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2429529 ] 00:07:43.237 [2024-07-12 18:11:26.732109] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:43.237 [2024-07-12 18:11:26.833158] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:07:43.237 18:11:26 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:07:43.237 18:11:26 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:07:43.237 18:11:26 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:07:43.237 18:11:26 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:07:43.237 18:11:26 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:07:43.237 18:11:26 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:07:43.237 18:11:26 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:07:43.237 18:11:26 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:07:43.237 18:11:26 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val=0x1 00:07:43.237 18:11:26 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:07:43.237 18:11:26 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:07:43.237 18:11:26 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:07:43.237 18:11:26 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:07:43.237 18:11:26 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:07:43.237 18:11:26 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:07:43.237 18:11:26 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:07:43.237 18:11:26 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:07:43.237 18:11:26 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:07:43.237 18:11:26 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:07:43.237 18:11:26 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:07:43.237 18:11:26 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val=crc32c 00:07:43.237 18:11:26 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:07:43.237 18:11:26 accel.accel_crc32c_C2 -- accel/accel.sh@23 -- # accel_opc=crc32c 00:07:43.237 18:11:26 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:07:43.237 18:11:26 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:07:43.237 18:11:26 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val=0 00:07:43.237 18:11:26 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:07:43.237 18:11:26 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:07:43.237 18:11:26 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:07:43.237 18:11:26 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val='4096 bytes' 00:07:43.237 18:11:26 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:07:43.237 18:11:26 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:07:43.237 18:11:26 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:07:43.237 18:11:26 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:07:43.237 18:11:26 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:07:43.237 18:11:26 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:07:43.237 18:11:26 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:07:43.237 18:11:26 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val=software 00:07:43.237 18:11:26 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:07:43.237 18:11:26 accel.accel_crc32c_C2 -- accel/accel.sh@22 -- # accel_module=software 00:07:43.237 18:11:26 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:07:43.237 18:11:26 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:07:43.237 18:11:26 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val=32 00:07:43.237 18:11:26 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:07:43.237 18:11:26 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:07:43.237 18:11:26 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:07:43.237 18:11:26 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val=32 00:07:43.237 18:11:26 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:07:43.237 18:11:26 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:07:43.237 18:11:26 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:07:43.237 18:11:26 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val=1 00:07:43.237 18:11:26 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:07:43.237 18:11:26 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:07:43.237 18:11:26 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:07:43.237 18:11:26 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val='1 seconds' 00:07:43.237 18:11:26 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:07:43.237 18:11:26 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:07:43.237 18:11:26 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:07:43.237 18:11:26 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val=Yes 00:07:43.237 18:11:26 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:07:43.237 18:11:26 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:07:43.237 18:11:26 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:07:43.237 18:11:26 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:07:43.237 18:11:26 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:07:43.237 18:11:26 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:07:43.237 18:11:26 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:07:43.237 18:11:26 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:07:43.237 18:11:26 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:07:43.237 18:11:26 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:07:43.237 18:11:26 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:07:44.613 18:11:28 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:07:44.613 18:11:28 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:07:44.613 18:11:28 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:07:44.613 18:11:28 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:07:44.613 18:11:28 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:07:44.613 18:11:28 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:07:44.613 18:11:28 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:07:44.613 18:11:28 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:07:44.613 18:11:28 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:07:44.613 18:11:28 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:07:44.613 18:11:28 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:07:44.613 18:11:28 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:07:44.613 18:11:28 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:07:44.613 18:11:28 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:07:44.613 18:11:28 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:07:44.613 18:11:28 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:07:44.613 18:11:28 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:07:44.613 18:11:28 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:07:44.613 18:11:28 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:07:44.613 18:11:28 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:07:44.613 18:11:28 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:07:44.613 18:11:28 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:07:44.613 18:11:28 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:07:44.613 18:11:28 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:07:44.613 18:11:28 accel.accel_crc32c_C2 -- accel/accel.sh@27 -- # [[ -n software ]] 00:07:44.613 18:11:28 accel.accel_crc32c_C2 -- accel/accel.sh@27 -- # [[ -n crc32c ]] 00:07:44.613 18:11:28 accel.accel_crc32c_C2 -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:07:44.613 00:07:44.613 real 0m1.510s 00:07:44.613 user 0m1.318s 00:07:44.613 sys 0m0.192s 00:07:44.613 18:11:28 accel.accel_crc32c_C2 -- common/autotest_common.sh@1124 -- # xtrace_disable 00:07:44.613 18:11:28 accel.accel_crc32c_C2 -- common/autotest_common.sh@10 -- # set +x 00:07:44.613 ************************************ 00:07:44.613 END TEST accel_crc32c_C2 00:07:44.613 ************************************ 00:07:44.613 18:11:28 accel -- common/autotest_common.sh@1142 -- # return 0 00:07:44.613 18:11:28 accel -- accel/accel.sh@103 -- # run_test accel_copy accel_test -t 1 -w copy -y 00:07:44.613 18:11:28 accel -- common/autotest_common.sh@1099 -- # '[' 7 -le 1 ']' 00:07:44.613 18:11:28 accel -- common/autotest_common.sh@1105 -- # xtrace_disable 00:07:44.613 18:11:28 accel -- common/autotest_common.sh@10 -- # set +x 00:07:44.613 ************************************ 00:07:44.613 START TEST accel_copy 00:07:44.613 ************************************ 00:07:44.613 18:11:28 accel.accel_copy -- common/autotest_common.sh@1123 -- # accel_test -t 1 -w copy -y 00:07:44.613 18:11:28 accel.accel_copy -- accel/accel.sh@16 -- # local accel_opc 00:07:44.613 18:11:28 accel.accel_copy -- accel/accel.sh@17 -- # local accel_module 00:07:44.613 18:11:28 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:07:44.613 18:11:28 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:07:44.613 18:11:28 accel.accel_copy -- accel/accel.sh@15 -- # accel_perf -t 1 -w copy -y 00:07:44.613 18:11:28 accel.accel_copy -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w copy -y 00:07:44.613 18:11:28 accel.accel_copy -- accel/accel.sh@12 -- # build_accel_config 00:07:44.613 18:11:28 accel.accel_copy -- accel/accel.sh@31 -- # accel_json_cfg=() 00:07:44.613 18:11:28 accel.accel_copy -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:07:44.613 18:11:28 accel.accel_copy -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:44.613 18:11:28 accel.accel_copy -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:44.613 18:11:28 accel.accel_copy -- accel/accel.sh@36 -- # [[ -n '' ]] 00:07:44.613 18:11:28 accel.accel_copy -- accel/accel.sh@40 -- # local IFS=, 00:07:44.613 18:11:28 accel.accel_copy -- accel/accel.sh@41 -- # jq -r . 00:07:44.613 [2024-07-12 18:11:28.196190] Starting SPDK v24.09-pre git sha1 182dd7de4 / DPDK 24.03.0 initialization... 00:07:44.613 [2024-07-12 18:11:28.196252] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2429819 ] 00:07:44.613 [2024-07-12 18:11:28.323892] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:44.871 [2024-07-12 18:11:28.426137] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:07:44.871 18:11:28 accel.accel_copy -- accel/accel.sh@20 -- # val= 00:07:44.871 18:11:28 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:07:44.871 18:11:28 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:07:44.871 18:11:28 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:07:44.871 18:11:28 accel.accel_copy -- accel/accel.sh@20 -- # val= 00:07:44.871 18:11:28 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:07:44.871 18:11:28 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:07:44.871 18:11:28 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:07:44.871 18:11:28 accel.accel_copy -- accel/accel.sh@20 -- # val=0x1 00:07:44.871 18:11:28 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:07:44.871 18:11:28 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:07:44.871 18:11:28 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:07:44.871 18:11:28 accel.accel_copy -- accel/accel.sh@20 -- # val= 00:07:44.871 18:11:28 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:07:44.871 18:11:28 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:07:44.871 18:11:28 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:07:44.871 18:11:28 accel.accel_copy -- accel/accel.sh@20 -- # val= 00:07:44.871 18:11:28 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:07:44.871 18:11:28 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:07:44.871 18:11:28 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:07:44.871 18:11:28 accel.accel_copy -- accel/accel.sh@20 -- # val=copy 00:07:44.871 18:11:28 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:07:44.871 18:11:28 accel.accel_copy -- accel/accel.sh@23 -- # accel_opc=copy 00:07:44.871 18:11:28 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:07:44.871 18:11:28 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:07:44.871 18:11:28 accel.accel_copy -- accel/accel.sh@20 -- # val='4096 bytes' 00:07:44.871 18:11:28 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:07:44.871 18:11:28 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:07:44.871 18:11:28 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:07:44.871 18:11:28 accel.accel_copy -- accel/accel.sh@20 -- # val= 00:07:44.871 18:11:28 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:07:44.871 18:11:28 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:07:44.871 18:11:28 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:07:44.871 18:11:28 accel.accel_copy -- accel/accel.sh@20 -- # val=software 00:07:44.871 18:11:28 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:07:44.871 18:11:28 accel.accel_copy -- accel/accel.sh@22 -- # accel_module=software 00:07:44.871 18:11:28 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:07:44.871 18:11:28 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:07:44.871 18:11:28 accel.accel_copy -- accel/accel.sh@20 -- # val=32 00:07:44.871 18:11:28 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:07:44.871 18:11:28 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:07:44.871 18:11:28 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:07:44.871 18:11:28 accel.accel_copy -- accel/accel.sh@20 -- # val=32 00:07:44.871 18:11:28 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:07:44.871 18:11:28 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:07:44.871 18:11:28 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:07:44.871 18:11:28 accel.accel_copy -- accel/accel.sh@20 -- # val=1 00:07:44.871 18:11:28 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:07:44.871 18:11:28 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:07:44.871 18:11:28 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:07:44.871 18:11:28 accel.accel_copy -- accel/accel.sh@20 -- # val='1 seconds' 00:07:44.871 18:11:28 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:07:44.871 18:11:28 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:07:44.871 18:11:28 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:07:44.871 18:11:28 accel.accel_copy -- accel/accel.sh@20 -- # val=Yes 00:07:44.872 18:11:28 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:07:44.872 18:11:28 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:07:44.872 18:11:28 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:07:44.872 18:11:28 accel.accel_copy -- accel/accel.sh@20 -- # val= 00:07:44.872 18:11:28 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:07:44.872 18:11:28 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:07:44.872 18:11:28 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:07:44.872 18:11:28 accel.accel_copy -- accel/accel.sh@20 -- # val= 00:07:44.872 18:11:28 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:07:44.872 18:11:28 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:07:44.872 18:11:28 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:07:46.246 18:11:29 accel.accel_copy -- accel/accel.sh@20 -- # val= 00:07:46.246 18:11:29 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:07:46.246 18:11:29 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:07:46.246 18:11:29 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:07:46.246 18:11:29 accel.accel_copy -- accel/accel.sh@20 -- # val= 00:07:46.246 18:11:29 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:07:46.246 18:11:29 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:07:46.246 18:11:29 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:07:46.246 18:11:29 accel.accel_copy -- accel/accel.sh@20 -- # val= 00:07:46.246 18:11:29 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:07:46.246 18:11:29 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:07:46.246 18:11:29 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:07:46.246 18:11:29 accel.accel_copy -- accel/accel.sh@20 -- # val= 00:07:46.246 18:11:29 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:07:46.246 18:11:29 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:07:46.246 18:11:29 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:07:46.246 18:11:29 accel.accel_copy -- accel/accel.sh@20 -- # val= 00:07:46.246 18:11:29 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:07:46.246 18:11:29 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:07:46.246 18:11:29 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:07:46.246 18:11:29 accel.accel_copy -- accel/accel.sh@20 -- # val= 00:07:46.246 18:11:29 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:07:46.246 18:11:29 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:07:46.246 18:11:29 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:07:46.246 18:11:29 accel.accel_copy -- accel/accel.sh@27 -- # [[ -n software ]] 00:07:46.246 18:11:29 accel.accel_copy -- accel/accel.sh@27 -- # [[ -n copy ]] 00:07:46.246 18:11:29 accel.accel_copy -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:07:46.246 00:07:46.246 real 0m1.511s 00:07:46.246 user 0m1.317s 00:07:46.246 sys 0m0.195s 00:07:46.246 18:11:29 accel.accel_copy -- common/autotest_common.sh@1124 -- # xtrace_disable 00:07:46.246 18:11:29 accel.accel_copy -- common/autotest_common.sh@10 -- # set +x 00:07:46.246 ************************************ 00:07:46.246 END TEST accel_copy 00:07:46.246 ************************************ 00:07:46.246 18:11:29 accel -- common/autotest_common.sh@1142 -- # return 0 00:07:46.246 18:11:29 accel -- accel/accel.sh@104 -- # run_test accel_fill accel_test -t 1 -w fill -f 128 -q 64 -a 64 -y 00:07:46.246 18:11:29 accel -- common/autotest_common.sh@1099 -- # '[' 13 -le 1 ']' 00:07:46.246 18:11:29 accel -- common/autotest_common.sh@1105 -- # xtrace_disable 00:07:46.246 18:11:29 accel -- common/autotest_common.sh@10 -- # set +x 00:07:46.246 ************************************ 00:07:46.246 START TEST accel_fill 00:07:46.246 ************************************ 00:07:46.246 18:11:29 accel.accel_fill -- common/autotest_common.sh@1123 -- # accel_test -t 1 -w fill -f 128 -q 64 -a 64 -y 00:07:46.246 18:11:29 accel.accel_fill -- accel/accel.sh@16 -- # local accel_opc 00:07:46.246 18:11:29 accel.accel_fill -- accel/accel.sh@17 -- # local accel_module 00:07:46.246 18:11:29 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:07:46.246 18:11:29 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:07:46.246 18:11:29 accel.accel_fill -- accel/accel.sh@15 -- # accel_perf -t 1 -w fill -f 128 -q 64 -a 64 -y 00:07:46.246 18:11:29 accel.accel_fill -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w fill -f 128 -q 64 -a 64 -y 00:07:46.246 18:11:29 accel.accel_fill -- accel/accel.sh@12 -- # build_accel_config 00:07:46.246 18:11:29 accel.accel_fill -- accel/accel.sh@31 -- # accel_json_cfg=() 00:07:46.246 18:11:29 accel.accel_fill -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:07:46.246 18:11:29 accel.accel_fill -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:46.246 18:11:29 accel.accel_fill -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:46.246 18:11:29 accel.accel_fill -- accel/accel.sh@36 -- # [[ -n '' ]] 00:07:46.246 18:11:29 accel.accel_fill -- accel/accel.sh@40 -- # local IFS=, 00:07:46.246 18:11:29 accel.accel_fill -- accel/accel.sh@41 -- # jq -r . 00:07:46.246 [2024-07-12 18:11:29.784407] Starting SPDK v24.09-pre git sha1 182dd7de4 / DPDK 24.03.0 initialization... 00:07:46.246 [2024-07-12 18:11:29.784468] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2430078 ] 00:07:46.246 [2024-07-12 18:11:29.913151] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:46.503 [2024-07-12 18:11:30.018273] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:07:46.503 18:11:30 accel.accel_fill -- accel/accel.sh@20 -- # val= 00:07:46.503 18:11:30 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:07:46.503 18:11:30 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:07:46.503 18:11:30 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:07:46.503 18:11:30 accel.accel_fill -- accel/accel.sh@20 -- # val= 00:07:46.503 18:11:30 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:07:46.503 18:11:30 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:07:46.503 18:11:30 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:07:46.503 18:11:30 accel.accel_fill -- accel/accel.sh@20 -- # val=0x1 00:07:46.503 18:11:30 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:07:46.503 18:11:30 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:07:46.503 18:11:30 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:07:46.503 18:11:30 accel.accel_fill -- accel/accel.sh@20 -- # val= 00:07:46.503 18:11:30 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:07:46.503 18:11:30 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:07:46.503 18:11:30 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:07:46.503 18:11:30 accel.accel_fill -- accel/accel.sh@20 -- # val= 00:07:46.503 18:11:30 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:07:46.503 18:11:30 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:07:46.503 18:11:30 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:07:46.503 18:11:30 accel.accel_fill -- accel/accel.sh@20 -- # val=fill 00:07:46.503 18:11:30 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:07:46.503 18:11:30 accel.accel_fill -- accel/accel.sh@23 -- # accel_opc=fill 00:07:46.503 18:11:30 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:07:46.503 18:11:30 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:07:46.503 18:11:30 accel.accel_fill -- accel/accel.sh@20 -- # val=0x80 00:07:46.503 18:11:30 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:07:46.503 18:11:30 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:07:46.503 18:11:30 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:07:46.503 18:11:30 accel.accel_fill -- accel/accel.sh@20 -- # val='4096 bytes' 00:07:46.503 18:11:30 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:07:46.503 18:11:30 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:07:46.503 18:11:30 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:07:46.503 18:11:30 accel.accel_fill -- accel/accel.sh@20 -- # val= 00:07:46.503 18:11:30 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:07:46.503 18:11:30 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:07:46.503 18:11:30 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:07:46.503 18:11:30 accel.accel_fill -- accel/accel.sh@20 -- # val=software 00:07:46.503 18:11:30 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:07:46.503 18:11:30 accel.accel_fill -- accel/accel.sh@22 -- # accel_module=software 00:07:46.503 18:11:30 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:07:46.503 18:11:30 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:07:46.503 18:11:30 accel.accel_fill -- accel/accel.sh@20 -- # val=64 00:07:46.503 18:11:30 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:07:46.503 18:11:30 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:07:46.503 18:11:30 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:07:46.503 18:11:30 accel.accel_fill -- accel/accel.sh@20 -- # val=64 00:07:46.503 18:11:30 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:07:46.503 18:11:30 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:07:46.503 18:11:30 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:07:46.503 18:11:30 accel.accel_fill -- accel/accel.sh@20 -- # val=1 00:07:46.503 18:11:30 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:07:46.503 18:11:30 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:07:46.503 18:11:30 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:07:46.503 18:11:30 accel.accel_fill -- accel/accel.sh@20 -- # val='1 seconds' 00:07:46.503 18:11:30 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:07:46.503 18:11:30 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:07:46.504 18:11:30 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:07:46.504 18:11:30 accel.accel_fill -- accel/accel.sh@20 -- # val=Yes 00:07:46.504 18:11:30 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:07:46.504 18:11:30 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:07:46.504 18:11:30 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:07:46.504 18:11:30 accel.accel_fill -- accel/accel.sh@20 -- # val= 00:07:46.504 18:11:30 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:07:46.504 18:11:30 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:07:46.504 18:11:30 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:07:46.504 18:11:30 accel.accel_fill -- accel/accel.sh@20 -- # val= 00:07:46.504 18:11:30 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:07:46.504 18:11:30 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:07:46.504 18:11:30 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:07:47.872 18:11:31 accel.accel_fill -- accel/accel.sh@20 -- # val= 00:07:47.872 18:11:31 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:07:47.872 18:11:31 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:07:47.872 18:11:31 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:07:47.872 18:11:31 accel.accel_fill -- accel/accel.sh@20 -- # val= 00:07:47.872 18:11:31 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:07:47.872 18:11:31 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:07:47.872 18:11:31 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:07:47.872 18:11:31 accel.accel_fill -- accel/accel.sh@20 -- # val= 00:07:47.872 18:11:31 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:07:47.872 18:11:31 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:07:47.872 18:11:31 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:07:47.872 18:11:31 accel.accel_fill -- accel/accel.sh@20 -- # val= 00:07:47.872 18:11:31 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:07:47.872 18:11:31 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:07:47.872 18:11:31 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:07:47.872 18:11:31 accel.accel_fill -- accel/accel.sh@20 -- # val= 00:07:47.872 18:11:31 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:07:47.872 18:11:31 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:07:47.872 18:11:31 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:07:47.872 18:11:31 accel.accel_fill -- accel/accel.sh@20 -- # val= 00:07:47.872 18:11:31 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:07:47.872 18:11:31 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:07:47.872 18:11:31 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:07:47.872 18:11:31 accel.accel_fill -- accel/accel.sh@27 -- # [[ -n software ]] 00:07:47.872 18:11:31 accel.accel_fill -- accel/accel.sh@27 -- # [[ -n fill ]] 00:07:47.872 18:11:31 accel.accel_fill -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:07:47.872 00:07:47.872 real 0m1.506s 00:07:47.872 user 0m1.313s 00:07:47.872 sys 0m0.197s 00:07:47.872 18:11:31 accel.accel_fill -- common/autotest_common.sh@1124 -- # xtrace_disable 00:07:47.872 18:11:31 accel.accel_fill -- common/autotest_common.sh@10 -- # set +x 00:07:47.872 ************************************ 00:07:47.872 END TEST accel_fill 00:07:47.872 ************************************ 00:07:47.872 18:11:31 accel -- common/autotest_common.sh@1142 -- # return 0 00:07:47.872 18:11:31 accel -- accel/accel.sh@105 -- # run_test accel_copy_crc32c accel_test -t 1 -w copy_crc32c -y 00:07:47.872 18:11:31 accel -- common/autotest_common.sh@1099 -- # '[' 7 -le 1 ']' 00:07:47.872 18:11:31 accel -- common/autotest_common.sh@1105 -- # xtrace_disable 00:07:47.872 18:11:31 accel -- common/autotest_common.sh@10 -- # set +x 00:07:47.872 ************************************ 00:07:47.872 START TEST accel_copy_crc32c 00:07:47.872 ************************************ 00:07:47.872 18:11:31 accel.accel_copy_crc32c -- common/autotest_common.sh@1123 -- # accel_test -t 1 -w copy_crc32c -y 00:07:47.872 18:11:31 accel.accel_copy_crc32c -- accel/accel.sh@16 -- # local accel_opc 00:07:47.872 18:11:31 accel.accel_copy_crc32c -- accel/accel.sh@17 -- # local accel_module 00:07:47.872 18:11:31 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:07:47.872 18:11:31 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:07:47.872 18:11:31 accel.accel_copy_crc32c -- accel/accel.sh@15 -- # accel_perf -t 1 -w copy_crc32c -y 00:07:47.872 18:11:31 accel.accel_copy_crc32c -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w copy_crc32c -y 00:07:47.872 18:11:31 accel.accel_copy_crc32c -- accel/accel.sh@12 -- # build_accel_config 00:07:47.872 18:11:31 accel.accel_copy_crc32c -- accel/accel.sh@31 -- # accel_json_cfg=() 00:07:47.872 18:11:31 accel.accel_copy_crc32c -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:07:47.872 18:11:31 accel.accel_copy_crc32c -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:47.872 18:11:31 accel.accel_copy_crc32c -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:47.872 18:11:31 accel.accel_copy_crc32c -- accel/accel.sh@36 -- # [[ -n '' ]] 00:07:47.872 18:11:31 accel.accel_copy_crc32c -- accel/accel.sh@40 -- # local IFS=, 00:07:47.872 18:11:31 accel.accel_copy_crc32c -- accel/accel.sh@41 -- # jq -r . 00:07:47.872 [2024-07-12 18:11:31.377560] Starting SPDK v24.09-pre git sha1 182dd7de4 / DPDK 24.03.0 initialization... 00:07:47.872 [2024-07-12 18:11:31.377620] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2430279 ] 00:07:47.872 [2024-07-12 18:11:31.507295] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:48.130 [2024-07-12 18:11:31.608758] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:07:48.130 18:11:31 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val= 00:07:48.130 18:11:31 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:07:48.130 18:11:31 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:07:48.130 18:11:31 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:07:48.130 18:11:31 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val= 00:07:48.130 18:11:31 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:07:48.130 18:11:31 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:07:48.130 18:11:31 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:07:48.130 18:11:31 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val=0x1 00:07:48.130 18:11:31 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:07:48.130 18:11:31 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:07:48.130 18:11:31 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:07:48.130 18:11:31 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val= 00:07:48.130 18:11:31 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:07:48.130 18:11:31 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:07:48.130 18:11:31 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:07:48.130 18:11:31 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val= 00:07:48.130 18:11:31 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:07:48.130 18:11:31 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:07:48.130 18:11:31 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:07:48.130 18:11:31 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val=copy_crc32c 00:07:48.130 18:11:31 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:07:48.130 18:11:31 accel.accel_copy_crc32c -- accel/accel.sh@23 -- # accel_opc=copy_crc32c 00:07:48.130 18:11:31 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:07:48.130 18:11:31 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:07:48.130 18:11:31 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val=0 00:07:48.130 18:11:31 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:07:48.130 18:11:31 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:07:48.130 18:11:31 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:07:48.130 18:11:31 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val='4096 bytes' 00:07:48.130 18:11:31 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:07:48.130 18:11:31 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:07:48.130 18:11:31 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:07:48.130 18:11:31 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val='4096 bytes' 00:07:48.130 18:11:31 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:07:48.130 18:11:31 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:07:48.130 18:11:31 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:07:48.130 18:11:31 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val= 00:07:48.130 18:11:31 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:07:48.130 18:11:31 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:07:48.130 18:11:31 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:07:48.130 18:11:31 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val=software 00:07:48.130 18:11:31 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:07:48.130 18:11:31 accel.accel_copy_crc32c -- accel/accel.sh@22 -- # accel_module=software 00:07:48.130 18:11:31 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:07:48.130 18:11:31 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:07:48.130 18:11:31 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val=32 00:07:48.130 18:11:31 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:07:48.130 18:11:31 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:07:48.130 18:11:31 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:07:48.130 18:11:31 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val=32 00:07:48.130 18:11:31 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:07:48.130 18:11:31 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:07:48.130 18:11:31 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:07:48.130 18:11:31 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val=1 00:07:48.130 18:11:31 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:07:48.130 18:11:31 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:07:48.130 18:11:31 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:07:48.130 18:11:31 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val='1 seconds' 00:07:48.131 18:11:31 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:07:48.131 18:11:31 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:07:48.131 18:11:31 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:07:48.131 18:11:31 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val=Yes 00:07:48.131 18:11:31 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:07:48.131 18:11:31 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:07:48.131 18:11:31 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:07:48.131 18:11:31 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val= 00:07:48.131 18:11:31 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:07:48.131 18:11:31 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:07:48.131 18:11:31 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:07:48.131 18:11:31 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val= 00:07:48.131 18:11:31 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:07:48.131 18:11:31 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:07:48.131 18:11:31 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:07:49.503 18:11:32 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val= 00:07:49.503 18:11:32 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:07:49.503 18:11:32 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:07:49.503 18:11:32 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:07:49.503 18:11:32 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val= 00:07:49.503 18:11:32 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:07:49.503 18:11:32 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:07:49.503 18:11:32 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:07:49.503 18:11:32 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val= 00:07:49.503 18:11:32 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:07:49.503 18:11:32 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:07:49.503 18:11:32 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:07:49.503 18:11:32 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val= 00:07:49.503 18:11:32 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:07:49.503 18:11:32 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:07:49.503 18:11:32 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:07:49.503 18:11:32 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val= 00:07:49.503 18:11:32 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:07:49.503 18:11:32 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:07:49.503 18:11:32 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:07:49.503 18:11:32 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val= 00:07:49.503 18:11:32 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:07:49.503 18:11:32 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:07:49.503 18:11:32 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:07:49.503 18:11:32 accel.accel_copy_crc32c -- accel/accel.sh@27 -- # [[ -n software ]] 00:07:49.503 18:11:32 accel.accel_copy_crc32c -- accel/accel.sh@27 -- # [[ -n copy_crc32c ]] 00:07:49.503 18:11:32 accel.accel_copy_crc32c -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:07:49.503 00:07:49.503 real 0m1.496s 00:07:49.503 user 0m1.302s 00:07:49.503 sys 0m0.199s 00:07:49.503 18:11:32 accel.accel_copy_crc32c -- common/autotest_common.sh@1124 -- # xtrace_disable 00:07:49.503 18:11:32 accel.accel_copy_crc32c -- common/autotest_common.sh@10 -- # set +x 00:07:49.503 ************************************ 00:07:49.503 END TEST accel_copy_crc32c 00:07:49.503 ************************************ 00:07:49.503 18:11:32 accel -- common/autotest_common.sh@1142 -- # return 0 00:07:49.503 18:11:32 accel -- accel/accel.sh@106 -- # run_test accel_copy_crc32c_C2 accel_test -t 1 -w copy_crc32c -y -C 2 00:07:49.503 18:11:32 accel -- common/autotest_common.sh@1099 -- # '[' 9 -le 1 ']' 00:07:49.503 18:11:32 accel -- common/autotest_common.sh@1105 -- # xtrace_disable 00:07:49.503 18:11:32 accel -- common/autotest_common.sh@10 -- # set +x 00:07:49.503 ************************************ 00:07:49.503 START TEST accel_copy_crc32c_C2 00:07:49.503 ************************************ 00:07:49.503 18:11:32 accel.accel_copy_crc32c_C2 -- common/autotest_common.sh@1123 -- # accel_test -t 1 -w copy_crc32c -y -C 2 00:07:49.503 18:11:32 accel.accel_copy_crc32c_C2 -- accel/accel.sh@16 -- # local accel_opc 00:07:49.503 18:11:32 accel.accel_copy_crc32c_C2 -- accel/accel.sh@17 -- # local accel_module 00:07:49.503 18:11:32 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:07:49.503 18:11:32 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:07:49.503 18:11:32 accel.accel_copy_crc32c_C2 -- accel/accel.sh@15 -- # accel_perf -t 1 -w copy_crc32c -y -C 2 00:07:49.503 18:11:32 accel.accel_copy_crc32c_C2 -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w copy_crc32c -y -C 2 00:07:49.503 18:11:32 accel.accel_copy_crc32c_C2 -- accel/accel.sh@12 -- # build_accel_config 00:07:49.503 18:11:32 accel.accel_copy_crc32c_C2 -- accel/accel.sh@31 -- # accel_json_cfg=() 00:07:49.503 18:11:32 accel.accel_copy_crc32c_C2 -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:07:49.503 18:11:32 accel.accel_copy_crc32c_C2 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:49.503 18:11:32 accel.accel_copy_crc32c_C2 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:49.503 18:11:32 accel.accel_copy_crc32c_C2 -- accel/accel.sh@36 -- # [[ -n '' ]] 00:07:49.504 18:11:32 accel.accel_copy_crc32c_C2 -- accel/accel.sh@40 -- # local IFS=, 00:07:49.504 18:11:32 accel.accel_copy_crc32c_C2 -- accel/accel.sh@41 -- # jq -r . 00:07:49.504 [2024-07-12 18:11:32.953949] Starting SPDK v24.09-pre git sha1 182dd7de4 / DPDK 24.03.0 initialization... 00:07:49.504 [2024-07-12 18:11:32.954015] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2430490 ] 00:07:49.504 [2024-07-12 18:11:33.082106] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:49.504 [2024-07-12 18:11:33.182983] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:07:49.763 18:11:33 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:07:49.763 18:11:33 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:07:49.763 18:11:33 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:07:49.763 18:11:33 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:07:49.763 18:11:33 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:07:49.763 18:11:33 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:07:49.763 18:11:33 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:07:49.763 18:11:33 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:07:49.763 18:11:33 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val=0x1 00:07:49.763 18:11:33 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:07:49.763 18:11:33 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:07:49.763 18:11:33 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:07:49.763 18:11:33 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:07:49.763 18:11:33 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:07:49.763 18:11:33 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:07:49.763 18:11:33 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:07:49.763 18:11:33 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:07:49.763 18:11:33 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:07:49.763 18:11:33 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:07:49.763 18:11:33 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:07:49.763 18:11:33 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val=copy_crc32c 00:07:49.763 18:11:33 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:07:49.763 18:11:33 accel.accel_copy_crc32c_C2 -- accel/accel.sh@23 -- # accel_opc=copy_crc32c 00:07:49.763 18:11:33 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:07:49.763 18:11:33 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:07:49.763 18:11:33 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val=0 00:07:49.763 18:11:33 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:07:49.763 18:11:33 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:07:49.763 18:11:33 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:07:49.763 18:11:33 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val='4096 bytes' 00:07:49.763 18:11:33 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:07:49.763 18:11:33 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:07:49.763 18:11:33 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:07:49.763 18:11:33 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val='8192 bytes' 00:07:49.763 18:11:33 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:07:49.763 18:11:33 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:07:49.763 18:11:33 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:07:49.763 18:11:33 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:07:49.763 18:11:33 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:07:49.763 18:11:33 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:07:49.763 18:11:33 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:07:49.763 18:11:33 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val=software 00:07:49.763 18:11:33 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:07:49.763 18:11:33 accel.accel_copy_crc32c_C2 -- accel/accel.sh@22 -- # accel_module=software 00:07:49.763 18:11:33 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:07:49.763 18:11:33 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:07:49.763 18:11:33 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val=32 00:07:49.763 18:11:33 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:07:49.763 18:11:33 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:07:49.763 18:11:33 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:07:49.763 18:11:33 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val=32 00:07:49.763 18:11:33 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:07:49.763 18:11:33 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:07:49.763 18:11:33 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:07:49.763 18:11:33 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val=1 00:07:49.763 18:11:33 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:07:49.763 18:11:33 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:07:49.763 18:11:33 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:07:49.763 18:11:33 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val='1 seconds' 00:07:49.763 18:11:33 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:07:49.763 18:11:33 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:07:49.763 18:11:33 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:07:49.763 18:11:33 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val=Yes 00:07:49.763 18:11:33 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:07:49.763 18:11:33 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:07:49.763 18:11:33 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:07:49.763 18:11:33 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:07:49.763 18:11:33 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:07:49.763 18:11:33 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:07:49.763 18:11:33 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:07:49.763 18:11:33 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:07:49.763 18:11:33 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:07:49.763 18:11:33 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:07:49.763 18:11:33 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:07:50.696 18:11:34 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:07:50.696 18:11:34 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:07:50.696 18:11:34 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:07:50.696 18:11:34 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:07:50.696 18:11:34 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:07:50.696 18:11:34 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:07:50.696 18:11:34 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:07:50.696 18:11:34 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:07:50.696 18:11:34 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:07:50.696 18:11:34 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:07:50.696 18:11:34 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:07:50.696 18:11:34 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:07:50.696 18:11:34 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:07:50.696 18:11:34 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:07:50.696 18:11:34 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:07:50.696 18:11:34 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:07:50.696 18:11:34 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:07:50.696 18:11:34 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:07:50.696 18:11:34 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:07:50.696 18:11:34 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:07:50.696 18:11:34 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:07:50.696 18:11:34 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:07:50.696 18:11:34 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:07:50.696 18:11:34 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:07:50.696 18:11:34 accel.accel_copy_crc32c_C2 -- accel/accel.sh@27 -- # [[ -n software ]] 00:07:50.696 18:11:34 accel.accel_copy_crc32c_C2 -- accel/accel.sh@27 -- # [[ -n copy_crc32c ]] 00:07:50.696 18:11:34 accel.accel_copy_crc32c_C2 -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:07:50.696 00:07:50.696 real 0m1.501s 00:07:50.696 user 0m1.319s 00:07:50.696 sys 0m0.187s 00:07:50.696 18:11:34 accel.accel_copy_crc32c_C2 -- common/autotest_common.sh@1124 -- # xtrace_disable 00:07:50.696 18:11:34 accel.accel_copy_crc32c_C2 -- common/autotest_common.sh@10 -- # set +x 00:07:50.696 ************************************ 00:07:50.696 END TEST accel_copy_crc32c_C2 00:07:50.696 ************************************ 00:07:50.955 18:11:34 accel -- common/autotest_common.sh@1142 -- # return 0 00:07:50.955 18:11:34 accel -- accel/accel.sh@107 -- # run_test accel_dualcast accel_test -t 1 -w dualcast -y 00:07:50.955 18:11:34 accel -- common/autotest_common.sh@1099 -- # '[' 7 -le 1 ']' 00:07:50.955 18:11:34 accel -- common/autotest_common.sh@1105 -- # xtrace_disable 00:07:50.955 18:11:34 accel -- common/autotest_common.sh@10 -- # set +x 00:07:50.955 ************************************ 00:07:50.955 START TEST accel_dualcast 00:07:50.955 ************************************ 00:07:50.955 18:11:34 accel.accel_dualcast -- common/autotest_common.sh@1123 -- # accel_test -t 1 -w dualcast -y 00:07:50.955 18:11:34 accel.accel_dualcast -- accel/accel.sh@16 -- # local accel_opc 00:07:50.955 18:11:34 accel.accel_dualcast -- accel/accel.sh@17 -- # local accel_module 00:07:50.955 18:11:34 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:07:50.955 18:11:34 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:07:50.955 18:11:34 accel.accel_dualcast -- accel/accel.sh@15 -- # accel_perf -t 1 -w dualcast -y 00:07:50.955 18:11:34 accel.accel_dualcast -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w dualcast -y 00:07:50.955 18:11:34 accel.accel_dualcast -- accel/accel.sh@12 -- # build_accel_config 00:07:50.955 18:11:34 accel.accel_dualcast -- accel/accel.sh@31 -- # accel_json_cfg=() 00:07:50.955 18:11:34 accel.accel_dualcast -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:07:50.955 18:11:34 accel.accel_dualcast -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:50.955 18:11:34 accel.accel_dualcast -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:50.955 18:11:34 accel.accel_dualcast -- accel/accel.sh@36 -- # [[ -n '' ]] 00:07:50.955 18:11:34 accel.accel_dualcast -- accel/accel.sh@40 -- # local IFS=, 00:07:50.955 18:11:34 accel.accel_dualcast -- accel/accel.sh@41 -- # jq -r . 00:07:50.955 [2024-07-12 18:11:34.537702] Starting SPDK v24.09-pre git sha1 182dd7de4 / DPDK 24.03.0 initialization... 00:07:50.955 [2024-07-12 18:11:34.537761] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2430687 ] 00:07:50.955 [2024-07-12 18:11:34.666401] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:51.213 [2024-07-12 18:11:34.768639] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:07:51.213 18:11:34 accel.accel_dualcast -- accel/accel.sh@20 -- # val= 00:07:51.213 18:11:34 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:07:51.213 18:11:34 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:07:51.213 18:11:34 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:07:51.213 18:11:34 accel.accel_dualcast -- accel/accel.sh@20 -- # val= 00:07:51.213 18:11:34 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:07:51.213 18:11:34 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:07:51.213 18:11:34 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:07:51.213 18:11:34 accel.accel_dualcast -- accel/accel.sh@20 -- # val=0x1 00:07:51.213 18:11:34 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:07:51.213 18:11:34 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:07:51.213 18:11:34 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:07:51.213 18:11:34 accel.accel_dualcast -- accel/accel.sh@20 -- # val= 00:07:51.213 18:11:34 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:07:51.213 18:11:34 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:07:51.213 18:11:34 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:07:51.213 18:11:34 accel.accel_dualcast -- accel/accel.sh@20 -- # val= 00:07:51.213 18:11:34 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:07:51.213 18:11:34 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:07:51.213 18:11:34 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:07:51.213 18:11:34 accel.accel_dualcast -- accel/accel.sh@20 -- # val=dualcast 00:07:51.213 18:11:34 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:07:51.213 18:11:34 accel.accel_dualcast -- accel/accel.sh@23 -- # accel_opc=dualcast 00:07:51.213 18:11:34 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:07:51.213 18:11:34 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:07:51.213 18:11:34 accel.accel_dualcast -- accel/accel.sh@20 -- # val='4096 bytes' 00:07:51.213 18:11:34 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:07:51.213 18:11:34 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:07:51.213 18:11:34 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:07:51.213 18:11:34 accel.accel_dualcast -- accel/accel.sh@20 -- # val= 00:07:51.213 18:11:34 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:07:51.213 18:11:34 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:07:51.213 18:11:34 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:07:51.213 18:11:34 accel.accel_dualcast -- accel/accel.sh@20 -- # val=software 00:07:51.213 18:11:34 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:07:51.213 18:11:34 accel.accel_dualcast -- accel/accel.sh@22 -- # accel_module=software 00:07:51.213 18:11:34 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:07:51.213 18:11:34 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:07:51.213 18:11:34 accel.accel_dualcast -- accel/accel.sh@20 -- # val=32 00:07:51.213 18:11:34 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:07:51.213 18:11:34 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:07:51.213 18:11:34 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:07:51.213 18:11:34 accel.accel_dualcast -- accel/accel.sh@20 -- # val=32 00:07:51.213 18:11:34 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:07:51.214 18:11:34 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:07:51.214 18:11:34 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:07:51.214 18:11:34 accel.accel_dualcast -- accel/accel.sh@20 -- # val=1 00:07:51.214 18:11:34 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:07:51.214 18:11:34 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:07:51.214 18:11:34 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:07:51.214 18:11:34 accel.accel_dualcast -- accel/accel.sh@20 -- # val='1 seconds' 00:07:51.214 18:11:34 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:07:51.214 18:11:34 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:07:51.214 18:11:34 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:07:51.214 18:11:34 accel.accel_dualcast -- accel/accel.sh@20 -- # val=Yes 00:07:51.214 18:11:34 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:07:51.214 18:11:34 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:07:51.214 18:11:34 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:07:51.214 18:11:34 accel.accel_dualcast -- accel/accel.sh@20 -- # val= 00:07:51.214 18:11:34 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:07:51.214 18:11:34 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:07:51.214 18:11:34 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:07:51.214 18:11:34 accel.accel_dualcast -- accel/accel.sh@20 -- # val= 00:07:51.214 18:11:34 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:07:51.214 18:11:34 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:07:51.214 18:11:34 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:07:52.647 18:11:36 accel.accel_dualcast -- accel/accel.sh@20 -- # val= 00:07:52.647 18:11:36 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:07:52.647 18:11:36 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:07:52.647 18:11:36 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:07:52.647 18:11:36 accel.accel_dualcast -- accel/accel.sh@20 -- # val= 00:07:52.647 18:11:36 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:07:52.647 18:11:36 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:07:52.647 18:11:36 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:07:52.647 18:11:36 accel.accel_dualcast -- accel/accel.sh@20 -- # val= 00:07:52.647 18:11:36 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:07:52.647 18:11:36 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:07:52.647 18:11:36 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:07:52.647 18:11:36 accel.accel_dualcast -- accel/accel.sh@20 -- # val= 00:07:52.647 18:11:36 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:07:52.647 18:11:36 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:07:52.647 18:11:36 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:07:52.647 18:11:36 accel.accel_dualcast -- accel/accel.sh@20 -- # val= 00:07:52.647 18:11:36 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:07:52.647 18:11:36 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:07:52.647 18:11:36 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:07:52.647 18:11:36 accel.accel_dualcast -- accel/accel.sh@20 -- # val= 00:07:52.647 18:11:36 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:07:52.647 18:11:36 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:07:52.647 18:11:36 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:07:52.647 18:11:36 accel.accel_dualcast -- accel/accel.sh@27 -- # [[ -n software ]] 00:07:52.647 18:11:36 accel.accel_dualcast -- accel/accel.sh@27 -- # [[ -n dualcast ]] 00:07:52.647 18:11:36 accel.accel_dualcast -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:07:52.647 00:07:52.647 real 0m1.514s 00:07:52.647 user 0m1.314s 00:07:52.647 sys 0m0.198s 00:07:52.647 18:11:36 accel.accel_dualcast -- common/autotest_common.sh@1124 -- # xtrace_disable 00:07:52.647 18:11:36 accel.accel_dualcast -- common/autotest_common.sh@10 -- # set +x 00:07:52.647 ************************************ 00:07:52.647 END TEST accel_dualcast 00:07:52.647 ************************************ 00:07:52.647 18:11:36 accel -- common/autotest_common.sh@1142 -- # return 0 00:07:52.647 18:11:36 accel -- accel/accel.sh@108 -- # run_test accel_compare accel_test -t 1 -w compare -y 00:07:52.647 18:11:36 accel -- common/autotest_common.sh@1099 -- # '[' 7 -le 1 ']' 00:07:52.647 18:11:36 accel -- common/autotest_common.sh@1105 -- # xtrace_disable 00:07:52.647 18:11:36 accel -- common/autotest_common.sh@10 -- # set +x 00:07:52.647 ************************************ 00:07:52.647 START TEST accel_compare 00:07:52.647 ************************************ 00:07:52.647 18:11:36 accel.accel_compare -- common/autotest_common.sh@1123 -- # accel_test -t 1 -w compare -y 00:07:52.647 18:11:36 accel.accel_compare -- accel/accel.sh@16 -- # local accel_opc 00:07:52.647 18:11:36 accel.accel_compare -- accel/accel.sh@17 -- # local accel_module 00:07:52.647 18:11:36 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:07:52.647 18:11:36 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:07:52.647 18:11:36 accel.accel_compare -- accel/accel.sh@15 -- # accel_perf -t 1 -w compare -y 00:07:52.647 18:11:36 accel.accel_compare -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w compare -y 00:07:52.647 18:11:36 accel.accel_compare -- accel/accel.sh@12 -- # build_accel_config 00:07:52.647 18:11:36 accel.accel_compare -- accel/accel.sh@31 -- # accel_json_cfg=() 00:07:52.647 18:11:36 accel.accel_compare -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:07:52.647 18:11:36 accel.accel_compare -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:52.647 18:11:36 accel.accel_compare -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:52.647 18:11:36 accel.accel_compare -- accel/accel.sh@36 -- # [[ -n '' ]] 00:07:52.647 18:11:36 accel.accel_compare -- accel/accel.sh@40 -- # local IFS=, 00:07:52.647 18:11:36 accel.accel_compare -- accel/accel.sh@41 -- # jq -r . 00:07:52.647 [2024-07-12 18:11:36.122606] Starting SPDK v24.09-pre git sha1 182dd7de4 / DPDK 24.03.0 initialization... 00:07:52.647 [2024-07-12 18:11:36.122667] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2430894 ] 00:07:52.647 [2024-07-12 18:11:36.251515] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:52.647 [2024-07-12 18:11:36.350003] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:07:52.905 18:11:36 accel.accel_compare -- accel/accel.sh@20 -- # val= 00:07:52.905 18:11:36 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:07:52.905 18:11:36 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:07:52.905 18:11:36 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:07:52.905 18:11:36 accel.accel_compare -- accel/accel.sh@20 -- # val= 00:07:52.905 18:11:36 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:07:52.905 18:11:36 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:07:52.905 18:11:36 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:07:52.905 18:11:36 accel.accel_compare -- accel/accel.sh@20 -- # val=0x1 00:07:52.905 18:11:36 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:07:52.905 18:11:36 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:07:52.905 18:11:36 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:07:52.905 18:11:36 accel.accel_compare -- accel/accel.sh@20 -- # val= 00:07:52.905 18:11:36 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:07:52.905 18:11:36 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:07:52.905 18:11:36 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:07:52.905 18:11:36 accel.accel_compare -- accel/accel.sh@20 -- # val= 00:07:52.905 18:11:36 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:07:52.905 18:11:36 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:07:52.905 18:11:36 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:07:52.905 18:11:36 accel.accel_compare -- accel/accel.sh@20 -- # val=compare 00:07:52.905 18:11:36 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:07:52.905 18:11:36 accel.accel_compare -- accel/accel.sh@23 -- # accel_opc=compare 00:07:52.905 18:11:36 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:07:52.905 18:11:36 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:07:52.905 18:11:36 accel.accel_compare -- accel/accel.sh@20 -- # val='4096 bytes' 00:07:52.905 18:11:36 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:07:52.906 18:11:36 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:07:52.906 18:11:36 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:07:52.906 18:11:36 accel.accel_compare -- accel/accel.sh@20 -- # val= 00:07:52.906 18:11:36 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:07:52.906 18:11:36 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:07:52.906 18:11:36 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:07:52.906 18:11:36 accel.accel_compare -- accel/accel.sh@20 -- # val=software 00:07:52.906 18:11:36 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:07:52.906 18:11:36 accel.accel_compare -- accel/accel.sh@22 -- # accel_module=software 00:07:52.906 18:11:36 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:07:52.906 18:11:36 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:07:52.906 18:11:36 accel.accel_compare -- accel/accel.sh@20 -- # val=32 00:07:52.906 18:11:36 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:07:52.906 18:11:36 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:07:52.906 18:11:36 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:07:52.906 18:11:36 accel.accel_compare -- accel/accel.sh@20 -- # val=32 00:07:52.906 18:11:36 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:07:52.906 18:11:36 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:07:52.906 18:11:36 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:07:52.906 18:11:36 accel.accel_compare -- accel/accel.sh@20 -- # val=1 00:07:52.906 18:11:36 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:07:52.906 18:11:36 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:07:52.906 18:11:36 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:07:52.906 18:11:36 accel.accel_compare -- accel/accel.sh@20 -- # val='1 seconds' 00:07:52.906 18:11:36 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:07:52.906 18:11:36 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:07:52.906 18:11:36 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:07:52.906 18:11:36 accel.accel_compare -- accel/accel.sh@20 -- # val=Yes 00:07:52.906 18:11:36 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:07:52.906 18:11:36 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:07:52.906 18:11:36 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:07:52.906 18:11:36 accel.accel_compare -- accel/accel.sh@20 -- # val= 00:07:52.906 18:11:36 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:07:52.906 18:11:36 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:07:52.906 18:11:36 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:07:52.906 18:11:36 accel.accel_compare -- accel/accel.sh@20 -- # val= 00:07:52.906 18:11:36 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:07:52.906 18:11:36 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:07:52.906 18:11:36 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:07:54.277 18:11:37 accel.accel_compare -- accel/accel.sh@20 -- # val= 00:07:54.277 18:11:37 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:07:54.277 18:11:37 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:07:54.277 18:11:37 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:07:54.277 18:11:37 accel.accel_compare -- accel/accel.sh@20 -- # val= 00:07:54.277 18:11:37 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:07:54.277 18:11:37 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:07:54.277 18:11:37 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:07:54.277 18:11:37 accel.accel_compare -- accel/accel.sh@20 -- # val= 00:07:54.277 18:11:37 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:07:54.277 18:11:37 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:07:54.277 18:11:37 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:07:54.277 18:11:37 accel.accel_compare -- accel/accel.sh@20 -- # val= 00:07:54.277 18:11:37 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:07:54.277 18:11:37 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:07:54.277 18:11:37 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:07:54.277 18:11:37 accel.accel_compare -- accel/accel.sh@20 -- # val= 00:07:54.277 18:11:37 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:07:54.277 18:11:37 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:07:54.277 18:11:37 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:07:54.277 18:11:37 accel.accel_compare -- accel/accel.sh@20 -- # val= 00:07:54.277 18:11:37 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:07:54.277 18:11:37 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:07:54.277 18:11:37 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:07:54.277 18:11:37 accel.accel_compare -- accel/accel.sh@27 -- # [[ -n software ]] 00:07:54.277 18:11:37 accel.accel_compare -- accel/accel.sh@27 -- # [[ -n compare ]] 00:07:54.277 18:11:37 accel.accel_compare -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:07:54.277 00:07:54.277 real 0m1.502s 00:07:54.277 user 0m1.312s 00:07:54.277 sys 0m0.188s 00:07:54.277 18:11:37 accel.accel_compare -- common/autotest_common.sh@1124 -- # xtrace_disable 00:07:54.277 18:11:37 accel.accel_compare -- common/autotest_common.sh@10 -- # set +x 00:07:54.277 ************************************ 00:07:54.277 END TEST accel_compare 00:07:54.277 ************************************ 00:07:54.277 18:11:37 accel -- common/autotest_common.sh@1142 -- # return 0 00:07:54.277 18:11:37 accel -- accel/accel.sh@109 -- # run_test accel_xor accel_test -t 1 -w xor -y 00:07:54.277 18:11:37 accel -- common/autotest_common.sh@1099 -- # '[' 7 -le 1 ']' 00:07:54.277 18:11:37 accel -- common/autotest_common.sh@1105 -- # xtrace_disable 00:07:54.277 18:11:37 accel -- common/autotest_common.sh@10 -- # set +x 00:07:54.277 ************************************ 00:07:54.277 START TEST accel_xor 00:07:54.277 ************************************ 00:07:54.277 18:11:37 accel.accel_xor -- common/autotest_common.sh@1123 -- # accel_test -t 1 -w xor -y 00:07:54.277 18:11:37 accel.accel_xor -- accel/accel.sh@16 -- # local accel_opc 00:07:54.277 18:11:37 accel.accel_xor -- accel/accel.sh@17 -- # local accel_module 00:07:54.277 18:11:37 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:07:54.277 18:11:37 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:07:54.277 18:11:37 accel.accel_xor -- accel/accel.sh@15 -- # accel_perf -t 1 -w xor -y 00:07:54.277 18:11:37 accel.accel_xor -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w xor -y 00:07:54.277 18:11:37 accel.accel_xor -- accel/accel.sh@12 -- # build_accel_config 00:07:54.277 18:11:37 accel.accel_xor -- accel/accel.sh@31 -- # accel_json_cfg=() 00:07:54.277 18:11:37 accel.accel_xor -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:07:54.277 18:11:37 accel.accel_xor -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:54.277 18:11:37 accel.accel_xor -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:54.277 18:11:37 accel.accel_xor -- accel/accel.sh@36 -- # [[ -n '' ]] 00:07:54.277 18:11:37 accel.accel_xor -- accel/accel.sh@40 -- # local IFS=, 00:07:54.278 18:11:37 accel.accel_xor -- accel/accel.sh@41 -- # jq -r . 00:07:54.278 [2024-07-12 18:11:37.703645] Starting SPDK v24.09-pre git sha1 182dd7de4 / DPDK 24.03.0 initialization... 00:07:54.278 [2024-07-12 18:11:37.703704] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2431243 ] 00:07:54.278 [2024-07-12 18:11:37.833796] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:54.278 [2024-07-12 18:11:37.935395] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:07:54.278 18:11:38 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:07:54.278 18:11:38 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:07:54.278 18:11:38 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:07:54.535 18:11:38 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:07:54.535 18:11:38 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:07:54.535 18:11:38 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:07:54.535 18:11:38 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:07:54.535 18:11:38 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:07:54.535 18:11:38 accel.accel_xor -- accel/accel.sh@20 -- # val=0x1 00:07:54.535 18:11:38 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:07:54.535 18:11:38 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:07:54.535 18:11:38 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:07:54.535 18:11:38 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:07:54.535 18:11:38 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:07:54.535 18:11:38 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:07:54.535 18:11:38 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:07:54.535 18:11:38 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:07:54.535 18:11:38 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:07:54.535 18:11:38 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:07:54.535 18:11:38 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:07:54.535 18:11:38 accel.accel_xor -- accel/accel.sh@20 -- # val=xor 00:07:54.535 18:11:38 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:07:54.535 18:11:38 accel.accel_xor -- accel/accel.sh@23 -- # accel_opc=xor 00:07:54.535 18:11:38 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:07:54.535 18:11:38 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:07:54.535 18:11:38 accel.accel_xor -- accel/accel.sh@20 -- # val=2 00:07:54.535 18:11:38 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:07:54.535 18:11:38 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:07:54.535 18:11:38 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:07:54.535 18:11:38 accel.accel_xor -- accel/accel.sh@20 -- # val='4096 bytes' 00:07:54.535 18:11:38 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:07:54.535 18:11:38 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:07:54.535 18:11:38 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:07:54.535 18:11:38 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:07:54.535 18:11:38 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:07:54.535 18:11:38 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:07:54.535 18:11:38 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:07:54.535 18:11:38 accel.accel_xor -- accel/accel.sh@20 -- # val=software 00:07:54.535 18:11:38 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:07:54.535 18:11:38 accel.accel_xor -- accel/accel.sh@22 -- # accel_module=software 00:07:54.535 18:11:38 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:07:54.535 18:11:38 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:07:54.535 18:11:38 accel.accel_xor -- accel/accel.sh@20 -- # val=32 00:07:54.535 18:11:38 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:07:54.535 18:11:38 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:07:54.535 18:11:38 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:07:54.535 18:11:38 accel.accel_xor -- accel/accel.sh@20 -- # val=32 00:07:54.535 18:11:38 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:07:54.535 18:11:38 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:07:54.535 18:11:38 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:07:54.535 18:11:38 accel.accel_xor -- accel/accel.sh@20 -- # val=1 00:07:54.535 18:11:38 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:07:54.535 18:11:38 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:07:54.535 18:11:38 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:07:54.535 18:11:38 accel.accel_xor -- accel/accel.sh@20 -- # val='1 seconds' 00:07:54.535 18:11:38 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:07:54.535 18:11:38 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:07:54.535 18:11:38 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:07:54.535 18:11:38 accel.accel_xor -- accel/accel.sh@20 -- # val=Yes 00:07:54.535 18:11:38 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:07:54.535 18:11:38 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:07:54.535 18:11:38 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:07:54.535 18:11:38 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:07:54.535 18:11:38 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:07:54.535 18:11:38 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:07:54.535 18:11:38 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:07:54.535 18:11:38 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:07:54.536 18:11:38 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:07:54.536 18:11:38 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:07:54.536 18:11:38 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:07:55.468 18:11:39 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:07:55.468 18:11:39 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:07:55.468 18:11:39 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:07:55.468 18:11:39 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:07:55.468 18:11:39 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:07:55.468 18:11:39 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:07:55.468 18:11:39 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:07:55.468 18:11:39 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:07:55.468 18:11:39 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:07:55.468 18:11:39 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:07:55.468 18:11:39 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:07:55.468 18:11:39 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:07:55.468 18:11:39 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:07:55.468 18:11:39 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:07:55.468 18:11:39 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:07:55.468 18:11:39 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:07:55.468 18:11:39 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:07:55.468 18:11:39 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:07:55.468 18:11:39 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:07:55.468 18:11:39 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:07:55.468 18:11:39 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:07:55.468 18:11:39 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:07:55.468 18:11:39 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:07:55.468 18:11:39 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:07:55.468 18:11:39 accel.accel_xor -- accel/accel.sh@27 -- # [[ -n software ]] 00:07:55.468 18:11:39 accel.accel_xor -- accel/accel.sh@27 -- # [[ -n xor ]] 00:07:55.468 18:11:39 accel.accel_xor -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:07:55.468 00:07:55.468 real 0m1.513s 00:07:55.468 user 0m1.315s 00:07:55.468 sys 0m0.206s 00:07:55.468 18:11:39 accel.accel_xor -- common/autotest_common.sh@1124 -- # xtrace_disable 00:07:55.468 18:11:39 accel.accel_xor -- common/autotest_common.sh@10 -- # set +x 00:07:55.468 ************************************ 00:07:55.468 END TEST accel_xor 00:07:55.468 ************************************ 00:07:55.726 18:11:39 accel -- common/autotest_common.sh@1142 -- # return 0 00:07:55.726 18:11:39 accel -- accel/accel.sh@110 -- # run_test accel_xor accel_test -t 1 -w xor -y -x 3 00:07:55.726 18:11:39 accel -- common/autotest_common.sh@1099 -- # '[' 9 -le 1 ']' 00:07:55.726 18:11:39 accel -- common/autotest_common.sh@1105 -- # xtrace_disable 00:07:55.726 18:11:39 accel -- common/autotest_common.sh@10 -- # set +x 00:07:55.726 ************************************ 00:07:55.726 START TEST accel_xor 00:07:55.726 ************************************ 00:07:55.726 18:11:39 accel.accel_xor -- common/autotest_common.sh@1123 -- # accel_test -t 1 -w xor -y -x 3 00:07:55.726 18:11:39 accel.accel_xor -- accel/accel.sh@16 -- # local accel_opc 00:07:55.726 18:11:39 accel.accel_xor -- accel/accel.sh@17 -- # local accel_module 00:07:55.726 18:11:39 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:07:55.726 18:11:39 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:07:55.726 18:11:39 accel.accel_xor -- accel/accel.sh@15 -- # accel_perf -t 1 -w xor -y -x 3 00:07:55.726 18:11:39 accel.accel_xor -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w xor -y -x 3 00:07:55.726 18:11:39 accel.accel_xor -- accel/accel.sh@12 -- # build_accel_config 00:07:55.726 18:11:39 accel.accel_xor -- accel/accel.sh@31 -- # accel_json_cfg=() 00:07:55.727 18:11:39 accel.accel_xor -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:07:55.727 18:11:39 accel.accel_xor -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:55.727 18:11:39 accel.accel_xor -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:55.727 18:11:39 accel.accel_xor -- accel/accel.sh@36 -- # [[ -n '' ]] 00:07:55.727 18:11:39 accel.accel_xor -- accel/accel.sh@40 -- # local IFS=, 00:07:55.727 18:11:39 accel.accel_xor -- accel/accel.sh@41 -- # jq -r . 00:07:55.727 [2024-07-12 18:11:39.301739] Starting SPDK v24.09-pre git sha1 182dd7de4 / DPDK 24.03.0 initialization... 00:07:55.727 [2024-07-12 18:11:39.301801] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2431439 ] 00:07:55.727 [2024-07-12 18:11:39.430891] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:55.985 [2024-07-12 18:11:39.533200] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:07:55.985 18:11:39 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:07:55.985 18:11:39 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:07:55.985 18:11:39 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:07:55.985 18:11:39 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:07:55.985 18:11:39 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:07:55.985 18:11:39 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:07:55.985 18:11:39 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:07:55.985 18:11:39 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:07:55.985 18:11:39 accel.accel_xor -- accel/accel.sh@20 -- # val=0x1 00:07:55.985 18:11:39 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:07:55.985 18:11:39 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:07:55.985 18:11:39 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:07:55.985 18:11:39 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:07:55.985 18:11:39 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:07:55.985 18:11:39 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:07:55.985 18:11:39 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:07:55.985 18:11:39 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:07:55.985 18:11:39 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:07:55.985 18:11:39 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:07:55.985 18:11:39 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:07:55.985 18:11:39 accel.accel_xor -- accel/accel.sh@20 -- # val=xor 00:07:55.985 18:11:39 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:07:55.985 18:11:39 accel.accel_xor -- accel/accel.sh@23 -- # accel_opc=xor 00:07:55.985 18:11:39 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:07:55.985 18:11:39 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:07:55.985 18:11:39 accel.accel_xor -- accel/accel.sh@20 -- # val=3 00:07:55.985 18:11:39 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:07:55.985 18:11:39 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:07:55.985 18:11:39 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:07:55.985 18:11:39 accel.accel_xor -- accel/accel.sh@20 -- # val='4096 bytes' 00:07:55.985 18:11:39 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:07:55.985 18:11:39 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:07:55.985 18:11:39 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:07:55.985 18:11:39 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:07:55.985 18:11:39 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:07:55.985 18:11:39 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:07:55.985 18:11:39 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:07:55.985 18:11:39 accel.accel_xor -- accel/accel.sh@20 -- # val=software 00:07:55.985 18:11:39 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:07:55.985 18:11:39 accel.accel_xor -- accel/accel.sh@22 -- # accel_module=software 00:07:55.985 18:11:39 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:07:55.985 18:11:39 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:07:55.985 18:11:39 accel.accel_xor -- accel/accel.sh@20 -- # val=32 00:07:55.985 18:11:39 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:07:55.985 18:11:39 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:07:55.985 18:11:39 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:07:55.985 18:11:39 accel.accel_xor -- accel/accel.sh@20 -- # val=32 00:07:55.985 18:11:39 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:07:55.985 18:11:39 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:07:55.985 18:11:39 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:07:55.985 18:11:39 accel.accel_xor -- accel/accel.sh@20 -- # val=1 00:07:55.985 18:11:39 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:07:55.985 18:11:39 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:07:55.985 18:11:39 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:07:55.985 18:11:39 accel.accel_xor -- accel/accel.sh@20 -- # val='1 seconds' 00:07:55.985 18:11:39 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:07:55.985 18:11:39 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:07:55.985 18:11:39 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:07:55.985 18:11:39 accel.accel_xor -- accel/accel.sh@20 -- # val=Yes 00:07:55.985 18:11:39 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:07:55.985 18:11:39 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:07:55.985 18:11:39 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:07:55.985 18:11:39 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:07:55.985 18:11:39 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:07:55.985 18:11:39 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:07:55.985 18:11:39 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:07:55.985 18:11:39 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:07:55.985 18:11:39 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:07:55.985 18:11:39 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:07:55.985 18:11:39 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:07:57.359 18:11:40 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:07:57.359 18:11:40 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:07:57.359 18:11:40 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:07:57.359 18:11:40 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:07:57.359 18:11:40 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:07:57.359 18:11:40 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:07:57.359 18:11:40 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:07:57.359 18:11:40 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:07:57.359 18:11:40 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:07:57.359 18:11:40 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:07:57.359 18:11:40 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:07:57.359 18:11:40 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:07:57.359 18:11:40 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:07:57.359 18:11:40 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:07:57.359 18:11:40 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:07:57.359 18:11:40 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:07:57.359 18:11:40 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:07:57.359 18:11:40 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:07:57.359 18:11:40 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:07:57.359 18:11:40 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:07:57.359 18:11:40 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:07:57.359 18:11:40 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:07:57.359 18:11:40 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:07:57.359 18:11:40 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:07:57.359 18:11:40 accel.accel_xor -- accel/accel.sh@27 -- # [[ -n software ]] 00:07:57.359 18:11:40 accel.accel_xor -- accel/accel.sh@27 -- # [[ -n xor ]] 00:07:57.359 18:11:40 accel.accel_xor -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:07:57.359 00:07:57.359 real 0m1.510s 00:07:57.359 user 0m1.320s 00:07:57.359 sys 0m0.196s 00:07:57.359 18:11:40 accel.accel_xor -- common/autotest_common.sh@1124 -- # xtrace_disable 00:07:57.359 18:11:40 accel.accel_xor -- common/autotest_common.sh@10 -- # set +x 00:07:57.359 ************************************ 00:07:57.359 END TEST accel_xor 00:07:57.359 ************************************ 00:07:57.359 18:11:40 accel -- common/autotest_common.sh@1142 -- # return 0 00:07:57.359 18:11:40 accel -- accel/accel.sh@111 -- # run_test accel_dif_verify accel_test -t 1 -w dif_verify 00:07:57.359 18:11:40 accel -- common/autotest_common.sh@1099 -- # '[' 6 -le 1 ']' 00:07:57.359 18:11:40 accel -- common/autotest_common.sh@1105 -- # xtrace_disable 00:07:57.359 18:11:40 accel -- common/autotest_common.sh@10 -- # set +x 00:07:57.359 ************************************ 00:07:57.359 START TEST accel_dif_verify 00:07:57.359 ************************************ 00:07:57.359 18:11:40 accel.accel_dif_verify -- common/autotest_common.sh@1123 -- # accel_test -t 1 -w dif_verify 00:07:57.359 18:11:40 accel.accel_dif_verify -- accel/accel.sh@16 -- # local accel_opc 00:07:57.359 18:11:40 accel.accel_dif_verify -- accel/accel.sh@17 -- # local accel_module 00:07:57.359 18:11:40 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:07:57.359 18:11:40 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:07:57.359 18:11:40 accel.accel_dif_verify -- accel/accel.sh@15 -- # accel_perf -t 1 -w dif_verify 00:07:57.359 18:11:40 accel.accel_dif_verify -- accel/accel.sh@12 -- # build_accel_config 00:07:57.359 18:11:40 accel.accel_dif_verify -- accel/accel.sh@31 -- # accel_json_cfg=() 00:07:57.359 18:11:40 accel.accel_dif_verify -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w dif_verify 00:07:57.359 18:11:40 accel.accel_dif_verify -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:07:57.359 18:11:40 accel.accel_dif_verify -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:57.359 18:11:40 accel.accel_dif_verify -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:57.359 18:11:40 accel.accel_dif_verify -- accel/accel.sh@36 -- # [[ -n '' ]] 00:07:57.359 18:11:40 accel.accel_dif_verify -- accel/accel.sh@40 -- # local IFS=, 00:07:57.359 18:11:40 accel.accel_dif_verify -- accel/accel.sh@41 -- # jq -r . 00:07:57.359 [2024-07-12 18:11:40.895576] Starting SPDK v24.09-pre git sha1 182dd7de4 / DPDK 24.03.0 initialization... 00:07:57.359 [2024-07-12 18:11:40.895637] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2431640 ] 00:07:57.359 [2024-07-12 18:11:41.013297] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:57.618 [2024-07-12 18:11:41.114898] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:07:57.618 18:11:41 accel.accel_dif_verify -- accel/accel.sh@20 -- # val= 00:07:57.618 18:11:41 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:07:57.618 18:11:41 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:07:57.618 18:11:41 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:07:57.618 18:11:41 accel.accel_dif_verify -- accel/accel.sh@20 -- # val= 00:07:57.618 18:11:41 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:07:57.618 18:11:41 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:07:57.618 18:11:41 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:07:57.618 18:11:41 accel.accel_dif_verify -- accel/accel.sh@20 -- # val=0x1 00:07:57.618 18:11:41 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:07:57.618 18:11:41 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:07:57.618 18:11:41 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:07:57.618 18:11:41 accel.accel_dif_verify -- accel/accel.sh@20 -- # val= 00:07:57.618 18:11:41 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:07:57.618 18:11:41 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:07:57.618 18:11:41 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:07:57.618 18:11:41 accel.accel_dif_verify -- accel/accel.sh@20 -- # val= 00:07:57.618 18:11:41 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:07:57.618 18:11:41 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:07:57.618 18:11:41 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:07:57.618 18:11:41 accel.accel_dif_verify -- accel/accel.sh@20 -- # val=dif_verify 00:07:57.618 18:11:41 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:07:57.618 18:11:41 accel.accel_dif_verify -- accel/accel.sh@23 -- # accel_opc=dif_verify 00:07:57.618 18:11:41 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:07:57.618 18:11:41 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:07:57.618 18:11:41 accel.accel_dif_verify -- accel/accel.sh@20 -- # val='4096 bytes' 00:07:57.618 18:11:41 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:07:57.618 18:11:41 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:07:57.618 18:11:41 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:07:57.618 18:11:41 accel.accel_dif_verify -- accel/accel.sh@20 -- # val='4096 bytes' 00:07:57.618 18:11:41 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:07:57.618 18:11:41 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:07:57.618 18:11:41 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:07:57.618 18:11:41 accel.accel_dif_verify -- accel/accel.sh@20 -- # val='512 bytes' 00:07:57.618 18:11:41 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:07:57.618 18:11:41 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:07:57.618 18:11:41 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:07:57.618 18:11:41 accel.accel_dif_verify -- accel/accel.sh@20 -- # val='8 bytes' 00:07:57.618 18:11:41 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:07:57.618 18:11:41 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:07:57.618 18:11:41 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:07:57.618 18:11:41 accel.accel_dif_verify -- accel/accel.sh@20 -- # val= 00:07:57.618 18:11:41 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:07:57.618 18:11:41 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:07:57.618 18:11:41 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:07:57.618 18:11:41 accel.accel_dif_verify -- accel/accel.sh@20 -- # val=software 00:07:57.618 18:11:41 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:07:57.618 18:11:41 accel.accel_dif_verify -- accel/accel.sh@22 -- # accel_module=software 00:07:57.618 18:11:41 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:07:57.618 18:11:41 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:07:57.618 18:11:41 accel.accel_dif_verify -- accel/accel.sh@20 -- # val=32 00:07:57.618 18:11:41 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:07:57.618 18:11:41 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:07:57.618 18:11:41 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:07:57.618 18:11:41 accel.accel_dif_verify -- accel/accel.sh@20 -- # val=32 00:07:57.618 18:11:41 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:07:57.618 18:11:41 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:07:57.618 18:11:41 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:07:57.618 18:11:41 accel.accel_dif_verify -- accel/accel.sh@20 -- # val=1 00:07:57.618 18:11:41 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:07:57.618 18:11:41 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:07:57.618 18:11:41 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:07:57.618 18:11:41 accel.accel_dif_verify -- accel/accel.sh@20 -- # val='1 seconds' 00:07:57.618 18:11:41 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:07:57.618 18:11:41 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:07:57.618 18:11:41 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:07:57.618 18:11:41 accel.accel_dif_verify -- accel/accel.sh@20 -- # val=No 00:07:57.618 18:11:41 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:07:57.618 18:11:41 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:07:57.618 18:11:41 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:07:57.618 18:11:41 accel.accel_dif_verify -- accel/accel.sh@20 -- # val= 00:07:57.618 18:11:41 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:07:57.618 18:11:41 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:07:57.618 18:11:41 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:07:57.618 18:11:41 accel.accel_dif_verify -- accel/accel.sh@20 -- # val= 00:07:57.618 18:11:41 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:07:57.618 18:11:41 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:07:57.618 18:11:41 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:07:58.987 18:11:42 accel.accel_dif_verify -- accel/accel.sh@20 -- # val= 00:07:58.987 18:11:42 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:07:58.987 18:11:42 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:07:58.987 18:11:42 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:07:58.987 18:11:42 accel.accel_dif_verify -- accel/accel.sh@20 -- # val= 00:07:58.987 18:11:42 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:07:58.987 18:11:42 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:07:58.988 18:11:42 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:07:58.988 18:11:42 accel.accel_dif_verify -- accel/accel.sh@20 -- # val= 00:07:58.988 18:11:42 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:07:58.988 18:11:42 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:07:58.988 18:11:42 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:07:58.988 18:11:42 accel.accel_dif_verify -- accel/accel.sh@20 -- # val= 00:07:58.988 18:11:42 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:07:58.988 18:11:42 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:07:58.988 18:11:42 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:07:58.988 18:11:42 accel.accel_dif_verify -- accel/accel.sh@20 -- # val= 00:07:58.988 18:11:42 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:07:58.988 18:11:42 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:07:58.988 18:11:42 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:07:58.988 18:11:42 accel.accel_dif_verify -- accel/accel.sh@20 -- # val= 00:07:58.988 18:11:42 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:07:58.988 18:11:42 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:07:58.988 18:11:42 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:07:58.988 18:11:42 accel.accel_dif_verify -- accel/accel.sh@27 -- # [[ -n software ]] 00:07:58.988 18:11:42 accel.accel_dif_verify -- accel/accel.sh@27 -- # [[ -n dif_verify ]] 00:07:58.988 18:11:42 accel.accel_dif_verify -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:07:58.988 00:07:58.988 real 0m1.503s 00:07:58.988 user 0m1.308s 00:07:58.988 sys 0m0.199s 00:07:58.988 18:11:42 accel.accel_dif_verify -- common/autotest_common.sh@1124 -- # xtrace_disable 00:07:58.988 18:11:42 accel.accel_dif_verify -- common/autotest_common.sh@10 -- # set +x 00:07:58.988 ************************************ 00:07:58.988 END TEST accel_dif_verify 00:07:58.988 ************************************ 00:07:58.988 18:11:42 accel -- common/autotest_common.sh@1142 -- # return 0 00:07:58.988 18:11:42 accel -- accel/accel.sh@112 -- # run_test accel_dif_generate accel_test -t 1 -w dif_generate 00:07:58.988 18:11:42 accel -- common/autotest_common.sh@1099 -- # '[' 6 -le 1 ']' 00:07:58.988 18:11:42 accel -- common/autotest_common.sh@1105 -- # xtrace_disable 00:07:58.988 18:11:42 accel -- common/autotest_common.sh@10 -- # set +x 00:07:58.988 ************************************ 00:07:58.988 START TEST accel_dif_generate 00:07:58.988 ************************************ 00:07:58.988 18:11:42 accel.accel_dif_generate -- common/autotest_common.sh@1123 -- # accel_test -t 1 -w dif_generate 00:07:58.988 18:11:42 accel.accel_dif_generate -- accel/accel.sh@16 -- # local accel_opc 00:07:58.988 18:11:42 accel.accel_dif_generate -- accel/accel.sh@17 -- # local accel_module 00:07:58.988 18:11:42 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:07:58.988 18:11:42 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:07:58.988 18:11:42 accel.accel_dif_generate -- accel/accel.sh@15 -- # accel_perf -t 1 -w dif_generate 00:07:58.988 18:11:42 accel.accel_dif_generate -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w dif_generate 00:07:58.988 18:11:42 accel.accel_dif_generate -- accel/accel.sh@12 -- # build_accel_config 00:07:58.988 18:11:42 accel.accel_dif_generate -- accel/accel.sh@31 -- # accel_json_cfg=() 00:07:58.988 18:11:42 accel.accel_dif_generate -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:07:58.988 18:11:42 accel.accel_dif_generate -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:58.988 18:11:42 accel.accel_dif_generate -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:58.988 18:11:42 accel.accel_dif_generate -- accel/accel.sh@36 -- # [[ -n '' ]] 00:07:58.988 18:11:42 accel.accel_dif_generate -- accel/accel.sh@40 -- # local IFS=, 00:07:58.988 18:11:42 accel.accel_dif_generate -- accel/accel.sh@41 -- # jq -r . 00:07:58.988 [2024-07-12 18:11:42.480934] Starting SPDK v24.09-pre git sha1 182dd7de4 / DPDK 24.03.0 initialization... 00:07:58.988 [2024-07-12 18:11:42.481006] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2431835 ] 00:07:58.988 [2024-07-12 18:11:42.610598] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:58.988 [2024-07-12 18:11:42.708141] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:07:59.270 18:11:42 accel.accel_dif_generate -- accel/accel.sh@20 -- # val= 00:07:59.270 18:11:42 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:07:59.270 18:11:42 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:07:59.270 18:11:42 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:07:59.270 18:11:42 accel.accel_dif_generate -- accel/accel.sh@20 -- # val= 00:07:59.270 18:11:42 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:07:59.270 18:11:42 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:07:59.270 18:11:42 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:07:59.270 18:11:42 accel.accel_dif_generate -- accel/accel.sh@20 -- # val=0x1 00:07:59.270 18:11:42 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:07:59.270 18:11:42 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:07:59.270 18:11:42 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:07:59.270 18:11:42 accel.accel_dif_generate -- accel/accel.sh@20 -- # val= 00:07:59.270 18:11:42 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:07:59.270 18:11:42 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:07:59.270 18:11:42 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:07:59.270 18:11:42 accel.accel_dif_generate -- accel/accel.sh@20 -- # val= 00:07:59.270 18:11:42 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:07:59.270 18:11:42 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:07:59.270 18:11:42 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:07:59.270 18:11:42 accel.accel_dif_generate -- accel/accel.sh@20 -- # val=dif_generate 00:07:59.270 18:11:42 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:07:59.270 18:11:42 accel.accel_dif_generate -- accel/accel.sh@23 -- # accel_opc=dif_generate 00:07:59.270 18:11:42 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:07:59.270 18:11:42 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:07:59.270 18:11:42 accel.accel_dif_generate -- accel/accel.sh@20 -- # val='4096 bytes' 00:07:59.270 18:11:42 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:07:59.270 18:11:42 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:07:59.270 18:11:42 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:07:59.270 18:11:42 accel.accel_dif_generate -- accel/accel.sh@20 -- # val='4096 bytes' 00:07:59.270 18:11:42 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:07:59.270 18:11:42 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:07:59.270 18:11:42 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:07:59.270 18:11:42 accel.accel_dif_generate -- accel/accel.sh@20 -- # val='512 bytes' 00:07:59.270 18:11:42 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:07:59.270 18:11:42 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:07:59.270 18:11:42 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:07:59.270 18:11:42 accel.accel_dif_generate -- accel/accel.sh@20 -- # val='8 bytes' 00:07:59.271 18:11:42 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:07:59.271 18:11:42 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:07:59.271 18:11:42 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:07:59.271 18:11:42 accel.accel_dif_generate -- accel/accel.sh@20 -- # val= 00:07:59.271 18:11:42 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:07:59.271 18:11:42 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:07:59.271 18:11:42 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:07:59.271 18:11:42 accel.accel_dif_generate -- accel/accel.sh@20 -- # val=software 00:07:59.271 18:11:42 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:07:59.271 18:11:42 accel.accel_dif_generate -- accel/accel.sh@22 -- # accel_module=software 00:07:59.271 18:11:42 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:07:59.271 18:11:42 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:07:59.271 18:11:42 accel.accel_dif_generate -- accel/accel.sh@20 -- # val=32 00:07:59.271 18:11:42 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:07:59.271 18:11:42 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:07:59.271 18:11:42 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:07:59.271 18:11:42 accel.accel_dif_generate -- accel/accel.sh@20 -- # val=32 00:07:59.271 18:11:42 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:07:59.271 18:11:42 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:07:59.271 18:11:42 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:07:59.271 18:11:42 accel.accel_dif_generate -- accel/accel.sh@20 -- # val=1 00:07:59.271 18:11:42 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:07:59.271 18:11:42 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:07:59.271 18:11:42 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:07:59.271 18:11:42 accel.accel_dif_generate -- accel/accel.sh@20 -- # val='1 seconds' 00:07:59.271 18:11:42 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:07:59.271 18:11:42 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:07:59.271 18:11:42 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:07:59.271 18:11:42 accel.accel_dif_generate -- accel/accel.sh@20 -- # val=No 00:07:59.271 18:11:42 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:07:59.271 18:11:42 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:07:59.271 18:11:42 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:07:59.271 18:11:42 accel.accel_dif_generate -- accel/accel.sh@20 -- # val= 00:07:59.271 18:11:42 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:07:59.271 18:11:42 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:07:59.271 18:11:42 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:07:59.271 18:11:42 accel.accel_dif_generate -- accel/accel.sh@20 -- # val= 00:07:59.271 18:11:42 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:07:59.271 18:11:42 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:07:59.271 18:11:42 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:08:00.643 18:11:43 accel.accel_dif_generate -- accel/accel.sh@20 -- # val= 00:08:00.643 18:11:43 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:08:00.643 18:11:43 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:08:00.643 18:11:43 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:08:00.643 18:11:43 accel.accel_dif_generate -- accel/accel.sh@20 -- # val= 00:08:00.643 18:11:43 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:08:00.643 18:11:43 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:08:00.643 18:11:43 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:08:00.643 18:11:43 accel.accel_dif_generate -- accel/accel.sh@20 -- # val= 00:08:00.643 18:11:43 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:08:00.643 18:11:43 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:08:00.643 18:11:43 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:08:00.643 18:11:43 accel.accel_dif_generate -- accel/accel.sh@20 -- # val= 00:08:00.643 18:11:43 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:08:00.643 18:11:43 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:08:00.643 18:11:43 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:08:00.643 18:11:43 accel.accel_dif_generate -- accel/accel.sh@20 -- # val= 00:08:00.643 18:11:43 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:08:00.643 18:11:43 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:08:00.643 18:11:43 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:08:00.643 18:11:43 accel.accel_dif_generate -- accel/accel.sh@20 -- # val= 00:08:00.643 18:11:43 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:08:00.643 18:11:43 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:08:00.643 18:11:43 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:08:00.643 18:11:43 accel.accel_dif_generate -- accel/accel.sh@27 -- # [[ -n software ]] 00:08:00.643 18:11:43 accel.accel_dif_generate -- accel/accel.sh@27 -- # [[ -n dif_generate ]] 00:08:00.643 18:11:43 accel.accel_dif_generate -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:08:00.643 00:08:00.643 real 0m1.496s 00:08:00.643 user 0m1.307s 00:08:00.643 sys 0m0.194s 00:08:00.643 18:11:43 accel.accel_dif_generate -- common/autotest_common.sh@1124 -- # xtrace_disable 00:08:00.643 18:11:43 accel.accel_dif_generate -- common/autotest_common.sh@10 -- # set +x 00:08:00.643 ************************************ 00:08:00.643 END TEST accel_dif_generate 00:08:00.643 ************************************ 00:08:00.643 18:11:43 accel -- common/autotest_common.sh@1142 -- # return 0 00:08:00.643 18:11:43 accel -- accel/accel.sh@113 -- # run_test accel_dif_generate_copy accel_test -t 1 -w dif_generate_copy 00:08:00.643 18:11:43 accel -- common/autotest_common.sh@1099 -- # '[' 6 -le 1 ']' 00:08:00.643 18:11:43 accel -- common/autotest_common.sh@1105 -- # xtrace_disable 00:08:00.643 18:11:43 accel -- common/autotest_common.sh@10 -- # set +x 00:08:00.643 ************************************ 00:08:00.643 START TEST accel_dif_generate_copy 00:08:00.643 ************************************ 00:08:00.643 18:11:44 accel.accel_dif_generate_copy -- common/autotest_common.sh@1123 -- # accel_test -t 1 -w dif_generate_copy 00:08:00.643 18:11:44 accel.accel_dif_generate_copy -- accel/accel.sh@16 -- # local accel_opc 00:08:00.643 18:11:44 accel.accel_dif_generate_copy -- accel/accel.sh@17 -- # local accel_module 00:08:00.643 18:11:44 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:08:00.643 18:11:44 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:08:00.643 18:11:44 accel.accel_dif_generate_copy -- accel/accel.sh@15 -- # accel_perf -t 1 -w dif_generate_copy 00:08:00.643 18:11:44 accel.accel_dif_generate_copy -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w dif_generate_copy 00:08:00.643 18:11:44 accel.accel_dif_generate_copy -- accel/accel.sh@12 -- # build_accel_config 00:08:00.643 18:11:44 accel.accel_dif_generate_copy -- accel/accel.sh@31 -- # accel_json_cfg=() 00:08:00.643 18:11:44 accel.accel_dif_generate_copy -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:08:00.643 18:11:44 accel.accel_dif_generate_copy -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:08:00.643 18:11:44 accel.accel_dif_generate_copy -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:08:00.643 18:11:44 accel.accel_dif_generate_copy -- accel/accel.sh@36 -- # [[ -n '' ]] 00:08:00.643 18:11:44 accel.accel_dif_generate_copy -- accel/accel.sh@40 -- # local IFS=, 00:08:00.643 18:11:44 accel.accel_dif_generate_copy -- accel/accel.sh@41 -- # jq -r . 00:08:00.643 [2024-07-12 18:11:44.046715] Starting SPDK v24.09-pre git sha1 182dd7de4 / DPDK 24.03.0 initialization... 00:08:00.643 [2024-07-12 18:11:44.046773] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2432038 ] 00:08:00.643 [2024-07-12 18:11:44.178274] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:00.643 [2024-07-12 18:11:44.279124] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:08:00.643 18:11:44 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val= 00:08:00.643 18:11:44 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:08:00.644 18:11:44 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:08:00.644 18:11:44 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:08:00.644 18:11:44 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val= 00:08:00.644 18:11:44 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:08:00.644 18:11:44 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:08:00.644 18:11:44 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:08:00.644 18:11:44 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val=0x1 00:08:00.644 18:11:44 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:08:00.644 18:11:44 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:08:00.644 18:11:44 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:08:00.644 18:11:44 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val= 00:08:00.644 18:11:44 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:08:00.644 18:11:44 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:08:00.644 18:11:44 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:08:00.644 18:11:44 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val= 00:08:00.644 18:11:44 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:08:00.644 18:11:44 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:08:00.644 18:11:44 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:08:00.644 18:11:44 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val=dif_generate_copy 00:08:00.644 18:11:44 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:08:00.644 18:11:44 accel.accel_dif_generate_copy -- accel/accel.sh@23 -- # accel_opc=dif_generate_copy 00:08:00.644 18:11:44 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:08:00.644 18:11:44 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:08:00.644 18:11:44 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val='4096 bytes' 00:08:00.644 18:11:44 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:08:00.644 18:11:44 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:08:00.644 18:11:44 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:08:00.644 18:11:44 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val='4096 bytes' 00:08:00.644 18:11:44 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:08:00.644 18:11:44 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:08:00.644 18:11:44 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:08:00.644 18:11:44 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val= 00:08:00.644 18:11:44 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:08:00.644 18:11:44 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:08:00.644 18:11:44 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:08:00.644 18:11:44 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val=software 00:08:00.644 18:11:44 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:08:00.644 18:11:44 accel.accel_dif_generate_copy -- accel/accel.sh@22 -- # accel_module=software 00:08:00.644 18:11:44 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:08:00.644 18:11:44 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:08:00.644 18:11:44 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val=32 00:08:00.644 18:11:44 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:08:00.644 18:11:44 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:08:00.644 18:11:44 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:08:00.644 18:11:44 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val=32 00:08:00.644 18:11:44 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:08:00.644 18:11:44 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:08:00.644 18:11:44 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:08:00.644 18:11:44 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val=1 00:08:00.644 18:11:44 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:08:00.644 18:11:44 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:08:00.644 18:11:44 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:08:00.644 18:11:44 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val='1 seconds' 00:08:00.644 18:11:44 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:08:00.644 18:11:44 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:08:00.644 18:11:44 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:08:00.644 18:11:44 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val=No 00:08:00.644 18:11:44 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:08:00.644 18:11:44 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:08:00.644 18:11:44 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:08:00.644 18:11:44 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val= 00:08:00.644 18:11:44 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:08:00.644 18:11:44 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:08:00.644 18:11:44 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:08:00.644 18:11:44 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val= 00:08:00.644 18:11:44 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:08:00.644 18:11:44 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:08:00.644 18:11:44 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:08:02.016 18:11:45 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val= 00:08:02.016 18:11:45 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:08:02.016 18:11:45 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:08:02.016 18:11:45 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:08:02.016 18:11:45 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val= 00:08:02.016 18:11:45 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:08:02.016 18:11:45 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:08:02.016 18:11:45 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:08:02.016 18:11:45 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val= 00:08:02.016 18:11:45 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:08:02.016 18:11:45 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:08:02.016 18:11:45 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:08:02.016 18:11:45 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val= 00:08:02.016 18:11:45 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:08:02.016 18:11:45 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:08:02.016 18:11:45 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:08:02.016 18:11:45 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val= 00:08:02.016 18:11:45 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:08:02.016 18:11:45 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:08:02.016 18:11:45 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:08:02.016 18:11:45 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val= 00:08:02.016 18:11:45 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:08:02.016 18:11:45 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:08:02.016 18:11:45 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:08:02.016 18:11:45 accel.accel_dif_generate_copy -- accel/accel.sh@27 -- # [[ -n software ]] 00:08:02.016 18:11:45 accel.accel_dif_generate_copy -- accel/accel.sh@27 -- # [[ -n dif_generate_copy ]] 00:08:02.016 18:11:45 accel.accel_dif_generate_copy -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:08:02.016 00:08:02.016 real 0m1.516s 00:08:02.016 user 0m1.319s 00:08:02.016 sys 0m0.197s 00:08:02.016 18:11:45 accel.accel_dif_generate_copy -- common/autotest_common.sh@1124 -- # xtrace_disable 00:08:02.016 18:11:45 accel.accel_dif_generate_copy -- common/autotest_common.sh@10 -- # set +x 00:08:02.016 ************************************ 00:08:02.016 END TEST accel_dif_generate_copy 00:08:02.016 ************************************ 00:08:02.016 18:11:45 accel -- common/autotest_common.sh@1142 -- # return 0 00:08:02.016 18:11:45 accel -- accel/accel.sh@115 -- # [[ y == y ]] 00:08:02.016 18:11:45 accel -- accel/accel.sh@116 -- # run_test accel_comp accel_test -t 1 -w compress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:08:02.016 18:11:45 accel -- common/autotest_common.sh@1099 -- # '[' 8 -le 1 ']' 00:08:02.016 18:11:45 accel -- common/autotest_common.sh@1105 -- # xtrace_disable 00:08:02.016 18:11:45 accel -- common/autotest_common.sh@10 -- # set +x 00:08:02.016 ************************************ 00:08:02.016 START TEST accel_comp 00:08:02.016 ************************************ 00:08:02.016 18:11:45 accel.accel_comp -- common/autotest_common.sh@1123 -- # accel_test -t 1 -w compress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:08:02.016 18:11:45 accel.accel_comp -- accel/accel.sh@16 -- # local accel_opc 00:08:02.016 18:11:45 accel.accel_comp -- accel/accel.sh@17 -- # local accel_module 00:08:02.016 18:11:45 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:08:02.016 18:11:45 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:08:02.016 18:11:45 accel.accel_comp -- accel/accel.sh@15 -- # accel_perf -t 1 -w compress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:08:02.016 18:11:45 accel.accel_comp -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w compress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:08:02.016 18:11:45 accel.accel_comp -- accel/accel.sh@12 -- # build_accel_config 00:08:02.016 18:11:45 accel.accel_comp -- accel/accel.sh@31 -- # accel_json_cfg=() 00:08:02.016 18:11:45 accel.accel_comp -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:08:02.016 18:11:45 accel.accel_comp -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:08:02.016 18:11:45 accel.accel_comp -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:08:02.016 18:11:45 accel.accel_comp -- accel/accel.sh@36 -- # [[ -n '' ]] 00:08:02.016 18:11:45 accel.accel_comp -- accel/accel.sh@40 -- # local IFS=, 00:08:02.017 18:11:45 accel.accel_comp -- accel/accel.sh@41 -- # jq -r . 00:08:02.017 [2024-07-12 18:11:45.631655] Starting SPDK v24.09-pre git sha1 182dd7de4 / DPDK 24.03.0 initialization... 00:08:02.017 [2024-07-12 18:11:45.631715] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2432307 ] 00:08:02.275 [2024-07-12 18:11:45.760256] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:02.275 [2024-07-12 18:11:45.860472] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:08:02.275 18:11:45 accel.accel_comp -- accel/accel.sh@20 -- # val= 00:08:02.275 18:11:45 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:08:02.275 18:11:45 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:08:02.275 18:11:45 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:08:02.275 18:11:45 accel.accel_comp -- accel/accel.sh@20 -- # val= 00:08:02.275 18:11:45 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:08:02.275 18:11:45 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:08:02.275 18:11:45 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:08:02.275 18:11:45 accel.accel_comp -- accel/accel.sh@20 -- # val= 00:08:02.275 18:11:45 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:08:02.275 18:11:45 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:08:02.275 18:11:45 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:08:02.275 18:11:45 accel.accel_comp -- accel/accel.sh@20 -- # val=0x1 00:08:02.275 18:11:45 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:08:02.275 18:11:45 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:08:02.275 18:11:45 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:08:02.275 18:11:45 accel.accel_comp -- accel/accel.sh@20 -- # val= 00:08:02.275 18:11:45 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:08:02.275 18:11:45 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:08:02.275 18:11:45 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:08:02.275 18:11:45 accel.accel_comp -- accel/accel.sh@20 -- # val= 00:08:02.275 18:11:45 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:08:02.275 18:11:45 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:08:02.275 18:11:45 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:08:02.275 18:11:45 accel.accel_comp -- accel/accel.sh@20 -- # val=compress 00:08:02.275 18:11:45 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:08:02.275 18:11:45 accel.accel_comp -- accel/accel.sh@23 -- # accel_opc=compress 00:08:02.275 18:11:45 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:08:02.275 18:11:45 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:08:02.275 18:11:45 accel.accel_comp -- accel/accel.sh@20 -- # val='4096 bytes' 00:08:02.275 18:11:45 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:08:02.275 18:11:45 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:08:02.275 18:11:45 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:08:02.275 18:11:45 accel.accel_comp -- accel/accel.sh@20 -- # val= 00:08:02.275 18:11:45 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:08:02.275 18:11:45 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:08:02.275 18:11:45 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:08:02.275 18:11:45 accel.accel_comp -- accel/accel.sh@20 -- # val=software 00:08:02.275 18:11:45 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:08:02.275 18:11:45 accel.accel_comp -- accel/accel.sh@22 -- # accel_module=software 00:08:02.275 18:11:45 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:08:02.275 18:11:45 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:08:02.275 18:11:45 accel.accel_comp -- accel/accel.sh@20 -- # val=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:08:02.275 18:11:45 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:08:02.275 18:11:45 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:08:02.275 18:11:45 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:08:02.275 18:11:45 accel.accel_comp -- accel/accel.sh@20 -- # val=32 00:08:02.275 18:11:45 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:08:02.275 18:11:45 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:08:02.275 18:11:45 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:08:02.275 18:11:45 accel.accel_comp -- accel/accel.sh@20 -- # val=32 00:08:02.275 18:11:45 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:08:02.275 18:11:45 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:08:02.275 18:11:45 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:08:02.275 18:11:45 accel.accel_comp -- accel/accel.sh@20 -- # val=1 00:08:02.275 18:11:45 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:08:02.275 18:11:45 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:08:02.275 18:11:45 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:08:02.275 18:11:45 accel.accel_comp -- accel/accel.sh@20 -- # val='1 seconds' 00:08:02.275 18:11:45 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:08:02.275 18:11:45 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:08:02.275 18:11:45 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:08:02.275 18:11:45 accel.accel_comp -- accel/accel.sh@20 -- # val=No 00:08:02.275 18:11:45 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:08:02.275 18:11:45 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:08:02.275 18:11:45 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:08:02.275 18:11:45 accel.accel_comp -- accel/accel.sh@20 -- # val= 00:08:02.275 18:11:45 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:08:02.275 18:11:45 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:08:02.275 18:11:45 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:08:02.275 18:11:45 accel.accel_comp -- accel/accel.sh@20 -- # val= 00:08:02.275 18:11:45 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:08:02.275 18:11:45 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:08:02.275 18:11:45 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:08:03.648 18:11:47 accel.accel_comp -- accel/accel.sh@20 -- # val= 00:08:03.648 18:11:47 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:08:03.648 18:11:47 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:08:03.648 18:11:47 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:08:03.648 18:11:47 accel.accel_comp -- accel/accel.sh@20 -- # val= 00:08:03.648 18:11:47 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:08:03.648 18:11:47 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:08:03.648 18:11:47 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:08:03.648 18:11:47 accel.accel_comp -- accel/accel.sh@20 -- # val= 00:08:03.648 18:11:47 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:08:03.648 18:11:47 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:08:03.648 18:11:47 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:08:03.648 18:11:47 accel.accel_comp -- accel/accel.sh@20 -- # val= 00:08:03.648 18:11:47 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:08:03.648 18:11:47 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:08:03.648 18:11:47 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:08:03.648 18:11:47 accel.accel_comp -- accel/accel.sh@20 -- # val= 00:08:03.648 18:11:47 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:08:03.648 18:11:47 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:08:03.648 18:11:47 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:08:03.649 18:11:47 accel.accel_comp -- accel/accel.sh@20 -- # val= 00:08:03.649 18:11:47 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:08:03.649 18:11:47 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:08:03.649 18:11:47 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:08:03.649 18:11:47 accel.accel_comp -- accel/accel.sh@27 -- # [[ -n software ]] 00:08:03.649 18:11:47 accel.accel_comp -- accel/accel.sh@27 -- # [[ -n compress ]] 00:08:03.649 18:11:47 accel.accel_comp -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:08:03.649 00:08:03.649 real 0m1.510s 00:08:03.649 user 0m1.319s 00:08:03.649 sys 0m0.195s 00:08:03.649 18:11:47 accel.accel_comp -- common/autotest_common.sh@1124 -- # xtrace_disable 00:08:03.649 18:11:47 accel.accel_comp -- common/autotest_common.sh@10 -- # set +x 00:08:03.649 ************************************ 00:08:03.649 END TEST accel_comp 00:08:03.649 ************************************ 00:08:03.649 18:11:47 accel -- common/autotest_common.sh@1142 -- # return 0 00:08:03.649 18:11:47 accel -- accel/accel.sh@117 -- # run_test accel_decomp accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y 00:08:03.649 18:11:47 accel -- common/autotest_common.sh@1099 -- # '[' 9 -le 1 ']' 00:08:03.649 18:11:47 accel -- common/autotest_common.sh@1105 -- # xtrace_disable 00:08:03.649 18:11:47 accel -- common/autotest_common.sh@10 -- # set +x 00:08:03.649 ************************************ 00:08:03.649 START TEST accel_decomp 00:08:03.649 ************************************ 00:08:03.649 18:11:47 accel.accel_decomp -- common/autotest_common.sh@1123 -- # accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y 00:08:03.649 18:11:47 accel.accel_decomp -- accel/accel.sh@16 -- # local accel_opc 00:08:03.649 18:11:47 accel.accel_decomp -- accel/accel.sh@17 -- # local accel_module 00:08:03.649 18:11:47 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:08:03.649 18:11:47 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:08:03.649 18:11:47 accel.accel_decomp -- accel/accel.sh@15 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y 00:08:03.649 18:11:47 accel.accel_decomp -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y 00:08:03.649 18:11:47 accel.accel_decomp -- accel/accel.sh@12 -- # build_accel_config 00:08:03.649 18:11:47 accel.accel_decomp -- accel/accel.sh@31 -- # accel_json_cfg=() 00:08:03.649 18:11:47 accel.accel_decomp -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:08:03.649 18:11:47 accel.accel_decomp -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:08:03.649 18:11:47 accel.accel_decomp -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:08:03.649 18:11:47 accel.accel_decomp -- accel/accel.sh@36 -- # [[ -n '' ]] 00:08:03.649 18:11:47 accel.accel_decomp -- accel/accel.sh@40 -- # local IFS=, 00:08:03.649 18:11:47 accel.accel_decomp -- accel/accel.sh@41 -- # jq -r . 00:08:03.649 [2024-07-12 18:11:47.231552] Starting SPDK v24.09-pre git sha1 182dd7de4 / DPDK 24.03.0 initialization... 00:08:03.649 [2024-07-12 18:11:47.231612] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2432588 ] 00:08:03.649 [2024-07-12 18:11:47.360998] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:03.907 [2024-07-12 18:11:47.460779] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:08:03.907 18:11:47 accel.accel_decomp -- accel/accel.sh@20 -- # val= 00:08:03.907 18:11:47 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:08:03.907 18:11:47 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:08:03.907 18:11:47 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:08:03.907 18:11:47 accel.accel_decomp -- accel/accel.sh@20 -- # val= 00:08:03.907 18:11:47 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:08:03.907 18:11:47 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:08:03.907 18:11:47 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:08:03.907 18:11:47 accel.accel_decomp -- accel/accel.sh@20 -- # val= 00:08:03.907 18:11:47 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:08:03.907 18:11:47 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:08:03.907 18:11:47 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:08:03.907 18:11:47 accel.accel_decomp -- accel/accel.sh@20 -- # val=0x1 00:08:03.907 18:11:47 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:08:03.907 18:11:47 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:08:03.907 18:11:47 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:08:03.907 18:11:47 accel.accel_decomp -- accel/accel.sh@20 -- # val= 00:08:03.907 18:11:47 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:08:03.907 18:11:47 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:08:03.907 18:11:47 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:08:03.907 18:11:47 accel.accel_decomp -- accel/accel.sh@20 -- # val= 00:08:03.907 18:11:47 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:08:03.907 18:11:47 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:08:03.907 18:11:47 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:08:03.907 18:11:47 accel.accel_decomp -- accel/accel.sh@20 -- # val=decompress 00:08:03.907 18:11:47 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:08:03.907 18:11:47 accel.accel_decomp -- accel/accel.sh@23 -- # accel_opc=decompress 00:08:03.907 18:11:47 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:08:03.907 18:11:47 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:08:03.907 18:11:47 accel.accel_decomp -- accel/accel.sh@20 -- # val='4096 bytes' 00:08:03.907 18:11:47 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:08:03.907 18:11:47 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:08:03.907 18:11:47 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:08:03.907 18:11:47 accel.accel_decomp -- accel/accel.sh@20 -- # val= 00:08:03.907 18:11:47 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:08:03.907 18:11:47 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:08:03.907 18:11:47 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:08:03.907 18:11:47 accel.accel_decomp -- accel/accel.sh@20 -- # val=software 00:08:03.907 18:11:47 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:08:03.907 18:11:47 accel.accel_decomp -- accel/accel.sh@22 -- # accel_module=software 00:08:03.907 18:11:47 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:08:03.907 18:11:47 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:08:03.907 18:11:47 accel.accel_decomp -- accel/accel.sh@20 -- # val=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:08:03.907 18:11:47 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:08:03.907 18:11:47 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:08:03.907 18:11:47 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:08:03.907 18:11:47 accel.accel_decomp -- accel/accel.sh@20 -- # val=32 00:08:03.907 18:11:47 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:08:03.907 18:11:47 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:08:03.907 18:11:47 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:08:03.907 18:11:47 accel.accel_decomp -- accel/accel.sh@20 -- # val=32 00:08:03.907 18:11:47 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:08:03.907 18:11:47 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:08:03.907 18:11:47 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:08:03.907 18:11:47 accel.accel_decomp -- accel/accel.sh@20 -- # val=1 00:08:03.907 18:11:47 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:08:03.907 18:11:47 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:08:03.907 18:11:47 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:08:03.907 18:11:47 accel.accel_decomp -- accel/accel.sh@20 -- # val='1 seconds' 00:08:03.907 18:11:47 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:08:03.907 18:11:47 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:08:03.907 18:11:47 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:08:03.907 18:11:47 accel.accel_decomp -- accel/accel.sh@20 -- # val=Yes 00:08:03.907 18:11:47 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:08:03.907 18:11:47 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:08:03.907 18:11:47 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:08:03.907 18:11:47 accel.accel_decomp -- accel/accel.sh@20 -- # val= 00:08:03.907 18:11:47 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:08:03.907 18:11:47 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:08:03.907 18:11:47 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:08:03.908 18:11:47 accel.accel_decomp -- accel/accel.sh@20 -- # val= 00:08:03.908 18:11:47 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:08:03.908 18:11:47 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:08:03.908 18:11:47 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:08:05.280 18:11:48 accel.accel_decomp -- accel/accel.sh@20 -- # val= 00:08:05.280 18:11:48 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:08:05.280 18:11:48 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:08:05.280 18:11:48 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:08:05.280 18:11:48 accel.accel_decomp -- accel/accel.sh@20 -- # val= 00:08:05.280 18:11:48 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:08:05.280 18:11:48 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:08:05.280 18:11:48 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:08:05.281 18:11:48 accel.accel_decomp -- accel/accel.sh@20 -- # val= 00:08:05.281 18:11:48 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:08:05.281 18:11:48 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:08:05.281 18:11:48 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:08:05.281 18:11:48 accel.accel_decomp -- accel/accel.sh@20 -- # val= 00:08:05.281 18:11:48 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:08:05.281 18:11:48 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:08:05.281 18:11:48 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:08:05.281 18:11:48 accel.accel_decomp -- accel/accel.sh@20 -- # val= 00:08:05.281 18:11:48 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:08:05.281 18:11:48 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:08:05.281 18:11:48 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:08:05.281 18:11:48 accel.accel_decomp -- accel/accel.sh@20 -- # val= 00:08:05.281 18:11:48 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:08:05.281 18:11:48 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:08:05.281 18:11:48 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:08:05.281 18:11:48 accel.accel_decomp -- accel/accel.sh@27 -- # [[ -n software ]] 00:08:05.281 18:11:48 accel.accel_decomp -- accel/accel.sh@27 -- # [[ -n decompress ]] 00:08:05.281 18:11:48 accel.accel_decomp -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:08:05.281 00:08:05.281 real 0m1.503s 00:08:05.281 user 0m1.318s 00:08:05.281 sys 0m0.186s 00:08:05.281 18:11:48 accel.accel_decomp -- common/autotest_common.sh@1124 -- # xtrace_disable 00:08:05.281 18:11:48 accel.accel_decomp -- common/autotest_common.sh@10 -- # set +x 00:08:05.281 ************************************ 00:08:05.281 END TEST accel_decomp 00:08:05.281 ************************************ 00:08:05.281 18:11:48 accel -- common/autotest_common.sh@1142 -- # return 0 00:08:05.281 18:11:48 accel -- accel/accel.sh@118 -- # run_test accel_decomp_full accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 00:08:05.281 18:11:48 accel -- common/autotest_common.sh@1099 -- # '[' 11 -le 1 ']' 00:08:05.281 18:11:48 accel -- common/autotest_common.sh@1105 -- # xtrace_disable 00:08:05.281 18:11:48 accel -- common/autotest_common.sh@10 -- # set +x 00:08:05.281 ************************************ 00:08:05.281 START TEST accel_decomp_full 00:08:05.281 ************************************ 00:08:05.281 18:11:48 accel.accel_decomp_full -- common/autotest_common.sh@1123 -- # accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 00:08:05.281 18:11:48 accel.accel_decomp_full -- accel/accel.sh@16 -- # local accel_opc 00:08:05.281 18:11:48 accel.accel_decomp_full -- accel/accel.sh@17 -- # local accel_module 00:08:05.281 18:11:48 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:08:05.281 18:11:48 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:08:05.281 18:11:48 accel.accel_decomp_full -- accel/accel.sh@15 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 00:08:05.281 18:11:48 accel.accel_decomp_full -- accel/accel.sh@12 -- # build_accel_config 00:08:05.281 18:11:48 accel.accel_decomp_full -- accel/accel.sh@31 -- # accel_json_cfg=() 00:08:05.281 18:11:48 accel.accel_decomp_full -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 00:08:05.281 18:11:48 accel.accel_decomp_full -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:08:05.281 18:11:48 accel.accel_decomp_full -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:08:05.281 18:11:48 accel.accel_decomp_full -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:08:05.281 18:11:48 accel.accel_decomp_full -- accel/accel.sh@36 -- # [[ -n '' ]] 00:08:05.281 18:11:48 accel.accel_decomp_full -- accel/accel.sh@40 -- # local IFS=, 00:08:05.281 18:11:48 accel.accel_decomp_full -- accel/accel.sh@41 -- # jq -r . 00:08:05.281 [2024-07-12 18:11:48.816117] Starting SPDK v24.09-pre git sha1 182dd7de4 / DPDK 24.03.0 initialization... 00:08:05.281 [2024-07-12 18:11:48.816177] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2432790 ] 00:08:05.281 [2024-07-12 18:11:48.933122] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:05.539 [2024-07-12 18:11:49.038502] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:08:05.539 18:11:49 accel.accel_decomp_full -- accel/accel.sh@20 -- # val= 00:08:05.539 18:11:49 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:08:05.539 18:11:49 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:08:05.539 18:11:49 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:08:05.539 18:11:49 accel.accel_decomp_full -- accel/accel.sh@20 -- # val= 00:08:05.539 18:11:49 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:08:05.539 18:11:49 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:08:05.539 18:11:49 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:08:05.539 18:11:49 accel.accel_decomp_full -- accel/accel.sh@20 -- # val= 00:08:05.539 18:11:49 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:08:05.539 18:11:49 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:08:05.539 18:11:49 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:08:05.540 18:11:49 accel.accel_decomp_full -- accel/accel.sh@20 -- # val=0x1 00:08:05.540 18:11:49 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:08:05.540 18:11:49 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:08:05.540 18:11:49 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:08:05.540 18:11:49 accel.accel_decomp_full -- accel/accel.sh@20 -- # val= 00:08:05.540 18:11:49 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:08:05.540 18:11:49 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:08:05.540 18:11:49 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:08:05.540 18:11:49 accel.accel_decomp_full -- accel/accel.sh@20 -- # val= 00:08:05.540 18:11:49 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:08:05.540 18:11:49 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:08:05.540 18:11:49 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:08:05.540 18:11:49 accel.accel_decomp_full -- accel/accel.sh@20 -- # val=decompress 00:08:05.540 18:11:49 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:08:05.540 18:11:49 accel.accel_decomp_full -- accel/accel.sh@23 -- # accel_opc=decompress 00:08:05.540 18:11:49 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:08:05.540 18:11:49 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:08:05.540 18:11:49 accel.accel_decomp_full -- accel/accel.sh@20 -- # val='111250 bytes' 00:08:05.540 18:11:49 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:08:05.540 18:11:49 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:08:05.540 18:11:49 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:08:05.540 18:11:49 accel.accel_decomp_full -- accel/accel.sh@20 -- # val= 00:08:05.540 18:11:49 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:08:05.540 18:11:49 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:08:05.540 18:11:49 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:08:05.540 18:11:49 accel.accel_decomp_full -- accel/accel.sh@20 -- # val=software 00:08:05.540 18:11:49 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:08:05.540 18:11:49 accel.accel_decomp_full -- accel/accel.sh@22 -- # accel_module=software 00:08:05.540 18:11:49 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:08:05.540 18:11:49 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:08:05.540 18:11:49 accel.accel_decomp_full -- accel/accel.sh@20 -- # val=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:08:05.540 18:11:49 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:08:05.540 18:11:49 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:08:05.540 18:11:49 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:08:05.540 18:11:49 accel.accel_decomp_full -- accel/accel.sh@20 -- # val=32 00:08:05.540 18:11:49 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:08:05.540 18:11:49 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:08:05.540 18:11:49 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:08:05.540 18:11:49 accel.accel_decomp_full -- accel/accel.sh@20 -- # val=32 00:08:05.540 18:11:49 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:08:05.540 18:11:49 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:08:05.540 18:11:49 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:08:05.540 18:11:49 accel.accel_decomp_full -- accel/accel.sh@20 -- # val=1 00:08:05.540 18:11:49 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:08:05.540 18:11:49 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:08:05.540 18:11:49 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:08:05.540 18:11:49 accel.accel_decomp_full -- accel/accel.sh@20 -- # val='1 seconds' 00:08:05.540 18:11:49 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:08:05.540 18:11:49 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:08:05.540 18:11:49 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:08:05.540 18:11:49 accel.accel_decomp_full -- accel/accel.sh@20 -- # val=Yes 00:08:05.540 18:11:49 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:08:05.540 18:11:49 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:08:05.540 18:11:49 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:08:05.540 18:11:49 accel.accel_decomp_full -- accel/accel.sh@20 -- # val= 00:08:05.540 18:11:49 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:08:05.540 18:11:49 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:08:05.540 18:11:49 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:08:05.540 18:11:49 accel.accel_decomp_full -- accel/accel.sh@20 -- # val= 00:08:05.540 18:11:49 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:08:05.540 18:11:49 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:08:05.540 18:11:49 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:08:06.913 18:11:50 accel.accel_decomp_full -- accel/accel.sh@20 -- # val= 00:08:06.913 18:11:50 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:08:06.913 18:11:50 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:08:06.913 18:11:50 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:08:06.913 18:11:50 accel.accel_decomp_full -- accel/accel.sh@20 -- # val= 00:08:06.913 18:11:50 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:08:06.913 18:11:50 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:08:06.913 18:11:50 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:08:06.913 18:11:50 accel.accel_decomp_full -- accel/accel.sh@20 -- # val= 00:08:06.913 18:11:50 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:08:06.913 18:11:50 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:08:06.913 18:11:50 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:08:06.913 18:11:50 accel.accel_decomp_full -- accel/accel.sh@20 -- # val= 00:08:06.913 18:11:50 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:08:06.913 18:11:50 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:08:06.913 18:11:50 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:08:06.913 18:11:50 accel.accel_decomp_full -- accel/accel.sh@20 -- # val= 00:08:06.913 18:11:50 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:08:06.913 18:11:50 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:08:06.913 18:11:50 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:08:06.913 18:11:50 accel.accel_decomp_full -- accel/accel.sh@20 -- # val= 00:08:06.913 18:11:50 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:08:06.913 18:11:50 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:08:06.913 18:11:50 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:08:06.913 18:11:50 accel.accel_decomp_full -- accel/accel.sh@27 -- # [[ -n software ]] 00:08:06.913 18:11:50 accel.accel_decomp_full -- accel/accel.sh@27 -- # [[ -n decompress ]] 00:08:06.913 18:11:50 accel.accel_decomp_full -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:08:06.913 00:08:06.913 real 0m1.508s 00:08:06.913 user 0m1.328s 00:08:06.913 sys 0m0.187s 00:08:06.913 18:11:50 accel.accel_decomp_full -- common/autotest_common.sh@1124 -- # xtrace_disable 00:08:06.913 18:11:50 accel.accel_decomp_full -- common/autotest_common.sh@10 -- # set +x 00:08:06.913 ************************************ 00:08:06.913 END TEST accel_decomp_full 00:08:06.913 ************************************ 00:08:06.913 18:11:50 accel -- common/autotest_common.sh@1142 -- # return 0 00:08:06.913 18:11:50 accel -- accel/accel.sh@119 -- # run_test accel_decomp_mcore accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -m 0xf 00:08:06.913 18:11:50 accel -- common/autotest_common.sh@1099 -- # '[' 11 -le 1 ']' 00:08:06.913 18:11:50 accel -- common/autotest_common.sh@1105 -- # xtrace_disable 00:08:06.913 18:11:50 accel -- common/autotest_common.sh@10 -- # set +x 00:08:06.913 ************************************ 00:08:06.913 START TEST accel_decomp_mcore 00:08:06.913 ************************************ 00:08:06.913 18:11:50 accel.accel_decomp_mcore -- common/autotest_common.sh@1123 -- # accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -m 0xf 00:08:06.913 18:11:50 accel.accel_decomp_mcore -- accel/accel.sh@16 -- # local accel_opc 00:08:06.913 18:11:50 accel.accel_decomp_mcore -- accel/accel.sh@17 -- # local accel_module 00:08:06.913 18:11:50 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:06.913 18:11:50 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:06.913 18:11:50 accel.accel_decomp_mcore -- accel/accel.sh@15 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -m 0xf 00:08:06.913 18:11:50 accel.accel_decomp_mcore -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -m 0xf 00:08:06.913 18:11:50 accel.accel_decomp_mcore -- accel/accel.sh@12 -- # build_accel_config 00:08:06.913 18:11:50 accel.accel_decomp_mcore -- accel/accel.sh@31 -- # accel_json_cfg=() 00:08:06.913 18:11:50 accel.accel_decomp_mcore -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:08:06.913 18:11:50 accel.accel_decomp_mcore -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:08:06.913 18:11:50 accel.accel_decomp_mcore -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:08:06.913 18:11:50 accel.accel_decomp_mcore -- accel/accel.sh@36 -- # [[ -n '' ]] 00:08:06.913 18:11:50 accel.accel_decomp_mcore -- accel/accel.sh@40 -- # local IFS=, 00:08:06.913 18:11:50 accel.accel_decomp_mcore -- accel/accel.sh@41 -- # jq -r . 00:08:06.913 [2024-07-12 18:11:50.394193] Starting SPDK v24.09-pre git sha1 182dd7de4 / DPDK 24.03.0 initialization... 00:08:06.913 [2024-07-12 18:11:50.394257] [ DPDK EAL parameters: accel_perf --no-shconf -c 0xf --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2432986 ] 00:08:06.913 [2024-07-12 18:11:50.521317] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 4 00:08:06.913 [2024-07-12 18:11:50.625220] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:08:06.913 [2024-07-12 18:11:50.625305] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 2 00:08:06.913 [2024-07-12 18:11:50.625384] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 3 00:08:06.913 [2024-07-12 18:11:50.625388] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:08:07.172 18:11:50 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val= 00:08:07.172 18:11:50 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:07.172 18:11:50 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:07.172 18:11:50 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:07.172 18:11:50 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val= 00:08:07.172 18:11:50 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:07.172 18:11:50 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:07.172 18:11:50 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:07.172 18:11:50 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val= 00:08:07.172 18:11:50 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:07.172 18:11:50 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:07.172 18:11:50 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:07.172 18:11:50 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val=0xf 00:08:07.172 18:11:50 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:07.172 18:11:50 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:07.172 18:11:50 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:07.172 18:11:50 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val= 00:08:07.172 18:11:50 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:07.172 18:11:50 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:07.172 18:11:50 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:07.172 18:11:50 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val= 00:08:07.172 18:11:50 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:07.172 18:11:50 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:07.172 18:11:50 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:07.172 18:11:50 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val=decompress 00:08:07.172 18:11:50 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:07.172 18:11:50 accel.accel_decomp_mcore -- accel/accel.sh@23 -- # accel_opc=decompress 00:08:07.172 18:11:50 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:07.172 18:11:50 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:07.172 18:11:50 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val='4096 bytes' 00:08:07.172 18:11:50 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:07.172 18:11:50 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:07.172 18:11:50 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:07.172 18:11:50 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val= 00:08:07.172 18:11:50 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:07.172 18:11:50 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:07.172 18:11:50 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:07.172 18:11:50 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val=software 00:08:07.172 18:11:50 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:07.172 18:11:50 accel.accel_decomp_mcore -- accel/accel.sh@22 -- # accel_module=software 00:08:07.172 18:11:50 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:07.172 18:11:50 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:07.172 18:11:50 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:08:07.172 18:11:50 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:07.172 18:11:50 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:07.172 18:11:50 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:07.172 18:11:50 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val=32 00:08:07.172 18:11:50 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:07.172 18:11:50 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:07.172 18:11:50 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:07.172 18:11:50 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val=32 00:08:07.172 18:11:50 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:07.172 18:11:50 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:07.172 18:11:50 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:07.172 18:11:50 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val=1 00:08:07.172 18:11:50 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:07.172 18:11:50 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:07.172 18:11:50 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:07.172 18:11:50 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val='1 seconds' 00:08:07.172 18:11:50 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:07.172 18:11:50 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:07.172 18:11:50 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:07.172 18:11:50 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val=Yes 00:08:07.172 18:11:50 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:07.172 18:11:50 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:07.172 18:11:50 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:07.172 18:11:50 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val= 00:08:07.172 18:11:50 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:07.172 18:11:50 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:07.172 18:11:50 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:07.172 18:11:50 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val= 00:08:07.172 18:11:50 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:07.172 18:11:50 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:07.172 18:11:50 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:08.546 18:11:51 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val= 00:08:08.546 18:11:51 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:08.546 18:11:51 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:08.546 18:11:51 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:08.546 18:11:51 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val= 00:08:08.546 18:11:51 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:08.546 18:11:51 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:08.546 18:11:51 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:08.546 18:11:51 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val= 00:08:08.546 18:11:51 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:08.546 18:11:51 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:08.546 18:11:51 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:08.546 18:11:51 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val= 00:08:08.546 18:11:51 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:08.546 18:11:51 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:08.546 18:11:51 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:08.546 18:11:51 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val= 00:08:08.546 18:11:51 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:08.546 18:11:51 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:08.546 18:11:51 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:08.546 18:11:51 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val= 00:08:08.546 18:11:51 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:08.546 18:11:51 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:08.546 18:11:51 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:08.546 18:11:51 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val= 00:08:08.546 18:11:51 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:08.546 18:11:51 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:08.546 18:11:51 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:08.546 18:11:51 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val= 00:08:08.546 18:11:51 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:08.546 18:11:51 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:08.546 18:11:51 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:08.546 18:11:51 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val= 00:08:08.546 18:11:51 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:08.546 18:11:51 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:08.546 18:11:51 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:08.546 18:11:51 accel.accel_decomp_mcore -- accel/accel.sh@27 -- # [[ -n software ]] 00:08:08.546 18:11:51 accel.accel_decomp_mcore -- accel/accel.sh@27 -- # [[ -n decompress ]] 00:08:08.546 18:11:51 accel.accel_decomp_mcore -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:08:08.546 00:08:08.546 real 0m1.515s 00:08:08.546 user 0m4.758s 00:08:08.546 sys 0m0.199s 00:08:08.546 18:11:51 accel.accel_decomp_mcore -- common/autotest_common.sh@1124 -- # xtrace_disable 00:08:08.546 18:11:51 accel.accel_decomp_mcore -- common/autotest_common.sh@10 -- # set +x 00:08:08.546 ************************************ 00:08:08.546 END TEST accel_decomp_mcore 00:08:08.546 ************************************ 00:08:08.546 18:11:51 accel -- common/autotest_common.sh@1142 -- # return 0 00:08:08.546 18:11:51 accel -- accel/accel.sh@120 -- # run_test accel_decomp_full_mcore accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 -m 0xf 00:08:08.546 18:11:51 accel -- common/autotest_common.sh@1099 -- # '[' 13 -le 1 ']' 00:08:08.546 18:11:51 accel -- common/autotest_common.sh@1105 -- # xtrace_disable 00:08:08.546 18:11:51 accel -- common/autotest_common.sh@10 -- # set +x 00:08:08.546 ************************************ 00:08:08.546 START TEST accel_decomp_full_mcore 00:08:08.546 ************************************ 00:08:08.546 18:11:51 accel.accel_decomp_full_mcore -- common/autotest_common.sh@1123 -- # accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 -m 0xf 00:08:08.546 18:11:51 accel.accel_decomp_full_mcore -- accel/accel.sh@16 -- # local accel_opc 00:08:08.546 18:11:51 accel.accel_decomp_full_mcore -- accel/accel.sh@17 -- # local accel_module 00:08:08.546 18:11:51 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:08.546 18:11:51 accel.accel_decomp_full_mcore -- accel/accel.sh@15 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 -m 0xf 00:08:08.546 18:11:51 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:08.546 18:11:51 accel.accel_decomp_full_mcore -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 -m 0xf 00:08:08.546 18:11:51 accel.accel_decomp_full_mcore -- accel/accel.sh@12 -- # build_accel_config 00:08:08.546 18:11:51 accel.accel_decomp_full_mcore -- accel/accel.sh@31 -- # accel_json_cfg=() 00:08:08.546 18:11:51 accel.accel_decomp_full_mcore -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:08:08.546 18:11:51 accel.accel_decomp_full_mcore -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:08:08.546 18:11:51 accel.accel_decomp_full_mcore -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:08:08.546 18:11:51 accel.accel_decomp_full_mcore -- accel/accel.sh@36 -- # [[ -n '' ]] 00:08:08.546 18:11:51 accel.accel_decomp_full_mcore -- accel/accel.sh@40 -- # local IFS=, 00:08:08.546 18:11:51 accel.accel_decomp_full_mcore -- accel/accel.sh@41 -- # jq -r . 00:08:08.546 [2024-07-12 18:11:51.992902] Starting SPDK v24.09-pre git sha1 182dd7de4 / DPDK 24.03.0 initialization... 00:08:08.546 [2024-07-12 18:11:51.992969] [ DPDK EAL parameters: accel_perf --no-shconf -c 0xf --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2433186 ] 00:08:08.546 [2024-07-12 18:11:52.123346] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 4 00:08:08.546 [2024-07-12 18:11:52.228381] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:08:08.546 [2024-07-12 18:11:52.228467] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 2 00:08:08.546 [2024-07-12 18:11:52.228547] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 3 00:08:08.546 [2024-07-12 18:11:52.228551] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:08:08.805 18:11:52 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:08:08.805 18:11:52 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:08.805 18:11:52 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:08.805 18:11:52 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:08.805 18:11:52 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:08:08.805 18:11:52 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:08.805 18:11:52 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:08.805 18:11:52 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:08.805 18:11:52 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:08:08.805 18:11:52 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:08.805 18:11:52 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:08.805 18:11:52 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:08.805 18:11:52 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val=0xf 00:08:08.805 18:11:52 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:08.805 18:11:52 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:08.805 18:11:52 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:08.805 18:11:52 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:08:08.805 18:11:52 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:08.805 18:11:52 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:08.805 18:11:52 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:08.805 18:11:52 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:08:08.805 18:11:52 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:08.805 18:11:52 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:08.805 18:11:52 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:08.805 18:11:52 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val=decompress 00:08:08.806 18:11:52 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:08.806 18:11:52 accel.accel_decomp_full_mcore -- accel/accel.sh@23 -- # accel_opc=decompress 00:08:08.806 18:11:52 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:08.806 18:11:52 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:08.806 18:11:52 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val='111250 bytes' 00:08:08.806 18:11:52 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:08.806 18:11:52 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:08.806 18:11:52 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:08.806 18:11:52 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:08:08.806 18:11:52 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:08.806 18:11:52 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:08.806 18:11:52 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:08.806 18:11:52 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val=software 00:08:08.806 18:11:52 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:08.806 18:11:52 accel.accel_decomp_full_mcore -- accel/accel.sh@22 -- # accel_module=software 00:08:08.806 18:11:52 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:08.806 18:11:52 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:08.806 18:11:52 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:08:08.806 18:11:52 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:08.806 18:11:52 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:08.806 18:11:52 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:08.806 18:11:52 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val=32 00:08:08.806 18:11:52 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:08.806 18:11:52 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:08.806 18:11:52 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:08.806 18:11:52 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val=32 00:08:08.806 18:11:52 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:08.806 18:11:52 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:08.806 18:11:52 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:08.806 18:11:52 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val=1 00:08:08.806 18:11:52 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:08.806 18:11:52 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:08.806 18:11:52 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:08.806 18:11:52 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val='1 seconds' 00:08:08.806 18:11:52 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:08.806 18:11:52 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:08.806 18:11:52 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:08.806 18:11:52 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val=Yes 00:08:08.806 18:11:52 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:08.806 18:11:52 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:08.806 18:11:52 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:08.806 18:11:52 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:08:08.806 18:11:52 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:08.806 18:11:52 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:08.806 18:11:52 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:08.806 18:11:52 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:08:08.806 18:11:52 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:08.806 18:11:52 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:08.806 18:11:52 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:10.178 18:11:53 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:08:10.178 18:11:53 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:10.178 18:11:53 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:10.178 18:11:53 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:10.178 18:11:53 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:08:10.178 18:11:53 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:10.178 18:11:53 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:10.178 18:11:53 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:10.178 18:11:53 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:08:10.178 18:11:53 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:10.178 18:11:53 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:10.178 18:11:53 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:10.178 18:11:53 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:08:10.178 18:11:53 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:10.178 18:11:53 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:10.178 18:11:53 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:10.178 18:11:53 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:08:10.178 18:11:53 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:10.178 18:11:53 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:10.178 18:11:53 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:10.178 18:11:53 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:08:10.178 18:11:53 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:10.178 18:11:53 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:10.178 18:11:53 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:10.178 18:11:53 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:08:10.178 18:11:53 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:10.178 18:11:53 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:10.178 18:11:53 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:10.178 18:11:53 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:08:10.178 18:11:53 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:10.178 18:11:53 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:10.178 18:11:53 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:10.178 18:11:53 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:08:10.179 18:11:53 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:10.179 18:11:53 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:10.179 18:11:53 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:10.179 18:11:53 accel.accel_decomp_full_mcore -- accel/accel.sh@27 -- # [[ -n software ]] 00:08:10.179 18:11:53 accel.accel_decomp_full_mcore -- accel/accel.sh@27 -- # [[ -n decompress ]] 00:08:10.179 18:11:53 accel.accel_decomp_full_mcore -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:08:10.179 00:08:10.179 real 0m1.529s 00:08:10.179 user 0m4.785s 00:08:10.179 sys 0m0.211s 00:08:10.179 18:11:53 accel.accel_decomp_full_mcore -- common/autotest_common.sh@1124 -- # xtrace_disable 00:08:10.179 18:11:53 accel.accel_decomp_full_mcore -- common/autotest_common.sh@10 -- # set +x 00:08:10.179 ************************************ 00:08:10.179 END TEST accel_decomp_full_mcore 00:08:10.179 ************************************ 00:08:10.179 18:11:53 accel -- common/autotest_common.sh@1142 -- # return 0 00:08:10.179 18:11:53 accel -- accel/accel.sh@121 -- # run_test accel_decomp_mthread accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -T 2 00:08:10.179 18:11:53 accel -- common/autotest_common.sh@1099 -- # '[' 11 -le 1 ']' 00:08:10.179 18:11:53 accel -- common/autotest_common.sh@1105 -- # xtrace_disable 00:08:10.179 18:11:53 accel -- common/autotest_common.sh@10 -- # set +x 00:08:10.179 ************************************ 00:08:10.179 START TEST accel_decomp_mthread 00:08:10.179 ************************************ 00:08:10.179 18:11:53 accel.accel_decomp_mthread -- common/autotest_common.sh@1123 -- # accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -T 2 00:08:10.179 18:11:53 accel.accel_decomp_mthread -- accel/accel.sh@16 -- # local accel_opc 00:08:10.179 18:11:53 accel.accel_decomp_mthread -- accel/accel.sh@17 -- # local accel_module 00:08:10.179 18:11:53 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:10.179 18:11:53 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:10.179 18:11:53 accel.accel_decomp_mthread -- accel/accel.sh@15 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -T 2 00:08:10.179 18:11:53 accel.accel_decomp_mthread -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -T 2 00:08:10.179 18:11:53 accel.accel_decomp_mthread -- accel/accel.sh@12 -- # build_accel_config 00:08:10.179 18:11:53 accel.accel_decomp_mthread -- accel/accel.sh@31 -- # accel_json_cfg=() 00:08:10.179 18:11:53 accel.accel_decomp_mthread -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:08:10.179 18:11:53 accel.accel_decomp_mthread -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:08:10.179 18:11:53 accel.accel_decomp_mthread -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:08:10.179 18:11:53 accel.accel_decomp_mthread -- accel/accel.sh@36 -- # [[ -n '' ]] 00:08:10.179 18:11:53 accel.accel_decomp_mthread -- accel/accel.sh@40 -- # local IFS=, 00:08:10.179 18:11:53 accel.accel_decomp_mthread -- accel/accel.sh@41 -- # jq -r . 00:08:10.179 [2024-07-12 18:11:53.605793] Starting SPDK v24.09-pre git sha1 182dd7de4 / DPDK 24.03.0 initialization... 00:08:10.179 [2024-07-12 18:11:53.605853] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2433413 ] 00:08:10.179 [2024-07-12 18:11:53.733561] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:10.179 [2024-07-12 18:11:53.830761] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:08:10.179 18:11:53 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val= 00:08:10.179 18:11:53 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:10.179 18:11:53 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:10.179 18:11:53 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:10.179 18:11:53 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val= 00:08:10.179 18:11:53 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:10.179 18:11:53 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:10.179 18:11:53 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:10.179 18:11:53 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val= 00:08:10.179 18:11:53 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:10.179 18:11:53 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:10.179 18:11:53 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:10.179 18:11:53 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val=0x1 00:08:10.179 18:11:53 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:10.179 18:11:53 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:10.179 18:11:53 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:10.179 18:11:53 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val= 00:08:10.179 18:11:53 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:10.179 18:11:53 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:10.179 18:11:53 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:10.179 18:11:53 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val= 00:08:10.436 18:11:53 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:10.436 18:11:53 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:10.436 18:11:53 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:10.436 18:11:53 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val=decompress 00:08:10.436 18:11:53 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:10.436 18:11:53 accel.accel_decomp_mthread -- accel/accel.sh@23 -- # accel_opc=decompress 00:08:10.436 18:11:53 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:10.436 18:11:53 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:10.436 18:11:53 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val='4096 bytes' 00:08:10.436 18:11:53 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:10.436 18:11:53 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:10.436 18:11:53 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:10.436 18:11:53 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val= 00:08:10.437 18:11:53 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:10.437 18:11:53 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:10.437 18:11:53 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:10.437 18:11:53 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val=software 00:08:10.437 18:11:53 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:10.437 18:11:53 accel.accel_decomp_mthread -- accel/accel.sh@22 -- # accel_module=software 00:08:10.437 18:11:53 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:10.437 18:11:53 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:10.437 18:11:53 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:08:10.437 18:11:53 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:10.437 18:11:53 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:10.437 18:11:53 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:10.437 18:11:53 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val=32 00:08:10.437 18:11:53 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:10.437 18:11:53 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:10.437 18:11:53 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:10.437 18:11:53 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val=32 00:08:10.437 18:11:53 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:10.437 18:11:53 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:10.437 18:11:53 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:10.437 18:11:53 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val=2 00:08:10.437 18:11:53 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:10.437 18:11:53 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:10.437 18:11:53 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:10.437 18:11:53 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val='1 seconds' 00:08:10.437 18:11:53 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:10.437 18:11:53 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:10.437 18:11:53 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:10.437 18:11:53 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val=Yes 00:08:10.437 18:11:53 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:10.437 18:11:53 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:10.437 18:11:53 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:10.437 18:11:53 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val= 00:08:10.437 18:11:53 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:10.437 18:11:53 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:10.437 18:11:53 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:10.437 18:11:53 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val= 00:08:10.437 18:11:53 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:10.437 18:11:53 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:10.437 18:11:53 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:11.379 18:11:55 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val= 00:08:11.379 18:11:55 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:11.379 18:11:55 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:11.379 18:11:55 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:11.379 18:11:55 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val= 00:08:11.379 18:11:55 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:11.379 18:11:55 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:11.379 18:11:55 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:11.379 18:11:55 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val= 00:08:11.379 18:11:55 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:11.379 18:11:55 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:11.379 18:11:55 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:11.379 18:11:55 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val= 00:08:11.379 18:11:55 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:11.379 18:11:55 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:11.379 18:11:55 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:11.379 18:11:55 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val= 00:08:11.379 18:11:55 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:11.379 18:11:55 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:11.379 18:11:55 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:11.379 18:11:55 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val= 00:08:11.379 18:11:55 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:11.379 18:11:55 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:11.379 18:11:55 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:11.379 18:11:55 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val= 00:08:11.379 18:11:55 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:11.379 18:11:55 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:11.379 18:11:55 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:11.379 18:11:55 accel.accel_decomp_mthread -- accel/accel.sh@27 -- # [[ -n software ]] 00:08:11.379 18:11:55 accel.accel_decomp_mthread -- accel/accel.sh@27 -- # [[ -n decompress ]] 00:08:11.379 18:11:55 accel.accel_decomp_mthread -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:08:11.379 00:08:11.379 real 0m1.496s 00:08:11.379 user 0m1.312s 00:08:11.379 sys 0m0.190s 00:08:11.379 18:11:55 accel.accel_decomp_mthread -- common/autotest_common.sh@1124 -- # xtrace_disable 00:08:11.379 18:11:55 accel.accel_decomp_mthread -- common/autotest_common.sh@10 -- # set +x 00:08:11.379 ************************************ 00:08:11.379 END TEST accel_decomp_mthread 00:08:11.379 ************************************ 00:08:11.637 18:11:55 accel -- common/autotest_common.sh@1142 -- # return 0 00:08:11.637 18:11:55 accel -- accel/accel.sh@122 -- # run_test accel_decomp_full_mthread accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 -T 2 00:08:11.637 18:11:55 accel -- common/autotest_common.sh@1099 -- # '[' 13 -le 1 ']' 00:08:11.638 18:11:55 accel -- common/autotest_common.sh@1105 -- # xtrace_disable 00:08:11.638 18:11:55 accel -- common/autotest_common.sh@10 -- # set +x 00:08:11.638 ************************************ 00:08:11.638 START TEST accel_decomp_full_mthread 00:08:11.638 ************************************ 00:08:11.638 18:11:55 accel.accel_decomp_full_mthread -- common/autotest_common.sh@1123 -- # accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 -T 2 00:08:11.638 18:11:55 accel.accel_decomp_full_mthread -- accel/accel.sh@16 -- # local accel_opc 00:08:11.638 18:11:55 accel.accel_decomp_full_mthread -- accel/accel.sh@17 -- # local accel_module 00:08:11.638 18:11:55 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:11.638 18:11:55 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:11.638 18:11:55 accel.accel_decomp_full_mthread -- accel/accel.sh@15 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 -T 2 00:08:11.638 18:11:55 accel.accel_decomp_full_mthread -- accel/accel.sh@12 -- # build_accel_config 00:08:11.638 18:11:55 accel.accel_decomp_full_mthread -- accel/accel.sh@31 -- # accel_json_cfg=() 00:08:11.638 18:11:55 accel.accel_decomp_full_mthread -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:08:11.638 18:11:55 accel.accel_decomp_full_mthread -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 -T 2 00:08:11.638 18:11:55 accel.accel_decomp_full_mthread -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:08:11.638 18:11:55 accel.accel_decomp_full_mthread -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:08:11.638 18:11:55 accel.accel_decomp_full_mthread -- accel/accel.sh@36 -- # [[ -n '' ]] 00:08:11.638 18:11:55 accel.accel_decomp_full_mthread -- accel/accel.sh@40 -- # local IFS=, 00:08:11.638 18:11:55 accel.accel_decomp_full_mthread -- accel/accel.sh@41 -- # jq -r . 00:08:11.638 [2024-07-12 18:11:55.182736] Starting SPDK v24.09-pre git sha1 182dd7de4 / DPDK 24.03.0 initialization... 00:08:11.638 [2024-07-12 18:11:55.182797] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2433729 ] 00:08:11.638 [2024-07-12 18:11:55.310922] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:11.896 [2024-07-12 18:11:55.412378] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:08:11.896 18:11:55 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:08:11.896 18:11:55 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:11.896 18:11:55 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:11.896 18:11:55 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:11.896 18:11:55 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:08:11.896 18:11:55 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:11.896 18:11:55 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:11.896 18:11:55 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:11.896 18:11:55 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:08:11.896 18:11:55 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:11.896 18:11:55 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:11.896 18:11:55 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:11.896 18:11:55 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val=0x1 00:08:11.896 18:11:55 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:11.896 18:11:55 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:11.896 18:11:55 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:11.896 18:11:55 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:08:11.896 18:11:55 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:11.896 18:11:55 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:11.896 18:11:55 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:11.896 18:11:55 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:08:11.896 18:11:55 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:11.896 18:11:55 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:11.896 18:11:55 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:11.896 18:11:55 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val=decompress 00:08:11.896 18:11:55 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:11.896 18:11:55 accel.accel_decomp_full_mthread -- accel/accel.sh@23 -- # accel_opc=decompress 00:08:11.896 18:11:55 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:11.896 18:11:55 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:11.896 18:11:55 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val='111250 bytes' 00:08:11.896 18:11:55 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:11.896 18:11:55 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:11.896 18:11:55 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:11.896 18:11:55 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:08:11.896 18:11:55 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:11.896 18:11:55 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:11.896 18:11:55 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:11.896 18:11:55 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val=software 00:08:11.896 18:11:55 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:11.896 18:11:55 accel.accel_decomp_full_mthread -- accel/accel.sh@22 -- # accel_module=software 00:08:11.896 18:11:55 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:11.896 18:11:55 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:11.897 18:11:55 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:08:11.897 18:11:55 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:11.897 18:11:55 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:11.897 18:11:55 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:11.897 18:11:55 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val=32 00:08:11.897 18:11:55 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:11.897 18:11:55 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:11.897 18:11:55 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:11.897 18:11:55 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val=32 00:08:11.897 18:11:55 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:11.897 18:11:55 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:11.897 18:11:55 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:11.897 18:11:55 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val=2 00:08:11.897 18:11:55 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:11.897 18:11:55 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:11.897 18:11:55 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:11.897 18:11:55 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val='1 seconds' 00:08:11.897 18:11:55 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:11.897 18:11:55 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:11.897 18:11:55 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:11.897 18:11:55 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val=Yes 00:08:11.897 18:11:55 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:11.897 18:11:55 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:11.897 18:11:55 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:11.897 18:11:55 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:08:11.897 18:11:55 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:11.897 18:11:55 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:11.897 18:11:55 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:11.897 18:11:55 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:08:11.897 18:11:55 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:11.897 18:11:55 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:11.897 18:11:55 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:13.298 18:11:56 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:08:13.298 18:11:56 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:13.298 18:11:56 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:13.298 18:11:56 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:13.298 18:11:56 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:08:13.298 18:11:56 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:13.298 18:11:56 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:13.298 18:11:56 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:13.298 18:11:56 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:08:13.298 18:11:56 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:13.298 18:11:56 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:13.298 18:11:56 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:13.298 18:11:56 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:08:13.298 18:11:56 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:13.298 18:11:56 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:13.298 18:11:56 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:13.298 18:11:56 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:08:13.298 18:11:56 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:13.298 18:11:56 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:13.298 18:11:56 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:13.298 18:11:56 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:08:13.298 18:11:56 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:13.298 18:11:56 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:13.298 18:11:56 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:13.298 18:11:56 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:08:13.298 18:11:56 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:13.298 18:11:56 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:13.298 18:11:56 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:13.298 18:11:56 accel.accel_decomp_full_mthread -- accel/accel.sh@27 -- # [[ -n software ]] 00:08:13.298 18:11:56 accel.accel_decomp_full_mthread -- accel/accel.sh@27 -- # [[ -n decompress ]] 00:08:13.298 18:11:56 accel.accel_decomp_full_mthread -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:08:13.298 00:08:13.298 real 0m1.542s 00:08:13.298 user 0m1.352s 00:08:13.298 sys 0m0.194s 00:08:13.298 18:11:56 accel.accel_decomp_full_mthread -- common/autotest_common.sh@1124 -- # xtrace_disable 00:08:13.298 18:11:56 accel.accel_decomp_full_mthread -- common/autotest_common.sh@10 -- # set +x 00:08:13.298 ************************************ 00:08:13.298 END TEST accel_decomp_full_mthread 00:08:13.298 ************************************ 00:08:13.298 18:11:56 accel -- common/autotest_common.sh@1142 -- # return 0 00:08:13.298 18:11:56 accel -- accel/accel.sh@124 -- # [[ y == y ]] 00:08:13.298 18:11:56 accel -- accel/accel.sh@125 -- # COMPRESSDEV=1 00:08:13.298 18:11:56 accel -- accel/accel.sh@126 -- # get_expected_opcs 00:08:13.298 18:11:56 accel -- accel/accel.sh@60 -- # trap 'killprocess $spdk_tgt_pid; exit 1' ERR 00:08:13.298 18:11:56 accel -- accel/accel.sh@62 -- # spdk_tgt_pid=2433939 00:08:13.298 18:11:56 accel -- accel/accel.sh@63 -- # waitforlisten 2433939 00:08:13.298 18:11:56 accel -- common/autotest_common.sh@829 -- # '[' -z 2433939 ']' 00:08:13.298 18:11:56 accel -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:08:13.298 18:11:56 accel -- common/autotest_common.sh@834 -- # local max_retries=100 00:08:13.298 18:11:56 accel -- accel/accel.sh@61 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt -c /dev/fd/63 00:08:13.298 18:11:56 accel -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:08:13.298 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:08:13.298 18:11:56 accel -- accel/accel.sh@61 -- # build_accel_config 00:08:13.298 18:11:56 accel -- common/autotest_common.sh@838 -- # xtrace_disable 00:08:13.298 18:11:56 accel -- common/autotest_common.sh@10 -- # set +x 00:08:13.298 18:11:56 accel -- accel/accel.sh@31 -- # accel_json_cfg=() 00:08:13.299 18:11:56 accel -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:08:13.299 18:11:56 accel -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:08:13.299 18:11:56 accel -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:08:13.299 18:11:56 accel -- accel/accel.sh@36 -- # [[ -n 1 ]] 00:08:13.299 18:11:56 accel -- accel/accel.sh@37 -- # accel_json_cfg+=('{"method": "compressdev_scan_accel_module", "params":{"pmd": 0}}') 00:08:13.299 18:11:56 accel -- accel/accel.sh@40 -- # local IFS=, 00:08:13.299 18:11:56 accel -- accel/accel.sh@41 -- # jq -r . 00:08:13.299 [2024-07-12 18:11:56.808703] Starting SPDK v24.09-pre git sha1 182dd7de4 / DPDK 24.03.0 initialization... 00:08:13.299 [2024-07-12 18:11:56.808785] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2433939 ] 00:08:13.299 [2024-07-12 18:11:56.932992] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:13.555 [2024-07-12 18:11:57.038912] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:08:14.121 [2024-07-12 18:11:57.797498] accel_dpdk_compressdev.c: 296:accel_init_compress_drivers: *NOTICE*: initialized QAT PMD 00:08:14.379 18:11:57 accel -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:08:14.379 18:11:57 accel -- common/autotest_common.sh@862 -- # return 0 00:08:14.379 18:11:57 accel -- accel/accel.sh@65 -- # [[ 0 -gt 0 ]] 00:08:14.379 18:11:57 accel -- accel/accel.sh@66 -- # [[ 0 -gt 0 ]] 00:08:14.379 18:11:57 accel -- accel/accel.sh@67 -- # [[ 0 -gt 0 ]] 00:08:14.379 18:11:57 accel -- accel/accel.sh@68 -- # [[ -n 1 ]] 00:08:14.379 18:11:57 accel -- accel/accel.sh@68 -- # check_save_config compressdev_scan_accel_module 00:08:14.379 18:11:57 accel -- accel/accel.sh@56 -- # rpc_cmd save_config 00:08:14.379 18:11:57 accel -- accel/accel.sh@56 -- # jq -r '.subsystems[] | select(.subsystem=="accel").config[]' 00:08:14.379 18:11:57 accel -- common/autotest_common.sh@559 -- # xtrace_disable 00:08:14.379 18:11:57 accel -- common/autotest_common.sh@10 -- # set +x 00:08:14.379 18:11:57 accel -- accel/accel.sh@56 -- # grep compressdev_scan_accel_module 00:08:14.638 18:11:58 accel -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:08:14.638 "method": "compressdev_scan_accel_module", 00:08:14.638 18:11:58 accel -- accel/accel.sh@70 -- # exp_opcs=($($rpc_py accel_get_opc_assignments | jq -r ". | to_entries | map(\"\(.key)=\(.value)\") | .[]")) 00:08:14.638 18:11:58 accel -- accel/accel.sh@70 -- # rpc_cmd accel_get_opc_assignments 00:08:14.638 18:11:58 accel -- common/autotest_common.sh@559 -- # xtrace_disable 00:08:14.638 18:11:58 accel -- accel/accel.sh@70 -- # jq -r '. | to_entries | map("\(.key)=\(.value)") | .[]' 00:08:14.638 18:11:58 accel -- common/autotest_common.sh@10 -- # set +x 00:08:14.638 18:11:58 accel -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:08:14.638 18:11:58 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:08:14.638 18:11:58 accel -- accel/accel.sh@72 -- # IFS== 00:08:14.638 18:11:58 accel -- accel/accel.sh@72 -- # read -r opc module 00:08:14.638 18:11:58 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:08:14.638 18:11:58 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:08:14.638 18:11:58 accel -- accel/accel.sh@72 -- # IFS== 00:08:14.638 18:11:58 accel -- accel/accel.sh@72 -- # read -r opc module 00:08:14.638 18:11:58 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:08:14.638 18:11:58 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:08:14.638 18:11:58 accel -- accel/accel.sh@72 -- # IFS== 00:08:14.638 18:11:58 accel -- accel/accel.sh@72 -- # read -r opc module 00:08:14.638 18:11:58 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:08:14.638 18:11:58 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:08:14.638 18:11:58 accel -- accel/accel.sh@72 -- # IFS== 00:08:14.638 18:11:58 accel -- accel/accel.sh@72 -- # read -r opc module 00:08:14.638 18:11:58 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:08:14.638 18:11:58 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:08:14.638 18:11:58 accel -- accel/accel.sh@72 -- # IFS== 00:08:14.638 18:11:58 accel -- accel/accel.sh@72 -- # read -r opc module 00:08:14.638 18:11:58 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:08:14.638 18:11:58 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:08:14.638 18:11:58 accel -- accel/accel.sh@72 -- # IFS== 00:08:14.638 18:11:58 accel -- accel/accel.sh@72 -- # read -r opc module 00:08:14.638 18:11:58 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:08:14.638 18:11:58 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:08:14.638 18:11:58 accel -- accel/accel.sh@72 -- # IFS== 00:08:14.638 18:11:58 accel -- accel/accel.sh@72 -- # read -r opc module 00:08:14.638 18:11:58 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=dpdk_compressdev 00:08:14.638 18:11:58 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:08:14.638 18:11:58 accel -- accel/accel.sh@72 -- # IFS== 00:08:14.638 18:11:58 accel -- accel/accel.sh@72 -- # read -r opc module 00:08:14.638 18:11:58 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=dpdk_compressdev 00:08:14.638 18:11:58 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:08:14.638 18:11:58 accel -- accel/accel.sh@72 -- # IFS== 00:08:14.638 18:11:58 accel -- accel/accel.sh@72 -- # read -r opc module 00:08:14.638 18:11:58 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:08:14.638 18:11:58 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:08:14.638 18:11:58 accel -- accel/accel.sh@72 -- # IFS== 00:08:14.638 18:11:58 accel -- accel/accel.sh@72 -- # read -r opc module 00:08:14.638 18:11:58 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:08:14.638 18:11:58 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:08:14.638 18:11:58 accel -- accel/accel.sh@72 -- # IFS== 00:08:14.638 18:11:58 accel -- accel/accel.sh@72 -- # read -r opc module 00:08:14.638 18:11:58 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:08:14.638 18:11:58 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:08:14.638 18:11:58 accel -- accel/accel.sh@72 -- # IFS== 00:08:14.638 18:11:58 accel -- accel/accel.sh@72 -- # read -r opc module 00:08:14.638 18:11:58 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:08:14.638 18:11:58 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:08:14.638 18:11:58 accel -- accel/accel.sh@72 -- # IFS== 00:08:14.638 18:11:58 accel -- accel/accel.sh@72 -- # read -r opc module 00:08:14.638 18:11:58 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:08:14.638 18:11:58 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:08:14.638 18:11:58 accel -- accel/accel.sh@72 -- # IFS== 00:08:14.638 18:11:58 accel -- accel/accel.sh@72 -- # read -r opc module 00:08:14.638 18:11:58 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:08:14.638 18:11:58 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:08:14.638 18:11:58 accel -- accel/accel.sh@72 -- # IFS== 00:08:14.638 18:11:58 accel -- accel/accel.sh@72 -- # read -r opc module 00:08:14.638 18:11:58 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:08:14.638 18:11:58 accel -- accel/accel.sh@75 -- # killprocess 2433939 00:08:14.638 18:11:58 accel -- common/autotest_common.sh@948 -- # '[' -z 2433939 ']' 00:08:14.638 18:11:58 accel -- common/autotest_common.sh@952 -- # kill -0 2433939 00:08:14.638 18:11:58 accel -- common/autotest_common.sh@953 -- # uname 00:08:14.638 18:11:58 accel -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:08:14.638 18:11:58 accel -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2433939 00:08:14.638 18:11:58 accel -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:08:14.638 18:11:58 accel -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:08:14.638 18:11:58 accel -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2433939' 00:08:14.638 killing process with pid 2433939 00:08:14.638 18:11:58 accel -- common/autotest_common.sh@967 -- # kill 2433939 00:08:14.638 18:11:58 accel -- common/autotest_common.sh@972 -- # wait 2433939 00:08:15.204 18:11:58 accel -- accel/accel.sh@76 -- # trap - ERR 00:08:15.204 18:11:58 accel -- accel/accel.sh@127 -- # run_test accel_cdev_comp accel_test -t 1 -w compress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:08:15.204 18:11:58 accel -- common/autotest_common.sh@1099 -- # '[' 8 -le 1 ']' 00:08:15.204 18:11:58 accel -- common/autotest_common.sh@1105 -- # xtrace_disable 00:08:15.204 18:11:58 accel -- common/autotest_common.sh@10 -- # set +x 00:08:15.204 ************************************ 00:08:15.204 START TEST accel_cdev_comp 00:08:15.204 ************************************ 00:08:15.204 18:11:58 accel.accel_cdev_comp -- common/autotest_common.sh@1123 -- # accel_test -t 1 -w compress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:08:15.204 18:11:58 accel.accel_cdev_comp -- accel/accel.sh@16 -- # local accel_opc 00:08:15.204 18:11:58 accel.accel_cdev_comp -- accel/accel.sh@17 -- # local accel_module 00:08:15.205 18:11:58 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:08:15.205 18:11:58 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:08:15.205 18:11:58 accel.accel_cdev_comp -- accel/accel.sh@15 -- # accel_perf -t 1 -w compress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:08:15.205 18:11:58 accel.accel_cdev_comp -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w compress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:08:15.205 18:11:58 accel.accel_cdev_comp -- accel/accel.sh@12 -- # build_accel_config 00:08:15.205 18:11:58 accel.accel_cdev_comp -- accel/accel.sh@31 -- # accel_json_cfg=() 00:08:15.205 18:11:58 accel.accel_cdev_comp -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:08:15.205 18:11:58 accel.accel_cdev_comp -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:08:15.205 18:11:58 accel.accel_cdev_comp -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:08:15.205 18:11:58 accel.accel_cdev_comp -- accel/accel.sh@36 -- # [[ -n 1 ]] 00:08:15.205 18:11:58 accel.accel_cdev_comp -- accel/accel.sh@37 -- # accel_json_cfg+=('{"method": "compressdev_scan_accel_module", "params":{"pmd": 0}}') 00:08:15.205 18:11:58 accel.accel_cdev_comp -- accel/accel.sh@40 -- # local IFS=, 00:08:15.205 18:11:58 accel.accel_cdev_comp -- accel/accel.sh@41 -- # jq -r . 00:08:15.205 [2024-07-12 18:11:58.725795] Starting SPDK v24.09-pre git sha1 182dd7de4 / DPDK 24.03.0 initialization... 00:08:15.205 [2024-07-12 18:11:58.725856] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2434144 ] 00:08:15.205 [2024-07-12 18:11:58.856230] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:15.462 [2024-07-12 18:11:58.961602] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:08:16.027 [2024-07-12 18:11:59.736556] accel_dpdk_compressdev.c: 296:accel_init_compress_drivers: *NOTICE*: initialized QAT PMD 00:08:16.027 [2024-07-12 18:11:59.739177] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x12b6080 PMD being used: compress_qat 00:08:16.027 18:11:59 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val= 00:08:16.027 18:11:59 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:08:16.027 18:11:59 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:08:16.027 18:11:59 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:08:16.027 18:11:59 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val= 00:08:16.027 18:11:59 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:08:16.027 18:11:59 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:08:16.027 [2024-07-12 18:11:59.743316] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x12bae60 PMD being used: compress_qat 00:08:16.027 18:11:59 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:08:16.027 18:11:59 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val= 00:08:16.027 18:11:59 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:08:16.027 18:11:59 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:08:16.027 18:11:59 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:08:16.027 18:11:59 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val=0x1 00:08:16.027 18:11:59 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:08:16.027 18:11:59 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:08:16.027 18:11:59 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:08:16.027 18:11:59 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val= 00:08:16.027 18:11:59 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:08:16.027 18:11:59 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:08:16.027 18:11:59 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:08:16.027 18:11:59 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val= 00:08:16.027 18:11:59 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:08:16.027 18:11:59 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:08:16.027 18:11:59 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:08:16.027 18:11:59 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val=compress 00:08:16.027 18:11:59 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:08:16.027 18:11:59 accel.accel_cdev_comp -- accel/accel.sh@23 -- # accel_opc=compress 00:08:16.027 18:11:59 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:08:16.027 18:11:59 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:08:16.027 18:11:59 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val='4096 bytes' 00:08:16.027 18:11:59 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:08:16.027 18:11:59 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:08:16.027 18:11:59 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:08:16.027 18:11:59 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val= 00:08:16.027 18:11:59 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:08:16.027 18:11:59 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:08:16.027 18:11:59 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:08:16.027 18:11:59 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val=dpdk_compressdev 00:08:16.027 18:11:59 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:08:16.027 18:11:59 accel.accel_cdev_comp -- accel/accel.sh@22 -- # accel_module=dpdk_compressdev 00:08:16.027 18:11:59 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:08:16.027 18:11:59 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:08:16.027 18:11:59 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:08:16.027 18:11:59 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:08:16.027 18:11:59 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:08:16.028 18:11:59 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:08:16.028 18:11:59 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val=32 00:08:16.028 18:11:59 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:08:16.028 18:11:59 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:08:16.028 18:11:59 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:08:16.028 18:11:59 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val=32 00:08:16.028 18:11:59 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:08:16.028 18:11:59 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:08:16.028 18:11:59 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:08:16.028 18:11:59 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val=1 00:08:16.028 18:11:59 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:08:16.028 18:11:59 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:08:16.028 18:11:59 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:08:16.028 18:11:59 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val='1 seconds' 00:08:16.285 18:11:59 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:08:16.285 18:11:59 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:08:16.285 18:11:59 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:08:16.285 18:11:59 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val=No 00:08:16.285 18:11:59 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:08:16.285 18:11:59 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:08:16.285 18:11:59 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:08:16.285 18:11:59 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val= 00:08:16.285 18:11:59 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:08:16.285 18:11:59 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:08:16.285 18:11:59 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:08:16.285 18:11:59 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val= 00:08:16.285 18:11:59 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:08:16.285 18:11:59 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:08:16.285 18:11:59 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:08:17.220 18:12:00 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val= 00:08:17.220 18:12:00 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:08:17.220 18:12:00 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:08:17.220 18:12:00 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:08:17.220 18:12:00 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val= 00:08:17.220 18:12:00 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:08:17.220 18:12:00 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:08:17.220 18:12:00 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:08:17.220 18:12:00 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val= 00:08:17.220 18:12:00 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:08:17.220 18:12:00 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:08:17.220 18:12:00 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:08:17.220 18:12:00 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val= 00:08:17.220 18:12:00 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:08:17.220 18:12:00 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:08:17.220 18:12:00 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:08:17.220 18:12:00 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val= 00:08:17.220 18:12:00 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:08:17.220 18:12:00 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:08:17.220 18:12:00 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:08:17.220 18:12:00 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val= 00:08:17.220 18:12:00 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:08:17.220 18:12:00 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:08:17.220 18:12:00 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:08:17.220 18:12:00 accel.accel_cdev_comp -- accel/accel.sh@27 -- # [[ -n dpdk_compressdev ]] 00:08:17.220 18:12:00 accel.accel_cdev_comp -- accel/accel.sh@27 -- # [[ -n compress ]] 00:08:17.220 18:12:00 accel.accel_cdev_comp -- accel/accel.sh@27 -- # [[ dpdk_compressdev == \d\p\d\k\_\c\o\m\p\r\e\s\s\d\e\v ]] 00:08:17.220 00:08:17.220 real 0m2.233s 00:08:17.220 user 0m0.022s 00:08:17.220 sys 0m0.005s 00:08:17.220 18:12:00 accel.accel_cdev_comp -- common/autotest_common.sh@1124 -- # xtrace_disable 00:08:17.220 18:12:00 accel.accel_cdev_comp -- common/autotest_common.sh@10 -- # set +x 00:08:17.220 ************************************ 00:08:17.220 END TEST accel_cdev_comp 00:08:17.220 ************************************ 00:08:17.479 18:12:00 accel -- common/autotest_common.sh@1142 -- # return 0 00:08:17.479 18:12:00 accel -- accel/accel.sh@128 -- # run_test accel_cdev_decomp accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y 00:08:17.479 18:12:00 accel -- common/autotest_common.sh@1099 -- # '[' 9 -le 1 ']' 00:08:17.479 18:12:00 accel -- common/autotest_common.sh@1105 -- # xtrace_disable 00:08:17.479 18:12:00 accel -- common/autotest_common.sh@10 -- # set +x 00:08:17.479 ************************************ 00:08:17.479 START TEST accel_cdev_decomp 00:08:17.479 ************************************ 00:08:17.479 18:12:01 accel.accel_cdev_decomp -- common/autotest_common.sh@1123 -- # accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y 00:08:17.479 18:12:01 accel.accel_cdev_decomp -- accel/accel.sh@16 -- # local accel_opc 00:08:17.479 18:12:01 accel.accel_cdev_decomp -- accel/accel.sh@17 -- # local accel_module 00:08:17.479 18:12:01 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:08:17.479 18:12:01 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:08:17.479 18:12:01 accel.accel_cdev_decomp -- accel/accel.sh@15 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y 00:08:17.479 18:12:01 accel.accel_cdev_decomp -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y 00:08:17.479 18:12:01 accel.accel_cdev_decomp -- accel/accel.sh@12 -- # build_accel_config 00:08:17.479 18:12:01 accel.accel_cdev_decomp -- accel/accel.sh@31 -- # accel_json_cfg=() 00:08:17.479 18:12:01 accel.accel_cdev_decomp -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:08:17.479 18:12:01 accel.accel_cdev_decomp -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:08:17.479 18:12:01 accel.accel_cdev_decomp -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:08:17.479 18:12:01 accel.accel_cdev_decomp -- accel/accel.sh@36 -- # [[ -n 1 ]] 00:08:17.479 18:12:01 accel.accel_cdev_decomp -- accel/accel.sh@37 -- # accel_json_cfg+=('{"method": "compressdev_scan_accel_module", "params":{"pmd": 0}}') 00:08:17.479 18:12:01 accel.accel_cdev_decomp -- accel/accel.sh@40 -- # local IFS=, 00:08:17.479 18:12:01 accel.accel_cdev_decomp -- accel/accel.sh@41 -- # jq -r . 00:08:17.479 [2024-07-12 18:12:01.040385] Starting SPDK v24.09-pre git sha1 182dd7de4 / DPDK 24.03.0 initialization... 00:08:17.479 [2024-07-12 18:12:01.040444] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2434583 ] 00:08:17.479 [2024-07-12 18:12:01.169122] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:17.757 [2024-07-12 18:12:01.269831] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:08:18.324 [2024-07-12 18:12:02.037373] accel_dpdk_compressdev.c: 296:accel_init_compress_drivers: *NOTICE*: initialized QAT PMD 00:08:18.324 [2024-07-12 18:12:02.039948] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x2549080 PMD being used: compress_qat 00:08:18.324 18:12:02 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val= 00:08:18.324 18:12:02 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:08:18.324 18:12:02 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:08:18.324 18:12:02 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:08:18.324 18:12:02 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val= 00:08:18.324 18:12:02 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:08:18.324 18:12:02 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:08:18.324 [2024-07-12 18:12:02.044145] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x254de60 PMD being used: compress_qat 00:08:18.324 18:12:02 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:08:18.324 18:12:02 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val= 00:08:18.324 18:12:02 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:08:18.324 18:12:02 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:08:18.324 18:12:02 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:08:18.324 18:12:02 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val=0x1 00:08:18.324 18:12:02 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:08:18.324 18:12:02 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:08:18.324 18:12:02 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:08:18.324 18:12:02 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val= 00:08:18.324 18:12:02 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:08:18.324 18:12:02 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:08:18.324 18:12:02 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:08:18.324 18:12:02 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val= 00:08:18.324 18:12:02 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:08:18.324 18:12:02 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:08:18.324 18:12:02 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:08:18.324 18:12:02 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val=decompress 00:08:18.324 18:12:02 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:08:18.324 18:12:02 accel.accel_cdev_decomp -- accel/accel.sh@23 -- # accel_opc=decompress 00:08:18.324 18:12:02 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:08:18.324 18:12:02 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:08:18.324 18:12:02 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val='4096 bytes' 00:08:18.324 18:12:02 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:08:18.324 18:12:02 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:08:18.324 18:12:02 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:08:18.324 18:12:02 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val= 00:08:18.324 18:12:02 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:08:18.324 18:12:02 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:08:18.324 18:12:02 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:08:18.324 18:12:02 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val=dpdk_compressdev 00:08:18.324 18:12:02 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:08:18.324 18:12:02 accel.accel_cdev_decomp -- accel/accel.sh@22 -- # accel_module=dpdk_compressdev 00:08:18.582 18:12:02 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:08:18.582 18:12:02 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:08:18.582 18:12:02 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:08:18.582 18:12:02 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:08:18.582 18:12:02 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:08:18.582 18:12:02 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:08:18.582 18:12:02 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val=32 00:08:18.582 18:12:02 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:08:18.582 18:12:02 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:08:18.582 18:12:02 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:08:18.582 18:12:02 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val=32 00:08:18.582 18:12:02 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:08:18.582 18:12:02 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:08:18.582 18:12:02 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:08:18.582 18:12:02 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val=1 00:08:18.582 18:12:02 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:08:18.582 18:12:02 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:08:18.582 18:12:02 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:08:18.582 18:12:02 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val='1 seconds' 00:08:18.582 18:12:02 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:08:18.582 18:12:02 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:08:18.582 18:12:02 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:08:18.582 18:12:02 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val=Yes 00:08:18.582 18:12:02 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:08:18.582 18:12:02 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:08:18.582 18:12:02 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:08:18.582 18:12:02 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val= 00:08:18.582 18:12:02 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:08:18.582 18:12:02 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:08:18.582 18:12:02 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:08:18.582 18:12:02 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val= 00:08:18.582 18:12:02 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:08:18.582 18:12:02 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:08:18.582 18:12:02 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:08:19.535 18:12:03 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val= 00:08:19.535 18:12:03 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:08:19.535 18:12:03 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:08:19.535 18:12:03 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:08:19.535 18:12:03 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val= 00:08:19.535 18:12:03 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:08:19.535 18:12:03 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:08:19.535 18:12:03 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:08:19.535 18:12:03 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val= 00:08:19.535 18:12:03 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:08:19.535 18:12:03 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:08:19.535 18:12:03 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:08:19.535 18:12:03 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val= 00:08:19.535 18:12:03 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:08:19.535 18:12:03 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:08:19.535 18:12:03 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:08:19.535 18:12:03 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val= 00:08:19.535 18:12:03 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:08:19.535 18:12:03 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:08:19.535 18:12:03 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:08:19.535 18:12:03 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val= 00:08:19.535 18:12:03 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:08:19.535 18:12:03 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:08:19.535 18:12:03 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:08:19.535 18:12:03 accel.accel_cdev_decomp -- accel/accel.sh@27 -- # [[ -n dpdk_compressdev ]] 00:08:19.535 18:12:03 accel.accel_cdev_decomp -- accel/accel.sh@27 -- # [[ -n decompress ]] 00:08:19.535 18:12:03 accel.accel_cdev_decomp -- accel/accel.sh@27 -- # [[ dpdk_compressdev == \d\p\d\k\_\c\o\m\p\r\e\s\s\d\e\v ]] 00:08:19.535 00:08:19.535 real 0m2.217s 00:08:19.535 user 0m1.644s 00:08:19.535 sys 0m0.573s 00:08:19.535 18:12:03 accel.accel_cdev_decomp -- common/autotest_common.sh@1124 -- # xtrace_disable 00:08:19.535 18:12:03 accel.accel_cdev_decomp -- common/autotest_common.sh@10 -- # set +x 00:08:19.535 ************************************ 00:08:19.535 END TEST accel_cdev_decomp 00:08:19.535 ************************************ 00:08:19.793 18:12:03 accel -- common/autotest_common.sh@1142 -- # return 0 00:08:19.793 18:12:03 accel -- accel/accel.sh@129 -- # run_test accel_cdev_decomp_full accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 00:08:19.793 18:12:03 accel -- common/autotest_common.sh@1099 -- # '[' 11 -le 1 ']' 00:08:19.793 18:12:03 accel -- common/autotest_common.sh@1105 -- # xtrace_disable 00:08:19.793 18:12:03 accel -- common/autotest_common.sh@10 -- # set +x 00:08:19.793 ************************************ 00:08:19.793 START TEST accel_cdev_decomp_full 00:08:19.793 ************************************ 00:08:19.793 18:12:03 accel.accel_cdev_decomp_full -- common/autotest_common.sh@1123 -- # accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 00:08:19.793 18:12:03 accel.accel_cdev_decomp_full -- accel/accel.sh@16 -- # local accel_opc 00:08:19.793 18:12:03 accel.accel_cdev_decomp_full -- accel/accel.sh@17 -- # local accel_module 00:08:19.793 18:12:03 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:08:19.793 18:12:03 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:08:19.793 18:12:03 accel.accel_cdev_decomp_full -- accel/accel.sh@15 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 00:08:19.793 18:12:03 accel.accel_cdev_decomp_full -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 00:08:19.793 18:12:03 accel.accel_cdev_decomp_full -- accel/accel.sh@12 -- # build_accel_config 00:08:19.793 18:12:03 accel.accel_cdev_decomp_full -- accel/accel.sh@31 -- # accel_json_cfg=() 00:08:19.793 18:12:03 accel.accel_cdev_decomp_full -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:08:19.793 18:12:03 accel.accel_cdev_decomp_full -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:08:19.793 18:12:03 accel.accel_cdev_decomp_full -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:08:19.793 18:12:03 accel.accel_cdev_decomp_full -- accel/accel.sh@36 -- # [[ -n 1 ]] 00:08:19.793 18:12:03 accel.accel_cdev_decomp_full -- accel/accel.sh@37 -- # accel_json_cfg+=('{"method": "compressdev_scan_accel_module", "params":{"pmd": 0}}') 00:08:19.793 18:12:03 accel.accel_cdev_decomp_full -- accel/accel.sh@40 -- # local IFS=, 00:08:19.793 18:12:03 accel.accel_cdev_decomp_full -- accel/accel.sh@41 -- # jq -r . 00:08:19.793 [2024-07-12 18:12:03.333668] Starting SPDK v24.09-pre git sha1 182dd7de4 / DPDK 24.03.0 initialization... 00:08:19.793 [2024-07-12 18:12:03.333730] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2435002 ] 00:08:19.793 [2024-07-12 18:12:03.459457] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:20.052 [2024-07-12 18:12:03.561234] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:08:20.619 [2024-07-12 18:12:04.324296] accel_dpdk_compressdev.c: 296:accel_init_compress_drivers: *NOTICE*: initialized QAT PMD 00:08:20.619 [2024-07-12 18:12:04.326871] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x2738080 PMD being used: compress_qat 00:08:20.619 18:12:04 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val= 00:08:20.619 18:12:04 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:08:20.619 18:12:04 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:08:20.619 18:12:04 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:08:20.619 18:12:04 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val= 00:08:20.619 18:12:04 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:08:20.619 18:12:04 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:08:20.619 [2024-07-12 18:12:04.330246] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x2737ce0 PMD being used: compress_qat 00:08:20.619 18:12:04 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:08:20.619 18:12:04 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val= 00:08:20.619 18:12:04 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:08:20.619 18:12:04 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:08:20.619 18:12:04 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:08:20.619 18:12:04 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val=0x1 00:08:20.619 18:12:04 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:08:20.619 18:12:04 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:08:20.619 18:12:04 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:08:20.619 18:12:04 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val= 00:08:20.619 18:12:04 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:08:20.619 18:12:04 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:08:20.619 18:12:04 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:08:20.619 18:12:04 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val= 00:08:20.619 18:12:04 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:08:20.619 18:12:04 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:08:20.619 18:12:04 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:08:20.619 18:12:04 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val=decompress 00:08:20.619 18:12:04 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:08:20.619 18:12:04 accel.accel_cdev_decomp_full -- accel/accel.sh@23 -- # accel_opc=decompress 00:08:20.619 18:12:04 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:08:20.619 18:12:04 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:08:20.619 18:12:04 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val='111250 bytes' 00:08:20.619 18:12:04 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:08:20.619 18:12:04 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:08:20.619 18:12:04 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:08:20.619 18:12:04 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val= 00:08:20.619 18:12:04 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:08:20.619 18:12:04 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:08:20.619 18:12:04 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:08:20.619 18:12:04 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val=dpdk_compressdev 00:08:20.619 18:12:04 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:08:20.619 18:12:04 accel.accel_cdev_decomp_full -- accel/accel.sh@22 -- # accel_module=dpdk_compressdev 00:08:20.619 18:12:04 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:08:20.619 18:12:04 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:08:20.619 18:12:04 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:08:20.619 18:12:04 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:08:20.619 18:12:04 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:08:20.619 18:12:04 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:08:20.619 18:12:04 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val=32 00:08:20.619 18:12:04 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:08:20.619 18:12:04 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:08:20.619 18:12:04 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:08:20.619 18:12:04 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val=32 00:08:20.619 18:12:04 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:08:20.619 18:12:04 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:08:20.619 18:12:04 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:08:20.619 18:12:04 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val=1 00:08:20.619 18:12:04 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:08:20.619 18:12:04 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:08:20.619 18:12:04 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:08:20.619 18:12:04 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val='1 seconds' 00:08:20.619 18:12:04 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:08:20.619 18:12:04 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:08:20.619 18:12:04 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:08:20.619 18:12:04 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val=Yes 00:08:20.619 18:12:04 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:08:20.619 18:12:04 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:08:20.619 18:12:04 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:08:20.619 18:12:04 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val= 00:08:20.619 18:12:04 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:08:20.619 18:12:04 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:08:20.619 18:12:04 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:08:20.619 18:12:04 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val= 00:08:20.619 18:12:04 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:08:20.619 18:12:04 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:08:20.619 18:12:04 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:08:21.997 18:12:05 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val= 00:08:21.997 18:12:05 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:08:21.997 18:12:05 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:08:21.997 18:12:05 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:08:21.997 18:12:05 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val= 00:08:21.997 18:12:05 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:08:21.997 18:12:05 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:08:21.997 18:12:05 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:08:21.997 18:12:05 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val= 00:08:21.997 18:12:05 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:08:21.997 18:12:05 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:08:21.997 18:12:05 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:08:21.997 18:12:05 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val= 00:08:21.997 18:12:05 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:08:21.997 18:12:05 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:08:21.997 18:12:05 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:08:21.997 18:12:05 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val= 00:08:21.997 18:12:05 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:08:21.997 18:12:05 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:08:21.997 18:12:05 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:08:21.997 18:12:05 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val= 00:08:21.997 18:12:05 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:08:21.997 18:12:05 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:08:21.997 18:12:05 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:08:21.997 18:12:05 accel.accel_cdev_decomp_full -- accel/accel.sh@27 -- # [[ -n dpdk_compressdev ]] 00:08:21.997 18:12:05 accel.accel_cdev_decomp_full -- accel/accel.sh@27 -- # [[ -n decompress ]] 00:08:21.997 18:12:05 accel.accel_cdev_decomp_full -- accel/accel.sh@27 -- # [[ dpdk_compressdev == \d\p\d\k\_\c\o\m\p\r\e\s\s\d\e\v ]] 00:08:21.997 00:08:21.997 real 0m2.216s 00:08:21.997 user 0m1.638s 00:08:21.997 sys 0m0.578s 00:08:21.997 18:12:05 accel.accel_cdev_decomp_full -- common/autotest_common.sh@1124 -- # xtrace_disable 00:08:21.997 18:12:05 accel.accel_cdev_decomp_full -- common/autotest_common.sh@10 -- # set +x 00:08:21.997 ************************************ 00:08:21.997 END TEST accel_cdev_decomp_full 00:08:21.997 ************************************ 00:08:21.997 18:12:05 accel -- common/autotest_common.sh@1142 -- # return 0 00:08:21.997 18:12:05 accel -- accel/accel.sh@130 -- # run_test accel_cdev_decomp_mcore accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -m 0xf 00:08:21.997 18:12:05 accel -- common/autotest_common.sh@1099 -- # '[' 11 -le 1 ']' 00:08:21.997 18:12:05 accel -- common/autotest_common.sh@1105 -- # xtrace_disable 00:08:21.997 18:12:05 accel -- common/autotest_common.sh@10 -- # set +x 00:08:21.997 ************************************ 00:08:21.997 START TEST accel_cdev_decomp_mcore 00:08:21.997 ************************************ 00:08:21.997 18:12:05 accel.accel_cdev_decomp_mcore -- common/autotest_common.sh@1123 -- # accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -m 0xf 00:08:21.997 18:12:05 accel.accel_cdev_decomp_mcore -- accel/accel.sh@16 -- # local accel_opc 00:08:21.997 18:12:05 accel.accel_cdev_decomp_mcore -- accel/accel.sh@17 -- # local accel_module 00:08:21.997 18:12:05 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:21.997 18:12:05 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:21.997 18:12:05 accel.accel_cdev_decomp_mcore -- accel/accel.sh@15 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -m 0xf 00:08:21.997 18:12:05 accel.accel_cdev_decomp_mcore -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -m 0xf 00:08:21.997 18:12:05 accel.accel_cdev_decomp_mcore -- accel/accel.sh@12 -- # build_accel_config 00:08:21.997 18:12:05 accel.accel_cdev_decomp_mcore -- accel/accel.sh@31 -- # accel_json_cfg=() 00:08:21.997 18:12:05 accel.accel_cdev_decomp_mcore -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:08:21.997 18:12:05 accel.accel_cdev_decomp_mcore -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:08:21.997 18:12:05 accel.accel_cdev_decomp_mcore -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:08:21.997 18:12:05 accel.accel_cdev_decomp_mcore -- accel/accel.sh@36 -- # [[ -n 1 ]] 00:08:21.997 18:12:05 accel.accel_cdev_decomp_mcore -- accel/accel.sh@37 -- # accel_json_cfg+=('{"method": "compressdev_scan_accel_module", "params":{"pmd": 0}}') 00:08:21.997 18:12:05 accel.accel_cdev_decomp_mcore -- accel/accel.sh@40 -- # local IFS=, 00:08:21.997 18:12:05 accel.accel_cdev_decomp_mcore -- accel/accel.sh@41 -- # jq -r . 00:08:21.997 [2024-07-12 18:12:05.623444] Starting SPDK v24.09-pre git sha1 182dd7de4 / DPDK 24.03.0 initialization... 00:08:21.997 [2024-07-12 18:12:05.623503] [ DPDK EAL parameters: accel_perf --no-shconf -c 0xf --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2435529 ] 00:08:22.256 [2024-07-12 18:12:05.753937] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 4 00:08:22.256 [2024-07-12 18:12:05.858802] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:08:22.256 [2024-07-12 18:12:05.858889] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 2 00:08:22.256 [2024-07-12 18:12:05.858967] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 3 00:08:22.256 [2024-07-12 18:12:05.858971] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:08:23.193 [2024-07-12 18:12:06.619351] accel_dpdk_compressdev.c: 296:accel_init_compress_drivers: *NOTICE*: initialized QAT PMD 00:08:23.193 [2024-07-12 18:12:06.621885] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x1881720 PMD being used: compress_qat 00:08:23.193 18:12:06 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val= 00:08:23.193 18:12:06 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:23.193 18:12:06 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:23.193 18:12:06 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:23.193 18:12:06 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val= 00:08:23.193 18:12:06 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:23.193 18:12:06 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:23.193 18:12:06 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:23.193 18:12:06 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val= 00:08:23.193 18:12:06 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:23.193 18:12:06 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:23.193 18:12:06 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:23.193 [2024-07-12 18:12:06.627406] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x7f74e819b8b0 PMD being used: compress_qat 00:08:23.193 18:12:06 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val=0xf 00:08:23.193 18:12:06 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:23.193 18:12:06 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:23.193 [2024-07-12 18:12:06.628167] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x7f74e019b8b0 PMD being used: compress_qat 00:08:23.193 18:12:06 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:23.193 18:12:06 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val= 00:08:23.193 18:12:06 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:23.193 18:12:06 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:23.193 18:12:06 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:23.193 [2024-07-12 18:12:06.629219] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x18869f0 PMD being used: compress_qat 00:08:23.193 [2024-07-12 18:12:06.629381] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x7f74d819b8b0 PMD being used: compress_qat 00:08:23.193 18:12:06 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val= 00:08:23.193 18:12:06 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:23.193 18:12:06 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:23.193 18:12:06 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:23.193 18:12:06 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val=decompress 00:08:23.193 18:12:06 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:23.193 18:12:06 accel.accel_cdev_decomp_mcore -- accel/accel.sh@23 -- # accel_opc=decompress 00:08:23.193 18:12:06 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:23.193 18:12:06 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:23.193 18:12:06 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val='4096 bytes' 00:08:23.193 18:12:06 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:23.193 18:12:06 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:23.193 18:12:06 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:23.193 18:12:06 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val= 00:08:23.193 18:12:06 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:23.193 18:12:06 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:23.193 18:12:06 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:23.193 18:12:06 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val=dpdk_compressdev 00:08:23.193 18:12:06 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:23.193 18:12:06 accel.accel_cdev_decomp_mcore -- accel/accel.sh@22 -- # accel_module=dpdk_compressdev 00:08:23.193 18:12:06 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:23.193 18:12:06 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:23.193 18:12:06 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:08:23.193 18:12:06 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:23.193 18:12:06 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:23.193 18:12:06 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:23.193 18:12:06 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val=32 00:08:23.193 18:12:06 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:23.193 18:12:06 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:23.193 18:12:06 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:23.193 18:12:06 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val=32 00:08:23.193 18:12:06 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:23.193 18:12:06 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:23.193 18:12:06 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:23.193 18:12:06 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val=1 00:08:23.193 18:12:06 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:23.193 18:12:06 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:23.193 18:12:06 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:23.193 18:12:06 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val='1 seconds' 00:08:23.193 18:12:06 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:23.193 18:12:06 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:23.193 18:12:06 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:23.193 18:12:06 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val=Yes 00:08:23.193 18:12:06 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:23.193 18:12:06 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:23.193 18:12:06 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:23.193 18:12:06 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val= 00:08:23.193 18:12:06 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:23.193 18:12:06 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:23.193 18:12:06 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:23.193 18:12:06 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val= 00:08:23.193 18:12:06 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:23.193 18:12:06 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:23.193 18:12:06 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:24.129 18:12:07 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val= 00:08:24.129 18:12:07 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:24.129 18:12:07 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:24.129 18:12:07 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:24.129 18:12:07 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val= 00:08:24.129 18:12:07 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:24.129 18:12:07 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:24.129 18:12:07 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:24.129 18:12:07 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val= 00:08:24.129 18:12:07 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:24.129 18:12:07 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:24.129 18:12:07 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:24.129 18:12:07 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val= 00:08:24.129 18:12:07 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:24.129 18:12:07 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:24.129 18:12:07 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:24.129 18:12:07 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val= 00:08:24.129 18:12:07 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:24.129 18:12:07 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:24.129 18:12:07 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:24.129 18:12:07 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val= 00:08:24.129 18:12:07 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:24.129 18:12:07 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:24.129 18:12:07 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:24.129 18:12:07 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val= 00:08:24.129 18:12:07 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:24.129 18:12:07 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:24.129 18:12:07 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:24.129 18:12:07 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val= 00:08:24.129 18:12:07 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:24.129 18:12:07 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:24.129 18:12:07 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:24.129 18:12:07 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val= 00:08:24.129 18:12:07 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:24.129 18:12:07 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:24.129 18:12:07 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:24.129 18:12:07 accel.accel_cdev_decomp_mcore -- accel/accel.sh@27 -- # [[ -n dpdk_compressdev ]] 00:08:24.129 18:12:07 accel.accel_cdev_decomp_mcore -- accel/accel.sh@27 -- # [[ -n decompress ]] 00:08:24.129 18:12:07 accel.accel_cdev_decomp_mcore -- accel/accel.sh@27 -- # [[ dpdk_compressdev == \d\p\d\k\_\c\o\m\p\r\e\s\s\d\e\v ]] 00:08:24.129 00:08:24.129 real 0m2.241s 00:08:24.129 user 0m7.239s 00:08:24.129 sys 0m0.579s 00:08:24.129 18:12:07 accel.accel_cdev_decomp_mcore -- common/autotest_common.sh@1124 -- # xtrace_disable 00:08:24.129 18:12:07 accel.accel_cdev_decomp_mcore -- common/autotest_common.sh@10 -- # set +x 00:08:24.129 ************************************ 00:08:24.129 END TEST accel_cdev_decomp_mcore 00:08:24.129 ************************************ 00:08:24.395 18:12:07 accel -- common/autotest_common.sh@1142 -- # return 0 00:08:24.395 18:12:07 accel -- accel/accel.sh@131 -- # run_test accel_cdev_decomp_full_mcore accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 -m 0xf 00:08:24.395 18:12:07 accel -- common/autotest_common.sh@1099 -- # '[' 13 -le 1 ']' 00:08:24.395 18:12:07 accel -- common/autotest_common.sh@1105 -- # xtrace_disable 00:08:24.396 18:12:07 accel -- common/autotest_common.sh@10 -- # set +x 00:08:24.396 ************************************ 00:08:24.396 START TEST accel_cdev_decomp_full_mcore 00:08:24.396 ************************************ 00:08:24.396 18:12:07 accel.accel_cdev_decomp_full_mcore -- common/autotest_common.sh@1123 -- # accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 -m 0xf 00:08:24.396 18:12:07 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@16 -- # local accel_opc 00:08:24.396 18:12:07 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@17 -- # local accel_module 00:08:24.396 18:12:07 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:24.396 18:12:07 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:24.396 18:12:07 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@15 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 -m 0xf 00:08:24.396 18:12:07 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 -m 0xf 00:08:24.396 18:12:07 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@12 -- # build_accel_config 00:08:24.396 18:12:07 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@31 -- # accel_json_cfg=() 00:08:24.396 18:12:07 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:08:24.396 18:12:07 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:08:24.396 18:12:07 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:08:24.396 18:12:07 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@36 -- # [[ -n 1 ]] 00:08:24.396 18:12:07 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@37 -- # accel_json_cfg+=('{"method": "compressdev_scan_accel_module", "params":{"pmd": 0}}') 00:08:24.396 18:12:07 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@40 -- # local IFS=, 00:08:24.396 18:12:07 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@41 -- # jq -r . 00:08:24.396 [2024-07-12 18:12:07.935160] Starting SPDK v24.09-pre git sha1 182dd7de4 / DPDK 24.03.0 initialization... 00:08:24.396 [2024-07-12 18:12:07.935220] [ DPDK EAL parameters: accel_perf --no-shconf -c 0xf --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2435957 ] 00:08:24.396 [2024-07-12 18:12:08.064720] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 4 00:08:24.653 [2024-07-12 18:12:08.171719] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:08:24.653 [2024-07-12 18:12:08.171821] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 2 00:08:24.653 [2024-07-12 18:12:08.171921] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 3 00:08:24.653 [2024-07-12 18:12:08.171943] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:08:25.221 [2024-07-12 18:12:08.934590] accel_dpdk_compressdev.c: 296:accel_init_compress_drivers: *NOTICE*: initialized QAT PMD 00:08:25.221 [2024-07-12 18:12:08.937183] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x1769720 PMD being used: compress_qat 00:08:25.221 18:12:08 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:08:25.221 18:12:08 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:25.221 18:12:08 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:25.221 18:12:08 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:25.221 18:12:08 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:08:25.221 18:12:08 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:25.221 18:12:08 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:25.221 18:12:08 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:25.221 18:12:08 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:08:25.221 18:12:08 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:25.221 18:12:08 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:25.221 18:12:08 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:25.221 [2024-07-12 18:12:08.941960] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x7fc05019b8b0 PMD being used: compress_qat 00:08:25.221 18:12:08 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val=0xf 00:08:25.221 18:12:08 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:25.221 18:12:08 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:25.221 18:12:08 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:25.221 [2024-07-12 18:12:08.942747] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x7fc04819b8b0 PMD being used: compress_qat 00:08:25.221 18:12:08 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:08:25.221 18:12:08 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:25.221 18:12:08 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:25.221 18:12:08 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:25.221 [2024-07-12 18:12:08.943835] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x176ca30 PMD being used: compress_qat 00:08:25.221 18:12:08 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:08:25.221 [2024-07-12 18:12:08.944046] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x7fc04019b8b0 PMD being used: compress_qat 00:08:25.221 18:12:08 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:25.221 18:12:08 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:25.221 18:12:08 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:25.221 18:12:08 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val=decompress 00:08:25.221 18:12:08 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:25.221 18:12:08 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@23 -- # accel_opc=decompress 00:08:25.221 18:12:08 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:25.221 18:12:08 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:25.221 18:12:08 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val='111250 bytes' 00:08:25.221 18:12:08 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:25.221 18:12:08 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:25.221 18:12:08 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:25.221 18:12:08 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:08:25.221 18:12:08 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:25.221 18:12:08 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:25.221 18:12:08 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:25.221 18:12:08 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val=dpdk_compressdev 00:08:25.221 18:12:08 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:25.221 18:12:08 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@22 -- # accel_module=dpdk_compressdev 00:08:25.221 18:12:08 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:25.221 18:12:08 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:25.480 18:12:08 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:08:25.480 18:12:08 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:25.480 18:12:08 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:25.480 18:12:08 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:25.480 18:12:08 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val=32 00:08:25.480 18:12:08 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:25.480 18:12:08 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:25.480 18:12:08 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:25.480 18:12:08 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val=32 00:08:25.480 18:12:08 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:25.480 18:12:08 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:25.480 18:12:08 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:25.480 18:12:08 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val=1 00:08:25.480 18:12:08 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:25.480 18:12:08 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:25.480 18:12:08 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:25.480 18:12:08 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val='1 seconds' 00:08:25.480 18:12:08 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:25.480 18:12:08 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:25.480 18:12:08 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:25.480 18:12:08 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val=Yes 00:08:25.480 18:12:08 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:25.480 18:12:08 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:25.480 18:12:08 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:25.480 18:12:08 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:08:25.480 18:12:08 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:25.480 18:12:08 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:25.480 18:12:08 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:25.480 18:12:08 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:08:25.480 18:12:08 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:25.480 18:12:08 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:25.480 18:12:08 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:26.448 18:12:10 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:08:26.448 18:12:10 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:26.448 18:12:10 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:26.448 18:12:10 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:26.448 18:12:10 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:08:26.448 18:12:10 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:26.448 18:12:10 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:26.448 18:12:10 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:26.448 18:12:10 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:08:26.448 18:12:10 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:26.448 18:12:10 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:26.448 18:12:10 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:26.448 18:12:10 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:08:26.448 18:12:10 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:26.448 18:12:10 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:26.448 18:12:10 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:26.448 18:12:10 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:08:26.448 18:12:10 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:26.448 18:12:10 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:26.448 18:12:10 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:26.448 18:12:10 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:08:26.448 18:12:10 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:26.448 18:12:10 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:26.448 18:12:10 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:26.448 18:12:10 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:08:26.448 18:12:10 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:26.448 18:12:10 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:26.448 18:12:10 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:26.448 18:12:10 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:08:26.448 18:12:10 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:26.449 18:12:10 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:26.449 18:12:10 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:26.449 18:12:10 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:08:26.449 18:12:10 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:26.449 18:12:10 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:26.449 18:12:10 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:26.449 18:12:10 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@27 -- # [[ -n dpdk_compressdev ]] 00:08:26.449 18:12:10 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@27 -- # [[ -n decompress ]] 00:08:26.449 18:12:10 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@27 -- # [[ dpdk_compressdev == \d\p\d\k\_\c\o\m\p\r\e\s\s\d\e\v ]] 00:08:26.449 00:08:26.449 real 0m2.252s 00:08:26.449 user 0m7.219s 00:08:26.449 sys 0m0.602s 00:08:26.449 18:12:10 accel.accel_cdev_decomp_full_mcore -- common/autotest_common.sh@1124 -- # xtrace_disable 00:08:26.449 18:12:10 accel.accel_cdev_decomp_full_mcore -- common/autotest_common.sh@10 -- # set +x 00:08:26.449 ************************************ 00:08:26.449 END TEST accel_cdev_decomp_full_mcore 00:08:26.449 ************************************ 00:08:26.718 18:12:10 accel -- common/autotest_common.sh@1142 -- # return 0 00:08:26.718 18:12:10 accel -- accel/accel.sh@132 -- # run_test accel_cdev_decomp_mthread accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -T 2 00:08:26.718 18:12:10 accel -- common/autotest_common.sh@1099 -- # '[' 11 -le 1 ']' 00:08:26.718 18:12:10 accel -- common/autotest_common.sh@1105 -- # xtrace_disable 00:08:26.718 18:12:10 accel -- common/autotest_common.sh@10 -- # set +x 00:08:26.718 ************************************ 00:08:26.718 START TEST accel_cdev_decomp_mthread 00:08:26.718 ************************************ 00:08:26.718 18:12:10 accel.accel_cdev_decomp_mthread -- common/autotest_common.sh@1123 -- # accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -T 2 00:08:26.718 18:12:10 accel.accel_cdev_decomp_mthread -- accel/accel.sh@16 -- # local accel_opc 00:08:26.718 18:12:10 accel.accel_cdev_decomp_mthread -- accel/accel.sh@17 -- # local accel_module 00:08:26.718 18:12:10 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:26.718 18:12:10 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:26.718 18:12:10 accel.accel_cdev_decomp_mthread -- accel/accel.sh@15 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -T 2 00:08:26.718 18:12:10 accel.accel_cdev_decomp_mthread -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -T 2 00:08:26.718 18:12:10 accel.accel_cdev_decomp_mthread -- accel/accel.sh@12 -- # build_accel_config 00:08:26.718 18:12:10 accel.accel_cdev_decomp_mthread -- accel/accel.sh@31 -- # accel_json_cfg=() 00:08:26.718 18:12:10 accel.accel_cdev_decomp_mthread -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:08:26.718 18:12:10 accel.accel_cdev_decomp_mthread -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:08:26.718 18:12:10 accel.accel_cdev_decomp_mthread -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:08:26.718 18:12:10 accel.accel_cdev_decomp_mthread -- accel/accel.sh@36 -- # [[ -n 1 ]] 00:08:26.718 18:12:10 accel.accel_cdev_decomp_mthread -- accel/accel.sh@37 -- # accel_json_cfg+=('{"method": "compressdev_scan_accel_module", "params":{"pmd": 0}}') 00:08:26.718 18:12:10 accel.accel_cdev_decomp_mthread -- accel/accel.sh@40 -- # local IFS=, 00:08:26.718 18:12:10 accel.accel_cdev_decomp_mthread -- accel/accel.sh@41 -- # jq -r . 00:08:26.718 [2024-07-12 18:12:10.268094] Starting SPDK v24.09-pre git sha1 182dd7de4 / DPDK 24.03.0 initialization... 00:08:26.718 [2024-07-12 18:12:10.268154] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2436328 ] 00:08:26.718 [2024-07-12 18:12:10.396566] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:26.980 [2024-07-12 18:12:10.498408] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:08:27.546 [2024-07-12 18:12:11.260153] accel_dpdk_compressdev.c: 296:accel_init_compress_drivers: *NOTICE*: initialized QAT PMD 00:08:27.546 [2024-07-12 18:12:11.262674] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x19b0080 PMD being used: compress_qat 00:08:27.546 18:12:11 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val= 00:08:27.546 18:12:11 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:27.546 18:12:11 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:27.546 18:12:11 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:27.546 18:12:11 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val= 00:08:27.546 18:12:11 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:27.546 18:12:11 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:27.546 18:12:11 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:27.546 18:12:11 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val= 00:08:27.546 [2024-07-12 18:12:11.267549] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x19b52a0 PMD being used: compress_qat 00:08:27.546 18:12:11 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:27.546 18:12:11 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:27.546 18:12:11 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:27.546 18:12:11 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val=0x1 00:08:27.546 18:12:11 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:27.546 18:12:11 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:27.546 18:12:11 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:27.546 18:12:11 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val= 00:08:27.546 18:12:11 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:27.546 18:12:11 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:27.546 18:12:11 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:27.546 [2024-07-12 18:12:11.270082] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x1ad80f0 PMD being used: compress_qat 00:08:27.546 18:12:11 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val= 00:08:27.546 18:12:11 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:27.546 18:12:11 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:27.546 18:12:11 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:27.546 18:12:11 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val=decompress 00:08:27.546 18:12:11 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:27.546 18:12:11 accel.accel_cdev_decomp_mthread -- accel/accel.sh@23 -- # accel_opc=decompress 00:08:27.546 18:12:11 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:27.546 18:12:11 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:27.546 18:12:11 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val='4096 bytes' 00:08:27.546 18:12:11 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:27.546 18:12:11 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:27.546 18:12:11 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:27.546 18:12:11 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val= 00:08:27.546 18:12:11 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:27.546 18:12:11 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:27.546 18:12:11 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:27.804 18:12:11 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val=dpdk_compressdev 00:08:27.804 18:12:11 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:27.805 18:12:11 accel.accel_cdev_decomp_mthread -- accel/accel.sh@22 -- # accel_module=dpdk_compressdev 00:08:27.805 18:12:11 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:27.805 18:12:11 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:27.805 18:12:11 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:08:27.805 18:12:11 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:27.805 18:12:11 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:27.805 18:12:11 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:27.805 18:12:11 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val=32 00:08:27.805 18:12:11 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:27.805 18:12:11 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:27.805 18:12:11 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:27.805 18:12:11 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val=32 00:08:27.805 18:12:11 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:27.805 18:12:11 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:27.805 18:12:11 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:27.805 18:12:11 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val=2 00:08:27.805 18:12:11 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:27.805 18:12:11 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:27.805 18:12:11 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:27.805 18:12:11 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val='1 seconds' 00:08:27.805 18:12:11 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:27.805 18:12:11 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:27.805 18:12:11 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:27.805 18:12:11 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val=Yes 00:08:27.805 18:12:11 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:27.805 18:12:11 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:27.805 18:12:11 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:27.805 18:12:11 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val= 00:08:27.805 18:12:11 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:27.805 18:12:11 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:27.805 18:12:11 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:27.805 18:12:11 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val= 00:08:27.805 18:12:11 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:27.805 18:12:11 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:27.805 18:12:11 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:28.741 18:12:12 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val= 00:08:28.741 18:12:12 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:28.741 18:12:12 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:28.741 18:12:12 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:28.741 18:12:12 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val= 00:08:28.741 18:12:12 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:28.741 18:12:12 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:28.741 18:12:12 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:28.741 18:12:12 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val= 00:08:28.741 18:12:12 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:28.741 18:12:12 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:28.741 18:12:12 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:28.741 18:12:12 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val= 00:08:28.741 18:12:12 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:28.741 18:12:12 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:28.741 18:12:12 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:28.741 18:12:12 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val= 00:08:28.741 18:12:12 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:28.741 18:12:12 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:28.741 18:12:12 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:28.741 18:12:12 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val= 00:08:28.741 18:12:12 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:28.741 18:12:12 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:28.741 18:12:12 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:28.741 18:12:12 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val= 00:08:28.741 18:12:12 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:28.741 18:12:12 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:28.741 18:12:12 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:28.741 18:12:12 accel.accel_cdev_decomp_mthread -- accel/accel.sh@27 -- # [[ -n dpdk_compressdev ]] 00:08:28.742 18:12:12 accel.accel_cdev_decomp_mthread -- accel/accel.sh@27 -- # [[ -n decompress ]] 00:08:28.742 18:12:12 accel.accel_cdev_decomp_mthread -- accel/accel.sh@27 -- # [[ dpdk_compressdev == \d\p\d\k\_\c\o\m\p\r\e\s\s\d\e\v ]] 00:08:28.742 00:08:28.742 real 0m2.209s 00:08:28.742 user 0m1.626s 00:08:28.742 sys 0m0.584s 00:08:28.742 18:12:12 accel.accel_cdev_decomp_mthread -- common/autotest_common.sh@1124 -- # xtrace_disable 00:08:28.742 18:12:12 accel.accel_cdev_decomp_mthread -- common/autotest_common.sh@10 -- # set +x 00:08:28.742 ************************************ 00:08:28.742 END TEST accel_cdev_decomp_mthread 00:08:28.742 ************************************ 00:08:29.000 18:12:12 accel -- common/autotest_common.sh@1142 -- # return 0 00:08:29.000 18:12:12 accel -- accel/accel.sh@133 -- # run_test accel_cdev_decomp_full_mthread accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 -T 2 00:08:29.000 18:12:12 accel -- common/autotest_common.sh@1099 -- # '[' 13 -le 1 ']' 00:08:29.000 18:12:12 accel -- common/autotest_common.sh@1105 -- # xtrace_disable 00:08:29.001 18:12:12 accel -- common/autotest_common.sh@10 -- # set +x 00:08:29.001 ************************************ 00:08:29.001 START TEST accel_cdev_decomp_full_mthread 00:08:29.001 ************************************ 00:08:29.001 18:12:12 accel.accel_cdev_decomp_full_mthread -- common/autotest_common.sh@1123 -- # accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 -T 2 00:08:29.001 18:12:12 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@16 -- # local accel_opc 00:08:29.001 18:12:12 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@17 -- # local accel_module 00:08:29.001 18:12:12 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:29.001 18:12:12 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:29.001 18:12:12 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@15 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 -T 2 00:08:29.001 18:12:12 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 -T 2 00:08:29.001 18:12:12 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@12 -- # build_accel_config 00:08:29.001 18:12:12 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@31 -- # accel_json_cfg=() 00:08:29.001 18:12:12 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:08:29.001 18:12:12 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:08:29.001 18:12:12 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:08:29.001 18:12:12 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@36 -- # [[ -n 1 ]] 00:08:29.001 18:12:12 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@37 -- # accel_json_cfg+=('{"method": "compressdev_scan_accel_module", "params":{"pmd": 0}}') 00:08:29.001 18:12:12 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@40 -- # local IFS=, 00:08:29.001 18:12:12 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@41 -- # jq -r . 00:08:29.001 [2024-07-12 18:12:12.556378] Starting SPDK v24.09-pre git sha1 182dd7de4 / DPDK 24.03.0 initialization... 00:08:29.001 [2024-07-12 18:12:12.556436] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2436660 ] 00:08:29.001 [2024-07-12 18:12:12.685649] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:29.259 [2024-07-12 18:12:12.786946] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:08:29.843 [2024-07-12 18:12:13.555835] accel_dpdk_compressdev.c: 296:accel_init_compress_drivers: *NOTICE*: initialized QAT PMD 00:08:30.136 [2024-07-12 18:12:13.558396] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x1944080 PMD being used: compress_qat 00:08:30.136 18:12:13 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:08:30.136 18:12:13 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:30.136 18:12:13 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:30.136 18:12:13 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:30.137 18:12:13 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:08:30.137 18:12:13 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:30.137 18:12:13 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:30.137 18:12:13 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:30.137 18:12:13 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:08:30.137 [2024-07-12 18:12:13.562594] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x19473b0 PMD being used: compress_qat 00:08:30.137 18:12:13 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:30.137 18:12:13 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:30.137 18:12:13 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:30.137 18:12:13 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val=0x1 00:08:30.137 18:12:13 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:30.137 18:12:13 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:30.137 18:12:13 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:30.137 18:12:13 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:08:30.137 18:12:13 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:30.137 18:12:13 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:30.137 18:12:13 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:30.137 18:12:13 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:08:30.137 [2024-07-12 18:12:13.565502] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x1a6bcc0 PMD being used: compress_qat 00:08:30.137 18:12:13 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:30.137 18:12:13 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:30.137 18:12:13 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:30.137 18:12:13 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val=decompress 00:08:30.137 18:12:13 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:30.137 18:12:13 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@23 -- # accel_opc=decompress 00:08:30.137 18:12:13 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:30.137 18:12:13 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:30.137 18:12:13 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val='111250 bytes' 00:08:30.137 18:12:13 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:30.137 18:12:13 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:30.137 18:12:13 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:30.137 18:12:13 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:08:30.137 18:12:13 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:30.137 18:12:13 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:30.137 18:12:13 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:30.137 18:12:13 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val=dpdk_compressdev 00:08:30.137 18:12:13 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:30.137 18:12:13 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@22 -- # accel_module=dpdk_compressdev 00:08:30.137 18:12:13 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:30.137 18:12:13 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:30.137 18:12:13 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:08:30.137 18:12:13 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:30.137 18:12:13 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:30.137 18:12:13 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:30.137 18:12:13 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val=32 00:08:30.137 18:12:13 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:30.137 18:12:13 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:30.137 18:12:13 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:30.137 18:12:13 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val=32 00:08:30.137 18:12:13 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:30.137 18:12:13 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:30.137 18:12:13 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:30.137 18:12:13 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val=2 00:08:30.137 18:12:13 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:30.137 18:12:13 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:30.137 18:12:13 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:30.137 18:12:13 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val='1 seconds' 00:08:30.137 18:12:13 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:30.137 18:12:13 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:30.137 18:12:13 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:30.137 18:12:13 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val=Yes 00:08:30.137 18:12:13 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:30.137 18:12:13 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:30.137 18:12:13 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:30.137 18:12:13 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:08:30.137 18:12:13 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:30.137 18:12:13 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:30.137 18:12:13 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:30.137 18:12:13 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:08:30.137 18:12:13 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:30.137 18:12:13 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:30.137 18:12:13 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:31.072 18:12:14 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:08:31.072 18:12:14 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:31.072 18:12:14 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:31.072 18:12:14 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:31.072 18:12:14 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:08:31.072 18:12:14 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:31.072 18:12:14 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:31.072 18:12:14 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:31.072 18:12:14 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:08:31.073 18:12:14 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:31.073 18:12:14 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:31.073 18:12:14 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:31.073 18:12:14 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:08:31.073 18:12:14 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:31.073 18:12:14 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:31.073 18:12:14 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:31.073 18:12:14 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:08:31.073 18:12:14 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:31.073 18:12:14 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:31.073 18:12:14 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:31.073 18:12:14 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:08:31.073 18:12:14 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:31.073 18:12:14 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:31.073 18:12:14 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:31.073 18:12:14 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:08:31.073 18:12:14 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:31.073 18:12:14 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:31.073 18:12:14 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:31.073 18:12:14 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@27 -- # [[ -n dpdk_compressdev ]] 00:08:31.073 18:12:14 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@27 -- # [[ -n decompress ]] 00:08:31.073 18:12:14 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@27 -- # [[ dpdk_compressdev == \d\p\d\k\_\c\o\m\p\r\e\s\s\d\e\v ]] 00:08:31.073 00:08:31.073 real 0m2.228s 00:08:31.073 user 0m1.652s 00:08:31.073 sys 0m0.576s 00:08:31.073 18:12:14 accel.accel_cdev_decomp_full_mthread -- common/autotest_common.sh@1124 -- # xtrace_disable 00:08:31.073 18:12:14 accel.accel_cdev_decomp_full_mthread -- common/autotest_common.sh@10 -- # set +x 00:08:31.073 ************************************ 00:08:31.073 END TEST accel_cdev_decomp_full_mthread 00:08:31.073 ************************************ 00:08:31.073 18:12:14 accel -- common/autotest_common.sh@1142 -- # return 0 00:08:31.073 18:12:14 accel -- accel/accel.sh@134 -- # unset COMPRESSDEV 00:08:31.073 18:12:14 accel -- accel/accel.sh@137 -- # run_test accel_dif_functional_tests /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/dif/dif -c /dev/fd/62 00:08:31.073 18:12:14 accel -- accel/accel.sh@137 -- # build_accel_config 00:08:31.073 18:12:14 accel -- common/autotest_common.sh@1099 -- # '[' 4 -le 1 ']' 00:08:31.073 18:12:14 accel -- common/autotest_common.sh@1105 -- # xtrace_disable 00:08:31.073 18:12:14 accel -- accel/accel.sh@31 -- # accel_json_cfg=() 00:08:31.073 18:12:14 accel -- common/autotest_common.sh@10 -- # set +x 00:08:31.073 18:12:14 accel -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:08:31.073 18:12:14 accel -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:08:31.073 18:12:14 accel -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:08:31.073 18:12:14 accel -- accel/accel.sh@36 -- # [[ -n '' ]] 00:08:31.073 18:12:14 accel -- accel/accel.sh@40 -- # local IFS=, 00:08:31.073 18:12:14 accel -- accel/accel.sh@41 -- # jq -r . 00:08:31.331 ************************************ 00:08:31.331 START TEST accel_dif_functional_tests 00:08:31.331 ************************************ 00:08:31.331 18:12:14 accel.accel_dif_functional_tests -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/dif/dif -c /dev/fd/62 00:08:31.331 [2024-07-12 18:12:14.893120] Starting SPDK v24.09-pre git sha1 182dd7de4 / DPDK 24.03.0 initialization... 00:08:31.331 [2024-07-12 18:12:14.893180] [ DPDK EAL parameters: DIF --no-shconf -c 0x7 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2436902 ] 00:08:31.331 [2024-07-12 18:12:15.023001] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 3 00:08:31.591 [2024-07-12 18:12:15.125396] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:08:31.591 [2024-07-12 18:12:15.125479] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 2 00:08:31.591 [2024-07-12 18:12:15.125484] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:08:31.591 00:08:31.591 00:08:31.591 CUnit - A unit testing framework for C - Version 2.1-3 00:08:31.591 http://cunit.sourceforge.net/ 00:08:31.591 00:08:31.591 00:08:31.591 Suite: accel_dif 00:08:31.591 Test: verify: DIF generated, GUARD check ...passed 00:08:31.591 Test: verify: DIF generated, APPTAG check ...passed 00:08:31.591 Test: verify: DIF generated, REFTAG check ...passed 00:08:31.591 Test: verify: DIF not generated, GUARD check ...[2024-07-12 18:12:15.223592] dif.c: 826:_dif_verify: *ERROR*: Failed to compare Guard: LBA=10, Expected=5a5a, Actual=7867 00:08:31.591 passed 00:08:31.591 Test: verify: DIF not generated, APPTAG check ...[2024-07-12 18:12:15.223664] dif.c: 841:_dif_verify: *ERROR*: Failed to compare App Tag: LBA=10, Expected=14, Actual=5a5a 00:08:31.591 passed 00:08:31.591 Test: verify: DIF not generated, REFTAG check ...[2024-07-12 18:12:15.223695] dif.c: 776:_dif_reftag_check: *ERROR*: Failed to compare Ref Tag: LBA=10, Expected=a, Actual=5a5a5a5a 00:08:31.591 passed 00:08:31.591 Test: verify: APPTAG correct, APPTAG check ...passed 00:08:31.591 Test: verify: APPTAG incorrect, APPTAG check ...[2024-07-12 18:12:15.223765] dif.c: 841:_dif_verify: *ERROR*: Failed to compare App Tag: LBA=30, Expected=28, Actual=14 00:08:31.591 passed 00:08:31.591 Test: verify: APPTAG incorrect, no APPTAG check ...passed 00:08:31.591 Test: verify: REFTAG incorrect, REFTAG ignore ...passed 00:08:31.591 Test: verify: REFTAG_INIT correct, REFTAG check ...passed 00:08:31.591 Test: verify: REFTAG_INIT incorrect, REFTAG check ...[2024-07-12 18:12:15.223915] dif.c: 776:_dif_reftag_check: *ERROR*: Failed to compare Ref Tag: LBA=10, Expected=a, Actual=10 00:08:31.591 passed 00:08:31.591 Test: verify copy: DIF generated, GUARD check ...passed 00:08:31.591 Test: verify copy: DIF generated, APPTAG check ...passed 00:08:31.591 Test: verify copy: DIF generated, REFTAG check ...passed 00:08:31.591 Test: verify copy: DIF not generated, GUARD check ...[2024-07-12 18:12:15.224083] dif.c: 826:_dif_verify: *ERROR*: Failed to compare Guard: LBA=10, Expected=5a5a, Actual=7867 00:08:31.591 passed 00:08:31.591 Test: verify copy: DIF not generated, APPTAG check ...[2024-07-12 18:12:15.224116] dif.c: 841:_dif_verify: *ERROR*: Failed to compare App Tag: LBA=10, Expected=14, Actual=5a5a 00:08:31.591 passed 00:08:31.591 Test: verify copy: DIF not generated, REFTAG check ...[2024-07-12 18:12:15.224154] dif.c: 776:_dif_reftag_check: *ERROR*: Failed to compare Ref Tag: LBA=10, Expected=a, Actual=5a5a5a5a 00:08:31.591 passed 00:08:31.591 Test: generate copy: DIF generated, GUARD check ...passed 00:08:31.591 Test: generate copy: DIF generated, APTTAG check ...passed 00:08:31.591 Test: generate copy: DIF generated, REFTAG check ...passed 00:08:31.591 Test: generate copy: DIF generated, no GUARD check flag set ...passed 00:08:31.591 Test: generate copy: DIF generated, no APPTAG check flag set ...passed 00:08:31.591 Test: generate copy: DIF generated, no REFTAG check flag set ...passed 00:08:31.591 Test: generate copy: iovecs-len validate ...[2024-07-12 18:12:15.224392] dif.c:1190:spdk_dif_generate_copy: *ERROR*: Size of bounce_iovs arrays are not valid or misaligned with block_size. 00:08:31.591 passed 00:08:31.591 Test: generate copy: buffer alignment validate ...passed 00:08:31.591 00:08:31.591 Run Summary: Type Total Ran Passed Failed Inactive 00:08:31.591 suites 1 1 n/a 0 0 00:08:31.591 tests 26 26 26 0 0 00:08:31.591 asserts 115 115 115 0 n/a 00:08:31.591 00:08:31.591 Elapsed time = 0.003 seconds 00:08:31.850 00:08:31.850 real 0m0.607s 00:08:31.850 user 0m0.818s 00:08:31.850 sys 0m0.235s 00:08:31.850 18:12:15 accel.accel_dif_functional_tests -- common/autotest_common.sh@1124 -- # xtrace_disable 00:08:31.850 18:12:15 accel.accel_dif_functional_tests -- common/autotest_common.sh@10 -- # set +x 00:08:31.850 ************************************ 00:08:31.850 END TEST accel_dif_functional_tests 00:08:31.850 ************************************ 00:08:31.850 18:12:15 accel -- common/autotest_common.sh@1142 -- # return 0 00:08:31.850 00:08:31.850 real 0m53.593s 00:08:31.850 user 1m1.688s 00:08:31.850 sys 0m12.015s 00:08:31.850 18:12:15 accel -- common/autotest_common.sh@1124 -- # xtrace_disable 00:08:31.850 18:12:15 accel -- common/autotest_common.sh@10 -- # set +x 00:08:31.850 ************************************ 00:08:31.850 END TEST accel 00:08:31.850 ************************************ 00:08:31.850 18:12:15 -- common/autotest_common.sh@1142 -- # return 0 00:08:31.850 18:12:15 -- spdk/autotest.sh@184 -- # run_test accel_rpc /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/accel_rpc.sh 00:08:31.850 18:12:15 -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:08:31.851 18:12:15 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:08:31.851 18:12:15 -- common/autotest_common.sh@10 -- # set +x 00:08:31.851 ************************************ 00:08:31.851 START TEST accel_rpc 00:08:31.851 ************************************ 00:08:31.851 18:12:15 accel_rpc -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/accel_rpc.sh 00:08:32.109 * Looking for test storage... 00:08:32.109 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel 00:08:32.109 18:12:15 accel_rpc -- accel/accel_rpc.sh@11 -- # trap 'killprocess $spdk_tgt_pid; exit 1' ERR 00:08:32.109 18:12:15 accel_rpc -- accel/accel_rpc.sh@14 -- # spdk_tgt_pid=2437131 00:08:32.109 18:12:15 accel_rpc -- accel/accel_rpc.sh@15 -- # waitforlisten 2437131 00:08:32.109 18:12:15 accel_rpc -- accel/accel_rpc.sh@13 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt --wait-for-rpc 00:08:32.109 18:12:15 accel_rpc -- common/autotest_common.sh@829 -- # '[' -z 2437131 ']' 00:08:32.109 18:12:15 accel_rpc -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:08:32.109 18:12:15 accel_rpc -- common/autotest_common.sh@834 -- # local max_retries=100 00:08:32.109 18:12:15 accel_rpc -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:08:32.109 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:08:32.109 18:12:15 accel_rpc -- common/autotest_common.sh@838 -- # xtrace_disable 00:08:32.109 18:12:15 accel_rpc -- common/autotest_common.sh@10 -- # set +x 00:08:32.109 [2024-07-12 18:12:15.750881] Starting SPDK v24.09-pre git sha1 182dd7de4 / DPDK 24.03.0 initialization... 00:08:32.109 [2024-07-12 18:12:15.750962] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2437131 ] 00:08:32.367 [2024-07-12 18:12:15.879535] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:32.367 [2024-07-12 18:12:15.983369] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:08:32.933 18:12:16 accel_rpc -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:08:32.933 18:12:16 accel_rpc -- common/autotest_common.sh@862 -- # return 0 00:08:32.933 18:12:16 accel_rpc -- accel/accel_rpc.sh@45 -- # [[ y == y ]] 00:08:32.933 18:12:16 accel_rpc -- accel/accel_rpc.sh@45 -- # [[ 0 -gt 0 ]] 00:08:32.933 18:12:16 accel_rpc -- accel/accel_rpc.sh@49 -- # [[ y == y ]] 00:08:32.933 18:12:16 accel_rpc -- accel/accel_rpc.sh@49 -- # [[ 0 -gt 0 ]] 00:08:32.933 18:12:16 accel_rpc -- accel/accel_rpc.sh@53 -- # run_test accel_assign_opcode accel_assign_opcode_test_suite 00:08:32.933 18:12:16 accel_rpc -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:08:32.933 18:12:16 accel_rpc -- common/autotest_common.sh@1105 -- # xtrace_disable 00:08:32.933 18:12:16 accel_rpc -- common/autotest_common.sh@10 -- # set +x 00:08:32.933 ************************************ 00:08:32.933 START TEST accel_assign_opcode 00:08:32.933 ************************************ 00:08:32.933 18:12:16 accel_rpc.accel_assign_opcode -- common/autotest_common.sh@1123 -- # accel_assign_opcode_test_suite 00:08:32.933 18:12:16 accel_rpc.accel_assign_opcode -- accel/accel_rpc.sh@38 -- # rpc_cmd accel_assign_opc -o copy -m incorrect 00:08:32.933 18:12:16 accel_rpc.accel_assign_opcode -- common/autotest_common.sh@559 -- # xtrace_disable 00:08:32.933 18:12:16 accel_rpc.accel_assign_opcode -- common/autotest_common.sh@10 -- # set +x 00:08:32.933 [2024-07-12 18:12:16.653536] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation copy will be assigned to module incorrect 00:08:32.933 18:12:16 accel_rpc.accel_assign_opcode -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:08:32.933 18:12:16 accel_rpc.accel_assign_opcode -- accel/accel_rpc.sh@40 -- # rpc_cmd accel_assign_opc -o copy -m software 00:08:32.933 18:12:16 accel_rpc.accel_assign_opcode -- common/autotest_common.sh@559 -- # xtrace_disable 00:08:32.933 18:12:16 accel_rpc.accel_assign_opcode -- common/autotest_common.sh@10 -- # set +x 00:08:32.933 [2024-07-12 18:12:16.661555] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation copy will be assigned to module software 00:08:33.192 18:12:16 accel_rpc.accel_assign_opcode -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:08:33.192 18:12:16 accel_rpc.accel_assign_opcode -- accel/accel_rpc.sh@41 -- # rpc_cmd framework_start_init 00:08:33.192 18:12:16 accel_rpc.accel_assign_opcode -- common/autotest_common.sh@559 -- # xtrace_disable 00:08:33.192 18:12:16 accel_rpc.accel_assign_opcode -- common/autotest_common.sh@10 -- # set +x 00:08:33.192 18:12:16 accel_rpc.accel_assign_opcode -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:08:33.192 18:12:16 accel_rpc.accel_assign_opcode -- accel/accel_rpc.sh@42 -- # rpc_cmd accel_get_opc_assignments 00:08:33.192 18:12:16 accel_rpc.accel_assign_opcode -- common/autotest_common.sh@559 -- # xtrace_disable 00:08:33.192 18:12:16 accel_rpc.accel_assign_opcode -- accel/accel_rpc.sh@42 -- # jq -r .copy 00:08:33.192 18:12:16 accel_rpc.accel_assign_opcode -- common/autotest_common.sh@10 -- # set +x 00:08:33.192 18:12:16 accel_rpc.accel_assign_opcode -- accel/accel_rpc.sh@42 -- # grep software 00:08:33.451 18:12:16 accel_rpc.accel_assign_opcode -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:08:33.451 software 00:08:33.451 00:08:33.451 real 0m0.305s 00:08:33.451 user 0m0.043s 00:08:33.451 sys 0m0.019s 00:08:33.451 18:12:16 accel_rpc.accel_assign_opcode -- common/autotest_common.sh@1124 -- # xtrace_disable 00:08:33.451 18:12:16 accel_rpc.accel_assign_opcode -- common/autotest_common.sh@10 -- # set +x 00:08:33.451 ************************************ 00:08:33.451 END TEST accel_assign_opcode 00:08:33.451 ************************************ 00:08:33.451 18:12:16 accel_rpc -- common/autotest_common.sh@1142 -- # return 0 00:08:33.451 18:12:16 accel_rpc -- accel/accel_rpc.sh@55 -- # killprocess 2437131 00:08:33.451 18:12:16 accel_rpc -- common/autotest_common.sh@948 -- # '[' -z 2437131 ']' 00:08:33.451 18:12:16 accel_rpc -- common/autotest_common.sh@952 -- # kill -0 2437131 00:08:33.451 18:12:16 accel_rpc -- common/autotest_common.sh@953 -- # uname 00:08:33.451 18:12:17 accel_rpc -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:08:33.451 18:12:17 accel_rpc -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2437131 00:08:33.451 18:12:17 accel_rpc -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:08:33.451 18:12:17 accel_rpc -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:08:33.451 18:12:17 accel_rpc -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2437131' 00:08:33.451 killing process with pid 2437131 00:08:33.451 18:12:17 accel_rpc -- common/autotest_common.sh@967 -- # kill 2437131 00:08:33.451 18:12:17 accel_rpc -- common/autotest_common.sh@972 -- # wait 2437131 00:08:34.018 00:08:34.018 real 0m1.870s 00:08:34.018 user 0m1.881s 00:08:34.018 sys 0m0.582s 00:08:34.018 18:12:17 accel_rpc -- common/autotest_common.sh@1124 -- # xtrace_disable 00:08:34.018 18:12:17 accel_rpc -- common/autotest_common.sh@10 -- # set +x 00:08:34.018 ************************************ 00:08:34.018 END TEST accel_rpc 00:08:34.018 ************************************ 00:08:34.018 18:12:17 -- common/autotest_common.sh@1142 -- # return 0 00:08:34.018 18:12:17 -- spdk/autotest.sh@185 -- # run_test app_cmdline /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/cmdline.sh 00:08:34.018 18:12:17 -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:08:34.018 18:12:17 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:08:34.018 18:12:17 -- common/autotest_common.sh@10 -- # set +x 00:08:34.018 ************************************ 00:08:34.018 START TEST app_cmdline 00:08:34.018 ************************************ 00:08:34.018 18:12:17 app_cmdline -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/cmdline.sh 00:08:34.018 * Looking for test storage... 00:08:34.018 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app 00:08:34.018 18:12:17 app_cmdline -- app/cmdline.sh@14 -- # trap 'killprocess $spdk_tgt_pid' EXIT 00:08:34.018 18:12:17 app_cmdline -- app/cmdline.sh@17 -- # spdk_tgt_pid=2437407 00:08:34.018 18:12:17 app_cmdline -- app/cmdline.sh@18 -- # waitforlisten 2437407 00:08:34.018 18:12:17 app_cmdline -- app/cmdline.sh@16 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt --rpcs-allowed spdk_get_version,rpc_get_methods 00:08:34.018 18:12:17 app_cmdline -- common/autotest_common.sh@829 -- # '[' -z 2437407 ']' 00:08:34.018 18:12:17 app_cmdline -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:08:34.018 18:12:17 app_cmdline -- common/autotest_common.sh@834 -- # local max_retries=100 00:08:34.018 18:12:17 app_cmdline -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:08:34.018 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:08:34.018 18:12:17 app_cmdline -- common/autotest_common.sh@838 -- # xtrace_disable 00:08:34.018 18:12:17 app_cmdline -- common/autotest_common.sh@10 -- # set +x 00:08:34.018 [2024-07-12 18:12:17.688110] Starting SPDK v24.09-pre git sha1 182dd7de4 / DPDK 24.03.0 initialization... 00:08:34.018 [2024-07-12 18:12:17.688165] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2437407 ] 00:08:34.276 [2024-07-12 18:12:17.800179] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:34.276 [2024-07-12 18:12:17.897483] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:08:34.843 18:12:18 app_cmdline -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:08:34.843 18:12:18 app_cmdline -- common/autotest_common.sh@862 -- # return 0 00:08:34.843 18:12:18 app_cmdline -- app/cmdline.sh@20 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py spdk_get_version 00:08:35.101 { 00:08:35.101 "version": "SPDK v24.09-pre git sha1 182dd7de4", 00:08:35.101 "fields": { 00:08:35.101 "major": 24, 00:08:35.101 "minor": 9, 00:08:35.101 "patch": 0, 00:08:35.101 "suffix": "-pre", 00:08:35.101 "commit": "182dd7de4" 00:08:35.101 } 00:08:35.101 } 00:08:35.101 18:12:18 app_cmdline -- app/cmdline.sh@22 -- # expected_methods=() 00:08:35.101 18:12:18 app_cmdline -- app/cmdline.sh@23 -- # expected_methods+=("rpc_get_methods") 00:08:35.101 18:12:18 app_cmdline -- app/cmdline.sh@24 -- # expected_methods+=("spdk_get_version") 00:08:35.101 18:12:18 app_cmdline -- app/cmdline.sh@26 -- # methods=($(rpc_cmd rpc_get_methods | jq -r ".[]" | sort)) 00:08:35.101 18:12:18 app_cmdline -- app/cmdline.sh@26 -- # sort 00:08:35.101 18:12:18 app_cmdline -- app/cmdline.sh@26 -- # rpc_cmd rpc_get_methods 00:08:35.101 18:12:18 app_cmdline -- app/cmdline.sh@26 -- # jq -r '.[]' 00:08:35.101 18:12:18 app_cmdline -- common/autotest_common.sh@559 -- # xtrace_disable 00:08:35.101 18:12:18 app_cmdline -- common/autotest_common.sh@10 -- # set +x 00:08:35.101 18:12:18 app_cmdline -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:08:35.359 18:12:18 app_cmdline -- app/cmdline.sh@27 -- # (( 2 == 2 )) 00:08:35.359 18:12:18 app_cmdline -- app/cmdline.sh@28 -- # [[ rpc_get_methods spdk_get_version == \r\p\c\_\g\e\t\_\m\e\t\h\o\d\s\ \s\p\d\k\_\g\e\t\_\v\e\r\s\i\o\n ]] 00:08:35.359 18:12:18 app_cmdline -- app/cmdline.sh@30 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py env_dpdk_get_mem_stats 00:08:35.359 18:12:18 app_cmdline -- common/autotest_common.sh@648 -- # local es=0 00:08:35.359 18:12:18 app_cmdline -- common/autotest_common.sh@650 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py env_dpdk_get_mem_stats 00:08:35.359 18:12:18 app_cmdline -- common/autotest_common.sh@636 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:08:35.359 18:12:18 app_cmdline -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:08:35.359 18:12:18 app_cmdline -- common/autotest_common.sh@640 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:08:35.359 18:12:18 app_cmdline -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:08:35.359 18:12:18 app_cmdline -- common/autotest_common.sh@642 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:08:35.359 18:12:18 app_cmdline -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:08:35.359 18:12:18 app_cmdline -- common/autotest_common.sh@642 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:08:35.359 18:12:18 app_cmdline -- common/autotest_common.sh@642 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py ]] 00:08:35.359 18:12:18 app_cmdline -- common/autotest_common.sh@651 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py env_dpdk_get_mem_stats 00:08:35.359 request: 00:08:35.359 { 00:08:35.359 "method": "env_dpdk_get_mem_stats", 00:08:35.359 "req_id": 1 00:08:35.359 } 00:08:35.359 Got JSON-RPC error response 00:08:35.359 response: 00:08:35.359 { 00:08:35.359 "code": -32601, 00:08:35.359 "message": "Method not found" 00:08:35.359 } 00:08:35.359 18:12:19 app_cmdline -- common/autotest_common.sh@651 -- # es=1 00:08:35.617 18:12:19 app_cmdline -- common/autotest_common.sh@659 -- # (( es > 128 )) 00:08:35.617 18:12:19 app_cmdline -- common/autotest_common.sh@670 -- # [[ -n '' ]] 00:08:35.617 18:12:19 app_cmdline -- common/autotest_common.sh@675 -- # (( !es == 0 )) 00:08:35.617 18:12:19 app_cmdline -- app/cmdline.sh@1 -- # killprocess 2437407 00:08:35.617 18:12:19 app_cmdline -- common/autotest_common.sh@948 -- # '[' -z 2437407 ']' 00:08:35.617 18:12:19 app_cmdline -- common/autotest_common.sh@952 -- # kill -0 2437407 00:08:35.617 18:12:19 app_cmdline -- common/autotest_common.sh@953 -- # uname 00:08:35.617 18:12:19 app_cmdline -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:08:35.617 18:12:19 app_cmdline -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2437407 00:08:35.617 18:12:19 app_cmdline -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:08:35.617 18:12:19 app_cmdline -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:08:35.617 18:12:19 app_cmdline -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2437407' 00:08:35.618 killing process with pid 2437407 00:08:35.618 18:12:19 app_cmdline -- common/autotest_common.sh@967 -- # kill 2437407 00:08:35.618 18:12:19 app_cmdline -- common/autotest_common.sh@972 -- # wait 2437407 00:08:35.877 00:08:35.877 real 0m1.997s 00:08:35.877 user 0m2.392s 00:08:35.877 sys 0m0.574s 00:08:35.877 18:12:19 app_cmdline -- common/autotest_common.sh@1124 -- # xtrace_disable 00:08:35.877 18:12:19 app_cmdline -- common/autotest_common.sh@10 -- # set +x 00:08:35.877 ************************************ 00:08:35.877 END TEST app_cmdline 00:08:35.877 ************************************ 00:08:35.877 18:12:19 -- common/autotest_common.sh@1142 -- # return 0 00:08:35.877 18:12:19 -- spdk/autotest.sh@186 -- # run_test version /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/version.sh 00:08:35.877 18:12:19 -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:08:35.877 18:12:19 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:08:35.877 18:12:19 -- common/autotest_common.sh@10 -- # set +x 00:08:36.136 ************************************ 00:08:36.136 START TEST version 00:08:36.136 ************************************ 00:08:36.136 18:12:19 version -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/version.sh 00:08:36.136 * Looking for test storage... 00:08:36.136 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app 00:08:36.136 18:12:19 version -- app/version.sh@17 -- # get_header_version major 00:08:36.136 18:12:19 version -- app/version.sh@13 -- # grep -E '^#define SPDK_VERSION_MAJOR[[:space:]]+' /var/jenkins/workspace/crypto-phy-autotest/spdk/include/spdk/version.h 00:08:36.136 18:12:19 version -- app/version.sh@14 -- # cut -f2 00:08:36.136 18:12:19 version -- app/version.sh@14 -- # tr -d '"' 00:08:36.136 18:12:19 version -- app/version.sh@17 -- # major=24 00:08:36.136 18:12:19 version -- app/version.sh@18 -- # get_header_version minor 00:08:36.136 18:12:19 version -- app/version.sh@13 -- # grep -E '^#define SPDK_VERSION_MINOR[[:space:]]+' /var/jenkins/workspace/crypto-phy-autotest/spdk/include/spdk/version.h 00:08:36.136 18:12:19 version -- app/version.sh@14 -- # cut -f2 00:08:36.136 18:12:19 version -- app/version.sh@14 -- # tr -d '"' 00:08:36.136 18:12:19 version -- app/version.sh@18 -- # minor=9 00:08:36.136 18:12:19 version -- app/version.sh@19 -- # get_header_version patch 00:08:36.136 18:12:19 version -- app/version.sh@13 -- # grep -E '^#define SPDK_VERSION_PATCH[[:space:]]+' /var/jenkins/workspace/crypto-phy-autotest/spdk/include/spdk/version.h 00:08:36.136 18:12:19 version -- app/version.sh@14 -- # cut -f2 00:08:36.136 18:12:19 version -- app/version.sh@14 -- # tr -d '"' 00:08:36.136 18:12:19 version -- app/version.sh@19 -- # patch=0 00:08:36.136 18:12:19 version -- app/version.sh@20 -- # get_header_version suffix 00:08:36.136 18:12:19 version -- app/version.sh@13 -- # grep -E '^#define SPDK_VERSION_SUFFIX[[:space:]]+' /var/jenkins/workspace/crypto-phy-autotest/spdk/include/spdk/version.h 00:08:36.136 18:12:19 version -- app/version.sh@14 -- # cut -f2 00:08:36.136 18:12:19 version -- app/version.sh@14 -- # tr -d '"' 00:08:36.136 18:12:19 version -- app/version.sh@20 -- # suffix=-pre 00:08:36.136 18:12:19 version -- app/version.sh@22 -- # version=24.9 00:08:36.136 18:12:19 version -- app/version.sh@25 -- # (( patch != 0 )) 00:08:36.136 18:12:19 version -- app/version.sh@28 -- # version=24.9rc0 00:08:36.136 18:12:19 version -- app/version.sh@30 -- # PYTHONPATH=:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/python 00:08:36.136 18:12:19 version -- app/version.sh@30 -- # python3 -c 'import spdk; print(spdk.__version__)' 00:08:36.136 18:12:19 version -- app/version.sh@30 -- # py_version=24.9rc0 00:08:36.136 18:12:19 version -- app/version.sh@31 -- # [[ 24.9rc0 == \2\4\.\9\r\c\0 ]] 00:08:36.136 00:08:36.136 real 0m0.189s 00:08:36.136 user 0m0.094s 00:08:36.136 sys 0m0.143s 00:08:36.136 18:12:19 version -- common/autotest_common.sh@1124 -- # xtrace_disable 00:08:36.136 18:12:19 version -- common/autotest_common.sh@10 -- # set +x 00:08:36.136 ************************************ 00:08:36.136 END TEST version 00:08:36.136 ************************************ 00:08:36.136 18:12:19 -- common/autotest_common.sh@1142 -- # return 0 00:08:36.136 18:12:19 -- spdk/autotest.sh@188 -- # '[' 1 -eq 1 ']' 00:08:36.136 18:12:19 -- spdk/autotest.sh@189 -- # run_test blockdev_general /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/blockdev.sh 00:08:36.136 18:12:19 -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:08:36.136 18:12:19 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:08:36.136 18:12:19 -- common/autotest_common.sh@10 -- # set +x 00:08:36.394 ************************************ 00:08:36.394 START TEST blockdev_general 00:08:36.394 ************************************ 00:08:36.394 18:12:19 blockdev_general -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/blockdev.sh 00:08:36.394 * Looking for test storage... 00:08:36.394 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev 00:08:36.394 18:12:19 blockdev_general -- bdev/blockdev.sh@10 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbd_common.sh 00:08:36.394 18:12:19 blockdev_general -- bdev/nbd_common.sh@6 -- # set -e 00:08:36.394 18:12:19 blockdev_general -- bdev/blockdev.sh@12 -- # rpc_py=rpc_cmd 00:08:36.394 18:12:19 blockdev_general -- bdev/blockdev.sh@13 -- # conf_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json 00:08:36.394 18:12:19 blockdev_general -- bdev/blockdev.sh@14 -- # nonenclosed_conf_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonenclosed.json 00:08:36.394 18:12:19 blockdev_general -- bdev/blockdev.sh@15 -- # nonarray_conf_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonarray.json 00:08:36.394 18:12:19 blockdev_general -- bdev/blockdev.sh@17 -- # export RPC_PIPE_TIMEOUT=30 00:08:36.394 18:12:19 blockdev_general -- bdev/blockdev.sh@17 -- # RPC_PIPE_TIMEOUT=30 00:08:36.394 18:12:19 blockdev_general -- bdev/blockdev.sh@20 -- # : 00:08:36.394 18:12:19 blockdev_general -- bdev/blockdev.sh@670 -- # QOS_DEV_1=Malloc_0 00:08:36.394 18:12:19 blockdev_general -- bdev/blockdev.sh@671 -- # QOS_DEV_2=Null_1 00:08:36.394 18:12:19 blockdev_general -- bdev/blockdev.sh@672 -- # QOS_RUN_TIME=5 00:08:36.394 18:12:19 blockdev_general -- bdev/blockdev.sh@674 -- # uname -s 00:08:36.394 18:12:19 blockdev_general -- bdev/blockdev.sh@674 -- # '[' Linux = Linux ']' 00:08:36.394 18:12:19 blockdev_general -- bdev/blockdev.sh@676 -- # PRE_RESERVED_MEM=0 00:08:36.394 18:12:19 blockdev_general -- bdev/blockdev.sh@682 -- # test_type=bdev 00:08:36.394 18:12:19 blockdev_general -- bdev/blockdev.sh@683 -- # crypto_device= 00:08:36.394 18:12:19 blockdev_general -- bdev/blockdev.sh@684 -- # dek= 00:08:36.394 18:12:19 blockdev_general -- bdev/blockdev.sh@685 -- # env_ctx= 00:08:36.394 18:12:19 blockdev_general -- bdev/blockdev.sh@686 -- # wait_for_rpc= 00:08:36.394 18:12:19 blockdev_general -- bdev/blockdev.sh@687 -- # '[' -n '' ']' 00:08:36.394 18:12:19 blockdev_general -- bdev/blockdev.sh@690 -- # [[ bdev == bdev ]] 00:08:36.394 18:12:19 blockdev_general -- bdev/blockdev.sh@691 -- # wait_for_rpc=--wait-for-rpc 00:08:36.394 18:12:19 blockdev_general -- bdev/blockdev.sh@693 -- # start_spdk_tgt 00:08:36.394 18:12:19 blockdev_general -- bdev/blockdev.sh@47 -- # spdk_tgt_pid=2437860 00:08:36.394 18:12:19 blockdev_general -- bdev/blockdev.sh@48 -- # trap 'killprocess "$spdk_tgt_pid"; exit 1' SIGINT SIGTERM EXIT 00:08:36.394 18:12:19 blockdev_general -- bdev/blockdev.sh@49 -- # waitforlisten 2437860 00:08:36.394 18:12:19 blockdev_general -- common/autotest_common.sh@829 -- # '[' -z 2437860 ']' 00:08:36.394 18:12:19 blockdev_general -- bdev/blockdev.sh@46 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt '' --wait-for-rpc 00:08:36.394 18:12:19 blockdev_general -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:08:36.394 18:12:19 blockdev_general -- common/autotest_common.sh@834 -- # local max_retries=100 00:08:36.394 18:12:19 blockdev_general -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:08:36.394 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:08:36.394 18:12:19 blockdev_general -- common/autotest_common.sh@838 -- # xtrace_disable 00:08:36.394 18:12:19 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:08:36.394 [2024-07-12 18:12:20.063301] Starting SPDK v24.09-pre git sha1 182dd7de4 / DPDK 24.03.0 initialization... 00:08:36.394 [2024-07-12 18:12:20.063375] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2437860 ] 00:08:36.652 [2024-07-12 18:12:20.205523] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:36.652 [2024-07-12 18:12:20.339670] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:08:37.588 18:12:21 blockdev_general -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:08:37.588 18:12:21 blockdev_general -- common/autotest_common.sh@862 -- # return 0 00:08:37.588 18:12:21 blockdev_general -- bdev/blockdev.sh@694 -- # case "$test_type" in 00:08:37.588 18:12:21 blockdev_general -- bdev/blockdev.sh@696 -- # setup_bdev_conf 00:08:37.588 18:12:21 blockdev_general -- bdev/blockdev.sh@53 -- # rpc_cmd 00:08:37.588 18:12:21 blockdev_general -- common/autotest_common.sh@559 -- # xtrace_disable 00:08:37.588 18:12:21 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:08:37.848 [2024-07-12 18:12:21.326667] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:08:37.848 [2024-07-12 18:12:21.326718] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:08:37.848 00:08:37.848 [2024-07-12 18:12:21.334655] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:08:37.848 [2024-07-12 18:12:21.334679] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:08:37.848 00:08:37.848 Malloc0 00:08:37.848 Malloc1 00:08:37.848 Malloc2 00:08:37.848 Malloc3 00:08:37.848 Malloc4 00:08:37.848 Malloc5 00:08:37.848 Malloc6 00:08:37.848 Malloc7 00:08:37.848 Malloc8 00:08:37.848 Malloc9 00:08:37.848 [2024-07-12 18:12:21.472048] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on Malloc3 00:08:37.848 [2024-07-12 18:12:21.472095] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:08:37.848 [2024-07-12 18:12:21.472115] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1207350 00:08:37.848 [2024-07-12 18:12:21.472128] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:08:37.848 [2024-07-12 18:12:21.473472] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:08:37.848 [2024-07-12 18:12:21.473499] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: TestPT 00:08:37.848 TestPT 00:08:37.848 18:12:21 blockdev_general -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:08:37.848 18:12:21 blockdev_general -- bdev/blockdev.sh@76 -- # dd if=/dev/zero of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/aiofile bs=2048 count=5000 00:08:37.848 5000+0 records in 00:08:37.848 5000+0 records out 00:08:37.848 10240000 bytes (10 MB, 9.8 MiB) copied, 0.0242134 s, 423 MB/s 00:08:37.848 18:12:21 blockdev_general -- bdev/blockdev.sh@77 -- # rpc_cmd bdev_aio_create /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/aiofile AIO0 2048 00:08:37.848 18:12:21 blockdev_general -- common/autotest_common.sh@559 -- # xtrace_disable 00:08:37.848 18:12:21 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:08:38.107 AIO0 00:08:38.107 18:12:21 blockdev_general -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:08:38.107 18:12:21 blockdev_general -- bdev/blockdev.sh@737 -- # rpc_cmd bdev_wait_for_examine 00:08:38.107 18:12:21 blockdev_general -- common/autotest_common.sh@559 -- # xtrace_disable 00:08:38.107 18:12:21 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:08:38.107 18:12:21 blockdev_general -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:08:38.107 18:12:21 blockdev_general -- bdev/blockdev.sh@740 -- # cat 00:08:38.107 18:12:21 blockdev_general -- bdev/blockdev.sh@740 -- # rpc_cmd save_subsystem_config -n accel 00:08:38.107 18:12:21 blockdev_general -- common/autotest_common.sh@559 -- # xtrace_disable 00:08:38.107 18:12:21 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:08:38.107 18:12:21 blockdev_general -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:08:38.107 18:12:21 blockdev_general -- bdev/blockdev.sh@740 -- # rpc_cmd save_subsystem_config -n bdev 00:08:38.107 18:12:21 blockdev_general -- common/autotest_common.sh@559 -- # xtrace_disable 00:08:38.107 18:12:21 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:08:38.107 18:12:21 blockdev_general -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:08:38.107 18:12:21 blockdev_general -- bdev/blockdev.sh@740 -- # rpc_cmd save_subsystem_config -n iobuf 00:08:38.107 18:12:21 blockdev_general -- common/autotest_common.sh@559 -- # xtrace_disable 00:08:38.107 18:12:21 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:08:38.107 18:12:21 blockdev_general -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:08:38.107 18:12:21 blockdev_general -- bdev/blockdev.sh@748 -- # mapfile -t bdevs 00:08:38.107 18:12:21 blockdev_general -- bdev/blockdev.sh@748 -- # rpc_cmd bdev_get_bdevs 00:08:38.107 18:12:21 blockdev_general -- bdev/blockdev.sh@748 -- # jq -r '.[] | select(.claimed == false)' 00:08:38.107 18:12:21 blockdev_general -- common/autotest_common.sh@559 -- # xtrace_disable 00:08:38.107 18:12:21 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:08:38.367 18:12:21 blockdev_general -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:08:38.367 18:12:21 blockdev_general -- bdev/blockdev.sh@749 -- # mapfile -t bdevs_name 00:08:38.367 18:12:21 blockdev_general -- bdev/blockdev.sh@749 -- # jq -r .name 00:08:38.368 18:12:21 blockdev_general -- bdev/blockdev.sh@749 -- # printf '%s\n' '{' ' "name": "Malloc0",' ' "aliases": [' ' "8c026948-175e-43a4-9c68-661d1aa15a51"' ' ],' ' "product_name": "Malloc disk",' ' "block_size": 512,' ' "num_blocks": 65536,' ' "uuid": "8c026948-175e-43a4-9c68-661d1aa15a51",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 20000,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {}' '}' '{' ' "name": "Malloc1p0",' ' "aliases": [' ' "10a8a23b-75e7-5a95-a595-934c63455d74"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 32768,' ' "uuid": "10a8a23b-75e7-5a95-a595-934c63455d74",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc1",' ' "offset_blocks": 0' ' }' ' }' '}' '{' ' "name": "Malloc1p1",' ' "aliases": [' ' "92b07044-9741-5918-86bb-7dc7a7ff5094"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 32768,' ' "uuid": "92b07044-9741-5918-86bb-7dc7a7ff5094",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc1",' ' "offset_blocks": 32768' ' }' ' }' '}' '{' ' "name": "Malloc2p0",' ' "aliases": [' ' "b605ca3f-a8ce-57fb-8a95-9efc3a7c9873"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "b605ca3f-a8ce-57fb-8a95-9efc3a7c9873",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 0' ' }' ' }' '}' '{' ' "name": "Malloc2p1",' ' "aliases": [' ' "874975c9-499f-5584-8e76-a3baa87ae239"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "874975c9-499f-5584-8e76-a3baa87ae239",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 8192' ' }' ' }' '}' '{' ' "name": "Malloc2p2",' ' "aliases": [' ' "24a31cf2-8c7b-5a42-894f-9035e0295699"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "24a31cf2-8c7b-5a42-894f-9035e0295699",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 16384' ' }' ' }' '}' '{' ' "name": "Malloc2p3",' ' "aliases": [' ' "d97aed29-4bfe-5d15-bbda-b935615639e9"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "d97aed29-4bfe-5d15-bbda-b935615639e9",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 24576' ' }' ' }' '}' '{' ' "name": "Malloc2p4",' ' "aliases": [' ' "91c509dc-cd44-5a4b-992e-0450ebb2e36e"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "91c509dc-cd44-5a4b-992e-0450ebb2e36e",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 32768' ' }' ' }' '}' '{' ' "name": "Malloc2p5",' ' "aliases": [' ' "52c6ac66-4aa9-5435-a650-d22d2f799ff0"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "52c6ac66-4aa9-5435-a650-d22d2f799ff0",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 40960' ' }' ' }' '}' '{' ' "name": "Malloc2p6",' ' "aliases": [' ' "ff93a3ae-9750-5893-b29c-979716dfe45a"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "ff93a3ae-9750-5893-b29c-979716dfe45a",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 49152' ' }' ' }' '}' '{' ' "name": "Malloc2p7",' ' "aliases": [' ' "e2c11e67-14f3-5dfe-bc1b-ad64b5e428eb"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "e2c11e67-14f3-5dfe-bc1b-ad64b5e428eb",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 57344' ' }' ' }' '}' '{' ' "name": "TestPT",' ' "aliases": [' ' "7b697129-79b5-5be7-9104-aa2d250452a5"' ' ],' ' "product_name": "passthru",' ' "block_size": 512,' ' "num_blocks": 65536,' ' "uuid": "7b697129-79b5-5be7-9104-aa2d250452a5",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "passthru": {' ' "name": "TestPT",' ' "base_bdev_name": "Malloc3"' ' }' ' }' '}' '{' ' "name": "raid0",' ' "aliases": [' ' "b18e8578-2643-4520-9d9d-c641d0cdb6cb"' ' ],' ' "product_name": "Raid Volume",' ' "block_size": 512,' ' "num_blocks": 131072,' ' "uuid": "b18e8578-2643-4520-9d9d-c641d0cdb6cb",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "raid": {' ' "uuid": "b18e8578-2643-4520-9d9d-c641d0cdb6cb",' ' "strip_size_kb": 64,' ' "state": "online",' ' "raid_level": "raid0",' ' "superblock": false,' ' "num_base_bdevs": 2,' ' "num_base_bdevs_discovered": 2,' ' "num_base_bdevs_operational": 2,' ' "base_bdevs_list": [' ' {' ' "name": "Malloc4",' ' "uuid": "a2091772-e110-46aa-a46b-7c07530e6ca9",' ' "is_configured": true,' ' "data_offset": 0,' ' "data_size": 65536' ' },' ' {' ' "name": "Malloc5",' ' "uuid": "82d0c7ef-46f1-44df-b775-eed1988d029c",' ' "is_configured": true,' ' "data_offset": 0,' ' "data_size": 65536' ' }' ' ]' ' }' ' }' '}' '{' ' "name": "concat0",' ' "aliases": [' ' "0df88812-5798-4dba-b9fb-c9f9390f0234"' ' ],' ' "product_name": "Raid Volume",' ' "block_size": 512,' ' "num_blocks": 131072,' ' "uuid": "0df88812-5798-4dba-b9fb-c9f9390f0234",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "raid": {' ' "uuid": "0df88812-5798-4dba-b9fb-c9f9390f0234",' ' "strip_size_kb": 64,' ' "state": "online",' ' "raid_level": "concat",' ' "superblock": false,' ' "num_base_bdevs": 2,' ' "num_base_bdevs_discovered": 2,' ' "num_base_bdevs_operational": 2,' ' "base_bdevs_list": [' ' {' ' "name": "Malloc6",' ' "uuid": "bdbffdf3-f179-4faf-8ab4-95de8aee6143",' ' "is_configured": true,' ' "data_offset": 0,' ' "data_size": 65536' ' },' ' {' ' "name": "Malloc7",' ' "uuid": "81c8b237-3342-48dc-99d7-48c6e1c22d95",' ' "is_configured": true,' ' "data_offset": 0,' ' "data_size": 65536' ' }' ' ]' ' }' ' }' '}' '{' ' "name": "raid1",' ' "aliases": [' ' "b721d4ff-426f-4dcf-9f63-13f4c0f8fcfd"' ' ],' ' "product_name": "Raid Volume",' ' "block_size": 512,' ' "num_blocks": 65536,' ' "uuid": "b721d4ff-426f-4dcf-9f63-13f4c0f8fcfd",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": false,' ' "flush": false,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "raid": {' ' "uuid": "b721d4ff-426f-4dcf-9f63-13f4c0f8fcfd",' ' "strip_size_kb": 0,' ' "state": "online",' ' "raid_level": "raid1",' ' "superblock": false,' ' "num_base_bdevs": 2,' ' "num_base_bdevs_discovered": 2,' ' "num_base_bdevs_operational": 2,' ' "base_bdevs_list": [' ' {' ' "name": "Malloc8",' ' "uuid": "e4bce0c5-9d51-489c-8914-8554a294b2f6",' ' "is_configured": true,' ' "data_offset": 0,' ' "data_size": 65536' ' },' ' {' ' "name": "Malloc9",' ' "uuid": "423fd104-aae0-4492-8a7c-3d25ec5fe345",' ' "is_configured": true,' ' "data_offset": 0,' ' "data_size": 65536' ' }' ' ]' ' }' ' }' '}' '{' ' "name": "AIO0",' ' "aliases": [' ' "97c75e37-b558-4fcf-a0d6-c74abc978df6"' ' ],' ' "product_name": "AIO disk",' ' "block_size": 2048,' ' "num_blocks": 5000,' ' "uuid": "97c75e37-b558-4fcf-a0d6-c74abc978df6",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": false,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "aio": {' ' "filename": "/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/aiofile",' ' "block_size_override": true,' ' "readonly": false,' ' "fallocate": false' ' }' ' }' '}' 00:08:38.368 18:12:21 blockdev_general -- bdev/blockdev.sh@750 -- # bdev_list=("${bdevs_name[@]}") 00:08:38.368 18:12:21 blockdev_general -- bdev/blockdev.sh@752 -- # hello_world_bdev=Malloc0 00:08:38.368 18:12:21 blockdev_general -- bdev/blockdev.sh@753 -- # trap - SIGINT SIGTERM EXIT 00:08:38.368 18:12:21 blockdev_general -- bdev/blockdev.sh@754 -- # killprocess 2437860 00:08:38.368 18:12:21 blockdev_general -- common/autotest_common.sh@948 -- # '[' -z 2437860 ']' 00:08:38.368 18:12:21 blockdev_general -- common/autotest_common.sh@952 -- # kill -0 2437860 00:08:38.369 18:12:21 blockdev_general -- common/autotest_common.sh@953 -- # uname 00:08:38.369 18:12:21 blockdev_general -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:08:38.369 18:12:21 blockdev_general -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2437860 00:08:38.369 18:12:22 blockdev_general -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:08:38.369 18:12:22 blockdev_general -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:08:38.369 18:12:22 blockdev_general -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2437860' 00:08:38.369 killing process with pid 2437860 00:08:38.369 18:12:22 blockdev_general -- common/autotest_common.sh@967 -- # kill 2437860 00:08:38.369 18:12:22 blockdev_general -- common/autotest_common.sh@972 -- # wait 2437860 00:08:38.940 18:12:22 blockdev_general -- bdev/blockdev.sh@758 -- # trap cleanup SIGINT SIGTERM EXIT 00:08:38.940 18:12:22 blockdev_general -- bdev/blockdev.sh@760 -- # run_test bdev_hello_world /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/hello_bdev --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -b Malloc0 '' 00:08:38.940 18:12:22 blockdev_general -- common/autotest_common.sh@1099 -- # '[' 7 -le 1 ']' 00:08:38.940 18:12:22 blockdev_general -- common/autotest_common.sh@1105 -- # xtrace_disable 00:08:38.940 18:12:22 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:08:38.940 ************************************ 00:08:38.940 START TEST bdev_hello_world 00:08:38.940 ************************************ 00:08:38.940 18:12:22 blockdev_general.bdev_hello_world -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/hello_bdev --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -b Malloc0 '' 00:08:38.940 [2024-07-12 18:12:22.611657] Starting SPDK v24.09-pre git sha1 182dd7de4 / DPDK 24.03.0 initialization... 00:08:38.940 [2024-07-12 18:12:22.611715] [ DPDK EAL parameters: hello_bdev --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2438235 ] 00:08:39.198 [2024-07-12 18:12:22.737147] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:39.198 [2024-07-12 18:12:22.836656] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:08:39.456 [2024-07-12 18:12:22.997760] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on Malloc3 00:08:39.456 [2024-07-12 18:12:22.997836] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc3 00:08:39.456 [2024-07-12 18:12:22.997851] vbdev_passthru.c: 735:bdev_passthru_create_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:08:39.456 [2024-07-12 18:12:23.005760] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:08:39.456 [2024-07-12 18:12:23.005789] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:08:39.456 [2024-07-12 18:12:23.013774] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:08:39.456 [2024-07-12 18:12:23.013802] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:08:39.456 [2024-07-12 18:12:23.091087] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on Malloc3 00:08:39.456 [2024-07-12 18:12:23.091139] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:08:39.456 [2024-07-12 18:12:23.091158] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xdda3c0 00:08:39.456 [2024-07-12 18:12:23.091172] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:08:39.456 [2024-07-12 18:12:23.092600] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:08:39.456 [2024-07-12 18:12:23.092629] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: TestPT 00:08:39.713 [2024-07-12 18:12:23.244372] hello_bdev.c: 222:hello_start: *NOTICE*: Successfully started the application 00:08:39.713 [2024-07-12 18:12:23.244437] hello_bdev.c: 231:hello_start: *NOTICE*: Opening the bdev Malloc0 00:08:39.713 [2024-07-12 18:12:23.244492] hello_bdev.c: 244:hello_start: *NOTICE*: Opening io channel 00:08:39.713 [2024-07-12 18:12:23.244568] hello_bdev.c: 138:hello_write: *NOTICE*: Writing to the bdev 00:08:39.713 [2024-07-12 18:12:23.244647] hello_bdev.c: 117:write_complete: *NOTICE*: bdev io write completed successfully 00:08:39.713 [2024-07-12 18:12:23.244678] hello_bdev.c: 84:hello_read: *NOTICE*: Reading io 00:08:39.713 [2024-07-12 18:12:23.244751] hello_bdev.c: 65:read_complete: *NOTICE*: Read string from bdev : Hello World! 00:08:39.713 00:08:39.713 [2024-07-12 18:12:23.244792] hello_bdev.c: 74:read_complete: *NOTICE*: Stopping app 00:08:39.972 00:08:39.972 real 0m1.024s 00:08:39.972 user 0m0.672s 00:08:39.972 sys 0m0.308s 00:08:39.972 18:12:23 blockdev_general.bdev_hello_world -- common/autotest_common.sh@1124 -- # xtrace_disable 00:08:39.972 18:12:23 blockdev_general.bdev_hello_world -- common/autotest_common.sh@10 -- # set +x 00:08:39.972 ************************************ 00:08:39.972 END TEST bdev_hello_world 00:08:39.972 ************************************ 00:08:39.972 18:12:23 blockdev_general -- common/autotest_common.sh@1142 -- # return 0 00:08:39.972 18:12:23 blockdev_general -- bdev/blockdev.sh@761 -- # run_test bdev_bounds bdev_bounds '' 00:08:39.972 18:12:23 blockdev_general -- common/autotest_common.sh@1099 -- # '[' 3 -le 1 ']' 00:08:39.972 18:12:23 blockdev_general -- common/autotest_common.sh@1105 -- # xtrace_disable 00:08:39.972 18:12:23 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:08:39.972 ************************************ 00:08:39.972 START TEST bdev_bounds 00:08:39.972 ************************************ 00:08:39.972 18:12:23 blockdev_general.bdev_bounds -- common/autotest_common.sh@1123 -- # bdev_bounds '' 00:08:39.972 18:12:23 blockdev_general.bdev_bounds -- bdev/blockdev.sh@289 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevio/bdevio -w -s 0 --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json '' 00:08:39.972 18:12:23 blockdev_general.bdev_bounds -- bdev/blockdev.sh@290 -- # bdevio_pid=2438427 00:08:39.972 18:12:23 blockdev_general.bdev_bounds -- bdev/blockdev.sh@291 -- # trap 'cleanup; killprocess $bdevio_pid; exit 1' SIGINT SIGTERM EXIT 00:08:39.972 18:12:23 blockdev_general.bdev_bounds -- bdev/blockdev.sh@292 -- # echo 'Process bdevio pid: 2438427' 00:08:39.972 Process bdevio pid: 2438427 00:08:39.972 18:12:23 blockdev_general.bdev_bounds -- bdev/blockdev.sh@293 -- # waitforlisten 2438427 00:08:39.972 18:12:23 blockdev_general.bdev_bounds -- common/autotest_common.sh@829 -- # '[' -z 2438427 ']' 00:08:39.972 18:12:23 blockdev_general.bdev_bounds -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:08:39.972 18:12:23 blockdev_general.bdev_bounds -- common/autotest_common.sh@834 -- # local max_retries=100 00:08:39.972 18:12:23 blockdev_general.bdev_bounds -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:08:39.972 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:08:39.972 18:12:23 blockdev_general.bdev_bounds -- common/autotest_common.sh@838 -- # xtrace_disable 00:08:39.972 18:12:23 blockdev_general.bdev_bounds -- common/autotest_common.sh@10 -- # set +x 00:08:40.230 [2024-07-12 18:12:23.722048] Starting SPDK v24.09-pre git sha1 182dd7de4 / DPDK 24.03.0 initialization... 00:08:40.230 [2024-07-12 18:12:23.722111] [ DPDK EAL parameters: bdevio --no-shconf -c 0x7 -m 0 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2438427 ] 00:08:40.230 [2024-07-12 18:12:23.850584] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 3 00:08:40.230 [2024-07-12 18:12:23.955836] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:08:40.230 [2024-07-12 18:12:23.955860] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 2 00:08:40.230 [2024-07-12 18:12:23.955865] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:08:40.488 [2024-07-12 18:12:24.109427] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on Malloc3 00:08:40.488 [2024-07-12 18:12:24.109488] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc3 00:08:40.488 [2024-07-12 18:12:24.109503] vbdev_passthru.c: 735:bdev_passthru_create_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:08:40.488 [2024-07-12 18:12:24.117439] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:08:40.488 [2024-07-12 18:12:24.117466] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:08:40.488 [2024-07-12 18:12:24.125446] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:08:40.488 [2024-07-12 18:12:24.125470] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:08:40.488 [2024-07-12 18:12:24.202723] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on Malloc3 00:08:40.488 [2024-07-12 18:12:24.202776] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:08:40.488 [2024-07-12 18:12:24.202794] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x2c000c0 00:08:40.488 [2024-07-12 18:12:24.202807] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:08:40.488 [2024-07-12 18:12:24.204263] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:08:40.488 [2024-07-12 18:12:24.204291] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: TestPT 00:08:41.052 18:12:24 blockdev_general.bdev_bounds -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:08:41.052 18:12:24 blockdev_general.bdev_bounds -- common/autotest_common.sh@862 -- # return 0 00:08:41.052 18:12:24 blockdev_general.bdev_bounds -- bdev/blockdev.sh@294 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevio/tests.py perform_tests 00:08:41.311 I/O targets: 00:08:41.311 Malloc0: 65536 blocks of 512 bytes (32 MiB) 00:08:41.311 Malloc1p0: 32768 blocks of 512 bytes (16 MiB) 00:08:41.311 Malloc1p1: 32768 blocks of 512 bytes (16 MiB) 00:08:41.311 Malloc2p0: 8192 blocks of 512 bytes (4 MiB) 00:08:41.311 Malloc2p1: 8192 blocks of 512 bytes (4 MiB) 00:08:41.311 Malloc2p2: 8192 blocks of 512 bytes (4 MiB) 00:08:41.311 Malloc2p3: 8192 blocks of 512 bytes (4 MiB) 00:08:41.311 Malloc2p4: 8192 blocks of 512 bytes (4 MiB) 00:08:41.311 Malloc2p5: 8192 blocks of 512 bytes (4 MiB) 00:08:41.311 Malloc2p6: 8192 blocks of 512 bytes (4 MiB) 00:08:41.311 Malloc2p7: 8192 blocks of 512 bytes (4 MiB) 00:08:41.311 TestPT: 65536 blocks of 512 bytes (32 MiB) 00:08:41.311 raid0: 131072 blocks of 512 bytes (64 MiB) 00:08:41.311 concat0: 131072 blocks of 512 bytes (64 MiB) 00:08:41.311 raid1: 65536 blocks of 512 bytes (32 MiB) 00:08:41.311 AIO0: 5000 blocks of 2048 bytes (10 MiB) 00:08:41.311 00:08:41.311 00:08:41.311 CUnit - A unit testing framework for C - Version 2.1-3 00:08:41.311 http://cunit.sourceforge.net/ 00:08:41.311 00:08:41.311 00:08:41.311 Suite: bdevio tests on: AIO0 00:08:41.311 Test: blockdev write read block ...passed 00:08:41.311 Test: blockdev write zeroes read block ...passed 00:08:41.311 Test: blockdev write zeroes read no split ...passed 00:08:41.311 Test: blockdev write zeroes read split ...passed 00:08:41.311 Test: blockdev write zeroes read split partial ...passed 00:08:41.311 Test: blockdev reset ...passed 00:08:41.311 Test: blockdev write read 8 blocks ...passed 00:08:41.311 Test: blockdev write read size > 128k ...passed 00:08:41.311 Test: blockdev write read invalid size ...passed 00:08:41.311 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:08:41.311 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:08:41.311 Test: blockdev write read max offset ...passed 00:08:41.311 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:08:41.311 Test: blockdev writev readv 8 blocks ...passed 00:08:41.311 Test: blockdev writev readv 30 x 1block ...passed 00:08:41.311 Test: blockdev writev readv block ...passed 00:08:41.311 Test: blockdev writev readv size > 128k ...passed 00:08:41.311 Test: blockdev writev readv size > 128k in two iovs ...passed 00:08:41.311 Test: blockdev comparev and writev ...passed 00:08:41.311 Test: blockdev nvme passthru rw ...passed 00:08:41.311 Test: blockdev nvme passthru vendor specific ...passed 00:08:41.311 Test: blockdev nvme admin passthru ...passed 00:08:41.311 Test: blockdev copy ...passed 00:08:41.311 Suite: bdevio tests on: raid1 00:08:41.311 Test: blockdev write read block ...passed 00:08:41.311 Test: blockdev write zeroes read block ...passed 00:08:41.311 Test: blockdev write zeroes read no split ...passed 00:08:41.311 Test: blockdev write zeroes read split ...passed 00:08:41.311 Test: blockdev write zeroes read split partial ...passed 00:08:41.311 Test: blockdev reset ...passed 00:08:41.311 Test: blockdev write read 8 blocks ...passed 00:08:41.311 Test: blockdev write read size > 128k ...passed 00:08:41.311 Test: blockdev write read invalid size ...passed 00:08:41.311 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:08:41.311 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:08:41.311 Test: blockdev write read max offset ...passed 00:08:41.311 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:08:41.311 Test: blockdev writev readv 8 blocks ...passed 00:08:41.311 Test: blockdev writev readv 30 x 1block ...passed 00:08:41.311 Test: blockdev writev readv block ...passed 00:08:41.311 Test: blockdev writev readv size > 128k ...passed 00:08:41.311 Test: blockdev writev readv size > 128k in two iovs ...passed 00:08:41.311 Test: blockdev comparev and writev ...passed 00:08:41.311 Test: blockdev nvme passthru rw ...passed 00:08:41.311 Test: blockdev nvme passthru vendor specific ...passed 00:08:41.311 Test: blockdev nvme admin passthru ...passed 00:08:41.311 Test: blockdev copy ...passed 00:08:41.311 Suite: bdevio tests on: concat0 00:08:41.311 Test: blockdev write read block ...passed 00:08:41.311 Test: blockdev write zeroes read block ...passed 00:08:41.311 Test: blockdev write zeroes read no split ...passed 00:08:41.311 Test: blockdev write zeroes read split ...passed 00:08:41.311 Test: blockdev write zeroes read split partial ...passed 00:08:41.311 Test: blockdev reset ...passed 00:08:41.311 Test: blockdev write read 8 blocks ...passed 00:08:41.311 Test: blockdev write read size > 128k ...passed 00:08:41.311 Test: blockdev write read invalid size ...passed 00:08:41.311 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:08:41.311 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:08:41.311 Test: blockdev write read max offset ...passed 00:08:41.311 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:08:41.311 Test: blockdev writev readv 8 blocks ...passed 00:08:41.311 Test: blockdev writev readv 30 x 1block ...passed 00:08:41.311 Test: blockdev writev readv block ...passed 00:08:41.311 Test: blockdev writev readv size > 128k ...passed 00:08:41.311 Test: blockdev writev readv size > 128k in two iovs ...passed 00:08:41.311 Test: blockdev comparev and writev ...passed 00:08:41.311 Test: blockdev nvme passthru rw ...passed 00:08:41.311 Test: blockdev nvme passthru vendor specific ...passed 00:08:41.311 Test: blockdev nvme admin passthru ...passed 00:08:41.311 Test: blockdev copy ...passed 00:08:41.311 Suite: bdevio tests on: raid0 00:08:41.311 Test: blockdev write read block ...passed 00:08:41.311 Test: blockdev write zeroes read block ...passed 00:08:41.311 Test: blockdev write zeroes read no split ...passed 00:08:41.311 Test: blockdev write zeroes read split ...passed 00:08:41.311 Test: blockdev write zeroes read split partial ...passed 00:08:41.311 Test: blockdev reset ...passed 00:08:41.311 Test: blockdev write read 8 blocks ...passed 00:08:41.311 Test: blockdev write read size > 128k ...passed 00:08:41.311 Test: blockdev write read invalid size ...passed 00:08:41.312 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:08:41.312 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:08:41.312 Test: blockdev write read max offset ...passed 00:08:41.312 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:08:41.312 Test: blockdev writev readv 8 blocks ...passed 00:08:41.312 Test: blockdev writev readv 30 x 1block ...passed 00:08:41.312 Test: blockdev writev readv block ...passed 00:08:41.312 Test: blockdev writev readv size > 128k ...passed 00:08:41.312 Test: blockdev writev readv size > 128k in two iovs ...passed 00:08:41.312 Test: blockdev comparev and writev ...passed 00:08:41.312 Test: blockdev nvme passthru rw ...passed 00:08:41.312 Test: blockdev nvme passthru vendor specific ...passed 00:08:41.312 Test: blockdev nvme admin passthru ...passed 00:08:41.312 Test: blockdev copy ...passed 00:08:41.312 Suite: bdevio tests on: TestPT 00:08:41.312 Test: blockdev write read block ...passed 00:08:41.312 Test: blockdev write zeroes read block ...passed 00:08:41.312 Test: blockdev write zeroes read no split ...passed 00:08:41.312 Test: blockdev write zeroes read split ...passed 00:08:41.312 Test: blockdev write zeroes read split partial ...passed 00:08:41.312 Test: blockdev reset ...passed 00:08:41.312 Test: blockdev write read 8 blocks ...passed 00:08:41.312 Test: blockdev write read size > 128k ...passed 00:08:41.312 Test: blockdev write read invalid size ...passed 00:08:41.312 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:08:41.312 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:08:41.312 Test: blockdev write read max offset ...passed 00:08:41.312 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:08:41.312 Test: blockdev writev readv 8 blocks ...passed 00:08:41.312 Test: blockdev writev readv 30 x 1block ...passed 00:08:41.312 Test: blockdev writev readv block ...passed 00:08:41.312 Test: blockdev writev readv size > 128k ...passed 00:08:41.312 Test: blockdev writev readv size > 128k in two iovs ...passed 00:08:41.312 Test: blockdev comparev and writev ...passed 00:08:41.312 Test: blockdev nvme passthru rw ...passed 00:08:41.312 Test: blockdev nvme passthru vendor specific ...passed 00:08:41.312 Test: blockdev nvme admin passthru ...passed 00:08:41.312 Test: blockdev copy ...passed 00:08:41.312 Suite: bdevio tests on: Malloc2p7 00:08:41.312 Test: blockdev write read block ...passed 00:08:41.312 Test: blockdev write zeroes read block ...passed 00:08:41.312 Test: blockdev write zeroes read no split ...passed 00:08:41.312 Test: blockdev write zeroes read split ...passed 00:08:41.312 Test: blockdev write zeroes read split partial ...passed 00:08:41.312 Test: blockdev reset ...passed 00:08:41.312 Test: blockdev write read 8 blocks ...passed 00:08:41.312 Test: blockdev write read size > 128k ...passed 00:08:41.312 Test: blockdev write read invalid size ...passed 00:08:41.312 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:08:41.312 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:08:41.312 Test: blockdev write read max offset ...passed 00:08:41.312 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:08:41.312 Test: blockdev writev readv 8 blocks ...passed 00:08:41.312 Test: blockdev writev readv 30 x 1block ...passed 00:08:41.312 Test: blockdev writev readv block ...passed 00:08:41.312 Test: blockdev writev readv size > 128k ...passed 00:08:41.312 Test: blockdev writev readv size > 128k in two iovs ...passed 00:08:41.312 Test: blockdev comparev and writev ...passed 00:08:41.312 Test: blockdev nvme passthru rw ...passed 00:08:41.312 Test: blockdev nvme passthru vendor specific ...passed 00:08:41.312 Test: blockdev nvme admin passthru ...passed 00:08:41.312 Test: blockdev copy ...passed 00:08:41.312 Suite: bdevio tests on: Malloc2p6 00:08:41.312 Test: blockdev write read block ...passed 00:08:41.312 Test: blockdev write zeroes read block ...passed 00:08:41.312 Test: blockdev write zeroes read no split ...passed 00:08:41.312 Test: blockdev write zeroes read split ...passed 00:08:41.312 Test: blockdev write zeroes read split partial ...passed 00:08:41.312 Test: blockdev reset ...passed 00:08:41.312 Test: blockdev write read 8 blocks ...passed 00:08:41.312 Test: blockdev write read size > 128k ...passed 00:08:41.312 Test: blockdev write read invalid size ...passed 00:08:41.312 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:08:41.312 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:08:41.312 Test: blockdev write read max offset ...passed 00:08:41.312 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:08:41.312 Test: blockdev writev readv 8 blocks ...passed 00:08:41.312 Test: blockdev writev readv 30 x 1block ...passed 00:08:41.312 Test: blockdev writev readv block ...passed 00:08:41.312 Test: blockdev writev readv size > 128k ...passed 00:08:41.312 Test: blockdev writev readv size > 128k in two iovs ...passed 00:08:41.312 Test: blockdev comparev and writev ...passed 00:08:41.312 Test: blockdev nvme passthru rw ...passed 00:08:41.312 Test: blockdev nvme passthru vendor specific ...passed 00:08:41.312 Test: blockdev nvme admin passthru ...passed 00:08:41.312 Test: blockdev copy ...passed 00:08:41.312 Suite: bdevio tests on: Malloc2p5 00:08:41.312 Test: blockdev write read block ...passed 00:08:41.312 Test: blockdev write zeroes read block ...passed 00:08:41.312 Test: blockdev write zeroes read no split ...passed 00:08:41.312 Test: blockdev write zeroes read split ...passed 00:08:41.312 Test: blockdev write zeroes read split partial ...passed 00:08:41.312 Test: blockdev reset ...passed 00:08:41.312 Test: blockdev write read 8 blocks ...passed 00:08:41.312 Test: blockdev write read size > 128k ...passed 00:08:41.312 Test: blockdev write read invalid size ...passed 00:08:41.312 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:08:41.312 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:08:41.312 Test: blockdev write read max offset ...passed 00:08:41.312 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:08:41.312 Test: blockdev writev readv 8 blocks ...passed 00:08:41.312 Test: blockdev writev readv 30 x 1block ...passed 00:08:41.312 Test: blockdev writev readv block ...passed 00:08:41.312 Test: blockdev writev readv size > 128k ...passed 00:08:41.312 Test: blockdev writev readv size > 128k in two iovs ...passed 00:08:41.312 Test: blockdev comparev and writev ...passed 00:08:41.312 Test: blockdev nvme passthru rw ...passed 00:08:41.312 Test: blockdev nvme passthru vendor specific ...passed 00:08:41.312 Test: blockdev nvme admin passthru ...passed 00:08:41.312 Test: blockdev copy ...passed 00:08:41.312 Suite: bdevio tests on: Malloc2p4 00:08:41.312 Test: blockdev write read block ...passed 00:08:41.312 Test: blockdev write zeroes read block ...passed 00:08:41.312 Test: blockdev write zeroes read no split ...passed 00:08:41.312 Test: blockdev write zeroes read split ...passed 00:08:41.312 Test: blockdev write zeroes read split partial ...passed 00:08:41.312 Test: blockdev reset ...passed 00:08:41.312 Test: blockdev write read 8 blocks ...passed 00:08:41.312 Test: blockdev write read size > 128k ...passed 00:08:41.312 Test: blockdev write read invalid size ...passed 00:08:41.312 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:08:41.312 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:08:41.312 Test: blockdev write read max offset ...passed 00:08:41.312 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:08:41.312 Test: blockdev writev readv 8 blocks ...passed 00:08:41.312 Test: blockdev writev readv 30 x 1block ...passed 00:08:41.312 Test: blockdev writev readv block ...passed 00:08:41.312 Test: blockdev writev readv size > 128k ...passed 00:08:41.312 Test: blockdev writev readv size > 128k in two iovs ...passed 00:08:41.312 Test: blockdev comparev and writev ...passed 00:08:41.312 Test: blockdev nvme passthru rw ...passed 00:08:41.312 Test: blockdev nvme passthru vendor specific ...passed 00:08:41.312 Test: blockdev nvme admin passthru ...passed 00:08:41.312 Test: blockdev copy ...passed 00:08:41.312 Suite: bdevio tests on: Malloc2p3 00:08:41.312 Test: blockdev write read block ...passed 00:08:41.312 Test: blockdev write zeroes read block ...passed 00:08:41.312 Test: blockdev write zeroes read no split ...passed 00:08:41.312 Test: blockdev write zeroes read split ...passed 00:08:41.312 Test: blockdev write zeroes read split partial ...passed 00:08:41.312 Test: blockdev reset ...passed 00:08:41.312 Test: blockdev write read 8 blocks ...passed 00:08:41.312 Test: blockdev write read size > 128k ...passed 00:08:41.312 Test: blockdev write read invalid size ...passed 00:08:41.312 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:08:41.312 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:08:41.312 Test: blockdev write read max offset ...passed 00:08:41.312 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:08:41.312 Test: blockdev writev readv 8 blocks ...passed 00:08:41.312 Test: blockdev writev readv 30 x 1block ...passed 00:08:41.312 Test: blockdev writev readv block ...passed 00:08:41.312 Test: blockdev writev readv size > 128k ...passed 00:08:41.312 Test: blockdev writev readv size > 128k in two iovs ...passed 00:08:41.312 Test: blockdev comparev and writev ...passed 00:08:41.312 Test: blockdev nvme passthru rw ...passed 00:08:41.312 Test: blockdev nvme passthru vendor specific ...passed 00:08:41.312 Test: blockdev nvme admin passthru ...passed 00:08:41.312 Test: blockdev copy ...passed 00:08:41.312 Suite: bdevio tests on: Malloc2p2 00:08:41.312 Test: blockdev write read block ...passed 00:08:41.312 Test: blockdev write zeroes read block ...passed 00:08:41.312 Test: blockdev write zeroes read no split ...passed 00:08:41.312 Test: blockdev write zeroes read split ...passed 00:08:41.312 Test: blockdev write zeroes read split partial ...passed 00:08:41.312 Test: blockdev reset ...passed 00:08:41.312 Test: blockdev write read 8 blocks ...passed 00:08:41.312 Test: blockdev write read size > 128k ...passed 00:08:41.312 Test: blockdev write read invalid size ...passed 00:08:41.312 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:08:41.312 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:08:41.312 Test: blockdev write read max offset ...passed 00:08:41.312 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:08:41.312 Test: blockdev writev readv 8 blocks ...passed 00:08:41.312 Test: blockdev writev readv 30 x 1block ...passed 00:08:41.312 Test: blockdev writev readv block ...passed 00:08:41.312 Test: blockdev writev readv size > 128k ...passed 00:08:41.312 Test: blockdev writev readv size > 128k in two iovs ...passed 00:08:41.312 Test: blockdev comparev and writev ...passed 00:08:41.312 Test: blockdev nvme passthru rw ...passed 00:08:41.312 Test: blockdev nvme passthru vendor specific ...passed 00:08:41.312 Test: blockdev nvme admin passthru ...passed 00:08:41.312 Test: blockdev copy ...passed 00:08:41.312 Suite: bdevio tests on: Malloc2p1 00:08:41.312 Test: blockdev write read block ...passed 00:08:41.312 Test: blockdev write zeroes read block ...passed 00:08:41.312 Test: blockdev write zeroes read no split ...passed 00:08:41.313 Test: blockdev write zeroes read split ...passed 00:08:41.313 Test: blockdev write zeroes read split partial ...passed 00:08:41.313 Test: blockdev reset ...passed 00:08:41.313 Test: blockdev write read 8 blocks ...passed 00:08:41.313 Test: blockdev write read size > 128k ...passed 00:08:41.313 Test: blockdev write read invalid size ...passed 00:08:41.313 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:08:41.313 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:08:41.313 Test: blockdev write read max offset ...passed 00:08:41.313 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:08:41.313 Test: blockdev writev readv 8 blocks ...passed 00:08:41.313 Test: blockdev writev readv 30 x 1block ...passed 00:08:41.313 Test: blockdev writev readv block ...passed 00:08:41.313 Test: blockdev writev readv size > 128k ...passed 00:08:41.313 Test: blockdev writev readv size > 128k in two iovs ...passed 00:08:41.313 Test: blockdev comparev and writev ...passed 00:08:41.313 Test: blockdev nvme passthru rw ...passed 00:08:41.313 Test: blockdev nvme passthru vendor specific ...passed 00:08:41.313 Test: blockdev nvme admin passthru ...passed 00:08:41.313 Test: blockdev copy ...passed 00:08:41.313 Suite: bdevio tests on: Malloc2p0 00:08:41.313 Test: blockdev write read block ...passed 00:08:41.313 Test: blockdev write zeroes read block ...passed 00:08:41.313 Test: blockdev write zeroes read no split ...passed 00:08:41.313 Test: blockdev write zeroes read split ...passed 00:08:41.313 Test: blockdev write zeroes read split partial ...passed 00:08:41.313 Test: blockdev reset ...passed 00:08:41.313 Test: blockdev write read 8 blocks ...passed 00:08:41.313 Test: blockdev write read size > 128k ...passed 00:08:41.313 Test: blockdev write read invalid size ...passed 00:08:41.313 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:08:41.313 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:08:41.313 Test: blockdev write read max offset ...passed 00:08:41.313 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:08:41.313 Test: blockdev writev readv 8 blocks ...passed 00:08:41.313 Test: blockdev writev readv 30 x 1block ...passed 00:08:41.313 Test: blockdev writev readv block ...passed 00:08:41.313 Test: blockdev writev readv size > 128k ...passed 00:08:41.313 Test: blockdev writev readv size > 128k in two iovs ...passed 00:08:41.313 Test: blockdev comparev and writev ...passed 00:08:41.313 Test: blockdev nvme passthru rw ...passed 00:08:41.313 Test: blockdev nvme passthru vendor specific ...passed 00:08:41.313 Test: blockdev nvme admin passthru ...passed 00:08:41.313 Test: blockdev copy ...passed 00:08:41.313 Suite: bdevio tests on: Malloc1p1 00:08:41.313 Test: blockdev write read block ...passed 00:08:41.313 Test: blockdev write zeroes read block ...passed 00:08:41.313 Test: blockdev write zeroes read no split ...passed 00:08:41.313 Test: blockdev write zeroes read split ...passed 00:08:41.313 Test: blockdev write zeroes read split partial ...passed 00:08:41.313 Test: blockdev reset ...passed 00:08:41.313 Test: blockdev write read 8 blocks ...passed 00:08:41.313 Test: blockdev write read size > 128k ...passed 00:08:41.313 Test: blockdev write read invalid size ...passed 00:08:41.313 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:08:41.313 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:08:41.313 Test: blockdev write read max offset ...passed 00:08:41.313 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:08:41.313 Test: blockdev writev readv 8 blocks ...passed 00:08:41.313 Test: blockdev writev readv 30 x 1block ...passed 00:08:41.313 Test: blockdev writev readv block ...passed 00:08:41.313 Test: blockdev writev readv size > 128k ...passed 00:08:41.313 Test: blockdev writev readv size > 128k in two iovs ...passed 00:08:41.313 Test: blockdev comparev and writev ...passed 00:08:41.313 Test: blockdev nvme passthru rw ...passed 00:08:41.313 Test: blockdev nvme passthru vendor specific ...passed 00:08:41.313 Test: blockdev nvme admin passthru ...passed 00:08:41.313 Test: blockdev copy ...passed 00:08:41.313 Suite: bdevio tests on: Malloc1p0 00:08:41.313 Test: blockdev write read block ...passed 00:08:41.313 Test: blockdev write zeroes read block ...passed 00:08:41.313 Test: blockdev write zeroes read no split ...passed 00:08:41.313 Test: blockdev write zeroes read split ...passed 00:08:41.313 Test: blockdev write zeroes read split partial ...passed 00:08:41.313 Test: blockdev reset ...passed 00:08:41.313 Test: blockdev write read 8 blocks ...passed 00:08:41.313 Test: blockdev write read size > 128k ...passed 00:08:41.313 Test: blockdev write read invalid size ...passed 00:08:41.313 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:08:41.313 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:08:41.313 Test: blockdev write read max offset ...passed 00:08:41.313 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:08:41.313 Test: blockdev writev readv 8 blocks ...passed 00:08:41.313 Test: blockdev writev readv 30 x 1block ...passed 00:08:41.313 Test: blockdev writev readv block ...passed 00:08:41.313 Test: blockdev writev readv size > 128k ...passed 00:08:41.313 Test: blockdev writev readv size > 128k in two iovs ...passed 00:08:41.313 Test: blockdev comparev and writev ...passed 00:08:41.313 Test: blockdev nvme passthru rw ...passed 00:08:41.313 Test: blockdev nvme passthru vendor specific ...passed 00:08:41.313 Test: blockdev nvme admin passthru ...passed 00:08:41.313 Test: blockdev copy ...passed 00:08:41.313 Suite: bdevio tests on: Malloc0 00:08:41.313 Test: blockdev write read block ...passed 00:08:41.313 Test: blockdev write zeroes read block ...passed 00:08:41.313 Test: blockdev write zeroes read no split ...passed 00:08:41.313 Test: blockdev write zeroes read split ...passed 00:08:41.313 Test: blockdev write zeroes read split partial ...passed 00:08:41.313 Test: blockdev reset ...passed 00:08:41.313 Test: blockdev write read 8 blocks ...passed 00:08:41.313 Test: blockdev write read size > 128k ...passed 00:08:41.313 Test: blockdev write read invalid size ...passed 00:08:41.313 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:08:41.313 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:08:41.313 Test: blockdev write read max offset ...passed 00:08:41.313 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:08:41.313 Test: blockdev writev readv 8 blocks ...passed 00:08:41.313 Test: blockdev writev readv 30 x 1block ...passed 00:08:41.313 Test: blockdev writev readv block ...passed 00:08:41.313 Test: blockdev writev readv size > 128k ...passed 00:08:41.313 Test: blockdev writev readv size > 128k in two iovs ...passed 00:08:41.313 Test: blockdev comparev and writev ...passed 00:08:41.313 Test: blockdev nvme passthru rw ...passed 00:08:41.313 Test: blockdev nvme passthru vendor specific ...passed 00:08:41.313 Test: blockdev nvme admin passthru ...passed 00:08:41.313 Test: blockdev copy ...passed 00:08:41.313 00:08:41.313 Run Summary: Type Total Ran Passed Failed Inactive 00:08:41.313 suites 16 16 n/a 0 0 00:08:41.313 tests 368 368 368 0 0 00:08:41.313 asserts 2224 2224 2224 0 n/a 00:08:41.313 00:08:41.313 Elapsed time = 0.507 seconds 00:08:41.313 0 00:08:41.313 18:12:25 blockdev_general.bdev_bounds -- bdev/blockdev.sh@295 -- # killprocess 2438427 00:08:41.313 18:12:25 blockdev_general.bdev_bounds -- common/autotest_common.sh@948 -- # '[' -z 2438427 ']' 00:08:41.313 18:12:25 blockdev_general.bdev_bounds -- common/autotest_common.sh@952 -- # kill -0 2438427 00:08:41.313 18:12:25 blockdev_general.bdev_bounds -- common/autotest_common.sh@953 -- # uname 00:08:41.572 18:12:25 blockdev_general.bdev_bounds -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:08:41.572 18:12:25 blockdev_general.bdev_bounds -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2438427 00:08:41.572 18:12:25 blockdev_general.bdev_bounds -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:08:41.572 18:12:25 blockdev_general.bdev_bounds -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:08:41.572 18:12:25 blockdev_general.bdev_bounds -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2438427' 00:08:41.572 killing process with pid 2438427 00:08:41.572 18:12:25 blockdev_general.bdev_bounds -- common/autotest_common.sh@967 -- # kill 2438427 00:08:41.572 18:12:25 blockdev_general.bdev_bounds -- common/autotest_common.sh@972 -- # wait 2438427 00:08:41.830 18:12:25 blockdev_general.bdev_bounds -- bdev/blockdev.sh@296 -- # trap - SIGINT SIGTERM EXIT 00:08:41.830 00:08:41.830 real 0m1.702s 00:08:41.830 user 0m4.266s 00:08:41.830 sys 0m0.500s 00:08:41.830 18:12:25 blockdev_general.bdev_bounds -- common/autotest_common.sh@1124 -- # xtrace_disable 00:08:41.830 18:12:25 blockdev_general.bdev_bounds -- common/autotest_common.sh@10 -- # set +x 00:08:41.830 ************************************ 00:08:41.830 END TEST bdev_bounds 00:08:41.830 ************************************ 00:08:41.830 18:12:25 blockdev_general -- common/autotest_common.sh@1142 -- # return 0 00:08:41.830 18:12:25 blockdev_general -- bdev/blockdev.sh@762 -- # run_test bdev_nbd nbd_function_test /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json 'Malloc0 Malloc1p0 Malloc1p1 Malloc2p0 Malloc2p1 Malloc2p2 Malloc2p3 Malloc2p4 Malloc2p5 Malloc2p6 Malloc2p7 TestPT raid0 concat0 raid1 AIO0' '' 00:08:41.830 18:12:25 blockdev_general -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:08:41.830 18:12:25 blockdev_general -- common/autotest_common.sh@1105 -- # xtrace_disable 00:08:41.830 18:12:25 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:08:41.830 ************************************ 00:08:41.830 START TEST bdev_nbd 00:08:41.830 ************************************ 00:08:41.830 18:12:25 blockdev_general.bdev_nbd -- common/autotest_common.sh@1123 -- # nbd_function_test /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json 'Malloc0 Malloc1p0 Malloc1p1 Malloc2p0 Malloc2p1 Malloc2p2 Malloc2p3 Malloc2p4 Malloc2p5 Malloc2p6 Malloc2p7 TestPT raid0 concat0 raid1 AIO0' '' 00:08:41.830 18:12:25 blockdev_general.bdev_nbd -- bdev/blockdev.sh@300 -- # uname -s 00:08:41.830 18:12:25 blockdev_general.bdev_nbd -- bdev/blockdev.sh@300 -- # [[ Linux == Linux ]] 00:08:41.830 18:12:25 blockdev_general.bdev_nbd -- bdev/blockdev.sh@302 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:08:41.830 18:12:25 blockdev_general.bdev_nbd -- bdev/blockdev.sh@303 -- # local conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json 00:08:41.830 18:12:25 blockdev_general.bdev_nbd -- bdev/blockdev.sh@304 -- # bdev_all=('Malloc0' 'Malloc1p0' 'Malloc1p1' 'Malloc2p0' 'Malloc2p1' 'Malloc2p2' 'Malloc2p3' 'Malloc2p4' 'Malloc2p5' 'Malloc2p6' 'Malloc2p7' 'TestPT' 'raid0' 'concat0' 'raid1' 'AIO0') 00:08:41.830 18:12:25 blockdev_general.bdev_nbd -- bdev/blockdev.sh@304 -- # local bdev_all 00:08:41.830 18:12:25 blockdev_general.bdev_nbd -- bdev/blockdev.sh@305 -- # local bdev_num=16 00:08:41.830 18:12:25 blockdev_general.bdev_nbd -- bdev/blockdev.sh@309 -- # [[ -e /sys/module/nbd ]] 00:08:41.830 18:12:25 blockdev_general.bdev_nbd -- bdev/blockdev.sh@311 -- # nbd_all=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11' '/dev/nbd12' '/dev/nbd13' '/dev/nbd14' '/dev/nbd15' '/dev/nbd2' '/dev/nbd3' '/dev/nbd4' '/dev/nbd5' '/dev/nbd6' '/dev/nbd7' '/dev/nbd8' '/dev/nbd9') 00:08:41.830 18:12:25 blockdev_general.bdev_nbd -- bdev/blockdev.sh@311 -- # local nbd_all 00:08:41.830 18:12:25 blockdev_general.bdev_nbd -- bdev/blockdev.sh@312 -- # bdev_num=16 00:08:41.831 18:12:25 blockdev_general.bdev_nbd -- bdev/blockdev.sh@314 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11' '/dev/nbd12' '/dev/nbd13' '/dev/nbd14' '/dev/nbd15' '/dev/nbd2' '/dev/nbd3' '/dev/nbd4' '/dev/nbd5' '/dev/nbd6' '/dev/nbd7' '/dev/nbd8' '/dev/nbd9') 00:08:41.831 18:12:25 blockdev_general.bdev_nbd -- bdev/blockdev.sh@314 -- # local nbd_list 00:08:41.831 18:12:25 blockdev_general.bdev_nbd -- bdev/blockdev.sh@315 -- # bdev_list=('Malloc0' 'Malloc1p0' 'Malloc1p1' 'Malloc2p0' 'Malloc2p1' 'Malloc2p2' 'Malloc2p3' 'Malloc2p4' 'Malloc2p5' 'Malloc2p6' 'Malloc2p7' 'TestPT' 'raid0' 'concat0' 'raid1' 'AIO0') 00:08:41.831 18:12:25 blockdev_general.bdev_nbd -- bdev/blockdev.sh@315 -- # local bdev_list 00:08:41.831 18:12:25 blockdev_general.bdev_nbd -- bdev/blockdev.sh@318 -- # nbd_pid=2438636 00:08:41.831 18:12:25 blockdev_general.bdev_nbd -- bdev/blockdev.sh@319 -- # trap 'cleanup; killprocess $nbd_pid' SIGINT SIGTERM EXIT 00:08:41.831 18:12:25 blockdev_general.bdev_nbd -- bdev/blockdev.sh@317 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-nbd.sock -i 0 --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json '' 00:08:41.831 18:12:25 blockdev_general.bdev_nbd -- bdev/blockdev.sh@320 -- # waitforlisten 2438636 /var/tmp/spdk-nbd.sock 00:08:41.831 18:12:25 blockdev_general.bdev_nbd -- common/autotest_common.sh@829 -- # '[' -z 2438636 ']' 00:08:41.831 18:12:25 blockdev_general.bdev_nbd -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-nbd.sock 00:08:41.831 18:12:25 blockdev_general.bdev_nbd -- common/autotest_common.sh@834 -- # local max_retries=100 00:08:41.831 18:12:25 blockdev_general.bdev_nbd -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock...' 00:08:41.831 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock... 00:08:41.831 18:12:25 blockdev_general.bdev_nbd -- common/autotest_common.sh@838 -- # xtrace_disable 00:08:41.831 18:12:25 blockdev_general.bdev_nbd -- common/autotest_common.sh@10 -- # set +x 00:08:41.831 [2024-07-12 18:12:25.527498] Starting SPDK v24.09-pre git sha1 182dd7de4 / DPDK 24.03.0 initialization... 00:08:41.831 [2024-07-12 18:12:25.527559] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:08:42.089 [2024-07-12 18:12:25.658098] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:42.089 [2024-07-12 18:12:25.760415] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:08:42.348 [2024-07-12 18:12:25.919504] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on Malloc3 00:08:42.348 [2024-07-12 18:12:25.919564] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc3 00:08:42.348 [2024-07-12 18:12:25.919579] vbdev_passthru.c: 735:bdev_passthru_create_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:08:42.348 [2024-07-12 18:12:25.927515] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:08:42.348 [2024-07-12 18:12:25.927550] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:08:42.348 [2024-07-12 18:12:25.935524] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:08:42.348 [2024-07-12 18:12:25.935550] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:08:42.348 [2024-07-12 18:12:26.012656] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on Malloc3 00:08:42.348 [2024-07-12 18:12:26.012707] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:08:42.348 [2024-07-12 18:12:26.012724] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1f26a40 00:08:42.348 [2024-07-12 18:12:26.012737] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:08:42.348 [2024-07-12 18:12:26.014149] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:08:42.348 [2024-07-12 18:12:26.014178] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: TestPT 00:08:42.915 18:12:26 blockdev_general.bdev_nbd -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:08:42.915 18:12:26 blockdev_general.bdev_nbd -- common/autotest_common.sh@862 -- # return 0 00:08:42.915 18:12:26 blockdev_general.bdev_nbd -- bdev/blockdev.sh@322 -- # nbd_rpc_start_stop_verify /var/tmp/spdk-nbd.sock 'Malloc0 Malloc1p0 Malloc1p1 Malloc2p0 Malloc2p1 Malloc2p2 Malloc2p3 Malloc2p4 Malloc2p5 Malloc2p6 Malloc2p7 TestPT raid0 concat0 raid1 AIO0' 00:08:42.915 18:12:26 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@113 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:08:42.915 18:12:26 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@114 -- # bdev_list=('Malloc0' 'Malloc1p0' 'Malloc1p1' 'Malloc2p0' 'Malloc2p1' 'Malloc2p2' 'Malloc2p3' 'Malloc2p4' 'Malloc2p5' 'Malloc2p6' 'Malloc2p7' 'TestPT' 'raid0' 'concat0' 'raid1' 'AIO0') 00:08:42.915 18:12:26 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@114 -- # local bdev_list 00:08:42.915 18:12:26 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@116 -- # nbd_start_disks_without_nbd_idx /var/tmp/spdk-nbd.sock 'Malloc0 Malloc1p0 Malloc1p1 Malloc2p0 Malloc2p1 Malloc2p2 Malloc2p3 Malloc2p4 Malloc2p5 Malloc2p6 Malloc2p7 TestPT raid0 concat0 raid1 AIO0' 00:08:42.915 18:12:26 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@22 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:08:42.915 18:12:26 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@23 -- # bdev_list=('Malloc0' 'Malloc1p0' 'Malloc1p1' 'Malloc2p0' 'Malloc2p1' 'Malloc2p2' 'Malloc2p3' 'Malloc2p4' 'Malloc2p5' 'Malloc2p6' 'Malloc2p7' 'TestPT' 'raid0' 'concat0' 'raid1' 'AIO0') 00:08:42.915 18:12:26 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@23 -- # local bdev_list 00:08:42.915 18:12:26 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@24 -- # local i 00:08:42.915 18:12:26 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@25 -- # local nbd_device 00:08:42.915 18:12:26 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i = 0 )) 00:08:42.915 18:12:26 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 16 )) 00:08:42.915 18:12:26 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc0 00:08:43.174 18:12:26 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd0 00:08:43.174 18:12:26 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd0 00:08:43.174 18:12:26 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd0 00:08:43.174 18:12:26 blockdev_general.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd0 00:08:43.174 18:12:26 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:08:43.174 18:12:26 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:08:43.174 18:12:26 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:08:43.174 18:12:26 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd0 /proc/partitions 00:08:43.174 18:12:26 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:08:43.174 18:12:26 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:08:43.174 18:12:26 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:08:43.174 18:12:26 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:08:43.174 1+0 records in 00:08:43.174 1+0 records out 00:08:43.174 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000212477 s, 19.3 MB/s 00:08:43.174 18:12:26 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:08:43.174 18:12:26 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:08:43.174 18:12:26 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:08:43.174 18:12:26 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:08:43.174 18:12:26 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:08:43.174 18:12:26 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:08:43.174 18:12:26 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 16 )) 00:08:43.174 18:12:26 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc1p0 00:08:43.433 18:12:26 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd1 00:08:43.433 18:12:26 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd1 00:08:43.433 18:12:27 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd1 00:08:43.433 18:12:27 blockdev_general.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd1 00:08:43.433 18:12:27 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:08:43.433 18:12:27 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:08:43.433 18:12:27 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:08:43.433 18:12:27 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd1 /proc/partitions 00:08:43.433 18:12:27 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:08:43.433 18:12:27 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:08:43.433 18:12:27 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:08:43.433 18:12:27 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:08:43.433 1+0 records in 00:08:43.433 1+0 records out 00:08:43.433 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000271778 s, 15.1 MB/s 00:08:43.433 18:12:27 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:08:43.433 18:12:27 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:08:43.433 18:12:27 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:08:43.433 18:12:27 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:08:43.433 18:12:27 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:08:43.433 18:12:27 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:08:43.433 18:12:27 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 16 )) 00:08:43.434 18:12:27 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc1p1 00:08:43.693 18:12:27 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd2 00:08:43.693 18:12:27 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd2 00:08:43.693 18:12:27 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd2 00:08:43.693 18:12:27 blockdev_general.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd2 00:08:43.693 18:12:27 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:08:43.693 18:12:27 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:08:43.693 18:12:27 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:08:43.693 18:12:27 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd2 /proc/partitions 00:08:43.693 18:12:27 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:08:43.693 18:12:27 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:08:43.693 18:12:27 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:08:43.693 18:12:27 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd2 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:08:43.693 1+0 records in 00:08:43.693 1+0 records out 00:08:43.693 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000297821 s, 13.8 MB/s 00:08:43.693 18:12:27 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:08:43.693 18:12:27 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:08:43.693 18:12:27 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:08:43.693 18:12:27 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:08:43.693 18:12:27 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:08:43.693 18:12:27 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:08:43.693 18:12:27 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 16 )) 00:08:43.693 18:12:27 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc2p0 00:08:43.952 18:12:27 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd3 00:08:43.952 18:12:27 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd3 00:08:43.952 18:12:27 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd3 00:08:43.952 18:12:27 blockdev_general.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd3 00:08:43.952 18:12:27 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:08:43.952 18:12:27 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:08:43.952 18:12:27 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:08:43.952 18:12:27 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd3 /proc/partitions 00:08:43.952 18:12:27 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:08:43.952 18:12:27 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:08:43.952 18:12:27 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:08:43.952 18:12:27 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd3 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:08:43.952 1+0 records in 00:08:43.952 1+0 records out 00:08:43.952 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000357997 s, 11.4 MB/s 00:08:43.952 18:12:27 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:08:43.952 18:12:27 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:08:43.952 18:12:27 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:08:43.952 18:12:27 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:08:43.952 18:12:27 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:08:43.952 18:12:27 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:08:43.952 18:12:27 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 16 )) 00:08:43.952 18:12:27 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc2p1 00:08:44.212 18:12:27 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd4 00:08:44.212 18:12:27 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd4 00:08:44.212 18:12:27 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd4 00:08:44.212 18:12:27 blockdev_general.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd4 00:08:44.212 18:12:27 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:08:44.212 18:12:27 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:08:44.212 18:12:27 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:08:44.212 18:12:27 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd4 /proc/partitions 00:08:44.212 18:12:27 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:08:44.212 18:12:27 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:08:44.212 18:12:27 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:08:44.212 18:12:27 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd4 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:08:44.212 1+0 records in 00:08:44.212 1+0 records out 00:08:44.212 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000367784 s, 11.1 MB/s 00:08:44.212 18:12:27 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:08:44.212 18:12:27 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:08:44.212 18:12:27 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:08:44.212 18:12:27 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:08:44.212 18:12:27 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:08:44.212 18:12:27 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:08:44.212 18:12:27 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 16 )) 00:08:44.212 18:12:27 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc2p2 00:08:44.472 18:12:28 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd5 00:08:44.472 18:12:28 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd5 00:08:44.472 18:12:28 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd5 00:08:44.472 18:12:28 blockdev_general.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd5 00:08:44.472 18:12:28 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:08:44.472 18:12:28 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:08:44.472 18:12:28 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:08:44.473 18:12:28 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd5 /proc/partitions 00:08:44.473 18:12:28 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:08:44.473 18:12:28 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:08:44.473 18:12:28 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:08:44.473 18:12:28 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd5 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:08:44.473 1+0 records in 00:08:44.473 1+0 records out 00:08:44.473 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000354474 s, 11.6 MB/s 00:08:44.473 18:12:28 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:08:44.473 18:12:28 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:08:44.473 18:12:28 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:08:44.473 18:12:28 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:08:44.473 18:12:28 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:08:44.473 18:12:28 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:08:44.473 18:12:28 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 16 )) 00:08:44.473 18:12:28 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc2p3 00:08:44.732 18:12:28 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd6 00:08:44.732 18:12:28 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd6 00:08:44.732 18:12:28 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd6 00:08:44.732 18:12:28 blockdev_general.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd6 00:08:44.732 18:12:28 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:08:44.732 18:12:28 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:08:44.732 18:12:28 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:08:44.732 18:12:28 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd6 /proc/partitions 00:08:44.732 18:12:28 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:08:44.732 18:12:28 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:08:44.732 18:12:28 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:08:44.732 18:12:28 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd6 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:08:44.732 1+0 records in 00:08:44.732 1+0 records out 00:08:44.732 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000490852 s, 8.3 MB/s 00:08:44.732 18:12:28 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:08:44.991 18:12:28 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:08:44.991 18:12:28 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:08:44.991 18:12:28 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:08:44.991 18:12:28 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:08:44.991 18:12:28 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:08:44.991 18:12:28 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 16 )) 00:08:44.991 18:12:28 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc2p4 00:08:45.251 18:12:28 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd7 00:08:45.251 18:12:28 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd7 00:08:45.251 18:12:28 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd7 00:08:45.251 18:12:28 blockdev_general.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd7 00:08:45.251 18:12:28 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:08:45.251 18:12:28 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:08:45.251 18:12:28 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:08:45.251 18:12:28 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd7 /proc/partitions 00:08:45.251 18:12:28 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:08:45.251 18:12:28 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:08:45.251 18:12:28 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:08:45.251 18:12:28 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd7 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:08:45.251 1+0 records in 00:08:45.251 1+0 records out 00:08:45.251 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000533277 s, 7.7 MB/s 00:08:45.251 18:12:28 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:08:45.251 18:12:28 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:08:45.251 18:12:28 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:08:45.251 18:12:28 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:08:45.251 18:12:28 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:08:45.251 18:12:28 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:08:45.251 18:12:28 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 16 )) 00:08:45.251 18:12:28 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc2p5 00:08:45.510 18:12:29 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd8 00:08:45.510 18:12:29 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd8 00:08:45.510 18:12:29 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd8 00:08:45.510 18:12:29 blockdev_general.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd8 00:08:45.510 18:12:29 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:08:45.510 18:12:29 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:08:45.510 18:12:29 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:08:45.511 18:12:29 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd8 /proc/partitions 00:08:45.511 18:12:29 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:08:45.511 18:12:29 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:08:45.511 18:12:29 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:08:45.511 18:12:29 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd8 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:08:45.511 1+0 records in 00:08:45.511 1+0 records out 00:08:45.511 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000584208 s, 7.0 MB/s 00:08:45.511 18:12:29 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:08:45.511 18:12:29 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:08:45.511 18:12:29 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:08:45.511 18:12:29 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:08:45.511 18:12:29 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:08:45.511 18:12:29 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:08:45.511 18:12:29 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 16 )) 00:08:45.511 18:12:29 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc2p6 00:08:45.769 18:12:29 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd9 00:08:45.769 18:12:29 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd9 00:08:45.769 18:12:29 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd9 00:08:45.769 18:12:29 blockdev_general.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd9 00:08:45.769 18:12:29 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:08:45.769 18:12:29 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:08:45.769 18:12:29 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:08:45.769 18:12:29 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd9 /proc/partitions 00:08:45.769 18:12:29 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:08:45.769 18:12:29 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:08:45.769 18:12:29 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:08:45.769 18:12:29 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd9 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:08:45.769 1+0 records in 00:08:45.769 1+0 records out 00:08:45.769 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000437237 s, 9.4 MB/s 00:08:45.769 18:12:29 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:08:45.769 18:12:29 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:08:45.769 18:12:29 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:08:45.769 18:12:29 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:08:45.769 18:12:29 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:08:45.769 18:12:29 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:08:45.769 18:12:29 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 16 )) 00:08:45.769 18:12:29 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc2p7 00:08:46.027 18:12:29 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd10 00:08:46.027 18:12:29 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd10 00:08:46.027 18:12:29 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd10 00:08:46.027 18:12:29 blockdev_general.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd10 00:08:46.027 18:12:29 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:08:46.027 18:12:29 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:08:46.027 18:12:29 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:08:46.027 18:12:29 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd10 /proc/partitions 00:08:46.027 18:12:29 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:08:46.027 18:12:29 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:08:46.027 18:12:29 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:08:46.027 18:12:29 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd10 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:08:46.027 1+0 records in 00:08:46.027 1+0 records out 00:08:46.027 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000459189 s, 8.9 MB/s 00:08:46.027 18:12:29 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:08:46.027 18:12:29 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:08:46.027 18:12:29 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:08:46.027 18:12:29 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:08:46.027 18:12:29 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:08:46.027 18:12:29 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:08:46.027 18:12:29 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 16 )) 00:08:46.027 18:12:29 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk TestPT 00:08:46.288 18:12:29 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd11 00:08:46.288 18:12:29 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd11 00:08:46.288 18:12:29 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd11 00:08:46.288 18:12:29 blockdev_general.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd11 00:08:46.288 18:12:29 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:08:46.288 18:12:29 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:08:46.288 18:12:29 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:08:46.288 18:12:29 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd11 /proc/partitions 00:08:46.288 18:12:29 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:08:46.288 18:12:29 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:08:46.288 18:12:29 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:08:46.288 18:12:29 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd11 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:08:46.288 1+0 records in 00:08:46.288 1+0 records out 00:08:46.288 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000561811 s, 7.3 MB/s 00:08:46.288 18:12:29 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:08:46.288 18:12:29 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:08:46.288 18:12:29 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:08:46.288 18:12:29 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:08:46.288 18:12:29 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:08:46.288 18:12:29 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:08:46.288 18:12:29 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 16 )) 00:08:46.288 18:12:29 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk raid0 00:08:46.582 18:12:30 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd12 00:08:46.582 18:12:30 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd12 00:08:46.582 18:12:30 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd12 00:08:46.582 18:12:30 blockdev_general.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd12 00:08:46.582 18:12:30 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:08:46.582 18:12:30 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:08:46.582 18:12:30 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:08:46.582 18:12:30 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd12 /proc/partitions 00:08:46.582 18:12:30 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:08:46.582 18:12:30 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:08:46.582 18:12:30 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:08:46.582 18:12:30 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd12 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:08:46.582 1+0 records in 00:08:46.582 1+0 records out 00:08:46.582 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000737523 s, 5.6 MB/s 00:08:46.582 18:12:30 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:08:46.582 18:12:30 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:08:46.582 18:12:30 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:08:46.582 18:12:30 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:08:46.582 18:12:30 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:08:46.582 18:12:30 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:08:46.582 18:12:30 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 16 )) 00:08:46.582 18:12:30 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk concat0 00:08:46.840 18:12:30 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd13 00:08:46.840 18:12:30 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd13 00:08:46.840 18:12:30 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd13 00:08:46.840 18:12:30 blockdev_general.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd13 00:08:46.840 18:12:30 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:08:46.840 18:12:30 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:08:46.840 18:12:30 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:08:46.840 18:12:30 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd13 /proc/partitions 00:08:46.840 18:12:30 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:08:46.840 18:12:30 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:08:46.840 18:12:30 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:08:46.840 18:12:30 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd13 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:08:46.840 1+0 records in 00:08:46.840 1+0 records out 00:08:46.840 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000824669 s, 5.0 MB/s 00:08:46.840 18:12:30 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:08:46.840 18:12:30 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:08:46.840 18:12:30 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:08:46.840 18:12:30 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:08:46.840 18:12:30 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:08:46.840 18:12:30 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:08:46.840 18:12:30 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 16 )) 00:08:46.840 18:12:30 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk raid1 00:08:47.098 18:12:30 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd14 00:08:47.098 18:12:30 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd14 00:08:47.098 18:12:30 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd14 00:08:47.098 18:12:30 blockdev_general.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd14 00:08:47.098 18:12:30 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:08:47.098 18:12:30 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:08:47.098 18:12:30 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:08:47.098 18:12:30 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd14 /proc/partitions 00:08:47.098 18:12:30 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:08:47.098 18:12:30 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:08:47.098 18:12:30 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:08:47.098 18:12:30 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd14 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:08:47.098 1+0 records in 00:08:47.098 1+0 records out 00:08:47.098 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000732002 s, 5.6 MB/s 00:08:47.098 18:12:30 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:08:47.098 18:12:30 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:08:47.098 18:12:30 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:08:47.098 18:12:30 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:08:47.098 18:12:30 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:08:47.098 18:12:30 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:08:47.098 18:12:30 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 16 )) 00:08:47.099 18:12:30 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk AIO0 00:08:47.357 18:12:31 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd15 00:08:47.357 18:12:31 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd15 00:08:47.357 18:12:31 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd15 00:08:47.357 18:12:31 blockdev_general.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd15 00:08:47.357 18:12:31 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:08:47.357 18:12:31 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:08:47.357 18:12:31 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:08:47.357 18:12:31 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd15 /proc/partitions 00:08:47.357 18:12:31 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:08:47.357 18:12:31 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:08:47.357 18:12:31 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:08:47.357 18:12:31 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd15 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:08:47.357 1+0 records in 00:08:47.357 1+0 records out 00:08:47.357 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000776133 s, 5.3 MB/s 00:08:47.357 18:12:31 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:08:47.357 18:12:31 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:08:47.357 18:12:31 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:08:47.357 18:12:31 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:08:47.357 18:12:31 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:08:47.357 18:12:31 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:08:47.357 18:12:31 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 16 )) 00:08:47.357 18:12:31 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@118 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:08:47.616 18:12:31 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@118 -- # nbd_disks_json='[ 00:08:47.616 { 00:08:47.616 "nbd_device": "/dev/nbd0", 00:08:47.616 "bdev_name": "Malloc0" 00:08:47.616 }, 00:08:47.616 { 00:08:47.616 "nbd_device": "/dev/nbd1", 00:08:47.616 "bdev_name": "Malloc1p0" 00:08:47.616 }, 00:08:47.616 { 00:08:47.616 "nbd_device": "/dev/nbd2", 00:08:47.616 "bdev_name": "Malloc1p1" 00:08:47.616 }, 00:08:47.616 { 00:08:47.616 "nbd_device": "/dev/nbd3", 00:08:47.616 "bdev_name": "Malloc2p0" 00:08:47.616 }, 00:08:47.616 { 00:08:47.616 "nbd_device": "/dev/nbd4", 00:08:47.616 "bdev_name": "Malloc2p1" 00:08:47.616 }, 00:08:47.616 { 00:08:47.616 "nbd_device": "/dev/nbd5", 00:08:47.616 "bdev_name": "Malloc2p2" 00:08:47.616 }, 00:08:47.616 { 00:08:47.616 "nbd_device": "/dev/nbd6", 00:08:47.616 "bdev_name": "Malloc2p3" 00:08:47.616 }, 00:08:47.616 { 00:08:47.616 "nbd_device": "/dev/nbd7", 00:08:47.616 "bdev_name": "Malloc2p4" 00:08:47.616 }, 00:08:47.616 { 00:08:47.616 "nbd_device": "/dev/nbd8", 00:08:47.616 "bdev_name": "Malloc2p5" 00:08:47.616 }, 00:08:47.616 { 00:08:47.616 "nbd_device": "/dev/nbd9", 00:08:47.616 "bdev_name": "Malloc2p6" 00:08:47.616 }, 00:08:47.616 { 00:08:47.616 "nbd_device": "/dev/nbd10", 00:08:47.616 "bdev_name": "Malloc2p7" 00:08:47.616 }, 00:08:47.616 { 00:08:47.616 "nbd_device": "/dev/nbd11", 00:08:47.616 "bdev_name": "TestPT" 00:08:47.616 }, 00:08:47.616 { 00:08:47.616 "nbd_device": "/dev/nbd12", 00:08:47.616 "bdev_name": "raid0" 00:08:47.616 }, 00:08:47.616 { 00:08:47.616 "nbd_device": "/dev/nbd13", 00:08:47.616 "bdev_name": "concat0" 00:08:47.616 }, 00:08:47.616 { 00:08:47.616 "nbd_device": "/dev/nbd14", 00:08:47.616 "bdev_name": "raid1" 00:08:47.616 }, 00:08:47.616 { 00:08:47.616 "nbd_device": "/dev/nbd15", 00:08:47.616 "bdev_name": "AIO0" 00:08:47.616 } 00:08:47.616 ]' 00:08:47.616 18:12:31 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@119 -- # nbd_disks_name=($(echo "${nbd_disks_json}" | jq -r '.[] | .nbd_device')) 00:08:47.616 18:12:31 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@119 -- # jq -r '.[] | .nbd_device' 00:08:47.616 18:12:31 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@119 -- # echo '[ 00:08:47.616 { 00:08:47.616 "nbd_device": "/dev/nbd0", 00:08:47.616 "bdev_name": "Malloc0" 00:08:47.616 }, 00:08:47.616 { 00:08:47.616 "nbd_device": "/dev/nbd1", 00:08:47.616 "bdev_name": "Malloc1p0" 00:08:47.616 }, 00:08:47.616 { 00:08:47.616 "nbd_device": "/dev/nbd2", 00:08:47.616 "bdev_name": "Malloc1p1" 00:08:47.616 }, 00:08:47.616 { 00:08:47.616 "nbd_device": "/dev/nbd3", 00:08:47.616 "bdev_name": "Malloc2p0" 00:08:47.616 }, 00:08:47.616 { 00:08:47.616 "nbd_device": "/dev/nbd4", 00:08:47.616 "bdev_name": "Malloc2p1" 00:08:47.616 }, 00:08:47.616 { 00:08:47.616 "nbd_device": "/dev/nbd5", 00:08:47.616 "bdev_name": "Malloc2p2" 00:08:47.616 }, 00:08:47.616 { 00:08:47.616 "nbd_device": "/dev/nbd6", 00:08:47.616 "bdev_name": "Malloc2p3" 00:08:47.616 }, 00:08:47.616 { 00:08:47.616 "nbd_device": "/dev/nbd7", 00:08:47.616 "bdev_name": "Malloc2p4" 00:08:47.616 }, 00:08:47.616 { 00:08:47.616 "nbd_device": "/dev/nbd8", 00:08:47.616 "bdev_name": "Malloc2p5" 00:08:47.616 }, 00:08:47.616 { 00:08:47.616 "nbd_device": "/dev/nbd9", 00:08:47.616 "bdev_name": "Malloc2p6" 00:08:47.616 }, 00:08:47.616 { 00:08:47.616 "nbd_device": "/dev/nbd10", 00:08:47.616 "bdev_name": "Malloc2p7" 00:08:47.616 }, 00:08:47.616 { 00:08:47.616 "nbd_device": "/dev/nbd11", 00:08:47.616 "bdev_name": "TestPT" 00:08:47.616 }, 00:08:47.616 { 00:08:47.616 "nbd_device": "/dev/nbd12", 00:08:47.616 "bdev_name": "raid0" 00:08:47.616 }, 00:08:47.616 { 00:08:47.616 "nbd_device": "/dev/nbd13", 00:08:47.616 "bdev_name": "concat0" 00:08:47.616 }, 00:08:47.616 { 00:08:47.616 "nbd_device": "/dev/nbd14", 00:08:47.616 "bdev_name": "raid1" 00:08:47.616 }, 00:08:47.616 { 00:08:47.616 "nbd_device": "/dev/nbd15", 00:08:47.616 "bdev_name": "AIO0" 00:08:47.616 } 00:08:47.616 ]' 00:08:47.616 18:12:31 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@120 -- # nbd_stop_disks /var/tmp/spdk-nbd.sock '/dev/nbd0 /dev/nbd1 /dev/nbd2 /dev/nbd3 /dev/nbd4 /dev/nbd5 /dev/nbd6 /dev/nbd7 /dev/nbd8 /dev/nbd9 /dev/nbd10 /dev/nbd11 /dev/nbd12 /dev/nbd13 /dev/nbd14 /dev/nbd15' 00:08:47.616 18:12:31 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:08:47.617 18:12:31 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd2' '/dev/nbd3' '/dev/nbd4' '/dev/nbd5' '/dev/nbd6' '/dev/nbd7' '/dev/nbd8' '/dev/nbd9' '/dev/nbd10' '/dev/nbd11' '/dev/nbd12' '/dev/nbd13' '/dev/nbd14' '/dev/nbd15') 00:08:47.617 18:12:31 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@50 -- # local nbd_list 00:08:47.617 18:12:31 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@51 -- # local i 00:08:47.617 18:12:31 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:08:47.617 18:12:31 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd0 00:08:47.875 18:12:31 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:08:47.875 18:12:31 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:08:47.875 18:12:31 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:08:47.875 18:12:31 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:08:47.875 18:12:31 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:08:47.875 18:12:31 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:08:47.875 18:12:31 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:08:47.875 18:12:31 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:08:47.875 18:12:31 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:08:47.875 18:12:31 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd1 00:08:48.133 18:12:31 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:08:48.133 18:12:31 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:08:48.133 18:12:31 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:08:48.133 18:12:31 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:08:48.133 18:12:31 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:08:48.133 18:12:31 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:08:48.133 18:12:31 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:08:48.133 18:12:31 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:08:48.133 18:12:31 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:08:48.133 18:12:31 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd2 00:08:48.391 18:12:32 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd2 00:08:48.391 18:12:32 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd2 00:08:48.391 18:12:32 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd2 00:08:48.391 18:12:32 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:08:48.391 18:12:32 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:08:48.391 18:12:32 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd2 /proc/partitions 00:08:48.391 18:12:32 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:08:48.391 18:12:32 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:08:48.391 18:12:32 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:08:48.391 18:12:32 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd3 00:08:48.650 18:12:32 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd3 00:08:48.650 18:12:32 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd3 00:08:48.650 18:12:32 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd3 00:08:48.650 18:12:32 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:08:48.650 18:12:32 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:08:48.650 18:12:32 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd3 /proc/partitions 00:08:48.650 18:12:32 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:08:48.650 18:12:32 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:08:48.650 18:12:32 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:08:48.650 18:12:32 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd4 00:08:48.908 18:12:32 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd4 00:08:48.908 18:12:32 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd4 00:08:48.908 18:12:32 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd4 00:08:48.908 18:12:32 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:08:48.908 18:12:32 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:08:48.908 18:12:32 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd4 /proc/partitions 00:08:48.908 18:12:32 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:08:48.908 18:12:32 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:08:48.908 18:12:32 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:08:48.908 18:12:32 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd5 00:08:49.166 18:12:32 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd5 00:08:49.166 18:12:32 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd5 00:08:49.166 18:12:32 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd5 00:08:49.166 18:12:32 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:08:49.166 18:12:32 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:08:49.166 18:12:32 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd5 /proc/partitions 00:08:49.166 18:12:32 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:08:49.166 18:12:32 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:08:49.166 18:12:32 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:08:49.166 18:12:32 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd6 00:08:49.424 18:12:33 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd6 00:08:49.424 18:12:33 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd6 00:08:49.424 18:12:33 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd6 00:08:49.424 18:12:33 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:08:49.424 18:12:33 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:08:49.424 18:12:33 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd6 /proc/partitions 00:08:49.424 18:12:33 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:08:49.424 18:12:33 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:08:49.424 18:12:33 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:08:49.424 18:12:33 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd7 00:08:49.681 18:12:33 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd7 00:08:49.681 18:12:33 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd7 00:08:49.681 18:12:33 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd7 00:08:49.681 18:12:33 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:08:49.681 18:12:33 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:08:49.681 18:12:33 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd7 /proc/partitions 00:08:49.681 18:12:33 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:08:49.681 18:12:33 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:08:49.681 18:12:33 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:08:49.681 18:12:33 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd8 00:08:49.938 18:12:33 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd8 00:08:49.938 18:12:33 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd8 00:08:49.938 18:12:33 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd8 00:08:49.938 18:12:33 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:08:49.938 18:12:33 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:08:49.938 18:12:33 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd8 /proc/partitions 00:08:49.938 18:12:33 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:08:49.938 18:12:33 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:08:49.938 18:12:33 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:08:49.938 18:12:33 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd9 00:08:50.196 18:12:33 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd9 00:08:50.196 18:12:33 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd9 00:08:50.196 18:12:33 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd9 00:08:50.196 18:12:33 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:08:50.196 18:12:33 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:08:50.196 18:12:33 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd9 /proc/partitions 00:08:50.196 18:12:33 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:08:50.196 18:12:33 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:08:50.196 18:12:33 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:08:50.196 18:12:33 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd10 00:08:50.454 18:12:34 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd10 00:08:50.454 18:12:34 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd10 00:08:50.454 18:12:34 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd10 00:08:50.454 18:12:34 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:08:50.454 18:12:34 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:08:50.454 18:12:34 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd10 /proc/partitions 00:08:50.454 18:12:34 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:08:50.454 18:12:34 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:08:50.454 18:12:34 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:08:50.454 18:12:34 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd11 00:08:50.713 18:12:34 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd11 00:08:50.713 18:12:34 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd11 00:08:50.713 18:12:34 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd11 00:08:50.713 18:12:34 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:08:50.713 18:12:34 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:08:50.713 18:12:34 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd11 /proc/partitions 00:08:50.713 18:12:34 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:08:50.713 18:12:34 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:08:50.713 18:12:34 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:08:50.713 18:12:34 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd12 00:08:50.972 18:12:34 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd12 00:08:50.972 18:12:34 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd12 00:08:50.972 18:12:34 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd12 00:08:50.972 18:12:34 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:08:50.972 18:12:34 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:08:50.972 18:12:34 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd12 /proc/partitions 00:08:50.972 18:12:34 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:08:50.972 18:12:34 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:08:50.972 18:12:34 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:08:50.972 18:12:34 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd13 00:08:51.230 18:12:34 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd13 00:08:51.230 18:12:34 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd13 00:08:51.230 18:12:34 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd13 00:08:51.230 18:12:34 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:08:51.230 18:12:34 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:08:51.231 18:12:34 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd13 /proc/partitions 00:08:51.231 18:12:34 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:08:51.231 18:12:34 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:08:51.231 18:12:34 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:08:51.231 18:12:34 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd14 00:08:51.489 18:12:35 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd14 00:08:51.489 18:12:35 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd14 00:08:51.489 18:12:35 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd14 00:08:51.489 18:12:35 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:08:51.489 18:12:35 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:08:51.489 18:12:35 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd14 /proc/partitions 00:08:51.489 18:12:35 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:08:51.489 18:12:35 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:08:51.489 18:12:35 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:08:51.489 18:12:35 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd15 00:08:51.748 18:12:35 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd15 00:08:51.748 18:12:35 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd15 00:08:51.748 18:12:35 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd15 00:08:51.748 18:12:35 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:08:51.748 18:12:35 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:08:51.748 18:12:35 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd15 /proc/partitions 00:08:51.748 18:12:35 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:08:51.748 18:12:35 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:08:51.748 18:12:35 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@122 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:08:51.748 18:12:35 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:08:51.748 18:12:35 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:08:52.006 18:12:35 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[]' 00:08:52.006 18:12:35 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@64 -- # echo '[]' 00:08:52.006 18:12:35 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:08:52.006 18:12:35 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@64 -- # nbd_disks_name= 00:08:52.006 18:12:35 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@65 -- # echo '' 00:08:52.006 18:12:35 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:08:52.006 18:12:35 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@65 -- # true 00:08:52.006 18:12:35 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@65 -- # count=0 00:08:52.006 18:12:35 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@66 -- # echo 0 00:08:52.006 18:12:35 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@122 -- # count=0 00:08:52.006 18:12:35 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@123 -- # '[' 0 -ne 0 ']' 00:08:52.006 18:12:35 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@127 -- # return 0 00:08:52.006 18:12:35 blockdev_general.bdev_nbd -- bdev/blockdev.sh@323 -- # nbd_rpc_data_verify /var/tmp/spdk-nbd.sock 'Malloc0 Malloc1p0 Malloc1p1 Malloc2p0 Malloc2p1 Malloc2p2 Malloc2p3 Malloc2p4 Malloc2p5 Malloc2p6 Malloc2p7 TestPT raid0 concat0 raid1 AIO0' '/dev/nbd0 /dev/nbd1 /dev/nbd10 /dev/nbd11 /dev/nbd12 /dev/nbd13 /dev/nbd14 /dev/nbd15 /dev/nbd2 /dev/nbd3 /dev/nbd4 /dev/nbd5 /dev/nbd6 /dev/nbd7 /dev/nbd8 /dev/nbd9' 00:08:52.006 18:12:35 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@90 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:08:52.006 18:12:35 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@91 -- # bdev_list=('Malloc0' 'Malloc1p0' 'Malloc1p1' 'Malloc2p0' 'Malloc2p1' 'Malloc2p2' 'Malloc2p3' 'Malloc2p4' 'Malloc2p5' 'Malloc2p6' 'Malloc2p7' 'TestPT' 'raid0' 'concat0' 'raid1' 'AIO0') 00:08:52.006 18:12:35 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@91 -- # local bdev_list 00:08:52.006 18:12:35 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@92 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11' '/dev/nbd12' '/dev/nbd13' '/dev/nbd14' '/dev/nbd15' '/dev/nbd2' '/dev/nbd3' '/dev/nbd4' '/dev/nbd5' '/dev/nbd6' '/dev/nbd7' '/dev/nbd8' '/dev/nbd9') 00:08:52.006 18:12:35 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@92 -- # local nbd_list 00:08:52.006 18:12:35 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@94 -- # nbd_start_disks /var/tmp/spdk-nbd.sock 'Malloc0 Malloc1p0 Malloc1p1 Malloc2p0 Malloc2p1 Malloc2p2 Malloc2p3 Malloc2p4 Malloc2p5 Malloc2p6 Malloc2p7 TestPT raid0 concat0 raid1 AIO0' '/dev/nbd0 /dev/nbd1 /dev/nbd10 /dev/nbd11 /dev/nbd12 /dev/nbd13 /dev/nbd14 /dev/nbd15 /dev/nbd2 /dev/nbd3 /dev/nbd4 /dev/nbd5 /dev/nbd6 /dev/nbd7 /dev/nbd8 /dev/nbd9' 00:08:52.006 18:12:35 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:08:52.006 18:12:35 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@10 -- # bdev_list=('Malloc0' 'Malloc1p0' 'Malloc1p1' 'Malloc2p0' 'Malloc2p1' 'Malloc2p2' 'Malloc2p3' 'Malloc2p4' 'Malloc2p5' 'Malloc2p6' 'Malloc2p7' 'TestPT' 'raid0' 'concat0' 'raid1' 'AIO0') 00:08:52.006 18:12:35 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@10 -- # local bdev_list 00:08:52.006 18:12:35 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11' '/dev/nbd12' '/dev/nbd13' '/dev/nbd14' '/dev/nbd15' '/dev/nbd2' '/dev/nbd3' '/dev/nbd4' '/dev/nbd5' '/dev/nbd6' '/dev/nbd7' '/dev/nbd8' '/dev/nbd9') 00:08:52.006 18:12:35 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@11 -- # local nbd_list 00:08:52.006 18:12:35 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@12 -- # local i 00:08:52.006 18:12:35 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:08:52.006 18:12:35 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 16 )) 00:08:52.006 18:12:35 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc0 /dev/nbd0 00:08:52.265 /dev/nbd0 00:08:52.265 18:12:35 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:08:52.265 18:12:35 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:08:52.265 18:12:35 blockdev_general.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd0 00:08:52.265 18:12:35 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:08:52.265 18:12:35 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:08:52.265 18:12:35 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:08:52.265 18:12:35 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd0 /proc/partitions 00:08:52.265 18:12:35 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:08:52.265 18:12:35 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:08:52.265 18:12:35 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:08:52.265 18:12:35 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:08:52.265 1+0 records in 00:08:52.265 1+0 records out 00:08:52.265 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000239462 s, 17.1 MB/s 00:08:52.265 18:12:35 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:08:52.265 18:12:35 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:08:52.265 18:12:35 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:08:52.265 18:12:35 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:08:52.265 18:12:35 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:08:52.265 18:12:35 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:08:52.265 18:12:35 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 16 )) 00:08:52.265 18:12:35 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc1p0 /dev/nbd1 00:08:52.523 /dev/nbd1 00:08:52.782 18:12:36 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:08:52.782 18:12:36 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:08:52.782 18:12:36 blockdev_general.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd1 00:08:52.782 18:12:36 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:08:52.782 18:12:36 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:08:52.782 18:12:36 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:08:52.782 18:12:36 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd1 /proc/partitions 00:08:52.782 18:12:36 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:08:52.782 18:12:36 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:08:52.782 18:12:36 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:08:52.782 18:12:36 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:08:52.782 1+0 records in 00:08:52.782 1+0 records out 00:08:52.782 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000270182 s, 15.2 MB/s 00:08:52.782 18:12:36 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:08:52.782 18:12:36 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:08:52.782 18:12:36 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:08:52.782 18:12:36 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:08:52.782 18:12:36 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:08:52.782 18:12:36 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:08:52.782 18:12:36 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 16 )) 00:08:52.782 18:12:36 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc1p1 /dev/nbd10 00:08:53.041 /dev/nbd10 00:08:53.041 18:12:36 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd10 00:08:53.041 18:12:36 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd10 00:08:53.041 18:12:36 blockdev_general.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd10 00:08:53.041 18:12:36 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:08:53.041 18:12:36 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:08:53.041 18:12:36 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:08:53.041 18:12:36 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd10 /proc/partitions 00:08:53.041 18:12:36 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:08:53.041 18:12:36 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:08:53.041 18:12:36 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:08:53.041 18:12:36 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd10 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:08:53.041 1+0 records in 00:08:53.041 1+0 records out 00:08:53.041 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000290968 s, 14.1 MB/s 00:08:53.041 18:12:36 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:08:53.041 18:12:36 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:08:53.041 18:12:36 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:08:53.041 18:12:36 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:08:53.041 18:12:36 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:08:53.041 18:12:36 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:08:53.041 18:12:36 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 16 )) 00:08:53.041 18:12:36 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc2p0 /dev/nbd11 00:08:53.299 /dev/nbd11 00:08:53.299 18:12:36 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd11 00:08:53.299 18:12:36 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd11 00:08:53.299 18:12:36 blockdev_general.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd11 00:08:53.299 18:12:36 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:08:53.299 18:12:36 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:08:53.299 18:12:36 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:08:53.299 18:12:36 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd11 /proc/partitions 00:08:53.299 18:12:36 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:08:53.299 18:12:36 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:08:53.299 18:12:36 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:08:53.299 18:12:36 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd11 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:08:53.299 1+0 records in 00:08:53.299 1+0 records out 00:08:53.299 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000351202 s, 11.7 MB/s 00:08:53.299 18:12:36 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:08:53.299 18:12:36 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:08:53.299 18:12:36 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:08:53.299 18:12:36 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:08:53.299 18:12:36 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:08:53.299 18:12:36 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:08:53.299 18:12:36 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 16 )) 00:08:53.299 18:12:36 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc2p1 /dev/nbd12 00:08:53.557 /dev/nbd12 00:08:53.557 18:12:37 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd12 00:08:53.557 18:12:37 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd12 00:08:53.557 18:12:37 blockdev_general.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd12 00:08:53.557 18:12:37 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:08:53.557 18:12:37 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:08:53.557 18:12:37 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:08:53.557 18:12:37 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd12 /proc/partitions 00:08:53.557 18:12:37 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:08:53.557 18:12:37 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:08:53.557 18:12:37 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:08:53.557 18:12:37 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd12 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:08:53.557 1+0 records in 00:08:53.557 1+0 records out 00:08:53.557 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.00027046 s, 15.1 MB/s 00:08:53.557 18:12:37 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:08:53.557 18:12:37 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:08:53.557 18:12:37 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:08:53.557 18:12:37 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:08:53.557 18:12:37 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:08:53.557 18:12:37 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:08:53.557 18:12:37 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 16 )) 00:08:53.557 18:12:37 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc2p2 /dev/nbd13 00:08:53.814 /dev/nbd13 00:08:53.814 18:12:37 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd13 00:08:53.814 18:12:37 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd13 00:08:53.814 18:12:37 blockdev_general.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd13 00:08:53.814 18:12:37 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:08:53.814 18:12:37 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:08:53.814 18:12:37 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:08:53.814 18:12:37 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd13 /proc/partitions 00:08:53.814 18:12:37 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:08:53.814 18:12:37 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:08:53.814 18:12:37 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:08:53.814 18:12:37 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd13 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:08:53.814 1+0 records in 00:08:53.814 1+0 records out 00:08:53.814 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000420644 s, 9.7 MB/s 00:08:53.814 18:12:37 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:08:53.814 18:12:37 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:08:53.815 18:12:37 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:08:53.815 18:12:37 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:08:53.815 18:12:37 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:08:53.815 18:12:37 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:08:53.815 18:12:37 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 16 )) 00:08:53.815 18:12:37 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc2p3 /dev/nbd14 00:08:54.073 /dev/nbd14 00:08:54.073 18:12:37 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd14 00:08:54.073 18:12:37 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd14 00:08:54.073 18:12:37 blockdev_general.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd14 00:08:54.073 18:12:37 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:08:54.073 18:12:37 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:08:54.073 18:12:37 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:08:54.073 18:12:37 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd14 /proc/partitions 00:08:54.073 18:12:37 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:08:54.073 18:12:37 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:08:54.073 18:12:37 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:08:54.073 18:12:37 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd14 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:08:54.073 1+0 records in 00:08:54.073 1+0 records out 00:08:54.073 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000431093 s, 9.5 MB/s 00:08:54.073 18:12:37 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:08:54.073 18:12:37 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:08:54.073 18:12:37 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:08:54.073 18:12:37 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:08:54.073 18:12:37 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:08:54.073 18:12:37 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:08:54.073 18:12:37 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 16 )) 00:08:54.073 18:12:37 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc2p4 /dev/nbd15 00:08:54.331 /dev/nbd15 00:08:54.331 18:12:37 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd15 00:08:54.331 18:12:38 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd15 00:08:54.331 18:12:38 blockdev_general.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd15 00:08:54.331 18:12:38 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:08:54.331 18:12:38 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:08:54.331 18:12:38 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:08:54.331 18:12:38 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd15 /proc/partitions 00:08:54.331 18:12:38 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:08:54.331 18:12:38 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:08:54.331 18:12:38 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:08:54.331 18:12:38 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd15 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:08:54.331 1+0 records in 00:08:54.331 1+0 records out 00:08:54.331 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000536151 s, 7.6 MB/s 00:08:54.331 18:12:38 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:08:54.331 18:12:38 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:08:54.331 18:12:38 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:08:54.331 18:12:38 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:08:54.331 18:12:38 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:08:54.331 18:12:38 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:08:54.331 18:12:38 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 16 )) 00:08:54.331 18:12:38 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc2p5 /dev/nbd2 00:08:54.590 /dev/nbd2 00:08:54.590 18:12:38 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd2 00:08:54.590 18:12:38 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd2 00:08:54.590 18:12:38 blockdev_general.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd2 00:08:54.590 18:12:38 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:08:54.590 18:12:38 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:08:54.590 18:12:38 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:08:54.590 18:12:38 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd2 /proc/partitions 00:08:54.590 18:12:38 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:08:54.590 18:12:38 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:08:54.590 18:12:38 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:08:54.590 18:12:38 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd2 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:08:54.590 1+0 records in 00:08:54.590 1+0 records out 00:08:54.590 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000565203 s, 7.2 MB/s 00:08:54.590 18:12:38 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:08:54.590 18:12:38 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:08:54.590 18:12:38 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:08:54.590 18:12:38 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:08:54.591 18:12:38 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:08:54.591 18:12:38 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:08:54.591 18:12:38 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 16 )) 00:08:54.591 18:12:38 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc2p6 /dev/nbd3 00:08:54.849 /dev/nbd3 00:08:54.849 18:12:38 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd3 00:08:54.849 18:12:38 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd3 00:08:54.849 18:12:38 blockdev_general.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd3 00:08:54.849 18:12:38 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:08:54.849 18:12:38 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:08:54.849 18:12:38 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:08:54.849 18:12:38 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd3 /proc/partitions 00:08:54.849 18:12:38 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:08:54.849 18:12:38 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:08:54.849 18:12:38 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:08:54.849 18:12:38 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd3 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:08:54.849 1+0 records in 00:08:54.849 1+0 records out 00:08:54.849 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000462581 s, 8.9 MB/s 00:08:54.849 18:12:38 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:08:55.108 18:12:38 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:08:55.108 18:12:38 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:08:55.108 18:12:38 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:08:55.108 18:12:38 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:08:55.108 18:12:38 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:08:55.108 18:12:38 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 16 )) 00:08:55.108 18:12:38 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc2p7 /dev/nbd4 00:08:55.108 /dev/nbd4 00:08:55.367 18:12:38 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd4 00:08:55.367 18:12:38 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd4 00:08:55.367 18:12:38 blockdev_general.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd4 00:08:55.367 18:12:38 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:08:55.367 18:12:38 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:08:55.367 18:12:38 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:08:55.367 18:12:38 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd4 /proc/partitions 00:08:55.367 18:12:38 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:08:55.367 18:12:38 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:08:55.367 18:12:38 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:08:55.367 18:12:38 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd4 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:08:55.367 1+0 records in 00:08:55.367 1+0 records out 00:08:55.367 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000549977 s, 7.4 MB/s 00:08:55.367 18:12:38 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:08:55.367 18:12:38 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:08:55.367 18:12:38 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:08:55.367 18:12:38 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:08:55.367 18:12:38 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:08:55.367 18:12:38 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:08:55.367 18:12:38 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 16 )) 00:08:55.367 18:12:38 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk TestPT /dev/nbd5 00:08:55.626 /dev/nbd5 00:08:55.626 18:12:39 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd5 00:08:55.626 18:12:39 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd5 00:08:55.626 18:12:39 blockdev_general.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd5 00:08:55.626 18:12:39 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:08:55.626 18:12:39 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:08:55.626 18:12:39 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:08:55.626 18:12:39 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd5 /proc/partitions 00:08:55.626 18:12:39 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:08:55.626 18:12:39 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:08:55.626 18:12:39 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:08:55.626 18:12:39 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd5 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:08:55.626 1+0 records in 00:08:55.626 1+0 records out 00:08:55.626 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000640244 s, 6.4 MB/s 00:08:55.626 18:12:39 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:08:55.626 18:12:39 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:08:55.626 18:12:39 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:08:55.626 18:12:39 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:08:55.626 18:12:39 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:08:55.626 18:12:39 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:08:55.626 18:12:39 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 16 )) 00:08:55.626 18:12:39 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk raid0 /dev/nbd6 00:08:55.885 /dev/nbd6 00:08:55.885 18:12:39 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd6 00:08:55.885 18:12:39 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd6 00:08:55.885 18:12:39 blockdev_general.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd6 00:08:55.885 18:12:39 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:08:55.885 18:12:39 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:08:55.885 18:12:39 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:08:55.885 18:12:39 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd6 /proc/partitions 00:08:55.885 18:12:39 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:08:55.885 18:12:39 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:08:55.885 18:12:39 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:08:55.885 18:12:39 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd6 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:08:55.885 1+0 records in 00:08:55.885 1+0 records out 00:08:55.885 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.00076496 s, 5.4 MB/s 00:08:55.885 18:12:39 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:08:55.885 18:12:39 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:08:55.885 18:12:39 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:08:55.885 18:12:39 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:08:55.885 18:12:39 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:08:55.885 18:12:39 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:08:55.885 18:12:39 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 16 )) 00:08:55.885 18:12:39 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk concat0 /dev/nbd7 00:08:56.144 /dev/nbd7 00:08:56.144 18:12:39 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd7 00:08:56.144 18:12:39 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd7 00:08:56.144 18:12:39 blockdev_general.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd7 00:08:56.144 18:12:39 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:08:56.144 18:12:39 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:08:56.144 18:12:39 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:08:56.144 18:12:39 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd7 /proc/partitions 00:08:56.144 18:12:39 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:08:56.144 18:12:39 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:08:56.144 18:12:39 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:08:56.144 18:12:39 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd7 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:08:56.144 1+0 records in 00:08:56.144 1+0 records out 00:08:56.144 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000561129 s, 7.3 MB/s 00:08:56.144 18:12:39 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:08:56.144 18:12:39 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:08:56.144 18:12:39 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:08:56.144 18:12:39 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:08:56.144 18:12:39 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:08:56.144 18:12:39 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:08:56.144 18:12:39 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 16 )) 00:08:56.144 18:12:39 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk raid1 /dev/nbd8 00:08:56.403 /dev/nbd8 00:08:56.403 18:12:40 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd8 00:08:56.403 18:12:40 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd8 00:08:56.403 18:12:40 blockdev_general.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd8 00:08:56.403 18:12:40 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:08:56.403 18:12:40 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:08:56.403 18:12:40 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:08:56.403 18:12:40 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd8 /proc/partitions 00:08:56.403 18:12:40 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:08:56.403 18:12:40 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:08:56.403 18:12:40 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:08:56.403 18:12:40 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd8 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:08:56.403 1+0 records in 00:08:56.403 1+0 records out 00:08:56.403 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000642763 s, 6.4 MB/s 00:08:56.403 18:12:40 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:08:56.403 18:12:40 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:08:56.403 18:12:40 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:08:56.403 18:12:40 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:08:56.403 18:12:40 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:08:56.403 18:12:40 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:08:56.403 18:12:40 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 16 )) 00:08:56.403 18:12:40 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk AIO0 /dev/nbd9 00:08:56.662 /dev/nbd9 00:08:56.662 18:12:40 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd9 00:08:56.662 18:12:40 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd9 00:08:56.662 18:12:40 blockdev_general.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd9 00:08:56.662 18:12:40 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:08:56.662 18:12:40 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:08:56.662 18:12:40 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:08:56.662 18:12:40 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd9 /proc/partitions 00:08:56.662 18:12:40 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:08:56.662 18:12:40 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:08:56.662 18:12:40 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:08:56.662 18:12:40 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd9 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:08:56.662 1+0 records in 00:08:56.662 1+0 records out 00:08:56.662 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000666897 s, 6.1 MB/s 00:08:56.662 18:12:40 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:08:56.662 18:12:40 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:08:56.662 18:12:40 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:08:56.662 18:12:40 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:08:56.662 18:12:40 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:08:56.662 18:12:40 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:08:56.662 18:12:40 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 16 )) 00:08:56.662 18:12:40 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@95 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:08:56.662 18:12:40 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:08:56.662 18:12:40 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:08:56.920 18:12:40 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[ 00:08:56.920 { 00:08:56.920 "nbd_device": "/dev/nbd0", 00:08:56.920 "bdev_name": "Malloc0" 00:08:56.920 }, 00:08:56.920 { 00:08:56.920 "nbd_device": "/dev/nbd1", 00:08:56.920 "bdev_name": "Malloc1p0" 00:08:56.920 }, 00:08:56.920 { 00:08:56.920 "nbd_device": "/dev/nbd10", 00:08:56.920 "bdev_name": "Malloc1p1" 00:08:56.920 }, 00:08:56.920 { 00:08:56.920 "nbd_device": "/dev/nbd11", 00:08:56.920 "bdev_name": "Malloc2p0" 00:08:56.920 }, 00:08:56.920 { 00:08:56.920 "nbd_device": "/dev/nbd12", 00:08:56.920 "bdev_name": "Malloc2p1" 00:08:56.920 }, 00:08:56.920 { 00:08:56.920 "nbd_device": "/dev/nbd13", 00:08:56.920 "bdev_name": "Malloc2p2" 00:08:56.920 }, 00:08:56.920 { 00:08:56.920 "nbd_device": "/dev/nbd14", 00:08:56.920 "bdev_name": "Malloc2p3" 00:08:56.920 }, 00:08:56.920 { 00:08:56.920 "nbd_device": "/dev/nbd15", 00:08:56.920 "bdev_name": "Malloc2p4" 00:08:56.920 }, 00:08:56.920 { 00:08:56.920 "nbd_device": "/dev/nbd2", 00:08:56.920 "bdev_name": "Malloc2p5" 00:08:56.920 }, 00:08:56.920 { 00:08:56.920 "nbd_device": "/dev/nbd3", 00:08:56.920 "bdev_name": "Malloc2p6" 00:08:56.920 }, 00:08:56.920 { 00:08:56.920 "nbd_device": "/dev/nbd4", 00:08:56.920 "bdev_name": "Malloc2p7" 00:08:56.920 }, 00:08:56.920 { 00:08:56.920 "nbd_device": "/dev/nbd5", 00:08:56.920 "bdev_name": "TestPT" 00:08:56.920 }, 00:08:56.920 { 00:08:56.920 "nbd_device": "/dev/nbd6", 00:08:56.920 "bdev_name": "raid0" 00:08:56.920 }, 00:08:56.920 { 00:08:56.920 "nbd_device": "/dev/nbd7", 00:08:56.920 "bdev_name": "concat0" 00:08:56.920 }, 00:08:56.920 { 00:08:56.920 "nbd_device": "/dev/nbd8", 00:08:56.920 "bdev_name": "raid1" 00:08:56.920 }, 00:08:56.920 { 00:08:56.920 "nbd_device": "/dev/nbd9", 00:08:56.920 "bdev_name": "AIO0" 00:08:56.920 } 00:08:56.920 ]' 00:08:56.920 18:12:40 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@64 -- # echo '[ 00:08:56.920 { 00:08:56.920 "nbd_device": "/dev/nbd0", 00:08:56.920 "bdev_name": "Malloc0" 00:08:56.920 }, 00:08:56.920 { 00:08:56.920 "nbd_device": "/dev/nbd1", 00:08:56.920 "bdev_name": "Malloc1p0" 00:08:56.920 }, 00:08:56.920 { 00:08:56.920 "nbd_device": "/dev/nbd10", 00:08:56.920 "bdev_name": "Malloc1p1" 00:08:56.920 }, 00:08:56.920 { 00:08:56.920 "nbd_device": "/dev/nbd11", 00:08:56.920 "bdev_name": "Malloc2p0" 00:08:56.920 }, 00:08:56.920 { 00:08:56.920 "nbd_device": "/dev/nbd12", 00:08:56.920 "bdev_name": "Malloc2p1" 00:08:56.920 }, 00:08:56.920 { 00:08:56.920 "nbd_device": "/dev/nbd13", 00:08:56.920 "bdev_name": "Malloc2p2" 00:08:56.920 }, 00:08:56.920 { 00:08:56.920 "nbd_device": "/dev/nbd14", 00:08:56.920 "bdev_name": "Malloc2p3" 00:08:56.920 }, 00:08:56.920 { 00:08:56.920 "nbd_device": "/dev/nbd15", 00:08:56.920 "bdev_name": "Malloc2p4" 00:08:56.920 }, 00:08:56.920 { 00:08:56.920 "nbd_device": "/dev/nbd2", 00:08:56.920 "bdev_name": "Malloc2p5" 00:08:56.920 }, 00:08:56.920 { 00:08:56.920 "nbd_device": "/dev/nbd3", 00:08:56.920 "bdev_name": "Malloc2p6" 00:08:56.920 }, 00:08:56.920 { 00:08:56.920 "nbd_device": "/dev/nbd4", 00:08:56.920 "bdev_name": "Malloc2p7" 00:08:56.920 }, 00:08:56.920 { 00:08:56.920 "nbd_device": "/dev/nbd5", 00:08:56.920 "bdev_name": "TestPT" 00:08:56.920 }, 00:08:56.920 { 00:08:56.920 "nbd_device": "/dev/nbd6", 00:08:56.920 "bdev_name": "raid0" 00:08:56.920 }, 00:08:56.920 { 00:08:56.920 "nbd_device": "/dev/nbd7", 00:08:56.920 "bdev_name": "concat0" 00:08:56.920 }, 00:08:56.920 { 00:08:56.920 "nbd_device": "/dev/nbd8", 00:08:56.920 "bdev_name": "raid1" 00:08:56.920 }, 00:08:56.920 { 00:08:56.920 "nbd_device": "/dev/nbd9", 00:08:56.920 "bdev_name": "AIO0" 00:08:56.920 } 00:08:56.920 ]' 00:08:56.920 18:12:40 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:08:56.920 18:12:40 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@64 -- # nbd_disks_name='/dev/nbd0 00:08:56.920 /dev/nbd1 00:08:56.920 /dev/nbd10 00:08:56.920 /dev/nbd11 00:08:56.920 /dev/nbd12 00:08:56.920 /dev/nbd13 00:08:56.920 /dev/nbd14 00:08:56.920 /dev/nbd15 00:08:56.920 /dev/nbd2 00:08:56.920 /dev/nbd3 00:08:56.920 /dev/nbd4 00:08:56.920 /dev/nbd5 00:08:56.920 /dev/nbd6 00:08:56.920 /dev/nbd7 00:08:56.920 /dev/nbd8 00:08:56.920 /dev/nbd9' 00:08:56.920 18:12:40 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@65 -- # echo '/dev/nbd0 00:08:56.920 /dev/nbd1 00:08:56.920 /dev/nbd10 00:08:56.920 /dev/nbd11 00:08:56.920 /dev/nbd12 00:08:56.920 /dev/nbd13 00:08:56.920 /dev/nbd14 00:08:56.920 /dev/nbd15 00:08:56.920 /dev/nbd2 00:08:56.920 /dev/nbd3 00:08:56.920 /dev/nbd4 00:08:56.920 /dev/nbd5 00:08:56.920 /dev/nbd6 00:08:56.920 /dev/nbd7 00:08:56.920 /dev/nbd8 00:08:56.920 /dev/nbd9' 00:08:56.920 18:12:40 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:08:56.920 18:12:40 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@65 -- # count=16 00:08:56.920 18:12:40 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@66 -- # echo 16 00:08:56.920 18:12:40 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@95 -- # count=16 00:08:56.920 18:12:40 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@96 -- # '[' 16 -ne 16 ']' 00:08:56.921 18:12:40 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@100 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1 /dev/nbd10 /dev/nbd11 /dev/nbd12 /dev/nbd13 /dev/nbd14 /dev/nbd15 /dev/nbd2 /dev/nbd3 /dev/nbd4 /dev/nbd5 /dev/nbd6 /dev/nbd7 /dev/nbd8 /dev/nbd9' write 00:08:56.921 18:12:40 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11' '/dev/nbd12' '/dev/nbd13' '/dev/nbd14' '/dev/nbd15' '/dev/nbd2' '/dev/nbd3' '/dev/nbd4' '/dev/nbd5' '/dev/nbd6' '/dev/nbd7' '/dev/nbd8' '/dev/nbd9') 00:08:56.921 18:12:40 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@70 -- # local nbd_list 00:08:56.921 18:12:40 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@71 -- # local operation=write 00:08:56.921 18:12:40 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest 00:08:56.921 18:12:40 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@74 -- # '[' write = write ']' 00:08:56.921 18:12:40 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@76 -- # dd if=/dev/urandom of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest bs=4096 count=256 00:08:57.178 256+0 records in 00:08:57.178 256+0 records out 00:08:57.178 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.010087 s, 104 MB/s 00:08:57.178 18:12:40 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:08:57.178 18:12:40 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd0 bs=4096 count=256 oflag=direct 00:08:57.178 256+0 records in 00:08:57.178 256+0 records out 00:08:57.178 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.180597 s, 5.8 MB/s 00:08:57.178 18:12:40 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:08:57.178 18:12:40 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd1 bs=4096 count=256 oflag=direct 00:08:57.437 256+0 records in 00:08:57.437 256+0 records out 00:08:57.437 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.182805 s, 5.7 MB/s 00:08:57.437 18:12:41 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:08:57.437 18:12:41 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd10 bs=4096 count=256 oflag=direct 00:08:57.695 256+0 records in 00:08:57.695 256+0 records out 00:08:57.695 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.182675 s, 5.7 MB/s 00:08:57.695 18:12:41 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:08:57.695 18:12:41 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd11 bs=4096 count=256 oflag=direct 00:08:57.695 256+0 records in 00:08:57.695 256+0 records out 00:08:57.695 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.182693 s, 5.7 MB/s 00:08:57.695 18:12:41 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:08:57.695 18:12:41 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd12 bs=4096 count=256 oflag=direct 00:08:57.954 256+0 records in 00:08:57.954 256+0 records out 00:08:57.954 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.182418 s, 5.7 MB/s 00:08:57.954 18:12:41 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:08:57.954 18:12:41 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd13 bs=4096 count=256 oflag=direct 00:08:58.212 256+0 records in 00:08:58.212 256+0 records out 00:08:58.212 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.182784 s, 5.7 MB/s 00:08:58.212 18:12:41 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:08:58.212 18:12:41 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd14 bs=4096 count=256 oflag=direct 00:08:58.470 256+0 records in 00:08:58.470 256+0 records out 00:08:58.470 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.182717 s, 5.7 MB/s 00:08:58.470 18:12:41 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:08:58.470 18:12:41 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd15 bs=4096 count=256 oflag=direct 00:08:58.470 256+0 records in 00:08:58.470 256+0 records out 00:08:58.470 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.183432 s, 5.7 MB/s 00:08:58.470 18:12:42 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:08:58.470 18:12:42 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd2 bs=4096 count=256 oflag=direct 00:08:58.729 256+0 records in 00:08:58.729 256+0 records out 00:08:58.729 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.18268 s, 5.7 MB/s 00:08:58.729 18:12:42 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:08:58.729 18:12:42 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd3 bs=4096 count=256 oflag=direct 00:08:58.987 256+0 records in 00:08:58.987 256+0 records out 00:08:58.987 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.182461 s, 5.7 MB/s 00:08:58.987 18:12:42 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:08:58.987 18:12:42 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd4 bs=4096 count=256 oflag=direct 00:08:58.987 256+0 records in 00:08:58.987 256+0 records out 00:08:58.987 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.182236 s, 5.8 MB/s 00:08:58.987 18:12:42 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:08:58.987 18:12:42 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd5 bs=4096 count=256 oflag=direct 00:08:59.245 256+0 records in 00:08:59.245 256+0 records out 00:08:59.245 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.182709 s, 5.7 MB/s 00:08:59.245 18:12:42 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:08:59.245 18:12:42 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd6 bs=4096 count=256 oflag=direct 00:08:59.505 256+0 records in 00:08:59.505 256+0 records out 00:08:59.505 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.184608 s, 5.7 MB/s 00:08:59.505 18:12:43 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:08:59.505 18:12:43 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd7 bs=4096 count=256 oflag=direct 00:08:59.763 256+0 records in 00:08:59.763 256+0 records out 00:08:59.763 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.184149 s, 5.7 MB/s 00:08:59.763 18:12:43 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:08:59.763 18:12:43 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd8 bs=4096 count=256 oflag=direct 00:08:59.763 256+0 records in 00:08:59.763 256+0 records out 00:08:59.763 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.186703 s, 5.6 MB/s 00:08:59.763 18:12:43 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:08:59.763 18:12:43 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd9 bs=4096 count=256 oflag=direct 00:09:00.022 256+0 records in 00:09:00.022 256+0 records out 00:09:00.022 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.181236 s, 5.8 MB/s 00:09:00.022 18:12:43 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@101 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1 /dev/nbd10 /dev/nbd11 /dev/nbd12 /dev/nbd13 /dev/nbd14 /dev/nbd15 /dev/nbd2 /dev/nbd3 /dev/nbd4 /dev/nbd5 /dev/nbd6 /dev/nbd7 /dev/nbd8 /dev/nbd9' verify 00:09:00.022 18:12:43 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11' '/dev/nbd12' '/dev/nbd13' '/dev/nbd14' '/dev/nbd15' '/dev/nbd2' '/dev/nbd3' '/dev/nbd4' '/dev/nbd5' '/dev/nbd6' '/dev/nbd7' '/dev/nbd8' '/dev/nbd9') 00:09:00.022 18:12:43 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@70 -- # local nbd_list 00:09:00.022 18:12:43 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@71 -- # local operation=verify 00:09:00.022 18:12:43 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest 00:09:00.022 18:12:43 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@74 -- # '[' verify = write ']' 00:09:00.022 18:12:43 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@80 -- # '[' verify = verify ']' 00:09:00.022 18:12:43 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:09:00.022 18:12:43 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd0 00:09:00.022 18:12:43 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:09:00.022 18:12:43 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd1 00:09:00.022 18:12:43 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:09:00.022 18:12:43 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd10 00:09:00.022 18:12:43 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:09:00.022 18:12:43 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd11 00:09:00.022 18:12:43 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:09:00.022 18:12:43 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd12 00:09:00.022 18:12:43 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:09:00.022 18:12:43 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd13 00:09:00.022 18:12:43 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:09:00.022 18:12:43 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd14 00:09:00.022 18:12:43 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:09:00.022 18:12:43 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd15 00:09:00.022 18:12:43 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:09:00.022 18:12:43 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd2 00:09:00.022 18:12:43 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:09:00.022 18:12:43 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd3 00:09:00.022 18:12:43 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:09:00.022 18:12:43 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd4 00:09:00.022 18:12:43 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:09:00.022 18:12:43 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd5 00:09:00.282 18:12:43 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:09:00.282 18:12:43 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd6 00:09:00.282 18:12:43 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:09:00.282 18:12:43 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd7 00:09:00.282 18:12:43 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:09:00.282 18:12:43 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd8 00:09:00.282 18:12:43 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:09:00.282 18:12:43 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd9 00:09:00.282 18:12:43 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@85 -- # rm /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest 00:09:00.282 18:12:43 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@103 -- # nbd_stop_disks /var/tmp/spdk-nbd.sock '/dev/nbd0 /dev/nbd1 /dev/nbd10 /dev/nbd11 /dev/nbd12 /dev/nbd13 /dev/nbd14 /dev/nbd15 /dev/nbd2 /dev/nbd3 /dev/nbd4 /dev/nbd5 /dev/nbd6 /dev/nbd7 /dev/nbd8 /dev/nbd9' 00:09:00.282 18:12:43 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:09:00.282 18:12:43 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11' '/dev/nbd12' '/dev/nbd13' '/dev/nbd14' '/dev/nbd15' '/dev/nbd2' '/dev/nbd3' '/dev/nbd4' '/dev/nbd5' '/dev/nbd6' '/dev/nbd7' '/dev/nbd8' '/dev/nbd9') 00:09:00.282 18:12:43 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@50 -- # local nbd_list 00:09:00.282 18:12:43 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@51 -- # local i 00:09:00.282 18:12:43 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:09:00.282 18:12:43 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd0 00:09:00.542 18:12:44 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:09:00.542 18:12:44 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:09:00.542 18:12:44 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:09:00.542 18:12:44 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:09:00.542 18:12:44 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:09:00.542 18:12:44 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:09:00.542 18:12:44 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:09:00.542 18:12:44 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:09:00.542 18:12:44 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:09:00.542 18:12:44 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd1 00:09:00.801 18:12:44 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:09:00.801 18:12:44 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:09:00.801 18:12:44 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:09:00.801 18:12:44 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:09:00.801 18:12:44 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:09:00.801 18:12:44 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:09:00.801 18:12:44 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:09:00.801 18:12:44 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:09:00.801 18:12:44 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:09:00.801 18:12:44 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd10 00:09:01.059 18:12:44 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd10 00:09:01.059 18:12:44 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd10 00:09:01.059 18:12:44 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd10 00:09:01.059 18:12:44 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:09:01.059 18:12:44 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:09:01.059 18:12:44 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd10 /proc/partitions 00:09:01.059 18:12:44 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:09:01.059 18:12:44 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:09:01.059 18:12:44 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:09:01.059 18:12:44 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd11 00:09:01.318 18:12:44 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd11 00:09:01.318 18:12:44 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd11 00:09:01.318 18:12:44 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd11 00:09:01.318 18:12:44 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:09:01.318 18:12:44 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:09:01.318 18:12:44 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd11 /proc/partitions 00:09:01.318 18:12:44 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:09:01.318 18:12:44 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:09:01.318 18:12:44 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:09:01.318 18:12:44 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd12 00:09:01.577 18:12:45 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd12 00:09:01.577 18:12:45 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd12 00:09:01.577 18:12:45 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd12 00:09:01.577 18:12:45 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:09:01.577 18:12:45 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:09:01.577 18:12:45 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd12 /proc/partitions 00:09:01.577 18:12:45 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:09:01.577 18:12:45 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:09:01.577 18:12:45 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:09:01.577 18:12:45 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd13 00:09:01.835 18:12:45 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd13 00:09:01.835 18:12:45 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd13 00:09:01.835 18:12:45 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd13 00:09:01.835 18:12:45 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:09:01.835 18:12:45 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:09:01.835 18:12:45 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd13 /proc/partitions 00:09:01.835 18:12:45 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:09:01.835 18:12:45 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:09:01.835 18:12:45 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:09:01.835 18:12:45 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd14 00:09:02.093 18:12:45 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd14 00:09:02.093 18:12:45 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd14 00:09:02.093 18:12:45 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd14 00:09:02.093 18:12:45 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:09:02.094 18:12:45 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:09:02.094 18:12:45 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd14 /proc/partitions 00:09:02.094 18:12:45 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:09:02.094 18:12:45 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:09:02.094 18:12:45 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:09:02.094 18:12:45 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd15 00:09:02.401 18:12:45 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd15 00:09:02.401 18:12:45 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd15 00:09:02.401 18:12:45 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd15 00:09:02.401 18:12:45 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:09:02.401 18:12:45 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:09:02.401 18:12:45 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd15 /proc/partitions 00:09:02.402 18:12:45 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:09:02.402 18:12:45 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:09:02.402 18:12:45 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:09:02.402 18:12:45 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd2 00:09:02.685 18:12:46 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd2 00:09:02.685 18:12:46 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd2 00:09:02.685 18:12:46 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd2 00:09:02.685 18:12:46 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:09:02.685 18:12:46 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:09:02.685 18:12:46 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd2 /proc/partitions 00:09:02.685 18:12:46 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:09:02.685 18:12:46 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:09:02.685 18:12:46 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:09:02.685 18:12:46 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd3 00:09:02.943 18:12:46 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd3 00:09:02.943 18:12:46 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd3 00:09:02.943 18:12:46 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd3 00:09:02.943 18:12:46 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:09:02.943 18:12:46 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:09:02.943 18:12:46 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd3 /proc/partitions 00:09:02.943 18:12:46 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:09:02.943 18:12:46 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:09:02.943 18:12:46 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:09:02.943 18:12:46 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd4 00:09:03.201 18:12:46 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd4 00:09:03.201 18:12:46 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd4 00:09:03.201 18:12:46 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd4 00:09:03.201 18:12:46 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:09:03.201 18:12:46 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:09:03.201 18:12:46 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd4 /proc/partitions 00:09:03.201 18:12:46 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:09:03.201 18:12:46 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:09:03.201 18:12:46 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:09:03.201 18:12:46 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd5 00:09:03.459 18:12:46 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd5 00:09:03.459 18:12:46 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd5 00:09:03.459 18:12:46 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd5 00:09:03.460 18:12:46 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:09:03.460 18:12:47 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:09:03.460 18:12:47 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd5 /proc/partitions 00:09:03.460 18:12:47 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:09:03.460 18:12:47 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:09:03.460 18:12:47 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:09:03.460 18:12:47 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd6 00:09:03.718 18:12:47 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd6 00:09:03.718 18:12:47 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd6 00:09:03.718 18:12:47 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd6 00:09:03.718 18:12:47 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:09:03.718 18:12:47 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:09:03.718 18:12:47 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd6 /proc/partitions 00:09:03.718 18:12:47 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:09:03.718 18:12:47 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:09:03.718 18:12:47 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:09:03.718 18:12:47 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd7 00:09:03.977 18:12:47 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd7 00:09:03.977 18:12:47 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd7 00:09:03.977 18:12:47 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd7 00:09:03.977 18:12:47 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:09:03.977 18:12:47 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:09:03.977 18:12:47 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd7 /proc/partitions 00:09:03.977 18:12:47 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:09:03.977 18:12:47 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:09:03.977 18:12:47 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:09:03.977 18:12:47 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd8 00:09:04.236 18:12:47 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd8 00:09:04.237 18:12:47 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd8 00:09:04.237 18:12:47 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd8 00:09:04.237 18:12:47 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:09:04.237 18:12:47 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:09:04.237 18:12:47 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd8 /proc/partitions 00:09:04.237 18:12:47 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:09:04.237 18:12:47 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:09:04.237 18:12:47 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:09:04.237 18:12:47 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd9 00:09:04.495 18:12:48 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd9 00:09:04.495 18:12:48 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd9 00:09:04.495 18:12:48 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd9 00:09:04.495 18:12:48 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:09:04.495 18:12:48 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:09:04.495 18:12:48 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd9 /proc/partitions 00:09:04.495 18:12:48 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:09:04.495 18:12:48 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:09:04.495 18:12:48 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@104 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:09:04.495 18:12:48 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:09:04.495 18:12:48 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:09:04.754 18:12:48 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[]' 00:09:04.754 18:12:48 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@64 -- # echo '[]' 00:09:04.754 18:12:48 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:09:04.754 18:12:48 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@64 -- # nbd_disks_name= 00:09:04.754 18:12:48 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@65 -- # echo '' 00:09:04.754 18:12:48 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:09:04.754 18:12:48 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@65 -- # true 00:09:04.754 18:12:48 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@65 -- # count=0 00:09:04.754 18:12:48 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@66 -- # echo 0 00:09:04.754 18:12:48 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@104 -- # count=0 00:09:04.754 18:12:48 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@105 -- # '[' 0 -ne 0 ']' 00:09:04.754 18:12:48 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@109 -- # return 0 00:09:04.754 18:12:48 blockdev_general.bdev_nbd -- bdev/blockdev.sh@324 -- # nbd_with_lvol_verify /var/tmp/spdk-nbd.sock '/dev/nbd0 /dev/nbd1 /dev/nbd10 /dev/nbd11 /dev/nbd12 /dev/nbd13 /dev/nbd14 /dev/nbd15 /dev/nbd2 /dev/nbd3 /dev/nbd4 /dev/nbd5 /dev/nbd6 /dev/nbd7 /dev/nbd8 /dev/nbd9' 00:09:04.754 18:12:48 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@131 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:09:04.754 18:12:48 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@132 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11' '/dev/nbd12' '/dev/nbd13' '/dev/nbd14' '/dev/nbd15' '/dev/nbd2' '/dev/nbd3' '/dev/nbd4' '/dev/nbd5' '/dev/nbd6' '/dev/nbd7' '/dev/nbd8' '/dev/nbd9') 00:09:04.754 18:12:48 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@132 -- # local nbd_list 00:09:04.754 18:12:48 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@133 -- # local mkfs_ret 00:09:04.754 18:12:48 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@135 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_malloc_create -b malloc_lvol_verify 16 512 00:09:05.012 malloc_lvol_verify 00:09:05.012 18:12:48 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@136 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_lvol_create_lvstore malloc_lvol_verify lvs 00:09:05.270 368c1fc9-d801-4202-863d-9753dc5210c9 00:09:05.270 18:12:48 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@137 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_lvol_create lvol 4 -l lvs 00:09:05.528 c5ce3fde-9344-4ddf-b662-af0643be2d60 00:09:05.528 18:12:49 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@138 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk lvs/lvol /dev/nbd0 00:09:05.787 /dev/nbd0 00:09:05.787 18:12:49 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@140 -- # mkfs.ext4 /dev/nbd0 00:09:05.787 mke2fs 1.46.5 (30-Dec-2021) 00:09:05.787 Discarding device blocks: 0/4096 done 00:09:05.787 Creating filesystem with 4096 1k blocks and 1024 inodes 00:09:05.787 00:09:05.787 Allocating group tables: 0/1 done 00:09:05.787 Writing inode tables: 0/1 done 00:09:05.787 Creating journal (1024 blocks): done 00:09:05.787 Writing superblocks and filesystem accounting information: 0/1 done 00:09:05.787 00:09:05.787 18:12:49 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@141 -- # mkfs_ret=0 00:09:05.787 18:12:49 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@142 -- # nbd_stop_disks /var/tmp/spdk-nbd.sock /dev/nbd0 00:09:05.787 18:12:49 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:09:05.787 18:12:49 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0') 00:09:05.787 18:12:49 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@50 -- # local nbd_list 00:09:05.787 18:12:49 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@51 -- # local i 00:09:05.787 18:12:49 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:09:05.787 18:12:49 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd0 00:09:06.045 18:12:49 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:09:06.045 18:12:49 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:09:06.045 18:12:49 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:09:06.045 18:12:49 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:09:06.045 18:12:49 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:09:06.045 18:12:49 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:09:06.045 18:12:49 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:09:06.045 18:12:49 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:09:06.045 18:12:49 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@143 -- # '[' 0 -ne 0 ']' 00:09:06.045 18:12:49 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@147 -- # return 0 00:09:06.045 18:12:49 blockdev_general.bdev_nbd -- bdev/blockdev.sh@326 -- # killprocess 2438636 00:09:06.045 18:12:49 blockdev_general.bdev_nbd -- common/autotest_common.sh@948 -- # '[' -z 2438636 ']' 00:09:06.045 18:12:49 blockdev_general.bdev_nbd -- common/autotest_common.sh@952 -- # kill -0 2438636 00:09:06.045 18:12:49 blockdev_general.bdev_nbd -- common/autotest_common.sh@953 -- # uname 00:09:06.045 18:12:49 blockdev_general.bdev_nbd -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:09:06.045 18:12:49 blockdev_general.bdev_nbd -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2438636 00:09:06.045 18:12:49 blockdev_general.bdev_nbd -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:09:06.045 18:12:49 blockdev_general.bdev_nbd -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:09:06.045 18:12:49 blockdev_general.bdev_nbd -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2438636' 00:09:06.045 killing process with pid 2438636 00:09:06.045 18:12:49 blockdev_general.bdev_nbd -- common/autotest_common.sh@967 -- # kill 2438636 00:09:06.045 18:12:49 blockdev_general.bdev_nbd -- common/autotest_common.sh@972 -- # wait 2438636 00:09:06.610 18:12:50 blockdev_general.bdev_nbd -- bdev/blockdev.sh@327 -- # trap - SIGINT SIGTERM EXIT 00:09:06.610 00:09:06.610 real 0m24.837s 00:09:06.610 user 0m30.394s 00:09:06.610 sys 0m14.048s 00:09:06.610 18:12:50 blockdev_general.bdev_nbd -- common/autotest_common.sh@1124 -- # xtrace_disable 00:09:06.610 18:12:50 blockdev_general.bdev_nbd -- common/autotest_common.sh@10 -- # set +x 00:09:06.610 ************************************ 00:09:06.610 END TEST bdev_nbd 00:09:06.610 ************************************ 00:09:06.869 18:12:50 blockdev_general -- common/autotest_common.sh@1142 -- # return 0 00:09:06.869 18:12:50 blockdev_general -- bdev/blockdev.sh@763 -- # [[ y == y ]] 00:09:06.869 18:12:50 blockdev_general -- bdev/blockdev.sh@764 -- # '[' bdev = nvme ']' 00:09:06.869 18:12:50 blockdev_general -- bdev/blockdev.sh@764 -- # '[' bdev = gpt ']' 00:09:06.869 18:12:50 blockdev_general -- bdev/blockdev.sh@768 -- # run_test bdev_fio fio_test_suite '' 00:09:06.869 18:12:50 blockdev_general -- common/autotest_common.sh@1099 -- # '[' 3 -le 1 ']' 00:09:06.869 18:12:50 blockdev_general -- common/autotest_common.sh@1105 -- # xtrace_disable 00:09:06.869 18:12:50 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:09:06.869 ************************************ 00:09:06.869 START TEST bdev_fio 00:09:06.869 ************************************ 00:09:06.869 18:12:50 blockdev_general.bdev_fio -- common/autotest_common.sh@1123 -- # fio_test_suite '' 00:09:06.869 18:12:50 blockdev_general.bdev_fio -- bdev/blockdev.sh@331 -- # local env_context 00:09:06.869 18:12:50 blockdev_general.bdev_fio -- bdev/blockdev.sh@335 -- # pushd /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev 00:09:06.869 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev /var/jenkins/workspace/crypto-phy-autotest/spdk 00:09:06.869 18:12:50 blockdev_general.bdev_fio -- bdev/blockdev.sh@336 -- # trap 'rm -f ./*.state; popd; exit 1' SIGINT SIGTERM EXIT 00:09:06.869 18:12:50 blockdev_general.bdev_fio -- bdev/blockdev.sh@339 -- # echo '' 00:09:06.869 18:12:50 blockdev_general.bdev_fio -- bdev/blockdev.sh@339 -- # sed s/--env-context=// 00:09:06.869 18:12:50 blockdev_general.bdev_fio -- bdev/blockdev.sh@339 -- # env_context= 00:09:06.869 18:12:50 blockdev_general.bdev_fio -- bdev/blockdev.sh@340 -- # fio_config_gen /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio verify AIO '' 00:09:06.869 18:12:50 blockdev_general.bdev_fio -- common/autotest_common.sh@1280 -- # local config_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:09:06.869 18:12:50 blockdev_general.bdev_fio -- common/autotest_common.sh@1281 -- # local workload=verify 00:09:06.869 18:12:50 blockdev_general.bdev_fio -- common/autotest_common.sh@1282 -- # local bdev_type=AIO 00:09:06.869 18:12:50 blockdev_general.bdev_fio -- common/autotest_common.sh@1283 -- # local env_context= 00:09:06.869 18:12:50 blockdev_general.bdev_fio -- common/autotest_common.sh@1284 -- # local fio_dir=/usr/src/fio 00:09:06.869 18:12:50 blockdev_general.bdev_fio -- common/autotest_common.sh@1286 -- # '[' -e /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio ']' 00:09:06.869 18:12:50 blockdev_general.bdev_fio -- common/autotest_common.sh@1291 -- # '[' -z verify ']' 00:09:06.869 18:12:50 blockdev_general.bdev_fio -- common/autotest_common.sh@1295 -- # '[' -n '' ']' 00:09:06.869 18:12:50 blockdev_general.bdev_fio -- common/autotest_common.sh@1299 -- # touch /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:09:06.869 18:12:50 blockdev_general.bdev_fio -- common/autotest_common.sh@1301 -- # cat 00:09:06.869 18:12:50 blockdev_general.bdev_fio -- common/autotest_common.sh@1313 -- # '[' verify == verify ']' 00:09:06.869 18:12:50 blockdev_general.bdev_fio -- common/autotest_common.sh@1314 -- # cat 00:09:06.869 18:12:50 blockdev_general.bdev_fio -- common/autotest_common.sh@1323 -- # '[' AIO == AIO ']' 00:09:06.869 18:12:50 blockdev_general.bdev_fio -- common/autotest_common.sh@1324 -- # /usr/src/fio/fio --version 00:09:06.869 18:12:50 blockdev_general.bdev_fio -- common/autotest_common.sh@1324 -- # [[ fio-3.35 == *\f\i\o\-\3* ]] 00:09:06.869 18:12:50 blockdev_general.bdev_fio -- common/autotest_common.sh@1325 -- # echo serialize_overlap=1 00:09:06.869 18:12:50 blockdev_general.bdev_fio -- bdev/blockdev.sh@341 -- # for b in "${bdevs_name[@]}" 00:09:06.869 18:12:50 blockdev_general.bdev_fio -- bdev/blockdev.sh@342 -- # echo '[job_Malloc0]' 00:09:06.869 18:12:50 blockdev_general.bdev_fio -- bdev/blockdev.sh@343 -- # echo filename=Malloc0 00:09:06.869 18:12:50 blockdev_general.bdev_fio -- bdev/blockdev.sh@341 -- # for b in "${bdevs_name[@]}" 00:09:06.869 18:12:50 blockdev_general.bdev_fio -- bdev/blockdev.sh@342 -- # echo '[job_Malloc1p0]' 00:09:06.869 18:12:50 blockdev_general.bdev_fio -- bdev/blockdev.sh@343 -- # echo filename=Malloc1p0 00:09:06.869 18:12:50 blockdev_general.bdev_fio -- bdev/blockdev.sh@341 -- # for b in "${bdevs_name[@]}" 00:09:06.869 18:12:50 blockdev_general.bdev_fio -- bdev/blockdev.sh@342 -- # echo '[job_Malloc1p1]' 00:09:06.869 18:12:50 blockdev_general.bdev_fio -- bdev/blockdev.sh@343 -- # echo filename=Malloc1p1 00:09:06.869 18:12:50 blockdev_general.bdev_fio -- bdev/blockdev.sh@341 -- # for b in "${bdevs_name[@]}" 00:09:06.869 18:12:50 blockdev_general.bdev_fio -- bdev/blockdev.sh@342 -- # echo '[job_Malloc2p0]' 00:09:06.869 18:12:50 blockdev_general.bdev_fio -- bdev/blockdev.sh@343 -- # echo filename=Malloc2p0 00:09:06.869 18:12:50 blockdev_general.bdev_fio -- bdev/blockdev.sh@341 -- # for b in "${bdevs_name[@]}" 00:09:06.869 18:12:50 blockdev_general.bdev_fio -- bdev/blockdev.sh@342 -- # echo '[job_Malloc2p1]' 00:09:06.869 18:12:50 blockdev_general.bdev_fio -- bdev/blockdev.sh@343 -- # echo filename=Malloc2p1 00:09:06.869 18:12:50 blockdev_general.bdev_fio -- bdev/blockdev.sh@341 -- # for b in "${bdevs_name[@]}" 00:09:06.869 18:12:50 blockdev_general.bdev_fio -- bdev/blockdev.sh@342 -- # echo '[job_Malloc2p2]' 00:09:06.869 18:12:50 blockdev_general.bdev_fio -- bdev/blockdev.sh@343 -- # echo filename=Malloc2p2 00:09:06.869 18:12:50 blockdev_general.bdev_fio -- bdev/blockdev.sh@341 -- # for b in "${bdevs_name[@]}" 00:09:06.869 18:12:50 blockdev_general.bdev_fio -- bdev/blockdev.sh@342 -- # echo '[job_Malloc2p3]' 00:09:06.869 18:12:50 blockdev_general.bdev_fio -- bdev/blockdev.sh@343 -- # echo filename=Malloc2p3 00:09:06.869 18:12:50 blockdev_general.bdev_fio -- bdev/blockdev.sh@341 -- # for b in "${bdevs_name[@]}" 00:09:06.869 18:12:50 blockdev_general.bdev_fio -- bdev/blockdev.sh@342 -- # echo '[job_Malloc2p4]' 00:09:06.869 18:12:50 blockdev_general.bdev_fio -- bdev/blockdev.sh@343 -- # echo filename=Malloc2p4 00:09:06.869 18:12:50 blockdev_general.bdev_fio -- bdev/blockdev.sh@341 -- # for b in "${bdevs_name[@]}" 00:09:06.870 18:12:50 blockdev_general.bdev_fio -- bdev/blockdev.sh@342 -- # echo '[job_Malloc2p5]' 00:09:06.870 18:12:50 blockdev_general.bdev_fio -- bdev/blockdev.sh@343 -- # echo filename=Malloc2p5 00:09:06.870 18:12:50 blockdev_general.bdev_fio -- bdev/blockdev.sh@341 -- # for b in "${bdevs_name[@]}" 00:09:06.870 18:12:50 blockdev_general.bdev_fio -- bdev/blockdev.sh@342 -- # echo '[job_Malloc2p6]' 00:09:06.870 18:12:50 blockdev_general.bdev_fio -- bdev/blockdev.sh@343 -- # echo filename=Malloc2p6 00:09:06.870 18:12:50 blockdev_general.bdev_fio -- bdev/blockdev.sh@341 -- # for b in "${bdevs_name[@]}" 00:09:06.870 18:12:50 blockdev_general.bdev_fio -- bdev/blockdev.sh@342 -- # echo '[job_Malloc2p7]' 00:09:06.870 18:12:50 blockdev_general.bdev_fio -- bdev/blockdev.sh@343 -- # echo filename=Malloc2p7 00:09:06.870 18:12:50 blockdev_general.bdev_fio -- bdev/blockdev.sh@341 -- # for b in "${bdevs_name[@]}" 00:09:06.870 18:12:50 blockdev_general.bdev_fio -- bdev/blockdev.sh@342 -- # echo '[job_TestPT]' 00:09:06.870 18:12:50 blockdev_general.bdev_fio -- bdev/blockdev.sh@343 -- # echo filename=TestPT 00:09:06.870 18:12:50 blockdev_general.bdev_fio -- bdev/blockdev.sh@341 -- # for b in "${bdevs_name[@]}" 00:09:06.870 18:12:50 blockdev_general.bdev_fio -- bdev/blockdev.sh@342 -- # echo '[job_raid0]' 00:09:06.870 18:12:50 blockdev_general.bdev_fio -- bdev/blockdev.sh@343 -- # echo filename=raid0 00:09:06.870 18:12:50 blockdev_general.bdev_fio -- bdev/blockdev.sh@341 -- # for b in "${bdevs_name[@]}" 00:09:06.870 18:12:50 blockdev_general.bdev_fio -- bdev/blockdev.sh@342 -- # echo '[job_concat0]' 00:09:06.870 18:12:50 blockdev_general.bdev_fio -- bdev/blockdev.sh@343 -- # echo filename=concat0 00:09:06.870 18:12:50 blockdev_general.bdev_fio -- bdev/blockdev.sh@341 -- # for b in "${bdevs_name[@]}" 00:09:06.870 18:12:50 blockdev_general.bdev_fio -- bdev/blockdev.sh@342 -- # echo '[job_raid1]' 00:09:06.870 18:12:50 blockdev_general.bdev_fio -- bdev/blockdev.sh@343 -- # echo filename=raid1 00:09:06.870 18:12:50 blockdev_general.bdev_fio -- bdev/blockdev.sh@341 -- # for b in "${bdevs_name[@]}" 00:09:06.870 18:12:50 blockdev_general.bdev_fio -- bdev/blockdev.sh@342 -- # echo '[job_AIO0]' 00:09:06.870 18:12:50 blockdev_general.bdev_fio -- bdev/blockdev.sh@343 -- # echo filename=AIO0 00:09:06.870 18:12:50 blockdev_general.bdev_fio -- bdev/blockdev.sh@347 -- # local 'fio_params=--ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json' 00:09:06.870 18:12:50 blockdev_general.bdev_fio -- bdev/blockdev.sh@349 -- # run_test bdev_fio_rw_verify fio_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --spdk_mem=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:09:06.870 18:12:50 blockdev_general.bdev_fio -- common/autotest_common.sh@1099 -- # '[' 11 -le 1 ']' 00:09:06.870 18:12:50 blockdev_general.bdev_fio -- common/autotest_common.sh@1105 -- # xtrace_disable 00:09:06.870 18:12:50 blockdev_general.bdev_fio -- common/autotest_common.sh@10 -- # set +x 00:09:06.870 ************************************ 00:09:06.870 START TEST bdev_fio_rw_verify 00:09:06.870 ************************************ 00:09:06.870 18:12:50 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1123 -- # fio_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --spdk_mem=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:09:06.870 18:12:50 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1356 -- # fio_plugin /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --spdk_mem=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:09:06.870 18:12:50 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1337 -- # local fio_dir=/usr/src/fio 00:09:06.870 18:12:50 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1339 -- # sanitizers=('libasan' 'libclang_rt.asan') 00:09:06.870 18:12:50 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1339 -- # local sanitizers 00:09:06.870 18:12:50 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1340 -- # local plugin=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:09:06.870 18:12:50 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1341 -- # shift 00:09:06.870 18:12:50 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1343 -- # local asan_lib= 00:09:06.870 18:12:50 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1344 -- # for sanitizer in "${sanitizers[@]}" 00:09:06.870 18:12:50 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # ldd /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:09:06.870 18:12:50 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # grep libasan 00:09:06.870 18:12:50 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # awk '{print $3}' 00:09:06.870 18:12:50 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # asan_lib= 00:09:06.870 18:12:50 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1346 -- # [[ -n '' ]] 00:09:06.870 18:12:50 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1344 -- # for sanitizer in "${sanitizers[@]}" 00:09:06.870 18:12:50 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # ldd /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:09:06.870 18:12:50 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # grep libclang_rt.asan 00:09:06.870 18:12:50 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # awk '{print $3}' 00:09:07.128 18:12:50 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # asan_lib= 00:09:07.128 18:12:50 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1346 -- # [[ -n '' ]] 00:09:07.128 18:12:50 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1352 -- # LD_PRELOAD=' /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev' 00:09:07.128 18:12:50 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1352 -- # /usr/src/fio/fio --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --spdk_mem=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:09:07.386 job_Malloc0: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:09:07.386 job_Malloc1p0: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:09:07.386 job_Malloc1p1: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:09:07.386 job_Malloc2p0: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:09:07.386 job_Malloc2p1: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:09:07.386 job_Malloc2p2: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:09:07.386 job_Malloc2p3: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:09:07.386 job_Malloc2p4: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:09:07.386 job_Malloc2p5: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:09:07.386 job_Malloc2p6: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:09:07.386 job_Malloc2p7: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:09:07.386 job_TestPT: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:09:07.386 job_raid0: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:09:07.386 job_concat0: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:09:07.386 job_raid1: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:09:07.386 job_AIO0: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:09:07.386 fio-3.35 00:09:07.386 Starting 16 threads 00:09:19.586 00:09:19.586 job_Malloc0: (groupid=0, jobs=16): err= 0: pid=2442827: Fri Jul 12 18:13:01 2024 00:09:19.586 read: IOPS=86.2k, BW=337MiB/s (353MB/s)(3366MiB/10001msec) 00:09:19.586 slat (usec): min=3, max=1391, avg=37.12, stdev=14.08 00:09:19.586 clat (usec): min=9, max=1830, avg=301.75, stdev=131.25 00:09:19.586 lat (usec): min=18, max=1861, avg=338.87, stdev=138.47 00:09:19.586 clat percentiles (usec): 00:09:19.586 | 50.000th=[ 289], 99.000th=[ 594], 99.900th=[ 660], 99.990th=[ 938], 00:09:19.586 | 99.999th=[ 1123] 00:09:19.586 write: IOPS=137k, BW=534MiB/s (560MB/s)(5272MiB/9870msec); 0 zone resets 00:09:19.586 slat (usec): min=6, max=3499, avg=50.52, stdev=14.97 00:09:19.586 clat (usec): min=13, max=3874, avg=354.24, stdev=156.54 00:09:19.586 lat (usec): min=32, max=3916, avg=404.76, stdev=163.66 00:09:19.586 clat percentiles (usec): 00:09:19.586 | 50.000th=[ 343], 99.000th=[ 750], 99.900th=[ 988], 99.990th=[ 1057], 00:09:19.586 | 99.999th=[ 1172] 00:09:19.586 bw ( KiB/s): min=435472, max=798063, per=99.31%, avg=543181.47, stdev=8176.24, samples=304 00:09:19.586 iops : min=108868, max=199511, avg=135794.95, stdev=2044.01, samples=304 00:09:19.586 lat (usec) : 10=0.01%, 20=0.01%, 50=0.49%, 100=3.00%, 250=29.27% 00:09:19.586 lat (usec) : 500=52.55%, 750=14.05%, 1000=0.58% 00:09:19.586 lat (msec) : 2=0.04%, 4=0.01% 00:09:19.586 cpu : usr=99.16%, sys=0.44%, ctx=680, majf=0, minf=2640 00:09:19.586 IO depths : 1=12.5%, 2=24.9%, 4=50.1%, 8=12.5%, 16=0.0%, 32=0.0%, >=64=0.0% 00:09:19.586 submit : 0=0.0%, 4=100.0%, 8=0.0%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.0% 00:09:19.586 complete : 0=0.0%, 4=89.0%, 8=11.0%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.0% 00:09:19.586 issued rwts: total=861626,1349668,0,0 short=0,0,0,0 dropped=0,0,0,0 00:09:19.586 latency : target=0, window=0, percentile=100.00%, depth=8 00:09:19.586 00:09:19.586 Run status group 0 (all jobs): 00:09:19.586 READ: bw=337MiB/s (353MB/s), 337MiB/s-337MiB/s (353MB/s-353MB/s), io=3366MiB (3529MB), run=10001-10001msec 00:09:19.586 WRITE: bw=534MiB/s (560MB/s), 534MiB/s-534MiB/s (560MB/s-560MB/s), io=5272MiB (5528MB), run=9870-9870msec 00:09:19.586 00:09:19.586 real 0m11.723s 00:09:19.586 user 2m45.210s 00:09:19.586 sys 0m1.342s 00:09:19.586 18:13:02 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1124 -- # xtrace_disable 00:09:19.586 18:13:02 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@10 -- # set +x 00:09:19.586 ************************************ 00:09:19.586 END TEST bdev_fio_rw_verify 00:09:19.586 ************************************ 00:09:19.586 18:13:02 blockdev_general.bdev_fio -- common/autotest_common.sh@1142 -- # return 0 00:09:19.586 18:13:02 blockdev_general.bdev_fio -- bdev/blockdev.sh@350 -- # rm -f 00:09:19.586 18:13:02 blockdev_general.bdev_fio -- bdev/blockdev.sh@351 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:09:19.586 18:13:02 blockdev_general.bdev_fio -- bdev/blockdev.sh@354 -- # fio_config_gen /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio trim '' '' 00:09:19.586 18:13:02 blockdev_general.bdev_fio -- common/autotest_common.sh@1280 -- # local config_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:09:19.586 18:13:02 blockdev_general.bdev_fio -- common/autotest_common.sh@1281 -- # local workload=trim 00:09:19.586 18:13:02 blockdev_general.bdev_fio -- common/autotest_common.sh@1282 -- # local bdev_type= 00:09:19.586 18:13:02 blockdev_general.bdev_fio -- common/autotest_common.sh@1283 -- # local env_context= 00:09:19.586 18:13:02 blockdev_general.bdev_fio -- common/autotest_common.sh@1284 -- # local fio_dir=/usr/src/fio 00:09:19.586 18:13:02 blockdev_general.bdev_fio -- common/autotest_common.sh@1286 -- # '[' -e /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio ']' 00:09:19.586 18:13:02 blockdev_general.bdev_fio -- common/autotest_common.sh@1291 -- # '[' -z trim ']' 00:09:19.586 18:13:02 blockdev_general.bdev_fio -- common/autotest_common.sh@1295 -- # '[' -n '' ']' 00:09:19.586 18:13:02 blockdev_general.bdev_fio -- common/autotest_common.sh@1299 -- # touch /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:09:19.586 18:13:02 blockdev_general.bdev_fio -- common/autotest_common.sh@1301 -- # cat 00:09:19.586 18:13:02 blockdev_general.bdev_fio -- common/autotest_common.sh@1313 -- # '[' trim == verify ']' 00:09:19.586 18:13:02 blockdev_general.bdev_fio -- common/autotest_common.sh@1328 -- # '[' trim == trim ']' 00:09:19.586 18:13:02 blockdev_general.bdev_fio -- common/autotest_common.sh@1329 -- # echo rw=trimwrite 00:09:19.586 18:13:02 blockdev_general.bdev_fio -- bdev/blockdev.sh@355 -- # jq -r 'select(.supported_io_types.unmap == true) | .name' 00:09:19.587 18:13:02 blockdev_general.bdev_fio -- bdev/blockdev.sh@355 -- # printf '%s\n' '{' ' "name": "Malloc0",' ' "aliases": [' ' "8c026948-175e-43a4-9c68-661d1aa15a51"' ' ],' ' "product_name": "Malloc disk",' ' "block_size": 512,' ' "num_blocks": 65536,' ' "uuid": "8c026948-175e-43a4-9c68-661d1aa15a51",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 20000,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {}' '}' '{' ' "name": "Malloc1p0",' ' "aliases": [' ' "10a8a23b-75e7-5a95-a595-934c63455d74"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 32768,' ' "uuid": "10a8a23b-75e7-5a95-a595-934c63455d74",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc1",' ' "offset_blocks": 0' ' }' ' }' '}' '{' ' "name": "Malloc1p1",' ' "aliases": [' ' "92b07044-9741-5918-86bb-7dc7a7ff5094"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 32768,' ' "uuid": "92b07044-9741-5918-86bb-7dc7a7ff5094",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc1",' ' "offset_blocks": 32768' ' }' ' }' '}' '{' ' "name": "Malloc2p0",' ' "aliases": [' ' "b605ca3f-a8ce-57fb-8a95-9efc3a7c9873"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "b605ca3f-a8ce-57fb-8a95-9efc3a7c9873",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 0' ' }' ' }' '}' '{' ' "name": "Malloc2p1",' ' "aliases": [' ' "874975c9-499f-5584-8e76-a3baa87ae239"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "874975c9-499f-5584-8e76-a3baa87ae239",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 8192' ' }' ' }' '}' '{' ' "name": "Malloc2p2",' ' "aliases": [' ' "24a31cf2-8c7b-5a42-894f-9035e0295699"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "24a31cf2-8c7b-5a42-894f-9035e0295699",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 16384' ' }' ' }' '}' '{' ' "name": "Malloc2p3",' ' "aliases": [' ' "d97aed29-4bfe-5d15-bbda-b935615639e9"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "d97aed29-4bfe-5d15-bbda-b935615639e9",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 24576' ' }' ' }' '}' '{' ' "name": "Malloc2p4",' ' "aliases": [' ' "91c509dc-cd44-5a4b-992e-0450ebb2e36e"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "91c509dc-cd44-5a4b-992e-0450ebb2e36e",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 32768' ' }' ' }' '}' '{' ' "name": "Malloc2p5",' ' "aliases": [' ' "52c6ac66-4aa9-5435-a650-d22d2f799ff0"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "52c6ac66-4aa9-5435-a650-d22d2f799ff0",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 40960' ' }' ' }' '}' '{' ' "name": "Malloc2p6",' ' "aliases": [' ' "ff93a3ae-9750-5893-b29c-979716dfe45a"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "ff93a3ae-9750-5893-b29c-979716dfe45a",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 49152' ' }' ' }' '}' '{' ' "name": "Malloc2p7",' ' "aliases": [' ' "e2c11e67-14f3-5dfe-bc1b-ad64b5e428eb"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "e2c11e67-14f3-5dfe-bc1b-ad64b5e428eb",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 57344' ' }' ' }' '}' '{' ' "name": "TestPT",' ' "aliases": [' ' "7b697129-79b5-5be7-9104-aa2d250452a5"' ' ],' ' "product_name": "passthru",' ' "block_size": 512,' ' "num_blocks": 65536,' ' "uuid": "7b697129-79b5-5be7-9104-aa2d250452a5",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "passthru": {' ' "name": "TestPT",' ' "base_bdev_name": "Malloc3"' ' }' ' }' '}' '{' ' "name": "raid0",' ' "aliases": [' ' "b18e8578-2643-4520-9d9d-c641d0cdb6cb"' ' ],' ' "product_name": "Raid Volume",' ' "block_size": 512,' ' "num_blocks": 131072,' ' "uuid": "b18e8578-2643-4520-9d9d-c641d0cdb6cb",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "raid": {' ' "uuid": "b18e8578-2643-4520-9d9d-c641d0cdb6cb",' ' "strip_size_kb": 64,' ' "state": "online",' ' "raid_level": "raid0",' ' "superblock": false,' ' "num_base_bdevs": 2,' ' "num_base_bdevs_discovered": 2,' ' "num_base_bdevs_operational": 2,' ' "base_bdevs_list": [' ' {' ' "name": "Malloc4",' ' "uuid": "a2091772-e110-46aa-a46b-7c07530e6ca9",' ' "is_configured": true,' ' "data_offset": 0,' ' "data_size": 65536' ' },' ' {' ' "name": "Malloc5",' ' "uuid": "82d0c7ef-46f1-44df-b775-eed1988d029c",' ' "is_configured": true,' ' "data_offset": 0,' ' "data_size": 65536' ' }' ' ]' ' }' ' }' '}' '{' ' "name": "concat0",' ' "aliases": [' ' "0df88812-5798-4dba-b9fb-c9f9390f0234"' ' ],' ' "product_name": "Raid Volume",' ' "block_size": 512,' ' "num_blocks": 131072,' ' "uuid": "0df88812-5798-4dba-b9fb-c9f9390f0234",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "raid": {' ' "uuid": "0df88812-5798-4dba-b9fb-c9f9390f0234",' ' "strip_size_kb": 64,' ' "state": "online",' ' "raid_level": "concat",' ' "superblock": false,' ' "num_base_bdevs": 2,' ' "num_base_bdevs_discovered": 2,' ' "num_base_bdevs_operational": 2,' ' "base_bdevs_list": [' ' {' ' "name": "Malloc6",' ' "uuid": "bdbffdf3-f179-4faf-8ab4-95de8aee6143",' ' "is_configured": true,' ' "data_offset": 0,' ' "data_size": 65536' ' },' ' {' ' "name": "Malloc7",' ' "uuid": "81c8b237-3342-48dc-99d7-48c6e1c22d95",' ' "is_configured": true,' ' "data_offset": 0,' ' "data_size": 65536' ' }' ' ]' ' }' ' }' '}' '{' ' "name": "raid1",' ' "aliases": [' ' "b721d4ff-426f-4dcf-9f63-13f4c0f8fcfd"' ' ],' ' "product_name": "Raid Volume",' ' "block_size": 512,' ' "num_blocks": 65536,' ' "uuid": "b721d4ff-426f-4dcf-9f63-13f4c0f8fcfd",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": false,' ' "flush": false,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "raid": {' ' "uuid": "b721d4ff-426f-4dcf-9f63-13f4c0f8fcfd",' ' "strip_size_kb": 0,' ' "state": "online",' ' "raid_level": "raid1",' ' "superblock": false,' ' "num_base_bdevs": 2,' ' "num_base_bdevs_discovered": 2,' ' "num_base_bdevs_operational": 2,' ' "base_bdevs_list": [' ' {' ' "name": "Malloc8",' ' "uuid": "e4bce0c5-9d51-489c-8914-8554a294b2f6",' ' "is_configured": true,' ' "data_offset": 0,' ' "data_size": 65536' ' },' ' {' ' "name": "Malloc9",' ' "uuid": "423fd104-aae0-4492-8a7c-3d25ec5fe345",' ' "is_configured": true,' ' "data_offset": 0,' ' "data_size": 65536' ' }' ' ]' ' }' ' }' '}' '{' ' "name": "AIO0",' ' "aliases": [' ' "97c75e37-b558-4fcf-a0d6-c74abc978df6"' ' ],' ' "product_name": "AIO disk",' ' "block_size": 2048,' ' "num_blocks": 5000,' ' "uuid": "97c75e37-b558-4fcf-a0d6-c74abc978df6",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": false,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "aio": {' ' "filename": "/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/aiofile",' ' "block_size_override": true,' ' "readonly": false,' ' "fallocate": false' ' }' ' }' '}' 00:09:19.587 18:13:02 blockdev_general.bdev_fio -- bdev/blockdev.sh@355 -- # [[ -n Malloc0 00:09:19.587 Malloc1p0 00:09:19.587 Malloc1p1 00:09:19.587 Malloc2p0 00:09:19.587 Malloc2p1 00:09:19.587 Malloc2p2 00:09:19.587 Malloc2p3 00:09:19.587 Malloc2p4 00:09:19.587 Malloc2p5 00:09:19.587 Malloc2p6 00:09:19.587 Malloc2p7 00:09:19.587 TestPT 00:09:19.587 raid0 00:09:19.587 concat0 ]] 00:09:19.587 18:13:02 blockdev_general.bdev_fio -- bdev/blockdev.sh@356 -- # jq -r 'select(.supported_io_types.unmap == true) | .name' 00:09:19.588 18:13:02 blockdev_general.bdev_fio -- bdev/blockdev.sh@356 -- # printf '%s\n' '{' ' "name": "Malloc0",' ' "aliases": [' ' "8c026948-175e-43a4-9c68-661d1aa15a51"' ' ],' ' "product_name": "Malloc disk",' ' "block_size": 512,' ' "num_blocks": 65536,' ' "uuid": "8c026948-175e-43a4-9c68-661d1aa15a51",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 20000,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {}' '}' '{' ' "name": "Malloc1p0",' ' "aliases": [' ' "10a8a23b-75e7-5a95-a595-934c63455d74"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 32768,' ' "uuid": "10a8a23b-75e7-5a95-a595-934c63455d74",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc1",' ' "offset_blocks": 0' ' }' ' }' '}' '{' ' "name": "Malloc1p1",' ' "aliases": [' ' "92b07044-9741-5918-86bb-7dc7a7ff5094"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 32768,' ' "uuid": "92b07044-9741-5918-86bb-7dc7a7ff5094",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc1",' ' "offset_blocks": 32768' ' }' ' }' '}' '{' ' "name": "Malloc2p0",' ' "aliases": [' ' "b605ca3f-a8ce-57fb-8a95-9efc3a7c9873"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "b605ca3f-a8ce-57fb-8a95-9efc3a7c9873",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 0' ' }' ' }' '}' '{' ' "name": "Malloc2p1",' ' "aliases": [' ' "874975c9-499f-5584-8e76-a3baa87ae239"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "874975c9-499f-5584-8e76-a3baa87ae239",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 8192' ' }' ' }' '}' '{' ' "name": "Malloc2p2",' ' "aliases": [' ' "24a31cf2-8c7b-5a42-894f-9035e0295699"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "24a31cf2-8c7b-5a42-894f-9035e0295699",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 16384' ' }' ' }' '}' '{' ' "name": "Malloc2p3",' ' "aliases": [' ' "d97aed29-4bfe-5d15-bbda-b935615639e9"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "d97aed29-4bfe-5d15-bbda-b935615639e9",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 24576' ' }' ' }' '}' '{' ' "name": "Malloc2p4",' ' "aliases": [' ' "91c509dc-cd44-5a4b-992e-0450ebb2e36e"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "91c509dc-cd44-5a4b-992e-0450ebb2e36e",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 32768' ' }' ' }' '}' '{' ' "name": "Malloc2p5",' ' "aliases": [' ' "52c6ac66-4aa9-5435-a650-d22d2f799ff0"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "52c6ac66-4aa9-5435-a650-d22d2f799ff0",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 40960' ' }' ' }' '}' '{' ' "name": "Malloc2p6",' ' "aliases": [' ' "ff93a3ae-9750-5893-b29c-979716dfe45a"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "ff93a3ae-9750-5893-b29c-979716dfe45a",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 49152' ' }' ' }' '}' '{' ' "name": "Malloc2p7",' ' "aliases": [' ' "e2c11e67-14f3-5dfe-bc1b-ad64b5e428eb"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "e2c11e67-14f3-5dfe-bc1b-ad64b5e428eb",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 57344' ' }' ' }' '}' '{' ' "name": "TestPT",' ' "aliases": [' ' "7b697129-79b5-5be7-9104-aa2d250452a5"' ' ],' ' "product_name": "passthru",' ' "block_size": 512,' ' "num_blocks": 65536,' ' "uuid": "7b697129-79b5-5be7-9104-aa2d250452a5",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "passthru": {' ' "name": "TestPT",' ' "base_bdev_name": "Malloc3"' ' }' ' }' '}' '{' ' "name": "raid0",' ' "aliases": [' ' "b18e8578-2643-4520-9d9d-c641d0cdb6cb"' ' ],' ' "product_name": "Raid Volume",' ' "block_size": 512,' ' "num_blocks": 131072,' ' "uuid": "b18e8578-2643-4520-9d9d-c641d0cdb6cb",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "raid": {' ' "uuid": "b18e8578-2643-4520-9d9d-c641d0cdb6cb",' ' "strip_size_kb": 64,' ' "state": "online",' ' "raid_level": "raid0",' ' "superblock": false,' ' "num_base_bdevs": 2,' ' "num_base_bdevs_discovered": 2,' ' "num_base_bdevs_operational": 2,' ' "base_bdevs_list": [' ' {' ' "name": "Malloc4",' ' "uuid": "a2091772-e110-46aa-a46b-7c07530e6ca9",' ' "is_configured": true,' ' "data_offset": 0,' ' "data_size": 65536' ' },' ' {' ' "name": "Malloc5",' ' "uuid": "82d0c7ef-46f1-44df-b775-eed1988d029c",' ' "is_configured": true,' ' "data_offset": 0,' ' "data_size": 65536' ' }' ' ]' ' }' ' }' '}' '{' ' "name": "concat0",' ' "aliases": [' ' "0df88812-5798-4dba-b9fb-c9f9390f0234"' ' ],' ' "product_name": "Raid Volume",' ' "block_size": 512,' ' "num_blocks": 131072,' ' "uuid": "0df88812-5798-4dba-b9fb-c9f9390f0234",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "raid": {' ' "uuid": "0df88812-5798-4dba-b9fb-c9f9390f0234",' ' "strip_size_kb": 64,' ' "state": "online",' ' "raid_level": "concat",' ' "superblock": false,' ' "num_base_bdevs": 2,' ' "num_base_bdevs_discovered": 2,' ' "num_base_bdevs_operational": 2,' ' "base_bdevs_list": [' ' {' ' "name": "Malloc6",' ' "uuid": "bdbffdf3-f179-4faf-8ab4-95de8aee6143",' ' "is_configured": true,' ' "data_offset": 0,' ' "data_size": 65536' ' },' ' {' ' "name": "Malloc7",' ' "uuid": "81c8b237-3342-48dc-99d7-48c6e1c22d95",' ' "is_configured": true,' ' "data_offset": 0,' ' "data_size": 65536' ' }' ' ]' ' }' ' }' '}' '{' ' "name": "raid1",' ' "aliases": [' ' "b721d4ff-426f-4dcf-9f63-13f4c0f8fcfd"' ' ],' ' "product_name": "Raid Volume",' ' "block_size": 512,' ' "num_blocks": 65536,' ' "uuid": "b721d4ff-426f-4dcf-9f63-13f4c0f8fcfd",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": false,' ' "flush": false,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "raid": {' ' "uuid": "b721d4ff-426f-4dcf-9f63-13f4c0f8fcfd",' ' "strip_size_kb": 0,' ' "state": "online",' ' "raid_level": "raid1",' ' "superblock": false,' ' "num_base_bdevs": 2,' ' "num_base_bdevs_discovered": 2,' ' "num_base_bdevs_operational": 2,' ' "base_bdevs_list": [' ' {' ' "name": "Malloc8",' ' "uuid": "e4bce0c5-9d51-489c-8914-8554a294b2f6",' ' "is_configured": true,' ' "data_offset": 0,' ' "data_size": 65536' ' },' ' {' ' "name": "Malloc9",' ' "uuid": "423fd104-aae0-4492-8a7c-3d25ec5fe345",' ' "is_configured": true,' ' "data_offset": 0,' ' "data_size": 65536' ' }' ' ]' ' }' ' }' '}' '{' ' "name": "AIO0",' ' "aliases": [' ' "97c75e37-b558-4fcf-a0d6-c74abc978df6"' ' ],' ' "product_name": "AIO disk",' ' "block_size": 2048,' ' "num_blocks": 5000,' ' "uuid": "97c75e37-b558-4fcf-a0d6-c74abc978df6",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": false,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "aio": {' ' "filename": "/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/aiofile",' ' "block_size_override": true,' ' "readonly": false,' ' "fallocate": false' ' }' ' }' '}' 00:09:19.588 18:13:02 blockdev_general.bdev_fio -- bdev/blockdev.sh@356 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:09:19.588 18:13:02 blockdev_general.bdev_fio -- bdev/blockdev.sh@357 -- # echo '[job_Malloc0]' 00:09:19.588 18:13:02 blockdev_general.bdev_fio -- bdev/blockdev.sh@358 -- # echo filename=Malloc0 00:09:19.588 18:13:02 blockdev_general.bdev_fio -- bdev/blockdev.sh@356 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:09:19.588 18:13:02 blockdev_general.bdev_fio -- bdev/blockdev.sh@357 -- # echo '[job_Malloc1p0]' 00:09:19.588 18:13:02 blockdev_general.bdev_fio -- bdev/blockdev.sh@358 -- # echo filename=Malloc1p0 00:09:19.588 18:13:02 blockdev_general.bdev_fio -- bdev/blockdev.sh@356 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:09:19.588 18:13:02 blockdev_general.bdev_fio -- bdev/blockdev.sh@357 -- # echo '[job_Malloc1p1]' 00:09:19.588 18:13:02 blockdev_general.bdev_fio -- bdev/blockdev.sh@358 -- # echo filename=Malloc1p1 00:09:19.588 18:13:02 blockdev_general.bdev_fio -- bdev/blockdev.sh@356 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:09:19.588 18:13:02 blockdev_general.bdev_fio -- bdev/blockdev.sh@357 -- # echo '[job_Malloc2p0]' 00:09:19.588 18:13:02 blockdev_general.bdev_fio -- bdev/blockdev.sh@358 -- # echo filename=Malloc2p0 00:09:19.588 18:13:02 blockdev_general.bdev_fio -- bdev/blockdev.sh@356 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:09:19.588 18:13:02 blockdev_general.bdev_fio -- bdev/blockdev.sh@357 -- # echo '[job_Malloc2p1]' 00:09:19.588 18:13:02 blockdev_general.bdev_fio -- bdev/blockdev.sh@358 -- # echo filename=Malloc2p1 00:09:19.588 18:13:02 blockdev_general.bdev_fio -- bdev/blockdev.sh@356 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:09:19.588 18:13:02 blockdev_general.bdev_fio -- bdev/blockdev.sh@357 -- # echo '[job_Malloc2p2]' 00:09:19.588 18:13:02 blockdev_general.bdev_fio -- bdev/blockdev.sh@358 -- # echo filename=Malloc2p2 00:09:19.588 18:13:02 blockdev_general.bdev_fio -- bdev/blockdev.sh@356 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:09:19.588 18:13:02 blockdev_general.bdev_fio -- bdev/blockdev.sh@357 -- # echo '[job_Malloc2p3]' 00:09:19.588 18:13:02 blockdev_general.bdev_fio -- bdev/blockdev.sh@358 -- # echo filename=Malloc2p3 00:09:19.588 18:13:02 blockdev_general.bdev_fio -- bdev/blockdev.sh@356 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:09:19.588 18:13:02 blockdev_general.bdev_fio -- bdev/blockdev.sh@357 -- # echo '[job_Malloc2p4]' 00:09:19.588 18:13:02 blockdev_general.bdev_fio -- bdev/blockdev.sh@358 -- # echo filename=Malloc2p4 00:09:19.588 18:13:02 blockdev_general.bdev_fio -- bdev/blockdev.sh@356 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:09:19.588 18:13:02 blockdev_general.bdev_fio -- bdev/blockdev.sh@357 -- # echo '[job_Malloc2p5]' 00:09:19.588 18:13:02 blockdev_general.bdev_fio -- bdev/blockdev.sh@358 -- # echo filename=Malloc2p5 00:09:19.588 18:13:02 blockdev_general.bdev_fio -- bdev/blockdev.sh@356 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:09:19.588 18:13:02 blockdev_general.bdev_fio -- bdev/blockdev.sh@357 -- # echo '[job_Malloc2p6]' 00:09:19.588 18:13:02 blockdev_general.bdev_fio -- bdev/blockdev.sh@358 -- # echo filename=Malloc2p6 00:09:19.588 18:13:02 blockdev_general.bdev_fio -- bdev/blockdev.sh@356 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:09:19.588 18:13:02 blockdev_general.bdev_fio -- bdev/blockdev.sh@357 -- # echo '[job_Malloc2p7]' 00:09:19.588 18:13:02 blockdev_general.bdev_fio -- bdev/blockdev.sh@358 -- # echo filename=Malloc2p7 00:09:19.588 18:13:02 blockdev_general.bdev_fio -- bdev/blockdev.sh@356 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:09:19.588 18:13:02 blockdev_general.bdev_fio -- bdev/blockdev.sh@357 -- # echo '[job_TestPT]' 00:09:19.588 18:13:02 blockdev_general.bdev_fio -- bdev/blockdev.sh@358 -- # echo filename=TestPT 00:09:19.588 18:13:02 blockdev_general.bdev_fio -- bdev/blockdev.sh@356 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:09:19.588 18:13:02 blockdev_general.bdev_fio -- bdev/blockdev.sh@357 -- # echo '[job_raid0]' 00:09:19.588 18:13:02 blockdev_general.bdev_fio -- bdev/blockdev.sh@358 -- # echo filename=raid0 00:09:19.588 18:13:02 blockdev_general.bdev_fio -- bdev/blockdev.sh@356 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:09:19.588 18:13:02 blockdev_general.bdev_fio -- bdev/blockdev.sh@357 -- # echo '[job_concat0]' 00:09:19.588 18:13:02 blockdev_general.bdev_fio -- bdev/blockdev.sh@358 -- # echo filename=concat0 00:09:19.588 18:13:02 blockdev_general.bdev_fio -- bdev/blockdev.sh@367 -- # run_test bdev_fio_trim fio_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --verify_state_save=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:09:19.588 18:13:02 blockdev_general.bdev_fio -- common/autotest_common.sh@1099 -- # '[' 11 -le 1 ']' 00:09:19.588 18:13:02 blockdev_general.bdev_fio -- common/autotest_common.sh@1105 -- # xtrace_disable 00:09:19.588 18:13:02 blockdev_general.bdev_fio -- common/autotest_common.sh@10 -- # set +x 00:09:19.588 ************************************ 00:09:19.588 START TEST bdev_fio_trim 00:09:19.588 ************************************ 00:09:19.588 18:13:02 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1123 -- # fio_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --verify_state_save=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:09:19.588 18:13:02 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1356 -- # fio_plugin /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --verify_state_save=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:09:19.588 18:13:02 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1337 -- # local fio_dir=/usr/src/fio 00:09:19.588 18:13:02 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1339 -- # sanitizers=('libasan' 'libclang_rt.asan') 00:09:19.588 18:13:02 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1339 -- # local sanitizers 00:09:19.588 18:13:02 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1340 -- # local plugin=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:09:19.588 18:13:02 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1341 -- # shift 00:09:19.588 18:13:02 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1343 -- # local asan_lib= 00:09:19.588 18:13:02 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1344 -- # for sanitizer in "${sanitizers[@]}" 00:09:19.588 18:13:02 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # ldd /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:09:19.588 18:13:02 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # grep libasan 00:09:19.588 18:13:02 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # awk '{print $3}' 00:09:19.588 18:13:02 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # asan_lib= 00:09:19.588 18:13:02 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1346 -- # [[ -n '' ]] 00:09:19.588 18:13:02 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1344 -- # for sanitizer in "${sanitizers[@]}" 00:09:19.588 18:13:02 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # ldd /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:09:19.588 18:13:02 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # grep libclang_rt.asan 00:09:19.588 18:13:02 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # awk '{print $3}' 00:09:19.588 18:13:02 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # asan_lib= 00:09:19.588 18:13:02 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1346 -- # [[ -n '' ]] 00:09:19.588 18:13:02 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1352 -- # LD_PRELOAD=' /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev' 00:09:19.588 18:13:02 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1352 -- # /usr/src/fio/fio --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --verify_state_save=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:09:19.588 job_Malloc0: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:09:19.588 job_Malloc1p0: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:09:19.588 job_Malloc1p1: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:09:19.588 job_Malloc2p0: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:09:19.588 job_Malloc2p1: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:09:19.588 job_Malloc2p2: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:09:19.588 job_Malloc2p3: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:09:19.588 job_Malloc2p4: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:09:19.588 job_Malloc2p5: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:09:19.588 job_Malloc2p6: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:09:19.588 job_Malloc2p7: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:09:19.588 job_TestPT: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:09:19.588 job_raid0: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:09:19.588 job_concat0: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:09:19.588 fio-3.35 00:09:19.588 Starting 14 threads 00:09:31.801 00:09:31.801 job_Malloc0: (groupid=0, jobs=14): err= 0: pid=2444535: Fri Jul 12 18:13:13 2024 00:09:31.801 write: IOPS=126k, BW=492MiB/s (516MB/s)(4925MiB/10001msec); 0 zone resets 00:09:31.801 slat (nsec): min=1913, max=637516, avg=38832.75, stdev=12939.02 00:09:31.801 clat (usec): min=16, max=3418, avg=279.81, stdev=106.86 00:09:31.801 lat (usec): min=33, max=3443, avg=318.64, stdev=113.57 00:09:31.801 clat percentiles (usec): 00:09:31.801 | 50.000th=[ 265], 99.000th=[ 537], 99.900th=[ 586], 99.990th=[ 619], 00:09:31.801 | 99.999th=[ 955] 00:09:31.801 bw ( KiB/s): min=425920, max=754048, per=100.00%, avg=507151.47, stdev=6934.96, samples=266 00:09:31.801 iops : min=106480, max=188512, avg=126787.79, stdev=1733.74, samples=266 00:09:31.801 trim: IOPS=126k, BW=493MiB/s (516MB/s)(4925MiB/10001msec); 0 zone resets 00:09:31.801 slat (usec): min=4, max=3153, avg=26.89, stdev= 8.83 00:09:31.801 clat (usec): min=3, max=3443, avg=312.48, stdev=119.57 00:09:31.801 lat (usec): min=14, max=3457, avg=339.37, stdev=124.53 00:09:31.801 clat percentiles (usec): 00:09:31.801 | 50.000th=[ 302], 99.000th=[ 594], 99.900th=[ 644], 99.990th=[ 676], 00:09:31.801 | 99.999th=[ 783] 00:09:31.801 bw ( KiB/s): min=425920, max=753984, per=100.00%, avg=507151.89, stdev=6934.95, samples=266 00:09:31.801 iops : min=106480, max=188496, avg=126787.89, stdev=1733.72, samples=266 00:09:31.801 lat (usec) : 4=0.01%, 10=0.01%, 20=0.05%, 50=0.20%, 100=1.40% 00:09:31.801 lat (usec) : 250=37.40%, 500=55.61%, 750=5.32%, 1000=0.01% 00:09:31.801 lat (msec) : 2=0.01%, 4=0.01% 00:09:31.801 cpu : usr=99.58%, sys=0.00%, ctx=589, majf=0, minf=949 00:09:31.801 IO depths : 1=12.4%, 2=24.9%, 4=50.0%, 8=12.6%, 16=0.0%, 32=0.0%, >=64=0.0% 00:09:31.801 submit : 0=0.0%, 4=100.0%, 8=0.0%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.0% 00:09:31.801 complete : 0=0.0%, 4=88.9%, 8=11.1%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.0% 00:09:31.801 issued rwts: total=0,1260923,1260927,0 short=0,0,0,0 dropped=0,0,0,0 00:09:31.801 latency : target=0, window=0, percentile=100.00%, depth=8 00:09:31.801 00:09:31.801 Run status group 0 (all jobs): 00:09:31.801 WRITE: bw=492MiB/s (516MB/s), 492MiB/s-492MiB/s (516MB/s-516MB/s), io=4925MiB (5165MB), run=10001-10001msec 00:09:31.801 TRIM: bw=493MiB/s (516MB/s), 493MiB/s-493MiB/s (516MB/s-516MB/s), io=4925MiB (5165MB), run=10001-10001msec 00:09:31.801 00:09:31.801 real 0m11.628s 00:09:31.801 user 2m25.561s 00:09:31.801 sys 0m0.750s 00:09:31.801 18:13:14 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1124 -- # xtrace_disable 00:09:31.801 18:13:14 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@10 -- # set +x 00:09:31.801 ************************************ 00:09:31.801 END TEST bdev_fio_trim 00:09:31.801 ************************************ 00:09:31.801 18:13:14 blockdev_general.bdev_fio -- common/autotest_common.sh@1142 -- # return 0 00:09:31.801 18:13:14 blockdev_general.bdev_fio -- bdev/blockdev.sh@368 -- # rm -f 00:09:31.801 18:13:14 blockdev_general.bdev_fio -- bdev/blockdev.sh@369 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:09:31.801 18:13:14 blockdev_general.bdev_fio -- bdev/blockdev.sh@370 -- # popd 00:09:31.801 /var/jenkins/workspace/crypto-phy-autotest/spdk 00:09:31.801 18:13:14 blockdev_general.bdev_fio -- bdev/blockdev.sh@371 -- # trap - SIGINT SIGTERM EXIT 00:09:31.801 00:09:31.801 real 0m23.731s 00:09:31.801 user 5m10.975s 00:09:31.801 sys 0m2.299s 00:09:31.801 18:13:14 blockdev_general.bdev_fio -- common/autotest_common.sh@1124 -- # xtrace_disable 00:09:31.801 18:13:14 blockdev_general.bdev_fio -- common/autotest_common.sh@10 -- # set +x 00:09:31.801 ************************************ 00:09:31.801 END TEST bdev_fio 00:09:31.801 ************************************ 00:09:31.801 18:13:14 blockdev_general -- common/autotest_common.sh@1142 -- # return 0 00:09:31.801 18:13:14 blockdev_general -- bdev/blockdev.sh@775 -- # trap cleanup SIGINT SIGTERM EXIT 00:09:31.801 18:13:14 blockdev_general -- bdev/blockdev.sh@777 -- # run_test bdev_verify /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 4096 -w verify -t 5 -C -m 0x3 '' 00:09:31.801 18:13:14 blockdev_general -- common/autotest_common.sh@1099 -- # '[' 16 -le 1 ']' 00:09:31.801 18:13:14 blockdev_general -- common/autotest_common.sh@1105 -- # xtrace_disable 00:09:31.801 18:13:14 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:09:31.801 ************************************ 00:09:31.801 START TEST bdev_verify 00:09:31.801 ************************************ 00:09:31.801 18:13:14 blockdev_general.bdev_verify -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 4096 -w verify -t 5 -C -m 0x3 '' 00:09:31.801 [2024-07-12 18:13:14.254098] Starting SPDK v24.09-pre git sha1 182dd7de4 / DPDK 24.03.0 initialization... 00:09:31.801 [2024-07-12 18:13:14.254165] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x3 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2445978 ] 00:09:31.801 [2024-07-12 18:13:14.381973] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 2 00:09:31.801 [2024-07-12 18:13:14.483475] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:09:31.801 [2024-07-12 18:13:14.483480] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:09:31.801 [2024-07-12 18:13:14.636739] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on Malloc3 00:09:31.802 [2024-07-12 18:13:14.636800] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc3 00:09:31.802 [2024-07-12 18:13:14.636816] vbdev_passthru.c: 735:bdev_passthru_create_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:09:31.802 [2024-07-12 18:13:14.644742] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:09:31.802 [2024-07-12 18:13:14.644768] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:09:31.802 [2024-07-12 18:13:14.652755] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:09:31.802 [2024-07-12 18:13:14.652786] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:09:31.802 [2024-07-12 18:13:14.729922] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on Malloc3 00:09:31.802 [2024-07-12 18:13:14.729983] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:09:31.802 [2024-07-12 18:13:14.730002] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x21234d0 00:09:31.802 [2024-07-12 18:13:14.730014] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:09:31.802 [2024-07-12 18:13:14.731695] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:09:31.802 [2024-07-12 18:13:14.731724] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: TestPT 00:09:31.802 Running I/O for 5 seconds... 00:09:37.067 00:09:37.067 Latency(us) 00:09:37.067 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:09:37.067 Job: Malloc0 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:09:37.067 Verification LBA range: start 0x0 length 0x1000 00:09:37.067 Malloc0 : 5.21 1032.60 4.03 0.00 0.00 123693.83 641.11 426724.84 00:09:37.067 Job: Malloc0 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:09:37.067 Verification LBA range: start 0x1000 length 0x1000 00:09:37.068 Malloc0 : 5.20 1008.55 3.94 0.00 0.00 126651.06 633.99 474138.71 00:09:37.068 Job: Malloc1p0 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:09:37.068 Verification LBA range: start 0x0 length 0x800 00:09:37.068 Malloc1p0 : 5.21 540.43 2.11 0.00 0.00 235449.57 3618.73 237069.36 00:09:37.068 Job: Malloc1p0 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:09:37.068 Verification LBA range: start 0x800 length 0x800 00:09:37.068 Malloc1p0 : 5.21 540.91 2.11 0.00 0.00 235260.94 3575.99 237069.36 00:09:37.068 Job: Malloc1p1 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:09:37.068 Verification LBA range: start 0x0 length 0x800 00:09:37.068 Malloc1p1 : 5.22 539.97 2.11 0.00 0.00 234948.89 3575.99 235245.75 00:09:37.068 Job: Malloc1p1 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:09:37.068 Verification LBA range: start 0x800 length 0x800 00:09:37.068 Malloc1p1 : 5.21 540.45 2.11 0.00 0.00 234743.63 3561.74 235245.75 00:09:37.068 Job: Malloc2p0 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:09:37.068 Verification LBA range: start 0x0 length 0x200 00:09:37.068 Malloc2p0 : 5.22 539.53 2.11 0.00 0.00 234407.42 3476.26 227951.30 00:09:37.068 Job: Malloc2p0 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:09:37.068 Verification LBA range: start 0x200 length 0x200 00:09:37.068 Malloc2p0 : 5.21 540.00 2.11 0.00 0.00 234199.27 3476.26 227951.30 00:09:37.068 Job: Malloc2p1 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:09:37.068 Verification LBA range: start 0x0 length 0x200 00:09:37.068 Malloc2p1 : 5.22 539.09 2.11 0.00 0.00 233815.43 3419.27 224304.08 00:09:37.068 Job: Malloc2p1 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:09:37.068 Verification LBA range: start 0x200 length 0x200 00:09:37.068 Malloc2p1 : 5.22 539.56 2.11 0.00 0.00 233611.89 3433.52 224304.08 00:09:37.068 Job: Malloc2p2 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:09:37.068 Verification LBA range: start 0x0 length 0x200 00:09:37.068 Malloc2p2 : 5.23 538.64 2.10 0.00 0.00 233309.40 3462.01 218833.25 00:09:37.068 Job: Malloc2p2 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:09:37.068 Verification LBA range: start 0x200 length 0x200 00:09:37.068 Malloc2p2 : 5.22 539.12 2.11 0.00 0.00 233092.86 3490.50 218833.25 00:09:37.068 Job: Malloc2p3 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:09:37.068 Verification LBA range: start 0x0 length 0x200 00:09:37.068 Malloc2p3 : 5.23 538.21 2.10 0.00 0.00 232802.32 3675.71 213362.42 00:09:37.068 Job: Malloc2p3 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:09:37.068 Verification LBA range: start 0x200 length 0x200 00:09:37.068 Malloc2p3 : 5.23 538.67 2.10 0.00 0.00 232592.71 3675.71 213362.42 00:09:37.068 Job: Malloc2p4 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:09:37.068 Verification LBA range: start 0x0 length 0x200 00:09:37.068 Malloc2p4 : 5.24 537.78 2.10 0.00 0.00 232307.13 3561.74 206979.78 00:09:37.068 Job: Malloc2p4 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:09:37.068 Verification LBA range: start 0x200 length 0x200 00:09:37.068 Malloc2p4 : 5.23 538.24 2.10 0.00 0.00 232089.43 3575.99 207891.59 00:09:37.068 Job: Malloc2p5 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:09:37.068 Verification LBA range: start 0x0 length 0x200 00:09:37.068 Malloc2p5 : 5.24 537.35 2.10 0.00 0.00 231760.44 3405.02 205156.17 00:09:37.068 Job: Malloc2p5 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:09:37.068 Verification LBA range: start 0x200 length 0x200 00:09:37.068 Malloc2p5 : 5.24 537.81 2.10 0.00 0.00 231563.39 3419.27 205156.17 00:09:37.068 Job: Malloc2p6 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:09:37.068 Verification LBA range: start 0x0 length 0x200 00:09:37.068 Malloc2p6 : 5.24 536.93 2.10 0.00 0.00 231266.49 3447.76 203332.56 00:09:37.068 Job: Malloc2p6 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:09:37.068 Verification LBA range: start 0x200 length 0x200 00:09:37.068 Malloc2p6 : 5.24 537.39 2.10 0.00 0.00 231064.78 3433.52 203332.56 00:09:37.068 Job: Malloc2p7 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:09:37.068 Verification LBA range: start 0x0 length 0x200 00:09:37.068 Malloc2p7 : 5.25 536.52 2.10 0.00 0.00 230723.49 3618.73 201508.95 00:09:37.068 Job: Malloc2p7 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:09:37.068 Verification LBA range: start 0x200 length 0x200 00:09:37.068 Malloc2p7 : 5.24 536.96 2.10 0.00 0.00 230511.42 3632.97 201508.95 00:09:37.068 Job: TestPT (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:09:37.068 Verification LBA range: start 0x0 length 0x1000 00:09:37.068 TestPT : 5.29 531.93 2.08 0.00 0.00 231794.36 18919.96 202420.76 00:09:37.068 Job: TestPT (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:09:37.068 Verification LBA range: start 0x1000 length 0x1000 00:09:37.068 TestPT : 5.29 510.64 1.99 0.00 0.00 239872.31 18919.96 275365.18 00:09:37.068 Job: raid0 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:09:37.068 Verification LBA range: start 0x0 length 0x2000 00:09:37.068 raid0 : 5.31 554.60 2.17 0.00 0.00 221657.79 3618.73 182361.04 00:09:37.068 Job: raid0 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:09:37.068 Verification LBA range: start 0x2000 length 0x2000 00:09:37.068 raid0 : 5.30 555.14 2.17 0.00 0.00 221465.00 3632.97 175978.41 00:09:37.068 Job: concat0 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:09:37.068 Verification LBA range: start 0x0 length 0x2000 00:09:37.068 concat0 : 5.31 554.22 2.16 0.00 0.00 221175.58 3675.71 177802.02 00:09:37.068 Job: concat0 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:09:37.068 Verification LBA range: start 0x2000 length 0x2000 00:09:37.068 concat0 : 5.31 554.90 2.17 0.00 0.00 220870.72 3704.21 173242.99 00:09:37.068 Job: raid1 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:09:37.068 Verification LBA range: start 0x0 length 0x1000 00:09:37.068 raid1 : 5.31 553.93 2.16 0.00 0.00 220622.36 4359.57 176890.21 00:09:37.068 Job: raid1 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:09:37.068 Verification LBA range: start 0x1000 length 0x1000 00:09:37.068 raid1 : 5.31 554.63 2.17 0.00 0.00 220317.02 4388.06 178713.82 00:09:37.068 Job: AIO0 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:09:37.068 Verification LBA range: start 0x0 length 0x4e2 00:09:37.068 AIO0 : 5.32 553.77 2.16 0.00 0.00 219990.33 1880.60 186920.07 00:09:37.068 Job: AIO0 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:09:37.068 Verification LBA range: start 0x4e2 length 0x4e2 00:09:37.068 AIO0 : 5.31 554.30 2.17 0.00 0.00 219768.30 1866.35 187831.87 00:09:37.068 =================================================================================================================== 00:09:37.068 Total : 18292.74 71.46 0.00 0.00 218201.64 633.99 474138.71 00:09:37.068 00:09:37.068 real 0m6.592s 00:09:37.068 user 0m12.185s 00:09:37.068 sys 0m0.434s 00:09:37.068 18:13:20 blockdev_general.bdev_verify -- common/autotest_common.sh@1124 -- # xtrace_disable 00:09:37.068 18:13:20 blockdev_general.bdev_verify -- common/autotest_common.sh@10 -- # set +x 00:09:37.068 ************************************ 00:09:37.068 END TEST bdev_verify 00:09:37.068 ************************************ 00:09:37.327 18:13:20 blockdev_general -- common/autotest_common.sh@1142 -- # return 0 00:09:37.327 18:13:20 blockdev_general -- bdev/blockdev.sh@778 -- # run_test bdev_verify_big_io /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 65536 -w verify -t 5 -C -m 0x3 '' 00:09:37.327 18:13:20 blockdev_general -- common/autotest_common.sh@1099 -- # '[' 16 -le 1 ']' 00:09:37.327 18:13:20 blockdev_general -- common/autotest_common.sh@1105 -- # xtrace_disable 00:09:37.327 18:13:20 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:09:37.327 ************************************ 00:09:37.327 START TEST bdev_verify_big_io 00:09:37.327 ************************************ 00:09:37.327 18:13:20 blockdev_general.bdev_verify_big_io -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 65536 -w verify -t 5 -C -m 0x3 '' 00:09:37.327 [2024-07-12 18:13:20.931974] Starting SPDK v24.09-pre git sha1 182dd7de4 / DPDK 24.03.0 initialization... 00:09:37.327 [2024-07-12 18:13:20.932035] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x3 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2446875 ] 00:09:37.585 [2024-07-12 18:13:21.058042] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 2 00:09:37.585 [2024-07-12 18:13:21.160328] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:09:37.585 [2024-07-12 18:13:21.160334] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:09:37.844 [2024-07-12 18:13:21.316422] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on Malloc3 00:09:37.844 [2024-07-12 18:13:21.316480] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc3 00:09:37.844 [2024-07-12 18:13:21.316494] vbdev_passthru.c: 735:bdev_passthru_create_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:09:37.844 [2024-07-12 18:13:21.324428] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:09:37.844 [2024-07-12 18:13:21.324454] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:09:37.844 [2024-07-12 18:13:21.332446] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:09:37.844 [2024-07-12 18:13:21.332469] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:09:37.844 [2024-07-12 18:13:21.409615] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on Malloc3 00:09:37.844 [2024-07-12 18:13:21.409666] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:09:37.844 [2024-07-12 18:13:21.409684] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x22cf4d0 00:09:37.844 [2024-07-12 18:13:21.409697] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:09:37.844 [2024-07-12 18:13:21.411321] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:09:37.844 [2024-07-12 18:13:21.411349] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: TestPT 00:09:37.844 [2024-07-12 18:13:21.570718] bdevperf.c:1818:bdevperf_construct_job: *WARNING*: Due to constraints of verify job, queue depth (-q, 128) can't exceed the number of IO requests which can be submitted to the bdev Malloc2p0 simultaneously (32). Queue depth is limited to 32 00:09:38.102 [2024-07-12 18:13:21.571746] bdevperf.c:1818:bdevperf_construct_job: *WARNING*: Due to constraints of verify job, queue depth (-q, 128) can't exceed the number of IO requests which can be submitted to the bdev Malloc2p0 simultaneously (32). Queue depth is limited to 32 00:09:38.102 [2024-07-12 18:13:21.573280] bdevperf.c:1818:bdevperf_construct_job: *WARNING*: Due to constraints of verify job, queue depth (-q, 128) can't exceed the number of IO requests which can be submitted to the bdev Malloc2p1 simultaneously (32). Queue depth is limited to 32 00:09:38.102 [2024-07-12 18:13:21.574285] bdevperf.c:1818:bdevperf_construct_job: *WARNING*: Due to constraints of verify job, queue depth (-q, 128) can't exceed the number of IO requests which can be submitted to the bdev Malloc2p1 simultaneously (32). Queue depth is limited to 32 00:09:38.102 [2024-07-12 18:13:21.575798] bdevperf.c:1818:bdevperf_construct_job: *WARNING*: Due to constraints of verify job, queue depth (-q, 128) can't exceed the number of IO requests which can be submitted to the bdev Malloc2p2 simultaneously (32). Queue depth is limited to 32 00:09:38.103 [2024-07-12 18:13:21.576800] bdevperf.c:1818:bdevperf_construct_job: *WARNING*: Due to constraints of verify job, queue depth (-q, 128) can't exceed the number of IO requests which can be submitted to the bdev Malloc2p2 simultaneously (32). Queue depth is limited to 32 00:09:38.103 [2024-07-12 18:13:21.578316] bdevperf.c:1818:bdevperf_construct_job: *WARNING*: Due to constraints of verify job, queue depth (-q, 128) can't exceed the number of IO requests which can be submitted to the bdev Malloc2p3 simultaneously (32). Queue depth is limited to 32 00:09:38.103 [2024-07-12 18:13:21.579828] bdevperf.c:1818:bdevperf_construct_job: *WARNING*: Due to constraints of verify job, queue depth (-q, 128) can't exceed the number of IO requests which can be submitted to the bdev Malloc2p3 simultaneously (32). Queue depth is limited to 32 00:09:38.103 [2024-07-12 18:13:21.580797] bdevperf.c:1818:bdevperf_construct_job: *WARNING*: Due to constraints of verify job, queue depth (-q, 128) can't exceed the number of IO requests which can be submitted to the bdev Malloc2p4 simultaneously (32). Queue depth is limited to 32 00:09:38.103 [2024-07-12 18:13:21.582304] bdevperf.c:1818:bdevperf_construct_job: *WARNING*: Due to constraints of verify job, queue depth (-q, 128) can't exceed the number of IO requests which can be submitted to the bdev Malloc2p4 simultaneously (32). Queue depth is limited to 32 00:09:38.103 [2024-07-12 18:13:21.583268] bdevperf.c:1818:bdevperf_construct_job: *WARNING*: Due to constraints of verify job, queue depth (-q, 128) can't exceed the number of IO requests which can be submitted to the bdev Malloc2p5 simultaneously (32). Queue depth is limited to 32 00:09:38.103 [2024-07-12 18:13:21.584717] bdevperf.c:1818:bdevperf_construct_job: *WARNING*: Due to constraints of verify job, queue depth (-q, 128) can't exceed the number of IO requests which can be submitted to the bdev Malloc2p5 simultaneously (32). Queue depth is limited to 32 00:09:38.103 [2024-07-12 18:13:21.585596] bdevperf.c:1818:bdevperf_construct_job: *WARNING*: Due to constraints of verify job, queue depth (-q, 128) can't exceed the number of IO requests which can be submitted to the bdev Malloc2p6 simultaneously (32). Queue depth is limited to 32 00:09:38.103 [2024-07-12 18:13:21.587001] bdevperf.c:1818:bdevperf_construct_job: *WARNING*: Due to constraints of verify job, queue depth (-q, 128) can't exceed the number of IO requests which can be submitted to the bdev Malloc2p6 simultaneously (32). Queue depth is limited to 32 00:09:38.103 [2024-07-12 18:13:21.587874] bdevperf.c:1818:bdevperf_construct_job: *WARNING*: Due to constraints of verify job, queue depth (-q, 128) can't exceed the number of IO requests which can be submitted to the bdev Malloc2p7 simultaneously (32). Queue depth is limited to 32 00:09:38.103 [2024-07-12 18:13:21.589287] bdevperf.c:1818:bdevperf_construct_job: *WARNING*: Due to constraints of verify job, queue depth (-q, 128) can't exceed the number of IO requests which can be submitted to the bdev Malloc2p7 simultaneously (32). Queue depth is limited to 32 00:09:38.103 [2024-07-12 18:13:21.613297] bdevperf.c:1818:bdevperf_construct_job: *WARNING*: Due to constraints of verify job, queue depth (-q, 128) can't exceed the number of IO requests which can be submitted to the bdev AIO0 simultaneously (78). Queue depth is limited to 78 00:09:38.103 [2024-07-12 18:13:21.615294] bdevperf.c:1818:bdevperf_construct_job: *WARNING*: Due to constraints of verify job, queue depth (-q, 128) can't exceed the number of IO requests which can be submitted to the bdev AIO0 simultaneously (78). Queue depth is limited to 78 00:09:38.103 Running I/O for 5 seconds... 00:09:46.265 00:09:46.265 Latency(us) 00:09:46.265 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:09:46.265 Job: Malloc0 (Core Mask 0x1, workload: verify, depth: 128, IO size: 65536) 00:09:46.265 Verification LBA range: start 0x0 length 0x100 00:09:46.265 Malloc0 : 5.97 171.55 10.72 0.00 0.00 732271.16 861.94 2173743.64 00:09:46.265 Job: Malloc0 (Core Mask 0x2, workload: verify, depth: 128, IO size: 65536) 00:09:46.265 Verification LBA range: start 0x100 length 0x100 00:09:46.265 Malloc0 : 6.86 223.88 13.99 0.00 0.00 420805.58 858.38 620027.55 00:09:46.265 Job: Malloc1p0 (Core Mask 0x1, workload: verify, depth: 128, IO size: 65536) 00:09:46.265 Verification LBA range: start 0x0 length 0x80 00:09:46.265 Malloc1p0 : 6.76 37.88 2.37 0.00 0.00 3041855.01 1488.81 5251998.05 00:09:46.265 Job: Malloc1p0 (Core Mask 0x2, workload: verify, depth: 128, IO size: 65536) 00:09:46.265 Verification LBA range: start 0x80 length 0x80 00:09:46.265 Malloc1p0 : 6.61 82.92 5.18 0.00 0.00 1443478.37 2464.72 2903187.81 00:09:46.265 Job: Malloc1p1 (Core Mask 0x1, workload: verify, depth: 128, IO size: 65536) 00:09:46.265 Verification LBA range: start 0x0 length 0x80 00:09:46.265 Malloc1p1 : 6.76 37.88 2.37 0.00 0.00 2946901.63 1524.42 5106109.22 00:09:46.265 Job: Malloc1p1 (Core Mask 0x2, workload: verify, depth: 128, IO size: 65536) 00:09:46.265 Verification LBA range: start 0x80 length 0x80 00:09:46.265 Malloc1p1 : 6.90 34.79 2.17 0.00 0.00 3399330.34 1510.18 5777197.86 00:09:46.265 Job: Malloc2p0 (Core Mask 0x1, workload: verify, depth: 32, IO size: 65536) 00:09:46.265 Verification LBA range: start 0x0 length 0x20 00:09:46.265 Malloc2p0 : 6.17 25.92 1.62 0.00 0.00 1093794.11 641.11 1830904.88 00:09:46.265 Job: Malloc2p0 (Core Mask 0x2, workload: verify, depth: 32, IO size: 65536) 00:09:46.265 Verification LBA range: start 0x20 length 0x20 00:09:46.265 Malloc2p0 : 6.38 22.56 1.41 0.00 0.00 1296275.35 644.67 2173743.64 00:09:46.265 Job: Malloc2p1 (Core Mask 0x1, workload: verify, depth: 32, IO size: 65536) 00:09:46.265 Verification LBA range: start 0x0 length 0x20 00:09:46.265 Malloc2p1 : 6.17 25.91 1.62 0.00 0.00 1084046.04 662.48 1809021.55 00:09:46.265 Job: Malloc2p1 (Core Mask 0x2, workload: verify, depth: 32, IO size: 65536) 00:09:46.265 Verification LBA range: start 0x20 length 0x20 00:09:46.265 Malloc2p1 : 6.38 22.56 1.41 0.00 0.00 1284885.83 648.24 2144565.87 00:09:46.265 Job: Malloc2p2 (Core Mask 0x1, workload: verify, depth: 32, IO size: 65536) 00:09:46.265 Verification LBA range: start 0x0 length 0x20 00:09:46.265 Malloc2p2 : 6.18 25.91 1.62 0.00 0.00 1074708.54 641.11 1779843.78 00:09:46.265 Job: Malloc2p2 (Core Mask 0x2, workload: verify, depth: 32, IO size: 65536) 00:09:46.265 Verification LBA range: start 0x20 length 0x20 00:09:46.265 Malloc2p2 : 6.38 22.55 1.41 0.00 0.00 1274180.50 673.17 2115388.10 00:09:46.265 Job: Malloc2p3 (Core Mask 0x1, workload: verify, depth: 32, IO size: 65536) 00:09:46.265 Verification LBA range: start 0x0 length 0x20 00:09:46.265 Malloc2p3 : 6.18 25.90 1.62 0.00 0.00 1065789.43 644.67 1757960.46 00:09:46.265 Job: Malloc2p3 (Core Mask 0x2, workload: verify, depth: 32, IO size: 65536) 00:09:46.265 Verification LBA range: start 0x20 length 0x20 00:09:46.265 Malloc2p3 : 6.39 22.55 1.41 0.00 0.00 1262855.66 658.92 2086210.34 00:09:46.265 Job: Malloc2p4 (Core Mask 0x1, workload: verify, depth: 32, IO size: 65536) 00:09:46.265 Verification LBA range: start 0x0 length 0x20 00:09:46.265 Malloc2p4 : 6.28 28.04 1.75 0.00 0.00 987419.69 641.11 1728782.69 00:09:46.265 Job: Malloc2p4 (Core Mask 0x2, workload: verify, depth: 32, IO size: 65536) 00:09:46.265 Verification LBA range: start 0x20 length 0x20 00:09:46.265 Malloc2p4 : 6.39 22.54 1.41 0.00 0.00 1252716.97 658.92 2071621.45 00:09:46.265 Job: Malloc2p5 (Core Mask 0x1, workload: verify, depth: 32, IO size: 65536) 00:09:46.265 Verification LBA range: start 0x0 length 0x20 00:09:46.265 Malloc2p5 : 6.28 28.04 1.75 0.00 0.00 978592.94 633.99 1699604.93 00:09:46.265 Job: Malloc2p5 (Core Mask 0x2, workload: verify, depth: 32, IO size: 65536) 00:09:46.265 Verification LBA range: start 0x20 length 0x20 00:09:46.265 Malloc2p5 : 6.39 22.54 1.41 0.00 0.00 1242478.38 648.24 2042443.69 00:09:46.265 Job: Malloc2p6 (Core Mask 0x1, workload: verify, depth: 32, IO size: 65536) 00:09:46.265 Verification LBA range: start 0x0 length 0x20 00:09:46.265 Malloc2p6 : 6.28 28.03 1.75 0.00 0.00 970088.74 630.43 1677721.60 00:09:46.265 Job: Malloc2p6 (Core Mask 0x2, workload: verify, depth: 32, IO size: 65536) 00:09:46.265 Verification LBA range: start 0x20 length 0x20 00:09:46.265 Malloc2p6 : 6.39 22.53 1.41 0.00 0.00 1232133.43 666.05 2013265.92 00:09:46.265 Job: Malloc2p7 (Core Mask 0x1, workload: verify, depth: 32, IO size: 65536) 00:09:46.265 Verification LBA range: start 0x0 length 0x20 00:09:46.265 Malloc2p7 : 6.28 28.03 1.75 0.00 0.00 960906.31 641.11 1655838.27 00:09:46.265 Job: Malloc2p7 (Core Mask 0x2, workload: verify, depth: 32, IO size: 65536) 00:09:46.265 Verification LBA range: start 0x20 length 0x20 00:09:46.265 Malloc2p7 : 6.39 22.53 1.41 0.00 0.00 1221038.12 651.80 1998677.04 00:09:46.265 Job: TestPT (Core Mask 0x1, workload: verify, depth: 128, IO size: 65536) 00:09:46.265 Verification LBA range: start 0x0 length 0x100 00:09:46.265 TestPT : 6.80 39.98 2.50 0.00 0.00 2554305.55 1481.68 4726798.25 00:09:46.265 Job: TestPT (Core Mask 0x2, workload: verify, depth: 128, IO size: 65536) 00:09:46.265 Verification LBA range: start 0x100 length 0x100 00:09:46.265 TestPT : 6.90 32.76 2.05 0.00 0.00 3275317.31 191479.10 4201598.44 00:09:46.265 Job: raid0 (Core Mask 0x1, workload: verify, depth: 128, IO size: 65536) 00:09:46.265 Verification LBA range: start 0x0 length 0x200 00:09:46.265 raid0 : 6.60 43.61 2.73 0.00 0.00 2298458.07 1581.41 4551731.65 00:09:46.265 Job: raid0 (Core Mask 0x2, workload: verify, depth: 128, IO size: 65536) 00:09:46.265 Verification LBA range: start 0x200 length 0x200 00:09:46.265 raid0 : 6.90 37.09 2.32 0.00 0.00 2840494.73 1609.91 4989398.15 00:09:46.265 Job: concat0 (Core Mask 0x1, workload: verify, depth: 128, IO size: 65536) 00:09:46.265 Verification LBA range: start 0x0 length 0x200 00:09:46.265 concat0 : 6.76 47.33 2.96 0.00 0.00 2048635.89 1581.41 4405842.81 00:09:46.265 Job: concat0 (Core Mask 0x2, workload: verify, depth: 128, IO size: 65536) 00:09:46.265 Verification LBA range: start 0x200 length 0x200 00:09:46.265 concat0 : 6.87 39.59 2.47 0.00 0.00 2600041.66 1617.03 4814331.55 00:09:46.265 Job: raid1 (Core Mask 0x1, workload: verify, depth: 128, IO size: 65536) 00:09:46.265 Verification LBA range: start 0x0 length 0x100 00:09:46.265 raid1 : 6.81 65.83 4.11 0.00 0.00 1470998.23 2023.07 4230776.21 00:09:46.265 Job: raid1 (Core Mask 0x2, workload: verify, depth: 128, IO size: 65536) 00:09:46.265 Verification LBA range: start 0x100 length 0x100 00:09:46.265 raid1 : 6.87 39.58 2.47 0.00 0.00 2506779.66 2065.81 4639264.95 00:09:46.265 Job: AIO0 (Core Mask 0x1, workload: verify, depth: 78, IO size: 65536) 00:09:46.265 Verification LBA range: start 0x0 length 0x4e 00:09:46.265 AIO0 : 6.81 60.52 3.78 0.00 0.00 948714.05 502.21 2509287.96 00:09:46.265 Job: AIO0 (Core Mask 0x2, workload: verify, depth: 78, IO size: 65536) 00:09:46.265 Verification LBA range: start 0x4e length 0x4e 00:09:46.265 AIO0 : 6.86 34.71 2.17 0.00 0.00 1699345.51 837.01 3063665.53 00:09:46.265 =================================================================================================================== 00:09:46.265 Total : 1426.05 89.13 0.00 0.00 1462140.89 502.21 5777197.86 00:09:46.265 00:09:46.265 real 0m8.153s 00:09:46.266 user 0m15.323s 00:09:46.266 sys 0m0.429s 00:09:46.266 18:13:29 blockdev_general.bdev_verify_big_io -- common/autotest_common.sh@1124 -- # xtrace_disable 00:09:46.266 18:13:29 blockdev_general.bdev_verify_big_io -- common/autotest_common.sh@10 -- # set +x 00:09:46.266 ************************************ 00:09:46.266 END TEST bdev_verify_big_io 00:09:46.266 ************************************ 00:09:46.266 18:13:29 blockdev_general -- common/autotest_common.sh@1142 -- # return 0 00:09:46.266 18:13:29 blockdev_general -- bdev/blockdev.sh@779 -- # run_test bdev_write_zeroes /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:09:46.266 18:13:29 blockdev_general -- common/autotest_common.sh@1099 -- # '[' 13 -le 1 ']' 00:09:46.266 18:13:29 blockdev_general -- common/autotest_common.sh@1105 -- # xtrace_disable 00:09:46.266 18:13:29 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:09:46.266 ************************************ 00:09:46.266 START TEST bdev_write_zeroes 00:09:46.266 ************************************ 00:09:46.266 18:13:29 blockdev_general.bdev_write_zeroes -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:09:46.266 [2024-07-12 18:13:29.131889] Starting SPDK v24.09-pre git sha1 182dd7de4 / DPDK 24.03.0 initialization... 00:09:46.266 [2024-07-12 18:13:29.131935] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2447948 ] 00:09:46.266 [2024-07-12 18:13:29.242171] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:09:46.266 [2024-07-12 18:13:29.342326] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:09:46.266 [2024-07-12 18:13:29.505365] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on Malloc3 00:09:46.266 [2024-07-12 18:13:29.505425] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc3 00:09:46.266 [2024-07-12 18:13:29.505439] vbdev_passthru.c: 735:bdev_passthru_create_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:09:46.266 [2024-07-12 18:13:29.513372] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:09:46.266 [2024-07-12 18:13:29.513398] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:09:46.266 [2024-07-12 18:13:29.521381] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:09:46.266 [2024-07-12 18:13:29.521405] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:09:46.266 [2024-07-12 18:13:29.598230] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on Malloc3 00:09:46.266 [2024-07-12 18:13:29.598284] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:09:46.266 [2024-07-12 18:13:29.598302] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xfabc10 00:09:46.266 [2024-07-12 18:13:29.598314] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:09:46.266 [2024-07-12 18:13:29.599831] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:09:46.266 [2024-07-12 18:13:29.599860] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: TestPT 00:09:46.266 Running I/O for 1 seconds... 00:09:47.200 00:09:47.200 Latency(us) 00:09:47.200 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:09:47.200 Job: Malloc0 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:09:47.200 Malloc0 : 1.05 5005.07 19.55 0.00 0.00 25566.84 662.48 42854.85 00:09:47.200 Job: Malloc1p0 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:09:47.200 Malloc1p0 : 1.05 4997.99 19.52 0.00 0.00 25557.40 918.93 41943.04 00:09:47.200 Job: Malloc1p1 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:09:47.200 Malloc1p1 : 1.05 4990.92 19.50 0.00 0.00 25535.82 894.00 41031.23 00:09:47.200 Job: Malloc2p0 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:09:47.200 Malloc2p0 : 1.05 4983.81 19.47 0.00 0.00 25511.18 908.24 40119.43 00:09:47.200 Job: Malloc2p1 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:09:47.200 Malloc2p1 : 1.05 4976.81 19.44 0.00 0.00 25491.26 901.12 39435.58 00:09:47.200 Job: Malloc2p2 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:09:47.200 Malloc2p2 : 1.06 4969.84 19.41 0.00 0.00 25472.63 918.93 38523.77 00:09:47.200 Job: Malloc2p3 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:09:47.200 Malloc2p3 : 1.06 4962.81 19.39 0.00 0.00 25453.26 901.12 37611.97 00:09:47.200 Job: Malloc2p4 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:09:47.200 Malloc2p4 : 1.06 4955.88 19.36 0.00 0.00 25432.87 897.56 36700.16 00:09:47.200 Job: Malloc2p5 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:09:47.200 Malloc2p5 : 1.06 4948.95 19.33 0.00 0.00 25412.70 904.68 35788.35 00:09:47.200 Job: Malloc2p6 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:09:47.200 Malloc2p6 : 1.06 4942.02 19.30 0.00 0.00 25393.25 904.68 34876.55 00:09:47.200 Job: Malloc2p7 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:09:47.200 Malloc2p7 : 1.06 4935.05 19.28 0.00 0.00 25370.38 901.12 33964.74 00:09:47.200 Job: TestPT (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:09:47.200 TestPT : 1.06 4928.19 19.25 0.00 0.00 25349.67 933.18 33052.94 00:09:47.200 Job: raid0 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:09:47.200 raid0 : 1.07 4920.26 19.22 0.00 0.00 25323.66 1609.91 31457.28 00:09:47.200 Job: concat0 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:09:47.200 concat0 : 1.07 4912.54 19.19 0.00 0.00 25272.94 1602.78 29861.62 00:09:47.200 Job: raid1 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:09:47.200 raid1 : 1.07 4902.84 19.15 0.00 0.00 25210.18 2550.21 27240.18 00:09:47.200 Job: AIO0 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:09:47.200 AIO0 : 1.07 4896.89 19.13 0.00 0.00 25122.88 1075.65 26784.28 00:09:47.200 =================================================================================================================== 00:09:47.200 Total : 79229.87 309.49 0.00 0.00 25404.81 662.48 42854.85 00:09:47.766 00:09:47.766 real 0m2.185s 00:09:47.766 user 0m1.811s 00:09:47.766 sys 0m0.324s 00:09:47.766 18:13:31 blockdev_general.bdev_write_zeroes -- common/autotest_common.sh@1124 -- # xtrace_disable 00:09:47.766 18:13:31 blockdev_general.bdev_write_zeroes -- common/autotest_common.sh@10 -- # set +x 00:09:47.766 ************************************ 00:09:47.766 END TEST bdev_write_zeroes 00:09:47.766 ************************************ 00:09:47.766 18:13:31 blockdev_general -- common/autotest_common.sh@1142 -- # return 0 00:09:47.766 18:13:31 blockdev_general -- bdev/blockdev.sh@782 -- # run_test bdev_json_nonenclosed /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonenclosed.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:09:47.766 18:13:31 blockdev_general -- common/autotest_common.sh@1099 -- # '[' 13 -le 1 ']' 00:09:47.766 18:13:31 blockdev_general -- common/autotest_common.sh@1105 -- # xtrace_disable 00:09:47.766 18:13:31 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:09:47.766 ************************************ 00:09:47.766 START TEST bdev_json_nonenclosed 00:09:47.766 ************************************ 00:09:47.766 18:13:31 blockdev_general.bdev_json_nonenclosed -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonenclosed.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:09:47.766 [2024-07-12 18:13:31.426727] Starting SPDK v24.09-pre git sha1 182dd7de4 / DPDK 24.03.0 initialization... 00:09:47.766 [2024-07-12 18:13:31.426793] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2448252 ] 00:09:48.024 [2024-07-12 18:13:31.555288] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:09:48.024 [2024-07-12 18:13:31.662336] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:09:48.024 [2024-07-12 18:13:31.662407] json_config.c: 608:json_config_prepare_ctx: *ERROR*: Invalid JSON configuration: not enclosed in {}. 00:09:48.024 [2024-07-12 18:13:31.662428] rpc.c: 190:spdk_rpc_server_finish: *ERROR*: No server listening on provided address: 00:09:48.024 [2024-07-12 18:13:31.662440] app.c:1052:spdk_app_stop: *WARNING*: spdk_app_stop'd on non-zero 00:09:48.283 00:09:48.283 real 0m0.407s 00:09:48.283 user 0m0.243s 00:09:48.283 sys 0m0.160s 00:09:48.283 18:13:31 blockdev_general.bdev_json_nonenclosed -- common/autotest_common.sh@1123 -- # es=234 00:09:48.283 18:13:31 blockdev_general.bdev_json_nonenclosed -- common/autotest_common.sh@1124 -- # xtrace_disable 00:09:48.283 18:13:31 blockdev_general.bdev_json_nonenclosed -- common/autotest_common.sh@10 -- # set +x 00:09:48.283 ************************************ 00:09:48.283 END TEST bdev_json_nonenclosed 00:09:48.283 ************************************ 00:09:48.283 18:13:31 blockdev_general -- common/autotest_common.sh@1142 -- # return 234 00:09:48.283 18:13:31 blockdev_general -- bdev/blockdev.sh@782 -- # true 00:09:48.283 18:13:31 blockdev_general -- bdev/blockdev.sh@785 -- # run_test bdev_json_nonarray /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonarray.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:09:48.283 18:13:31 blockdev_general -- common/autotest_common.sh@1099 -- # '[' 13 -le 1 ']' 00:09:48.283 18:13:31 blockdev_general -- common/autotest_common.sh@1105 -- # xtrace_disable 00:09:48.283 18:13:31 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:09:48.283 ************************************ 00:09:48.283 START TEST bdev_json_nonarray 00:09:48.283 ************************************ 00:09:48.283 18:13:31 blockdev_general.bdev_json_nonarray -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonarray.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:09:48.283 [2024-07-12 18:13:31.920861] Starting SPDK v24.09-pre git sha1 182dd7de4 / DPDK 24.03.0 initialization... 00:09:48.283 [2024-07-12 18:13:31.920931] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2448335 ] 00:09:48.540 [2024-07-12 18:13:32.048846] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:09:48.540 [2024-07-12 18:13:32.149311] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:09:48.540 [2024-07-12 18:13:32.149385] json_config.c: 614:json_config_prepare_ctx: *ERROR*: Invalid JSON configuration: 'subsystems' should be an array. 00:09:48.540 [2024-07-12 18:13:32.149406] rpc.c: 190:spdk_rpc_server_finish: *ERROR*: No server listening on provided address: 00:09:48.540 [2024-07-12 18:13:32.149418] app.c:1052:spdk_app_stop: *WARNING*: spdk_app_stop'd on non-zero 00:09:48.540 00:09:48.540 real 0m0.394s 00:09:48.540 user 0m0.242s 00:09:48.540 sys 0m0.150s 00:09:48.540 18:13:32 blockdev_general.bdev_json_nonarray -- common/autotest_common.sh@1123 -- # es=234 00:09:48.540 18:13:32 blockdev_general.bdev_json_nonarray -- common/autotest_common.sh@1124 -- # xtrace_disable 00:09:48.540 18:13:32 blockdev_general.bdev_json_nonarray -- common/autotest_common.sh@10 -- # set +x 00:09:48.540 ************************************ 00:09:48.540 END TEST bdev_json_nonarray 00:09:48.540 ************************************ 00:09:48.798 18:13:32 blockdev_general -- common/autotest_common.sh@1142 -- # return 234 00:09:48.798 18:13:32 blockdev_general -- bdev/blockdev.sh@785 -- # true 00:09:48.798 18:13:32 blockdev_general -- bdev/blockdev.sh@787 -- # [[ bdev == bdev ]] 00:09:48.798 18:13:32 blockdev_general -- bdev/blockdev.sh@788 -- # run_test bdev_qos qos_test_suite '' 00:09:48.798 18:13:32 blockdev_general -- common/autotest_common.sh@1099 -- # '[' 3 -le 1 ']' 00:09:48.798 18:13:32 blockdev_general -- common/autotest_common.sh@1105 -- # xtrace_disable 00:09:48.798 18:13:32 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:09:48.798 ************************************ 00:09:48.798 START TEST bdev_qos 00:09:48.798 ************************************ 00:09:48.798 18:13:32 blockdev_general.bdev_qos -- common/autotest_common.sh@1123 -- # qos_test_suite '' 00:09:48.798 18:13:32 blockdev_general.bdev_qos -- bdev/blockdev.sh@446 -- # QOS_PID=2448365 00:09:48.798 18:13:32 blockdev_general.bdev_qos -- bdev/blockdev.sh@447 -- # echo 'Process qos testing pid: 2448365' 00:09:48.798 Process qos testing pid: 2448365 00:09:48.798 18:13:32 blockdev_general.bdev_qos -- bdev/blockdev.sh@448 -- # trap 'cleanup; killprocess $QOS_PID; exit 1' SIGINT SIGTERM EXIT 00:09:48.798 18:13:32 blockdev_general.bdev_qos -- bdev/blockdev.sh@449 -- # waitforlisten 2448365 00:09:48.798 18:13:32 blockdev_general.bdev_qos -- common/autotest_common.sh@829 -- # '[' -z 2448365 ']' 00:09:48.798 18:13:32 blockdev_general.bdev_qos -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:09:48.798 18:13:32 blockdev_general.bdev_qos -- common/autotest_common.sh@834 -- # local max_retries=100 00:09:48.798 18:13:32 blockdev_general.bdev_qos -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:09:48.798 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:09:48.798 18:13:32 blockdev_general.bdev_qos -- common/autotest_common.sh@838 -- # xtrace_disable 00:09:48.798 18:13:32 blockdev_general.bdev_qos -- common/autotest_common.sh@10 -- # set +x 00:09:48.798 18:13:32 blockdev_general.bdev_qos -- bdev/blockdev.sh@445 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -z -m 0x2 -q 256 -o 4096 -w randread -t 60 '' 00:09:48.798 [2024-07-12 18:13:32.390720] Starting SPDK v24.09-pre git sha1 182dd7de4 / DPDK 24.03.0 initialization... 00:09:48.798 [2024-07-12 18:13:32.390786] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x2 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2448365 ] 00:09:49.056 [2024-07-12 18:13:32.533399] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:09:49.056 [2024-07-12 18:13:32.667360] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:09:50.014 18:13:33 blockdev_general.bdev_qos -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:09:50.014 18:13:33 blockdev_general.bdev_qos -- common/autotest_common.sh@862 -- # return 0 00:09:50.014 18:13:33 blockdev_general.bdev_qos -- bdev/blockdev.sh@451 -- # rpc_cmd bdev_malloc_create -b Malloc_0 128 512 00:09:50.014 18:13:33 blockdev_general.bdev_qos -- common/autotest_common.sh@559 -- # xtrace_disable 00:09:50.014 18:13:33 blockdev_general.bdev_qos -- common/autotest_common.sh@10 -- # set +x 00:09:50.014 Malloc_0 00:09:50.014 18:13:33 blockdev_general.bdev_qos -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:09:50.014 18:13:33 blockdev_general.bdev_qos -- bdev/blockdev.sh@452 -- # waitforbdev Malloc_0 00:09:50.014 18:13:33 blockdev_general.bdev_qos -- common/autotest_common.sh@897 -- # local bdev_name=Malloc_0 00:09:50.014 18:13:33 blockdev_general.bdev_qos -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:09:50.014 18:13:33 blockdev_general.bdev_qos -- common/autotest_common.sh@899 -- # local i 00:09:50.014 18:13:33 blockdev_general.bdev_qos -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:09:50.014 18:13:33 blockdev_general.bdev_qos -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:09:50.014 18:13:33 blockdev_general.bdev_qos -- common/autotest_common.sh@902 -- # rpc_cmd bdev_wait_for_examine 00:09:50.014 18:13:33 blockdev_general.bdev_qos -- common/autotest_common.sh@559 -- # xtrace_disable 00:09:50.014 18:13:33 blockdev_general.bdev_qos -- common/autotest_common.sh@10 -- # set +x 00:09:50.014 18:13:33 blockdev_general.bdev_qos -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:09:50.014 18:13:33 blockdev_general.bdev_qos -- common/autotest_common.sh@904 -- # rpc_cmd bdev_get_bdevs -b Malloc_0 -t 2000 00:09:50.014 18:13:33 blockdev_general.bdev_qos -- common/autotest_common.sh@559 -- # xtrace_disable 00:09:50.014 18:13:33 blockdev_general.bdev_qos -- common/autotest_common.sh@10 -- # set +x 00:09:50.014 [ 00:09:50.014 { 00:09:50.014 "name": "Malloc_0", 00:09:50.014 "aliases": [ 00:09:50.014 "6f7db9bf-c05f-49ae-96e9-50ff28c7d238" 00:09:50.014 ], 00:09:50.014 "product_name": "Malloc disk", 00:09:50.014 "block_size": 512, 00:09:50.014 "num_blocks": 262144, 00:09:50.014 "uuid": "6f7db9bf-c05f-49ae-96e9-50ff28c7d238", 00:09:50.014 "assigned_rate_limits": { 00:09:50.014 "rw_ios_per_sec": 0, 00:09:50.014 "rw_mbytes_per_sec": 0, 00:09:50.014 "r_mbytes_per_sec": 0, 00:09:50.014 "w_mbytes_per_sec": 0 00:09:50.014 }, 00:09:50.014 "claimed": false, 00:09:50.014 "zoned": false, 00:09:50.014 "supported_io_types": { 00:09:50.014 "read": true, 00:09:50.014 "write": true, 00:09:50.014 "unmap": true, 00:09:50.014 "flush": true, 00:09:50.014 "reset": true, 00:09:50.014 "nvme_admin": false, 00:09:50.014 "nvme_io": false, 00:09:50.014 "nvme_io_md": false, 00:09:50.014 "write_zeroes": true, 00:09:50.014 "zcopy": true, 00:09:50.014 "get_zone_info": false, 00:09:50.014 "zone_management": false, 00:09:50.014 "zone_append": false, 00:09:50.014 "compare": false, 00:09:50.014 "compare_and_write": false, 00:09:50.014 "abort": true, 00:09:50.014 "seek_hole": false, 00:09:50.014 "seek_data": false, 00:09:50.014 "copy": true, 00:09:50.014 "nvme_iov_md": false 00:09:50.014 }, 00:09:50.014 "memory_domains": [ 00:09:50.014 { 00:09:50.014 "dma_device_id": "system", 00:09:50.014 "dma_device_type": 1 00:09:50.014 }, 00:09:50.014 { 00:09:50.014 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:09:50.014 "dma_device_type": 2 00:09:50.014 } 00:09:50.014 ], 00:09:50.014 "driver_specific": {} 00:09:50.014 } 00:09:50.014 ] 00:09:50.014 18:13:33 blockdev_general.bdev_qos -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:09:50.014 18:13:33 blockdev_general.bdev_qos -- common/autotest_common.sh@905 -- # return 0 00:09:50.014 18:13:33 blockdev_general.bdev_qos -- bdev/blockdev.sh@453 -- # rpc_cmd bdev_null_create Null_1 128 512 00:09:50.014 18:13:33 blockdev_general.bdev_qos -- common/autotest_common.sh@559 -- # xtrace_disable 00:09:50.014 18:13:33 blockdev_general.bdev_qos -- common/autotest_common.sh@10 -- # set +x 00:09:50.014 Null_1 00:09:50.014 18:13:33 blockdev_general.bdev_qos -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:09:50.014 18:13:33 blockdev_general.bdev_qos -- bdev/blockdev.sh@454 -- # waitforbdev Null_1 00:09:50.014 18:13:33 blockdev_general.bdev_qos -- common/autotest_common.sh@897 -- # local bdev_name=Null_1 00:09:50.014 18:13:33 blockdev_general.bdev_qos -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:09:50.014 18:13:33 blockdev_general.bdev_qos -- common/autotest_common.sh@899 -- # local i 00:09:50.014 18:13:33 blockdev_general.bdev_qos -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:09:50.014 18:13:33 blockdev_general.bdev_qos -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:09:50.014 18:13:33 blockdev_general.bdev_qos -- common/autotest_common.sh@902 -- # rpc_cmd bdev_wait_for_examine 00:09:50.014 18:13:33 blockdev_general.bdev_qos -- common/autotest_common.sh@559 -- # xtrace_disable 00:09:50.014 18:13:33 blockdev_general.bdev_qos -- common/autotest_common.sh@10 -- # set +x 00:09:50.014 18:13:33 blockdev_general.bdev_qos -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:09:50.014 18:13:33 blockdev_general.bdev_qos -- common/autotest_common.sh@904 -- # rpc_cmd bdev_get_bdevs -b Null_1 -t 2000 00:09:50.014 18:13:33 blockdev_general.bdev_qos -- common/autotest_common.sh@559 -- # xtrace_disable 00:09:50.014 18:13:33 blockdev_general.bdev_qos -- common/autotest_common.sh@10 -- # set +x 00:09:50.273 [ 00:09:50.273 { 00:09:50.273 "name": "Null_1", 00:09:50.273 "aliases": [ 00:09:50.273 "84dea494-df0a-479a-a988-39652fbcefbe" 00:09:50.273 ], 00:09:50.273 "product_name": "Null disk", 00:09:50.273 "block_size": 512, 00:09:50.273 "num_blocks": 262144, 00:09:50.273 "uuid": "84dea494-df0a-479a-a988-39652fbcefbe", 00:09:50.273 "assigned_rate_limits": { 00:09:50.273 "rw_ios_per_sec": 0, 00:09:50.273 "rw_mbytes_per_sec": 0, 00:09:50.273 "r_mbytes_per_sec": 0, 00:09:50.273 "w_mbytes_per_sec": 0 00:09:50.273 }, 00:09:50.273 "claimed": false, 00:09:50.273 "zoned": false, 00:09:50.273 "supported_io_types": { 00:09:50.273 "read": true, 00:09:50.273 "write": true, 00:09:50.273 "unmap": false, 00:09:50.273 "flush": false, 00:09:50.273 "reset": true, 00:09:50.273 "nvme_admin": false, 00:09:50.273 "nvme_io": false, 00:09:50.273 "nvme_io_md": false, 00:09:50.273 "write_zeroes": true, 00:09:50.273 "zcopy": false, 00:09:50.273 "get_zone_info": false, 00:09:50.273 "zone_management": false, 00:09:50.273 "zone_append": false, 00:09:50.273 "compare": false, 00:09:50.273 "compare_and_write": false, 00:09:50.273 "abort": true, 00:09:50.273 "seek_hole": false, 00:09:50.273 "seek_data": false, 00:09:50.273 "copy": false, 00:09:50.273 "nvme_iov_md": false 00:09:50.273 }, 00:09:50.273 "driver_specific": {} 00:09:50.273 } 00:09:50.273 ] 00:09:50.273 18:13:33 blockdev_general.bdev_qos -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:09:50.273 18:13:33 blockdev_general.bdev_qos -- common/autotest_common.sh@905 -- # return 0 00:09:50.273 18:13:33 blockdev_general.bdev_qos -- bdev/blockdev.sh@457 -- # qos_function_test 00:09:50.273 18:13:33 blockdev_general.bdev_qos -- bdev/blockdev.sh@410 -- # local qos_lower_iops_limit=1000 00:09:50.273 18:13:33 blockdev_general.bdev_qos -- bdev/blockdev.sh@411 -- # local qos_lower_bw_limit=2 00:09:50.273 18:13:33 blockdev_general.bdev_qos -- bdev/blockdev.sh@412 -- # local io_result=0 00:09:50.273 18:13:33 blockdev_general.bdev_qos -- bdev/blockdev.sh@413 -- # local iops_limit=0 00:09:50.273 18:13:33 blockdev_general.bdev_qos -- bdev/blockdev.sh@414 -- # local bw_limit=0 00:09:50.273 18:13:33 blockdev_general.bdev_qos -- bdev/blockdev.sh@456 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py perform_tests 00:09:50.273 18:13:33 blockdev_general.bdev_qos -- bdev/blockdev.sh@416 -- # get_io_result IOPS Malloc_0 00:09:50.273 18:13:33 blockdev_general.bdev_qos -- bdev/blockdev.sh@375 -- # local limit_type=IOPS 00:09:50.273 18:13:33 blockdev_general.bdev_qos -- bdev/blockdev.sh@376 -- # local qos_dev=Malloc_0 00:09:50.273 18:13:33 blockdev_general.bdev_qos -- bdev/blockdev.sh@377 -- # local iostat_result 00:09:50.273 18:13:33 blockdev_general.bdev_qos -- bdev/blockdev.sh@378 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/iostat.py -d -i 1 -t 5 00:09:50.273 18:13:33 blockdev_general.bdev_qos -- bdev/blockdev.sh@378 -- # grep Malloc_0 00:09:50.273 18:13:33 blockdev_general.bdev_qos -- bdev/blockdev.sh@378 -- # tail -1 00:09:50.273 Running I/O for 60 seconds... 00:09:55.542 18:13:38 blockdev_general.bdev_qos -- bdev/blockdev.sh@378 -- # iostat_result='Malloc_0 63408.66 253634.64 0.00 0.00 254976.00 0.00 0.00 ' 00:09:55.542 18:13:38 blockdev_general.bdev_qos -- bdev/blockdev.sh@379 -- # '[' IOPS = IOPS ']' 00:09:55.542 18:13:38 blockdev_general.bdev_qos -- bdev/blockdev.sh@380 -- # awk '{print $2}' 00:09:55.542 18:13:38 blockdev_general.bdev_qos -- bdev/blockdev.sh@380 -- # iostat_result=63408.66 00:09:55.542 18:13:38 blockdev_general.bdev_qos -- bdev/blockdev.sh@385 -- # echo 63408 00:09:55.542 18:13:38 blockdev_general.bdev_qos -- bdev/blockdev.sh@416 -- # io_result=63408 00:09:55.542 18:13:38 blockdev_general.bdev_qos -- bdev/blockdev.sh@418 -- # iops_limit=15000 00:09:55.542 18:13:38 blockdev_general.bdev_qos -- bdev/blockdev.sh@419 -- # '[' 15000 -gt 1000 ']' 00:09:55.542 18:13:38 blockdev_general.bdev_qos -- bdev/blockdev.sh@422 -- # rpc_cmd bdev_set_qos_limit --rw_ios_per_sec 15000 Malloc_0 00:09:55.542 18:13:38 blockdev_general.bdev_qos -- common/autotest_common.sh@559 -- # xtrace_disable 00:09:55.542 18:13:38 blockdev_general.bdev_qos -- common/autotest_common.sh@10 -- # set +x 00:09:55.542 18:13:38 blockdev_general.bdev_qos -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:09:55.542 18:13:38 blockdev_general.bdev_qos -- bdev/blockdev.sh@423 -- # run_test bdev_qos_iops run_qos_test 15000 IOPS Malloc_0 00:09:55.542 18:13:38 blockdev_general.bdev_qos -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:09:55.542 18:13:38 blockdev_general.bdev_qos -- common/autotest_common.sh@1105 -- # xtrace_disable 00:09:55.542 18:13:38 blockdev_general.bdev_qos -- common/autotest_common.sh@10 -- # set +x 00:09:55.542 ************************************ 00:09:55.542 START TEST bdev_qos_iops 00:09:55.542 ************************************ 00:09:55.542 18:13:38 blockdev_general.bdev_qos.bdev_qos_iops -- common/autotest_common.sh@1123 -- # run_qos_test 15000 IOPS Malloc_0 00:09:55.542 18:13:38 blockdev_general.bdev_qos.bdev_qos_iops -- bdev/blockdev.sh@389 -- # local qos_limit=15000 00:09:55.542 18:13:38 blockdev_general.bdev_qos.bdev_qos_iops -- bdev/blockdev.sh@390 -- # local qos_result=0 00:09:55.542 18:13:38 blockdev_general.bdev_qos.bdev_qos_iops -- bdev/blockdev.sh@392 -- # get_io_result IOPS Malloc_0 00:09:55.542 18:13:38 blockdev_general.bdev_qos.bdev_qos_iops -- bdev/blockdev.sh@375 -- # local limit_type=IOPS 00:09:55.542 18:13:38 blockdev_general.bdev_qos.bdev_qos_iops -- bdev/blockdev.sh@376 -- # local qos_dev=Malloc_0 00:09:55.542 18:13:38 blockdev_general.bdev_qos.bdev_qos_iops -- bdev/blockdev.sh@377 -- # local iostat_result 00:09:55.542 18:13:38 blockdev_general.bdev_qos.bdev_qos_iops -- bdev/blockdev.sh@378 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/iostat.py -d -i 1 -t 5 00:09:55.542 18:13:38 blockdev_general.bdev_qos.bdev_qos_iops -- bdev/blockdev.sh@378 -- # grep Malloc_0 00:09:55.542 18:13:38 blockdev_general.bdev_qos.bdev_qos_iops -- bdev/blockdev.sh@378 -- # tail -1 00:10:00.807 18:13:44 blockdev_general.bdev_qos.bdev_qos_iops -- bdev/blockdev.sh@378 -- # iostat_result='Malloc_0 15003.08 60012.31 0.00 0.00 61440.00 0.00 0.00 ' 00:10:00.808 18:13:44 blockdev_general.bdev_qos.bdev_qos_iops -- bdev/blockdev.sh@379 -- # '[' IOPS = IOPS ']' 00:10:00.808 18:13:44 blockdev_general.bdev_qos.bdev_qos_iops -- bdev/blockdev.sh@380 -- # awk '{print $2}' 00:10:00.808 18:13:44 blockdev_general.bdev_qos.bdev_qos_iops -- bdev/blockdev.sh@380 -- # iostat_result=15003.08 00:10:00.808 18:13:44 blockdev_general.bdev_qos.bdev_qos_iops -- bdev/blockdev.sh@385 -- # echo 15003 00:10:00.808 18:13:44 blockdev_general.bdev_qos.bdev_qos_iops -- bdev/blockdev.sh@392 -- # qos_result=15003 00:10:00.808 18:13:44 blockdev_general.bdev_qos.bdev_qos_iops -- bdev/blockdev.sh@393 -- # '[' IOPS = BANDWIDTH ']' 00:10:00.808 18:13:44 blockdev_general.bdev_qos.bdev_qos_iops -- bdev/blockdev.sh@396 -- # lower_limit=13500 00:10:00.808 18:13:44 blockdev_general.bdev_qos.bdev_qos_iops -- bdev/blockdev.sh@397 -- # upper_limit=16500 00:10:00.808 18:13:44 blockdev_general.bdev_qos.bdev_qos_iops -- bdev/blockdev.sh@400 -- # '[' 15003 -lt 13500 ']' 00:10:00.808 18:13:44 blockdev_general.bdev_qos.bdev_qos_iops -- bdev/blockdev.sh@400 -- # '[' 15003 -gt 16500 ']' 00:10:00.808 00:10:00.808 real 0m5.221s 00:10:00.808 user 0m0.085s 00:10:00.808 sys 0m0.044s 00:10:00.808 18:13:44 blockdev_general.bdev_qos.bdev_qos_iops -- common/autotest_common.sh@1124 -- # xtrace_disable 00:10:00.808 18:13:44 blockdev_general.bdev_qos.bdev_qos_iops -- common/autotest_common.sh@10 -- # set +x 00:10:00.808 ************************************ 00:10:00.808 END TEST bdev_qos_iops 00:10:00.808 ************************************ 00:10:00.808 18:13:44 blockdev_general.bdev_qos -- common/autotest_common.sh@1142 -- # return 0 00:10:00.808 18:13:44 blockdev_general.bdev_qos -- bdev/blockdev.sh@427 -- # get_io_result BANDWIDTH Null_1 00:10:00.808 18:13:44 blockdev_general.bdev_qos -- bdev/blockdev.sh@375 -- # local limit_type=BANDWIDTH 00:10:00.808 18:13:44 blockdev_general.bdev_qos -- bdev/blockdev.sh@376 -- # local qos_dev=Null_1 00:10:00.808 18:13:44 blockdev_general.bdev_qos -- bdev/blockdev.sh@377 -- # local iostat_result 00:10:00.808 18:13:44 blockdev_general.bdev_qos -- bdev/blockdev.sh@378 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/iostat.py -d -i 1 -t 5 00:10:00.808 18:13:44 blockdev_general.bdev_qos -- bdev/blockdev.sh@378 -- # grep Null_1 00:10:00.808 18:13:44 blockdev_general.bdev_qos -- bdev/blockdev.sh@378 -- # tail -1 00:10:06.066 18:13:49 blockdev_general.bdev_qos -- bdev/blockdev.sh@378 -- # iostat_result='Null_1 20235.78 80943.13 0.00 0.00 81920.00 0.00 0.00 ' 00:10:06.066 18:13:49 blockdev_general.bdev_qos -- bdev/blockdev.sh@379 -- # '[' BANDWIDTH = IOPS ']' 00:10:06.066 18:13:49 blockdev_general.bdev_qos -- bdev/blockdev.sh@381 -- # '[' BANDWIDTH = BANDWIDTH ']' 00:10:06.066 18:13:49 blockdev_general.bdev_qos -- bdev/blockdev.sh@382 -- # awk '{print $6}' 00:10:06.066 18:13:49 blockdev_general.bdev_qos -- bdev/blockdev.sh@382 -- # iostat_result=81920.00 00:10:06.066 18:13:49 blockdev_general.bdev_qos -- bdev/blockdev.sh@385 -- # echo 81920 00:10:06.066 18:13:49 blockdev_general.bdev_qos -- bdev/blockdev.sh@427 -- # bw_limit=81920 00:10:06.066 18:13:49 blockdev_general.bdev_qos -- bdev/blockdev.sh@428 -- # bw_limit=8 00:10:06.066 18:13:49 blockdev_general.bdev_qos -- bdev/blockdev.sh@429 -- # '[' 8 -lt 2 ']' 00:10:06.066 18:13:49 blockdev_general.bdev_qos -- bdev/blockdev.sh@432 -- # rpc_cmd bdev_set_qos_limit --rw_mbytes_per_sec 8 Null_1 00:10:06.066 18:13:49 blockdev_general.bdev_qos -- common/autotest_common.sh@559 -- # xtrace_disable 00:10:06.066 18:13:49 blockdev_general.bdev_qos -- common/autotest_common.sh@10 -- # set +x 00:10:06.066 18:13:49 blockdev_general.bdev_qos -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:10:06.066 18:13:49 blockdev_general.bdev_qos -- bdev/blockdev.sh@433 -- # run_test bdev_qos_bw run_qos_test 8 BANDWIDTH Null_1 00:10:06.066 18:13:49 blockdev_general.bdev_qos -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:10:06.066 18:13:49 blockdev_general.bdev_qos -- common/autotest_common.sh@1105 -- # xtrace_disable 00:10:06.066 18:13:49 blockdev_general.bdev_qos -- common/autotest_common.sh@10 -- # set +x 00:10:06.066 ************************************ 00:10:06.066 START TEST bdev_qos_bw 00:10:06.066 ************************************ 00:10:06.066 18:13:49 blockdev_general.bdev_qos.bdev_qos_bw -- common/autotest_common.sh@1123 -- # run_qos_test 8 BANDWIDTH Null_1 00:10:06.066 18:13:49 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@389 -- # local qos_limit=8 00:10:06.066 18:13:49 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@390 -- # local qos_result=0 00:10:06.066 18:13:49 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@392 -- # get_io_result BANDWIDTH Null_1 00:10:06.066 18:13:49 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@375 -- # local limit_type=BANDWIDTH 00:10:06.066 18:13:49 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@376 -- # local qos_dev=Null_1 00:10:06.066 18:13:49 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@377 -- # local iostat_result 00:10:06.066 18:13:49 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@378 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/iostat.py -d -i 1 -t 5 00:10:06.066 18:13:49 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@378 -- # grep Null_1 00:10:06.066 18:13:49 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@378 -- # tail -1 00:10:11.350 18:13:54 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@378 -- # iostat_result='Null_1 2047.88 8191.52 0.00 0.00 8388.00 0.00 0.00 ' 00:10:11.350 18:13:54 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@379 -- # '[' BANDWIDTH = IOPS ']' 00:10:11.350 18:13:54 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@381 -- # '[' BANDWIDTH = BANDWIDTH ']' 00:10:11.350 18:13:54 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@382 -- # awk '{print $6}' 00:10:11.350 18:13:54 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@382 -- # iostat_result=8388.00 00:10:11.350 18:13:54 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@385 -- # echo 8388 00:10:11.350 18:13:54 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@392 -- # qos_result=8388 00:10:11.350 18:13:54 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@393 -- # '[' BANDWIDTH = BANDWIDTH ']' 00:10:11.350 18:13:54 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@394 -- # qos_limit=8192 00:10:11.350 18:13:54 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@396 -- # lower_limit=7372 00:10:11.350 18:13:54 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@397 -- # upper_limit=9011 00:10:11.350 18:13:54 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@400 -- # '[' 8388 -lt 7372 ']' 00:10:11.350 18:13:54 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@400 -- # '[' 8388 -gt 9011 ']' 00:10:11.350 00:10:11.350 real 0m5.288s 00:10:11.350 user 0m0.119s 00:10:11.350 sys 0m0.043s 00:10:11.350 18:13:54 blockdev_general.bdev_qos.bdev_qos_bw -- common/autotest_common.sh@1124 -- # xtrace_disable 00:10:11.350 18:13:54 blockdev_general.bdev_qos.bdev_qos_bw -- common/autotest_common.sh@10 -- # set +x 00:10:11.350 ************************************ 00:10:11.350 END TEST bdev_qos_bw 00:10:11.350 ************************************ 00:10:11.350 18:13:54 blockdev_general.bdev_qos -- common/autotest_common.sh@1142 -- # return 0 00:10:11.350 18:13:54 blockdev_general.bdev_qos -- bdev/blockdev.sh@436 -- # rpc_cmd bdev_set_qos_limit --r_mbytes_per_sec 2 Malloc_0 00:10:11.350 18:13:54 blockdev_general.bdev_qos -- common/autotest_common.sh@559 -- # xtrace_disable 00:10:11.350 18:13:54 blockdev_general.bdev_qos -- common/autotest_common.sh@10 -- # set +x 00:10:11.350 18:13:54 blockdev_general.bdev_qos -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:10:11.350 18:13:54 blockdev_general.bdev_qos -- bdev/blockdev.sh@437 -- # run_test bdev_qos_ro_bw run_qos_test 2 BANDWIDTH Malloc_0 00:10:11.350 18:13:54 blockdev_general.bdev_qos -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:10:11.350 18:13:54 blockdev_general.bdev_qos -- common/autotest_common.sh@1105 -- # xtrace_disable 00:10:11.350 18:13:54 blockdev_general.bdev_qos -- common/autotest_common.sh@10 -- # set +x 00:10:11.350 ************************************ 00:10:11.350 START TEST bdev_qos_ro_bw 00:10:11.350 ************************************ 00:10:11.350 18:13:55 blockdev_general.bdev_qos.bdev_qos_ro_bw -- common/autotest_common.sh@1123 -- # run_qos_test 2 BANDWIDTH Malloc_0 00:10:11.350 18:13:55 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@389 -- # local qos_limit=2 00:10:11.350 18:13:55 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@390 -- # local qos_result=0 00:10:11.350 18:13:55 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@392 -- # get_io_result BANDWIDTH Malloc_0 00:10:11.350 18:13:55 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@375 -- # local limit_type=BANDWIDTH 00:10:11.350 18:13:55 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@376 -- # local qos_dev=Malloc_0 00:10:11.350 18:13:55 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@377 -- # local iostat_result 00:10:11.350 18:13:55 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@378 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/iostat.py -d -i 1 -t 5 00:10:11.351 18:13:55 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@378 -- # grep Malloc_0 00:10:11.351 18:13:55 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@378 -- # tail -1 00:10:16.662 18:14:00 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@378 -- # iostat_result='Malloc_0 511.77 2047.07 0.00 0.00 2060.00 0.00 0.00 ' 00:10:16.662 18:14:00 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@379 -- # '[' BANDWIDTH = IOPS ']' 00:10:16.662 18:14:00 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@381 -- # '[' BANDWIDTH = BANDWIDTH ']' 00:10:16.662 18:14:00 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@382 -- # awk '{print $6}' 00:10:16.662 18:14:00 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@382 -- # iostat_result=2060.00 00:10:16.662 18:14:00 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@385 -- # echo 2060 00:10:16.662 18:14:00 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@392 -- # qos_result=2060 00:10:16.662 18:14:00 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@393 -- # '[' BANDWIDTH = BANDWIDTH ']' 00:10:16.663 18:14:00 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@394 -- # qos_limit=2048 00:10:16.663 18:14:00 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@396 -- # lower_limit=1843 00:10:16.663 18:14:00 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@397 -- # upper_limit=2252 00:10:16.663 18:14:00 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@400 -- # '[' 2060 -lt 1843 ']' 00:10:16.663 18:14:00 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@400 -- # '[' 2060 -gt 2252 ']' 00:10:16.663 00:10:16.663 real 0m5.188s 00:10:16.663 user 0m0.120s 00:10:16.663 sys 0m0.042s 00:10:16.663 18:14:00 blockdev_general.bdev_qos.bdev_qos_ro_bw -- common/autotest_common.sh@1124 -- # xtrace_disable 00:10:16.663 18:14:00 blockdev_general.bdev_qos.bdev_qos_ro_bw -- common/autotest_common.sh@10 -- # set +x 00:10:16.663 ************************************ 00:10:16.663 END TEST bdev_qos_ro_bw 00:10:16.663 ************************************ 00:10:16.663 18:14:00 blockdev_general.bdev_qos -- common/autotest_common.sh@1142 -- # return 0 00:10:16.663 18:14:00 blockdev_general.bdev_qos -- bdev/blockdev.sh@459 -- # rpc_cmd bdev_malloc_delete Malloc_0 00:10:16.663 18:14:00 blockdev_general.bdev_qos -- common/autotest_common.sh@559 -- # xtrace_disable 00:10:16.663 18:14:00 blockdev_general.bdev_qos -- common/autotest_common.sh@10 -- # set +x 00:10:17.230 18:14:00 blockdev_general.bdev_qos -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:10:17.230 18:14:00 blockdev_general.bdev_qos -- bdev/blockdev.sh@460 -- # rpc_cmd bdev_null_delete Null_1 00:10:17.230 18:14:00 blockdev_general.bdev_qos -- common/autotest_common.sh@559 -- # xtrace_disable 00:10:17.230 18:14:00 blockdev_general.bdev_qos -- common/autotest_common.sh@10 -- # set +x 00:10:17.230 00:10:17.230 Latency(us) 00:10:17.230 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:10:17.230 Job: Malloc_0 (Core Mask 0x2, workload: randread, depth: 256, IO size: 4096) 00:10:17.230 Malloc_0 : 26.71 20835.10 81.39 0.00 0.00 12171.81 1980.33 503316.48 00:10:17.230 Job: Null_1 (Core Mask 0x2, workload: randread, depth: 256, IO size: 4096) 00:10:17.230 Null_1 : 26.86 20528.29 80.19 0.00 0.00 12437.61 801.39 149536.06 00:10:17.230 =================================================================================================================== 00:10:17.230 Total : 41363.39 161.58 0.00 0.00 12304.09 801.39 503316.48 00:10:17.230 0 00:10:17.230 18:14:00 blockdev_general.bdev_qos -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:10:17.230 18:14:00 blockdev_general.bdev_qos -- bdev/blockdev.sh@461 -- # killprocess 2448365 00:10:17.230 18:14:00 blockdev_general.bdev_qos -- common/autotest_common.sh@948 -- # '[' -z 2448365 ']' 00:10:17.230 18:14:00 blockdev_general.bdev_qos -- common/autotest_common.sh@952 -- # kill -0 2448365 00:10:17.230 18:14:00 blockdev_general.bdev_qos -- common/autotest_common.sh@953 -- # uname 00:10:17.230 18:14:00 blockdev_general.bdev_qos -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:10:17.230 18:14:00 blockdev_general.bdev_qos -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2448365 00:10:17.230 18:14:00 blockdev_general.bdev_qos -- common/autotest_common.sh@954 -- # process_name=reactor_1 00:10:17.230 18:14:00 blockdev_general.bdev_qos -- common/autotest_common.sh@958 -- # '[' reactor_1 = sudo ']' 00:10:17.230 18:14:00 blockdev_general.bdev_qos -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2448365' 00:10:17.230 killing process with pid 2448365 00:10:17.230 18:14:00 blockdev_general.bdev_qos -- common/autotest_common.sh@967 -- # kill 2448365 00:10:17.230 Received shutdown signal, test time was about 26.923084 seconds 00:10:17.230 00:10:17.230 Latency(us) 00:10:17.230 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:10:17.230 =================================================================================================================== 00:10:17.230 Total : 0.00 0.00 0.00 0.00 0.00 0.00 0.00 00:10:17.230 18:14:00 blockdev_general.bdev_qos -- common/autotest_common.sh@972 -- # wait 2448365 00:10:17.489 18:14:01 blockdev_general.bdev_qos -- bdev/blockdev.sh@462 -- # trap - SIGINT SIGTERM EXIT 00:10:17.489 00:10:17.489 real 0m28.843s 00:10:17.489 user 0m29.921s 00:10:17.489 sys 0m0.941s 00:10:17.489 18:14:01 blockdev_general.bdev_qos -- common/autotest_common.sh@1124 -- # xtrace_disable 00:10:17.489 18:14:01 blockdev_general.bdev_qos -- common/autotest_common.sh@10 -- # set +x 00:10:17.489 ************************************ 00:10:17.489 END TEST bdev_qos 00:10:17.489 ************************************ 00:10:17.747 18:14:01 blockdev_general -- common/autotest_common.sh@1142 -- # return 0 00:10:17.747 18:14:01 blockdev_general -- bdev/blockdev.sh@789 -- # run_test bdev_qd_sampling qd_sampling_test_suite '' 00:10:17.747 18:14:01 blockdev_general -- common/autotest_common.sh@1099 -- # '[' 3 -le 1 ']' 00:10:17.747 18:14:01 blockdev_general -- common/autotest_common.sh@1105 -- # xtrace_disable 00:10:17.747 18:14:01 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:10:17.747 ************************************ 00:10:17.747 START TEST bdev_qd_sampling 00:10:17.747 ************************************ 00:10:17.747 18:14:01 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@1123 -- # qd_sampling_test_suite '' 00:10:17.747 18:14:01 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@538 -- # QD_DEV=Malloc_QD 00:10:17.747 18:14:01 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@541 -- # QD_PID=2452268 00:10:17.747 18:14:01 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@542 -- # echo 'Process bdev QD sampling period testing pid: 2452268' 00:10:17.747 Process bdev QD sampling period testing pid: 2452268 00:10:17.747 18:14:01 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@540 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -z -m 0x3 -q 256 -o 4096 -w randread -t 5 -C '' 00:10:17.747 18:14:01 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@543 -- # trap 'cleanup; killprocess $QD_PID; exit 1' SIGINT SIGTERM EXIT 00:10:17.747 18:14:01 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@544 -- # waitforlisten 2452268 00:10:17.747 18:14:01 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@829 -- # '[' -z 2452268 ']' 00:10:17.747 18:14:01 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:10:17.747 18:14:01 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@834 -- # local max_retries=100 00:10:17.747 18:14:01 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:10:17.747 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:10:17.747 18:14:01 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@838 -- # xtrace_disable 00:10:17.747 18:14:01 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@10 -- # set +x 00:10:17.747 [2024-07-12 18:14:01.319230] Starting SPDK v24.09-pre git sha1 182dd7de4 / DPDK 24.03.0 initialization... 00:10:17.747 [2024-07-12 18:14:01.319300] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x3 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2452268 ] 00:10:17.747 [2024-07-12 18:14:01.448580] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 2 00:10:18.005 [2024-07-12 18:14:01.548575] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:10:18.005 [2024-07-12 18:14:01.548579] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:10:18.572 18:14:02 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:10:18.572 18:14:02 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@862 -- # return 0 00:10:18.572 18:14:02 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@546 -- # rpc_cmd bdev_malloc_create -b Malloc_QD 128 512 00:10:18.572 18:14:02 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@559 -- # xtrace_disable 00:10:18.572 18:14:02 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@10 -- # set +x 00:10:18.572 Malloc_QD 00:10:18.572 18:14:02 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:10:18.572 18:14:02 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@547 -- # waitforbdev Malloc_QD 00:10:18.572 18:14:02 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@897 -- # local bdev_name=Malloc_QD 00:10:18.572 18:14:02 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:10:18.572 18:14:02 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@899 -- # local i 00:10:18.572 18:14:02 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:10:18.572 18:14:02 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:10:18.572 18:14:02 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@902 -- # rpc_cmd bdev_wait_for_examine 00:10:18.572 18:14:02 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@559 -- # xtrace_disable 00:10:18.572 18:14:02 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@10 -- # set +x 00:10:18.572 18:14:02 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:10:18.572 18:14:02 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@904 -- # rpc_cmd bdev_get_bdevs -b Malloc_QD -t 2000 00:10:18.572 18:14:02 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@559 -- # xtrace_disable 00:10:18.572 18:14:02 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@10 -- # set +x 00:10:18.572 [ 00:10:18.572 { 00:10:18.572 "name": "Malloc_QD", 00:10:18.572 "aliases": [ 00:10:18.572 "f8fcd41c-48e8-47a2-8703-0f4ba38ee1a5" 00:10:18.572 ], 00:10:18.572 "product_name": "Malloc disk", 00:10:18.572 "block_size": 512, 00:10:18.572 "num_blocks": 262144, 00:10:18.572 "uuid": "f8fcd41c-48e8-47a2-8703-0f4ba38ee1a5", 00:10:18.572 "assigned_rate_limits": { 00:10:18.572 "rw_ios_per_sec": 0, 00:10:18.572 "rw_mbytes_per_sec": 0, 00:10:18.572 "r_mbytes_per_sec": 0, 00:10:18.572 "w_mbytes_per_sec": 0 00:10:18.572 }, 00:10:18.572 "claimed": false, 00:10:18.572 "zoned": false, 00:10:18.572 "supported_io_types": { 00:10:18.572 "read": true, 00:10:18.572 "write": true, 00:10:18.572 "unmap": true, 00:10:18.572 "flush": true, 00:10:18.572 "reset": true, 00:10:18.572 "nvme_admin": false, 00:10:18.572 "nvme_io": false, 00:10:18.572 "nvme_io_md": false, 00:10:18.572 "write_zeroes": true, 00:10:18.572 "zcopy": true, 00:10:18.572 "get_zone_info": false, 00:10:18.572 "zone_management": false, 00:10:18.572 "zone_append": false, 00:10:18.572 "compare": false, 00:10:18.572 "compare_and_write": false, 00:10:18.572 "abort": true, 00:10:18.572 "seek_hole": false, 00:10:18.572 "seek_data": false, 00:10:18.572 "copy": true, 00:10:18.572 "nvme_iov_md": false 00:10:18.572 }, 00:10:18.572 "memory_domains": [ 00:10:18.572 { 00:10:18.572 "dma_device_id": "system", 00:10:18.572 "dma_device_type": 1 00:10:18.572 }, 00:10:18.572 { 00:10:18.572 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:10:18.572 "dma_device_type": 2 00:10:18.572 } 00:10:18.572 ], 00:10:18.573 "driver_specific": {} 00:10:18.573 } 00:10:18.573 ] 00:10:18.573 18:14:02 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:10:18.573 18:14:02 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@905 -- # return 0 00:10:18.573 18:14:02 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@550 -- # sleep 2 00:10:18.573 18:14:02 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@549 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py perform_tests 00:10:18.831 Running I/O for 5 seconds... 00:10:20.731 18:14:04 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@551 -- # qd_sampling_function_test Malloc_QD 00:10:20.731 18:14:04 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@519 -- # local bdev_name=Malloc_QD 00:10:20.731 18:14:04 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@520 -- # local sampling_period=10 00:10:20.731 18:14:04 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@521 -- # local iostats 00:10:20.731 18:14:04 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@523 -- # rpc_cmd bdev_set_qd_sampling_period Malloc_QD 10 00:10:20.731 18:14:04 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@559 -- # xtrace_disable 00:10:20.731 18:14:04 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@10 -- # set +x 00:10:20.731 18:14:04 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:10:20.731 18:14:04 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@525 -- # rpc_cmd bdev_get_iostat -b Malloc_QD 00:10:20.731 18:14:04 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@559 -- # xtrace_disable 00:10:20.731 18:14:04 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@10 -- # set +x 00:10:20.731 18:14:04 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:10:20.731 18:14:04 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@525 -- # iostats='{ 00:10:20.731 "tick_rate": 2300000000, 00:10:20.731 "ticks": 4804182347493362, 00:10:20.731 "bdevs": [ 00:10:20.731 { 00:10:20.731 "name": "Malloc_QD", 00:10:20.731 "bytes_read": 771797504, 00:10:20.731 "num_read_ops": 188420, 00:10:20.731 "bytes_written": 0, 00:10:20.731 "num_write_ops": 0, 00:10:20.731 "bytes_unmapped": 0, 00:10:20.731 "num_unmap_ops": 0, 00:10:20.731 "bytes_copied": 0, 00:10:20.731 "num_copy_ops": 0, 00:10:20.731 "read_latency_ticks": 2244822995636, 00:10:20.731 "max_read_latency_ticks": 14755274, 00:10:20.731 "min_read_latency_ticks": 284928, 00:10:20.731 "write_latency_ticks": 0, 00:10:20.731 "max_write_latency_ticks": 0, 00:10:20.731 "min_write_latency_ticks": 0, 00:10:20.731 "unmap_latency_ticks": 0, 00:10:20.731 "max_unmap_latency_ticks": 0, 00:10:20.731 "min_unmap_latency_ticks": 0, 00:10:20.731 "copy_latency_ticks": 0, 00:10:20.731 "max_copy_latency_ticks": 0, 00:10:20.731 "min_copy_latency_ticks": 0, 00:10:20.731 "io_error": {}, 00:10:20.731 "queue_depth_polling_period": 10, 00:10:20.731 "queue_depth": 512, 00:10:20.731 "io_time": 30, 00:10:20.731 "weighted_io_time": 15360 00:10:20.731 } 00:10:20.731 ] 00:10:20.731 }' 00:10:20.731 18:14:04 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@527 -- # jq -r '.bdevs[0].queue_depth_polling_period' 00:10:20.731 18:14:04 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@527 -- # qd_sampling_period=10 00:10:20.731 18:14:04 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@529 -- # '[' 10 == null ']' 00:10:20.731 18:14:04 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@529 -- # '[' 10 -ne 10 ']' 00:10:20.731 18:14:04 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@553 -- # rpc_cmd bdev_malloc_delete Malloc_QD 00:10:20.731 18:14:04 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@559 -- # xtrace_disable 00:10:20.731 18:14:04 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@10 -- # set +x 00:10:20.731 00:10:20.731 Latency(us) 00:10:20.731 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:10:20.731 Job: Malloc_QD (Core Mask 0x1, workload: randread, depth: 256, IO size: 4096) 00:10:20.731 Malloc_QD : 1.99 48807.87 190.66 0.00 0.00 5232.19 1410.45 5556.31 00:10:20.731 Job: Malloc_QD (Core Mask 0x2, workload: randread, depth: 256, IO size: 4096) 00:10:20.731 Malloc_QD : 1.99 49911.81 194.97 0.00 0.00 5117.03 961.67 6439.62 00:10:20.731 =================================================================================================================== 00:10:20.731 Total : 98719.69 385.62 0.00 0.00 5173.94 961.67 6439.62 00:10:20.731 0 00:10:20.731 18:14:04 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:10:20.731 18:14:04 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@554 -- # killprocess 2452268 00:10:20.731 18:14:04 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@948 -- # '[' -z 2452268 ']' 00:10:20.731 18:14:04 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@952 -- # kill -0 2452268 00:10:20.731 18:14:04 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@953 -- # uname 00:10:20.731 18:14:04 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:10:20.731 18:14:04 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2452268 00:10:20.731 18:14:04 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:10:20.731 18:14:04 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:10:20.731 18:14:04 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2452268' 00:10:20.731 killing process with pid 2452268 00:10:20.731 18:14:04 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@967 -- # kill 2452268 00:10:20.731 Received shutdown signal, test time was about 2.067253 seconds 00:10:20.731 00:10:20.731 Latency(us) 00:10:20.731 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:10:20.731 =================================================================================================================== 00:10:20.731 Total : 0.00 0.00 0.00 0.00 0.00 0.00 0.00 00:10:20.731 18:14:04 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@972 -- # wait 2452268 00:10:20.990 18:14:04 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@555 -- # trap - SIGINT SIGTERM EXIT 00:10:20.990 00:10:20.990 real 0m3.401s 00:10:20.990 user 0m6.595s 00:10:20.990 sys 0m0.431s 00:10:20.990 18:14:04 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@1124 -- # xtrace_disable 00:10:20.990 18:14:04 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@10 -- # set +x 00:10:20.990 ************************************ 00:10:20.990 END TEST bdev_qd_sampling 00:10:20.990 ************************************ 00:10:20.990 18:14:04 blockdev_general -- common/autotest_common.sh@1142 -- # return 0 00:10:20.990 18:14:04 blockdev_general -- bdev/blockdev.sh@790 -- # run_test bdev_error error_test_suite '' 00:10:20.990 18:14:04 blockdev_general -- common/autotest_common.sh@1099 -- # '[' 3 -le 1 ']' 00:10:20.990 18:14:04 blockdev_general -- common/autotest_common.sh@1105 -- # xtrace_disable 00:10:20.990 18:14:04 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:10:21.249 ************************************ 00:10:21.249 START TEST bdev_error 00:10:21.249 ************************************ 00:10:21.249 18:14:04 blockdev_general.bdev_error -- common/autotest_common.sh@1123 -- # error_test_suite '' 00:10:21.249 18:14:04 blockdev_general.bdev_error -- bdev/blockdev.sh@466 -- # DEV_1=Dev_1 00:10:21.249 18:14:04 blockdev_general.bdev_error -- bdev/blockdev.sh@467 -- # DEV_2=Dev_2 00:10:21.249 18:14:04 blockdev_general.bdev_error -- bdev/blockdev.sh@468 -- # ERR_DEV=EE_Dev_1 00:10:21.249 18:14:04 blockdev_general.bdev_error -- bdev/blockdev.sh@472 -- # ERR_PID=2452703 00:10:21.249 18:14:04 blockdev_general.bdev_error -- bdev/blockdev.sh@473 -- # echo 'Process error testing pid: 2452703' 00:10:21.249 Process error testing pid: 2452703 00:10:21.249 18:14:04 blockdev_general.bdev_error -- bdev/blockdev.sh@471 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -z -m 0x2 -q 16 -o 4096 -w randread -t 5 -f '' 00:10:21.249 18:14:04 blockdev_general.bdev_error -- bdev/blockdev.sh@474 -- # waitforlisten 2452703 00:10:21.249 18:14:04 blockdev_general.bdev_error -- common/autotest_common.sh@829 -- # '[' -z 2452703 ']' 00:10:21.249 18:14:04 blockdev_general.bdev_error -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:10:21.249 18:14:04 blockdev_general.bdev_error -- common/autotest_common.sh@834 -- # local max_retries=100 00:10:21.249 18:14:04 blockdev_general.bdev_error -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:10:21.249 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:10:21.249 18:14:04 blockdev_general.bdev_error -- common/autotest_common.sh@838 -- # xtrace_disable 00:10:21.249 18:14:04 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:10:21.249 [2024-07-12 18:14:04.809085] Starting SPDK v24.09-pre git sha1 182dd7de4 / DPDK 24.03.0 initialization... 00:10:21.249 [2024-07-12 18:14:04.809157] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x2 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2452703 ] 00:10:21.249 [2024-07-12 18:14:04.932421] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:10:21.508 [2024-07-12 18:14:05.031652] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:10:22.076 18:14:05 blockdev_general.bdev_error -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:10:22.076 18:14:05 blockdev_general.bdev_error -- common/autotest_common.sh@862 -- # return 0 00:10:22.076 18:14:05 blockdev_general.bdev_error -- bdev/blockdev.sh@476 -- # rpc_cmd bdev_malloc_create -b Dev_1 128 512 00:10:22.076 18:14:05 blockdev_general.bdev_error -- common/autotest_common.sh@559 -- # xtrace_disable 00:10:22.076 18:14:05 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:10:22.076 Dev_1 00:10:22.076 18:14:05 blockdev_general.bdev_error -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:10:22.076 18:14:05 blockdev_general.bdev_error -- bdev/blockdev.sh@477 -- # waitforbdev Dev_1 00:10:22.076 18:14:05 blockdev_general.bdev_error -- common/autotest_common.sh@897 -- # local bdev_name=Dev_1 00:10:22.076 18:14:05 blockdev_general.bdev_error -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:10:22.076 18:14:05 blockdev_general.bdev_error -- common/autotest_common.sh@899 -- # local i 00:10:22.076 18:14:05 blockdev_general.bdev_error -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:10:22.076 18:14:05 blockdev_general.bdev_error -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:10:22.076 18:14:05 blockdev_general.bdev_error -- common/autotest_common.sh@902 -- # rpc_cmd bdev_wait_for_examine 00:10:22.076 18:14:05 blockdev_general.bdev_error -- common/autotest_common.sh@559 -- # xtrace_disable 00:10:22.076 18:14:05 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:10:22.076 18:14:05 blockdev_general.bdev_error -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:10:22.076 18:14:05 blockdev_general.bdev_error -- common/autotest_common.sh@904 -- # rpc_cmd bdev_get_bdevs -b Dev_1 -t 2000 00:10:22.076 18:14:05 blockdev_general.bdev_error -- common/autotest_common.sh@559 -- # xtrace_disable 00:10:22.076 18:14:05 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:10:22.076 [ 00:10:22.076 { 00:10:22.076 "name": "Dev_1", 00:10:22.076 "aliases": [ 00:10:22.076 "35c7dbaa-dad4-4863-a849-573510873021" 00:10:22.076 ], 00:10:22.076 "product_name": "Malloc disk", 00:10:22.076 "block_size": 512, 00:10:22.076 "num_blocks": 262144, 00:10:22.076 "uuid": "35c7dbaa-dad4-4863-a849-573510873021", 00:10:22.076 "assigned_rate_limits": { 00:10:22.076 "rw_ios_per_sec": 0, 00:10:22.076 "rw_mbytes_per_sec": 0, 00:10:22.076 "r_mbytes_per_sec": 0, 00:10:22.076 "w_mbytes_per_sec": 0 00:10:22.076 }, 00:10:22.076 "claimed": false, 00:10:22.076 "zoned": false, 00:10:22.076 "supported_io_types": { 00:10:22.076 "read": true, 00:10:22.076 "write": true, 00:10:22.076 "unmap": true, 00:10:22.076 "flush": true, 00:10:22.076 "reset": true, 00:10:22.076 "nvme_admin": false, 00:10:22.076 "nvme_io": false, 00:10:22.076 "nvme_io_md": false, 00:10:22.076 "write_zeroes": true, 00:10:22.076 "zcopy": true, 00:10:22.076 "get_zone_info": false, 00:10:22.076 "zone_management": false, 00:10:22.076 "zone_append": false, 00:10:22.076 "compare": false, 00:10:22.076 "compare_and_write": false, 00:10:22.076 "abort": true, 00:10:22.076 "seek_hole": false, 00:10:22.076 "seek_data": false, 00:10:22.076 "copy": true, 00:10:22.076 "nvme_iov_md": false 00:10:22.076 }, 00:10:22.076 "memory_domains": [ 00:10:22.076 { 00:10:22.076 "dma_device_id": "system", 00:10:22.076 "dma_device_type": 1 00:10:22.076 }, 00:10:22.076 { 00:10:22.076 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:10:22.076 "dma_device_type": 2 00:10:22.076 } 00:10:22.076 ], 00:10:22.076 "driver_specific": {} 00:10:22.076 } 00:10:22.076 ] 00:10:22.076 18:14:05 blockdev_general.bdev_error -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:10:22.335 18:14:05 blockdev_general.bdev_error -- common/autotest_common.sh@905 -- # return 0 00:10:22.335 18:14:05 blockdev_general.bdev_error -- bdev/blockdev.sh@478 -- # rpc_cmd bdev_error_create Dev_1 00:10:22.335 18:14:05 blockdev_general.bdev_error -- common/autotest_common.sh@559 -- # xtrace_disable 00:10:22.335 18:14:05 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:10:22.335 true 00:10:22.335 18:14:05 blockdev_general.bdev_error -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:10:22.335 18:14:05 blockdev_general.bdev_error -- bdev/blockdev.sh@479 -- # rpc_cmd bdev_malloc_create -b Dev_2 128 512 00:10:22.335 18:14:05 blockdev_general.bdev_error -- common/autotest_common.sh@559 -- # xtrace_disable 00:10:22.335 18:14:05 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:10:22.335 Dev_2 00:10:22.335 18:14:05 blockdev_general.bdev_error -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:10:22.335 18:14:05 blockdev_general.bdev_error -- bdev/blockdev.sh@480 -- # waitforbdev Dev_2 00:10:22.335 18:14:05 blockdev_general.bdev_error -- common/autotest_common.sh@897 -- # local bdev_name=Dev_2 00:10:22.335 18:14:05 blockdev_general.bdev_error -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:10:22.335 18:14:05 blockdev_general.bdev_error -- common/autotest_common.sh@899 -- # local i 00:10:22.335 18:14:05 blockdev_general.bdev_error -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:10:22.335 18:14:05 blockdev_general.bdev_error -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:10:22.335 18:14:05 blockdev_general.bdev_error -- common/autotest_common.sh@902 -- # rpc_cmd bdev_wait_for_examine 00:10:22.335 18:14:05 blockdev_general.bdev_error -- common/autotest_common.sh@559 -- # xtrace_disable 00:10:22.335 18:14:05 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:10:22.335 18:14:05 blockdev_general.bdev_error -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:10:22.335 18:14:05 blockdev_general.bdev_error -- common/autotest_common.sh@904 -- # rpc_cmd bdev_get_bdevs -b Dev_2 -t 2000 00:10:22.335 18:14:05 blockdev_general.bdev_error -- common/autotest_common.sh@559 -- # xtrace_disable 00:10:22.335 18:14:05 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:10:22.335 [ 00:10:22.335 { 00:10:22.335 "name": "Dev_2", 00:10:22.335 "aliases": [ 00:10:22.335 "339c9928-1cee-4b49-b859-47c62fbfd748" 00:10:22.335 ], 00:10:22.335 "product_name": "Malloc disk", 00:10:22.335 "block_size": 512, 00:10:22.335 "num_blocks": 262144, 00:10:22.335 "uuid": "339c9928-1cee-4b49-b859-47c62fbfd748", 00:10:22.335 "assigned_rate_limits": { 00:10:22.335 "rw_ios_per_sec": 0, 00:10:22.335 "rw_mbytes_per_sec": 0, 00:10:22.335 "r_mbytes_per_sec": 0, 00:10:22.335 "w_mbytes_per_sec": 0 00:10:22.335 }, 00:10:22.335 "claimed": false, 00:10:22.335 "zoned": false, 00:10:22.335 "supported_io_types": { 00:10:22.335 "read": true, 00:10:22.335 "write": true, 00:10:22.335 "unmap": true, 00:10:22.335 "flush": true, 00:10:22.335 "reset": true, 00:10:22.335 "nvme_admin": false, 00:10:22.335 "nvme_io": false, 00:10:22.335 "nvme_io_md": false, 00:10:22.335 "write_zeroes": true, 00:10:22.335 "zcopy": true, 00:10:22.335 "get_zone_info": false, 00:10:22.335 "zone_management": false, 00:10:22.335 "zone_append": false, 00:10:22.335 "compare": false, 00:10:22.335 "compare_and_write": false, 00:10:22.335 "abort": true, 00:10:22.335 "seek_hole": false, 00:10:22.335 "seek_data": false, 00:10:22.335 "copy": true, 00:10:22.335 "nvme_iov_md": false 00:10:22.335 }, 00:10:22.335 "memory_domains": [ 00:10:22.335 { 00:10:22.335 "dma_device_id": "system", 00:10:22.335 "dma_device_type": 1 00:10:22.335 }, 00:10:22.335 { 00:10:22.335 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:10:22.335 "dma_device_type": 2 00:10:22.335 } 00:10:22.335 ], 00:10:22.335 "driver_specific": {} 00:10:22.335 } 00:10:22.335 ] 00:10:22.335 18:14:05 blockdev_general.bdev_error -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:10:22.335 18:14:05 blockdev_general.bdev_error -- common/autotest_common.sh@905 -- # return 0 00:10:22.335 18:14:05 blockdev_general.bdev_error -- bdev/blockdev.sh@481 -- # rpc_cmd bdev_error_inject_error EE_Dev_1 all failure -n 5 00:10:22.335 18:14:05 blockdev_general.bdev_error -- common/autotest_common.sh@559 -- # xtrace_disable 00:10:22.335 18:14:05 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:10:22.335 18:14:05 blockdev_general.bdev_error -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:10:22.335 18:14:05 blockdev_general.bdev_error -- bdev/blockdev.sh@484 -- # sleep 1 00:10:22.336 18:14:05 blockdev_general.bdev_error -- bdev/blockdev.sh@483 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -t 1 perform_tests 00:10:22.336 Running I/O for 5 seconds... 00:10:23.272 18:14:06 blockdev_general.bdev_error -- bdev/blockdev.sh@487 -- # kill -0 2452703 00:10:23.272 18:14:06 blockdev_general.bdev_error -- bdev/blockdev.sh@488 -- # echo 'Process is existed as continue on error is set. Pid: 2452703' 00:10:23.272 Process is existed as continue on error is set. Pid: 2452703 00:10:23.272 18:14:06 blockdev_general.bdev_error -- bdev/blockdev.sh@495 -- # rpc_cmd bdev_error_delete EE_Dev_1 00:10:23.272 18:14:06 blockdev_general.bdev_error -- common/autotest_common.sh@559 -- # xtrace_disable 00:10:23.272 18:14:06 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:10:23.272 18:14:06 blockdev_general.bdev_error -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:10:23.272 18:14:06 blockdev_general.bdev_error -- bdev/blockdev.sh@496 -- # rpc_cmd bdev_malloc_delete Dev_1 00:10:23.272 18:14:06 blockdev_general.bdev_error -- common/autotest_common.sh@559 -- # xtrace_disable 00:10:23.272 18:14:06 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:10:23.272 18:14:06 blockdev_general.bdev_error -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:10:23.272 18:14:06 blockdev_general.bdev_error -- bdev/blockdev.sh@497 -- # sleep 5 00:10:23.530 Timeout while waiting for response: 00:10:23.530 00:10:23.530 00:10:27.862 00:10:27.862 Latency(us) 00:10:27.862 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:10:27.862 Job: EE_Dev_1 (Core Mask 0x2, workload: randread, depth: 16, IO size: 4096) 00:10:27.862 EE_Dev_1 : 0.90 37769.39 147.54 5.57 0.00 420.07 130.89 690.98 00:10:27.862 Job: Dev_2 (Core Mask 0x2, workload: randread, depth: 16, IO size: 4096) 00:10:27.862 Dev_2 : 5.00 82163.97 320.95 0.00 0.00 191.23 65.89 20971.52 00:10:27.862 =================================================================================================================== 00:10:27.862 Total : 119933.36 468.49 5.57 0.00 208.69 65.89 20971.52 00:10:28.429 18:14:11 blockdev_general.bdev_error -- bdev/blockdev.sh@499 -- # killprocess 2452703 00:10:28.429 18:14:11 blockdev_general.bdev_error -- common/autotest_common.sh@948 -- # '[' -z 2452703 ']' 00:10:28.429 18:14:11 blockdev_general.bdev_error -- common/autotest_common.sh@952 -- # kill -0 2452703 00:10:28.429 18:14:11 blockdev_general.bdev_error -- common/autotest_common.sh@953 -- # uname 00:10:28.429 18:14:11 blockdev_general.bdev_error -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:10:28.429 18:14:11 blockdev_general.bdev_error -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2452703 00:10:28.429 18:14:11 blockdev_general.bdev_error -- common/autotest_common.sh@954 -- # process_name=reactor_1 00:10:28.429 18:14:11 blockdev_general.bdev_error -- common/autotest_common.sh@958 -- # '[' reactor_1 = sudo ']' 00:10:28.429 18:14:11 blockdev_general.bdev_error -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2452703' 00:10:28.429 killing process with pid 2452703 00:10:28.429 18:14:11 blockdev_general.bdev_error -- common/autotest_common.sh@967 -- # kill 2452703 00:10:28.429 Received shutdown signal, test time was about 5.000000 seconds 00:10:28.429 00:10:28.429 Latency(us) 00:10:28.429 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:10:28.429 =================================================================================================================== 00:10:28.429 Total : 0.00 0.00 0.00 0.00 0.00 0.00 0.00 00:10:28.429 18:14:11 blockdev_general.bdev_error -- common/autotest_common.sh@972 -- # wait 2452703 00:10:28.688 18:14:12 blockdev_general.bdev_error -- bdev/blockdev.sh@503 -- # ERR_PID=2453751 00:10:28.688 18:14:12 blockdev_general.bdev_error -- bdev/blockdev.sh@504 -- # echo 'Process error testing pid: 2453751' 00:10:28.688 Process error testing pid: 2453751 00:10:28.688 18:14:12 blockdev_general.bdev_error -- bdev/blockdev.sh@502 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -z -m 0x2 -q 16 -o 4096 -w randread -t 5 '' 00:10:28.688 18:14:12 blockdev_general.bdev_error -- bdev/blockdev.sh@505 -- # waitforlisten 2453751 00:10:28.688 18:14:12 blockdev_general.bdev_error -- common/autotest_common.sh@829 -- # '[' -z 2453751 ']' 00:10:28.688 18:14:12 blockdev_general.bdev_error -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:10:28.688 18:14:12 blockdev_general.bdev_error -- common/autotest_common.sh@834 -- # local max_retries=100 00:10:28.688 18:14:12 blockdev_general.bdev_error -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:10:28.688 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:10:28.688 18:14:12 blockdev_general.bdev_error -- common/autotest_common.sh@838 -- # xtrace_disable 00:10:28.688 18:14:12 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:10:28.688 [2024-07-12 18:14:12.322874] Starting SPDK v24.09-pre git sha1 182dd7de4 / DPDK 24.03.0 initialization... 00:10:28.688 [2024-07-12 18:14:12.322969] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x2 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2453751 ] 00:10:28.947 [2024-07-12 18:14:12.443496] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:10:28.947 [2024-07-12 18:14:12.544701] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:10:29.883 18:14:13 blockdev_general.bdev_error -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:10:29.883 18:14:13 blockdev_general.bdev_error -- common/autotest_common.sh@862 -- # return 0 00:10:29.883 18:14:13 blockdev_general.bdev_error -- bdev/blockdev.sh@507 -- # rpc_cmd bdev_malloc_create -b Dev_1 128 512 00:10:29.883 18:14:13 blockdev_general.bdev_error -- common/autotest_common.sh@559 -- # xtrace_disable 00:10:29.883 18:14:13 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:10:29.883 Dev_1 00:10:29.883 18:14:13 blockdev_general.bdev_error -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:10:29.883 18:14:13 blockdev_general.bdev_error -- bdev/blockdev.sh@508 -- # waitforbdev Dev_1 00:10:29.883 18:14:13 blockdev_general.bdev_error -- common/autotest_common.sh@897 -- # local bdev_name=Dev_1 00:10:29.883 18:14:13 blockdev_general.bdev_error -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:10:29.883 18:14:13 blockdev_general.bdev_error -- common/autotest_common.sh@899 -- # local i 00:10:29.883 18:14:13 blockdev_general.bdev_error -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:10:29.883 18:14:13 blockdev_general.bdev_error -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:10:29.883 18:14:13 blockdev_general.bdev_error -- common/autotest_common.sh@902 -- # rpc_cmd bdev_wait_for_examine 00:10:29.883 18:14:13 blockdev_general.bdev_error -- common/autotest_common.sh@559 -- # xtrace_disable 00:10:29.883 18:14:13 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:10:29.883 18:14:13 blockdev_general.bdev_error -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:10:29.883 18:14:13 blockdev_general.bdev_error -- common/autotest_common.sh@904 -- # rpc_cmd bdev_get_bdevs -b Dev_1 -t 2000 00:10:29.883 18:14:13 blockdev_general.bdev_error -- common/autotest_common.sh@559 -- # xtrace_disable 00:10:29.883 18:14:13 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:10:29.883 [ 00:10:29.883 { 00:10:29.883 "name": "Dev_1", 00:10:29.883 "aliases": [ 00:10:29.883 "eaff47ea-1931-499b-a559-fabd09279aca" 00:10:29.883 ], 00:10:29.883 "product_name": "Malloc disk", 00:10:29.883 "block_size": 512, 00:10:29.883 "num_blocks": 262144, 00:10:29.883 "uuid": "eaff47ea-1931-499b-a559-fabd09279aca", 00:10:29.883 "assigned_rate_limits": { 00:10:29.883 "rw_ios_per_sec": 0, 00:10:29.883 "rw_mbytes_per_sec": 0, 00:10:29.883 "r_mbytes_per_sec": 0, 00:10:29.883 "w_mbytes_per_sec": 0 00:10:29.883 }, 00:10:29.883 "claimed": false, 00:10:29.883 "zoned": false, 00:10:29.883 "supported_io_types": { 00:10:29.883 "read": true, 00:10:29.883 "write": true, 00:10:29.883 "unmap": true, 00:10:29.883 "flush": true, 00:10:29.883 "reset": true, 00:10:29.883 "nvme_admin": false, 00:10:29.883 "nvme_io": false, 00:10:29.883 "nvme_io_md": false, 00:10:29.883 "write_zeroes": true, 00:10:29.883 "zcopy": true, 00:10:29.883 "get_zone_info": false, 00:10:29.883 "zone_management": false, 00:10:29.883 "zone_append": false, 00:10:29.883 "compare": false, 00:10:29.883 "compare_and_write": false, 00:10:29.883 "abort": true, 00:10:29.883 "seek_hole": false, 00:10:29.883 "seek_data": false, 00:10:29.883 "copy": true, 00:10:29.883 "nvme_iov_md": false 00:10:29.883 }, 00:10:29.883 "memory_domains": [ 00:10:29.883 { 00:10:29.883 "dma_device_id": "system", 00:10:29.883 "dma_device_type": 1 00:10:29.883 }, 00:10:29.883 { 00:10:29.883 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:10:29.883 "dma_device_type": 2 00:10:29.883 } 00:10:29.883 ], 00:10:29.883 "driver_specific": {} 00:10:29.883 } 00:10:29.883 ] 00:10:29.883 18:14:13 blockdev_general.bdev_error -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:10:29.883 18:14:13 blockdev_general.bdev_error -- common/autotest_common.sh@905 -- # return 0 00:10:29.883 18:14:13 blockdev_general.bdev_error -- bdev/blockdev.sh@509 -- # rpc_cmd bdev_error_create Dev_1 00:10:29.883 18:14:13 blockdev_general.bdev_error -- common/autotest_common.sh@559 -- # xtrace_disable 00:10:29.883 18:14:13 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:10:29.883 true 00:10:29.883 18:14:13 blockdev_general.bdev_error -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:10:29.883 18:14:13 blockdev_general.bdev_error -- bdev/blockdev.sh@510 -- # rpc_cmd bdev_malloc_create -b Dev_2 128 512 00:10:29.883 18:14:13 blockdev_general.bdev_error -- common/autotest_common.sh@559 -- # xtrace_disable 00:10:29.883 18:14:13 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:10:29.883 Dev_2 00:10:29.883 18:14:13 blockdev_general.bdev_error -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:10:29.883 18:14:13 blockdev_general.bdev_error -- bdev/blockdev.sh@511 -- # waitforbdev Dev_2 00:10:29.883 18:14:13 blockdev_general.bdev_error -- common/autotest_common.sh@897 -- # local bdev_name=Dev_2 00:10:29.883 18:14:13 blockdev_general.bdev_error -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:10:29.883 18:14:13 blockdev_general.bdev_error -- common/autotest_common.sh@899 -- # local i 00:10:29.883 18:14:13 blockdev_general.bdev_error -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:10:29.883 18:14:13 blockdev_general.bdev_error -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:10:29.883 18:14:13 blockdev_general.bdev_error -- common/autotest_common.sh@902 -- # rpc_cmd bdev_wait_for_examine 00:10:29.883 18:14:13 blockdev_general.bdev_error -- common/autotest_common.sh@559 -- # xtrace_disable 00:10:29.883 18:14:13 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:10:29.883 18:14:13 blockdev_general.bdev_error -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:10:29.883 18:14:13 blockdev_general.bdev_error -- common/autotest_common.sh@904 -- # rpc_cmd bdev_get_bdevs -b Dev_2 -t 2000 00:10:29.883 18:14:13 blockdev_general.bdev_error -- common/autotest_common.sh@559 -- # xtrace_disable 00:10:29.883 18:14:13 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:10:29.883 [ 00:10:29.883 { 00:10:29.883 "name": "Dev_2", 00:10:29.883 "aliases": [ 00:10:29.883 "a0dd1551-9f5b-4070-b7de-da83880e9735" 00:10:29.883 ], 00:10:29.883 "product_name": "Malloc disk", 00:10:29.883 "block_size": 512, 00:10:29.883 "num_blocks": 262144, 00:10:29.883 "uuid": "a0dd1551-9f5b-4070-b7de-da83880e9735", 00:10:29.883 "assigned_rate_limits": { 00:10:29.883 "rw_ios_per_sec": 0, 00:10:29.883 "rw_mbytes_per_sec": 0, 00:10:29.883 "r_mbytes_per_sec": 0, 00:10:29.883 "w_mbytes_per_sec": 0 00:10:29.883 }, 00:10:29.883 "claimed": false, 00:10:29.883 "zoned": false, 00:10:29.883 "supported_io_types": { 00:10:29.883 "read": true, 00:10:29.883 "write": true, 00:10:29.883 "unmap": true, 00:10:29.883 "flush": true, 00:10:29.883 "reset": true, 00:10:29.883 "nvme_admin": false, 00:10:29.883 "nvme_io": false, 00:10:29.883 "nvme_io_md": false, 00:10:29.883 "write_zeroes": true, 00:10:29.883 "zcopy": true, 00:10:29.883 "get_zone_info": false, 00:10:29.883 "zone_management": false, 00:10:29.883 "zone_append": false, 00:10:29.883 "compare": false, 00:10:29.883 "compare_and_write": false, 00:10:29.883 "abort": true, 00:10:29.883 "seek_hole": false, 00:10:29.883 "seek_data": false, 00:10:29.883 "copy": true, 00:10:29.883 "nvme_iov_md": false 00:10:29.883 }, 00:10:29.883 "memory_domains": [ 00:10:29.883 { 00:10:29.883 "dma_device_id": "system", 00:10:29.883 "dma_device_type": 1 00:10:29.883 }, 00:10:29.883 { 00:10:29.883 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:10:29.883 "dma_device_type": 2 00:10:29.883 } 00:10:29.883 ], 00:10:29.883 "driver_specific": {} 00:10:29.883 } 00:10:29.883 ] 00:10:29.883 18:14:13 blockdev_general.bdev_error -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:10:29.883 18:14:13 blockdev_general.bdev_error -- common/autotest_common.sh@905 -- # return 0 00:10:29.883 18:14:13 blockdev_general.bdev_error -- bdev/blockdev.sh@512 -- # rpc_cmd bdev_error_inject_error EE_Dev_1 all failure -n 5 00:10:29.883 18:14:13 blockdev_general.bdev_error -- common/autotest_common.sh@559 -- # xtrace_disable 00:10:29.883 18:14:13 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:10:29.883 18:14:13 blockdev_general.bdev_error -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:10:29.883 18:14:13 blockdev_general.bdev_error -- bdev/blockdev.sh@515 -- # NOT wait 2453751 00:10:29.883 18:14:13 blockdev_general.bdev_error -- common/autotest_common.sh@648 -- # local es=0 00:10:29.883 18:14:13 blockdev_general.bdev_error -- common/autotest_common.sh@650 -- # valid_exec_arg wait 2453751 00:10:29.883 18:14:13 blockdev_general.bdev_error -- bdev/blockdev.sh@514 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -t 1 perform_tests 00:10:29.883 18:14:13 blockdev_general.bdev_error -- common/autotest_common.sh@636 -- # local arg=wait 00:10:29.883 18:14:13 blockdev_general.bdev_error -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:10:29.883 18:14:13 blockdev_general.bdev_error -- common/autotest_common.sh@640 -- # type -t wait 00:10:29.883 18:14:13 blockdev_general.bdev_error -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:10:29.883 18:14:13 blockdev_general.bdev_error -- common/autotest_common.sh@651 -- # wait 2453751 00:10:29.883 Running I/O for 5 seconds... 00:10:29.883 task offset: 126936 on job bdev=EE_Dev_1 fails 00:10:29.883 00:10:29.883 Latency(us) 00:10:29.883 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:10:29.883 Job: EE_Dev_1 (Core Mask 0x2, workload: randread, depth: 16, IO size: 4096) 00:10:29.883 Job: EE_Dev_1 ended in about 0.00 seconds with error 00:10:29.883 EE_Dev_1 : 0.00 30303.03 118.37 6887.05 0.00 356.72 130.89 637.55 00:10:29.883 Job: Dev_2 (Core Mask 0x2, workload: randread, depth: 16, IO size: 4096) 00:10:29.883 Dev_2 : 0.00 18401.38 71.88 0.00 0.00 646.65 125.55 1203.87 00:10:29.883 =================================================================================================================== 00:10:29.883 Total : 48704.41 190.25 6887.05 0.00 513.97 125.55 1203.87 00:10:29.883 [2024-07-12 18:14:13.524803] app.c:1052:spdk_app_stop: *WARNING*: spdk_app_stop'd on non-zero 00:10:29.883 request: 00:10:29.883 { 00:10:29.883 "method": "perform_tests", 00:10:29.883 "req_id": 1 00:10:29.883 } 00:10:29.883 Got JSON-RPC error response 00:10:29.883 response: 00:10:29.883 { 00:10:29.883 "code": -32603, 00:10:29.883 "message": "bdevperf failed with error Operation not permitted" 00:10:29.883 } 00:10:30.142 18:14:13 blockdev_general.bdev_error -- common/autotest_common.sh@651 -- # es=255 00:10:30.142 18:14:13 blockdev_general.bdev_error -- common/autotest_common.sh@659 -- # (( es > 128 )) 00:10:30.142 18:14:13 blockdev_general.bdev_error -- common/autotest_common.sh@660 -- # es=127 00:10:30.142 18:14:13 blockdev_general.bdev_error -- common/autotest_common.sh@661 -- # case "$es" in 00:10:30.142 18:14:13 blockdev_general.bdev_error -- common/autotest_common.sh@668 -- # es=1 00:10:30.142 18:14:13 blockdev_general.bdev_error -- common/autotest_common.sh@675 -- # (( !es == 0 )) 00:10:30.142 00:10:30.142 real 0m9.078s 00:10:30.142 user 0m9.486s 00:10:30.142 sys 0m0.869s 00:10:30.142 18:14:13 blockdev_general.bdev_error -- common/autotest_common.sh@1124 -- # xtrace_disable 00:10:30.142 18:14:13 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:10:30.142 ************************************ 00:10:30.142 END TEST bdev_error 00:10:30.142 ************************************ 00:10:30.142 18:14:13 blockdev_general -- common/autotest_common.sh@1142 -- # return 0 00:10:30.142 18:14:13 blockdev_general -- bdev/blockdev.sh@791 -- # run_test bdev_stat stat_test_suite '' 00:10:30.142 18:14:13 blockdev_general -- common/autotest_common.sh@1099 -- # '[' 3 -le 1 ']' 00:10:30.142 18:14:13 blockdev_general -- common/autotest_common.sh@1105 -- # xtrace_disable 00:10:30.142 18:14:13 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:10:30.401 ************************************ 00:10:30.401 START TEST bdev_stat 00:10:30.401 ************************************ 00:10:30.401 18:14:13 blockdev_general.bdev_stat -- common/autotest_common.sh@1123 -- # stat_test_suite '' 00:10:30.401 18:14:13 blockdev_general.bdev_stat -- bdev/blockdev.sh@592 -- # STAT_DEV=Malloc_STAT 00:10:30.401 18:14:13 blockdev_general.bdev_stat -- bdev/blockdev.sh@596 -- # STAT_PID=2453956 00:10:30.401 18:14:13 blockdev_general.bdev_stat -- bdev/blockdev.sh@597 -- # echo 'Process Bdev IO statistics testing pid: 2453956' 00:10:30.401 Process Bdev IO statistics testing pid: 2453956 00:10:30.401 18:14:13 blockdev_general.bdev_stat -- bdev/blockdev.sh@595 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -z -m 0x3 -q 256 -o 4096 -w randread -t 10 -C '' 00:10:30.401 18:14:13 blockdev_general.bdev_stat -- bdev/blockdev.sh@598 -- # trap 'cleanup; killprocess $STAT_PID; exit 1' SIGINT SIGTERM EXIT 00:10:30.401 18:14:13 blockdev_general.bdev_stat -- bdev/blockdev.sh@599 -- # waitforlisten 2453956 00:10:30.401 18:14:13 blockdev_general.bdev_stat -- common/autotest_common.sh@829 -- # '[' -z 2453956 ']' 00:10:30.401 18:14:13 blockdev_general.bdev_stat -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:10:30.401 18:14:13 blockdev_general.bdev_stat -- common/autotest_common.sh@834 -- # local max_retries=100 00:10:30.401 18:14:13 blockdev_general.bdev_stat -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:10:30.401 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:10:30.401 18:14:13 blockdev_general.bdev_stat -- common/autotest_common.sh@838 -- # xtrace_disable 00:10:30.401 18:14:13 blockdev_general.bdev_stat -- common/autotest_common.sh@10 -- # set +x 00:10:30.401 [2024-07-12 18:14:13.968705] Starting SPDK v24.09-pre git sha1 182dd7de4 / DPDK 24.03.0 initialization... 00:10:30.401 [2024-07-12 18:14:13.968772] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x3 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2453956 ] 00:10:30.401 [2024-07-12 18:14:14.101582] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 2 00:10:30.660 [2024-07-12 18:14:14.211879] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:10:30.660 [2024-07-12 18:14:14.211885] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:10:31.228 18:14:14 blockdev_general.bdev_stat -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:10:31.228 18:14:14 blockdev_general.bdev_stat -- common/autotest_common.sh@862 -- # return 0 00:10:31.228 18:14:14 blockdev_general.bdev_stat -- bdev/blockdev.sh@601 -- # rpc_cmd bdev_malloc_create -b Malloc_STAT 128 512 00:10:31.228 18:14:14 blockdev_general.bdev_stat -- common/autotest_common.sh@559 -- # xtrace_disable 00:10:31.228 18:14:14 blockdev_general.bdev_stat -- common/autotest_common.sh@10 -- # set +x 00:10:31.228 Malloc_STAT 00:10:31.228 18:14:14 blockdev_general.bdev_stat -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:10:31.228 18:14:14 blockdev_general.bdev_stat -- bdev/blockdev.sh@602 -- # waitforbdev Malloc_STAT 00:10:31.228 18:14:14 blockdev_general.bdev_stat -- common/autotest_common.sh@897 -- # local bdev_name=Malloc_STAT 00:10:31.228 18:14:14 blockdev_general.bdev_stat -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:10:31.228 18:14:14 blockdev_general.bdev_stat -- common/autotest_common.sh@899 -- # local i 00:10:31.228 18:14:14 blockdev_general.bdev_stat -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:10:31.228 18:14:14 blockdev_general.bdev_stat -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:10:31.228 18:14:14 blockdev_general.bdev_stat -- common/autotest_common.sh@902 -- # rpc_cmd bdev_wait_for_examine 00:10:31.228 18:14:14 blockdev_general.bdev_stat -- common/autotest_common.sh@559 -- # xtrace_disable 00:10:31.228 18:14:14 blockdev_general.bdev_stat -- common/autotest_common.sh@10 -- # set +x 00:10:31.228 18:14:14 blockdev_general.bdev_stat -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:10:31.228 18:14:14 blockdev_general.bdev_stat -- common/autotest_common.sh@904 -- # rpc_cmd bdev_get_bdevs -b Malloc_STAT -t 2000 00:10:31.228 18:14:14 blockdev_general.bdev_stat -- common/autotest_common.sh@559 -- # xtrace_disable 00:10:31.228 18:14:14 blockdev_general.bdev_stat -- common/autotest_common.sh@10 -- # set +x 00:10:31.487 [ 00:10:31.487 { 00:10:31.487 "name": "Malloc_STAT", 00:10:31.487 "aliases": [ 00:10:31.487 "1526679e-126a-4683-bbe7-7c1b14dbd2b1" 00:10:31.487 ], 00:10:31.487 "product_name": "Malloc disk", 00:10:31.487 "block_size": 512, 00:10:31.487 "num_blocks": 262144, 00:10:31.487 "uuid": "1526679e-126a-4683-bbe7-7c1b14dbd2b1", 00:10:31.487 "assigned_rate_limits": { 00:10:31.487 "rw_ios_per_sec": 0, 00:10:31.487 "rw_mbytes_per_sec": 0, 00:10:31.487 "r_mbytes_per_sec": 0, 00:10:31.487 "w_mbytes_per_sec": 0 00:10:31.487 }, 00:10:31.487 "claimed": false, 00:10:31.487 "zoned": false, 00:10:31.487 "supported_io_types": { 00:10:31.487 "read": true, 00:10:31.487 "write": true, 00:10:31.487 "unmap": true, 00:10:31.487 "flush": true, 00:10:31.487 "reset": true, 00:10:31.487 "nvme_admin": false, 00:10:31.487 "nvme_io": false, 00:10:31.487 "nvme_io_md": false, 00:10:31.487 "write_zeroes": true, 00:10:31.487 "zcopy": true, 00:10:31.487 "get_zone_info": false, 00:10:31.487 "zone_management": false, 00:10:31.487 "zone_append": false, 00:10:31.487 "compare": false, 00:10:31.487 "compare_and_write": false, 00:10:31.487 "abort": true, 00:10:31.487 "seek_hole": false, 00:10:31.487 "seek_data": false, 00:10:31.487 "copy": true, 00:10:31.487 "nvme_iov_md": false 00:10:31.487 }, 00:10:31.487 "memory_domains": [ 00:10:31.487 { 00:10:31.487 "dma_device_id": "system", 00:10:31.487 "dma_device_type": 1 00:10:31.487 }, 00:10:31.487 { 00:10:31.487 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:10:31.487 "dma_device_type": 2 00:10:31.487 } 00:10:31.487 ], 00:10:31.487 "driver_specific": {} 00:10:31.487 } 00:10:31.487 ] 00:10:31.487 18:14:14 blockdev_general.bdev_stat -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:10:31.487 18:14:14 blockdev_general.bdev_stat -- common/autotest_common.sh@905 -- # return 0 00:10:31.487 18:14:14 blockdev_general.bdev_stat -- bdev/blockdev.sh@605 -- # sleep 2 00:10:31.487 18:14:14 blockdev_general.bdev_stat -- bdev/blockdev.sh@604 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py perform_tests 00:10:31.487 Running I/O for 10 seconds... 00:10:33.391 18:14:16 blockdev_general.bdev_stat -- bdev/blockdev.sh@606 -- # stat_function_test Malloc_STAT 00:10:33.391 18:14:16 blockdev_general.bdev_stat -- bdev/blockdev.sh@559 -- # local bdev_name=Malloc_STAT 00:10:33.391 18:14:16 blockdev_general.bdev_stat -- bdev/blockdev.sh@560 -- # local iostats 00:10:33.391 18:14:16 blockdev_general.bdev_stat -- bdev/blockdev.sh@561 -- # local io_count1 00:10:33.391 18:14:16 blockdev_general.bdev_stat -- bdev/blockdev.sh@562 -- # local io_count2 00:10:33.391 18:14:16 blockdev_general.bdev_stat -- bdev/blockdev.sh@563 -- # local iostats_per_channel 00:10:33.391 18:14:16 blockdev_general.bdev_stat -- bdev/blockdev.sh@564 -- # local io_count_per_channel1 00:10:33.391 18:14:16 blockdev_general.bdev_stat -- bdev/blockdev.sh@565 -- # local io_count_per_channel2 00:10:33.391 18:14:16 blockdev_general.bdev_stat -- bdev/blockdev.sh@566 -- # local io_count_per_channel_all=0 00:10:33.391 18:14:16 blockdev_general.bdev_stat -- bdev/blockdev.sh@568 -- # rpc_cmd bdev_get_iostat -b Malloc_STAT 00:10:33.391 18:14:16 blockdev_general.bdev_stat -- common/autotest_common.sh@559 -- # xtrace_disable 00:10:33.391 18:14:16 blockdev_general.bdev_stat -- common/autotest_common.sh@10 -- # set +x 00:10:33.391 18:14:17 blockdev_general.bdev_stat -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:10:33.391 18:14:17 blockdev_general.bdev_stat -- bdev/blockdev.sh@568 -- # iostats='{ 00:10:33.391 "tick_rate": 2300000000, 00:10:33.391 "ticks": 4804211510076296, 00:10:33.391 "bdevs": [ 00:10:33.391 { 00:10:33.391 "name": "Malloc_STAT", 00:10:33.391 "bytes_read": 772846080, 00:10:33.391 "num_read_ops": 188676, 00:10:33.391 "bytes_written": 0, 00:10:33.391 "num_write_ops": 0, 00:10:33.391 "bytes_unmapped": 0, 00:10:33.391 "num_unmap_ops": 0, 00:10:33.391 "bytes_copied": 0, 00:10:33.391 "num_copy_ops": 0, 00:10:33.391 "read_latency_ticks": 2228367789160, 00:10:33.391 "max_read_latency_ticks": 14580776, 00:10:33.391 "min_read_latency_ticks": 278388, 00:10:33.391 "write_latency_ticks": 0, 00:10:33.391 "max_write_latency_ticks": 0, 00:10:33.391 "min_write_latency_ticks": 0, 00:10:33.391 "unmap_latency_ticks": 0, 00:10:33.391 "max_unmap_latency_ticks": 0, 00:10:33.391 "min_unmap_latency_ticks": 0, 00:10:33.391 "copy_latency_ticks": 0, 00:10:33.391 "max_copy_latency_ticks": 0, 00:10:33.391 "min_copy_latency_ticks": 0, 00:10:33.391 "io_error": {} 00:10:33.391 } 00:10:33.391 ] 00:10:33.391 }' 00:10:33.391 18:14:17 blockdev_general.bdev_stat -- bdev/blockdev.sh@569 -- # jq -r '.bdevs[0].num_read_ops' 00:10:33.391 18:14:17 blockdev_general.bdev_stat -- bdev/blockdev.sh@569 -- # io_count1=188676 00:10:33.391 18:14:17 blockdev_general.bdev_stat -- bdev/blockdev.sh@571 -- # rpc_cmd bdev_get_iostat -b Malloc_STAT -c 00:10:33.391 18:14:17 blockdev_general.bdev_stat -- common/autotest_common.sh@559 -- # xtrace_disable 00:10:33.391 18:14:17 blockdev_general.bdev_stat -- common/autotest_common.sh@10 -- # set +x 00:10:33.391 18:14:17 blockdev_general.bdev_stat -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:10:33.391 18:14:17 blockdev_general.bdev_stat -- bdev/blockdev.sh@571 -- # iostats_per_channel='{ 00:10:33.391 "tick_rate": 2300000000, 00:10:33.391 "ticks": 4804211677651010, 00:10:33.391 "name": "Malloc_STAT", 00:10:33.391 "channels": [ 00:10:33.391 { 00:10:33.391 "thread_id": 2, 00:10:33.391 "bytes_read": 398458880, 00:10:33.391 "num_read_ops": 97280, 00:10:33.391 "bytes_written": 0, 00:10:33.391 "num_write_ops": 0, 00:10:33.391 "bytes_unmapped": 0, 00:10:33.391 "num_unmap_ops": 0, 00:10:33.391 "bytes_copied": 0, 00:10:33.391 "num_copy_ops": 0, 00:10:33.391 "read_latency_ticks": 1156479728810, 00:10:33.391 "max_read_latency_ticks": 12671368, 00:10:33.391 "min_read_latency_ticks": 7631616, 00:10:33.391 "write_latency_ticks": 0, 00:10:33.391 "max_write_latency_ticks": 0, 00:10:33.391 "min_write_latency_ticks": 0, 00:10:33.391 "unmap_latency_ticks": 0, 00:10:33.391 "max_unmap_latency_ticks": 0, 00:10:33.391 "min_unmap_latency_ticks": 0, 00:10:33.391 "copy_latency_ticks": 0, 00:10:33.391 "max_copy_latency_ticks": 0, 00:10:33.391 "min_copy_latency_ticks": 0 00:10:33.391 }, 00:10:33.391 { 00:10:33.391 "thread_id": 3, 00:10:33.391 "bytes_read": 403701760, 00:10:33.391 "num_read_ops": 98560, 00:10:33.391 "bytes_written": 0, 00:10:33.391 "num_write_ops": 0, 00:10:33.391 "bytes_unmapped": 0, 00:10:33.391 "num_unmap_ops": 0, 00:10:33.391 "bytes_copied": 0, 00:10:33.391 "num_copy_ops": 0, 00:10:33.391 "read_latency_ticks": 1156784736452, 00:10:33.391 "max_read_latency_ticks": 14580776, 00:10:33.391 "min_read_latency_ticks": 7650756, 00:10:33.391 "write_latency_ticks": 0, 00:10:33.391 "max_write_latency_ticks": 0, 00:10:33.391 "min_write_latency_ticks": 0, 00:10:33.391 "unmap_latency_ticks": 0, 00:10:33.391 "max_unmap_latency_ticks": 0, 00:10:33.391 "min_unmap_latency_ticks": 0, 00:10:33.391 "copy_latency_ticks": 0, 00:10:33.391 "max_copy_latency_ticks": 0, 00:10:33.391 "min_copy_latency_ticks": 0 00:10:33.391 } 00:10:33.391 ] 00:10:33.391 }' 00:10:33.391 18:14:17 blockdev_general.bdev_stat -- bdev/blockdev.sh@572 -- # jq -r '.channels[0].num_read_ops' 00:10:33.650 18:14:17 blockdev_general.bdev_stat -- bdev/blockdev.sh@572 -- # io_count_per_channel1=97280 00:10:33.650 18:14:17 blockdev_general.bdev_stat -- bdev/blockdev.sh@573 -- # io_count_per_channel_all=97280 00:10:33.650 18:14:17 blockdev_general.bdev_stat -- bdev/blockdev.sh@574 -- # jq -r '.channels[1].num_read_ops' 00:10:33.650 18:14:17 blockdev_general.bdev_stat -- bdev/blockdev.sh@574 -- # io_count_per_channel2=98560 00:10:33.650 18:14:17 blockdev_general.bdev_stat -- bdev/blockdev.sh@575 -- # io_count_per_channel_all=195840 00:10:33.650 18:14:17 blockdev_general.bdev_stat -- bdev/blockdev.sh@577 -- # rpc_cmd bdev_get_iostat -b Malloc_STAT 00:10:33.650 18:14:17 blockdev_general.bdev_stat -- common/autotest_common.sh@559 -- # xtrace_disable 00:10:33.650 18:14:17 blockdev_general.bdev_stat -- common/autotest_common.sh@10 -- # set +x 00:10:33.650 18:14:17 blockdev_general.bdev_stat -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:10:33.650 18:14:17 blockdev_general.bdev_stat -- bdev/blockdev.sh@577 -- # iostats='{ 00:10:33.650 "tick_rate": 2300000000, 00:10:33.650 "ticks": 4804211952349910, 00:10:33.650 "bdevs": [ 00:10:33.650 { 00:10:33.650 "name": "Malloc_STAT", 00:10:33.650 "bytes_read": 851489280, 00:10:33.650 "num_read_ops": 207876, 00:10:33.650 "bytes_written": 0, 00:10:33.650 "num_write_ops": 0, 00:10:33.650 "bytes_unmapped": 0, 00:10:33.650 "num_unmap_ops": 0, 00:10:33.650 "bytes_copied": 0, 00:10:33.650 "num_copy_ops": 0, 00:10:33.650 "read_latency_ticks": 2456204358190, 00:10:33.650 "max_read_latency_ticks": 14580776, 00:10:33.650 "min_read_latency_ticks": 278388, 00:10:33.650 "write_latency_ticks": 0, 00:10:33.650 "max_write_latency_ticks": 0, 00:10:33.650 "min_write_latency_ticks": 0, 00:10:33.650 "unmap_latency_ticks": 0, 00:10:33.650 "max_unmap_latency_ticks": 0, 00:10:33.650 "min_unmap_latency_ticks": 0, 00:10:33.650 "copy_latency_ticks": 0, 00:10:33.650 "max_copy_latency_ticks": 0, 00:10:33.650 "min_copy_latency_ticks": 0, 00:10:33.650 "io_error": {} 00:10:33.650 } 00:10:33.650 ] 00:10:33.650 }' 00:10:33.650 18:14:17 blockdev_general.bdev_stat -- bdev/blockdev.sh@578 -- # jq -r '.bdevs[0].num_read_ops' 00:10:33.650 18:14:17 blockdev_general.bdev_stat -- bdev/blockdev.sh@578 -- # io_count2=207876 00:10:33.650 18:14:17 blockdev_general.bdev_stat -- bdev/blockdev.sh@583 -- # '[' 195840 -lt 188676 ']' 00:10:33.650 18:14:17 blockdev_general.bdev_stat -- bdev/blockdev.sh@583 -- # '[' 195840 -gt 207876 ']' 00:10:33.650 18:14:17 blockdev_general.bdev_stat -- bdev/blockdev.sh@608 -- # rpc_cmd bdev_malloc_delete Malloc_STAT 00:10:33.650 18:14:17 blockdev_general.bdev_stat -- common/autotest_common.sh@559 -- # xtrace_disable 00:10:33.650 18:14:17 blockdev_general.bdev_stat -- common/autotest_common.sh@10 -- # set +x 00:10:33.650 00:10:33.650 Latency(us) 00:10:33.650 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:10:33.650 Job: Malloc_STAT (Core Mask 0x1, workload: randread, depth: 256, IO size: 4096) 00:10:33.650 Malloc_STAT : 2.16 49442.45 193.13 0.00 0.00 5165.40 1374.83 5527.82 00:10:33.650 Job: Malloc_STAT (Core Mask 0x2, workload: randread, depth: 256, IO size: 4096) 00:10:33.650 Malloc_STAT : 2.17 50106.54 195.73 0.00 0.00 5097.54 947.42 6354.14 00:10:33.650 =================================================================================================================== 00:10:33.650 Total : 99548.99 388.86 0.00 0.00 5131.23 947.42 6354.14 00:10:33.650 0 00:10:33.650 18:14:17 blockdev_general.bdev_stat -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:10:33.650 18:14:17 blockdev_general.bdev_stat -- bdev/blockdev.sh@609 -- # killprocess 2453956 00:10:33.650 18:14:17 blockdev_general.bdev_stat -- common/autotest_common.sh@948 -- # '[' -z 2453956 ']' 00:10:33.650 18:14:17 blockdev_general.bdev_stat -- common/autotest_common.sh@952 -- # kill -0 2453956 00:10:33.650 18:14:17 blockdev_general.bdev_stat -- common/autotest_common.sh@953 -- # uname 00:10:33.650 18:14:17 blockdev_general.bdev_stat -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:10:33.650 18:14:17 blockdev_general.bdev_stat -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2453956 00:10:33.650 18:14:17 blockdev_general.bdev_stat -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:10:33.650 18:14:17 blockdev_general.bdev_stat -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:10:33.650 18:14:17 blockdev_general.bdev_stat -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2453956' 00:10:33.650 killing process with pid 2453956 00:10:33.650 18:14:17 blockdev_general.bdev_stat -- common/autotest_common.sh@967 -- # kill 2453956 00:10:33.650 Received shutdown signal, test time was about 2.249530 seconds 00:10:33.650 00:10:33.650 Latency(us) 00:10:33.650 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:10:33.650 =================================================================================================================== 00:10:33.650 Total : 0.00 0.00 0.00 0.00 0.00 0.00 0.00 00:10:33.650 18:14:17 blockdev_general.bdev_stat -- common/autotest_common.sh@972 -- # wait 2453956 00:10:33.910 18:14:17 blockdev_general.bdev_stat -- bdev/blockdev.sh@610 -- # trap - SIGINT SIGTERM EXIT 00:10:33.910 00:10:33.910 real 0m3.666s 00:10:33.910 user 0m7.313s 00:10:33.910 sys 0m0.486s 00:10:33.910 18:14:17 blockdev_general.bdev_stat -- common/autotest_common.sh@1124 -- # xtrace_disable 00:10:33.910 18:14:17 blockdev_general.bdev_stat -- common/autotest_common.sh@10 -- # set +x 00:10:33.910 ************************************ 00:10:33.910 END TEST bdev_stat 00:10:33.910 ************************************ 00:10:33.910 18:14:17 blockdev_general -- common/autotest_common.sh@1142 -- # return 0 00:10:33.910 18:14:17 blockdev_general -- bdev/blockdev.sh@794 -- # [[ bdev == gpt ]] 00:10:33.910 18:14:17 blockdev_general -- bdev/blockdev.sh@798 -- # [[ bdev == crypto_sw ]] 00:10:33.910 18:14:17 blockdev_general -- bdev/blockdev.sh@810 -- # trap - SIGINT SIGTERM EXIT 00:10:33.910 18:14:17 blockdev_general -- bdev/blockdev.sh@811 -- # cleanup 00:10:33.910 18:14:17 blockdev_general -- bdev/blockdev.sh@23 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/aiofile 00:10:33.910 18:14:17 blockdev_general -- bdev/blockdev.sh@24 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json 00:10:33.910 18:14:17 blockdev_general -- bdev/blockdev.sh@26 -- # [[ bdev == rbd ]] 00:10:33.910 18:14:17 blockdev_general -- bdev/blockdev.sh@30 -- # [[ bdev == daos ]] 00:10:33.911 18:14:17 blockdev_general -- bdev/blockdev.sh@34 -- # [[ bdev = \g\p\t ]] 00:10:33.911 18:14:17 blockdev_general -- bdev/blockdev.sh@40 -- # [[ bdev == xnvme ]] 00:10:33.911 00:10:33.911 real 1m57.757s 00:10:33.911 user 7m12.629s 00:10:33.911 sys 0m23.008s 00:10:33.911 18:14:17 blockdev_general -- common/autotest_common.sh@1124 -- # xtrace_disable 00:10:33.911 18:14:17 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:10:33.911 ************************************ 00:10:33.911 END TEST blockdev_general 00:10:33.911 ************************************ 00:10:34.170 18:14:17 -- common/autotest_common.sh@1142 -- # return 0 00:10:34.170 18:14:17 -- spdk/autotest.sh@190 -- # run_test bdev_raid /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev_raid.sh 00:10:34.170 18:14:17 -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:10:34.170 18:14:17 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:10:34.170 18:14:17 -- common/autotest_common.sh@10 -- # set +x 00:10:34.170 ************************************ 00:10:34.170 START TEST bdev_raid 00:10:34.170 ************************************ 00:10:34.170 18:14:17 bdev_raid -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev_raid.sh 00:10:34.170 * Looking for test storage... 00:10:34.170 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev 00:10:34.170 18:14:17 bdev_raid -- bdev/bdev_raid.sh@13 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbd_common.sh 00:10:34.170 18:14:17 bdev_raid -- bdev/nbd_common.sh@6 -- # set -e 00:10:34.170 18:14:17 bdev_raid -- bdev/bdev_raid.sh@15 -- # rpc_py='/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock' 00:10:34.170 18:14:17 bdev_raid -- bdev/bdev_raid.sh@851 -- # mkdir -p /raidtest 00:10:34.170 18:14:17 bdev_raid -- bdev/bdev_raid.sh@852 -- # trap 'cleanup; exit 1' EXIT 00:10:34.170 18:14:17 bdev_raid -- bdev/bdev_raid.sh@854 -- # base_blocklen=512 00:10:34.170 18:14:17 bdev_raid -- bdev/bdev_raid.sh@856 -- # uname -s 00:10:34.170 18:14:17 bdev_raid -- bdev/bdev_raid.sh@856 -- # '[' Linux = Linux ']' 00:10:34.170 18:14:17 bdev_raid -- bdev/bdev_raid.sh@856 -- # modprobe -n nbd 00:10:34.170 18:14:17 bdev_raid -- bdev/bdev_raid.sh@857 -- # has_nbd=true 00:10:34.170 18:14:17 bdev_raid -- bdev/bdev_raid.sh@858 -- # modprobe nbd 00:10:34.170 18:14:17 bdev_raid -- bdev/bdev_raid.sh@859 -- # run_test raid_function_test_raid0 raid_function_test raid0 00:10:34.170 18:14:17 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 3 -le 1 ']' 00:10:34.170 18:14:17 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:10:34.170 18:14:17 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:10:34.170 ************************************ 00:10:34.170 START TEST raid_function_test_raid0 00:10:34.170 ************************************ 00:10:34.170 18:14:17 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@1123 -- # raid_function_test raid0 00:10:34.170 18:14:17 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@80 -- # local raid_level=raid0 00:10:34.170 18:14:17 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@81 -- # local nbd=/dev/nbd0 00:10:34.170 18:14:17 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@82 -- # local raid_bdev 00:10:34.428 18:14:17 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@85 -- # raid_pid=2454565 00:10:34.428 18:14:17 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@86 -- # echo 'Process raid pid: 2454565' 00:10:34.428 Process raid pid: 2454565 00:10:34.428 18:14:17 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@84 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:10:34.428 18:14:17 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@87 -- # waitforlisten 2454565 /var/tmp/spdk-raid.sock 00:10:34.428 18:14:17 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@829 -- # '[' -z 2454565 ']' 00:10:34.428 18:14:17 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:10:34.428 18:14:17 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@834 -- # local max_retries=100 00:10:34.428 18:14:17 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:10:34.428 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:10:34.428 18:14:17 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@838 -- # xtrace_disable 00:10:34.428 18:14:17 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@10 -- # set +x 00:10:34.428 [2024-07-12 18:14:17.956712] Starting SPDK v24.09-pre git sha1 182dd7de4 / DPDK 24.03.0 initialization... 00:10:34.428 [2024-07-12 18:14:17.956781] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:10:34.428 [2024-07-12 18:14:18.098883] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:10:34.686 [2024-07-12 18:14:18.237521] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:10:34.686 [2024-07-12 18:14:18.307071] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:10:34.686 [2024-07-12 18:14:18.307107] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:10:35.621 18:14:18 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:10:35.621 18:14:18 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@862 -- # return 0 00:10:35.621 18:14:18 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@89 -- # configure_raid_bdev raid0 00:10:35.621 18:14:18 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@66 -- # local raid_level=raid0 00:10:35.621 18:14:18 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@67 -- # rm -rf /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/rpcs.txt 00:10:35.621 18:14:18 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@69 -- # cat 00:10:35.621 18:14:18 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@74 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock 00:10:35.621 [2024-07-12 18:14:19.245054] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev Base_1 is claimed 00:10:35.621 [2024-07-12 18:14:19.246510] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev Base_2 is claimed 00:10:35.621 [2024-07-12 18:14:19.246568] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x1771bd0 00:10:35.621 [2024-07-12 18:14:19.246578] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 131072, blocklen 512 00:10:35.621 [2024-07-12 18:14:19.246770] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1771b10 00:10:35.621 [2024-07-12 18:14:19.246888] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1771bd0 00:10:35.621 [2024-07-12 18:14:19.246898] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid, raid_bdev 0x1771bd0 00:10:35.621 [2024-07-12 18:14:19.247020] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:10:35.621 Base_1 00:10:35.621 Base_2 00:10:35.621 18:14:19 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@76 -- # rm -rf /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/rpcs.txt 00:10:35.621 18:14:19 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@90 -- # jq -r '.[0]["name"] | select(.)' 00:10:35.621 18:14:19 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@90 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs online 00:10:35.879 18:14:19 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@90 -- # raid_bdev=raid 00:10:35.879 18:14:19 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@91 -- # '[' raid = '' ']' 00:10:35.879 18:14:19 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@96 -- # nbd_start_disks /var/tmp/spdk-raid.sock raid /dev/nbd0 00:10:35.879 18:14:19 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:10:35.879 18:14:19 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@10 -- # bdev_list=('raid') 00:10:35.879 18:14:19 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@10 -- # local bdev_list 00:10:35.879 18:14:19 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0') 00:10:35.879 18:14:19 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@11 -- # local nbd_list 00:10:35.879 18:14:19 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@12 -- # local i 00:10:35.879 18:14:19 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:10:35.879 18:14:19 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:10:35.879 18:14:19 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk raid /dev/nbd0 00:10:36.446 [2024-07-12 18:14:19.995063] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x19258e0 00:10:36.446 /dev/nbd0 00:10:36.446 18:14:20 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:10:36.446 18:14:20 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:10:36.446 18:14:20 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@866 -- # local nbd_name=nbd0 00:10:36.446 18:14:20 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@867 -- # local i 00:10:36.446 18:14:20 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:10:36.446 18:14:20 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:10:36.446 18:14:20 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@870 -- # grep -q -w nbd0 /proc/partitions 00:10:36.446 18:14:20 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@871 -- # break 00:10:36.446 18:14:20 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:10:36.446 18:14:20 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:10:36.446 18:14:20 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@883 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:10:36.446 1+0 records in 00:10:36.446 1+0 records out 00:10:36.446 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000278451 s, 14.7 MB/s 00:10:36.446 18:14:20 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:10:36.446 18:14:20 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@884 -- # size=4096 00:10:36.446 18:14:20 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:10:36.446 18:14:20 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:10:36.446 18:14:20 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@887 -- # return 0 00:10:36.446 18:14:20 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:10:36.446 18:14:20 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:10:36.446 18:14:20 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@97 -- # nbd_get_count /var/tmp/spdk-raid.sock 00:10:36.446 18:14:20 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:10:36.446 18:14:20 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_get_disks 00:10:36.704 18:14:20 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[ 00:10:36.704 { 00:10:36.704 "nbd_device": "/dev/nbd0", 00:10:36.704 "bdev_name": "raid" 00:10:36.704 } 00:10:36.704 ]' 00:10:36.704 18:14:20 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@64 -- # echo '[ 00:10:36.704 { 00:10:36.704 "nbd_device": "/dev/nbd0", 00:10:36.704 "bdev_name": "raid" 00:10:36.704 } 00:10:36.704 ]' 00:10:36.704 18:14:20 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:10:36.704 18:14:20 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@64 -- # nbd_disks_name=/dev/nbd0 00:10:36.704 18:14:20 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@65 -- # echo /dev/nbd0 00:10:36.704 18:14:20 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:10:36.704 18:14:20 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@65 -- # count=1 00:10:36.704 18:14:20 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@66 -- # echo 1 00:10:36.704 18:14:20 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@97 -- # count=1 00:10:36.704 18:14:20 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@98 -- # '[' 1 -ne 1 ']' 00:10:36.704 18:14:20 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@102 -- # raid_unmap_data_verify /dev/nbd0 /var/tmp/spdk-raid.sock 00:10:36.704 18:14:20 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@18 -- # hash blkdiscard 00:10:36.704 18:14:20 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@19 -- # local nbd=/dev/nbd0 00:10:36.704 18:14:20 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@20 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:10:36.704 18:14:20 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@21 -- # local blksize 00:10:36.704 18:14:20 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@22 -- # lsblk -o LOG-SEC /dev/nbd0 00:10:36.704 18:14:20 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@22 -- # grep -v LOG-SEC 00:10:36.704 18:14:20 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@22 -- # cut -d ' ' -f 5 00:10:36.704 18:14:20 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@22 -- # blksize=512 00:10:36.704 18:14:20 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@23 -- # local rw_blk_num=4096 00:10:36.704 18:14:20 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@24 -- # local rw_len=2097152 00:10:36.704 18:14:20 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@25 -- # unmap_blk_offs=('0' '1028' '321') 00:10:36.704 18:14:20 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@25 -- # local unmap_blk_offs 00:10:36.704 18:14:20 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@26 -- # unmap_blk_nums=('128' '2035' '456') 00:10:36.704 18:14:20 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@26 -- # local unmap_blk_nums 00:10:36.704 18:14:20 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@27 -- # local unmap_off 00:10:36.704 18:14:20 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@28 -- # local unmap_len 00:10:36.704 18:14:20 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@31 -- # dd if=/dev/urandom of=/raidtest/raidrandtest bs=512 count=4096 00:10:36.704 4096+0 records in 00:10:36.704 4096+0 records out 00:10:36.704 2097152 bytes (2.1 MB, 2.0 MiB) copied, 0.0287472 s, 73.0 MB/s 00:10:36.704 18:14:20 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@32 -- # dd if=/raidtest/raidrandtest of=/dev/nbd0 bs=512 count=4096 oflag=direct 00:10:37.271 4096+0 records in 00:10:37.271 4096+0 records out 00:10:37.271 2097152 bytes (2.1 MB, 2.0 MiB) copied, 0.286877 s, 7.3 MB/s 00:10:37.271 18:14:20 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@33 -- # blockdev --flushbufs /dev/nbd0 00:10:37.271 18:14:20 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@36 -- # cmp -b -n 2097152 /raidtest/raidrandtest /dev/nbd0 00:10:37.271 18:14:20 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@38 -- # (( i = 0 )) 00:10:37.271 18:14:20 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@38 -- # (( i < 3 )) 00:10:37.271 18:14:20 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@39 -- # unmap_off=0 00:10:37.271 18:14:20 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@40 -- # unmap_len=65536 00:10:37.271 18:14:20 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@43 -- # dd if=/dev/zero of=/raidtest/raidrandtest bs=512 seek=0 count=128 conv=notrunc 00:10:37.271 128+0 records in 00:10:37.271 128+0 records out 00:10:37.271 65536 bytes (66 kB, 64 KiB) copied, 0.000821913 s, 79.7 MB/s 00:10:37.271 18:14:20 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@46 -- # blkdiscard -o 0 -l 65536 /dev/nbd0 00:10:37.271 18:14:20 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@47 -- # blockdev --flushbufs /dev/nbd0 00:10:37.271 18:14:20 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@50 -- # cmp -b -n 2097152 /raidtest/raidrandtest /dev/nbd0 00:10:37.271 18:14:20 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@38 -- # (( i++ )) 00:10:37.271 18:14:20 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@38 -- # (( i < 3 )) 00:10:37.271 18:14:20 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@39 -- # unmap_off=526336 00:10:37.271 18:14:20 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@40 -- # unmap_len=1041920 00:10:37.271 18:14:20 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@43 -- # dd if=/dev/zero of=/raidtest/raidrandtest bs=512 seek=1028 count=2035 conv=notrunc 00:10:37.271 2035+0 records in 00:10:37.271 2035+0 records out 00:10:37.271 1041920 bytes (1.0 MB, 1018 KiB) copied, 0.00538196 s, 194 MB/s 00:10:37.271 18:14:20 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@46 -- # blkdiscard -o 526336 -l 1041920 /dev/nbd0 00:10:37.271 18:14:20 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@47 -- # blockdev --flushbufs /dev/nbd0 00:10:37.271 18:14:20 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@50 -- # cmp -b -n 2097152 /raidtest/raidrandtest /dev/nbd0 00:10:37.271 18:14:20 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@38 -- # (( i++ )) 00:10:37.271 18:14:20 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@38 -- # (( i < 3 )) 00:10:37.271 18:14:20 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@39 -- # unmap_off=164352 00:10:37.271 18:14:20 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@40 -- # unmap_len=233472 00:10:37.271 18:14:20 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@43 -- # dd if=/dev/zero of=/raidtest/raidrandtest bs=512 seek=321 count=456 conv=notrunc 00:10:37.271 456+0 records in 00:10:37.271 456+0 records out 00:10:37.271 233472 bytes (233 kB, 228 KiB) copied, 0.00275207 s, 84.8 MB/s 00:10:37.271 18:14:20 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@46 -- # blkdiscard -o 164352 -l 233472 /dev/nbd0 00:10:37.271 18:14:20 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@47 -- # blockdev --flushbufs /dev/nbd0 00:10:37.271 18:14:20 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@50 -- # cmp -b -n 2097152 /raidtest/raidrandtest /dev/nbd0 00:10:37.271 18:14:20 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@38 -- # (( i++ )) 00:10:37.271 18:14:20 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@38 -- # (( i < 3 )) 00:10:37.271 18:14:20 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@54 -- # return 0 00:10:37.271 18:14:20 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@104 -- # nbd_stop_disks /var/tmp/spdk-raid.sock /dev/nbd0 00:10:37.271 18:14:20 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:10:37.271 18:14:20 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0') 00:10:37.271 18:14:20 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@50 -- # local nbd_list 00:10:37.271 18:14:20 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@51 -- # local i 00:10:37.271 18:14:20 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:10:37.271 18:14:20 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd0 00:10:37.529 18:14:21 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:10:37.529 [2024-07-12 18:14:21.065048] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:10:37.529 18:14:21 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:10:37.529 18:14:21 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:10:37.529 18:14:21 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:10:37.529 18:14:21 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:10:37.529 18:14:21 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:10:37.529 18:14:21 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@41 -- # break 00:10:37.529 18:14:21 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@45 -- # return 0 00:10:37.529 18:14:21 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@105 -- # nbd_get_count /var/tmp/spdk-raid.sock 00:10:37.529 18:14:21 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:10:37.529 18:14:21 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_get_disks 00:10:37.788 18:14:21 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[]' 00:10:37.788 18:14:21 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@64 -- # echo '[]' 00:10:37.788 18:14:21 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:10:37.788 18:14:21 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@64 -- # nbd_disks_name= 00:10:37.788 18:14:21 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@65 -- # echo '' 00:10:37.788 18:14:21 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:10:37.788 18:14:21 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@65 -- # true 00:10:37.788 18:14:21 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@65 -- # count=0 00:10:37.788 18:14:21 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@66 -- # echo 0 00:10:37.788 18:14:21 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@105 -- # count=0 00:10:37.788 18:14:21 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@106 -- # '[' 0 -ne 0 ']' 00:10:37.788 18:14:21 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@110 -- # killprocess 2454565 00:10:37.788 18:14:21 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@948 -- # '[' -z 2454565 ']' 00:10:37.788 18:14:21 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@952 -- # kill -0 2454565 00:10:37.788 18:14:21 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@953 -- # uname 00:10:37.788 18:14:21 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:10:37.788 18:14:21 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2454565 00:10:37.788 18:14:21 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:10:37.788 18:14:21 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:10:37.788 18:14:21 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2454565' 00:10:37.788 killing process with pid 2454565 00:10:37.788 18:14:21 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@967 -- # kill 2454565 00:10:37.788 [2024-07-12 18:14:21.423409] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:10:37.788 [2024-07-12 18:14:21.423482] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:10:37.788 [2024-07-12 18:14:21.423525] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:10:37.788 [2024-07-12 18:14:21.423537] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1771bd0 name raid, state offline 00:10:37.788 18:14:21 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@972 -- # wait 2454565 00:10:37.788 [2024-07-12 18:14:21.442431] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:10:38.046 18:14:21 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@112 -- # return 0 00:10:38.046 00:10:38.046 real 0m3.769s 00:10:38.046 user 0m5.102s 00:10:38.046 sys 0m1.325s 00:10:38.046 18:14:21 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@1124 -- # xtrace_disable 00:10:38.046 18:14:21 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@10 -- # set +x 00:10:38.046 ************************************ 00:10:38.046 END TEST raid_function_test_raid0 00:10:38.046 ************************************ 00:10:38.046 18:14:21 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:10:38.046 18:14:21 bdev_raid -- bdev/bdev_raid.sh@860 -- # run_test raid_function_test_concat raid_function_test concat 00:10:38.046 18:14:21 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 3 -le 1 ']' 00:10:38.046 18:14:21 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:10:38.046 18:14:21 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:10:38.046 ************************************ 00:10:38.046 START TEST raid_function_test_concat 00:10:38.046 ************************************ 00:10:38.046 18:14:21 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@1123 -- # raid_function_test concat 00:10:38.046 18:14:21 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@80 -- # local raid_level=concat 00:10:38.046 18:14:21 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@81 -- # local nbd=/dev/nbd0 00:10:38.046 18:14:21 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@82 -- # local raid_bdev 00:10:38.046 18:14:21 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@85 -- # raid_pid=2455171 00:10:38.047 18:14:21 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@86 -- # echo 'Process raid pid: 2455171' 00:10:38.047 Process raid pid: 2455171 00:10:38.047 18:14:21 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@84 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:10:38.047 18:14:21 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@87 -- # waitforlisten 2455171 /var/tmp/spdk-raid.sock 00:10:38.047 18:14:21 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@829 -- # '[' -z 2455171 ']' 00:10:38.047 18:14:21 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:10:38.047 18:14:21 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@834 -- # local max_retries=100 00:10:38.047 18:14:21 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:10:38.047 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:10:38.047 18:14:21 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@838 -- # xtrace_disable 00:10:38.047 18:14:21 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@10 -- # set +x 00:10:38.304 [2024-07-12 18:14:21.808919] Starting SPDK v24.09-pre git sha1 182dd7de4 / DPDK 24.03.0 initialization... 00:10:38.304 [2024-07-12 18:14:21.808989] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:10:38.304 [2024-07-12 18:14:21.937162] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:10:38.563 [2024-07-12 18:14:22.041295] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:10:38.563 [2024-07-12 18:14:22.100106] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:10:38.563 [2024-07-12 18:14:22.100131] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:10:39.129 18:14:22 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:10:39.129 18:14:22 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@862 -- # return 0 00:10:39.129 18:14:22 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@89 -- # configure_raid_bdev concat 00:10:39.129 18:14:22 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@66 -- # local raid_level=concat 00:10:39.129 18:14:22 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@67 -- # rm -rf /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/rpcs.txt 00:10:39.129 18:14:22 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@69 -- # cat 00:10:39.129 18:14:22 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@74 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock 00:10:39.385 [2024-07-12 18:14:23.007095] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev Base_1 is claimed 00:10:39.385 [2024-07-12 18:14:23.008562] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev Base_2 is claimed 00:10:39.385 [2024-07-12 18:14:23.008619] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0xd66bd0 00:10:39.385 [2024-07-12 18:14:23.008629] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 131072, blocklen 512 00:10:39.386 [2024-07-12 18:14:23.008814] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xd66b10 00:10:39.386 [2024-07-12 18:14:23.008943] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0xd66bd0 00:10:39.386 [2024-07-12 18:14:23.008955] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid, raid_bdev 0xd66bd0 00:10:39.386 [2024-07-12 18:14:23.009057] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:10:39.386 Base_1 00:10:39.386 Base_2 00:10:39.386 18:14:23 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@76 -- # rm -rf /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/rpcs.txt 00:10:39.386 18:14:23 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@90 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs online 00:10:39.386 18:14:23 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@90 -- # jq -r '.[0]["name"] | select(.)' 00:10:39.642 18:14:23 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@90 -- # raid_bdev=raid 00:10:39.642 18:14:23 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@91 -- # '[' raid = '' ']' 00:10:39.642 18:14:23 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@96 -- # nbd_start_disks /var/tmp/spdk-raid.sock raid /dev/nbd0 00:10:39.642 18:14:23 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:10:39.642 18:14:23 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@10 -- # bdev_list=('raid') 00:10:39.642 18:14:23 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@10 -- # local bdev_list 00:10:39.642 18:14:23 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0') 00:10:39.642 18:14:23 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@11 -- # local nbd_list 00:10:39.642 18:14:23 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@12 -- # local i 00:10:39.642 18:14:23 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:10:39.642 18:14:23 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:10:39.642 18:14:23 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk raid /dev/nbd0 00:10:39.898 [2024-07-12 18:14:23.512441] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xf1a8e0 00:10:39.898 /dev/nbd0 00:10:39.898 18:14:23 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:10:39.898 18:14:23 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:10:39.898 18:14:23 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@866 -- # local nbd_name=nbd0 00:10:39.898 18:14:23 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@867 -- # local i 00:10:39.898 18:14:23 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:10:39.898 18:14:23 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:10:39.898 18:14:23 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@870 -- # grep -q -w nbd0 /proc/partitions 00:10:39.898 18:14:23 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@871 -- # break 00:10:39.898 18:14:23 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:10:39.898 18:14:23 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:10:39.898 18:14:23 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@883 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:10:39.898 1+0 records in 00:10:39.898 1+0 records out 00:10:39.898 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000261637 s, 15.7 MB/s 00:10:39.898 18:14:23 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:10:39.898 18:14:23 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@884 -- # size=4096 00:10:39.898 18:14:23 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:10:39.898 18:14:23 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:10:39.898 18:14:23 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@887 -- # return 0 00:10:39.898 18:14:23 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:10:39.898 18:14:23 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:10:39.898 18:14:23 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@97 -- # nbd_get_count /var/tmp/spdk-raid.sock 00:10:39.898 18:14:23 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:10:39.898 18:14:23 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_get_disks 00:10:40.154 18:14:23 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[ 00:10:40.154 { 00:10:40.154 "nbd_device": "/dev/nbd0", 00:10:40.154 "bdev_name": "raid" 00:10:40.154 } 00:10:40.154 ]' 00:10:40.154 18:14:23 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@64 -- # echo '[ 00:10:40.154 { 00:10:40.154 "nbd_device": "/dev/nbd0", 00:10:40.154 "bdev_name": "raid" 00:10:40.154 } 00:10:40.154 ]' 00:10:40.154 18:14:23 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:10:40.154 18:14:23 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@64 -- # nbd_disks_name=/dev/nbd0 00:10:40.154 18:14:23 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@65 -- # echo /dev/nbd0 00:10:40.154 18:14:23 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:10:40.154 18:14:23 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@65 -- # count=1 00:10:40.154 18:14:23 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@66 -- # echo 1 00:10:40.154 18:14:23 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@97 -- # count=1 00:10:40.154 18:14:23 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@98 -- # '[' 1 -ne 1 ']' 00:10:40.154 18:14:23 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@102 -- # raid_unmap_data_verify /dev/nbd0 /var/tmp/spdk-raid.sock 00:10:40.154 18:14:23 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@18 -- # hash blkdiscard 00:10:40.154 18:14:23 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@19 -- # local nbd=/dev/nbd0 00:10:40.154 18:14:23 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@20 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:10:40.154 18:14:23 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@21 -- # local blksize 00:10:40.154 18:14:23 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@22 -- # grep -v LOG-SEC 00:10:40.154 18:14:23 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@22 -- # lsblk -o LOG-SEC /dev/nbd0 00:10:40.154 18:14:23 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@22 -- # cut -d ' ' -f 5 00:10:40.410 18:14:23 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@22 -- # blksize=512 00:10:40.410 18:14:23 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@23 -- # local rw_blk_num=4096 00:10:40.410 18:14:23 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@24 -- # local rw_len=2097152 00:10:40.410 18:14:23 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@25 -- # unmap_blk_offs=('0' '1028' '321') 00:10:40.410 18:14:23 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@25 -- # local unmap_blk_offs 00:10:40.410 18:14:23 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@26 -- # unmap_blk_nums=('128' '2035' '456') 00:10:40.410 18:14:23 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@26 -- # local unmap_blk_nums 00:10:40.410 18:14:23 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@27 -- # local unmap_off 00:10:40.410 18:14:23 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@28 -- # local unmap_len 00:10:40.410 18:14:23 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@31 -- # dd if=/dev/urandom of=/raidtest/raidrandtest bs=512 count=4096 00:10:40.410 4096+0 records in 00:10:40.410 4096+0 records out 00:10:40.410 2097152 bytes (2.1 MB, 2.0 MiB) copied, 0.0287555 s, 72.9 MB/s 00:10:40.410 18:14:23 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@32 -- # dd if=/raidtest/raidrandtest of=/dev/nbd0 bs=512 count=4096 oflag=direct 00:10:40.666 4096+0 records in 00:10:40.666 4096+0 records out 00:10:40.666 2097152 bytes (2.1 MB, 2.0 MiB) copied, 0.266037 s, 7.9 MB/s 00:10:40.666 18:14:24 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@33 -- # blockdev --flushbufs /dev/nbd0 00:10:40.666 18:14:24 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@36 -- # cmp -b -n 2097152 /raidtest/raidrandtest /dev/nbd0 00:10:40.666 18:14:24 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@38 -- # (( i = 0 )) 00:10:40.666 18:14:24 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@38 -- # (( i < 3 )) 00:10:40.666 18:14:24 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@39 -- # unmap_off=0 00:10:40.666 18:14:24 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@40 -- # unmap_len=65536 00:10:40.666 18:14:24 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@43 -- # dd if=/dev/zero of=/raidtest/raidrandtest bs=512 seek=0 count=128 conv=notrunc 00:10:40.666 128+0 records in 00:10:40.666 128+0 records out 00:10:40.666 65536 bytes (66 kB, 64 KiB) copied, 0.000846459 s, 77.4 MB/s 00:10:40.666 18:14:24 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@46 -- # blkdiscard -o 0 -l 65536 /dev/nbd0 00:10:40.666 18:14:24 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@47 -- # blockdev --flushbufs /dev/nbd0 00:10:40.666 18:14:24 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@50 -- # cmp -b -n 2097152 /raidtest/raidrandtest /dev/nbd0 00:10:40.666 18:14:24 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@38 -- # (( i++ )) 00:10:40.666 18:14:24 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@38 -- # (( i < 3 )) 00:10:40.666 18:14:24 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@39 -- # unmap_off=526336 00:10:40.666 18:14:24 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@40 -- # unmap_len=1041920 00:10:40.666 18:14:24 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@43 -- # dd if=/dev/zero of=/raidtest/raidrandtest bs=512 seek=1028 count=2035 conv=notrunc 00:10:40.666 2035+0 records in 00:10:40.666 2035+0 records out 00:10:40.666 1041920 bytes (1.0 MB, 1018 KiB) copied, 0.0119108 s, 87.5 MB/s 00:10:40.666 18:14:24 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@46 -- # blkdiscard -o 526336 -l 1041920 /dev/nbd0 00:10:40.666 18:14:24 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@47 -- # blockdev --flushbufs /dev/nbd0 00:10:40.666 18:14:24 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@50 -- # cmp -b -n 2097152 /raidtest/raidrandtest /dev/nbd0 00:10:40.666 18:14:24 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@38 -- # (( i++ )) 00:10:40.666 18:14:24 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@38 -- # (( i < 3 )) 00:10:40.666 18:14:24 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@39 -- # unmap_off=164352 00:10:40.666 18:14:24 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@40 -- # unmap_len=233472 00:10:40.666 18:14:24 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@43 -- # dd if=/dev/zero of=/raidtest/raidrandtest bs=512 seek=321 count=456 conv=notrunc 00:10:40.666 456+0 records in 00:10:40.666 456+0 records out 00:10:40.666 233472 bytes (233 kB, 228 KiB) copied, 0.00273301 s, 85.4 MB/s 00:10:40.666 18:14:24 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@46 -- # blkdiscard -o 164352 -l 233472 /dev/nbd0 00:10:40.667 18:14:24 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@47 -- # blockdev --flushbufs /dev/nbd0 00:10:40.667 18:14:24 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@50 -- # cmp -b -n 2097152 /raidtest/raidrandtest /dev/nbd0 00:10:40.667 18:14:24 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@38 -- # (( i++ )) 00:10:40.667 18:14:24 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@38 -- # (( i < 3 )) 00:10:40.667 18:14:24 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@54 -- # return 0 00:10:40.667 18:14:24 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@104 -- # nbd_stop_disks /var/tmp/spdk-raid.sock /dev/nbd0 00:10:40.667 18:14:24 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:10:40.667 18:14:24 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0') 00:10:40.667 18:14:24 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@50 -- # local nbd_list 00:10:40.667 18:14:24 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@51 -- # local i 00:10:40.667 18:14:24 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:10:40.667 18:14:24 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd0 00:10:40.923 18:14:24 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:10:40.923 [2024-07-12 18:14:24.494688] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:10:40.923 18:14:24 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:10:40.923 18:14:24 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:10:40.923 18:14:24 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:10:40.923 18:14:24 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:10:40.923 18:14:24 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:10:40.923 18:14:24 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@41 -- # break 00:10:40.923 18:14:24 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@45 -- # return 0 00:10:40.923 18:14:24 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@105 -- # nbd_get_count /var/tmp/spdk-raid.sock 00:10:40.923 18:14:24 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:10:40.923 18:14:24 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_get_disks 00:10:41.180 18:14:24 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[]' 00:10:41.180 18:14:24 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@64 -- # echo '[]' 00:10:41.180 18:14:24 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:10:41.180 18:14:24 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@64 -- # nbd_disks_name= 00:10:41.180 18:14:24 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:10:41.180 18:14:24 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@65 -- # echo '' 00:10:41.180 18:14:24 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@65 -- # true 00:10:41.180 18:14:24 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@65 -- # count=0 00:10:41.180 18:14:24 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@66 -- # echo 0 00:10:41.180 18:14:24 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@105 -- # count=0 00:10:41.180 18:14:24 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@106 -- # '[' 0 -ne 0 ']' 00:10:41.180 18:14:24 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@110 -- # killprocess 2455171 00:10:41.180 18:14:24 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@948 -- # '[' -z 2455171 ']' 00:10:41.180 18:14:24 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@952 -- # kill -0 2455171 00:10:41.180 18:14:24 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@953 -- # uname 00:10:41.180 18:14:24 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:10:41.180 18:14:24 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2455171 00:10:41.180 18:14:24 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:10:41.180 18:14:24 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:10:41.180 18:14:24 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2455171' 00:10:41.180 killing process with pid 2455171 00:10:41.180 18:14:24 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@967 -- # kill 2455171 00:10:41.180 [2024-07-12 18:14:24.851549] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:10:41.180 [2024-07-12 18:14:24.851613] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:10:41.180 [2024-07-12 18:14:24.851655] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:10:41.181 [2024-07-12 18:14:24.851667] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xd66bd0 name raid, state offline 00:10:41.181 18:14:24 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@972 -- # wait 2455171 00:10:41.181 [2024-07-12 18:14:24.867993] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:10:41.438 18:14:25 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@112 -- # return 0 00:10:41.438 00:10:41.438 real 0m3.319s 00:10:41.438 user 0m4.375s 00:10:41.438 sys 0m1.230s 00:10:41.438 18:14:25 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@1124 -- # xtrace_disable 00:10:41.438 18:14:25 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@10 -- # set +x 00:10:41.438 ************************************ 00:10:41.438 END TEST raid_function_test_concat 00:10:41.438 ************************************ 00:10:41.438 18:14:25 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:10:41.438 18:14:25 bdev_raid -- bdev/bdev_raid.sh@863 -- # run_test raid0_resize_test raid0_resize_test 00:10:41.438 18:14:25 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:10:41.438 18:14:25 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:10:41.438 18:14:25 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:10:41.438 ************************************ 00:10:41.438 START TEST raid0_resize_test 00:10:41.438 ************************************ 00:10:41.438 18:14:25 bdev_raid.raid0_resize_test -- common/autotest_common.sh@1123 -- # raid0_resize_test 00:10:41.438 18:14:25 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@347 -- # local blksize=512 00:10:41.438 18:14:25 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@348 -- # local bdev_size_mb=32 00:10:41.438 18:14:25 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@349 -- # local new_bdev_size_mb=64 00:10:41.438 18:14:25 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@350 -- # local blkcnt 00:10:41.438 18:14:25 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@351 -- # local raid_size_mb 00:10:41.438 18:14:25 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@352 -- # local new_raid_size_mb 00:10:41.438 18:14:25 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@355 -- # raid_pid=2455622 00:10:41.438 18:14:25 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@356 -- # echo 'Process raid pid: 2455622' 00:10:41.438 Process raid pid: 2455622 00:10:41.438 18:14:25 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@354 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:10:41.438 18:14:25 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@357 -- # waitforlisten 2455622 /var/tmp/spdk-raid.sock 00:10:41.438 18:14:25 bdev_raid.raid0_resize_test -- common/autotest_common.sh@829 -- # '[' -z 2455622 ']' 00:10:41.438 18:14:25 bdev_raid.raid0_resize_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:10:41.438 18:14:25 bdev_raid.raid0_resize_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:10:41.438 18:14:25 bdev_raid.raid0_resize_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:10:41.438 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:10:41.438 18:14:25 bdev_raid.raid0_resize_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:10:41.438 18:14:25 bdev_raid.raid0_resize_test -- common/autotest_common.sh@10 -- # set +x 00:10:41.695 [2024-07-12 18:14:25.199618] Starting SPDK v24.09-pre git sha1 182dd7de4 / DPDK 24.03.0 initialization... 00:10:41.695 [2024-07-12 18:14:25.199683] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:10:41.695 [2024-07-12 18:14:25.328341] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:10:41.952 [2024-07-12 18:14:25.435378] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:10:41.952 [2024-07-12 18:14:25.501471] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:10:41.952 [2024-07-12 18:14:25.501503] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:10:42.518 18:14:26 bdev_raid.raid0_resize_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:10:42.518 18:14:26 bdev_raid.raid0_resize_test -- common/autotest_common.sh@862 -- # return 0 00:10:42.518 18:14:26 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@359 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_null_create Base_1 32 512 00:10:42.776 Base_1 00:10:42.776 18:14:26 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@360 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_null_create Base_2 32 512 00:10:43.035 Base_2 00:10:43.035 18:14:26 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@362 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r 0 -b 'Base_1 Base_2' -n Raid 00:10:43.293 [2024-07-12 18:14:26.825262] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev Base_1 is claimed 00:10:43.293 [2024-07-12 18:14:26.826615] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev Base_2 is claimed 00:10:43.293 [2024-07-12 18:14:26.826663] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0xa52780 00:10:43.293 [2024-07-12 18:14:26.826673] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 131072, blocklen 512 00:10:43.293 [2024-07-12 18:14:26.826879] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x59e020 00:10:43.293 [2024-07-12 18:14:26.826980] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0xa52780 00:10:43.293 [2024-07-12 18:14:26.826990] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Raid, raid_bdev 0xa52780 00:10:43.293 [2024-07-12 18:14:26.827096] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:10:43.293 18:14:26 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@365 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_null_resize Base_1 64 00:10:43.552 [2024-07-12 18:14:27.065874] bdev_raid.c:2262:raid_bdev_resize_base_bdev: *DEBUG*: raid_bdev_resize_base_bdev 00:10:43.552 [2024-07-12 18:14:27.065897] bdev_raid.c:2275:raid_bdev_resize_base_bdev: *NOTICE*: base_bdev 'Base_1' was resized: old size 65536, new size 131072 00:10:43.552 true 00:10:43.552 18:14:27 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@368 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Raid 00:10:43.552 18:14:27 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@368 -- # jq '.[].num_blocks' 00:10:43.811 [2024-07-12 18:14:27.310673] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:10:43.811 18:14:27 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@368 -- # blkcnt=131072 00:10:43.811 18:14:27 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@369 -- # raid_size_mb=64 00:10:43.811 18:14:27 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@370 -- # '[' 64 '!=' 64 ']' 00:10:43.811 18:14:27 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@376 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_null_resize Base_2 64 00:10:44.070 [2024-07-12 18:14:27.555147] bdev_raid.c:2262:raid_bdev_resize_base_bdev: *DEBUG*: raid_bdev_resize_base_bdev 00:10:44.070 [2024-07-12 18:14:27.555167] bdev_raid.c:2275:raid_bdev_resize_base_bdev: *NOTICE*: base_bdev 'Base_2' was resized: old size 65536, new size 131072 00:10:44.070 [2024-07-12 18:14:27.555191] bdev_raid.c:2289:raid_bdev_resize_base_bdev: *NOTICE*: raid bdev 'Raid': block count was changed from 131072 to 262144 00:10:44.070 true 00:10:44.070 18:14:27 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@379 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Raid 00:10:44.070 18:14:27 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@379 -- # jq '.[].num_blocks' 00:10:44.070 [2024-07-12 18:14:27.795935] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:10:44.328 18:14:27 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@379 -- # blkcnt=262144 00:10:44.328 18:14:27 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@380 -- # raid_size_mb=128 00:10:44.328 18:14:27 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@381 -- # '[' 128 '!=' 128 ']' 00:10:44.328 18:14:27 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@386 -- # killprocess 2455622 00:10:44.328 18:14:27 bdev_raid.raid0_resize_test -- common/autotest_common.sh@948 -- # '[' -z 2455622 ']' 00:10:44.328 18:14:27 bdev_raid.raid0_resize_test -- common/autotest_common.sh@952 -- # kill -0 2455622 00:10:44.328 18:14:27 bdev_raid.raid0_resize_test -- common/autotest_common.sh@953 -- # uname 00:10:44.328 18:14:27 bdev_raid.raid0_resize_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:10:44.328 18:14:27 bdev_raid.raid0_resize_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2455622 00:10:44.328 18:14:27 bdev_raid.raid0_resize_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:10:44.328 18:14:27 bdev_raid.raid0_resize_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:10:44.328 18:14:27 bdev_raid.raid0_resize_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2455622' 00:10:44.328 killing process with pid 2455622 00:10:44.328 18:14:27 bdev_raid.raid0_resize_test -- common/autotest_common.sh@967 -- # kill 2455622 00:10:44.328 [2024-07-12 18:14:27.867365] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:10:44.328 [2024-07-12 18:14:27.867416] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:10:44.328 [2024-07-12 18:14:27.867455] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:10:44.328 [2024-07-12 18:14:27.867466] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xa52780 name Raid, state offline 00:10:44.328 18:14:27 bdev_raid.raid0_resize_test -- common/autotest_common.sh@972 -- # wait 2455622 00:10:44.328 [2024-07-12 18:14:27.868704] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:10:44.586 18:14:28 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@388 -- # return 0 00:10:44.586 00:10:44.586 real 0m2.921s 00:10:44.586 user 0m4.510s 00:10:44.586 sys 0m0.634s 00:10:44.586 18:14:28 bdev_raid.raid0_resize_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:10:44.586 18:14:28 bdev_raid.raid0_resize_test -- common/autotest_common.sh@10 -- # set +x 00:10:44.586 ************************************ 00:10:44.586 END TEST raid0_resize_test 00:10:44.586 ************************************ 00:10:44.586 18:14:28 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:10:44.586 18:14:28 bdev_raid -- bdev/bdev_raid.sh@865 -- # for n in {2..4} 00:10:44.586 18:14:28 bdev_raid -- bdev/bdev_raid.sh@866 -- # for level in raid0 concat raid1 00:10:44.587 18:14:28 bdev_raid -- bdev/bdev_raid.sh@867 -- # run_test raid_state_function_test raid_state_function_test raid0 2 false 00:10:44.587 18:14:28 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:10:44.587 18:14:28 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:10:44.587 18:14:28 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:10:44.587 ************************************ 00:10:44.587 START TEST raid_state_function_test 00:10:44.587 ************************************ 00:10:44.587 18:14:28 bdev_raid.raid_state_function_test -- common/autotest_common.sh@1123 -- # raid_state_function_test raid0 2 false 00:10:44.587 18:14:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@220 -- # local raid_level=raid0 00:10:44.587 18:14:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@221 -- # local num_base_bdevs=2 00:10:44.587 18:14:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@222 -- # local superblock=false 00:10:44.587 18:14:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@223 -- # local raid_bdev 00:10:44.587 18:14:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i = 1 )) 00:10:44.587 18:14:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:10:44.587 18:14:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev1 00:10:44.587 18:14:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:10:44.587 18:14:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:10:44.587 18:14:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev2 00:10:44.587 18:14:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:10:44.587 18:14:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:10:44.587 18:14:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:10:44.587 18:14:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # local base_bdevs 00:10:44.587 18:14:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@225 -- # local raid_bdev_name=Existed_Raid 00:10:44.587 18:14:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # local strip_size 00:10:44.587 18:14:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@227 -- # local strip_size_create_arg 00:10:44.587 18:14:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@228 -- # local superblock_create_arg 00:10:44.587 18:14:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@230 -- # '[' raid0 '!=' raid1 ']' 00:10:44.587 18:14:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@231 -- # strip_size=64 00:10:44.587 18:14:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@232 -- # strip_size_create_arg='-z 64' 00:10:44.587 18:14:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@237 -- # '[' false = true ']' 00:10:44.587 18:14:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@240 -- # superblock_create_arg= 00:10:44.587 18:14:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@243 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:10:44.587 18:14:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@244 -- # raid_pid=2456021 00:10:44.587 18:14:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@245 -- # echo 'Process raid pid: 2456021' 00:10:44.587 Process raid pid: 2456021 00:10:44.587 18:14:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@246 -- # waitforlisten 2456021 /var/tmp/spdk-raid.sock 00:10:44.587 18:14:28 bdev_raid.raid_state_function_test -- common/autotest_common.sh@829 -- # '[' -z 2456021 ']' 00:10:44.587 18:14:28 bdev_raid.raid_state_function_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:10:44.587 18:14:28 bdev_raid.raid_state_function_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:10:44.587 18:14:28 bdev_raid.raid_state_function_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:10:44.587 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:10:44.587 18:14:28 bdev_raid.raid_state_function_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:10:44.587 18:14:28 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:10:44.587 [2024-07-12 18:14:28.177661] Starting SPDK v24.09-pre git sha1 182dd7de4 / DPDK 24.03.0 initialization... 00:10:44.587 [2024-07-12 18:14:28.177708] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:10:44.587 [2024-07-12 18:14:28.290715] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:10:44.844 [2024-07-12 18:14:28.399862] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:10:44.844 [2024-07-12 18:14:28.465656] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:10:44.844 [2024-07-12 18:14:28.465701] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:10:45.777 18:14:29 bdev_raid.raid_state_function_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:10:45.777 18:14:29 bdev_raid.raid_state_function_test -- common/autotest_common.sh@862 -- # return 0 00:10:45.777 18:14:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@250 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:10:45.777 [2024-07-12 18:14:29.365366] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:10:45.777 [2024-07-12 18:14:29.365406] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:10:45.777 [2024-07-12 18:14:29.365417] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:10:45.777 [2024-07-12 18:14:29.365429] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:10:45.777 18:14:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@251 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 2 00:10:45.777 18:14:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:10:45.777 18:14:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:10:45.777 18:14:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:10:45.777 18:14:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:10:45.777 18:14:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:10:45.777 18:14:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:10:45.777 18:14:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:10:45.777 18:14:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:10:45.777 18:14:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:10:45.777 18:14:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:10:45.777 18:14:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:10:46.035 18:14:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:10:46.035 "name": "Existed_Raid", 00:10:46.035 "uuid": "00000000-0000-0000-0000-000000000000", 00:10:46.035 "strip_size_kb": 64, 00:10:46.035 "state": "configuring", 00:10:46.035 "raid_level": "raid0", 00:10:46.035 "superblock": false, 00:10:46.035 "num_base_bdevs": 2, 00:10:46.035 "num_base_bdevs_discovered": 0, 00:10:46.035 "num_base_bdevs_operational": 2, 00:10:46.035 "base_bdevs_list": [ 00:10:46.035 { 00:10:46.035 "name": "BaseBdev1", 00:10:46.035 "uuid": "00000000-0000-0000-0000-000000000000", 00:10:46.035 "is_configured": false, 00:10:46.035 "data_offset": 0, 00:10:46.035 "data_size": 0 00:10:46.035 }, 00:10:46.035 { 00:10:46.035 "name": "BaseBdev2", 00:10:46.035 "uuid": "00000000-0000-0000-0000-000000000000", 00:10:46.035 "is_configured": false, 00:10:46.035 "data_offset": 0, 00:10:46.035 "data_size": 0 00:10:46.035 } 00:10:46.035 ] 00:10:46.035 }' 00:10:46.035 18:14:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:10:46.035 18:14:29 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:10:46.600 18:14:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@252 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:10:46.858 [2024-07-12 18:14:30.448129] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:10:46.858 [2024-07-12 18:14:30.448161] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1030a80 name Existed_Raid, state configuring 00:10:46.858 18:14:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@256 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:10:47.116 [2024-07-12 18:14:30.692775] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:10:47.116 [2024-07-12 18:14:30.692811] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:10:47.116 [2024-07-12 18:14:30.692822] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:10:47.116 [2024-07-12 18:14:30.692833] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:10:47.116 18:14:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@257 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:10:47.375 [2024-07-12 18:14:30.951381] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:10:47.375 BaseBdev1 00:10:47.375 18:14:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@258 -- # waitforbdev BaseBdev1 00:10:47.375 18:14:30 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev1 00:10:47.375 18:14:30 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:10:47.375 18:14:30 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:10:47.375 18:14:30 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:10:47.375 18:14:30 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:10:47.375 18:14:30 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:10:47.634 18:14:31 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:10:47.894 [ 00:10:47.894 { 00:10:47.894 "name": "BaseBdev1", 00:10:47.894 "aliases": [ 00:10:47.894 "e93965b0-409c-4b48-83b4-1cf905b7f071" 00:10:47.894 ], 00:10:47.894 "product_name": "Malloc disk", 00:10:47.894 "block_size": 512, 00:10:47.894 "num_blocks": 65536, 00:10:47.894 "uuid": "e93965b0-409c-4b48-83b4-1cf905b7f071", 00:10:47.894 "assigned_rate_limits": { 00:10:47.894 "rw_ios_per_sec": 0, 00:10:47.894 "rw_mbytes_per_sec": 0, 00:10:47.894 "r_mbytes_per_sec": 0, 00:10:47.894 "w_mbytes_per_sec": 0 00:10:47.894 }, 00:10:47.894 "claimed": true, 00:10:47.894 "claim_type": "exclusive_write", 00:10:47.894 "zoned": false, 00:10:47.894 "supported_io_types": { 00:10:47.894 "read": true, 00:10:47.894 "write": true, 00:10:47.894 "unmap": true, 00:10:47.894 "flush": true, 00:10:47.894 "reset": true, 00:10:47.894 "nvme_admin": false, 00:10:47.894 "nvme_io": false, 00:10:47.894 "nvme_io_md": false, 00:10:47.894 "write_zeroes": true, 00:10:47.894 "zcopy": true, 00:10:47.894 "get_zone_info": false, 00:10:47.894 "zone_management": false, 00:10:47.894 "zone_append": false, 00:10:47.894 "compare": false, 00:10:47.894 "compare_and_write": false, 00:10:47.894 "abort": true, 00:10:47.894 "seek_hole": false, 00:10:47.894 "seek_data": false, 00:10:47.894 "copy": true, 00:10:47.894 "nvme_iov_md": false 00:10:47.894 }, 00:10:47.894 "memory_domains": [ 00:10:47.894 { 00:10:47.894 "dma_device_id": "system", 00:10:47.894 "dma_device_type": 1 00:10:47.894 }, 00:10:47.894 { 00:10:47.894 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:10:47.894 "dma_device_type": 2 00:10:47.894 } 00:10:47.894 ], 00:10:47.894 "driver_specific": {} 00:10:47.894 } 00:10:47.894 ] 00:10:47.894 18:14:31 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:10:47.894 18:14:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@259 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 2 00:10:47.894 18:14:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:10:47.894 18:14:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:10:47.894 18:14:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:10:47.894 18:14:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:10:47.894 18:14:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:10:47.894 18:14:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:10:47.894 18:14:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:10:47.894 18:14:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:10:47.894 18:14:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:10:47.894 18:14:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:10:47.894 18:14:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:10:48.153 18:14:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:10:48.153 "name": "Existed_Raid", 00:10:48.153 "uuid": "00000000-0000-0000-0000-000000000000", 00:10:48.153 "strip_size_kb": 64, 00:10:48.153 "state": "configuring", 00:10:48.153 "raid_level": "raid0", 00:10:48.153 "superblock": false, 00:10:48.153 "num_base_bdevs": 2, 00:10:48.153 "num_base_bdevs_discovered": 1, 00:10:48.153 "num_base_bdevs_operational": 2, 00:10:48.153 "base_bdevs_list": [ 00:10:48.153 { 00:10:48.153 "name": "BaseBdev1", 00:10:48.153 "uuid": "e93965b0-409c-4b48-83b4-1cf905b7f071", 00:10:48.153 "is_configured": true, 00:10:48.153 "data_offset": 0, 00:10:48.153 "data_size": 65536 00:10:48.153 }, 00:10:48.153 { 00:10:48.153 "name": "BaseBdev2", 00:10:48.153 "uuid": "00000000-0000-0000-0000-000000000000", 00:10:48.153 "is_configured": false, 00:10:48.153 "data_offset": 0, 00:10:48.153 "data_size": 0 00:10:48.153 } 00:10:48.153 ] 00:10:48.153 }' 00:10:48.153 18:14:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:10:48.153 18:14:31 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:10:48.721 18:14:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@260 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:10:48.980 [2024-07-12 18:14:32.459382] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:10:48.980 [2024-07-12 18:14:32.459421] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1030350 name Existed_Raid, state configuring 00:10:48.980 18:14:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@264 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:10:48.980 [2024-07-12 18:14:32.704057] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:10:48.980 [2024-07-12 18:14:32.705593] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:10:48.980 [2024-07-12 18:14:32.705628] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:10:49.239 18:14:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i = 1 )) 00:10:49.239 18:14:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:10:49.239 18:14:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 2 00:10:49.239 18:14:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:10:49.239 18:14:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:10:49.239 18:14:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:10:49.239 18:14:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:10:49.240 18:14:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:10:49.240 18:14:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:10:49.240 18:14:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:10:49.240 18:14:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:10:49.240 18:14:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:10:49.240 18:14:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:10:49.240 18:14:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:10:49.498 18:14:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:10:49.498 "name": "Existed_Raid", 00:10:49.498 "uuid": "00000000-0000-0000-0000-000000000000", 00:10:49.498 "strip_size_kb": 64, 00:10:49.498 "state": "configuring", 00:10:49.498 "raid_level": "raid0", 00:10:49.498 "superblock": false, 00:10:49.498 "num_base_bdevs": 2, 00:10:49.498 "num_base_bdevs_discovered": 1, 00:10:49.498 "num_base_bdevs_operational": 2, 00:10:49.498 "base_bdevs_list": [ 00:10:49.498 { 00:10:49.498 "name": "BaseBdev1", 00:10:49.498 "uuid": "e93965b0-409c-4b48-83b4-1cf905b7f071", 00:10:49.498 "is_configured": true, 00:10:49.498 "data_offset": 0, 00:10:49.498 "data_size": 65536 00:10:49.498 }, 00:10:49.498 { 00:10:49.498 "name": "BaseBdev2", 00:10:49.498 "uuid": "00000000-0000-0000-0000-000000000000", 00:10:49.498 "is_configured": false, 00:10:49.498 "data_offset": 0, 00:10:49.498 "data_size": 0 00:10:49.498 } 00:10:49.498 ] 00:10:49.498 }' 00:10:49.498 18:14:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:10:49.498 18:14:32 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:10:50.066 18:14:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:10:50.066 [2024-07-12 18:14:33.718105] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:10:50.067 [2024-07-12 18:14:33.718138] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x1031000 00:10:50.067 [2024-07-12 18:14:33.718147] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 131072, blocklen 512 00:10:50.067 [2024-07-12 18:14:33.718334] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xf4b0c0 00:10:50.067 [2024-07-12 18:14:33.718449] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1031000 00:10:50.067 [2024-07-12 18:14:33.718460] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x1031000 00:10:50.067 [2024-07-12 18:14:33.718615] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:10:50.067 BaseBdev2 00:10:50.067 18:14:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev2 00:10:50.067 18:14:33 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev2 00:10:50.067 18:14:33 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:10:50.067 18:14:33 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:10:50.067 18:14:33 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:10:50.067 18:14:33 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:10:50.067 18:14:33 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:10:50.326 18:14:33 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:10:50.585 [ 00:10:50.585 { 00:10:50.585 "name": "BaseBdev2", 00:10:50.585 "aliases": [ 00:10:50.585 "2c173341-b31d-4c14-b553-c26ce88daa40" 00:10:50.585 ], 00:10:50.585 "product_name": "Malloc disk", 00:10:50.585 "block_size": 512, 00:10:50.585 "num_blocks": 65536, 00:10:50.585 "uuid": "2c173341-b31d-4c14-b553-c26ce88daa40", 00:10:50.585 "assigned_rate_limits": { 00:10:50.585 "rw_ios_per_sec": 0, 00:10:50.585 "rw_mbytes_per_sec": 0, 00:10:50.585 "r_mbytes_per_sec": 0, 00:10:50.585 "w_mbytes_per_sec": 0 00:10:50.585 }, 00:10:50.585 "claimed": true, 00:10:50.585 "claim_type": "exclusive_write", 00:10:50.585 "zoned": false, 00:10:50.585 "supported_io_types": { 00:10:50.585 "read": true, 00:10:50.585 "write": true, 00:10:50.585 "unmap": true, 00:10:50.585 "flush": true, 00:10:50.585 "reset": true, 00:10:50.585 "nvme_admin": false, 00:10:50.585 "nvme_io": false, 00:10:50.585 "nvme_io_md": false, 00:10:50.585 "write_zeroes": true, 00:10:50.585 "zcopy": true, 00:10:50.585 "get_zone_info": false, 00:10:50.585 "zone_management": false, 00:10:50.585 "zone_append": false, 00:10:50.585 "compare": false, 00:10:50.585 "compare_and_write": false, 00:10:50.585 "abort": true, 00:10:50.585 "seek_hole": false, 00:10:50.585 "seek_data": false, 00:10:50.585 "copy": true, 00:10:50.585 "nvme_iov_md": false 00:10:50.585 }, 00:10:50.585 "memory_domains": [ 00:10:50.585 { 00:10:50.585 "dma_device_id": "system", 00:10:50.585 "dma_device_type": 1 00:10:50.585 }, 00:10:50.585 { 00:10:50.585 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:10:50.585 "dma_device_type": 2 00:10:50.585 } 00:10:50.585 ], 00:10:50.585 "driver_specific": {} 00:10:50.585 } 00:10:50.585 ] 00:10:50.585 18:14:34 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:10:50.585 18:14:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:10:50.585 18:14:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:10:50.585 18:14:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@270 -- # verify_raid_bdev_state Existed_Raid online raid0 64 2 00:10:50.585 18:14:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:10:50.585 18:14:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:10:50.585 18:14:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:10:50.585 18:14:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:10:50.585 18:14:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:10:50.585 18:14:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:10:50.585 18:14:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:10:50.585 18:14:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:10:50.585 18:14:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:10:50.586 18:14:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:10:50.586 18:14:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:10:50.586 18:14:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:10:50.586 "name": "Existed_Raid", 00:10:50.586 "uuid": "000be946-f0bb-425f-b8bd-c1f21fabbdd1", 00:10:50.586 "strip_size_kb": 64, 00:10:50.586 "state": "online", 00:10:50.586 "raid_level": "raid0", 00:10:50.586 "superblock": false, 00:10:50.586 "num_base_bdevs": 2, 00:10:50.586 "num_base_bdevs_discovered": 2, 00:10:50.586 "num_base_bdevs_operational": 2, 00:10:50.586 "base_bdevs_list": [ 00:10:50.586 { 00:10:50.586 "name": "BaseBdev1", 00:10:50.586 "uuid": "e93965b0-409c-4b48-83b4-1cf905b7f071", 00:10:50.586 "is_configured": true, 00:10:50.586 "data_offset": 0, 00:10:50.586 "data_size": 65536 00:10:50.586 }, 00:10:50.586 { 00:10:50.586 "name": "BaseBdev2", 00:10:50.586 "uuid": "2c173341-b31d-4c14-b553-c26ce88daa40", 00:10:50.586 "is_configured": true, 00:10:50.586 "data_offset": 0, 00:10:50.586 "data_size": 65536 00:10:50.586 } 00:10:50.586 ] 00:10:50.586 }' 00:10:50.586 18:14:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:10:50.586 18:14:34 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:10:51.155 18:14:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_properties Existed_Raid 00:10:51.155 18:14:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:10:51.155 18:14:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:10:51.155 18:14:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:10:51.155 18:14:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:10:51.155 18:14:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@198 -- # local name 00:10:51.155 18:14:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:10:51.155 18:14:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:10:51.415 [2024-07-12 18:14:35.086034] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:10:51.415 18:14:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:10:51.415 "name": "Existed_Raid", 00:10:51.415 "aliases": [ 00:10:51.415 "000be946-f0bb-425f-b8bd-c1f21fabbdd1" 00:10:51.415 ], 00:10:51.415 "product_name": "Raid Volume", 00:10:51.415 "block_size": 512, 00:10:51.415 "num_blocks": 131072, 00:10:51.415 "uuid": "000be946-f0bb-425f-b8bd-c1f21fabbdd1", 00:10:51.415 "assigned_rate_limits": { 00:10:51.415 "rw_ios_per_sec": 0, 00:10:51.415 "rw_mbytes_per_sec": 0, 00:10:51.415 "r_mbytes_per_sec": 0, 00:10:51.415 "w_mbytes_per_sec": 0 00:10:51.415 }, 00:10:51.415 "claimed": false, 00:10:51.415 "zoned": false, 00:10:51.415 "supported_io_types": { 00:10:51.415 "read": true, 00:10:51.415 "write": true, 00:10:51.415 "unmap": true, 00:10:51.415 "flush": true, 00:10:51.415 "reset": true, 00:10:51.415 "nvme_admin": false, 00:10:51.415 "nvme_io": false, 00:10:51.415 "nvme_io_md": false, 00:10:51.415 "write_zeroes": true, 00:10:51.415 "zcopy": false, 00:10:51.415 "get_zone_info": false, 00:10:51.415 "zone_management": false, 00:10:51.415 "zone_append": false, 00:10:51.415 "compare": false, 00:10:51.415 "compare_and_write": false, 00:10:51.415 "abort": false, 00:10:51.415 "seek_hole": false, 00:10:51.415 "seek_data": false, 00:10:51.415 "copy": false, 00:10:51.415 "nvme_iov_md": false 00:10:51.415 }, 00:10:51.415 "memory_domains": [ 00:10:51.415 { 00:10:51.415 "dma_device_id": "system", 00:10:51.415 "dma_device_type": 1 00:10:51.415 }, 00:10:51.415 { 00:10:51.415 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:10:51.415 "dma_device_type": 2 00:10:51.415 }, 00:10:51.415 { 00:10:51.415 "dma_device_id": "system", 00:10:51.415 "dma_device_type": 1 00:10:51.415 }, 00:10:51.415 { 00:10:51.415 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:10:51.415 "dma_device_type": 2 00:10:51.415 } 00:10:51.415 ], 00:10:51.415 "driver_specific": { 00:10:51.415 "raid": { 00:10:51.415 "uuid": "000be946-f0bb-425f-b8bd-c1f21fabbdd1", 00:10:51.415 "strip_size_kb": 64, 00:10:51.415 "state": "online", 00:10:51.415 "raid_level": "raid0", 00:10:51.415 "superblock": false, 00:10:51.415 "num_base_bdevs": 2, 00:10:51.415 "num_base_bdevs_discovered": 2, 00:10:51.415 "num_base_bdevs_operational": 2, 00:10:51.415 "base_bdevs_list": [ 00:10:51.415 { 00:10:51.415 "name": "BaseBdev1", 00:10:51.415 "uuid": "e93965b0-409c-4b48-83b4-1cf905b7f071", 00:10:51.415 "is_configured": true, 00:10:51.415 "data_offset": 0, 00:10:51.415 "data_size": 65536 00:10:51.415 }, 00:10:51.415 { 00:10:51.415 "name": "BaseBdev2", 00:10:51.415 "uuid": "2c173341-b31d-4c14-b553-c26ce88daa40", 00:10:51.415 "is_configured": true, 00:10:51.415 "data_offset": 0, 00:10:51.415 "data_size": 65536 00:10:51.415 } 00:10:51.415 ] 00:10:51.415 } 00:10:51.415 } 00:10:51.415 }' 00:10:51.415 18:14:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:10:51.415 18:14:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='BaseBdev1 00:10:51.415 BaseBdev2' 00:10:51.674 18:14:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:10:51.675 18:14:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:10:51.675 18:14:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:10:51.675 18:14:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:10:51.675 "name": "BaseBdev1", 00:10:51.675 "aliases": [ 00:10:51.675 "e93965b0-409c-4b48-83b4-1cf905b7f071" 00:10:51.675 ], 00:10:51.675 "product_name": "Malloc disk", 00:10:51.675 "block_size": 512, 00:10:51.675 "num_blocks": 65536, 00:10:51.675 "uuid": "e93965b0-409c-4b48-83b4-1cf905b7f071", 00:10:51.675 "assigned_rate_limits": { 00:10:51.675 "rw_ios_per_sec": 0, 00:10:51.675 "rw_mbytes_per_sec": 0, 00:10:51.675 "r_mbytes_per_sec": 0, 00:10:51.675 "w_mbytes_per_sec": 0 00:10:51.675 }, 00:10:51.675 "claimed": true, 00:10:51.675 "claim_type": "exclusive_write", 00:10:51.675 "zoned": false, 00:10:51.675 "supported_io_types": { 00:10:51.675 "read": true, 00:10:51.675 "write": true, 00:10:51.675 "unmap": true, 00:10:51.675 "flush": true, 00:10:51.675 "reset": true, 00:10:51.675 "nvme_admin": false, 00:10:51.675 "nvme_io": false, 00:10:51.675 "nvme_io_md": false, 00:10:51.675 "write_zeroes": true, 00:10:51.675 "zcopy": true, 00:10:51.675 "get_zone_info": false, 00:10:51.675 "zone_management": false, 00:10:51.675 "zone_append": false, 00:10:51.675 "compare": false, 00:10:51.675 "compare_and_write": false, 00:10:51.675 "abort": true, 00:10:51.675 "seek_hole": false, 00:10:51.675 "seek_data": false, 00:10:51.675 "copy": true, 00:10:51.675 "nvme_iov_md": false 00:10:51.675 }, 00:10:51.675 "memory_domains": [ 00:10:51.675 { 00:10:51.675 "dma_device_id": "system", 00:10:51.675 "dma_device_type": 1 00:10:51.675 }, 00:10:51.675 { 00:10:51.675 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:10:51.675 "dma_device_type": 2 00:10:51.675 } 00:10:51.675 ], 00:10:51.675 "driver_specific": {} 00:10:51.675 }' 00:10:51.675 18:14:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:10:51.675 18:14:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:10:51.935 18:14:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:10:51.935 18:14:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:10:51.935 18:14:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:10:51.935 18:14:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:10:51.935 18:14:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:10:51.935 18:14:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:10:51.935 18:14:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:10:51.935 18:14:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:10:51.935 18:14:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:10:52.194 18:14:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:10:52.194 18:14:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:10:52.194 18:14:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:10:52.194 18:14:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:10:52.194 18:14:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:10:52.194 "name": "BaseBdev2", 00:10:52.194 "aliases": [ 00:10:52.194 "2c173341-b31d-4c14-b553-c26ce88daa40" 00:10:52.194 ], 00:10:52.194 "product_name": "Malloc disk", 00:10:52.194 "block_size": 512, 00:10:52.194 "num_blocks": 65536, 00:10:52.194 "uuid": "2c173341-b31d-4c14-b553-c26ce88daa40", 00:10:52.194 "assigned_rate_limits": { 00:10:52.194 "rw_ios_per_sec": 0, 00:10:52.194 "rw_mbytes_per_sec": 0, 00:10:52.194 "r_mbytes_per_sec": 0, 00:10:52.194 "w_mbytes_per_sec": 0 00:10:52.194 }, 00:10:52.194 "claimed": true, 00:10:52.194 "claim_type": "exclusive_write", 00:10:52.194 "zoned": false, 00:10:52.194 "supported_io_types": { 00:10:52.194 "read": true, 00:10:52.194 "write": true, 00:10:52.194 "unmap": true, 00:10:52.194 "flush": true, 00:10:52.194 "reset": true, 00:10:52.194 "nvme_admin": false, 00:10:52.194 "nvme_io": false, 00:10:52.194 "nvme_io_md": false, 00:10:52.194 "write_zeroes": true, 00:10:52.194 "zcopy": true, 00:10:52.194 "get_zone_info": false, 00:10:52.194 "zone_management": false, 00:10:52.194 "zone_append": false, 00:10:52.194 "compare": false, 00:10:52.194 "compare_and_write": false, 00:10:52.194 "abort": true, 00:10:52.194 "seek_hole": false, 00:10:52.194 "seek_data": false, 00:10:52.194 "copy": true, 00:10:52.194 "nvme_iov_md": false 00:10:52.194 }, 00:10:52.194 "memory_domains": [ 00:10:52.194 { 00:10:52.194 "dma_device_id": "system", 00:10:52.194 "dma_device_type": 1 00:10:52.194 }, 00:10:52.194 { 00:10:52.194 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:10:52.194 "dma_device_type": 2 00:10:52.194 } 00:10:52.194 ], 00:10:52.194 "driver_specific": {} 00:10:52.194 }' 00:10:52.194 18:14:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:10:52.453 18:14:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:10:52.453 18:14:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:10:52.453 18:14:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:10:52.453 18:14:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:10:52.453 18:14:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:10:52.453 18:14:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:10:52.453 18:14:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:10:52.712 18:14:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:10:52.712 18:14:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:10:52.712 18:14:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:10:52.712 18:14:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:10:52.712 18:14:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@274 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:10:52.971 [2024-07-12 18:14:36.489530] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:10:52.971 [2024-07-12 18:14:36.489557] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:10:52.971 [2024-07-12 18:14:36.489597] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:10:52.971 18:14:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@275 -- # local expected_state 00:10:52.971 18:14:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@276 -- # has_redundancy raid0 00:10:52.971 18:14:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:10:52.971 18:14:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@215 -- # return 1 00:10:52.971 18:14:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@277 -- # expected_state=offline 00:10:52.971 18:14:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@281 -- # verify_raid_bdev_state Existed_Raid offline raid0 64 1 00:10:52.971 18:14:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:10:52.971 18:14:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=offline 00:10:52.971 18:14:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:10:52.971 18:14:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:10:52.971 18:14:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:10:52.971 18:14:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:10:52.971 18:14:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:10:52.971 18:14:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:10:52.971 18:14:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:10:52.971 18:14:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:10:52.971 18:14:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:10:53.230 18:14:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:10:53.230 "name": "Existed_Raid", 00:10:53.230 "uuid": "000be946-f0bb-425f-b8bd-c1f21fabbdd1", 00:10:53.230 "strip_size_kb": 64, 00:10:53.230 "state": "offline", 00:10:53.230 "raid_level": "raid0", 00:10:53.230 "superblock": false, 00:10:53.230 "num_base_bdevs": 2, 00:10:53.230 "num_base_bdevs_discovered": 1, 00:10:53.230 "num_base_bdevs_operational": 1, 00:10:53.230 "base_bdevs_list": [ 00:10:53.230 { 00:10:53.230 "name": null, 00:10:53.230 "uuid": "00000000-0000-0000-0000-000000000000", 00:10:53.230 "is_configured": false, 00:10:53.230 "data_offset": 0, 00:10:53.230 "data_size": 65536 00:10:53.230 }, 00:10:53.230 { 00:10:53.230 "name": "BaseBdev2", 00:10:53.230 "uuid": "2c173341-b31d-4c14-b553-c26ce88daa40", 00:10:53.230 "is_configured": true, 00:10:53.230 "data_offset": 0, 00:10:53.230 "data_size": 65536 00:10:53.230 } 00:10:53.230 ] 00:10:53.230 }' 00:10:53.230 18:14:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:10:53.230 18:14:36 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:10:54.170 18:14:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i = 1 )) 00:10:54.170 18:14:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:10:54.170 18:14:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:10:54.170 18:14:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:10:54.170 18:14:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:10:54.170 18:14:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:10:54.170 18:14:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:10:54.441 [2024-07-12 18:14:38.090781] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:10:54.441 [2024-07-12 18:14:38.090834] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1031000 name Existed_Raid, state offline 00:10:54.441 18:14:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:10:54.441 18:14:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:10:54.441 18:14:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:10:54.442 18:14:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # jq -r '.[0]["name"] | select(.)' 00:10:54.730 18:14:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # raid_bdev= 00:10:54.730 18:14:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@294 -- # '[' -n '' ']' 00:10:54.730 18:14:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@299 -- # '[' 2 -gt 2 ']' 00:10:54.730 18:14:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@341 -- # killprocess 2456021 00:10:54.730 18:14:38 bdev_raid.raid_state_function_test -- common/autotest_common.sh@948 -- # '[' -z 2456021 ']' 00:10:54.730 18:14:38 bdev_raid.raid_state_function_test -- common/autotest_common.sh@952 -- # kill -0 2456021 00:10:54.730 18:14:38 bdev_raid.raid_state_function_test -- common/autotest_common.sh@953 -- # uname 00:10:54.730 18:14:38 bdev_raid.raid_state_function_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:10:54.730 18:14:38 bdev_raid.raid_state_function_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2456021 00:10:54.730 18:14:38 bdev_raid.raid_state_function_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:10:54.730 18:14:38 bdev_raid.raid_state_function_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:10:54.730 18:14:38 bdev_raid.raid_state_function_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2456021' 00:10:54.730 killing process with pid 2456021 00:10:54.730 18:14:38 bdev_raid.raid_state_function_test -- common/autotest_common.sh@967 -- # kill 2456021 00:10:54.730 [2024-07-12 18:14:38.314566] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:10:54.730 18:14:38 bdev_raid.raid_state_function_test -- common/autotest_common.sh@972 -- # wait 2456021 00:10:54.730 [2024-07-12 18:14:38.315433] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:10:54.989 18:14:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@343 -- # return 0 00:10:54.989 00:10:54.989 real 0m10.380s 00:10:54.989 user 0m18.500s 00:10:54.989 sys 0m1.880s 00:10:54.989 18:14:38 bdev_raid.raid_state_function_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:10:54.989 18:14:38 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:10:54.989 ************************************ 00:10:54.989 END TEST raid_state_function_test 00:10:54.989 ************************************ 00:10:54.989 18:14:38 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:10:54.989 18:14:38 bdev_raid -- bdev/bdev_raid.sh@868 -- # run_test raid_state_function_test_sb raid_state_function_test raid0 2 true 00:10:54.989 18:14:38 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:10:54.989 18:14:38 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:10:54.989 18:14:38 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:10:54.989 ************************************ 00:10:54.989 START TEST raid_state_function_test_sb 00:10:54.989 ************************************ 00:10:54.989 18:14:38 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@1123 -- # raid_state_function_test raid0 2 true 00:10:54.989 18:14:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@220 -- # local raid_level=raid0 00:10:54.989 18:14:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@221 -- # local num_base_bdevs=2 00:10:54.989 18:14:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@222 -- # local superblock=true 00:10:54.989 18:14:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@223 -- # local raid_bdev 00:10:54.989 18:14:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i = 1 )) 00:10:54.989 18:14:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:10:54.990 18:14:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev1 00:10:54.990 18:14:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:10:54.990 18:14:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:10:54.990 18:14:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev2 00:10:54.990 18:14:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:10:54.990 18:14:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:10:54.990 18:14:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:10:54.990 18:14:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # local base_bdevs 00:10:54.990 18:14:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@225 -- # local raid_bdev_name=Existed_Raid 00:10:54.990 18:14:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # local strip_size 00:10:54.990 18:14:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@227 -- # local strip_size_create_arg 00:10:54.990 18:14:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@228 -- # local superblock_create_arg 00:10:54.990 18:14:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@230 -- # '[' raid0 '!=' raid1 ']' 00:10:54.990 18:14:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@231 -- # strip_size=64 00:10:54.990 18:14:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@232 -- # strip_size_create_arg='-z 64' 00:10:54.990 18:14:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@237 -- # '[' true = true ']' 00:10:54.990 18:14:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@238 -- # superblock_create_arg=-s 00:10:54.990 18:14:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@244 -- # raid_pid=2457649 00:10:54.990 18:14:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@245 -- # echo 'Process raid pid: 2457649' 00:10:54.990 Process raid pid: 2457649 00:10:54.990 18:14:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@243 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:10:54.990 18:14:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@246 -- # waitforlisten 2457649 /var/tmp/spdk-raid.sock 00:10:54.990 18:14:38 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@829 -- # '[' -z 2457649 ']' 00:10:54.990 18:14:38 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:10:54.990 18:14:38 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@834 -- # local max_retries=100 00:10:54.990 18:14:38 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:10:54.990 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:10:54.990 18:14:38 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@838 -- # xtrace_disable 00:10:54.990 18:14:38 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:10:54.990 [2024-07-12 18:14:38.648560] Starting SPDK v24.09-pre git sha1 182dd7de4 / DPDK 24.03.0 initialization... 00:10:54.990 [2024-07-12 18:14:38.648626] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:10:55.249 [2024-07-12 18:14:38.768865] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:10:55.249 [2024-07-12 18:14:38.875688] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:10:55.249 [2024-07-12 18:14:38.943197] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:10:55.249 [2024-07-12 18:14:38.943235] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:10:56.186 18:14:39 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:10:56.187 18:14:39 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@862 -- # return 0 00:10:56.187 18:14:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@250 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r raid0 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:10:56.187 [2024-07-12 18:14:39.785511] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:10:56.187 [2024-07-12 18:14:39.785554] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:10:56.187 [2024-07-12 18:14:39.785565] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:10:56.187 [2024-07-12 18:14:39.785577] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:10:56.187 18:14:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@251 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 2 00:10:56.187 18:14:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:10:56.187 18:14:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:10:56.187 18:14:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:10:56.187 18:14:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:10:56.187 18:14:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:10:56.187 18:14:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:10:56.187 18:14:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:10:56.187 18:14:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:10:56.187 18:14:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:10:56.187 18:14:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:10:56.187 18:14:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:10:56.755 18:14:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:10:56.755 "name": "Existed_Raid", 00:10:56.755 "uuid": "391111fa-c945-490e-9d6d-51dbab5fa8e7", 00:10:56.755 "strip_size_kb": 64, 00:10:56.755 "state": "configuring", 00:10:56.755 "raid_level": "raid0", 00:10:56.755 "superblock": true, 00:10:56.755 "num_base_bdevs": 2, 00:10:56.755 "num_base_bdevs_discovered": 0, 00:10:56.755 "num_base_bdevs_operational": 2, 00:10:56.755 "base_bdevs_list": [ 00:10:56.755 { 00:10:56.755 "name": "BaseBdev1", 00:10:56.755 "uuid": "00000000-0000-0000-0000-000000000000", 00:10:56.756 "is_configured": false, 00:10:56.756 "data_offset": 0, 00:10:56.756 "data_size": 0 00:10:56.756 }, 00:10:56.756 { 00:10:56.756 "name": "BaseBdev2", 00:10:56.756 "uuid": "00000000-0000-0000-0000-000000000000", 00:10:56.756 "is_configured": false, 00:10:56.756 "data_offset": 0, 00:10:56.756 "data_size": 0 00:10:56.756 } 00:10:56.756 ] 00:10:56.756 }' 00:10:56.756 18:14:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:10:56.756 18:14:40 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:10:57.324 18:14:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@252 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:10:57.324 [2024-07-12 18:14:41.044699] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:10:57.324 [2024-07-12 18:14:41.044733] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x2312a80 name Existed_Raid, state configuring 00:10:57.583 18:14:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@256 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r raid0 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:10:57.583 [2024-07-12 18:14:41.289382] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:10:57.583 [2024-07-12 18:14:41.289414] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:10:57.583 [2024-07-12 18:14:41.289424] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:10:57.583 [2024-07-12 18:14:41.289435] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:10:57.583 18:14:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@257 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:10:57.842 [2024-07-12 18:14:41.467703] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:10:57.842 BaseBdev1 00:10:57.842 18:14:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@258 -- # waitforbdev BaseBdev1 00:10:57.842 18:14:41 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev1 00:10:57.842 18:14:41 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:10:57.842 18:14:41 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:10:57.842 18:14:41 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:10:57.842 18:14:41 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:10:57.842 18:14:41 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:10:58.101 18:14:41 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:10:58.360 [ 00:10:58.360 { 00:10:58.360 "name": "BaseBdev1", 00:10:58.360 "aliases": [ 00:10:58.360 "a208d73e-f889-46cc-9307-0370d932eacf" 00:10:58.360 ], 00:10:58.360 "product_name": "Malloc disk", 00:10:58.360 "block_size": 512, 00:10:58.360 "num_blocks": 65536, 00:10:58.360 "uuid": "a208d73e-f889-46cc-9307-0370d932eacf", 00:10:58.360 "assigned_rate_limits": { 00:10:58.360 "rw_ios_per_sec": 0, 00:10:58.360 "rw_mbytes_per_sec": 0, 00:10:58.360 "r_mbytes_per_sec": 0, 00:10:58.360 "w_mbytes_per_sec": 0 00:10:58.360 }, 00:10:58.360 "claimed": true, 00:10:58.360 "claim_type": "exclusive_write", 00:10:58.360 "zoned": false, 00:10:58.360 "supported_io_types": { 00:10:58.360 "read": true, 00:10:58.360 "write": true, 00:10:58.360 "unmap": true, 00:10:58.360 "flush": true, 00:10:58.360 "reset": true, 00:10:58.360 "nvme_admin": false, 00:10:58.360 "nvme_io": false, 00:10:58.360 "nvme_io_md": false, 00:10:58.360 "write_zeroes": true, 00:10:58.360 "zcopy": true, 00:10:58.360 "get_zone_info": false, 00:10:58.360 "zone_management": false, 00:10:58.360 "zone_append": false, 00:10:58.360 "compare": false, 00:10:58.360 "compare_and_write": false, 00:10:58.360 "abort": true, 00:10:58.360 "seek_hole": false, 00:10:58.360 "seek_data": false, 00:10:58.360 "copy": true, 00:10:58.360 "nvme_iov_md": false 00:10:58.360 }, 00:10:58.360 "memory_domains": [ 00:10:58.360 { 00:10:58.360 "dma_device_id": "system", 00:10:58.360 "dma_device_type": 1 00:10:58.360 }, 00:10:58.360 { 00:10:58.360 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:10:58.360 "dma_device_type": 2 00:10:58.360 } 00:10:58.360 ], 00:10:58.360 "driver_specific": {} 00:10:58.360 } 00:10:58.360 ] 00:10:58.360 18:14:41 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:10:58.360 18:14:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@259 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 2 00:10:58.360 18:14:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:10:58.360 18:14:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:10:58.360 18:14:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:10:58.360 18:14:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:10:58.360 18:14:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:10:58.360 18:14:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:10:58.360 18:14:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:10:58.360 18:14:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:10:58.360 18:14:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:10:58.360 18:14:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:10:58.360 18:14:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:10:58.619 18:14:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:10:58.619 "name": "Existed_Raid", 00:10:58.619 "uuid": "ef24c622-e252-4fd1-b345-c72454d05890", 00:10:58.619 "strip_size_kb": 64, 00:10:58.619 "state": "configuring", 00:10:58.619 "raid_level": "raid0", 00:10:58.619 "superblock": true, 00:10:58.619 "num_base_bdevs": 2, 00:10:58.619 "num_base_bdevs_discovered": 1, 00:10:58.619 "num_base_bdevs_operational": 2, 00:10:58.619 "base_bdevs_list": [ 00:10:58.619 { 00:10:58.619 "name": "BaseBdev1", 00:10:58.619 "uuid": "a208d73e-f889-46cc-9307-0370d932eacf", 00:10:58.619 "is_configured": true, 00:10:58.619 "data_offset": 2048, 00:10:58.619 "data_size": 63488 00:10:58.619 }, 00:10:58.619 { 00:10:58.619 "name": "BaseBdev2", 00:10:58.619 "uuid": "00000000-0000-0000-0000-000000000000", 00:10:58.619 "is_configured": false, 00:10:58.619 "data_offset": 0, 00:10:58.619 "data_size": 0 00:10:58.619 } 00:10:58.619 ] 00:10:58.619 }' 00:10:58.619 18:14:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:10:58.619 18:14:42 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:10:59.186 18:14:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@260 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:10:59.443 [2024-07-12 18:14:42.927598] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:10:59.443 [2024-07-12 18:14:42.927633] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x2312350 name Existed_Raid, state configuring 00:10:59.443 18:14:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@264 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r raid0 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:10:59.443 [2024-07-12 18:14:43.108124] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:10:59.443 [2024-07-12 18:14:43.109698] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:10:59.443 [2024-07-12 18:14:43.109729] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:10:59.443 18:14:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i = 1 )) 00:10:59.443 18:14:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:10:59.443 18:14:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 2 00:10:59.443 18:14:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:10:59.443 18:14:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:10:59.443 18:14:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:10:59.443 18:14:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:10:59.443 18:14:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:10:59.443 18:14:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:10:59.443 18:14:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:10:59.443 18:14:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:10:59.443 18:14:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:10:59.443 18:14:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:10:59.443 18:14:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:10:59.701 18:14:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:10:59.701 "name": "Existed_Raid", 00:10:59.701 "uuid": "53745ddb-98ad-4f9d-8573-bea30fe49f49", 00:10:59.701 "strip_size_kb": 64, 00:10:59.701 "state": "configuring", 00:10:59.701 "raid_level": "raid0", 00:10:59.701 "superblock": true, 00:10:59.701 "num_base_bdevs": 2, 00:10:59.701 "num_base_bdevs_discovered": 1, 00:10:59.701 "num_base_bdevs_operational": 2, 00:10:59.701 "base_bdevs_list": [ 00:10:59.701 { 00:10:59.701 "name": "BaseBdev1", 00:10:59.701 "uuid": "a208d73e-f889-46cc-9307-0370d932eacf", 00:10:59.701 "is_configured": true, 00:10:59.701 "data_offset": 2048, 00:10:59.701 "data_size": 63488 00:10:59.701 }, 00:10:59.701 { 00:10:59.701 "name": "BaseBdev2", 00:10:59.701 "uuid": "00000000-0000-0000-0000-000000000000", 00:10:59.701 "is_configured": false, 00:10:59.701 "data_offset": 0, 00:10:59.701 "data_size": 0 00:10:59.701 } 00:10:59.701 ] 00:10:59.701 }' 00:10:59.701 18:14:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:10:59.701 18:14:43 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:11:00.636 18:14:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:11:00.895 [2024-07-12 18:14:44.378900] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:11:00.895 [2024-07-12 18:14:44.379057] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x2313000 00:11:00.895 [2024-07-12 18:14:44.379071] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 126976, blocklen 512 00:11:00.895 [2024-07-12 18:14:44.379242] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x222d0c0 00:11:00.895 [2024-07-12 18:14:44.379353] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x2313000 00:11:00.895 [2024-07-12 18:14:44.379363] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x2313000 00:11:00.895 [2024-07-12 18:14:44.379454] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:11:00.895 BaseBdev2 00:11:00.895 18:14:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev2 00:11:00.895 18:14:44 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev2 00:11:00.895 18:14:44 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:11:00.895 18:14:44 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:11:00.895 18:14:44 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:11:00.895 18:14:44 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:11:00.895 18:14:44 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:11:01.154 18:14:44 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:11:01.154 [ 00:11:01.154 { 00:11:01.154 "name": "BaseBdev2", 00:11:01.154 "aliases": [ 00:11:01.154 "20b9812f-358c-4efc-ac5a-504479b21c95" 00:11:01.154 ], 00:11:01.154 "product_name": "Malloc disk", 00:11:01.154 "block_size": 512, 00:11:01.154 "num_blocks": 65536, 00:11:01.154 "uuid": "20b9812f-358c-4efc-ac5a-504479b21c95", 00:11:01.154 "assigned_rate_limits": { 00:11:01.154 "rw_ios_per_sec": 0, 00:11:01.154 "rw_mbytes_per_sec": 0, 00:11:01.154 "r_mbytes_per_sec": 0, 00:11:01.154 "w_mbytes_per_sec": 0 00:11:01.154 }, 00:11:01.154 "claimed": true, 00:11:01.154 "claim_type": "exclusive_write", 00:11:01.154 "zoned": false, 00:11:01.154 "supported_io_types": { 00:11:01.154 "read": true, 00:11:01.154 "write": true, 00:11:01.154 "unmap": true, 00:11:01.154 "flush": true, 00:11:01.154 "reset": true, 00:11:01.154 "nvme_admin": false, 00:11:01.154 "nvme_io": false, 00:11:01.154 "nvme_io_md": false, 00:11:01.154 "write_zeroes": true, 00:11:01.154 "zcopy": true, 00:11:01.154 "get_zone_info": false, 00:11:01.154 "zone_management": false, 00:11:01.154 "zone_append": false, 00:11:01.154 "compare": false, 00:11:01.154 "compare_and_write": false, 00:11:01.154 "abort": true, 00:11:01.154 "seek_hole": false, 00:11:01.154 "seek_data": false, 00:11:01.154 "copy": true, 00:11:01.154 "nvme_iov_md": false 00:11:01.154 }, 00:11:01.154 "memory_domains": [ 00:11:01.154 { 00:11:01.154 "dma_device_id": "system", 00:11:01.154 "dma_device_type": 1 00:11:01.154 }, 00:11:01.154 { 00:11:01.154 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:01.154 "dma_device_type": 2 00:11:01.154 } 00:11:01.154 ], 00:11:01.154 "driver_specific": {} 00:11:01.154 } 00:11:01.154 ] 00:11:01.414 18:14:44 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:11:01.414 18:14:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:11:01.414 18:14:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:11:01.414 18:14:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@270 -- # verify_raid_bdev_state Existed_Raid online raid0 64 2 00:11:01.414 18:14:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:11:01.414 18:14:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:11:01.414 18:14:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:11:01.414 18:14:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:11:01.414 18:14:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:11:01.414 18:14:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:11:01.414 18:14:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:11:01.414 18:14:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:11:01.414 18:14:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:11:01.414 18:14:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:01.414 18:14:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:11:01.414 18:14:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:11:01.414 "name": "Existed_Raid", 00:11:01.414 "uuid": "53745ddb-98ad-4f9d-8573-bea30fe49f49", 00:11:01.414 "strip_size_kb": 64, 00:11:01.414 "state": "online", 00:11:01.414 "raid_level": "raid0", 00:11:01.414 "superblock": true, 00:11:01.414 "num_base_bdevs": 2, 00:11:01.414 "num_base_bdevs_discovered": 2, 00:11:01.414 "num_base_bdevs_operational": 2, 00:11:01.414 "base_bdevs_list": [ 00:11:01.414 { 00:11:01.414 "name": "BaseBdev1", 00:11:01.414 "uuid": "a208d73e-f889-46cc-9307-0370d932eacf", 00:11:01.414 "is_configured": true, 00:11:01.414 "data_offset": 2048, 00:11:01.414 "data_size": 63488 00:11:01.414 }, 00:11:01.414 { 00:11:01.414 "name": "BaseBdev2", 00:11:01.414 "uuid": "20b9812f-358c-4efc-ac5a-504479b21c95", 00:11:01.414 "is_configured": true, 00:11:01.414 "data_offset": 2048, 00:11:01.414 "data_size": 63488 00:11:01.414 } 00:11:01.414 ] 00:11:01.414 }' 00:11:01.414 18:14:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:11:01.414 18:14:45 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:11:02.347 18:14:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_properties Existed_Raid 00:11:02.347 18:14:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:11:02.348 18:14:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:11:02.348 18:14:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:11:02.348 18:14:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:11:02.348 18:14:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@198 -- # local name 00:11:02.348 18:14:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:11:02.348 18:14:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:11:02.348 [2024-07-12 18:14:45.963400] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:11:02.348 18:14:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:11:02.348 "name": "Existed_Raid", 00:11:02.348 "aliases": [ 00:11:02.348 "53745ddb-98ad-4f9d-8573-bea30fe49f49" 00:11:02.348 ], 00:11:02.348 "product_name": "Raid Volume", 00:11:02.348 "block_size": 512, 00:11:02.348 "num_blocks": 126976, 00:11:02.348 "uuid": "53745ddb-98ad-4f9d-8573-bea30fe49f49", 00:11:02.348 "assigned_rate_limits": { 00:11:02.348 "rw_ios_per_sec": 0, 00:11:02.348 "rw_mbytes_per_sec": 0, 00:11:02.348 "r_mbytes_per_sec": 0, 00:11:02.348 "w_mbytes_per_sec": 0 00:11:02.348 }, 00:11:02.348 "claimed": false, 00:11:02.348 "zoned": false, 00:11:02.348 "supported_io_types": { 00:11:02.348 "read": true, 00:11:02.348 "write": true, 00:11:02.348 "unmap": true, 00:11:02.348 "flush": true, 00:11:02.348 "reset": true, 00:11:02.348 "nvme_admin": false, 00:11:02.348 "nvme_io": false, 00:11:02.348 "nvme_io_md": false, 00:11:02.348 "write_zeroes": true, 00:11:02.348 "zcopy": false, 00:11:02.348 "get_zone_info": false, 00:11:02.348 "zone_management": false, 00:11:02.348 "zone_append": false, 00:11:02.348 "compare": false, 00:11:02.348 "compare_and_write": false, 00:11:02.348 "abort": false, 00:11:02.348 "seek_hole": false, 00:11:02.348 "seek_data": false, 00:11:02.348 "copy": false, 00:11:02.348 "nvme_iov_md": false 00:11:02.348 }, 00:11:02.348 "memory_domains": [ 00:11:02.348 { 00:11:02.348 "dma_device_id": "system", 00:11:02.348 "dma_device_type": 1 00:11:02.348 }, 00:11:02.348 { 00:11:02.348 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:02.348 "dma_device_type": 2 00:11:02.348 }, 00:11:02.348 { 00:11:02.348 "dma_device_id": "system", 00:11:02.348 "dma_device_type": 1 00:11:02.348 }, 00:11:02.348 { 00:11:02.348 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:02.348 "dma_device_type": 2 00:11:02.348 } 00:11:02.348 ], 00:11:02.348 "driver_specific": { 00:11:02.348 "raid": { 00:11:02.348 "uuid": "53745ddb-98ad-4f9d-8573-bea30fe49f49", 00:11:02.348 "strip_size_kb": 64, 00:11:02.348 "state": "online", 00:11:02.348 "raid_level": "raid0", 00:11:02.348 "superblock": true, 00:11:02.348 "num_base_bdevs": 2, 00:11:02.348 "num_base_bdevs_discovered": 2, 00:11:02.348 "num_base_bdevs_operational": 2, 00:11:02.348 "base_bdevs_list": [ 00:11:02.348 { 00:11:02.348 "name": "BaseBdev1", 00:11:02.348 "uuid": "a208d73e-f889-46cc-9307-0370d932eacf", 00:11:02.348 "is_configured": true, 00:11:02.348 "data_offset": 2048, 00:11:02.348 "data_size": 63488 00:11:02.348 }, 00:11:02.348 { 00:11:02.348 "name": "BaseBdev2", 00:11:02.348 "uuid": "20b9812f-358c-4efc-ac5a-504479b21c95", 00:11:02.348 "is_configured": true, 00:11:02.348 "data_offset": 2048, 00:11:02.348 "data_size": 63488 00:11:02.348 } 00:11:02.348 ] 00:11:02.348 } 00:11:02.348 } 00:11:02.348 }' 00:11:02.348 18:14:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:11:02.348 18:14:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # base_bdev_names='BaseBdev1 00:11:02.348 BaseBdev2' 00:11:02.348 18:14:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:11:02.348 18:14:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:11:02.348 18:14:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:11:02.606 18:14:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:11:02.606 "name": "BaseBdev1", 00:11:02.606 "aliases": [ 00:11:02.606 "a208d73e-f889-46cc-9307-0370d932eacf" 00:11:02.606 ], 00:11:02.606 "product_name": "Malloc disk", 00:11:02.606 "block_size": 512, 00:11:02.606 "num_blocks": 65536, 00:11:02.606 "uuid": "a208d73e-f889-46cc-9307-0370d932eacf", 00:11:02.606 "assigned_rate_limits": { 00:11:02.606 "rw_ios_per_sec": 0, 00:11:02.606 "rw_mbytes_per_sec": 0, 00:11:02.606 "r_mbytes_per_sec": 0, 00:11:02.606 "w_mbytes_per_sec": 0 00:11:02.606 }, 00:11:02.606 "claimed": true, 00:11:02.606 "claim_type": "exclusive_write", 00:11:02.606 "zoned": false, 00:11:02.606 "supported_io_types": { 00:11:02.606 "read": true, 00:11:02.606 "write": true, 00:11:02.606 "unmap": true, 00:11:02.606 "flush": true, 00:11:02.606 "reset": true, 00:11:02.606 "nvme_admin": false, 00:11:02.606 "nvme_io": false, 00:11:02.606 "nvme_io_md": false, 00:11:02.606 "write_zeroes": true, 00:11:02.606 "zcopy": true, 00:11:02.606 "get_zone_info": false, 00:11:02.606 "zone_management": false, 00:11:02.606 "zone_append": false, 00:11:02.606 "compare": false, 00:11:02.606 "compare_and_write": false, 00:11:02.606 "abort": true, 00:11:02.606 "seek_hole": false, 00:11:02.606 "seek_data": false, 00:11:02.606 "copy": true, 00:11:02.606 "nvme_iov_md": false 00:11:02.606 }, 00:11:02.606 "memory_domains": [ 00:11:02.606 { 00:11:02.606 "dma_device_id": "system", 00:11:02.606 "dma_device_type": 1 00:11:02.606 }, 00:11:02.606 { 00:11:02.606 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:02.606 "dma_device_type": 2 00:11:02.606 } 00:11:02.606 ], 00:11:02.606 "driver_specific": {} 00:11:02.606 }' 00:11:02.606 18:14:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:11:02.606 18:14:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:11:02.864 18:14:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:11:02.864 18:14:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:11:02.864 18:14:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:11:02.864 18:14:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:11:02.864 18:14:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:11:02.864 18:14:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:11:02.864 18:14:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:11:02.864 18:14:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:11:02.864 18:14:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:11:03.122 18:14:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:11:03.122 18:14:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:11:03.122 18:14:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:11:03.122 18:14:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:11:03.381 18:14:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:11:03.381 "name": "BaseBdev2", 00:11:03.381 "aliases": [ 00:11:03.381 "20b9812f-358c-4efc-ac5a-504479b21c95" 00:11:03.381 ], 00:11:03.381 "product_name": "Malloc disk", 00:11:03.381 "block_size": 512, 00:11:03.381 "num_blocks": 65536, 00:11:03.381 "uuid": "20b9812f-358c-4efc-ac5a-504479b21c95", 00:11:03.381 "assigned_rate_limits": { 00:11:03.381 "rw_ios_per_sec": 0, 00:11:03.381 "rw_mbytes_per_sec": 0, 00:11:03.381 "r_mbytes_per_sec": 0, 00:11:03.381 "w_mbytes_per_sec": 0 00:11:03.381 }, 00:11:03.381 "claimed": true, 00:11:03.381 "claim_type": "exclusive_write", 00:11:03.381 "zoned": false, 00:11:03.381 "supported_io_types": { 00:11:03.381 "read": true, 00:11:03.381 "write": true, 00:11:03.381 "unmap": true, 00:11:03.381 "flush": true, 00:11:03.381 "reset": true, 00:11:03.381 "nvme_admin": false, 00:11:03.381 "nvme_io": false, 00:11:03.381 "nvme_io_md": false, 00:11:03.381 "write_zeroes": true, 00:11:03.381 "zcopy": true, 00:11:03.381 "get_zone_info": false, 00:11:03.381 "zone_management": false, 00:11:03.381 "zone_append": false, 00:11:03.381 "compare": false, 00:11:03.381 "compare_and_write": false, 00:11:03.381 "abort": true, 00:11:03.381 "seek_hole": false, 00:11:03.381 "seek_data": false, 00:11:03.381 "copy": true, 00:11:03.381 "nvme_iov_md": false 00:11:03.381 }, 00:11:03.381 "memory_domains": [ 00:11:03.381 { 00:11:03.381 "dma_device_id": "system", 00:11:03.381 "dma_device_type": 1 00:11:03.381 }, 00:11:03.381 { 00:11:03.381 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:03.381 "dma_device_type": 2 00:11:03.381 } 00:11:03.381 ], 00:11:03.381 "driver_specific": {} 00:11:03.381 }' 00:11:03.381 18:14:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:11:03.381 18:14:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:11:03.381 18:14:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:11:03.381 18:14:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:11:03.381 18:14:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:11:03.381 18:14:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:11:03.381 18:14:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:11:03.381 18:14:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:11:03.640 18:14:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:11:03.640 18:14:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:11:03.640 18:14:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:11:03.640 18:14:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:11:03.640 18:14:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@274 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:11:03.898 [2024-07-12 18:14:47.447307] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:11:03.898 [2024-07-12 18:14:47.447332] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:11:03.899 [2024-07-12 18:14:47.447372] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:11:03.899 18:14:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@275 -- # local expected_state 00:11:03.899 18:14:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@276 -- # has_redundancy raid0 00:11:03.899 18:14:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@213 -- # case $1 in 00:11:03.899 18:14:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@215 -- # return 1 00:11:03.899 18:14:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@277 -- # expected_state=offline 00:11:03.899 18:14:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@281 -- # verify_raid_bdev_state Existed_Raid offline raid0 64 1 00:11:03.899 18:14:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:11:03.899 18:14:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=offline 00:11:03.899 18:14:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:11:03.899 18:14:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:11:03.899 18:14:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:11:03.899 18:14:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:11:03.899 18:14:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:11:03.899 18:14:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:11:03.899 18:14:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:11:03.899 18:14:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:03.899 18:14:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:11:04.157 18:14:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:11:04.157 "name": "Existed_Raid", 00:11:04.157 "uuid": "53745ddb-98ad-4f9d-8573-bea30fe49f49", 00:11:04.157 "strip_size_kb": 64, 00:11:04.157 "state": "offline", 00:11:04.157 "raid_level": "raid0", 00:11:04.157 "superblock": true, 00:11:04.157 "num_base_bdevs": 2, 00:11:04.157 "num_base_bdevs_discovered": 1, 00:11:04.157 "num_base_bdevs_operational": 1, 00:11:04.157 "base_bdevs_list": [ 00:11:04.157 { 00:11:04.157 "name": null, 00:11:04.157 "uuid": "00000000-0000-0000-0000-000000000000", 00:11:04.157 "is_configured": false, 00:11:04.157 "data_offset": 2048, 00:11:04.157 "data_size": 63488 00:11:04.157 }, 00:11:04.157 { 00:11:04.157 "name": "BaseBdev2", 00:11:04.157 "uuid": "20b9812f-358c-4efc-ac5a-504479b21c95", 00:11:04.157 "is_configured": true, 00:11:04.157 "data_offset": 2048, 00:11:04.157 "data_size": 63488 00:11:04.157 } 00:11:04.157 ] 00:11:04.157 }' 00:11:04.157 18:14:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:11:04.157 18:14:47 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:11:04.725 18:14:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i = 1 )) 00:11:04.725 18:14:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:11:04.725 18:14:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:04.725 18:14:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:11:04.984 18:14:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:11:04.984 18:14:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:11:04.984 18:14:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:11:04.984 [2024-07-12 18:14:48.695607] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:11:04.984 [2024-07-12 18:14:48.695652] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x2313000 name Existed_Raid, state offline 00:11:05.243 18:14:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:11:05.243 18:14:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:11:05.243 18:14:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:05.243 18:14:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # jq -r '.[0]["name"] | select(.)' 00:11:05.243 18:14:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # raid_bdev= 00:11:05.243 18:14:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@294 -- # '[' -n '' ']' 00:11:05.243 18:14:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@299 -- # '[' 2 -gt 2 ']' 00:11:05.243 18:14:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@341 -- # killprocess 2457649 00:11:05.243 18:14:48 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@948 -- # '[' -z 2457649 ']' 00:11:05.243 18:14:48 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@952 -- # kill -0 2457649 00:11:05.502 18:14:48 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@953 -- # uname 00:11:05.502 18:14:48 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:11:05.502 18:14:48 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2457649 00:11:05.502 18:14:49 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:11:05.502 18:14:49 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:11:05.502 18:14:49 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2457649' 00:11:05.502 killing process with pid 2457649 00:11:05.502 18:14:49 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@967 -- # kill 2457649 00:11:05.502 [2024-07-12 18:14:49.010659] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:11:05.502 18:14:49 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@972 -- # wait 2457649 00:11:05.502 [2024-07-12 18:14:49.011526] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:11:05.502 18:14:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@343 -- # return 0 00:11:05.502 00:11:05.502 real 0m10.643s 00:11:05.502 user 0m18.948s 00:11:05.502 sys 0m1.974s 00:11:05.502 18:14:49 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@1124 -- # xtrace_disable 00:11:05.502 18:14:49 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:11:05.502 ************************************ 00:11:05.502 END TEST raid_state_function_test_sb 00:11:05.502 ************************************ 00:11:05.762 18:14:49 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:11:05.762 18:14:49 bdev_raid -- bdev/bdev_raid.sh@869 -- # run_test raid_superblock_test raid_superblock_test raid0 2 00:11:05.762 18:14:49 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 4 -le 1 ']' 00:11:05.762 18:14:49 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:11:05.762 18:14:49 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:11:05.762 ************************************ 00:11:05.762 START TEST raid_superblock_test 00:11:05.762 ************************************ 00:11:05.762 18:14:49 bdev_raid.raid_superblock_test -- common/autotest_common.sh@1123 -- # raid_superblock_test raid0 2 00:11:05.762 18:14:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@392 -- # local raid_level=raid0 00:11:05.762 18:14:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@393 -- # local num_base_bdevs=2 00:11:05.762 18:14:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@394 -- # base_bdevs_malloc=() 00:11:05.762 18:14:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@394 -- # local base_bdevs_malloc 00:11:05.762 18:14:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@395 -- # base_bdevs_pt=() 00:11:05.762 18:14:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@395 -- # local base_bdevs_pt 00:11:05.762 18:14:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@396 -- # base_bdevs_pt_uuid=() 00:11:05.762 18:14:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@396 -- # local base_bdevs_pt_uuid 00:11:05.762 18:14:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@397 -- # local raid_bdev_name=raid_bdev1 00:11:05.762 18:14:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@398 -- # local strip_size 00:11:05.762 18:14:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@399 -- # local strip_size_create_arg 00:11:05.762 18:14:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@400 -- # local raid_bdev_uuid 00:11:05.762 18:14:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@401 -- # local raid_bdev 00:11:05.762 18:14:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@403 -- # '[' raid0 '!=' raid1 ']' 00:11:05.762 18:14:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@404 -- # strip_size=64 00:11:05.762 18:14:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@405 -- # strip_size_create_arg='-z 64' 00:11:05.762 18:14:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@411 -- # raid_pid=2459278 00:11:05.762 18:14:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@410 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -L bdev_raid 00:11:05.762 18:14:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@412 -- # waitforlisten 2459278 /var/tmp/spdk-raid.sock 00:11:05.762 18:14:49 bdev_raid.raid_superblock_test -- common/autotest_common.sh@829 -- # '[' -z 2459278 ']' 00:11:05.762 18:14:49 bdev_raid.raid_superblock_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:11:05.762 18:14:49 bdev_raid.raid_superblock_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:11:05.762 18:14:49 bdev_raid.raid_superblock_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:11:05.762 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:11:05.762 18:14:49 bdev_raid.raid_superblock_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:11:05.762 18:14:49 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:11:05.762 [2024-07-12 18:14:49.357330] Starting SPDK v24.09-pre git sha1 182dd7de4 / DPDK 24.03.0 initialization... 00:11:05.762 [2024-07-12 18:14:49.357384] [ DPDK EAL parameters: bdev_svc --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2459278 ] 00:11:05.762 [2024-07-12 18:14:49.470609] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:11:06.021 [2024-07-12 18:14:49.574713] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:11:06.021 [2024-07-12 18:14:49.635656] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:11:06.021 [2024-07-12 18:14:49.635695] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:11:06.588 18:14:50 bdev_raid.raid_superblock_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:11:06.588 18:14:50 bdev_raid.raid_superblock_test -- common/autotest_common.sh@862 -- # return 0 00:11:06.588 18:14:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i = 1 )) 00:11:06.588 18:14:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:11:06.588 18:14:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc1 00:11:06.588 18:14:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt1 00:11:06.588 18:14:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000001 00:11:06.588 18:14:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:11:06.588 18:14:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:11:06.588 18:14:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:11:06.588 18:14:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc1 00:11:06.848 malloc1 00:11:06.848 18:14:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:11:07.107 [2024-07-12 18:14:50.647335] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:11:07.107 [2024-07-12 18:14:50.647387] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:11:07.107 [2024-07-12 18:14:50.647406] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1e0b570 00:11:07.107 [2024-07-12 18:14:50.647418] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:11:07.107 [2024-07-12 18:14:50.649003] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:11:07.107 [2024-07-12 18:14:50.649031] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:11:07.107 pt1 00:11:07.107 18:14:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:11:07.107 18:14:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:11:07.107 18:14:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc2 00:11:07.107 18:14:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt2 00:11:07.107 18:14:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000002 00:11:07.107 18:14:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:11:07.107 18:14:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:11:07.107 18:14:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:11:07.107 18:14:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc2 00:11:07.366 malloc2 00:11:07.366 18:14:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:11:07.366 [2024-07-12 18:14:51.077287] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:11:07.366 [2024-07-12 18:14:51.077333] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:11:07.366 [2024-07-12 18:14:51.077349] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1e0c970 00:11:07.366 [2024-07-12 18:14:51.077361] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:11:07.366 [2024-07-12 18:14:51.078860] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:11:07.366 [2024-07-12 18:14:51.078888] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:11:07.366 pt2 00:11:07.625 18:14:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:11:07.625 18:14:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:11:07.625 18:14:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@429 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'pt1 pt2' -n raid_bdev1 -s 00:11:07.625 [2024-07-12 18:14:51.325971] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:11:07.625 [2024-07-12 18:14:51.327141] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:11:07.625 [2024-07-12 18:14:51.327274] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x1faf270 00:11:07.625 [2024-07-12 18:14:51.327287] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 126976, blocklen 512 00:11:07.625 [2024-07-12 18:14:51.327466] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1fa4c10 00:11:07.625 [2024-07-12 18:14:51.327602] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1faf270 00:11:07.625 [2024-07-12 18:14:51.327612] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x1faf270 00:11:07.625 [2024-07-12 18:14:51.327701] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:11:07.625 18:14:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@430 -- # verify_raid_bdev_state raid_bdev1 online raid0 64 2 00:11:07.625 18:14:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:11:07.625 18:14:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:11:07.625 18:14:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:11:07.625 18:14:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:11:07.625 18:14:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:11:07.625 18:14:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:11:07.625 18:14:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:11:07.625 18:14:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:11:07.625 18:14:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:11:07.625 18:14:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:07.625 18:14:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:11:07.884 18:14:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:11:07.884 "name": "raid_bdev1", 00:11:07.884 "uuid": "a975c54c-dd04-4d49-8237-965a1710f76a", 00:11:07.884 "strip_size_kb": 64, 00:11:07.884 "state": "online", 00:11:07.884 "raid_level": "raid0", 00:11:07.884 "superblock": true, 00:11:07.884 "num_base_bdevs": 2, 00:11:07.884 "num_base_bdevs_discovered": 2, 00:11:07.884 "num_base_bdevs_operational": 2, 00:11:07.884 "base_bdevs_list": [ 00:11:07.884 { 00:11:07.884 "name": "pt1", 00:11:07.884 "uuid": "00000000-0000-0000-0000-000000000001", 00:11:07.884 "is_configured": true, 00:11:07.884 "data_offset": 2048, 00:11:07.884 "data_size": 63488 00:11:07.884 }, 00:11:07.884 { 00:11:07.884 "name": "pt2", 00:11:07.884 "uuid": "00000000-0000-0000-0000-000000000002", 00:11:07.884 "is_configured": true, 00:11:07.884 "data_offset": 2048, 00:11:07.884 "data_size": 63488 00:11:07.884 } 00:11:07.884 ] 00:11:07.884 }' 00:11:07.884 18:14:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:11:07.884 18:14:51 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:11:08.821 18:14:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@431 -- # verify_raid_bdev_properties raid_bdev1 00:11:08.821 18:14:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:11:08.821 18:14:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:11:08.821 18:14:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:11:08.821 18:14:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:11:08.821 18:14:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@198 -- # local name 00:11:08.821 18:14:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:11:08.821 18:14:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:11:08.821 [2024-07-12 18:14:52.409051] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:11:08.821 18:14:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:11:08.821 "name": "raid_bdev1", 00:11:08.821 "aliases": [ 00:11:08.821 "a975c54c-dd04-4d49-8237-965a1710f76a" 00:11:08.821 ], 00:11:08.821 "product_name": "Raid Volume", 00:11:08.821 "block_size": 512, 00:11:08.821 "num_blocks": 126976, 00:11:08.821 "uuid": "a975c54c-dd04-4d49-8237-965a1710f76a", 00:11:08.821 "assigned_rate_limits": { 00:11:08.821 "rw_ios_per_sec": 0, 00:11:08.821 "rw_mbytes_per_sec": 0, 00:11:08.821 "r_mbytes_per_sec": 0, 00:11:08.821 "w_mbytes_per_sec": 0 00:11:08.821 }, 00:11:08.821 "claimed": false, 00:11:08.821 "zoned": false, 00:11:08.821 "supported_io_types": { 00:11:08.821 "read": true, 00:11:08.821 "write": true, 00:11:08.821 "unmap": true, 00:11:08.821 "flush": true, 00:11:08.821 "reset": true, 00:11:08.821 "nvme_admin": false, 00:11:08.821 "nvme_io": false, 00:11:08.821 "nvme_io_md": false, 00:11:08.821 "write_zeroes": true, 00:11:08.821 "zcopy": false, 00:11:08.821 "get_zone_info": false, 00:11:08.821 "zone_management": false, 00:11:08.821 "zone_append": false, 00:11:08.821 "compare": false, 00:11:08.821 "compare_and_write": false, 00:11:08.821 "abort": false, 00:11:08.821 "seek_hole": false, 00:11:08.822 "seek_data": false, 00:11:08.822 "copy": false, 00:11:08.822 "nvme_iov_md": false 00:11:08.822 }, 00:11:08.822 "memory_domains": [ 00:11:08.822 { 00:11:08.822 "dma_device_id": "system", 00:11:08.822 "dma_device_type": 1 00:11:08.822 }, 00:11:08.822 { 00:11:08.822 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:08.822 "dma_device_type": 2 00:11:08.822 }, 00:11:08.822 { 00:11:08.822 "dma_device_id": "system", 00:11:08.822 "dma_device_type": 1 00:11:08.822 }, 00:11:08.822 { 00:11:08.822 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:08.822 "dma_device_type": 2 00:11:08.822 } 00:11:08.822 ], 00:11:08.822 "driver_specific": { 00:11:08.822 "raid": { 00:11:08.822 "uuid": "a975c54c-dd04-4d49-8237-965a1710f76a", 00:11:08.822 "strip_size_kb": 64, 00:11:08.822 "state": "online", 00:11:08.822 "raid_level": "raid0", 00:11:08.822 "superblock": true, 00:11:08.822 "num_base_bdevs": 2, 00:11:08.822 "num_base_bdevs_discovered": 2, 00:11:08.822 "num_base_bdevs_operational": 2, 00:11:08.822 "base_bdevs_list": [ 00:11:08.822 { 00:11:08.822 "name": "pt1", 00:11:08.822 "uuid": "00000000-0000-0000-0000-000000000001", 00:11:08.822 "is_configured": true, 00:11:08.822 "data_offset": 2048, 00:11:08.822 "data_size": 63488 00:11:08.822 }, 00:11:08.822 { 00:11:08.822 "name": "pt2", 00:11:08.822 "uuid": "00000000-0000-0000-0000-000000000002", 00:11:08.822 "is_configured": true, 00:11:08.822 "data_offset": 2048, 00:11:08.822 "data_size": 63488 00:11:08.822 } 00:11:08.822 ] 00:11:08.822 } 00:11:08.822 } 00:11:08.822 }' 00:11:08.822 18:14:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:11:08.822 18:14:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:11:08.822 pt2' 00:11:08.822 18:14:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:11:08.822 18:14:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:11:08.822 18:14:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:11:09.083 18:14:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:11:09.083 "name": "pt1", 00:11:09.083 "aliases": [ 00:11:09.083 "00000000-0000-0000-0000-000000000001" 00:11:09.083 ], 00:11:09.083 "product_name": "passthru", 00:11:09.083 "block_size": 512, 00:11:09.083 "num_blocks": 65536, 00:11:09.083 "uuid": "00000000-0000-0000-0000-000000000001", 00:11:09.083 "assigned_rate_limits": { 00:11:09.083 "rw_ios_per_sec": 0, 00:11:09.083 "rw_mbytes_per_sec": 0, 00:11:09.083 "r_mbytes_per_sec": 0, 00:11:09.083 "w_mbytes_per_sec": 0 00:11:09.083 }, 00:11:09.083 "claimed": true, 00:11:09.083 "claim_type": "exclusive_write", 00:11:09.083 "zoned": false, 00:11:09.083 "supported_io_types": { 00:11:09.083 "read": true, 00:11:09.083 "write": true, 00:11:09.083 "unmap": true, 00:11:09.083 "flush": true, 00:11:09.083 "reset": true, 00:11:09.083 "nvme_admin": false, 00:11:09.083 "nvme_io": false, 00:11:09.083 "nvme_io_md": false, 00:11:09.083 "write_zeroes": true, 00:11:09.083 "zcopy": true, 00:11:09.083 "get_zone_info": false, 00:11:09.083 "zone_management": false, 00:11:09.083 "zone_append": false, 00:11:09.083 "compare": false, 00:11:09.083 "compare_and_write": false, 00:11:09.083 "abort": true, 00:11:09.083 "seek_hole": false, 00:11:09.083 "seek_data": false, 00:11:09.083 "copy": true, 00:11:09.083 "nvme_iov_md": false 00:11:09.084 }, 00:11:09.084 "memory_domains": [ 00:11:09.084 { 00:11:09.084 "dma_device_id": "system", 00:11:09.084 "dma_device_type": 1 00:11:09.084 }, 00:11:09.084 { 00:11:09.084 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:09.084 "dma_device_type": 2 00:11:09.084 } 00:11:09.084 ], 00:11:09.084 "driver_specific": { 00:11:09.084 "passthru": { 00:11:09.084 "name": "pt1", 00:11:09.084 "base_bdev_name": "malloc1" 00:11:09.084 } 00:11:09.084 } 00:11:09.084 }' 00:11:09.084 18:14:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:11:09.084 18:14:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:11:09.084 18:14:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:11:09.084 18:14:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:11:09.341 18:14:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:11:09.341 18:14:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:11:09.341 18:14:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:11:09.341 18:14:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:11:09.341 18:14:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:11:09.341 18:14:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:11:09.341 18:14:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:11:09.341 18:14:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:11:09.341 18:14:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:11:09.341 18:14:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:11:09.341 18:14:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:11:09.598 18:14:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:11:09.598 "name": "pt2", 00:11:09.598 "aliases": [ 00:11:09.598 "00000000-0000-0000-0000-000000000002" 00:11:09.598 ], 00:11:09.598 "product_name": "passthru", 00:11:09.598 "block_size": 512, 00:11:09.598 "num_blocks": 65536, 00:11:09.598 "uuid": "00000000-0000-0000-0000-000000000002", 00:11:09.598 "assigned_rate_limits": { 00:11:09.598 "rw_ios_per_sec": 0, 00:11:09.598 "rw_mbytes_per_sec": 0, 00:11:09.598 "r_mbytes_per_sec": 0, 00:11:09.598 "w_mbytes_per_sec": 0 00:11:09.598 }, 00:11:09.598 "claimed": true, 00:11:09.598 "claim_type": "exclusive_write", 00:11:09.598 "zoned": false, 00:11:09.598 "supported_io_types": { 00:11:09.598 "read": true, 00:11:09.598 "write": true, 00:11:09.598 "unmap": true, 00:11:09.598 "flush": true, 00:11:09.598 "reset": true, 00:11:09.598 "nvme_admin": false, 00:11:09.598 "nvme_io": false, 00:11:09.598 "nvme_io_md": false, 00:11:09.598 "write_zeroes": true, 00:11:09.598 "zcopy": true, 00:11:09.598 "get_zone_info": false, 00:11:09.598 "zone_management": false, 00:11:09.598 "zone_append": false, 00:11:09.598 "compare": false, 00:11:09.598 "compare_and_write": false, 00:11:09.598 "abort": true, 00:11:09.598 "seek_hole": false, 00:11:09.598 "seek_data": false, 00:11:09.598 "copy": true, 00:11:09.598 "nvme_iov_md": false 00:11:09.598 }, 00:11:09.598 "memory_domains": [ 00:11:09.598 { 00:11:09.598 "dma_device_id": "system", 00:11:09.598 "dma_device_type": 1 00:11:09.598 }, 00:11:09.598 { 00:11:09.598 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:09.598 "dma_device_type": 2 00:11:09.598 } 00:11:09.598 ], 00:11:09.598 "driver_specific": { 00:11:09.598 "passthru": { 00:11:09.598 "name": "pt2", 00:11:09.598 "base_bdev_name": "malloc2" 00:11:09.598 } 00:11:09.598 } 00:11:09.598 }' 00:11:09.598 18:14:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:11:09.598 18:14:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:11:09.855 18:14:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:11:09.855 18:14:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:11:09.855 18:14:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:11:09.855 18:14:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:11:09.855 18:14:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:11:09.855 18:14:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:11:09.855 18:14:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:11:09.855 18:14:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:11:09.855 18:14:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:11:10.141 18:14:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:11:10.141 18:14:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:11:10.141 18:14:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # jq -r '.[] | .uuid' 00:11:10.141 [2024-07-12 18:14:53.828811] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:11:10.414 18:14:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # raid_bdev_uuid=a975c54c-dd04-4d49-8237-965a1710f76a 00:11:10.414 18:14:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@435 -- # '[' -z a975c54c-dd04-4d49-8237-965a1710f76a ']' 00:11:10.414 18:14:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@440 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:11:10.414 [2024-07-12 18:14:54.077218] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:11:10.414 [2024-07-12 18:14:54.077235] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:11:10.414 [2024-07-12 18:14:54.077286] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:11:10.414 [2024-07-12 18:14:54.077328] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:11:10.414 [2024-07-12 18:14:54.077340] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1faf270 name raid_bdev1, state offline 00:11:10.414 18:14:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:10.414 18:14:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # jq -r '.[]' 00:11:10.672 18:14:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # raid_bdev= 00:11:10.672 18:14:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@442 -- # '[' -n '' ']' 00:11:10.673 18:14:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:11:10.673 18:14:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt1 00:11:10.931 18:14:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:11:10.931 18:14:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:11:11.190 18:14:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs 00:11:11.190 18:14:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # jq -r '[.[] | select(.product_name == "passthru")] | any' 00:11:11.449 18:14:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # '[' false == true ']' 00:11:11.449 18:14:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@456 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'malloc1 malloc2' -n raid_bdev1 00:11:11.449 18:14:55 bdev_raid.raid_superblock_test -- common/autotest_common.sh@648 -- # local es=0 00:11:11.449 18:14:55 bdev_raid.raid_superblock_test -- common/autotest_common.sh@650 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'malloc1 malloc2' -n raid_bdev1 00:11:11.449 18:14:55 bdev_raid.raid_superblock_test -- common/autotest_common.sh@636 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:11:11.449 18:14:55 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:11:11.449 18:14:55 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:11:11.449 18:14:55 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:11:11.449 18:14:55 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:11:11.449 18:14:55 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:11:11.449 18:14:55 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:11:11.449 18:14:55 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py ]] 00:11:11.449 18:14:55 bdev_raid.raid_superblock_test -- common/autotest_common.sh@651 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'malloc1 malloc2' -n raid_bdev1 00:11:11.708 [2024-07-12 18:14:55.296416] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc1 is claimed 00:11:11.708 [2024-07-12 18:14:55.297808] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc2 is claimed 00:11:11.708 [2024-07-12 18:14:55.297864] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc1 00:11:11.708 [2024-07-12 18:14:55.297907] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc2 00:11:11.708 [2024-07-12 18:14:55.297934] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:11:11.708 [2024-07-12 18:14:55.297944] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1faeff0 name raid_bdev1, state configuring 00:11:11.708 request: 00:11:11.708 { 00:11:11.708 "name": "raid_bdev1", 00:11:11.708 "raid_level": "raid0", 00:11:11.708 "base_bdevs": [ 00:11:11.708 "malloc1", 00:11:11.708 "malloc2" 00:11:11.708 ], 00:11:11.708 "strip_size_kb": 64, 00:11:11.708 "superblock": false, 00:11:11.708 "method": "bdev_raid_create", 00:11:11.708 "req_id": 1 00:11:11.708 } 00:11:11.708 Got JSON-RPC error response 00:11:11.708 response: 00:11:11.708 { 00:11:11.708 "code": -17, 00:11:11.708 "message": "Failed to create RAID bdev raid_bdev1: File exists" 00:11:11.708 } 00:11:11.708 18:14:55 bdev_raid.raid_superblock_test -- common/autotest_common.sh@651 -- # es=1 00:11:11.708 18:14:55 bdev_raid.raid_superblock_test -- common/autotest_common.sh@659 -- # (( es > 128 )) 00:11:11.708 18:14:55 bdev_raid.raid_superblock_test -- common/autotest_common.sh@670 -- # [[ -n '' ]] 00:11:11.708 18:14:55 bdev_raid.raid_superblock_test -- common/autotest_common.sh@675 -- # (( !es == 0 )) 00:11:11.708 18:14:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:11.708 18:14:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # jq -r '.[]' 00:11:11.967 18:14:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # raid_bdev= 00:11:11.967 18:14:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@459 -- # '[' -n '' ']' 00:11:11.967 18:14:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@464 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:11:12.225 [2024-07-12 18:14:55.789659] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:11:12.225 [2024-07-12 18:14:55.789709] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:11:12.225 [2024-07-12 18:14:55.789730] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1e0b7a0 00:11:12.225 [2024-07-12 18:14:55.789742] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:11:12.225 [2024-07-12 18:14:55.791437] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:11:12.225 [2024-07-12 18:14:55.791467] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:11:12.225 [2024-07-12 18:14:55.791541] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt1 00:11:12.225 [2024-07-12 18:14:55.791569] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:11:12.225 pt1 00:11:12.225 18:14:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@467 -- # verify_raid_bdev_state raid_bdev1 configuring raid0 64 2 00:11:12.225 18:14:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:11:12.225 18:14:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:11:12.225 18:14:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:11:12.225 18:14:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:11:12.225 18:14:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:11:12.225 18:14:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:11:12.225 18:14:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:11:12.225 18:14:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:11:12.225 18:14:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:11:12.225 18:14:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:12.225 18:14:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:11:12.484 18:14:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:11:12.484 "name": "raid_bdev1", 00:11:12.484 "uuid": "a975c54c-dd04-4d49-8237-965a1710f76a", 00:11:12.484 "strip_size_kb": 64, 00:11:12.484 "state": "configuring", 00:11:12.484 "raid_level": "raid0", 00:11:12.484 "superblock": true, 00:11:12.484 "num_base_bdevs": 2, 00:11:12.484 "num_base_bdevs_discovered": 1, 00:11:12.484 "num_base_bdevs_operational": 2, 00:11:12.484 "base_bdevs_list": [ 00:11:12.484 { 00:11:12.484 "name": "pt1", 00:11:12.484 "uuid": "00000000-0000-0000-0000-000000000001", 00:11:12.484 "is_configured": true, 00:11:12.484 "data_offset": 2048, 00:11:12.484 "data_size": 63488 00:11:12.484 }, 00:11:12.484 { 00:11:12.484 "name": null, 00:11:12.484 "uuid": "00000000-0000-0000-0000-000000000002", 00:11:12.484 "is_configured": false, 00:11:12.484 "data_offset": 2048, 00:11:12.484 "data_size": 63488 00:11:12.484 } 00:11:12.484 ] 00:11:12.484 }' 00:11:12.484 18:14:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:11:12.484 18:14:56 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:11:13.052 18:14:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@469 -- # '[' 2 -gt 2 ']' 00:11:13.052 18:14:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i = 1 )) 00:11:13.052 18:14:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:11:13.052 18:14:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@478 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:11:13.310 [2024-07-12 18:14:56.892633] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:11:13.310 [2024-07-12 18:14:56.892681] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:11:13.310 [2024-07-12 18:14:56.892700] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1fa5820 00:11:13.310 [2024-07-12 18:14:56.892712] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:11:13.310 [2024-07-12 18:14:56.893070] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:11:13.310 [2024-07-12 18:14:56.893088] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:11:13.310 [2024-07-12 18:14:56.893154] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:11:13.310 [2024-07-12 18:14:56.893172] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:11:13.310 [2024-07-12 18:14:56.893267] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x1e01ec0 00:11:13.310 [2024-07-12 18:14:56.893277] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 126976, blocklen 512 00:11:13.310 [2024-07-12 18:14:56.893446] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1e04530 00:11:13.310 [2024-07-12 18:14:56.893565] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1e01ec0 00:11:13.310 [2024-07-12 18:14:56.893574] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x1e01ec0 00:11:13.310 [2024-07-12 18:14:56.893673] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:11:13.310 pt2 00:11:13.310 18:14:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i++ )) 00:11:13.310 18:14:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:11:13.310 18:14:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@482 -- # verify_raid_bdev_state raid_bdev1 online raid0 64 2 00:11:13.310 18:14:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:11:13.310 18:14:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:11:13.310 18:14:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:11:13.310 18:14:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:11:13.310 18:14:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:11:13.310 18:14:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:11:13.310 18:14:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:11:13.311 18:14:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:11:13.311 18:14:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:11:13.311 18:14:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:13.311 18:14:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:11:13.569 18:14:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:11:13.569 "name": "raid_bdev1", 00:11:13.569 "uuid": "a975c54c-dd04-4d49-8237-965a1710f76a", 00:11:13.569 "strip_size_kb": 64, 00:11:13.569 "state": "online", 00:11:13.569 "raid_level": "raid0", 00:11:13.569 "superblock": true, 00:11:13.569 "num_base_bdevs": 2, 00:11:13.569 "num_base_bdevs_discovered": 2, 00:11:13.569 "num_base_bdevs_operational": 2, 00:11:13.569 "base_bdevs_list": [ 00:11:13.569 { 00:11:13.569 "name": "pt1", 00:11:13.569 "uuid": "00000000-0000-0000-0000-000000000001", 00:11:13.569 "is_configured": true, 00:11:13.569 "data_offset": 2048, 00:11:13.569 "data_size": 63488 00:11:13.569 }, 00:11:13.569 { 00:11:13.569 "name": "pt2", 00:11:13.569 "uuid": "00000000-0000-0000-0000-000000000002", 00:11:13.569 "is_configured": true, 00:11:13.569 "data_offset": 2048, 00:11:13.569 "data_size": 63488 00:11:13.569 } 00:11:13.569 ] 00:11:13.569 }' 00:11:13.569 18:14:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:11:13.569 18:14:57 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:11:14.135 18:14:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@483 -- # verify_raid_bdev_properties raid_bdev1 00:11:14.135 18:14:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:11:14.135 18:14:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:11:14.135 18:14:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:11:14.135 18:14:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:11:14.135 18:14:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@198 -- # local name 00:11:14.135 18:14:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:11:14.135 18:14:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:11:14.393 [2024-07-12 18:14:57.979766] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:11:14.393 18:14:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:11:14.393 "name": "raid_bdev1", 00:11:14.393 "aliases": [ 00:11:14.393 "a975c54c-dd04-4d49-8237-965a1710f76a" 00:11:14.393 ], 00:11:14.393 "product_name": "Raid Volume", 00:11:14.393 "block_size": 512, 00:11:14.393 "num_blocks": 126976, 00:11:14.393 "uuid": "a975c54c-dd04-4d49-8237-965a1710f76a", 00:11:14.393 "assigned_rate_limits": { 00:11:14.393 "rw_ios_per_sec": 0, 00:11:14.393 "rw_mbytes_per_sec": 0, 00:11:14.393 "r_mbytes_per_sec": 0, 00:11:14.393 "w_mbytes_per_sec": 0 00:11:14.393 }, 00:11:14.393 "claimed": false, 00:11:14.393 "zoned": false, 00:11:14.393 "supported_io_types": { 00:11:14.393 "read": true, 00:11:14.393 "write": true, 00:11:14.393 "unmap": true, 00:11:14.393 "flush": true, 00:11:14.393 "reset": true, 00:11:14.393 "nvme_admin": false, 00:11:14.393 "nvme_io": false, 00:11:14.393 "nvme_io_md": false, 00:11:14.393 "write_zeroes": true, 00:11:14.393 "zcopy": false, 00:11:14.393 "get_zone_info": false, 00:11:14.393 "zone_management": false, 00:11:14.393 "zone_append": false, 00:11:14.393 "compare": false, 00:11:14.393 "compare_and_write": false, 00:11:14.393 "abort": false, 00:11:14.393 "seek_hole": false, 00:11:14.393 "seek_data": false, 00:11:14.393 "copy": false, 00:11:14.393 "nvme_iov_md": false 00:11:14.393 }, 00:11:14.393 "memory_domains": [ 00:11:14.393 { 00:11:14.393 "dma_device_id": "system", 00:11:14.393 "dma_device_type": 1 00:11:14.393 }, 00:11:14.393 { 00:11:14.393 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:14.393 "dma_device_type": 2 00:11:14.393 }, 00:11:14.393 { 00:11:14.393 "dma_device_id": "system", 00:11:14.393 "dma_device_type": 1 00:11:14.393 }, 00:11:14.393 { 00:11:14.393 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:14.393 "dma_device_type": 2 00:11:14.393 } 00:11:14.393 ], 00:11:14.393 "driver_specific": { 00:11:14.393 "raid": { 00:11:14.393 "uuid": "a975c54c-dd04-4d49-8237-965a1710f76a", 00:11:14.393 "strip_size_kb": 64, 00:11:14.393 "state": "online", 00:11:14.393 "raid_level": "raid0", 00:11:14.393 "superblock": true, 00:11:14.393 "num_base_bdevs": 2, 00:11:14.393 "num_base_bdevs_discovered": 2, 00:11:14.393 "num_base_bdevs_operational": 2, 00:11:14.393 "base_bdevs_list": [ 00:11:14.393 { 00:11:14.393 "name": "pt1", 00:11:14.393 "uuid": "00000000-0000-0000-0000-000000000001", 00:11:14.393 "is_configured": true, 00:11:14.393 "data_offset": 2048, 00:11:14.393 "data_size": 63488 00:11:14.393 }, 00:11:14.393 { 00:11:14.393 "name": "pt2", 00:11:14.393 "uuid": "00000000-0000-0000-0000-000000000002", 00:11:14.393 "is_configured": true, 00:11:14.393 "data_offset": 2048, 00:11:14.393 "data_size": 63488 00:11:14.393 } 00:11:14.393 ] 00:11:14.393 } 00:11:14.393 } 00:11:14.393 }' 00:11:14.393 18:14:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:11:14.393 18:14:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:11:14.393 pt2' 00:11:14.393 18:14:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:11:14.393 18:14:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:11:14.393 18:14:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:11:14.652 18:14:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:11:14.652 "name": "pt1", 00:11:14.652 "aliases": [ 00:11:14.652 "00000000-0000-0000-0000-000000000001" 00:11:14.652 ], 00:11:14.652 "product_name": "passthru", 00:11:14.652 "block_size": 512, 00:11:14.652 "num_blocks": 65536, 00:11:14.652 "uuid": "00000000-0000-0000-0000-000000000001", 00:11:14.652 "assigned_rate_limits": { 00:11:14.652 "rw_ios_per_sec": 0, 00:11:14.652 "rw_mbytes_per_sec": 0, 00:11:14.652 "r_mbytes_per_sec": 0, 00:11:14.652 "w_mbytes_per_sec": 0 00:11:14.652 }, 00:11:14.652 "claimed": true, 00:11:14.652 "claim_type": "exclusive_write", 00:11:14.652 "zoned": false, 00:11:14.652 "supported_io_types": { 00:11:14.652 "read": true, 00:11:14.652 "write": true, 00:11:14.652 "unmap": true, 00:11:14.652 "flush": true, 00:11:14.652 "reset": true, 00:11:14.652 "nvme_admin": false, 00:11:14.652 "nvme_io": false, 00:11:14.652 "nvme_io_md": false, 00:11:14.652 "write_zeroes": true, 00:11:14.652 "zcopy": true, 00:11:14.652 "get_zone_info": false, 00:11:14.652 "zone_management": false, 00:11:14.652 "zone_append": false, 00:11:14.652 "compare": false, 00:11:14.652 "compare_and_write": false, 00:11:14.652 "abort": true, 00:11:14.652 "seek_hole": false, 00:11:14.652 "seek_data": false, 00:11:14.652 "copy": true, 00:11:14.652 "nvme_iov_md": false 00:11:14.652 }, 00:11:14.652 "memory_domains": [ 00:11:14.652 { 00:11:14.652 "dma_device_id": "system", 00:11:14.652 "dma_device_type": 1 00:11:14.652 }, 00:11:14.652 { 00:11:14.652 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:14.652 "dma_device_type": 2 00:11:14.652 } 00:11:14.652 ], 00:11:14.652 "driver_specific": { 00:11:14.652 "passthru": { 00:11:14.652 "name": "pt1", 00:11:14.652 "base_bdev_name": "malloc1" 00:11:14.652 } 00:11:14.652 } 00:11:14.652 }' 00:11:14.652 18:14:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:11:14.652 18:14:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:11:14.911 18:14:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:11:14.911 18:14:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:11:14.911 18:14:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:11:14.911 18:14:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:11:14.911 18:14:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:11:14.911 18:14:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:11:14.911 18:14:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:11:14.911 18:14:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:11:14.911 18:14:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:11:14.911 18:14:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:11:14.911 18:14:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:11:14.911 18:14:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:11:14.911 18:14:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:11:15.169 18:14:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:11:15.169 "name": "pt2", 00:11:15.169 "aliases": [ 00:11:15.169 "00000000-0000-0000-0000-000000000002" 00:11:15.169 ], 00:11:15.169 "product_name": "passthru", 00:11:15.169 "block_size": 512, 00:11:15.169 "num_blocks": 65536, 00:11:15.169 "uuid": "00000000-0000-0000-0000-000000000002", 00:11:15.169 "assigned_rate_limits": { 00:11:15.170 "rw_ios_per_sec": 0, 00:11:15.170 "rw_mbytes_per_sec": 0, 00:11:15.170 "r_mbytes_per_sec": 0, 00:11:15.170 "w_mbytes_per_sec": 0 00:11:15.170 }, 00:11:15.170 "claimed": true, 00:11:15.170 "claim_type": "exclusive_write", 00:11:15.170 "zoned": false, 00:11:15.170 "supported_io_types": { 00:11:15.170 "read": true, 00:11:15.170 "write": true, 00:11:15.170 "unmap": true, 00:11:15.170 "flush": true, 00:11:15.170 "reset": true, 00:11:15.170 "nvme_admin": false, 00:11:15.170 "nvme_io": false, 00:11:15.170 "nvme_io_md": false, 00:11:15.170 "write_zeroes": true, 00:11:15.170 "zcopy": true, 00:11:15.170 "get_zone_info": false, 00:11:15.170 "zone_management": false, 00:11:15.170 "zone_append": false, 00:11:15.170 "compare": false, 00:11:15.170 "compare_and_write": false, 00:11:15.170 "abort": true, 00:11:15.170 "seek_hole": false, 00:11:15.170 "seek_data": false, 00:11:15.170 "copy": true, 00:11:15.170 "nvme_iov_md": false 00:11:15.170 }, 00:11:15.170 "memory_domains": [ 00:11:15.170 { 00:11:15.170 "dma_device_id": "system", 00:11:15.170 "dma_device_type": 1 00:11:15.170 }, 00:11:15.170 { 00:11:15.170 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:15.170 "dma_device_type": 2 00:11:15.170 } 00:11:15.170 ], 00:11:15.170 "driver_specific": { 00:11:15.170 "passthru": { 00:11:15.170 "name": "pt2", 00:11:15.170 "base_bdev_name": "malloc2" 00:11:15.170 } 00:11:15.170 } 00:11:15.170 }' 00:11:15.170 18:14:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:11:15.428 18:14:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:11:15.428 18:14:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:11:15.428 18:14:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:11:15.428 18:14:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:11:15.428 18:14:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:11:15.428 18:14:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:11:15.428 18:14:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:11:15.428 18:14:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:11:15.428 18:14:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:11:15.686 18:14:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:11:15.686 18:14:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:11:15.687 18:14:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:11:15.687 18:14:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # jq -r '.[] | .uuid' 00:11:15.945 [2024-07-12 18:14:59.431621] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:11:15.945 18:14:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # '[' a975c54c-dd04-4d49-8237-965a1710f76a '!=' a975c54c-dd04-4d49-8237-965a1710f76a ']' 00:11:15.946 18:14:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@490 -- # has_redundancy raid0 00:11:15.946 18:14:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:11:15.946 18:14:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@215 -- # return 1 00:11:15.946 18:14:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@562 -- # killprocess 2459278 00:11:15.946 18:14:59 bdev_raid.raid_superblock_test -- common/autotest_common.sh@948 -- # '[' -z 2459278 ']' 00:11:15.946 18:14:59 bdev_raid.raid_superblock_test -- common/autotest_common.sh@952 -- # kill -0 2459278 00:11:15.946 18:14:59 bdev_raid.raid_superblock_test -- common/autotest_common.sh@953 -- # uname 00:11:15.946 18:14:59 bdev_raid.raid_superblock_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:11:15.946 18:14:59 bdev_raid.raid_superblock_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2459278 00:11:15.946 18:14:59 bdev_raid.raid_superblock_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:11:15.946 18:14:59 bdev_raid.raid_superblock_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:11:15.946 18:14:59 bdev_raid.raid_superblock_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2459278' 00:11:15.946 killing process with pid 2459278 00:11:15.946 18:14:59 bdev_raid.raid_superblock_test -- common/autotest_common.sh@967 -- # kill 2459278 00:11:15.946 [2024-07-12 18:14:59.498192] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:11:15.946 18:14:59 bdev_raid.raid_superblock_test -- common/autotest_common.sh@972 -- # wait 2459278 00:11:15.946 [2024-07-12 18:14:59.498244] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:11:15.946 [2024-07-12 18:14:59.498286] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:11:15.946 [2024-07-12 18:14:59.498297] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1e01ec0 name raid_bdev1, state offline 00:11:15.946 [2024-07-12 18:14:59.514409] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:11:16.205 18:14:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@564 -- # return 0 00:11:16.205 00:11:16.205 real 0m10.420s 00:11:16.205 user 0m18.602s 00:11:16.205 sys 0m1.898s 00:11:16.205 18:14:59 bdev_raid.raid_superblock_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:11:16.205 18:14:59 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:11:16.205 ************************************ 00:11:16.205 END TEST raid_superblock_test 00:11:16.205 ************************************ 00:11:16.205 18:14:59 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:11:16.205 18:14:59 bdev_raid -- bdev/bdev_raid.sh@870 -- # run_test raid_read_error_test raid_io_error_test raid0 2 read 00:11:16.205 18:14:59 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:11:16.205 18:14:59 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:11:16.205 18:14:59 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:11:16.205 ************************************ 00:11:16.205 START TEST raid_read_error_test 00:11:16.205 ************************************ 00:11:16.205 18:14:59 bdev_raid.raid_read_error_test -- common/autotest_common.sh@1123 -- # raid_io_error_test raid0 2 read 00:11:16.205 18:14:59 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@788 -- # local raid_level=raid0 00:11:16.205 18:14:59 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@789 -- # local num_base_bdevs=2 00:11:16.205 18:14:59 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@790 -- # local error_io_type=read 00:11:16.205 18:14:59 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i = 1 )) 00:11:16.205 18:14:59 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:11:16.205 18:14:59 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev1 00:11:16.205 18:14:59 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:11:16.205 18:14:59 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:11:16.205 18:14:59 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev2 00:11:16.205 18:14:59 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:11:16.205 18:14:59 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:11:16.205 18:14:59 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:11:16.205 18:14:59 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # local base_bdevs 00:11:16.205 18:14:59 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@792 -- # local raid_bdev_name=raid_bdev1 00:11:16.205 18:14:59 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # local strip_size 00:11:16.205 18:14:59 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@794 -- # local create_arg 00:11:16.205 18:14:59 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@795 -- # local bdevperf_log 00:11:16.205 18:14:59 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@796 -- # local fail_per_s 00:11:16.205 18:14:59 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@798 -- # '[' raid0 '!=' raid1 ']' 00:11:16.205 18:14:59 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@799 -- # strip_size=64 00:11:16.205 18:14:59 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@800 -- # create_arg+=' -z 64' 00:11:16.205 18:14:59 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@805 -- # mktemp -p /raidtest 00:11:16.205 18:14:59 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@805 -- # bdevperf_log=/raidtest/tmp.jYlObtEi6P 00:11:16.205 18:14:59 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@808 -- # raid_pid=2460911 00:11:16.205 18:14:59 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@809 -- # waitforlisten 2460911 /var/tmp/spdk-raid.sock 00:11:16.205 18:14:59 bdev_raid.raid_read_error_test -- common/autotest_common.sh@829 -- # '[' -z 2460911 ']' 00:11:16.205 18:14:59 bdev_raid.raid_read_error_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:11:16.205 18:14:59 bdev_raid.raid_read_error_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:11:16.205 18:14:59 bdev_raid.raid_read_error_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:11:16.205 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:11:16.205 18:14:59 bdev_raid.raid_read_error_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:11:16.205 18:14:59 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:11:16.205 18:14:59 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@807 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 128k -q 1 -z -f -L bdev_raid 00:11:16.205 [2024-07-12 18:14:59.883365] Starting SPDK v24.09-pre git sha1 182dd7de4 / DPDK 24.03.0 initialization... 00:11:16.205 [2024-07-12 18:14:59.883432] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2460911 ] 00:11:16.464 [2024-07-12 18:15:00.013775] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:11:16.464 [2024-07-12 18:15:00.126165] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:11:16.723 [2024-07-12 18:15:00.196791] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:11:16.723 [2024-07-12 18:15:00.196830] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:11:17.657 18:15:01 bdev_raid.raid_read_error_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:11:17.657 18:15:01 bdev_raid.raid_read_error_test -- common/autotest_common.sh@862 -- # return 0 00:11:17.657 18:15:01 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:11:17.657 18:15:01 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:11:17.657 BaseBdev1_malloc 00:11:17.657 18:15:01 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev1_malloc 00:11:18.225 true 00:11:18.225 18:15:01 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev1_malloc -p BaseBdev1 00:11:18.484 [2024-07-12 18:15:02.093553] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev1_malloc 00:11:18.484 [2024-07-12 18:15:02.093600] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:11:18.484 [2024-07-12 18:15:02.093624] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x25bf0d0 00:11:18.484 [2024-07-12 18:15:02.093636] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:11:18.484 [2024-07-12 18:15:02.095554] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:11:18.484 [2024-07-12 18:15:02.095583] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:11:18.484 BaseBdev1 00:11:18.484 18:15:02 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:11:18.484 18:15:02 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:11:19.049 BaseBdev2_malloc 00:11:19.049 18:15:02 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev2_malloc 00:11:19.307 true 00:11:19.307 18:15:02 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev2_malloc -p BaseBdev2 00:11:19.872 [2024-07-12 18:15:03.353645] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev2_malloc 00:11:19.872 [2024-07-12 18:15:03.353692] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:11:19.872 [2024-07-12 18:15:03.353715] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x25c3910 00:11:19.872 [2024-07-12 18:15:03.353728] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:11:19.872 [2024-07-12 18:15:03.355356] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:11:19.872 [2024-07-12 18:15:03.355384] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:11:19.872 BaseBdev2 00:11:19.872 18:15:03 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@819 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'BaseBdev1 BaseBdev2' -n raid_bdev1 -s 00:11:20.438 [2024-07-12 18:15:03.863004] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:11:20.438 [2024-07-12 18:15:03.864313] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:11:20.438 [2024-07-12 18:15:03.864499] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x25c5320 00:11:20.438 [2024-07-12 18:15:03.864512] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 126976, blocklen 512 00:11:20.438 [2024-07-12 18:15:03.864704] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x25c4270 00:11:20.438 [2024-07-12 18:15:03.864849] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x25c5320 00:11:20.438 [2024-07-12 18:15:03.864859] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x25c5320 00:11:20.438 [2024-07-12 18:15:03.864976] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:11:20.438 18:15:03 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@820 -- # verify_raid_bdev_state raid_bdev1 online raid0 64 2 00:11:20.438 18:15:03 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:11:20.438 18:15:03 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:11:20.438 18:15:03 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:11:20.438 18:15:03 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:11:20.438 18:15:03 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:11:20.438 18:15:03 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:11:20.438 18:15:03 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:11:20.438 18:15:03 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:11:20.438 18:15:03 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:11:20.438 18:15:03 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:20.438 18:15:03 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:11:20.438 18:15:04 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:11:20.438 "name": "raid_bdev1", 00:11:20.438 "uuid": "ba2860f9-c02b-46d6-98e1-914e43111120", 00:11:20.438 "strip_size_kb": 64, 00:11:20.438 "state": "online", 00:11:20.438 "raid_level": "raid0", 00:11:20.438 "superblock": true, 00:11:20.438 "num_base_bdevs": 2, 00:11:20.438 "num_base_bdevs_discovered": 2, 00:11:20.438 "num_base_bdevs_operational": 2, 00:11:20.438 "base_bdevs_list": [ 00:11:20.438 { 00:11:20.438 "name": "BaseBdev1", 00:11:20.438 "uuid": "9afd4565-2d20-5fad-bcda-3a974f496eba", 00:11:20.438 "is_configured": true, 00:11:20.438 "data_offset": 2048, 00:11:20.438 "data_size": 63488 00:11:20.438 }, 00:11:20.438 { 00:11:20.438 "name": "BaseBdev2", 00:11:20.438 "uuid": "20f735f9-a9bc-59fc-9a0e-54a2af869b32", 00:11:20.438 "is_configured": true, 00:11:20.438 "data_offset": 2048, 00:11:20.438 "data_size": 63488 00:11:20.438 } 00:11:20.438 ] 00:11:20.438 }' 00:11:20.438 18:15:04 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:11:20.438 18:15:04 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:11:21.005 18:15:04 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@824 -- # sleep 1 00:11:21.005 18:15:04 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@823 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/spdk-raid.sock perform_tests 00:11:21.264 [2024-07-12 18:15:04.837861] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x25c09b0 00:11:22.200 18:15:05 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@827 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_inject_error EE_BaseBdev1_malloc read failure 00:11:22.458 18:15:05 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@829 -- # local expected_num_base_bdevs 00:11:22.458 18:15:05 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@830 -- # [[ raid0 = \r\a\i\d\1 ]] 00:11:22.458 18:15:05 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@833 -- # expected_num_base_bdevs=2 00:11:22.458 18:15:05 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@835 -- # verify_raid_bdev_state raid_bdev1 online raid0 64 2 00:11:22.458 18:15:05 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:11:22.458 18:15:05 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:11:22.458 18:15:05 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:11:22.458 18:15:05 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:11:22.458 18:15:05 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:11:22.458 18:15:05 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:11:22.458 18:15:05 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:11:22.458 18:15:05 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:11:22.458 18:15:05 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:11:22.458 18:15:05 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:22.458 18:15:05 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:11:22.717 18:15:06 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:11:22.717 "name": "raid_bdev1", 00:11:22.717 "uuid": "ba2860f9-c02b-46d6-98e1-914e43111120", 00:11:22.717 "strip_size_kb": 64, 00:11:22.717 "state": "online", 00:11:22.717 "raid_level": "raid0", 00:11:22.717 "superblock": true, 00:11:22.717 "num_base_bdevs": 2, 00:11:22.717 "num_base_bdevs_discovered": 2, 00:11:22.717 "num_base_bdevs_operational": 2, 00:11:22.717 "base_bdevs_list": [ 00:11:22.717 { 00:11:22.717 "name": "BaseBdev1", 00:11:22.717 "uuid": "9afd4565-2d20-5fad-bcda-3a974f496eba", 00:11:22.717 "is_configured": true, 00:11:22.717 "data_offset": 2048, 00:11:22.717 "data_size": 63488 00:11:22.717 }, 00:11:22.717 { 00:11:22.717 "name": "BaseBdev2", 00:11:22.717 "uuid": "20f735f9-a9bc-59fc-9a0e-54a2af869b32", 00:11:22.717 "is_configured": true, 00:11:22.717 "data_offset": 2048, 00:11:22.717 "data_size": 63488 00:11:22.717 } 00:11:22.717 ] 00:11:22.717 }' 00:11:22.717 18:15:06 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:11:22.717 18:15:06 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:11:23.284 18:15:06 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@837 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:11:23.544 [2024-07-12 18:15:07.042474] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:11:23.544 [2024-07-12 18:15:07.042506] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:11:23.544 [2024-07-12 18:15:07.045665] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:11:23.544 [2024-07-12 18:15:07.045695] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:11:23.544 [2024-07-12 18:15:07.045723] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:11:23.544 [2024-07-12 18:15:07.045734] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x25c5320 name raid_bdev1, state offline 00:11:23.544 0 00:11:23.544 18:15:07 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@839 -- # killprocess 2460911 00:11:23.544 18:15:07 bdev_raid.raid_read_error_test -- common/autotest_common.sh@948 -- # '[' -z 2460911 ']' 00:11:23.544 18:15:07 bdev_raid.raid_read_error_test -- common/autotest_common.sh@952 -- # kill -0 2460911 00:11:23.544 18:15:07 bdev_raid.raid_read_error_test -- common/autotest_common.sh@953 -- # uname 00:11:23.544 18:15:07 bdev_raid.raid_read_error_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:11:23.544 18:15:07 bdev_raid.raid_read_error_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2460911 00:11:23.544 18:15:07 bdev_raid.raid_read_error_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:11:23.544 18:15:07 bdev_raid.raid_read_error_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:11:23.544 18:15:07 bdev_raid.raid_read_error_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2460911' 00:11:23.544 killing process with pid 2460911 00:11:23.544 18:15:07 bdev_raid.raid_read_error_test -- common/autotest_common.sh@967 -- # kill 2460911 00:11:23.544 [2024-07-12 18:15:07.111307] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:11:23.544 18:15:07 bdev_raid.raid_read_error_test -- common/autotest_common.sh@972 -- # wait 2460911 00:11:23.544 [2024-07-12 18:15:07.122194] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:11:23.804 18:15:07 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # grep -v Job /raidtest/tmp.jYlObtEi6P 00:11:23.804 18:15:07 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # grep raid_bdev1 00:11:23.804 18:15:07 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # awk '{print $6}' 00:11:23.804 18:15:07 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # fail_per_s=0.46 00:11:23.804 18:15:07 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@844 -- # has_redundancy raid0 00:11:23.804 18:15:07 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:11:23.804 18:15:07 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@215 -- # return 1 00:11:23.804 18:15:07 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@847 -- # [[ 0.46 != \0\.\0\0 ]] 00:11:23.804 00:11:23.804 real 0m7.546s 00:11:23.804 user 0m12.242s 00:11:23.804 sys 0m1.228s 00:11:23.804 18:15:07 bdev_raid.raid_read_error_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:11:23.804 18:15:07 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:11:23.804 ************************************ 00:11:23.804 END TEST raid_read_error_test 00:11:23.804 ************************************ 00:11:23.804 18:15:07 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:11:23.804 18:15:07 bdev_raid -- bdev/bdev_raid.sh@871 -- # run_test raid_write_error_test raid_io_error_test raid0 2 write 00:11:23.804 18:15:07 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:11:23.804 18:15:07 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:11:23.804 18:15:07 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:11:23.804 ************************************ 00:11:23.804 START TEST raid_write_error_test 00:11:23.804 ************************************ 00:11:23.804 18:15:07 bdev_raid.raid_write_error_test -- common/autotest_common.sh@1123 -- # raid_io_error_test raid0 2 write 00:11:23.804 18:15:07 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@788 -- # local raid_level=raid0 00:11:23.804 18:15:07 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@789 -- # local num_base_bdevs=2 00:11:23.804 18:15:07 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@790 -- # local error_io_type=write 00:11:23.804 18:15:07 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i = 1 )) 00:11:23.804 18:15:07 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:11:23.804 18:15:07 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev1 00:11:23.804 18:15:07 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:11:23.804 18:15:07 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:11:23.804 18:15:07 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev2 00:11:23.804 18:15:07 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:11:23.804 18:15:07 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:11:23.804 18:15:07 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:11:23.804 18:15:07 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # local base_bdevs 00:11:23.804 18:15:07 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@792 -- # local raid_bdev_name=raid_bdev1 00:11:23.804 18:15:07 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # local strip_size 00:11:23.804 18:15:07 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@794 -- # local create_arg 00:11:23.804 18:15:07 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@795 -- # local bdevperf_log 00:11:23.804 18:15:07 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@796 -- # local fail_per_s 00:11:23.804 18:15:07 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@798 -- # '[' raid0 '!=' raid1 ']' 00:11:23.804 18:15:07 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@799 -- # strip_size=64 00:11:23.804 18:15:07 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@800 -- # create_arg+=' -z 64' 00:11:23.804 18:15:07 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@805 -- # mktemp -p /raidtest 00:11:23.804 18:15:07 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@805 -- # bdevperf_log=/raidtest/tmp.R5U0srRjbP 00:11:23.804 18:15:07 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@808 -- # raid_pid=2462557 00:11:23.804 18:15:07 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@809 -- # waitforlisten 2462557 /var/tmp/spdk-raid.sock 00:11:23.804 18:15:07 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@807 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 128k -q 1 -z -f -L bdev_raid 00:11:23.804 18:15:07 bdev_raid.raid_write_error_test -- common/autotest_common.sh@829 -- # '[' -z 2462557 ']' 00:11:23.804 18:15:07 bdev_raid.raid_write_error_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:11:23.804 18:15:07 bdev_raid.raid_write_error_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:11:23.804 18:15:07 bdev_raid.raid_write_error_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:11:23.804 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:11:23.805 18:15:07 bdev_raid.raid_write_error_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:11:23.805 18:15:07 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:11:23.805 [2024-07-12 18:15:07.524086] Starting SPDK v24.09-pre git sha1 182dd7de4 / DPDK 24.03.0 initialization... 00:11:23.805 [2024-07-12 18:15:07.524158] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2462557 ] 00:11:24.063 [2024-07-12 18:15:07.654996] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:11:24.063 [2024-07-12 18:15:07.756063] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:11:24.320 [2024-07-12 18:15:07.819263] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:11:24.320 [2024-07-12 18:15:07.819301] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:11:24.885 18:15:08 bdev_raid.raid_write_error_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:11:24.885 18:15:08 bdev_raid.raid_write_error_test -- common/autotest_common.sh@862 -- # return 0 00:11:24.885 18:15:08 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:11:24.885 18:15:08 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:11:25.143 BaseBdev1_malloc 00:11:25.143 18:15:08 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev1_malloc 00:11:25.401 true 00:11:25.401 18:15:08 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev1_malloc -p BaseBdev1 00:11:25.659 [2024-07-12 18:15:09.185747] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev1_malloc 00:11:25.659 [2024-07-12 18:15:09.185791] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:11:25.659 [2024-07-12 18:15:09.185809] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1f9d0d0 00:11:25.659 [2024-07-12 18:15:09.185822] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:11:25.659 [2024-07-12 18:15:09.187526] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:11:25.659 [2024-07-12 18:15:09.187554] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:11:25.659 BaseBdev1 00:11:25.659 18:15:09 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:11:25.659 18:15:09 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:11:25.916 BaseBdev2_malloc 00:11:25.916 18:15:09 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev2_malloc 00:11:26.174 true 00:11:26.174 18:15:09 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev2_malloc -p BaseBdev2 00:11:26.432 [2024-07-12 18:15:09.924229] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev2_malloc 00:11:26.432 [2024-07-12 18:15:09.924275] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:11:26.432 [2024-07-12 18:15:09.924295] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1fa1910 00:11:26.432 [2024-07-12 18:15:09.924307] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:11:26.432 [2024-07-12 18:15:09.925812] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:11:26.432 [2024-07-12 18:15:09.925841] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:11:26.432 BaseBdev2 00:11:26.432 18:15:09 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@819 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'BaseBdev1 BaseBdev2' -n raid_bdev1 -s 00:11:26.690 [2024-07-12 18:15:10.172935] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:11:26.690 [2024-07-12 18:15:10.174225] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:11:26.690 [2024-07-12 18:15:10.174405] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x1fa3320 00:11:26.690 [2024-07-12 18:15:10.174419] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 126976, blocklen 512 00:11:26.690 [2024-07-12 18:15:10.174617] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1fa2270 00:11:26.690 [2024-07-12 18:15:10.174763] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1fa3320 00:11:26.690 [2024-07-12 18:15:10.174773] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x1fa3320 00:11:26.690 [2024-07-12 18:15:10.174876] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:11:26.690 18:15:10 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@820 -- # verify_raid_bdev_state raid_bdev1 online raid0 64 2 00:11:26.690 18:15:10 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:11:26.690 18:15:10 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:11:26.690 18:15:10 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:11:26.690 18:15:10 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:11:26.690 18:15:10 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:11:26.690 18:15:10 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:11:26.690 18:15:10 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:11:26.690 18:15:10 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:11:26.690 18:15:10 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:11:26.690 18:15:10 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:26.690 18:15:10 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:11:26.949 18:15:10 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:11:26.949 "name": "raid_bdev1", 00:11:26.949 "uuid": "382c4f56-4822-4571-bba6-9a711fe0169d", 00:11:26.949 "strip_size_kb": 64, 00:11:26.949 "state": "online", 00:11:26.949 "raid_level": "raid0", 00:11:26.949 "superblock": true, 00:11:26.949 "num_base_bdevs": 2, 00:11:26.949 "num_base_bdevs_discovered": 2, 00:11:26.949 "num_base_bdevs_operational": 2, 00:11:26.949 "base_bdevs_list": [ 00:11:26.949 { 00:11:26.949 "name": "BaseBdev1", 00:11:26.949 "uuid": "02252132-a531-5a10-93f6-b6ca22df1f3f", 00:11:26.949 "is_configured": true, 00:11:26.949 "data_offset": 2048, 00:11:26.949 "data_size": 63488 00:11:26.949 }, 00:11:26.949 { 00:11:26.949 "name": "BaseBdev2", 00:11:26.949 "uuid": "c7a30682-dc61-5abb-95e2-ecf989598e4c", 00:11:26.949 "is_configured": true, 00:11:26.949 "data_offset": 2048, 00:11:26.949 "data_size": 63488 00:11:26.949 } 00:11:26.949 ] 00:11:26.949 }' 00:11:26.949 18:15:10 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:11:26.949 18:15:10 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:11:27.516 18:15:11 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@824 -- # sleep 1 00:11:27.516 18:15:11 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@823 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/spdk-raid.sock perform_tests 00:11:27.516 [2024-07-12 18:15:11.123723] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1f9e9b0 00:11:28.452 18:15:12 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@827 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_inject_error EE_BaseBdev1_malloc write failure 00:11:28.710 18:15:12 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@829 -- # local expected_num_base_bdevs 00:11:28.710 18:15:12 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@830 -- # [[ raid0 = \r\a\i\d\1 ]] 00:11:28.710 18:15:12 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@833 -- # expected_num_base_bdevs=2 00:11:28.710 18:15:12 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@835 -- # verify_raid_bdev_state raid_bdev1 online raid0 64 2 00:11:28.710 18:15:12 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:11:28.710 18:15:12 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:11:28.710 18:15:12 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:11:28.710 18:15:12 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:11:28.710 18:15:12 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:11:28.710 18:15:12 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:11:28.710 18:15:12 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:11:28.710 18:15:12 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:11:28.710 18:15:12 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:11:28.710 18:15:12 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:28.710 18:15:12 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:11:28.969 18:15:12 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:11:28.969 "name": "raid_bdev1", 00:11:28.969 "uuid": "382c4f56-4822-4571-bba6-9a711fe0169d", 00:11:28.969 "strip_size_kb": 64, 00:11:28.969 "state": "online", 00:11:28.969 "raid_level": "raid0", 00:11:28.969 "superblock": true, 00:11:28.969 "num_base_bdevs": 2, 00:11:28.969 "num_base_bdevs_discovered": 2, 00:11:28.969 "num_base_bdevs_operational": 2, 00:11:28.969 "base_bdevs_list": [ 00:11:28.969 { 00:11:28.969 "name": "BaseBdev1", 00:11:28.969 "uuid": "02252132-a531-5a10-93f6-b6ca22df1f3f", 00:11:28.969 "is_configured": true, 00:11:28.969 "data_offset": 2048, 00:11:28.969 "data_size": 63488 00:11:28.969 }, 00:11:28.969 { 00:11:28.969 "name": "BaseBdev2", 00:11:28.969 "uuid": "c7a30682-dc61-5abb-95e2-ecf989598e4c", 00:11:28.969 "is_configured": true, 00:11:28.969 "data_offset": 2048, 00:11:28.969 "data_size": 63488 00:11:28.969 } 00:11:28.969 ] 00:11:28.969 }' 00:11:28.969 18:15:12 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:11:28.969 18:15:12 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:11:29.537 18:15:13 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@837 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:11:29.794 [2024-07-12 18:15:13.304009] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:11:29.794 [2024-07-12 18:15:13.304051] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:11:29.794 [2024-07-12 18:15:13.307217] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:11:29.794 [2024-07-12 18:15:13.307248] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:11:29.794 [2024-07-12 18:15:13.307274] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:11:29.794 [2024-07-12 18:15:13.307285] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1fa3320 name raid_bdev1, state offline 00:11:29.794 0 00:11:29.794 18:15:13 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@839 -- # killprocess 2462557 00:11:29.795 18:15:13 bdev_raid.raid_write_error_test -- common/autotest_common.sh@948 -- # '[' -z 2462557 ']' 00:11:29.795 18:15:13 bdev_raid.raid_write_error_test -- common/autotest_common.sh@952 -- # kill -0 2462557 00:11:29.795 18:15:13 bdev_raid.raid_write_error_test -- common/autotest_common.sh@953 -- # uname 00:11:29.795 18:15:13 bdev_raid.raid_write_error_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:11:29.795 18:15:13 bdev_raid.raid_write_error_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2462557 00:11:29.795 18:15:13 bdev_raid.raid_write_error_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:11:29.795 18:15:13 bdev_raid.raid_write_error_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:11:29.795 18:15:13 bdev_raid.raid_write_error_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2462557' 00:11:29.795 killing process with pid 2462557 00:11:29.795 18:15:13 bdev_raid.raid_write_error_test -- common/autotest_common.sh@967 -- # kill 2462557 00:11:29.795 [2024-07-12 18:15:13.371535] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:11:29.795 18:15:13 bdev_raid.raid_write_error_test -- common/autotest_common.sh@972 -- # wait 2462557 00:11:29.795 [2024-07-12 18:15:13.382149] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:11:30.053 18:15:13 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # grep -v Job /raidtest/tmp.R5U0srRjbP 00:11:30.053 18:15:13 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # grep raid_bdev1 00:11:30.053 18:15:13 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # awk '{print $6}' 00:11:30.053 18:15:13 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # fail_per_s=0.46 00:11:30.053 18:15:13 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@844 -- # has_redundancy raid0 00:11:30.053 18:15:13 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:11:30.053 18:15:13 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@215 -- # return 1 00:11:30.053 18:15:13 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@847 -- # [[ 0.46 != \0\.\0\0 ]] 00:11:30.053 00:11:30.053 real 0m6.179s 00:11:30.053 user 0m9.600s 00:11:30.053 sys 0m1.095s 00:11:30.053 18:15:13 bdev_raid.raid_write_error_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:11:30.053 18:15:13 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:11:30.053 ************************************ 00:11:30.053 END TEST raid_write_error_test 00:11:30.053 ************************************ 00:11:30.053 18:15:13 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:11:30.053 18:15:13 bdev_raid -- bdev/bdev_raid.sh@866 -- # for level in raid0 concat raid1 00:11:30.053 18:15:13 bdev_raid -- bdev/bdev_raid.sh@867 -- # run_test raid_state_function_test raid_state_function_test concat 2 false 00:11:30.053 18:15:13 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:11:30.053 18:15:13 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:11:30.053 18:15:13 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:11:30.053 ************************************ 00:11:30.053 START TEST raid_state_function_test 00:11:30.053 ************************************ 00:11:30.053 18:15:13 bdev_raid.raid_state_function_test -- common/autotest_common.sh@1123 -- # raid_state_function_test concat 2 false 00:11:30.053 18:15:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@220 -- # local raid_level=concat 00:11:30.053 18:15:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@221 -- # local num_base_bdevs=2 00:11:30.053 18:15:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@222 -- # local superblock=false 00:11:30.053 18:15:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@223 -- # local raid_bdev 00:11:30.053 18:15:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i = 1 )) 00:11:30.053 18:15:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:11:30.053 18:15:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev1 00:11:30.053 18:15:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:11:30.053 18:15:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:11:30.053 18:15:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev2 00:11:30.053 18:15:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:11:30.053 18:15:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:11:30.053 18:15:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:11:30.053 18:15:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # local base_bdevs 00:11:30.053 18:15:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@225 -- # local raid_bdev_name=Existed_Raid 00:11:30.053 18:15:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # local strip_size 00:11:30.053 18:15:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@227 -- # local strip_size_create_arg 00:11:30.053 18:15:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@228 -- # local superblock_create_arg 00:11:30.053 18:15:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@230 -- # '[' concat '!=' raid1 ']' 00:11:30.053 18:15:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@231 -- # strip_size=64 00:11:30.053 18:15:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@232 -- # strip_size_create_arg='-z 64' 00:11:30.053 18:15:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@237 -- # '[' false = true ']' 00:11:30.053 18:15:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@240 -- # superblock_create_arg= 00:11:30.053 18:15:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@244 -- # raid_pid=2463454 00:11:30.053 18:15:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@245 -- # echo 'Process raid pid: 2463454' 00:11:30.053 Process raid pid: 2463454 00:11:30.053 18:15:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@243 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:11:30.053 18:15:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@246 -- # waitforlisten 2463454 /var/tmp/spdk-raid.sock 00:11:30.053 18:15:13 bdev_raid.raid_state_function_test -- common/autotest_common.sh@829 -- # '[' -z 2463454 ']' 00:11:30.053 18:15:13 bdev_raid.raid_state_function_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:11:30.053 18:15:13 bdev_raid.raid_state_function_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:11:30.053 18:15:13 bdev_raid.raid_state_function_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:11:30.053 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:11:30.053 18:15:13 bdev_raid.raid_state_function_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:11:30.053 18:15:13 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:11:30.311 [2024-07-12 18:15:13.781662] Starting SPDK v24.09-pre git sha1 182dd7de4 / DPDK 24.03.0 initialization... 00:11:30.311 [2024-07-12 18:15:13.781733] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:11:30.311 [2024-07-12 18:15:13.910698] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:11:30.311 [2024-07-12 18:15:14.013793] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:11:30.569 [2024-07-12 18:15:14.075128] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:11:30.569 [2024-07-12 18:15:14.075162] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:11:31.156 18:15:14 bdev_raid.raid_state_function_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:11:31.156 18:15:14 bdev_raid.raid_state_function_test -- common/autotest_common.sh@862 -- # return 0 00:11:31.156 18:15:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@250 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:11:31.454 [2024-07-12 18:15:14.941403] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:11:31.454 [2024-07-12 18:15:14.941447] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:11:31.454 [2024-07-12 18:15:14.941458] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:11:31.454 [2024-07-12 18:15:14.941470] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:11:31.454 18:15:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@251 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 2 00:11:31.454 18:15:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:11:31.454 18:15:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:11:31.454 18:15:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:11:31.454 18:15:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:11:31.454 18:15:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:11:31.454 18:15:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:11:31.454 18:15:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:11:31.454 18:15:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:11:31.454 18:15:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:11:31.454 18:15:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:31.454 18:15:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:11:31.712 18:15:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:11:31.712 "name": "Existed_Raid", 00:11:31.712 "uuid": "00000000-0000-0000-0000-000000000000", 00:11:31.712 "strip_size_kb": 64, 00:11:31.712 "state": "configuring", 00:11:31.712 "raid_level": "concat", 00:11:31.712 "superblock": false, 00:11:31.712 "num_base_bdevs": 2, 00:11:31.712 "num_base_bdevs_discovered": 0, 00:11:31.712 "num_base_bdevs_operational": 2, 00:11:31.712 "base_bdevs_list": [ 00:11:31.712 { 00:11:31.712 "name": "BaseBdev1", 00:11:31.712 "uuid": "00000000-0000-0000-0000-000000000000", 00:11:31.712 "is_configured": false, 00:11:31.712 "data_offset": 0, 00:11:31.712 "data_size": 0 00:11:31.712 }, 00:11:31.712 { 00:11:31.712 "name": "BaseBdev2", 00:11:31.712 "uuid": "00000000-0000-0000-0000-000000000000", 00:11:31.712 "is_configured": false, 00:11:31.712 "data_offset": 0, 00:11:31.712 "data_size": 0 00:11:31.712 } 00:11:31.712 ] 00:11:31.712 }' 00:11:31.712 18:15:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:11:31.712 18:15:15 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:11:32.295 18:15:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@252 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:11:32.295 [2024-07-12 18:15:16.016121] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:11:32.295 [2024-07-12 18:15:16.016151] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x12cea80 name Existed_Raid, state configuring 00:11:32.551 18:15:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@256 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:11:32.551 [2024-07-12 18:15:16.264784] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:11:32.551 [2024-07-12 18:15:16.264811] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:11:32.551 [2024-07-12 18:15:16.264825] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:11:32.551 [2024-07-12 18:15:16.264837] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:11:32.807 18:15:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@257 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:11:32.807 [2024-07-12 18:15:16.519396] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:11:32.807 BaseBdev1 00:11:33.064 18:15:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@258 -- # waitforbdev BaseBdev1 00:11:33.064 18:15:16 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev1 00:11:33.064 18:15:16 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:11:33.064 18:15:16 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:11:33.064 18:15:16 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:11:33.064 18:15:16 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:11:33.064 18:15:16 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:11:33.064 18:15:16 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:11:33.322 [ 00:11:33.322 { 00:11:33.322 "name": "BaseBdev1", 00:11:33.322 "aliases": [ 00:11:33.322 "f8638bf2-4379-4d45-bada-387f180d3e72" 00:11:33.322 ], 00:11:33.322 "product_name": "Malloc disk", 00:11:33.322 "block_size": 512, 00:11:33.322 "num_blocks": 65536, 00:11:33.322 "uuid": "f8638bf2-4379-4d45-bada-387f180d3e72", 00:11:33.322 "assigned_rate_limits": { 00:11:33.322 "rw_ios_per_sec": 0, 00:11:33.322 "rw_mbytes_per_sec": 0, 00:11:33.322 "r_mbytes_per_sec": 0, 00:11:33.322 "w_mbytes_per_sec": 0 00:11:33.322 }, 00:11:33.322 "claimed": true, 00:11:33.322 "claim_type": "exclusive_write", 00:11:33.322 "zoned": false, 00:11:33.322 "supported_io_types": { 00:11:33.322 "read": true, 00:11:33.322 "write": true, 00:11:33.322 "unmap": true, 00:11:33.322 "flush": true, 00:11:33.322 "reset": true, 00:11:33.322 "nvme_admin": false, 00:11:33.322 "nvme_io": false, 00:11:33.322 "nvme_io_md": false, 00:11:33.322 "write_zeroes": true, 00:11:33.322 "zcopy": true, 00:11:33.322 "get_zone_info": false, 00:11:33.322 "zone_management": false, 00:11:33.322 "zone_append": false, 00:11:33.322 "compare": false, 00:11:33.322 "compare_and_write": false, 00:11:33.322 "abort": true, 00:11:33.322 "seek_hole": false, 00:11:33.322 "seek_data": false, 00:11:33.322 "copy": true, 00:11:33.322 "nvme_iov_md": false 00:11:33.322 }, 00:11:33.322 "memory_domains": [ 00:11:33.322 { 00:11:33.322 "dma_device_id": "system", 00:11:33.322 "dma_device_type": 1 00:11:33.322 }, 00:11:33.322 { 00:11:33.322 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:33.322 "dma_device_type": 2 00:11:33.322 } 00:11:33.322 ], 00:11:33.322 "driver_specific": {} 00:11:33.322 } 00:11:33.322 ] 00:11:33.322 18:15:17 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:11:33.322 18:15:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@259 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 2 00:11:33.322 18:15:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:11:33.322 18:15:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:11:33.322 18:15:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:11:33.322 18:15:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:11:33.322 18:15:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:11:33.322 18:15:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:11:33.322 18:15:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:11:33.322 18:15:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:11:33.322 18:15:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:11:33.322 18:15:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:33.322 18:15:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:11:33.581 18:15:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:11:33.581 "name": "Existed_Raid", 00:11:33.581 "uuid": "00000000-0000-0000-0000-000000000000", 00:11:33.581 "strip_size_kb": 64, 00:11:33.581 "state": "configuring", 00:11:33.581 "raid_level": "concat", 00:11:33.581 "superblock": false, 00:11:33.581 "num_base_bdevs": 2, 00:11:33.581 "num_base_bdevs_discovered": 1, 00:11:33.581 "num_base_bdevs_operational": 2, 00:11:33.581 "base_bdevs_list": [ 00:11:33.581 { 00:11:33.581 "name": "BaseBdev1", 00:11:33.581 "uuid": "f8638bf2-4379-4d45-bada-387f180d3e72", 00:11:33.581 "is_configured": true, 00:11:33.581 "data_offset": 0, 00:11:33.581 "data_size": 65536 00:11:33.581 }, 00:11:33.581 { 00:11:33.581 "name": "BaseBdev2", 00:11:33.581 "uuid": "00000000-0000-0000-0000-000000000000", 00:11:33.581 "is_configured": false, 00:11:33.581 "data_offset": 0, 00:11:33.581 "data_size": 0 00:11:33.581 } 00:11:33.581 ] 00:11:33.581 }' 00:11:33.581 18:15:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:11:33.581 18:15:17 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:11:34.145 18:15:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@260 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:11:34.403 [2024-07-12 18:15:18.087560] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:11:34.403 [2024-07-12 18:15:18.087600] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x12ce350 name Existed_Raid, state configuring 00:11:34.403 18:15:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@264 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:11:34.660 [2024-07-12 18:15:18.332238] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:11:34.660 [2024-07-12 18:15:18.333725] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:11:34.660 [2024-07-12 18:15:18.333759] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:11:34.660 18:15:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i = 1 )) 00:11:34.660 18:15:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:11:34.660 18:15:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 2 00:11:34.660 18:15:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:11:34.660 18:15:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:11:34.660 18:15:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:11:34.660 18:15:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:11:34.660 18:15:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:11:34.660 18:15:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:11:34.660 18:15:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:11:34.660 18:15:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:11:34.660 18:15:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:11:34.660 18:15:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:34.660 18:15:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:11:34.918 18:15:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:11:34.918 "name": "Existed_Raid", 00:11:34.918 "uuid": "00000000-0000-0000-0000-000000000000", 00:11:34.918 "strip_size_kb": 64, 00:11:34.918 "state": "configuring", 00:11:34.918 "raid_level": "concat", 00:11:34.918 "superblock": false, 00:11:34.918 "num_base_bdevs": 2, 00:11:34.918 "num_base_bdevs_discovered": 1, 00:11:34.918 "num_base_bdevs_operational": 2, 00:11:34.918 "base_bdevs_list": [ 00:11:34.918 { 00:11:34.918 "name": "BaseBdev1", 00:11:34.918 "uuid": "f8638bf2-4379-4d45-bada-387f180d3e72", 00:11:34.918 "is_configured": true, 00:11:34.918 "data_offset": 0, 00:11:34.918 "data_size": 65536 00:11:34.918 }, 00:11:34.918 { 00:11:34.918 "name": "BaseBdev2", 00:11:34.918 "uuid": "00000000-0000-0000-0000-000000000000", 00:11:34.918 "is_configured": false, 00:11:34.918 "data_offset": 0, 00:11:34.918 "data_size": 0 00:11:34.918 } 00:11:34.918 ] 00:11:34.918 }' 00:11:34.918 18:15:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:11:34.918 18:15:18 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:11:35.483 18:15:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:11:35.740 [2024-07-12 18:15:19.438500] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:11:35.740 [2024-07-12 18:15:19.438539] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x12cf000 00:11:35.740 [2024-07-12 18:15:19.438547] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 131072, blocklen 512 00:11:35.740 [2024-07-12 18:15:19.438736] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x11e90c0 00:11:35.740 [2024-07-12 18:15:19.438855] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x12cf000 00:11:35.740 [2024-07-12 18:15:19.438865] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x12cf000 00:11:35.740 [2024-07-12 18:15:19.439035] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:11:35.740 BaseBdev2 00:11:35.740 18:15:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev2 00:11:35.740 18:15:19 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev2 00:11:35.740 18:15:19 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:11:35.740 18:15:19 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:11:35.740 18:15:19 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:11:35.740 18:15:19 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:11:35.740 18:15:19 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:11:35.997 18:15:19 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:11:36.255 [ 00:11:36.255 { 00:11:36.255 "name": "BaseBdev2", 00:11:36.255 "aliases": [ 00:11:36.255 "58f19631-ce6b-45cb-9bce-251440a61233" 00:11:36.255 ], 00:11:36.255 "product_name": "Malloc disk", 00:11:36.255 "block_size": 512, 00:11:36.255 "num_blocks": 65536, 00:11:36.255 "uuid": "58f19631-ce6b-45cb-9bce-251440a61233", 00:11:36.255 "assigned_rate_limits": { 00:11:36.255 "rw_ios_per_sec": 0, 00:11:36.255 "rw_mbytes_per_sec": 0, 00:11:36.255 "r_mbytes_per_sec": 0, 00:11:36.255 "w_mbytes_per_sec": 0 00:11:36.255 }, 00:11:36.255 "claimed": true, 00:11:36.255 "claim_type": "exclusive_write", 00:11:36.255 "zoned": false, 00:11:36.255 "supported_io_types": { 00:11:36.255 "read": true, 00:11:36.255 "write": true, 00:11:36.255 "unmap": true, 00:11:36.255 "flush": true, 00:11:36.255 "reset": true, 00:11:36.255 "nvme_admin": false, 00:11:36.255 "nvme_io": false, 00:11:36.255 "nvme_io_md": false, 00:11:36.255 "write_zeroes": true, 00:11:36.255 "zcopy": true, 00:11:36.255 "get_zone_info": false, 00:11:36.255 "zone_management": false, 00:11:36.255 "zone_append": false, 00:11:36.255 "compare": false, 00:11:36.255 "compare_and_write": false, 00:11:36.255 "abort": true, 00:11:36.255 "seek_hole": false, 00:11:36.255 "seek_data": false, 00:11:36.255 "copy": true, 00:11:36.255 "nvme_iov_md": false 00:11:36.255 }, 00:11:36.255 "memory_domains": [ 00:11:36.255 { 00:11:36.255 "dma_device_id": "system", 00:11:36.255 "dma_device_type": 1 00:11:36.255 }, 00:11:36.255 { 00:11:36.255 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:36.255 "dma_device_type": 2 00:11:36.255 } 00:11:36.255 ], 00:11:36.255 "driver_specific": {} 00:11:36.255 } 00:11:36.255 ] 00:11:36.255 18:15:19 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:11:36.255 18:15:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:11:36.255 18:15:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:11:36.255 18:15:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@270 -- # verify_raid_bdev_state Existed_Raid online concat 64 2 00:11:36.255 18:15:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:11:36.255 18:15:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:11:36.255 18:15:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:11:36.255 18:15:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:11:36.255 18:15:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:11:36.255 18:15:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:11:36.255 18:15:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:11:36.255 18:15:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:11:36.255 18:15:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:11:36.255 18:15:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:36.255 18:15:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:11:36.513 18:15:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:11:36.513 "name": "Existed_Raid", 00:11:36.513 "uuid": "5dcc79c4-32fa-47ab-a134-64d1ba7e7488", 00:11:36.513 "strip_size_kb": 64, 00:11:36.513 "state": "online", 00:11:36.513 "raid_level": "concat", 00:11:36.513 "superblock": false, 00:11:36.513 "num_base_bdevs": 2, 00:11:36.513 "num_base_bdevs_discovered": 2, 00:11:36.513 "num_base_bdevs_operational": 2, 00:11:36.513 "base_bdevs_list": [ 00:11:36.513 { 00:11:36.513 "name": "BaseBdev1", 00:11:36.513 "uuid": "f8638bf2-4379-4d45-bada-387f180d3e72", 00:11:36.513 "is_configured": true, 00:11:36.513 "data_offset": 0, 00:11:36.513 "data_size": 65536 00:11:36.513 }, 00:11:36.513 { 00:11:36.513 "name": "BaseBdev2", 00:11:36.513 "uuid": "58f19631-ce6b-45cb-9bce-251440a61233", 00:11:36.513 "is_configured": true, 00:11:36.513 "data_offset": 0, 00:11:36.513 "data_size": 65536 00:11:36.513 } 00:11:36.513 ] 00:11:36.513 }' 00:11:36.513 18:15:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:11:36.513 18:15:20 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:11:37.099 18:15:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_properties Existed_Raid 00:11:37.099 18:15:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:11:37.099 18:15:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:11:37.099 18:15:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:11:37.099 18:15:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:11:37.099 18:15:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@198 -- # local name 00:11:37.099 18:15:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:11:37.099 18:15:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:11:37.356 [2024-07-12 18:15:21.010968] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:11:37.356 18:15:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:11:37.356 "name": "Existed_Raid", 00:11:37.356 "aliases": [ 00:11:37.356 "5dcc79c4-32fa-47ab-a134-64d1ba7e7488" 00:11:37.356 ], 00:11:37.356 "product_name": "Raid Volume", 00:11:37.356 "block_size": 512, 00:11:37.356 "num_blocks": 131072, 00:11:37.356 "uuid": "5dcc79c4-32fa-47ab-a134-64d1ba7e7488", 00:11:37.356 "assigned_rate_limits": { 00:11:37.356 "rw_ios_per_sec": 0, 00:11:37.356 "rw_mbytes_per_sec": 0, 00:11:37.356 "r_mbytes_per_sec": 0, 00:11:37.356 "w_mbytes_per_sec": 0 00:11:37.356 }, 00:11:37.356 "claimed": false, 00:11:37.356 "zoned": false, 00:11:37.356 "supported_io_types": { 00:11:37.356 "read": true, 00:11:37.356 "write": true, 00:11:37.356 "unmap": true, 00:11:37.356 "flush": true, 00:11:37.356 "reset": true, 00:11:37.356 "nvme_admin": false, 00:11:37.356 "nvme_io": false, 00:11:37.356 "nvme_io_md": false, 00:11:37.356 "write_zeroes": true, 00:11:37.356 "zcopy": false, 00:11:37.356 "get_zone_info": false, 00:11:37.356 "zone_management": false, 00:11:37.356 "zone_append": false, 00:11:37.356 "compare": false, 00:11:37.356 "compare_and_write": false, 00:11:37.356 "abort": false, 00:11:37.356 "seek_hole": false, 00:11:37.356 "seek_data": false, 00:11:37.356 "copy": false, 00:11:37.356 "nvme_iov_md": false 00:11:37.356 }, 00:11:37.356 "memory_domains": [ 00:11:37.356 { 00:11:37.356 "dma_device_id": "system", 00:11:37.356 "dma_device_type": 1 00:11:37.356 }, 00:11:37.356 { 00:11:37.356 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:37.356 "dma_device_type": 2 00:11:37.357 }, 00:11:37.357 { 00:11:37.357 "dma_device_id": "system", 00:11:37.357 "dma_device_type": 1 00:11:37.357 }, 00:11:37.357 { 00:11:37.357 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:37.357 "dma_device_type": 2 00:11:37.357 } 00:11:37.357 ], 00:11:37.357 "driver_specific": { 00:11:37.357 "raid": { 00:11:37.357 "uuid": "5dcc79c4-32fa-47ab-a134-64d1ba7e7488", 00:11:37.357 "strip_size_kb": 64, 00:11:37.357 "state": "online", 00:11:37.357 "raid_level": "concat", 00:11:37.357 "superblock": false, 00:11:37.357 "num_base_bdevs": 2, 00:11:37.357 "num_base_bdevs_discovered": 2, 00:11:37.357 "num_base_bdevs_operational": 2, 00:11:37.357 "base_bdevs_list": [ 00:11:37.357 { 00:11:37.357 "name": "BaseBdev1", 00:11:37.357 "uuid": "f8638bf2-4379-4d45-bada-387f180d3e72", 00:11:37.357 "is_configured": true, 00:11:37.357 "data_offset": 0, 00:11:37.357 "data_size": 65536 00:11:37.357 }, 00:11:37.357 { 00:11:37.357 "name": "BaseBdev2", 00:11:37.357 "uuid": "58f19631-ce6b-45cb-9bce-251440a61233", 00:11:37.357 "is_configured": true, 00:11:37.357 "data_offset": 0, 00:11:37.357 "data_size": 65536 00:11:37.357 } 00:11:37.357 ] 00:11:37.357 } 00:11:37.357 } 00:11:37.357 }' 00:11:37.357 18:15:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:11:37.357 18:15:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='BaseBdev1 00:11:37.357 BaseBdev2' 00:11:37.357 18:15:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:11:37.357 18:15:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:11:37.357 18:15:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:11:37.614 18:15:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:11:37.614 "name": "BaseBdev1", 00:11:37.614 "aliases": [ 00:11:37.614 "f8638bf2-4379-4d45-bada-387f180d3e72" 00:11:37.614 ], 00:11:37.614 "product_name": "Malloc disk", 00:11:37.614 "block_size": 512, 00:11:37.614 "num_blocks": 65536, 00:11:37.614 "uuid": "f8638bf2-4379-4d45-bada-387f180d3e72", 00:11:37.614 "assigned_rate_limits": { 00:11:37.614 "rw_ios_per_sec": 0, 00:11:37.614 "rw_mbytes_per_sec": 0, 00:11:37.614 "r_mbytes_per_sec": 0, 00:11:37.614 "w_mbytes_per_sec": 0 00:11:37.614 }, 00:11:37.614 "claimed": true, 00:11:37.614 "claim_type": "exclusive_write", 00:11:37.614 "zoned": false, 00:11:37.614 "supported_io_types": { 00:11:37.614 "read": true, 00:11:37.614 "write": true, 00:11:37.614 "unmap": true, 00:11:37.614 "flush": true, 00:11:37.614 "reset": true, 00:11:37.614 "nvme_admin": false, 00:11:37.614 "nvme_io": false, 00:11:37.614 "nvme_io_md": false, 00:11:37.614 "write_zeroes": true, 00:11:37.614 "zcopy": true, 00:11:37.614 "get_zone_info": false, 00:11:37.614 "zone_management": false, 00:11:37.614 "zone_append": false, 00:11:37.614 "compare": false, 00:11:37.614 "compare_and_write": false, 00:11:37.614 "abort": true, 00:11:37.614 "seek_hole": false, 00:11:37.614 "seek_data": false, 00:11:37.614 "copy": true, 00:11:37.614 "nvme_iov_md": false 00:11:37.614 }, 00:11:37.614 "memory_domains": [ 00:11:37.614 { 00:11:37.614 "dma_device_id": "system", 00:11:37.614 "dma_device_type": 1 00:11:37.614 }, 00:11:37.614 { 00:11:37.614 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:37.614 "dma_device_type": 2 00:11:37.614 } 00:11:37.614 ], 00:11:37.614 "driver_specific": {} 00:11:37.614 }' 00:11:37.614 18:15:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:11:37.872 18:15:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:11:37.872 18:15:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:11:37.872 18:15:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:11:37.872 18:15:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:11:37.872 18:15:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:11:37.872 18:15:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:11:37.872 18:15:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:11:37.872 18:15:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:11:37.872 18:15:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:11:38.129 18:15:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:11:38.129 18:15:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:11:38.129 18:15:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:11:38.129 18:15:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:11:38.129 18:15:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:11:38.385 18:15:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:11:38.385 "name": "BaseBdev2", 00:11:38.385 "aliases": [ 00:11:38.385 "58f19631-ce6b-45cb-9bce-251440a61233" 00:11:38.385 ], 00:11:38.385 "product_name": "Malloc disk", 00:11:38.385 "block_size": 512, 00:11:38.385 "num_blocks": 65536, 00:11:38.385 "uuid": "58f19631-ce6b-45cb-9bce-251440a61233", 00:11:38.385 "assigned_rate_limits": { 00:11:38.385 "rw_ios_per_sec": 0, 00:11:38.385 "rw_mbytes_per_sec": 0, 00:11:38.385 "r_mbytes_per_sec": 0, 00:11:38.385 "w_mbytes_per_sec": 0 00:11:38.385 }, 00:11:38.385 "claimed": true, 00:11:38.385 "claim_type": "exclusive_write", 00:11:38.385 "zoned": false, 00:11:38.385 "supported_io_types": { 00:11:38.385 "read": true, 00:11:38.385 "write": true, 00:11:38.385 "unmap": true, 00:11:38.385 "flush": true, 00:11:38.385 "reset": true, 00:11:38.385 "nvme_admin": false, 00:11:38.385 "nvme_io": false, 00:11:38.385 "nvme_io_md": false, 00:11:38.385 "write_zeroes": true, 00:11:38.385 "zcopy": true, 00:11:38.385 "get_zone_info": false, 00:11:38.385 "zone_management": false, 00:11:38.385 "zone_append": false, 00:11:38.385 "compare": false, 00:11:38.385 "compare_and_write": false, 00:11:38.385 "abort": true, 00:11:38.385 "seek_hole": false, 00:11:38.385 "seek_data": false, 00:11:38.385 "copy": true, 00:11:38.385 "nvme_iov_md": false 00:11:38.385 }, 00:11:38.385 "memory_domains": [ 00:11:38.385 { 00:11:38.385 "dma_device_id": "system", 00:11:38.385 "dma_device_type": 1 00:11:38.385 }, 00:11:38.385 { 00:11:38.385 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:38.385 "dma_device_type": 2 00:11:38.385 } 00:11:38.385 ], 00:11:38.385 "driver_specific": {} 00:11:38.385 }' 00:11:38.385 18:15:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:11:38.385 18:15:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:11:38.385 18:15:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:11:38.385 18:15:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:11:38.385 18:15:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:11:38.385 18:15:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:11:38.385 18:15:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:11:38.643 18:15:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:11:38.643 18:15:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:11:38.643 18:15:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:11:38.643 18:15:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:11:38.643 18:15:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:11:38.643 18:15:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@274 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:11:38.900 [2024-07-12 18:15:22.478619] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:11:38.900 [2024-07-12 18:15:22.478647] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:11:38.900 [2024-07-12 18:15:22.478687] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:11:38.900 18:15:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@275 -- # local expected_state 00:11:38.900 18:15:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@276 -- # has_redundancy concat 00:11:38.900 18:15:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:11:38.900 18:15:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@215 -- # return 1 00:11:38.900 18:15:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@277 -- # expected_state=offline 00:11:38.900 18:15:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@281 -- # verify_raid_bdev_state Existed_Raid offline concat 64 1 00:11:38.900 18:15:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:11:38.900 18:15:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=offline 00:11:38.900 18:15:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:11:38.900 18:15:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:11:38.900 18:15:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:11:38.900 18:15:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:11:38.900 18:15:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:11:38.900 18:15:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:11:38.900 18:15:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:11:38.900 18:15:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:38.900 18:15:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:11:39.158 18:15:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:11:39.158 "name": "Existed_Raid", 00:11:39.158 "uuid": "5dcc79c4-32fa-47ab-a134-64d1ba7e7488", 00:11:39.158 "strip_size_kb": 64, 00:11:39.158 "state": "offline", 00:11:39.158 "raid_level": "concat", 00:11:39.158 "superblock": false, 00:11:39.158 "num_base_bdevs": 2, 00:11:39.158 "num_base_bdevs_discovered": 1, 00:11:39.158 "num_base_bdevs_operational": 1, 00:11:39.158 "base_bdevs_list": [ 00:11:39.158 { 00:11:39.158 "name": null, 00:11:39.158 "uuid": "00000000-0000-0000-0000-000000000000", 00:11:39.158 "is_configured": false, 00:11:39.158 "data_offset": 0, 00:11:39.158 "data_size": 65536 00:11:39.158 }, 00:11:39.158 { 00:11:39.158 "name": "BaseBdev2", 00:11:39.158 "uuid": "58f19631-ce6b-45cb-9bce-251440a61233", 00:11:39.158 "is_configured": true, 00:11:39.158 "data_offset": 0, 00:11:39.158 "data_size": 65536 00:11:39.158 } 00:11:39.158 ] 00:11:39.158 }' 00:11:39.158 18:15:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:11:39.158 18:15:22 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:11:39.721 18:15:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i = 1 )) 00:11:39.721 18:15:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:11:39.721 18:15:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:11:39.721 18:15:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:39.978 18:15:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:11:39.978 18:15:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:11:39.978 18:15:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:11:40.236 [2024-07-12 18:15:23.828284] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:11:40.236 [2024-07-12 18:15:23.828335] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x12cf000 name Existed_Raid, state offline 00:11:40.236 18:15:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:11:40.236 18:15:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:11:40.236 18:15:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # jq -r '.[0]["name"] | select(.)' 00:11:40.236 18:15:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:40.494 18:15:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # raid_bdev= 00:11:40.494 18:15:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@294 -- # '[' -n '' ']' 00:11:40.494 18:15:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@299 -- # '[' 2 -gt 2 ']' 00:11:40.494 18:15:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@341 -- # killprocess 2463454 00:11:40.494 18:15:24 bdev_raid.raid_state_function_test -- common/autotest_common.sh@948 -- # '[' -z 2463454 ']' 00:11:40.494 18:15:24 bdev_raid.raid_state_function_test -- common/autotest_common.sh@952 -- # kill -0 2463454 00:11:40.494 18:15:24 bdev_raid.raid_state_function_test -- common/autotest_common.sh@953 -- # uname 00:11:40.494 18:15:24 bdev_raid.raid_state_function_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:11:40.494 18:15:24 bdev_raid.raid_state_function_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2463454 00:11:40.494 18:15:24 bdev_raid.raid_state_function_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:11:40.494 18:15:24 bdev_raid.raid_state_function_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:11:40.494 18:15:24 bdev_raid.raid_state_function_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2463454' 00:11:40.494 killing process with pid 2463454 00:11:40.494 18:15:24 bdev_raid.raid_state_function_test -- common/autotest_common.sh@967 -- # kill 2463454 00:11:40.494 [2024-07-12 18:15:24.158444] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:11:40.494 18:15:24 bdev_raid.raid_state_function_test -- common/autotest_common.sh@972 -- # wait 2463454 00:11:40.494 [2024-07-12 18:15:24.159422] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:11:40.753 18:15:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@343 -- # return 0 00:11:40.753 00:11:40.753 real 0m10.673s 00:11:40.753 user 0m18.994s 00:11:40.753 sys 0m1.969s 00:11:40.753 18:15:24 bdev_raid.raid_state_function_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:11:40.753 18:15:24 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:11:40.753 ************************************ 00:11:40.753 END TEST raid_state_function_test 00:11:40.753 ************************************ 00:11:40.753 18:15:24 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:11:40.753 18:15:24 bdev_raid -- bdev/bdev_raid.sh@868 -- # run_test raid_state_function_test_sb raid_state_function_test concat 2 true 00:11:40.753 18:15:24 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:11:40.753 18:15:24 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:11:40.753 18:15:24 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:11:40.753 ************************************ 00:11:40.753 START TEST raid_state_function_test_sb 00:11:40.753 ************************************ 00:11:40.753 18:15:24 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@1123 -- # raid_state_function_test concat 2 true 00:11:40.753 18:15:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@220 -- # local raid_level=concat 00:11:40.753 18:15:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@221 -- # local num_base_bdevs=2 00:11:40.753 18:15:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@222 -- # local superblock=true 00:11:40.753 18:15:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@223 -- # local raid_bdev 00:11:40.753 18:15:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i = 1 )) 00:11:40.753 18:15:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:11:40.753 18:15:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev1 00:11:40.753 18:15:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:11:40.753 18:15:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:11:40.753 18:15:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev2 00:11:40.753 18:15:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:11:40.753 18:15:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:11:40.753 18:15:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:11:40.753 18:15:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # local base_bdevs 00:11:40.753 18:15:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@225 -- # local raid_bdev_name=Existed_Raid 00:11:40.753 18:15:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # local strip_size 00:11:40.753 18:15:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@227 -- # local strip_size_create_arg 00:11:40.753 18:15:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@228 -- # local superblock_create_arg 00:11:40.753 18:15:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@230 -- # '[' concat '!=' raid1 ']' 00:11:40.753 18:15:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@231 -- # strip_size=64 00:11:40.753 18:15:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@232 -- # strip_size_create_arg='-z 64' 00:11:40.753 18:15:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@237 -- # '[' true = true ']' 00:11:40.753 18:15:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@238 -- # superblock_create_arg=-s 00:11:40.753 18:15:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@243 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:11:40.753 18:15:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@244 -- # raid_pid=2465080 00:11:40.753 18:15:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@245 -- # echo 'Process raid pid: 2465080' 00:11:40.753 Process raid pid: 2465080 00:11:40.753 18:15:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@246 -- # waitforlisten 2465080 /var/tmp/spdk-raid.sock 00:11:40.753 18:15:24 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@829 -- # '[' -z 2465080 ']' 00:11:40.753 18:15:24 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:11:40.753 18:15:24 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@834 -- # local max_retries=100 00:11:40.753 18:15:24 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:11:40.753 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:11:40.753 18:15:24 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@838 -- # xtrace_disable 00:11:40.753 18:15:24 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:11:41.011 [2024-07-12 18:15:24.522800] Starting SPDK v24.09-pre git sha1 182dd7de4 / DPDK 24.03.0 initialization... 00:11:41.011 [2024-07-12 18:15:24.522864] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:11:41.011 [2024-07-12 18:15:24.652681] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:11:41.270 [2024-07-12 18:15:24.759840] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:11:41.270 [2024-07-12 18:15:24.826527] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:11:41.270 [2024-07-12 18:15:24.826563] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:11:41.838 18:15:25 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:11:41.838 18:15:25 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@862 -- # return 0 00:11:41.838 18:15:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@250 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r concat -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:11:42.096 [2024-07-12 18:15:25.679909] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:11:42.096 [2024-07-12 18:15:25.679961] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:11:42.097 [2024-07-12 18:15:25.679972] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:11:42.097 [2024-07-12 18:15:25.679984] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:11:42.097 18:15:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@251 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 2 00:11:42.097 18:15:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:11:42.097 18:15:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:11:42.097 18:15:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:11:42.097 18:15:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:11:42.097 18:15:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:11:42.097 18:15:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:11:42.097 18:15:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:11:42.097 18:15:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:11:42.097 18:15:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:11:42.097 18:15:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:42.097 18:15:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:11:42.355 18:15:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:11:42.355 "name": "Existed_Raid", 00:11:42.355 "uuid": "b92fd4a0-aea7-4a04-a35f-f298e8d560fc", 00:11:42.355 "strip_size_kb": 64, 00:11:42.355 "state": "configuring", 00:11:42.355 "raid_level": "concat", 00:11:42.355 "superblock": true, 00:11:42.355 "num_base_bdevs": 2, 00:11:42.355 "num_base_bdevs_discovered": 0, 00:11:42.355 "num_base_bdevs_operational": 2, 00:11:42.355 "base_bdevs_list": [ 00:11:42.355 { 00:11:42.355 "name": "BaseBdev1", 00:11:42.355 "uuid": "00000000-0000-0000-0000-000000000000", 00:11:42.355 "is_configured": false, 00:11:42.355 "data_offset": 0, 00:11:42.355 "data_size": 0 00:11:42.355 }, 00:11:42.355 { 00:11:42.355 "name": "BaseBdev2", 00:11:42.355 "uuid": "00000000-0000-0000-0000-000000000000", 00:11:42.355 "is_configured": false, 00:11:42.355 "data_offset": 0, 00:11:42.355 "data_size": 0 00:11:42.355 } 00:11:42.355 ] 00:11:42.355 }' 00:11:42.355 18:15:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:11:42.355 18:15:25 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:11:42.921 18:15:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@252 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:11:43.180 [2024-07-12 18:15:26.758635] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:11:43.180 [2024-07-12 18:15:26.758665] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1cfca80 name Existed_Raid, state configuring 00:11:43.180 18:15:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@256 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r concat -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:11:43.439 [2024-07-12 18:15:26.931119] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:11:43.439 [2024-07-12 18:15:26.931152] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:11:43.439 [2024-07-12 18:15:26.931161] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:11:43.439 [2024-07-12 18:15:26.931173] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:11:43.439 18:15:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@257 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:11:43.439 [2024-07-12 18:15:27.113516] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:11:43.439 BaseBdev1 00:11:43.439 18:15:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@258 -- # waitforbdev BaseBdev1 00:11:43.439 18:15:27 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev1 00:11:43.439 18:15:27 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:11:43.439 18:15:27 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:11:43.439 18:15:27 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:11:43.439 18:15:27 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:11:43.439 18:15:27 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:11:43.698 18:15:27 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:11:43.957 [ 00:11:43.957 { 00:11:43.957 "name": "BaseBdev1", 00:11:43.957 "aliases": [ 00:11:43.957 "65694e66-febd-4097-afbd-ddd867796d6d" 00:11:43.957 ], 00:11:43.957 "product_name": "Malloc disk", 00:11:43.957 "block_size": 512, 00:11:43.957 "num_blocks": 65536, 00:11:43.957 "uuid": "65694e66-febd-4097-afbd-ddd867796d6d", 00:11:43.957 "assigned_rate_limits": { 00:11:43.957 "rw_ios_per_sec": 0, 00:11:43.957 "rw_mbytes_per_sec": 0, 00:11:43.957 "r_mbytes_per_sec": 0, 00:11:43.957 "w_mbytes_per_sec": 0 00:11:43.957 }, 00:11:43.957 "claimed": true, 00:11:43.957 "claim_type": "exclusive_write", 00:11:43.957 "zoned": false, 00:11:43.957 "supported_io_types": { 00:11:43.957 "read": true, 00:11:43.957 "write": true, 00:11:43.957 "unmap": true, 00:11:43.957 "flush": true, 00:11:43.957 "reset": true, 00:11:43.957 "nvme_admin": false, 00:11:43.957 "nvme_io": false, 00:11:43.957 "nvme_io_md": false, 00:11:43.957 "write_zeroes": true, 00:11:43.957 "zcopy": true, 00:11:43.957 "get_zone_info": false, 00:11:43.957 "zone_management": false, 00:11:43.957 "zone_append": false, 00:11:43.957 "compare": false, 00:11:43.957 "compare_and_write": false, 00:11:43.957 "abort": true, 00:11:43.957 "seek_hole": false, 00:11:43.957 "seek_data": false, 00:11:43.957 "copy": true, 00:11:43.957 "nvme_iov_md": false 00:11:43.957 }, 00:11:43.957 "memory_domains": [ 00:11:43.957 { 00:11:43.957 "dma_device_id": "system", 00:11:43.957 "dma_device_type": 1 00:11:43.957 }, 00:11:43.957 { 00:11:43.957 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:43.957 "dma_device_type": 2 00:11:43.957 } 00:11:43.957 ], 00:11:43.957 "driver_specific": {} 00:11:43.957 } 00:11:43.957 ] 00:11:43.957 18:15:27 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:11:43.957 18:15:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@259 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 2 00:11:43.957 18:15:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:11:43.957 18:15:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:11:43.957 18:15:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:11:43.957 18:15:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:11:43.957 18:15:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:11:43.957 18:15:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:11:43.957 18:15:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:11:43.957 18:15:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:11:43.957 18:15:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:11:43.957 18:15:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:43.957 18:15:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:11:44.216 18:15:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:11:44.216 "name": "Existed_Raid", 00:11:44.216 "uuid": "3ea45bae-232d-433f-b64f-1e612247e3e7", 00:11:44.216 "strip_size_kb": 64, 00:11:44.216 "state": "configuring", 00:11:44.216 "raid_level": "concat", 00:11:44.216 "superblock": true, 00:11:44.216 "num_base_bdevs": 2, 00:11:44.216 "num_base_bdevs_discovered": 1, 00:11:44.216 "num_base_bdevs_operational": 2, 00:11:44.216 "base_bdevs_list": [ 00:11:44.216 { 00:11:44.216 "name": "BaseBdev1", 00:11:44.216 "uuid": "65694e66-febd-4097-afbd-ddd867796d6d", 00:11:44.216 "is_configured": true, 00:11:44.216 "data_offset": 2048, 00:11:44.216 "data_size": 63488 00:11:44.216 }, 00:11:44.216 { 00:11:44.216 "name": "BaseBdev2", 00:11:44.216 "uuid": "00000000-0000-0000-0000-000000000000", 00:11:44.216 "is_configured": false, 00:11:44.216 "data_offset": 0, 00:11:44.216 "data_size": 0 00:11:44.216 } 00:11:44.216 ] 00:11:44.216 }' 00:11:44.216 18:15:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:11:44.216 18:15:27 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:11:44.784 18:15:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@260 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:11:45.044 [2024-07-12 18:15:28.537312] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:11:45.044 [2024-07-12 18:15:28.537346] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1cfc350 name Existed_Raid, state configuring 00:11:45.044 18:15:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@264 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r concat -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:11:45.303 [2024-07-12 18:15:28.781999] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:11:45.303 [2024-07-12 18:15:28.783464] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:11:45.303 [2024-07-12 18:15:28.783494] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:11:45.303 18:15:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i = 1 )) 00:11:45.303 18:15:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:11:45.303 18:15:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 2 00:11:45.303 18:15:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:11:45.303 18:15:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:11:45.303 18:15:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:11:45.303 18:15:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:11:45.303 18:15:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:11:45.303 18:15:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:11:45.303 18:15:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:11:45.303 18:15:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:11:45.303 18:15:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:11:45.303 18:15:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:45.303 18:15:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:11:45.303 18:15:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:11:45.303 "name": "Existed_Raid", 00:11:45.303 "uuid": "88eac858-56b6-41ac-9730-9dd563255d1e", 00:11:45.303 "strip_size_kb": 64, 00:11:45.303 "state": "configuring", 00:11:45.304 "raid_level": "concat", 00:11:45.304 "superblock": true, 00:11:45.304 "num_base_bdevs": 2, 00:11:45.304 "num_base_bdevs_discovered": 1, 00:11:45.304 "num_base_bdevs_operational": 2, 00:11:45.304 "base_bdevs_list": [ 00:11:45.304 { 00:11:45.304 "name": "BaseBdev1", 00:11:45.304 "uuid": "65694e66-febd-4097-afbd-ddd867796d6d", 00:11:45.304 "is_configured": true, 00:11:45.304 "data_offset": 2048, 00:11:45.304 "data_size": 63488 00:11:45.304 }, 00:11:45.304 { 00:11:45.304 "name": "BaseBdev2", 00:11:45.304 "uuid": "00000000-0000-0000-0000-000000000000", 00:11:45.304 "is_configured": false, 00:11:45.304 "data_offset": 0, 00:11:45.304 "data_size": 0 00:11:45.304 } 00:11:45.304 ] 00:11:45.304 }' 00:11:45.304 18:15:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:11:45.304 18:15:28 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:11:45.873 18:15:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:11:46.132 [2024-07-12 18:15:29.671643] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:11:46.132 [2024-07-12 18:15:29.671781] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x1cfd000 00:11:46.132 [2024-07-12 18:15:29.671794] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 126976, blocklen 512 00:11:46.132 [2024-07-12 18:15:29.671974] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1c170c0 00:11:46.132 [2024-07-12 18:15:29.672091] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1cfd000 00:11:46.132 [2024-07-12 18:15:29.672101] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x1cfd000 00:11:46.132 [2024-07-12 18:15:29.672190] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:11:46.132 BaseBdev2 00:11:46.132 18:15:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev2 00:11:46.132 18:15:29 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev2 00:11:46.132 18:15:29 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:11:46.132 18:15:29 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:11:46.132 18:15:29 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:11:46.132 18:15:29 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:11:46.132 18:15:29 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:11:46.390 18:15:29 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:11:46.390 [ 00:11:46.390 { 00:11:46.390 "name": "BaseBdev2", 00:11:46.391 "aliases": [ 00:11:46.391 "7d2ecd42-b158-4647-9900-c6c821d02562" 00:11:46.391 ], 00:11:46.391 "product_name": "Malloc disk", 00:11:46.391 "block_size": 512, 00:11:46.391 "num_blocks": 65536, 00:11:46.391 "uuid": "7d2ecd42-b158-4647-9900-c6c821d02562", 00:11:46.391 "assigned_rate_limits": { 00:11:46.391 "rw_ios_per_sec": 0, 00:11:46.391 "rw_mbytes_per_sec": 0, 00:11:46.391 "r_mbytes_per_sec": 0, 00:11:46.391 "w_mbytes_per_sec": 0 00:11:46.391 }, 00:11:46.391 "claimed": true, 00:11:46.391 "claim_type": "exclusive_write", 00:11:46.391 "zoned": false, 00:11:46.391 "supported_io_types": { 00:11:46.391 "read": true, 00:11:46.391 "write": true, 00:11:46.391 "unmap": true, 00:11:46.391 "flush": true, 00:11:46.391 "reset": true, 00:11:46.391 "nvme_admin": false, 00:11:46.391 "nvme_io": false, 00:11:46.391 "nvme_io_md": false, 00:11:46.391 "write_zeroes": true, 00:11:46.391 "zcopy": true, 00:11:46.391 "get_zone_info": false, 00:11:46.391 "zone_management": false, 00:11:46.391 "zone_append": false, 00:11:46.391 "compare": false, 00:11:46.391 "compare_and_write": false, 00:11:46.391 "abort": true, 00:11:46.391 "seek_hole": false, 00:11:46.391 "seek_data": false, 00:11:46.391 "copy": true, 00:11:46.391 "nvme_iov_md": false 00:11:46.391 }, 00:11:46.391 "memory_domains": [ 00:11:46.391 { 00:11:46.391 "dma_device_id": "system", 00:11:46.391 "dma_device_type": 1 00:11:46.391 }, 00:11:46.391 { 00:11:46.391 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:46.391 "dma_device_type": 2 00:11:46.391 } 00:11:46.391 ], 00:11:46.391 "driver_specific": {} 00:11:46.391 } 00:11:46.391 ] 00:11:46.650 18:15:30 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:11:46.650 18:15:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:11:46.650 18:15:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:11:46.650 18:15:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@270 -- # verify_raid_bdev_state Existed_Raid online concat 64 2 00:11:46.650 18:15:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:11:46.650 18:15:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:11:46.650 18:15:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:11:46.650 18:15:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:11:46.650 18:15:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:11:46.650 18:15:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:11:46.650 18:15:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:11:46.650 18:15:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:11:46.650 18:15:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:11:46.650 18:15:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:46.650 18:15:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:11:46.650 18:15:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:11:46.650 "name": "Existed_Raid", 00:11:46.650 "uuid": "88eac858-56b6-41ac-9730-9dd563255d1e", 00:11:46.650 "strip_size_kb": 64, 00:11:46.650 "state": "online", 00:11:46.650 "raid_level": "concat", 00:11:46.650 "superblock": true, 00:11:46.650 "num_base_bdevs": 2, 00:11:46.650 "num_base_bdevs_discovered": 2, 00:11:46.650 "num_base_bdevs_operational": 2, 00:11:46.650 "base_bdevs_list": [ 00:11:46.650 { 00:11:46.650 "name": "BaseBdev1", 00:11:46.650 "uuid": "65694e66-febd-4097-afbd-ddd867796d6d", 00:11:46.650 "is_configured": true, 00:11:46.650 "data_offset": 2048, 00:11:46.650 "data_size": 63488 00:11:46.650 }, 00:11:46.650 { 00:11:46.650 "name": "BaseBdev2", 00:11:46.650 "uuid": "7d2ecd42-b158-4647-9900-c6c821d02562", 00:11:46.650 "is_configured": true, 00:11:46.650 "data_offset": 2048, 00:11:46.650 "data_size": 63488 00:11:46.650 } 00:11:46.650 ] 00:11:46.650 }' 00:11:46.650 18:15:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:11:46.650 18:15:30 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:11:47.219 18:15:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_properties Existed_Raid 00:11:47.219 18:15:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:11:47.219 18:15:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:11:47.219 18:15:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:11:47.219 18:15:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:11:47.219 18:15:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@198 -- # local name 00:11:47.219 18:15:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:11:47.219 18:15:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:11:47.479 [2024-07-12 18:15:31.159847] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:11:47.479 18:15:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:11:47.479 "name": "Existed_Raid", 00:11:47.479 "aliases": [ 00:11:47.479 "88eac858-56b6-41ac-9730-9dd563255d1e" 00:11:47.479 ], 00:11:47.479 "product_name": "Raid Volume", 00:11:47.479 "block_size": 512, 00:11:47.479 "num_blocks": 126976, 00:11:47.479 "uuid": "88eac858-56b6-41ac-9730-9dd563255d1e", 00:11:47.479 "assigned_rate_limits": { 00:11:47.479 "rw_ios_per_sec": 0, 00:11:47.479 "rw_mbytes_per_sec": 0, 00:11:47.479 "r_mbytes_per_sec": 0, 00:11:47.479 "w_mbytes_per_sec": 0 00:11:47.479 }, 00:11:47.479 "claimed": false, 00:11:47.479 "zoned": false, 00:11:47.479 "supported_io_types": { 00:11:47.479 "read": true, 00:11:47.479 "write": true, 00:11:47.479 "unmap": true, 00:11:47.479 "flush": true, 00:11:47.479 "reset": true, 00:11:47.479 "nvme_admin": false, 00:11:47.479 "nvme_io": false, 00:11:47.479 "nvme_io_md": false, 00:11:47.479 "write_zeroes": true, 00:11:47.479 "zcopy": false, 00:11:47.479 "get_zone_info": false, 00:11:47.479 "zone_management": false, 00:11:47.479 "zone_append": false, 00:11:47.479 "compare": false, 00:11:47.479 "compare_and_write": false, 00:11:47.479 "abort": false, 00:11:47.479 "seek_hole": false, 00:11:47.479 "seek_data": false, 00:11:47.479 "copy": false, 00:11:47.479 "nvme_iov_md": false 00:11:47.479 }, 00:11:47.479 "memory_domains": [ 00:11:47.479 { 00:11:47.479 "dma_device_id": "system", 00:11:47.479 "dma_device_type": 1 00:11:47.479 }, 00:11:47.479 { 00:11:47.479 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:47.479 "dma_device_type": 2 00:11:47.479 }, 00:11:47.479 { 00:11:47.479 "dma_device_id": "system", 00:11:47.479 "dma_device_type": 1 00:11:47.479 }, 00:11:47.479 { 00:11:47.479 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:47.479 "dma_device_type": 2 00:11:47.479 } 00:11:47.479 ], 00:11:47.479 "driver_specific": { 00:11:47.479 "raid": { 00:11:47.479 "uuid": "88eac858-56b6-41ac-9730-9dd563255d1e", 00:11:47.479 "strip_size_kb": 64, 00:11:47.479 "state": "online", 00:11:47.479 "raid_level": "concat", 00:11:47.479 "superblock": true, 00:11:47.480 "num_base_bdevs": 2, 00:11:47.480 "num_base_bdevs_discovered": 2, 00:11:47.480 "num_base_bdevs_operational": 2, 00:11:47.480 "base_bdevs_list": [ 00:11:47.480 { 00:11:47.480 "name": "BaseBdev1", 00:11:47.480 "uuid": "65694e66-febd-4097-afbd-ddd867796d6d", 00:11:47.480 "is_configured": true, 00:11:47.480 "data_offset": 2048, 00:11:47.480 "data_size": 63488 00:11:47.480 }, 00:11:47.480 { 00:11:47.480 "name": "BaseBdev2", 00:11:47.480 "uuid": "7d2ecd42-b158-4647-9900-c6c821d02562", 00:11:47.480 "is_configured": true, 00:11:47.480 "data_offset": 2048, 00:11:47.480 "data_size": 63488 00:11:47.480 } 00:11:47.480 ] 00:11:47.480 } 00:11:47.480 } 00:11:47.480 }' 00:11:47.480 18:15:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:11:47.739 18:15:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # base_bdev_names='BaseBdev1 00:11:47.739 BaseBdev2' 00:11:47.739 18:15:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:11:47.739 18:15:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:11:47.739 18:15:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:11:47.998 18:15:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:11:47.998 "name": "BaseBdev1", 00:11:47.998 "aliases": [ 00:11:47.998 "65694e66-febd-4097-afbd-ddd867796d6d" 00:11:47.998 ], 00:11:47.998 "product_name": "Malloc disk", 00:11:47.998 "block_size": 512, 00:11:47.998 "num_blocks": 65536, 00:11:47.998 "uuid": "65694e66-febd-4097-afbd-ddd867796d6d", 00:11:47.998 "assigned_rate_limits": { 00:11:47.998 "rw_ios_per_sec": 0, 00:11:47.998 "rw_mbytes_per_sec": 0, 00:11:47.998 "r_mbytes_per_sec": 0, 00:11:47.998 "w_mbytes_per_sec": 0 00:11:47.998 }, 00:11:47.998 "claimed": true, 00:11:47.998 "claim_type": "exclusive_write", 00:11:47.998 "zoned": false, 00:11:47.998 "supported_io_types": { 00:11:47.998 "read": true, 00:11:47.998 "write": true, 00:11:47.998 "unmap": true, 00:11:47.998 "flush": true, 00:11:47.998 "reset": true, 00:11:47.998 "nvme_admin": false, 00:11:47.998 "nvme_io": false, 00:11:47.998 "nvme_io_md": false, 00:11:47.998 "write_zeroes": true, 00:11:47.998 "zcopy": true, 00:11:47.998 "get_zone_info": false, 00:11:47.998 "zone_management": false, 00:11:47.998 "zone_append": false, 00:11:47.998 "compare": false, 00:11:47.998 "compare_and_write": false, 00:11:47.998 "abort": true, 00:11:47.998 "seek_hole": false, 00:11:47.998 "seek_data": false, 00:11:47.999 "copy": true, 00:11:47.999 "nvme_iov_md": false 00:11:47.999 }, 00:11:47.999 "memory_domains": [ 00:11:47.999 { 00:11:47.999 "dma_device_id": "system", 00:11:47.999 "dma_device_type": 1 00:11:47.999 }, 00:11:47.999 { 00:11:47.999 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:47.999 "dma_device_type": 2 00:11:47.999 } 00:11:47.999 ], 00:11:47.999 "driver_specific": {} 00:11:47.999 }' 00:11:47.999 18:15:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:11:47.999 18:15:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:11:47.999 18:15:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:11:47.999 18:15:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:11:47.999 18:15:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:11:47.999 18:15:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:11:47.999 18:15:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:11:47.999 18:15:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:11:48.258 18:15:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:11:48.258 18:15:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:11:48.258 18:15:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:11:48.258 18:15:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:11:48.258 18:15:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:11:48.258 18:15:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:11:48.258 18:15:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:11:48.258 18:15:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:11:48.258 "name": "BaseBdev2", 00:11:48.258 "aliases": [ 00:11:48.258 "7d2ecd42-b158-4647-9900-c6c821d02562" 00:11:48.258 ], 00:11:48.258 "product_name": "Malloc disk", 00:11:48.258 "block_size": 512, 00:11:48.258 "num_blocks": 65536, 00:11:48.258 "uuid": "7d2ecd42-b158-4647-9900-c6c821d02562", 00:11:48.258 "assigned_rate_limits": { 00:11:48.258 "rw_ios_per_sec": 0, 00:11:48.258 "rw_mbytes_per_sec": 0, 00:11:48.258 "r_mbytes_per_sec": 0, 00:11:48.258 "w_mbytes_per_sec": 0 00:11:48.258 }, 00:11:48.258 "claimed": true, 00:11:48.258 "claim_type": "exclusive_write", 00:11:48.258 "zoned": false, 00:11:48.258 "supported_io_types": { 00:11:48.258 "read": true, 00:11:48.258 "write": true, 00:11:48.258 "unmap": true, 00:11:48.258 "flush": true, 00:11:48.258 "reset": true, 00:11:48.258 "nvme_admin": false, 00:11:48.258 "nvme_io": false, 00:11:48.258 "nvme_io_md": false, 00:11:48.258 "write_zeroes": true, 00:11:48.258 "zcopy": true, 00:11:48.258 "get_zone_info": false, 00:11:48.258 "zone_management": false, 00:11:48.258 "zone_append": false, 00:11:48.258 "compare": false, 00:11:48.258 "compare_and_write": false, 00:11:48.258 "abort": true, 00:11:48.258 "seek_hole": false, 00:11:48.258 "seek_data": false, 00:11:48.258 "copy": true, 00:11:48.258 "nvme_iov_md": false 00:11:48.258 }, 00:11:48.258 "memory_domains": [ 00:11:48.258 { 00:11:48.258 "dma_device_id": "system", 00:11:48.258 "dma_device_type": 1 00:11:48.258 }, 00:11:48.258 { 00:11:48.258 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:48.258 "dma_device_type": 2 00:11:48.258 } 00:11:48.258 ], 00:11:48.258 "driver_specific": {} 00:11:48.258 }' 00:11:48.518 18:15:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:11:48.518 18:15:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:11:48.518 18:15:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:11:48.518 18:15:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:11:48.518 18:15:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:11:48.518 18:15:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:11:48.518 18:15:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:11:48.518 18:15:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:11:48.777 18:15:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:11:48.777 18:15:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:11:48.777 18:15:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:11:48.777 18:15:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:11:48.777 18:15:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@274 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:11:49.037 [2024-07-12 18:15:32.571384] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:11:49.037 [2024-07-12 18:15:32.571407] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:11:49.037 [2024-07-12 18:15:32.571446] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:11:49.037 18:15:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@275 -- # local expected_state 00:11:49.037 18:15:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@276 -- # has_redundancy concat 00:11:49.037 18:15:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@213 -- # case $1 in 00:11:49.037 18:15:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@215 -- # return 1 00:11:49.037 18:15:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@277 -- # expected_state=offline 00:11:49.037 18:15:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@281 -- # verify_raid_bdev_state Existed_Raid offline concat 64 1 00:11:49.037 18:15:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:11:49.037 18:15:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=offline 00:11:49.037 18:15:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:11:49.037 18:15:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:11:49.037 18:15:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:11:49.037 18:15:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:11:49.037 18:15:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:11:49.037 18:15:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:11:49.037 18:15:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:11:49.037 18:15:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:49.037 18:15:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:11:49.326 18:15:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:11:49.326 "name": "Existed_Raid", 00:11:49.326 "uuid": "88eac858-56b6-41ac-9730-9dd563255d1e", 00:11:49.326 "strip_size_kb": 64, 00:11:49.326 "state": "offline", 00:11:49.326 "raid_level": "concat", 00:11:49.326 "superblock": true, 00:11:49.326 "num_base_bdevs": 2, 00:11:49.326 "num_base_bdevs_discovered": 1, 00:11:49.326 "num_base_bdevs_operational": 1, 00:11:49.326 "base_bdevs_list": [ 00:11:49.326 { 00:11:49.326 "name": null, 00:11:49.326 "uuid": "00000000-0000-0000-0000-000000000000", 00:11:49.326 "is_configured": false, 00:11:49.326 "data_offset": 2048, 00:11:49.326 "data_size": 63488 00:11:49.326 }, 00:11:49.326 { 00:11:49.326 "name": "BaseBdev2", 00:11:49.326 "uuid": "7d2ecd42-b158-4647-9900-c6c821d02562", 00:11:49.326 "is_configured": true, 00:11:49.326 "data_offset": 2048, 00:11:49.326 "data_size": 63488 00:11:49.326 } 00:11:49.326 ] 00:11:49.326 }' 00:11:49.326 18:15:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:11:49.326 18:15:32 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:11:49.893 18:15:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i = 1 )) 00:11:49.893 18:15:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:11:49.893 18:15:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:11:49.893 18:15:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:50.152 18:15:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:11:50.152 18:15:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:11:50.152 18:15:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:11:50.410 [2024-07-12 18:15:33.944228] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:11:50.410 [2024-07-12 18:15:33.944275] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1cfd000 name Existed_Raid, state offline 00:11:50.410 18:15:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:11:50.410 18:15:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:11:50.410 18:15:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:50.410 18:15:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # jq -r '.[0]["name"] | select(.)' 00:11:50.669 18:15:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # raid_bdev= 00:11:50.669 18:15:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@294 -- # '[' -n '' ']' 00:11:50.669 18:15:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@299 -- # '[' 2 -gt 2 ']' 00:11:50.669 18:15:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@341 -- # killprocess 2465080 00:11:50.669 18:15:34 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@948 -- # '[' -z 2465080 ']' 00:11:50.669 18:15:34 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@952 -- # kill -0 2465080 00:11:50.669 18:15:34 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@953 -- # uname 00:11:50.669 18:15:34 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:11:50.669 18:15:34 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2465080 00:11:50.669 18:15:34 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:11:50.669 18:15:34 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:11:50.669 18:15:34 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2465080' 00:11:50.669 killing process with pid 2465080 00:11:50.669 18:15:34 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@967 -- # kill 2465080 00:11:50.669 [2024-07-12 18:15:34.272892] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:11:50.669 18:15:34 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@972 -- # wait 2465080 00:11:50.669 [2024-07-12 18:15:34.273855] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:11:50.928 18:15:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@343 -- # return 0 00:11:50.928 00:11:50.928 real 0m10.031s 00:11:50.928 user 0m17.768s 00:11:50.928 sys 0m1.887s 00:11:50.928 18:15:34 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@1124 -- # xtrace_disable 00:11:50.928 18:15:34 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:11:50.928 ************************************ 00:11:50.928 END TEST raid_state_function_test_sb 00:11:50.928 ************************************ 00:11:50.928 18:15:34 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:11:50.928 18:15:34 bdev_raid -- bdev/bdev_raid.sh@869 -- # run_test raid_superblock_test raid_superblock_test concat 2 00:11:50.928 18:15:34 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 4 -le 1 ']' 00:11:50.928 18:15:34 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:11:50.928 18:15:34 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:11:50.928 ************************************ 00:11:50.928 START TEST raid_superblock_test 00:11:50.928 ************************************ 00:11:50.928 18:15:34 bdev_raid.raid_superblock_test -- common/autotest_common.sh@1123 -- # raid_superblock_test concat 2 00:11:50.928 18:15:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@392 -- # local raid_level=concat 00:11:50.928 18:15:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@393 -- # local num_base_bdevs=2 00:11:50.928 18:15:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@394 -- # base_bdevs_malloc=() 00:11:50.928 18:15:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@394 -- # local base_bdevs_malloc 00:11:50.928 18:15:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@395 -- # base_bdevs_pt=() 00:11:50.928 18:15:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@395 -- # local base_bdevs_pt 00:11:50.928 18:15:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@396 -- # base_bdevs_pt_uuid=() 00:11:50.928 18:15:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@396 -- # local base_bdevs_pt_uuid 00:11:50.928 18:15:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@397 -- # local raid_bdev_name=raid_bdev1 00:11:50.928 18:15:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@398 -- # local strip_size 00:11:50.928 18:15:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@399 -- # local strip_size_create_arg 00:11:50.928 18:15:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@400 -- # local raid_bdev_uuid 00:11:50.928 18:15:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@401 -- # local raid_bdev 00:11:50.928 18:15:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@403 -- # '[' concat '!=' raid1 ']' 00:11:50.928 18:15:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@404 -- # strip_size=64 00:11:50.928 18:15:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@405 -- # strip_size_create_arg='-z 64' 00:11:50.928 18:15:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@411 -- # raid_pid=2466644 00:11:50.928 18:15:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@412 -- # waitforlisten 2466644 /var/tmp/spdk-raid.sock 00:11:50.928 18:15:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@410 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -L bdev_raid 00:11:50.928 18:15:34 bdev_raid.raid_superblock_test -- common/autotest_common.sh@829 -- # '[' -z 2466644 ']' 00:11:50.928 18:15:34 bdev_raid.raid_superblock_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:11:50.928 18:15:34 bdev_raid.raid_superblock_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:11:50.928 18:15:34 bdev_raid.raid_superblock_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:11:50.928 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:11:50.928 18:15:34 bdev_raid.raid_superblock_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:11:50.928 18:15:34 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:11:50.928 [2024-07-12 18:15:34.640995] Starting SPDK v24.09-pre git sha1 182dd7de4 / DPDK 24.03.0 initialization... 00:11:50.928 [2024-07-12 18:15:34.641062] [ DPDK EAL parameters: bdev_svc --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2466644 ] 00:11:51.188 [2024-07-12 18:15:34.758132] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:11:51.188 [2024-07-12 18:15:34.859808] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:11:51.447 [2024-07-12 18:15:34.928056] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:11:51.447 [2024-07-12 18:15:34.928091] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:11:52.015 18:15:35 bdev_raid.raid_superblock_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:11:52.015 18:15:35 bdev_raid.raid_superblock_test -- common/autotest_common.sh@862 -- # return 0 00:11:52.015 18:15:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i = 1 )) 00:11:52.015 18:15:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:11:52.015 18:15:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc1 00:11:52.015 18:15:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt1 00:11:52.015 18:15:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000001 00:11:52.016 18:15:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:11:52.016 18:15:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:11:52.016 18:15:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:11:52.016 18:15:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc1 00:11:52.275 malloc1 00:11:52.275 18:15:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:11:52.534 [2024-07-12 18:15:36.043202] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:11:52.534 [2024-07-12 18:15:36.043249] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:11:52.534 [2024-07-12 18:15:36.043269] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xc24570 00:11:52.534 [2024-07-12 18:15:36.043282] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:11:52.534 [2024-07-12 18:15:36.044974] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:11:52.534 [2024-07-12 18:15:36.045001] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:11:52.534 pt1 00:11:52.534 18:15:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:11:52.534 18:15:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:11:52.534 18:15:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc2 00:11:52.534 18:15:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt2 00:11:52.534 18:15:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000002 00:11:52.534 18:15:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:11:52.534 18:15:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:11:52.534 18:15:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:11:52.534 18:15:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc2 00:11:52.793 malloc2 00:11:52.793 18:15:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:11:53.052 [2024-07-12 18:15:36.542099] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:11:53.052 [2024-07-12 18:15:36.542145] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:11:53.052 [2024-07-12 18:15:36.542162] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xc25970 00:11:53.052 [2024-07-12 18:15:36.542175] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:11:53.052 [2024-07-12 18:15:36.543817] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:11:53.052 [2024-07-12 18:15:36.543845] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:11:53.052 pt2 00:11:53.052 18:15:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:11:53.052 18:15:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:11:53.052 18:15:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@429 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'pt1 pt2' -n raid_bdev1 -s 00:11:53.312 [2024-07-12 18:15:36.790781] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:11:53.312 [2024-07-12 18:15:36.792146] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:11:53.312 [2024-07-12 18:15:36.792286] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0xdc8270 00:11:53.312 [2024-07-12 18:15:36.792298] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 126976, blocklen 512 00:11:53.312 [2024-07-12 18:15:36.792499] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xdbdc10 00:11:53.312 [2024-07-12 18:15:36.792643] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0xdc8270 00:11:53.312 [2024-07-12 18:15:36.792653] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0xdc8270 00:11:53.312 [2024-07-12 18:15:36.792754] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:11:53.312 18:15:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@430 -- # verify_raid_bdev_state raid_bdev1 online concat 64 2 00:11:53.312 18:15:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:11:53.312 18:15:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:11:53.312 18:15:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:11:53.312 18:15:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:11:53.312 18:15:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:11:53.312 18:15:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:11:53.312 18:15:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:11:53.312 18:15:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:11:53.312 18:15:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:11:53.312 18:15:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:53.312 18:15:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:11:53.570 18:15:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:11:53.570 "name": "raid_bdev1", 00:11:53.570 "uuid": "21f4b8a7-9531-40d3-810d-1e7ba573007d", 00:11:53.570 "strip_size_kb": 64, 00:11:53.570 "state": "online", 00:11:53.570 "raid_level": "concat", 00:11:53.570 "superblock": true, 00:11:53.570 "num_base_bdevs": 2, 00:11:53.570 "num_base_bdevs_discovered": 2, 00:11:53.570 "num_base_bdevs_operational": 2, 00:11:53.570 "base_bdevs_list": [ 00:11:53.570 { 00:11:53.570 "name": "pt1", 00:11:53.570 "uuid": "00000000-0000-0000-0000-000000000001", 00:11:53.570 "is_configured": true, 00:11:53.570 "data_offset": 2048, 00:11:53.570 "data_size": 63488 00:11:53.570 }, 00:11:53.570 { 00:11:53.570 "name": "pt2", 00:11:53.570 "uuid": "00000000-0000-0000-0000-000000000002", 00:11:53.570 "is_configured": true, 00:11:53.570 "data_offset": 2048, 00:11:53.570 "data_size": 63488 00:11:53.570 } 00:11:53.570 ] 00:11:53.570 }' 00:11:53.570 18:15:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:11:53.570 18:15:37 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:11:54.139 18:15:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@431 -- # verify_raid_bdev_properties raid_bdev1 00:11:54.139 18:15:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:11:54.139 18:15:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:11:54.139 18:15:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:11:54.139 18:15:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:11:54.139 18:15:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@198 -- # local name 00:11:54.139 18:15:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:11:54.139 18:15:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:11:54.139 [2024-07-12 18:15:37.829726] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:11:54.139 18:15:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:11:54.139 "name": "raid_bdev1", 00:11:54.139 "aliases": [ 00:11:54.139 "21f4b8a7-9531-40d3-810d-1e7ba573007d" 00:11:54.139 ], 00:11:54.139 "product_name": "Raid Volume", 00:11:54.139 "block_size": 512, 00:11:54.139 "num_blocks": 126976, 00:11:54.139 "uuid": "21f4b8a7-9531-40d3-810d-1e7ba573007d", 00:11:54.139 "assigned_rate_limits": { 00:11:54.139 "rw_ios_per_sec": 0, 00:11:54.139 "rw_mbytes_per_sec": 0, 00:11:54.139 "r_mbytes_per_sec": 0, 00:11:54.139 "w_mbytes_per_sec": 0 00:11:54.139 }, 00:11:54.139 "claimed": false, 00:11:54.139 "zoned": false, 00:11:54.139 "supported_io_types": { 00:11:54.139 "read": true, 00:11:54.139 "write": true, 00:11:54.139 "unmap": true, 00:11:54.139 "flush": true, 00:11:54.139 "reset": true, 00:11:54.139 "nvme_admin": false, 00:11:54.139 "nvme_io": false, 00:11:54.139 "nvme_io_md": false, 00:11:54.139 "write_zeroes": true, 00:11:54.139 "zcopy": false, 00:11:54.139 "get_zone_info": false, 00:11:54.139 "zone_management": false, 00:11:54.139 "zone_append": false, 00:11:54.139 "compare": false, 00:11:54.139 "compare_and_write": false, 00:11:54.139 "abort": false, 00:11:54.139 "seek_hole": false, 00:11:54.139 "seek_data": false, 00:11:54.139 "copy": false, 00:11:54.139 "nvme_iov_md": false 00:11:54.139 }, 00:11:54.139 "memory_domains": [ 00:11:54.139 { 00:11:54.139 "dma_device_id": "system", 00:11:54.139 "dma_device_type": 1 00:11:54.139 }, 00:11:54.139 { 00:11:54.139 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:54.139 "dma_device_type": 2 00:11:54.139 }, 00:11:54.139 { 00:11:54.139 "dma_device_id": "system", 00:11:54.139 "dma_device_type": 1 00:11:54.139 }, 00:11:54.139 { 00:11:54.139 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:54.139 "dma_device_type": 2 00:11:54.139 } 00:11:54.139 ], 00:11:54.139 "driver_specific": { 00:11:54.139 "raid": { 00:11:54.139 "uuid": "21f4b8a7-9531-40d3-810d-1e7ba573007d", 00:11:54.139 "strip_size_kb": 64, 00:11:54.139 "state": "online", 00:11:54.139 "raid_level": "concat", 00:11:54.139 "superblock": true, 00:11:54.139 "num_base_bdevs": 2, 00:11:54.139 "num_base_bdevs_discovered": 2, 00:11:54.139 "num_base_bdevs_operational": 2, 00:11:54.139 "base_bdevs_list": [ 00:11:54.139 { 00:11:54.139 "name": "pt1", 00:11:54.139 "uuid": "00000000-0000-0000-0000-000000000001", 00:11:54.139 "is_configured": true, 00:11:54.139 "data_offset": 2048, 00:11:54.139 "data_size": 63488 00:11:54.139 }, 00:11:54.139 { 00:11:54.139 "name": "pt2", 00:11:54.139 "uuid": "00000000-0000-0000-0000-000000000002", 00:11:54.139 "is_configured": true, 00:11:54.139 "data_offset": 2048, 00:11:54.139 "data_size": 63488 00:11:54.139 } 00:11:54.139 ] 00:11:54.139 } 00:11:54.139 } 00:11:54.139 }' 00:11:54.139 18:15:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:11:54.398 18:15:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:11:54.398 pt2' 00:11:54.398 18:15:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:11:54.398 18:15:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:11:54.398 18:15:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:11:54.657 18:15:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:11:54.657 "name": "pt1", 00:11:54.657 "aliases": [ 00:11:54.657 "00000000-0000-0000-0000-000000000001" 00:11:54.657 ], 00:11:54.657 "product_name": "passthru", 00:11:54.657 "block_size": 512, 00:11:54.657 "num_blocks": 65536, 00:11:54.657 "uuid": "00000000-0000-0000-0000-000000000001", 00:11:54.657 "assigned_rate_limits": { 00:11:54.657 "rw_ios_per_sec": 0, 00:11:54.657 "rw_mbytes_per_sec": 0, 00:11:54.657 "r_mbytes_per_sec": 0, 00:11:54.657 "w_mbytes_per_sec": 0 00:11:54.657 }, 00:11:54.657 "claimed": true, 00:11:54.657 "claim_type": "exclusive_write", 00:11:54.657 "zoned": false, 00:11:54.657 "supported_io_types": { 00:11:54.657 "read": true, 00:11:54.657 "write": true, 00:11:54.657 "unmap": true, 00:11:54.657 "flush": true, 00:11:54.657 "reset": true, 00:11:54.657 "nvme_admin": false, 00:11:54.657 "nvme_io": false, 00:11:54.657 "nvme_io_md": false, 00:11:54.657 "write_zeroes": true, 00:11:54.657 "zcopy": true, 00:11:54.657 "get_zone_info": false, 00:11:54.657 "zone_management": false, 00:11:54.657 "zone_append": false, 00:11:54.657 "compare": false, 00:11:54.657 "compare_and_write": false, 00:11:54.657 "abort": true, 00:11:54.657 "seek_hole": false, 00:11:54.657 "seek_data": false, 00:11:54.657 "copy": true, 00:11:54.657 "nvme_iov_md": false 00:11:54.657 }, 00:11:54.657 "memory_domains": [ 00:11:54.657 { 00:11:54.657 "dma_device_id": "system", 00:11:54.657 "dma_device_type": 1 00:11:54.657 }, 00:11:54.657 { 00:11:54.657 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:54.657 "dma_device_type": 2 00:11:54.657 } 00:11:54.657 ], 00:11:54.657 "driver_specific": { 00:11:54.657 "passthru": { 00:11:54.657 "name": "pt1", 00:11:54.657 "base_bdev_name": "malloc1" 00:11:54.657 } 00:11:54.657 } 00:11:54.657 }' 00:11:54.657 18:15:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:11:54.657 18:15:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:11:54.657 18:15:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:11:54.657 18:15:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:11:54.657 18:15:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:11:54.657 18:15:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:11:54.657 18:15:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:11:54.657 18:15:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:11:54.657 18:15:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:11:54.657 18:15:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:11:54.916 18:15:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:11:54.916 18:15:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:11:54.916 18:15:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:11:54.916 18:15:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:11:54.916 18:15:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:11:55.175 18:15:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:11:55.175 "name": "pt2", 00:11:55.175 "aliases": [ 00:11:55.175 "00000000-0000-0000-0000-000000000002" 00:11:55.175 ], 00:11:55.175 "product_name": "passthru", 00:11:55.175 "block_size": 512, 00:11:55.175 "num_blocks": 65536, 00:11:55.175 "uuid": "00000000-0000-0000-0000-000000000002", 00:11:55.175 "assigned_rate_limits": { 00:11:55.175 "rw_ios_per_sec": 0, 00:11:55.175 "rw_mbytes_per_sec": 0, 00:11:55.175 "r_mbytes_per_sec": 0, 00:11:55.175 "w_mbytes_per_sec": 0 00:11:55.175 }, 00:11:55.175 "claimed": true, 00:11:55.175 "claim_type": "exclusive_write", 00:11:55.175 "zoned": false, 00:11:55.175 "supported_io_types": { 00:11:55.175 "read": true, 00:11:55.175 "write": true, 00:11:55.175 "unmap": true, 00:11:55.175 "flush": true, 00:11:55.175 "reset": true, 00:11:55.175 "nvme_admin": false, 00:11:55.175 "nvme_io": false, 00:11:55.175 "nvme_io_md": false, 00:11:55.175 "write_zeroes": true, 00:11:55.175 "zcopy": true, 00:11:55.175 "get_zone_info": false, 00:11:55.175 "zone_management": false, 00:11:55.175 "zone_append": false, 00:11:55.175 "compare": false, 00:11:55.175 "compare_and_write": false, 00:11:55.175 "abort": true, 00:11:55.175 "seek_hole": false, 00:11:55.175 "seek_data": false, 00:11:55.175 "copy": true, 00:11:55.175 "nvme_iov_md": false 00:11:55.175 }, 00:11:55.175 "memory_domains": [ 00:11:55.175 { 00:11:55.175 "dma_device_id": "system", 00:11:55.175 "dma_device_type": 1 00:11:55.175 }, 00:11:55.175 { 00:11:55.175 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:55.175 "dma_device_type": 2 00:11:55.175 } 00:11:55.175 ], 00:11:55.175 "driver_specific": { 00:11:55.175 "passthru": { 00:11:55.175 "name": "pt2", 00:11:55.175 "base_bdev_name": "malloc2" 00:11:55.175 } 00:11:55.175 } 00:11:55.175 }' 00:11:55.175 18:15:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:11:55.175 18:15:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:11:55.175 18:15:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:11:55.175 18:15:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:11:55.175 18:15:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:11:55.175 18:15:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:11:55.175 18:15:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:11:55.175 18:15:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:11:55.433 18:15:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:11:55.433 18:15:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:11:55.433 18:15:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:11:55.433 18:15:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:11:55.433 18:15:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:11:55.433 18:15:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # jq -r '.[] | .uuid' 00:11:55.691 [2024-07-12 18:15:39.229412] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:11:55.691 18:15:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # raid_bdev_uuid=21f4b8a7-9531-40d3-810d-1e7ba573007d 00:11:55.691 18:15:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@435 -- # '[' -z 21f4b8a7-9531-40d3-810d-1e7ba573007d ']' 00:11:55.691 18:15:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@440 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:11:55.949 [2024-07-12 18:15:39.477838] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:11:55.949 [2024-07-12 18:15:39.477860] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:11:55.949 [2024-07-12 18:15:39.477908] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:11:55.949 [2024-07-12 18:15:39.477959] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:11:55.949 [2024-07-12 18:15:39.477972] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xdc8270 name raid_bdev1, state offline 00:11:55.949 18:15:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:55.949 18:15:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # jq -r '.[]' 00:11:56.207 18:15:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # raid_bdev= 00:11:56.207 18:15:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@442 -- # '[' -n '' ']' 00:11:56.207 18:15:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:11:56.207 18:15:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt1 00:11:56.467 18:15:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:11:56.467 18:15:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:11:56.725 18:15:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs 00:11:56.725 18:15:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # jq -r '[.[] | select(.product_name == "passthru")] | any' 00:11:56.982 18:15:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # '[' false == true ']' 00:11:56.982 18:15:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@456 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'malloc1 malloc2' -n raid_bdev1 00:11:56.983 18:15:40 bdev_raid.raid_superblock_test -- common/autotest_common.sh@648 -- # local es=0 00:11:56.983 18:15:40 bdev_raid.raid_superblock_test -- common/autotest_common.sh@650 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'malloc1 malloc2' -n raid_bdev1 00:11:56.983 18:15:40 bdev_raid.raid_superblock_test -- common/autotest_common.sh@636 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:11:56.983 18:15:40 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:11:56.983 18:15:40 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:11:56.983 18:15:40 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:11:56.983 18:15:40 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:11:56.983 18:15:40 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:11:56.983 18:15:40 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:11:56.983 18:15:40 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py ]] 00:11:56.983 18:15:40 bdev_raid.raid_superblock_test -- common/autotest_common.sh@651 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'malloc1 malloc2' -n raid_bdev1 00:11:56.983 [2024-07-12 18:15:40.701031] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc1 is claimed 00:11:56.983 [2024-07-12 18:15:40.702419] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc2 is claimed 00:11:56.983 [2024-07-12 18:15:40.702473] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc1 00:11:56.983 [2024-07-12 18:15:40.702512] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc2 00:11:56.983 [2024-07-12 18:15:40.702531] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:11:56.983 [2024-07-12 18:15:40.702540] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xdc7ff0 name raid_bdev1, state configuring 00:11:56.983 request: 00:11:56.983 { 00:11:56.983 "name": "raid_bdev1", 00:11:56.983 "raid_level": "concat", 00:11:56.983 "base_bdevs": [ 00:11:56.983 "malloc1", 00:11:56.983 "malloc2" 00:11:56.983 ], 00:11:56.983 "strip_size_kb": 64, 00:11:56.983 "superblock": false, 00:11:56.983 "method": "bdev_raid_create", 00:11:56.983 "req_id": 1 00:11:56.983 } 00:11:56.983 Got JSON-RPC error response 00:11:56.983 response: 00:11:56.983 { 00:11:56.983 "code": -17, 00:11:56.983 "message": "Failed to create RAID bdev raid_bdev1: File exists" 00:11:56.983 } 00:11:57.240 18:15:40 bdev_raid.raid_superblock_test -- common/autotest_common.sh@651 -- # es=1 00:11:57.240 18:15:40 bdev_raid.raid_superblock_test -- common/autotest_common.sh@659 -- # (( es > 128 )) 00:11:57.240 18:15:40 bdev_raid.raid_superblock_test -- common/autotest_common.sh@670 -- # [[ -n '' ]] 00:11:57.240 18:15:40 bdev_raid.raid_superblock_test -- common/autotest_common.sh@675 -- # (( !es == 0 )) 00:11:57.240 18:15:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:57.240 18:15:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # jq -r '.[]' 00:11:57.240 18:15:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # raid_bdev= 00:11:57.240 18:15:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@459 -- # '[' -n '' ']' 00:11:57.240 18:15:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@464 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:11:57.499 [2024-07-12 18:15:41.186246] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:11:57.499 [2024-07-12 18:15:41.186287] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:11:57.499 [2024-07-12 18:15:41.186307] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xc247a0 00:11:57.499 [2024-07-12 18:15:41.186320] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:11:57.499 [2024-07-12 18:15:41.187934] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:11:57.499 [2024-07-12 18:15:41.187962] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:11:57.499 [2024-07-12 18:15:41.188022] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt1 00:11:57.499 [2024-07-12 18:15:41.188046] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:11:57.499 pt1 00:11:57.499 18:15:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@467 -- # verify_raid_bdev_state raid_bdev1 configuring concat 64 2 00:11:57.499 18:15:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:11:57.499 18:15:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:11:57.499 18:15:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:11:57.499 18:15:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:11:57.499 18:15:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:11:57.499 18:15:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:11:57.499 18:15:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:11:57.499 18:15:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:11:57.499 18:15:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:11:57.500 18:15:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:57.500 18:15:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:11:57.758 18:15:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:11:57.758 "name": "raid_bdev1", 00:11:57.758 "uuid": "21f4b8a7-9531-40d3-810d-1e7ba573007d", 00:11:57.758 "strip_size_kb": 64, 00:11:57.758 "state": "configuring", 00:11:57.758 "raid_level": "concat", 00:11:57.758 "superblock": true, 00:11:57.758 "num_base_bdevs": 2, 00:11:57.758 "num_base_bdevs_discovered": 1, 00:11:57.758 "num_base_bdevs_operational": 2, 00:11:57.758 "base_bdevs_list": [ 00:11:57.758 { 00:11:57.758 "name": "pt1", 00:11:57.758 "uuid": "00000000-0000-0000-0000-000000000001", 00:11:57.758 "is_configured": true, 00:11:57.758 "data_offset": 2048, 00:11:57.758 "data_size": 63488 00:11:57.758 }, 00:11:57.758 { 00:11:57.758 "name": null, 00:11:57.758 "uuid": "00000000-0000-0000-0000-000000000002", 00:11:57.758 "is_configured": false, 00:11:57.758 "data_offset": 2048, 00:11:57.758 "data_size": 63488 00:11:57.758 } 00:11:57.758 ] 00:11:57.758 }' 00:11:57.758 18:15:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:11:57.758 18:15:41 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:11:58.323 18:15:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@469 -- # '[' 2 -gt 2 ']' 00:11:58.323 18:15:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i = 1 )) 00:11:58.323 18:15:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:11:58.323 18:15:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@478 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:11:58.581 [2024-07-12 18:15:42.257098] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:11:58.581 [2024-07-12 18:15:42.257143] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:11:58.581 [2024-07-12 18:15:42.257161] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xdbe820 00:11:58.581 [2024-07-12 18:15:42.257173] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:11:58.581 [2024-07-12 18:15:42.257496] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:11:58.581 [2024-07-12 18:15:42.257513] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:11:58.581 [2024-07-12 18:15:42.257569] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:11:58.581 [2024-07-12 18:15:42.257587] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:11:58.581 [2024-07-12 18:15:42.257677] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0xc1aec0 00:11:58.581 [2024-07-12 18:15:42.257687] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 126976, blocklen 512 00:11:58.581 [2024-07-12 18:15:42.257852] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xc1bf00 00:11:58.581 [2024-07-12 18:15:42.257991] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0xc1aec0 00:11:58.581 [2024-07-12 18:15:42.258002] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0xc1aec0 00:11:58.581 [2024-07-12 18:15:42.258100] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:11:58.581 pt2 00:11:58.581 18:15:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i++ )) 00:11:58.581 18:15:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:11:58.581 18:15:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@482 -- # verify_raid_bdev_state raid_bdev1 online concat 64 2 00:11:58.581 18:15:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:11:58.581 18:15:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:11:58.581 18:15:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:11:58.581 18:15:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:11:58.581 18:15:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:11:58.581 18:15:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:11:58.581 18:15:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:11:58.581 18:15:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:11:58.581 18:15:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:11:58.581 18:15:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:58.581 18:15:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:11:58.839 18:15:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:11:58.839 "name": "raid_bdev1", 00:11:58.839 "uuid": "21f4b8a7-9531-40d3-810d-1e7ba573007d", 00:11:58.839 "strip_size_kb": 64, 00:11:58.839 "state": "online", 00:11:58.839 "raid_level": "concat", 00:11:58.839 "superblock": true, 00:11:58.839 "num_base_bdevs": 2, 00:11:58.839 "num_base_bdevs_discovered": 2, 00:11:58.839 "num_base_bdevs_operational": 2, 00:11:58.839 "base_bdevs_list": [ 00:11:58.839 { 00:11:58.839 "name": "pt1", 00:11:58.839 "uuid": "00000000-0000-0000-0000-000000000001", 00:11:58.839 "is_configured": true, 00:11:58.839 "data_offset": 2048, 00:11:58.839 "data_size": 63488 00:11:58.839 }, 00:11:58.839 { 00:11:58.839 "name": "pt2", 00:11:58.839 "uuid": "00000000-0000-0000-0000-000000000002", 00:11:58.839 "is_configured": true, 00:11:58.839 "data_offset": 2048, 00:11:58.839 "data_size": 63488 00:11:58.839 } 00:11:58.839 ] 00:11:58.839 }' 00:11:58.839 18:15:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:11:58.839 18:15:42 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:11:59.406 18:15:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@483 -- # verify_raid_bdev_properties raid_bdev1 00:11:59.406 18:15:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:11:59.406 18:15:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:11:59.406 18:15:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:11:59.406 18:15:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:11:59.406 18:15:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@198 -- # local name 00:11:59.406 18:15:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:11:59.406 18:15:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:11:59.664 [2024-07-12 18:15:43.340259] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:11:59.664 18:15:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:11:59.664 "name": "raid_bdev1", 00:11:59.664 "aliases": [ 00:11:59.664 "21f4b8a7-9531-40d3-810d-1e7ba573007d" 00:11:59.664 ], 00:11:59.664 "product_name": "Raid Volume", 00:11:59.664 "block_size": 512, 00:11:59.664 "num_blocks": 126976, 00:11:59.664 "uuid": "21f4b8a7-9531-40d3-810d-1e7ba573007d", 00:11:59.664 "assigned_rate_limits": { 00:11:59.664 "rw_ios_per_sec": 0, 00:11:59.664 "rw_mbytes_per_sec": 0, 00:11:59.664 "r_mbytes_per_sec": 0, 00:11:59.664 "w_mbytes_per_sec": 0 00:11:59.664 }, 00:11:59.664 "claimed": false, 00:11:59.664 "zoned": false, 00:11:59.664 "supported_io_types": { 00:11:59.664 "read": true, 00:11:59.664 "write": true, 00:11:59.664 "unmap": true, 00:11:59.664 "flush": true, 00:11:59.664 "reset": true, 00:11:59.664 "nvme_admin": false, 00:11:59.664 "nvme_io": false, 00:11:59.664 "nvme_io_md": false, 00:11:59.664 "write_zeroes": true, 00:11:59.664 "zcopy": false, 00:11:59.664 "get_zone_info": false, 00:11:59.664 "zone_management": false, 00:11:59.664 "zone_append": false, 00:11:59.664 "compare": false, 00:11:59.664 "compare_and_write": false, 00:11:59.664 "abort": false, 00:11:59.664 "seek_hole": false, 00:11:59.664 "seek_data": false, 00:11:59.664 "copy": false, 00:11:59.664 "nvme_iov_md": false 00:11:59.664 }, 00:11:59.664 "memory_domains": [ 00:11:59.664 { 00:11:59.664 "dma_device_id": "system", 00:11:59.664 "dma_device_type": 1 00:11:59.664 }, 00:11:59.664 { 00:11:59.664 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:59.664 "dma_device_type": 2 00:11:59.664 }, 00:11:59.664 { 00:11:59.664 "dma_device_id": "system", 00:11:59.664 "dma_device_type": 1 00:11:59.664 }, 00:11:59.664 { 00:11:59.664 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:59.664 "dma_device_type": 2 00:11:59.664 } 00:11:59.664 ], 00:11:59.664 "driver_specific": { 00:11:59.664 "raid": { 00:11:59.664 "uuid": "21f4b8a7-9531-40d3-810d-1e7ba573007d", 00:11:59.664 "strip_size_kb": 64, 00:11:59.664 "state": "online", 00:11:59.664 "raid_level": "concat", 00:11:59.664 "superblock": true, 00:11:59.664 "num_base_bdevs": 2, 00:11:59.664 "num_base_bdevs_discovered": 2, 00:11:59.664 "num_base_bdevs_operational": 2, 00:11:59.664 "base_bdevs_list": [ 00:11:59.664 { 00:11:59.664 "name": "pt1", 00:11:59.664 "uuid": "00000000-0000-0000-0000-000000000001", 00:11:59.664 "is_configured": true, 00:11:59.664 "data_offset": 2048, 00:11:59.664 "data_size": 63488 00:11:59.664 }, 00:11:59.664 { 00:11:59.664 "name": "pt2", 00:11:59.664 "uuid": "00000000-0000-0000-0000-000000000002", 00:11:59.664 "is_configured": true, 00:11:59.664 "data_offset": 2048, 00:11:59.664 "data_size": 63488 00:11:59.664 } 00:11:59.664 ] 00:11:59.664 } 00:11:59.664 } 00:11:59.664 }' 00:11:59.664 18:15:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:11:59.923 18:15:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:11:59.923 pt2' 00:11:59.923 18:15:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:11:59.923 18:15:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:11:59.923 18:15:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:12:00.181 18:15:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:12:00.181 "name": "pt1", 00:12:00.181 "aliases": [ 00:12:00.181 "00000000-0000-0000-0000-000000000001" 00:12:00.181 ], 00:12:00.181 "product_name": "passthru", 00:12:00.181 "block_size": 512, 00:12:00.181 "num_blocks": 65536, 00:12:00.181 "uuid": "00000000-0000-0000-0000-000000000001", 00:12:00.181 "assigned_rate_limits": { 00:12:00.181 "rw_ios_per_sec": 0, 00:12:00.181 "rw_mbytes_per_sec": 0, 00:12:00.181 "r_mbytes_per_sec": 0, 00:12:00.181 "w_mbytes_per_sec": 0 00:12:00.181 }, 00:12:00.181 "claimed": true, 00:12:00.181 "claim_type": "exclusive_write", 00:12:00.181 "zoned": false, 00:12:00.181 "supported_io_types": { 00:12:00.181 "read": true, 00:12:00.181 "write": true, 00:12:00.181 "unmap": true, 00:12:00.181 "flush": true, 00:12:00.181 "reset": true, 00:12:00.181 "nvme_admin": false, 00:12:00.181 "nvme_io": false, 00:12:00.181 "nvme_io_md": false, 00:12:00.181 "write_zeroes": true, 00:12:00.181 "zcopy": true, 00:12:00.181 "get_zone_info": false, 00:12:00.181 "zone_management": false, 00:12:00.181 "zone_append": false, 00:12:00.181 "compare": false, 00:12:00.181 "compare_and_write": false, 00:12:00.181 "abort": true, 00:12:00.181 "seek_hole": false, 00:12:00.181 "seek_data": false, 00:12:00.181 "copy": true, 00:12:00.181 "nvme_iov_md": false 00:12:00.181 }, 00:12:00.181 "memory_domains": [ 00:12:00.181 { 00:12:00.181 "dma_device_id": "system", 00:12:00.181 "dma_device_type": 1 00:12:00.181 }, 00:12:00.181 { 00:12:00.181 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:00.181 "dma_device_type": 2 00:12:00.181 } 00:12:00.181 ], 00:12:00.181 "driver_specific": { 00:12:00.181 "passthru": { 00:12:00.181 "name": "pt1", 00:12:00.181 "base_bdev_name": "malloc1" 00:12:00.181 } 00:12:00.181 } 00:12:00.181 }' 00:12:00.181 18:15:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:12:00.181 18:15:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:12:00.181 18:15:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:12:00.181 18:15:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:12:00.181 18:15:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:12:00.181 18:15:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:12:00.182 18:15:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:12:00.182 18:15:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:12:00.182 18:15:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:12:00.182 18:15:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:12:00.440 18:15:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:12:00.440 18:15:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:12:00.440 18:15:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:12:00.440 18:15:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:12:00.440 18:15:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:12:00.698 18:15:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:12:00.698 "name": "pt2", 00:12:00.698 "aliases": [ 00:12:00.698 "00000000-0000-0000-0000-000000000002" 00:12:00.698 ], 00:12:00.698 "product_name": "passthru", 00:12:00.698 "block_size": 512, 00:12:00.698 "num_blocks": 65536, 00:12:00.698 "uuid": "00000000-0000-0000-0000-000000000002", 00:12:00.698 "assigned_rate_limits": { 00:12:00.698 "rw_ios_per_sec": 0, 00:12:00.698 "rw_mbytes_per_sec": 0, 00:12:00.698 "r_mbytes_per_sec": 0, 00:12:00.698 "w_mbytes_per_sec": 0 00:12:00.698 }, 00:12:00.698 "claimed": true, 00:12:00.698 "claim_type": "exclusive_write", 00:12:00.698 "zoned": false, 00:12:00.698 "supported_io_types": { 00:12:00.698 "read": true, 00:12:00.698 "write": true, 00:12:00.698 "unmap": true, 00:12:00.698 "flush": true, 00:12:00.698 "reset": true, 00:12:00.698 "nvme_admin": false, 00:12:00.698 "nvme_io": false, 00:12:00.698 "nvme_io_md": false, 00:12:00.698 "write_zeroes": true, 00:12:00.698 "zcopy": true, 00:12:00.698 "get_zone_info": false, 00:12:00.698 "zone_management": false, 00:12:00.698 "zone_append": false, 00:12:00.698 "compare": false, 00:12:00.698 "compare_and_write": false, 00:12:00.698 "abort": true, 00:12:00.698 "seek_hole": false, 00:12:00.698 "seek_data": false, 00:12:00.698 "copy": true, 00:12:00.698 "nvme_iov_md": false 00:12:00.698 }, 00:12:00.698 "memory_domains": [ 00:12:00.698 { 00:12:00.698 "dma_device_id": "system", 00:12:00.698 "dma_device_type": 1 00:12:00.698 }, 00:12:00.698 { 00:12:00.698 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:00.698 "dma_device_type": 2 00:12:00.698 } 00:12:00.698 ], 00:12:00.698 "driver_specific": { 00:12:00.698 "passthru": { 00:12:00.698 "name": "pt2", 00:12:00.698 "base_bdev_name": "malloc2" 00:12:00.698 } 00:12:00.698 } 00:12:00.698 }' 00:12:00.698 18:15:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:12:00.698 18:15:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:12:00.698 18:15:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:12:00.698 18:15:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:12:00.698 18:15:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:12:00.698 18:15:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:12:00.698 18:15:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:12:00.698 18:15:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:12:00.957 18:15:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:12:00.957 18:15:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:12:00.957 18:15:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:12:00.957 18:15:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:12:00.957 18:15:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # jq -r '.[] | .uuid' 00:12:00.957 18:15:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:12:01.216 [2024-07-12 18:15:44.735960] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:12:01.216 18:15:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # '[' 21f4b8a7-9531-40d3-810d-1e7ba573007d '!=' 21f4b8a7-9531-40d3-810d-1e7ba573007d ']' 00:12:01.216 18:15:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@490 -- # has_redundancy concat 00:12:01.216 18:15:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:12:01.216 18:15:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@215 -- # return 1 00:12:01.216 18:15:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@562 -- # killprocess 2466644 00:12:01.216 18:15:44 bdev_raid.raid_superblock_test -- common/autotest_common.sh@948 -- # '[' -z 2466644 ']' 00:12:01.216 18:15:44 bdev_raid.raid_superblock_test -- common/autotest_common.sh@952 -- # kill -0 2466644 00:12:01.216 18:15:44 bdev_raid.raid_superblock_test -- common/autotest_common.sh@953 -- # uname 00:12:01.216 18:15:44 bdev_raid.raid_superblock_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:12:01.216 18:15:44 bdev_raid.raid_superblock_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2466644 00:12:01.216 18:15:44 bdev_raid.raid_superblock_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:12:01.216 18:15:44 bdev_raid.raid_superblock_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:12:01.216 18:15:44 bdev_raid.raid_superblock_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2466644' 00:12:01.216 killing process with pid 2466644 00:12:01.216 18:15:44 bdev_raid.raid_superblock_test -- common/autotest_common.sh@967 -- # kill 2466644 00:12:01.216 [2024-07-12 18:15:44.794577] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:12:01.216 [2024-07-12 18:15:44.794627] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:12:01.216 [2024-07-12 18:15:44.794667] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:12:01.216 [2024-07-12 18:15:44.794678] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xc1aec0 name raid_bdev1, state offline 00:12:01.216 18:15:44 bdev_raid.raid_superblock_test -- common/autotest_common.sh@972 -- # wait 2466644 00:12:01.216 [2024-07-12 18:15:44.813824] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:12:01.474 18:15:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@564 -- # return 0 00:12:01.474 00:12:01.474 real 0m10.460s 00:12:01.474 user 0m18.661s 00:12:01.474 sys 0m1.919s 00:12:01.474 18:15:45 bdev_raid.raid_superblock_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:12:01.474 18:15:45 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:12:01.474 ************************************ 00:12:01.474 END TEST raid_superblock_test 00:12:01.474 ************************************ 00:12:01.474 18:15:45 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:12:01.474 18:15:45 bdev_raid -- bdev/bdev_raid.sh@870 -- # run_test raid_read_error_test raid_io_error_test concat 2 read 00:12:01.474 18:15:45 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:12:01.474 18:15:45 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:12:01.474 18:15:45 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:12:01.474 ************************************ 00:12:01.474 START TEST raid_read_error_test 00:12:01.474 ************************************ 00:12:01.474 18:15:45 bdev_raid.raid_read_error_test -- common/autotest_common.sh@1123 -- # raid_io_error_test concat 2 read 00:12:01.474 18:15:45 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@788 -- # local raid_level=concat 00:12:01.474 18:15:45 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@789 -- # local num_base_bdevs=2 00:12:01.474 18:15:45 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@790 -- # local error_io_type=read 00:12:01.474 18:15:45 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i = 1 )) 00:12:01.474 18:15:45 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:12:01.474 18:15:45 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev1 00:12:01.474 18:15:45 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:12:01.474 18:15:45 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:12:01.474 18:15:45 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev2 00:12:01.474 18:15:45 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:12:01.474 18:15:45 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:12:01.474 18:15:45 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:12:01.474 18:15:45 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # local base_bdevs 00:12:01.474 18:15:45 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@792 -- # local raid_bdev_name=raid_bdev1 00:12:01.474 18:15:45 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # local strip_size 00:12:01.474 18:15:45 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@794 -- # local create_arg 00:12:01.474 18:15:45 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@795 -- # local bdevperf_log 00:12:01.474 18:15:45 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@796 -- # local fail_per_s 00:12:01.474 18:15:45 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@798 -- # '[' concat '!=' raid1 ']' 00:12:01.474 18:15:45 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@799 -- # strip_size=64 00:12:01.474 18:15:45 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@800 -- # create_arg+=' -z 64' 00:12:01.474 18:15:45 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@805 -- # mktemp -p /raidtest 00:12:01.474 18:15:45 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@805 -- # bdevperf_log=/raidtest/tmp.qVrX8h8IZe 00:12:01.474 18:15:45 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@808 -- # raid_pid=2468269 00:12:01.474 18:15:45 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@809 -- # waitforlisten 2468269 /var/tmp/spdk-raid.sock 00:12:01.475 18:15:45 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@807 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 128k -q 1 -z -f -L bdev_raid 00:12:01.475 18:15:45 bdev_raid.raid_read_error_test -- common/autotest_common.sh@829 -- # '[' -z 2468269 ']' 00:12:01.475 18:15:45 bdev_raid.raid_read_error_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:12:01.475 18:15:45 bdev_raid.raid_read_error_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:12:01.475 18:15:45 bdev_raid.raid_read_error_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:12:01.475 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:12:01.475 18:15:45 bdev_raid.raid_read_error_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:12:01.475 18:15:45 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:12:01.475 [2024-07-12 18:15:45.185955] Starting SPDK v24.09-pre git sha1 182dd7de4 / DPDK 24.03.0 initialization... 00:12:01.475 [2024-07-12 18:15:45.186018] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2468269 ] 00:12:01.734 [2024-07-12 18:15:45.307027] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:12:01.734 [2024-07-12 18:15:45.413015] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:12:01.992 [2024-07-12 18:15:45.487027] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:12:01.992 [2024-07-12 18:15:45.487064] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:12:02.559 18:15:46 bdev_raid.raid_read_error_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:12:02.559 18:15:46 bdev_raid.raid_read_error_test -- common/autotest_common.sh@862 -- # return 0 00:12:02.559 18:15:46 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:12:02.559 18:15:46 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:12:02.818 BaseBdev1_malloc 00:12:02.818 18:15:46 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev1_malloc 00:12:03.077 true 00:12:03.077 18:15:46 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev1_malloc -p BaseBdev1 00:12:03.335 [2024-07-12 18:15:46.807827] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev1_malloc 00:12:03.335 [2024-07-12 18:15:46.807873] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:12:03.335 [2024-07-12 18:15:46.807893] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x11c90d0 00:12:03.335 [2024-07-12 18:15:46.807906] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:12:03.335 [2024-07-12 18:15:46.809782] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:12:03.335 [2024-07-12 18:15:46.809812] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:12:03.335 BaseBdev1 00:12:03.335 18:15:46 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:12:03.335 18:15:46 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:12:03.335 BaseBdev2_malloc 00:12:03.594 18:15:47 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev2_malloc 00:12:03.594 true 00:12:03.594 18:15:47 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev2_malloc -p BaseBdev2 00:12:03.853 [2024-07-12 18:15:47.522575] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev2_malloc 00:12:03.853 [2024-07-12 18:15:47.522622] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:12:03.853 [2024-07-12 18:15:47.522643] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x11cd910 00:12:03.853 [2024-07-12 18:15:47.522656] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:12:03.853 [2024-07-12 18:15:47.524247] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:12:03.853 [2024-07-12 18:15:47.524276] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:12:03.853 BaseBdev2 00:12:03.853 18:15:47 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@819 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'BaseBdev1 BaseBdev2' -n raid_bdev1 -s 00:12:04.112 [2024-07-12 18:15:47.763235] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:12:04.112 [2024-07-12 18:15:47.764600] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:12:04.112 [2024-07-12 18:15:47.764788] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x11cf320 00:12:04.112 [2024-07-12 18:15:47.764802] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 126976, blocklen 512 00:12:04.112 [2024-07-12 18:15:47.765006] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x11d0290 00:12:04.112 [2024-07-12 18:15:47.765149] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x11cf320 00:12:04.112 [2024-07-12 18:15:47.765159] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x11cf320 00:12:04.112 [2024-07-12 18:15:47.765262] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:12:04.112 18:15:47 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@820 -- # verify_raid_bdev_state raid_bdev1 online concat 64 2 00:12:04.112 18:15:47 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:12:04.112 18:15:47 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:12:04.112 18:15:47 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:12:04.112 18:15:47 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:12:04.112 18:15:47 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:12:04.112 18:15:47 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:12:04.112 18:15:47 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:12:04.112 18:15:47 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:12:04.112 18:15:47 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:12:04.112 18:15:47 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:12:04.112 18:15:47 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:04.418 18:15:48 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:12:04.418 "name": "raid_bdev1", 00:12:04.418 "uuid": "0422a09e-ec2f-4966-a2e1-6c33b170a347", 00:12:04.418 "strip_size_kb": 64, 00:12:04.418 "state": "online", 00:12:04.418 "raid_level": "concat", 00:12:04.418 "superblock": true, 00:12:04.418 "num_base_bdevs": 2, 00:12:04.418 "num_base_bdevs_discovered": 2, 00:12:04.418 "num_base_bdevs_operational": 2, 00:12:04.418 "base_bdevs_list": [ 00:12:04.418 { 00:12:04.418 "name": "BaseBdev1", 00:12:04.418 "uuid": "79ea9c7f-98c3-5756-a43f-00bd27235884", 00:12:04.418 "is_configured": true, 00:12:04.418 "data_offset": 2048, 00:12:04.418 "data_size": 63488 00:12:04.418 }, 00:12:04.418 { 00:12:04.418 "name": "BaseBdev2", 00:12:04.418 "uuid": "0063a8a7-76ef-5437-a6de-bafafa72c426", 00:12:04.418 "is_configured": true, 00:12:04.418 "data_offset": 2048, 00:12:04.418 "data_size": 63488 00:12:04.418 } 00:12:04.418 ] 00:12:04.418 }' 00:12:04.418 18:15:48 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:12:04.418 18:15:48 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:12:04.985 18:15:48 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@824 -- # sleep 1 00:12:04.985 18:15:48 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@823 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/spdk-raid.sock perform_tests 00:12:04.985 [2024-07-12 18:15:48.697986] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x11ca9b0 00:12:05.922 18:15:49 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@827 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_inject_error EE_BaseBdev1_malloc read failure 00:12:06.181 18:15:49 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@829 -- # local expected_num_base_bdevs 00:12:06.181 18:15:49 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@830 -- # [[ concat = \r\a\i\d\1 ]] 00:12:06.181 18:15:49 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@833 -- # expected_num_base_bdevs=2 00:12:06.181 18:15:49 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@835 -- # verify_raid_bdev_state raid_bdev1 online concat 64 2 00:12:06.181 18:15:49 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:12:06.181 18:15:49 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:12:06.181 18:15:49 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:12:06.181 18:15:49 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:12:06.181 18:15:49 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:12:06.181 18:15:49 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:12:06.181 18:15:49 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:12:06.181 18:15:49 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:12:06.181 18:15:49 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:12:06.181 18:15:49 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:06.181 18:15:49 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:12:06.440 18:15:49 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:12:06.440 "name": "raid_bdev1", 00:12:06.440 "uuid": "0422a09e-ec2f-4966-a2e1-6c33b170a347", 00:12:06.440 "strip_size_kb": 64, 00:12:06.440 "state": "online", 00:12:06.440 "raid_level": "concat", 00:12:06.440 "superblock": true, 00:12:06.440 "num_base_bdevs": 2, 00:12:06.440 "num_base_bdevs_discovered": 2, 00:12:06.440 "num_base_bdevs_operational": 2, 00:12:06.440 "base_bdevs_list": [ 00:12:06.440 { 00:12:06.440 "name": "BaseBdev1", 00:12:06.440 "uuid": "79ea9c7f-98c3-5756-a43f-00bd27235884", 00:12:06.440 "is_configured": true, 00:12:06.440 "data_offset": 2048, 00:12:06.440 "data_size": 63488 00:12:06.440 }, 00:12:06.440 { 00:12:06.440 "name": "BaseBdev2", 00:12:06.440 "uuid": "0063a8a7-76ef-5437-a6de-bafafa72c426", 00:12:06.440 "is_configured": true, 00:12:06.440 "data_offset": 2048, 00:12:06.440 "data_size": 63488 00:12:06.440 } 00:12:06.440 ] 00:12:06.440 }' 00:12:06.440 18:15:49 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:12:06.440 18:15:49 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:12:07.008 18:15:50 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@837 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:12:07.008 [2024-07-12 18:15:50.656524] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:12:07.008 [2024-07-12 18:15:50.656568] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:12:07.008 [2024-07-12 18:15:50.659715] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:12:07.008 [2024-07-12 18:15:50.659745] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:12:07.008 [2024-07-12 18:15:50.659772] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:12:07.008 [2024-07-12 18:15:50.659783] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x11cf320 name raid_bdev1, state offline 00:12:07.008 0 00:12:07.008 18:15:50 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@839 -- # killprocess 2468269 00:12:07.008 18:15:50 bdev_raid.raid_read_error_test -- common/autotest_common.sh@948 -- # '[' -z 2468269 ']' 00:12:07.008 18:15:50 bdev_raid.raid_read_error_test -- common/autotest_common.sh@952 -- # kill -0 2468269 00:12:07.008 18:15:50 bdev_raid.raid_read_error_test -- common/autotest_common.sh@953 -- # uname 00:12:07.008 18:15:50 bdev_raid.raid_read_error_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:12:07.008 18:15:50 bdev_raid.raid_read_error_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2468269 00:12:07.008 18:15:50 bdev_raid.raid_read_error_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:12:07.008 18:15:50 bdev_raid.raid_read_error_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:12:07.008 18:15:50 bdev_raid.raid_read_error_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2468269' 00:12:07.008 killing process with pid 2468269 00:12:07.008 18:15:50 bdev_raid.raid_read_error_test -- common/autotest_common.sh@967 -- # kill 2468269 00:12:07.008 [2024-07-12 18:15:50.725519] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:12:07.008 18:15:50 bdev_raid.raid_read_error_test -- common/autotest_common.sh@972 -- # wait 2468269 00:12:07.267 [2024-07-12 18:15:50.736082] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:12:07.267 18:15:50 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # grep -v Job /raidtest/tmp.qVrX8h8IZe 00:12:07.267 18:15:50 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # awk '{print $6}' 00:12:07.267 18:15:50 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # grep raid_bdev1 00:12:07.267 18:15:50 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # fail_per_s=0.51 00:12:07.267 18:15:50 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@844 -- # has_redundancy concat 00:12:07.267 18:15:50 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:12:07.267 18:15:50 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@215 -- # return 1 00:12:07.267 18:15:50 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@847 -- # [[ 0.51 != \0\.\0\0 ]] 00:12:07.267 00:12:07.267 real 0m5.858s 00:12:07.267 user 0m9.012s 00:12:07.267 sys 0m1.035s 00:12:07.267 18:15:50 bdev_raid.raid_read_error_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:12:07.267 18:15:50 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:12:07.268 ************************************ 00:12:07.268 END TEST raid_read_error_test 00:12:07.268 ************************************ 00:12:07.527 18:15:51 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:12:07.527 18:15:51 bdev_raid -- bdev/bdev_raid.sh@871 -- # run_test raid_write_error_test raid_io_error_test concat 2 write 00:12:07.527 18:15:51 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:12:07.527 18:15:51 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:12:07.527 18:15:51 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:12:07.527 ************************************ 00:12:07.527 START TEST raid_write_error_test 00:12:07.527 ************************************ 00:12:07.527 18:15:51 bdev_raid.raid_write_error_test -- common/autotest_common.sh@1123 -- # raid_io_error_test concat 2 write 00:12:07.527 18:15:51 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@788 -- # local raid_level=concat 00:12:07.527 18:15:51 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@789 -- # local num_base_bdevs=2 00:12:07.527 18:15:51 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@790 -- # local error_io_type=write 00:12:07.527 18:15:51 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i = 1 )) 00:12:07.527 18:15:51 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:12:07.527 18:15:51 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev1 00:12:07.527 18:15:51 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:12:07.527 18:15:51 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:12:07.527 18:15:51 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev2 00:12:07.527 18:15:51 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:12:07.527 18:15:51 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:12:07.527 18:15:51 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:12:07.527 18:15:51 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # local base_bdevs 00:12:07.527 18:15:51 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@792 -- # local raid_bdev_name=raid_bdev1 00:12:07.527 18:15:51 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # local strip_size 00:12:07.527 18:15:51 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@794 -- # local create_arg 00:12:07.527 18:15:51 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@795 -- # local bdevperf_log 00:12:07.527 18:15:51 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@796 -- # local fail_per_s 00:12:07.527 18:15:51 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@798 -- # '[' concat '!=' raid1 ']' 00:12:07.527 18:15:51 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@799 -- # strip_size=64 00:12:07.527 18:15:51 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@800 -- # create_arg+=' -z 64' 00:12:07.527 18:15:51 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@805 -- # mktemp -p /raidtest 00:12:07.527 18:15:51 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@805 -- # bdevperf_log=/raidtest/tmp.9o7z4rXR5s 00:12:07.527 18:15:51 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@808 -- # raid_pid=2469078 00:12:07.527 18:15:51 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@809 -- # waitforlisten 2469078 /var/tmp/spdk-raid.sock 00:12:07.527 18:15:51 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@807 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 128k -q 1 -z -f -L bdev_raid 00:12:07.527 18:15:51 bdev_raid.raid_write_error_test -- common/autotest_common.sh@829 -- # '[' -z 2469078 ']' 00:12:07.527 18:15:51 bdev_raid.raid_write_error_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:12:07.527 18:15:51 bdev_raid.raid_write_error_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:12:07.527 18:15:51 bdev_raid.raid_write_error_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:12:07.527 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:12:07.527 18:15:51 bdev_raid.raid_write_error_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:12:07.527 18:15:51 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:12:07.527 [2024-07-12 18:15:51.135843] Starting SPDK v24.09-pre git sha1 182dd7de4 / DPDK 24.03.0 initialization... 00:12:07.527 [2024-07-12 18:15:51.135915] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2469078 ] 00:12:07.786 [2024-07-12 18:15:51.265341] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:12:07.786 [2024-07-12 18:15:51.368396] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:12:07.787 [2024-07-12 18:15:51.424775] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:12:07.787 [2024-07-12 18:15:51.424803] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:12:08.354 18:15:52 bdev_raid.raid_write_error_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:12:08.354 18:15:52 bdev_raid.raid_write_error_test -- common/autotest_common.sh@862 -- # return 0 00:12:08.354 18:15:52 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:12:08.354 18:15:52 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:12:08.613 BaseBdev1_malloc 00:12:08.613 18:15:52 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev1_malloc 00:12:08.613 true 00:12:08.899 18:15:52 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev1_malloc -p BaseBdev1 00:12:08.899 [2024-07-12 18:15:52.504348] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev1_malloc 00:12:08.899 [2024-07-12 18:15:52.504398] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:12:08.899 [2024-07-12 18:15:52.504417] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x208c0d0 00:12:08.899 [2024-07-12 18:15:52.504430] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:12:08.899 [2024-07-12 18:15:52.506148] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:12:08.899 [2024-07-12 18:15:52.506177] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:12:08.899 BaseBdev1 00:12:08.899 18:15:52 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:12:08.899 18:15:52 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:12:09.173 BaseBdev2_malloc 00:12:09.173 18:15:52 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev2_malloc 00:12:09.173 true 00:12:09.173 18:15:52 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev2_malloc -p BaseBdev2 00:12:09.431 [2024-07-12 18:15:53.086612] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev2_malloc 00:12:09.431 [2024-07-12 18:15:53.086657] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:12:09.431 [2024-07-12 18:15:53.086677] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x2090910 00:12:09.431 [2024-07-12 18:15:53.086690] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:12:09.431 [2024-07-12 18:15:53.088172] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:12:09.431 [2024-07-12 18:15:53.088200] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:12:09.431 BaseBdev2 00:12:09.431 18:15:53 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@819 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'BaseBdev1 BaseBdev2' -n raid_bdev1 -s 00:12:09.689 [2024-07-12 18:15:53.331288] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:12:09.689 [2024-07-12 18:15:53.332487] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:12:09.689 [2024-07-12 18:15:53.332660] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x2092320 00:12:09.689 [2024-07-12 18:15:53.332673] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 126976, blocklen 512 00:12:09.689 [2024-07-12 18:15:53.332857] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x2093290 00:12:09.689 [2024-07-12 18:15:53.333005] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x2092320 00:12:09.689 [2024-07-12 18:15:53.333016] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x2092320 00:12:09.689 [2024-07-12 18:15:53.333113] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:12:09.689 18:15:53 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@820 -- # verify_raid_bdev_state raid_bdev1 online concat 64 2 00:12:09.689 18:15:53 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:12:09.689 18:15:53 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:12:09.689 18:15:53 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:12:09.689 18:15:53 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:12:09.689 18:15:53 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:12:09.689 18:15:53 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:12:09.689 18:15:53 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:12:09.689 18:15:53 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:12:09.689 18:15:53 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:12:09.689 18:15:53 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:09.689 18:15:53 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:12:09.946 18:15:53 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:12:09.946 "name": "raid_bdev1", 00:12:09.946 "uuid": "7fa3e1b5-f466-495a-acae-7865c6f6ac3d", 00:12:09.946 "strip_size_kb": 64, 00:12:09.946 "state": "online", 00:12:09.946 "raid_level": "concat", 00:12:09.946 "superblock": true, 00:12:09.946 "num_base_bdevs": 2, 00:12:09.946 "num_base_bdevs_discovered": 2, 00:12:09.946 "num_base_bdevs_operational": 2, 00:12:09.946 "base_bdevs_list": [ 00:12:09.946 { 00:12:09.946 "name": "BaseBdev1", 00:12:09.946 "uuid": "ccf00071-7c94-5858-829e-62e83de6e51a", 00:12:09.946 "is_configured": true, 00:12:09.946 "data_offset": 2048, 00:12:09.946 "data_size": 63488 00:12:09.946 }, 00:12:09.946 { 00:12:09.946 "name": "BaseBdev2", 00:12:09.946 "uuid": "5291c686-9a34-5987-9e06-d867816fc99c", 00:12:09.946 "is_configured": true, 00:12:09.946 "data_offset": 2048, 00:12:09.946 "data_size": 63488 00:12:09.946 } 00:12:09.946 ] 00:12:09.946 }' 00:12:09.946 18:15:53 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:12:09.946 18:15:53 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:12:10.512 18:15:54 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@824 -- # sleep 1 00:12:10.512 18:15:54 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@823 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/spdk-raid.sock perform_tests 00:12:10.512 [2024-07-12 18:15:54.157748] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x208d9b0 00:12:11.446 18:15:55 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@827 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_inject_error EE_BaseBdev1_malloc write failure 00:12:11.705 18:15:55 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@829 -- # local expected_num_base_bdevs 00:12:11.705 18:15:55 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@830 -- # [[ concat = \r\a\i\d\1 ]] 00:12:11.705 18:15:55 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@833 -- # expected_num_base_bdevs=2 00:12:11.705 18:15:55 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@835 -- # verify_raid_bdev_state raid_bdev1 online concat 64 2 00:12:11.705 18:15:55 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:12:11.705 18:15:55 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:12:11.705 18:15:55 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:12:11.705 18:15:55 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:12:11.705 18:15:55 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:12:11.705 18:15:55 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:12:11.705 18:15:55 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:12:11.705 18:15:55 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:12:11.705 18:15:55 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:12:11.705 18:15:55 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:11.705 18:15:55 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:12:11.705 18:15:55 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:12:11.705 "name": "raid_bdev1", 00:12:11.705 "uuid": "7fa3e1b5-f466-495a-acae-7865c6f6ac3d", 00:12:11.705 "strip_size_kb": 64, 00:12:11.705 "state": "online", 00:12:11.705 "raid_level": "concat", 00:12:11.705 "superblock": true, 00:12:11.705 "num_base_bdevs": 2, 00:12:11.705 "num_base_bdevs_discovered": 2, 00:12:11.705 "num_base_bdevs_operational": 2, 00:12:11.705 "base_bdevs_list": [ 00:12:11.705 { 00:12:11.705 "name": "BaseBdev1", 00:12:11.705 "uuid": "ccf00071-7c94-5858-829e-62e83de6e51a", 00:12:11.705 "is_configured": true, 00:12:11.705 "data_offset": 2048, 00:12:11.705 "data_size": 63488 00:12:11.705 }, 00:12:11.705 { 00:12:11.705 "name": "BaseBdev2", 00:12:11.705 "uuid": "5291c686-9a34-5987-9e06-d867816fc99c", 00:12:11.705 "is_configured": true, 00:12:11.705 "data_offset": 2048, 00:12:11.705 "data_size": 63488 00:12:11.705 } 00:12:11.705 ] 00:12:11.705 }' 00:12:11.705 18:15:55 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:12:11.705 18:15:55 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:12:12.272 18:15:55 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@837 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:12:12.531 [2024-07-12 18:15:56.168346] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:12:12.531 [2024-07-12 18:15:56.168390] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:12:12.531 [2024-07-12 18:15:56.171545] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:12:12.531 [2024-07-12 18:15:56.171576] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:12:12.531 [2024-07-12 18:15:56.171603] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:12:12.531 [2024-07-12 18:15:56.171614] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x2092320 name raid_bdev1, state offline 00:12:12.531 0 00:12:12.531 18:15:56 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@839 -- # killprocess 2469078 00:12:12.531 18:15:56 bdev_raid.raid_write_error_test -- common/autotest_common.sh@948 -- # '[' -z 2469078 ']' 00:12:12.531 18:15:56 bdev_raid.raid_write_error_test -- common/autotest_common.sh@952 -- # kill -0 2469078 00:12:12.531 18:15:56 bdev_raid.raid_write_error_test -- common/autotest_common.sh@953 -- # uname 00:12:12.532 18:15:56 bdev_raid.raid_write_error_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:12:12.532 18:15:56 bdev_raid.raid_write_error_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2469078 00:12:12.532 18:15:56 bdev_raid.raid_write_error_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:12:12.532 18:15:56 bdev_raid.raid_write_error_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:12:12.532 18:15:56 bdev_raid.raid_write_error_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2469078' 00:12:12.532 killing process with pid 2469078 00:12:12.532 18:15:56 bdev_raid.raid_write_error_test -- common/autotest_common.sh@967 -- # kill 2469078 00:12:12.532 [2024-07-12 18:15:56.236717] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:12:12.532 18:15:56 bdev_raid.raid_write_error_test -- common/autotest_common.sh@972 -- # wait 2469078 00:12:12.532 [2024-07-12 18:15:56.247224] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:12:12.791 18:15:56 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # grep -v Job /raidtest/tmp.9o7z4rXR5s 00:12:12.791 18:15:56 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # grep raid_bdev1 00:12:12.791 18:15:56 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # awk '{print $6}' 00:12:12.791 18:15:56 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # fail_per_s=0.50 00:12:12.791 18:15:56 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@844 -- # has_redundancy concat 00:12:12.791 18:15:56 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:12:12.791 18:15:56 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@215 -- # return 1 00:12:12.791 18:15:56 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@847 -- # [[ 0.50 != \0\.\0\0 ]] 00:12:12.791 00:12:12.791 real 0m5.429s 00:12:12.791 user 0m8.226s 00:12:12.791 sys 0m1.004s 00:12:12.791 18:15:56 bdev_raid.raid_write_error_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:12:12.791 18:15:56 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:12:12.791 ************************************ 00:12:12.791 END TEST raid_write_error_test 00:12:12.791 ************************************ 00:12:13.050 18:15:56 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:12:13.050 18:15:56 bdev_raid -- bdev/bdev_raid.sh@866 -- # for level in raid0 concat raid1 00:12:13.050 18:15:56 bdev_raid -- bdev/bdev_raid.sh@867 -- # run_test raid_state_function_test raid_state_function_test raid1 2 false 00:12:13.050 18:15:56 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:12:13.050 18:15:56 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:12:13.050 18:15:56 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:12:13.050 ************************************ 00:12:13.050 START TEST raid_state_function_test 00:12:13.050 ************************************ 00:12:13.050 18:15:56 bdev_raid.raid_state_function_test -- common/autotest_common.sh@1123 -- # raid_state_function_test raid1 2 false 00:12:13.050 18:15:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@220 -- # local raid_level=raid1 00:12:13.050 18:15:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@221 -- # local num_base_bdevs=2 00:12:13.050 18:15:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@222 -- # local superblock=false 00:12:13.050 18:15:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@223 -- # local raid_bdev 00:12:13.050 18:15:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i = 1 )) 00:12:13.050 18:15:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:12:13.050 18:15:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev1 00:12:13.050 18:15:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:12:13.050 18:15:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:12:13.050 18:15:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev2 00:12:13.050 18:15:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:12:13.050 18:15:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:12:13.050 18:15:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:12:13.050 18:15:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # local base_bdevs 00:12:13.050 18:15:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@225 -- # local raid_bdev_name=Existed_Raid 00:12:13.050 18:15:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # local strip_size 00:12:13.050 18:15:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@227 -- # local strip_size_create_arg 00:12:13.050 18:15:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@228 -- # local superblock_create_arg 00:12:13.050 18:15:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@230 -- # '[' raid1 '!=' raid1 ']' 00:12:13.050 18:15:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@234 -- # strip_size=0 00:12:13.050 18:15:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@237 -- # '[' false = true ']' 00:12:13.050 18:15:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@240 -- # superblock_create_arg= 00:12:13.050 18:15:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@244 -- # raid_pid=2469883 00:12:13.050 18:15:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@245 -- # echo 'Process raid pid: 2469883' 00:12:13.050 Process raid pid: 2469883 00:12:13.050 18:15:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@243 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:12:13.050 18:15:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@246 -- # waitforlisten 2469883 /var/tmp/spdk-raid.sock 00:12:13.050 18:15:56 bdev_raid.raid_state_function_test -- common/autotest_common.sh@829 -- # '[' -z 2469883 ']' 00:12:13.050 18:15:56 bdev_raid.raid_state_function_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:12:13.050 18:15:56 bdev_raid.raid_state_function_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:12:13.050 18:15:56 bdev_raid.raid_state_function_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:12:13.050 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:12:13.050 18:15:56 bdev_raid.raid_state_function_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:12:13.050 18:15:56 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:12:13.050 [2024-07-12 18:15:56.673747] Starting SPDK v24.09-pre git sha1 182dd7de4 / DPDK 24.03.0 initialization... 00:12:13.050 [2024-07-12 18:15:56.673892] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:12:13.308 [2024-07-12 18:15:56.871609] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:12:13.308 [2024-07-12 18:15:56.975084] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:12:13.566 [2024-07-12 18:15:57.038816] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:12:13.566 [2024-07-12 18:15:57.038846] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:12:14.133 18:15:57 bdev_raid.raid_state_function_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:12:14.134 18:15:57 bdev_raid.raid_state_function_test -- common/autotest_common.sh@862 -- # return 0 00:12:14.134 18:15:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@250 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:12:14.134 [2024-07-12 18:15:57.786361] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:12:14.134 [2024-07-12 18:15:57.786403] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:12:14.134 [2024-07-12 18:15:57.786414] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:12:14.134 [2024-07-12 18:15:57.786426] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:12:14.134 18:15:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@251 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 2 00:12:14.134 18:15:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:12:14.134 18:15:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:12:14.134 18:15:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:12:14.134 18:15:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:12:14.134 18:15:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:12:14.134 18:15:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:12:14.134 18:15:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:12:14.134 18:15:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:12:14.134 18:15:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:12:14.134 18:15:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:14.134 18:15:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:12:14.392 18:15:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:12:14.392 "name": "Existed_Raid", 00:12:14.392 "uuid": "00000000-0000-0000-0000-000000000000", 00:12:14.392 "strip_size_kb": 0, 00:12:14.392 "state": "configuring", 00:12:14.392 "raid_level": "raid1", 00:12:14.392 "superblock": false, 00:12:14.392 "num_base_bdevs": 2, 00:12:14.392 "num_base_bdevs_discovered": 0, 00:12:14.392 "num_base_bdevs_operational": 2, 00:12:14.392 "base_bdevs_list": [ 00:12:14.392 { 00:12:14.392 "name": "BaseBdev1", 00:12:14.392 "uuid": "00000000-0000-0000-0000-000000000000", 00:12:14.392 "is_configured": false, 00:12:14.392 "data_offset": 0, 00:12:14.392 "data_size": 0 00:12:14.392 }, 00:12:14.392 { 00:12:14.392 "name": "BaseBdev2", 00:12:14.392 "uuid": "00000000-0000-0000-0000-000000000000", 00:12:14.392 "is_configured": false, 00:12:14.392 "data_offset": 0, 00:12:14.392 "data_size": 0 00:12:14.392 } 00:12:14.392 ] 00:12:14.392 }' 00:12:14.392 18:15:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:12:14.392 18:15:58 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:12:14.958 18:15:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@252 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:12:15.216 [2024-07-12 18:15:58.885153] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:12:15.216 [2024-07-12 18:15:58.885182] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1167a80 name Existed_Raid, state configuring 00:12:15.216 18:15:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@256 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:12:15.475 [2024-07-12 18:15:59.129809] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:12:15.475 [2024-07-12 18:15:59.129835] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:12:15.475 [2024-07-12 18:15:59.129845] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:12:15.475 [2024-07-12 18:15:59.129861] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:12:15.475 18:15:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@257 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:12:15.733 [2024-07-12 18:15:59.384364] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:12:15.733 BaseBdev1 00:12:15.733 18:15:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@258 -- # waitforbdev BaseBdev1 00:12:15.733 18:15:59 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev1 00:12:15.733 18:15:59 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:12:15.733 18:15:59 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:12:15.733 18:15:59 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:12:15.733 18:15:59 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:12:15.733 18:15:59 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:12:15.992 18:15:59 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:12:16.251 [ 00:12:16.251 { 00:12:16.251 "name": "BaseBdev1", 00:12:16.251 "aliases": [ 00:12:16.251 "896e71f7-377e-4903-8ce5-e9c51a66f360" 00:12:16.251 ], 00:12:16.251 "product_name": "Malloc disk", 00:12:16.251 "block_size": 512, 00:12:16.251 "num_blocks": 65536, 00:12:16.251 "uuid": "896e71f7-377e-4903-8ce5-e9c51a66f360", 00:12:16.251 "assigned_rate_limits": { 00:12:16.251 "rw_ios_per_sec": 0, 00:12:16.251 "rw_mbytes_per_sec": 0, 00:12:16.251 "r_mbytes_per_sec": 0, 00:12:16.251 "w_mbytes_per_sec": 0 00:12:16.251 }, 00:12:16.251 "claimed": true, 00:12:16.251 "claim_type": "exclusive_write", 00:12:16.251 "zoned": false, 00:12:16.251 "supported_io_types": { 00:12:16.251 "read": true, 00:12:16.251 "write": true, 00:12:16.251 "unmap": true, 00:12:16.251 "flush": true, 00:12:16.251 "reset": true, 00:12:16.251 "nvme_admin": false, 00:12:16.251 "nvme_io": false, 00:12:16.251 "nvme_io_md": false, 00:12:16.251 "write_zeroes": true, 00:12:16.251 "zcopy": true, 00:12:16.251 "get_zone_info": false, 00:12:16.251 "zone_management": false, 00:12:16.251 "zone_append": false, 00:12:16.251 "compare": false, 00:12:16.251 "compare_and_write": false, 00:12:16.251 "abort": true, 00:12:16.251 "seek_hole": false, 00:12:16.251 "seek_data": false, 00:12:16.251 "copy": true, 00:12:16.251 "nvme_iov_md": false 00:12:16.251 }, 00:12:16.251 "memory_domains": [ 00:12:16.251 { 00:12:16.251 "dma_device_id": "system", 00:12:16.251 "dma_device_type": 1 00:12:16.251 }, 00:12:16.251 { 00:12:16.251 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:16.251 "dma_device_type": 2 00:12:16.251 } 00:12:16.251 ], 00:12:16.251 "driver_specific": {} 00:12:16.251 } 00:12:16.251 ] 00:12:16.251 18:15:59 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:12:16.251 18:15:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@259 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 2 00:12:16.251 18:15:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:12:16.251 18:15:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:12:16.252 18:15:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:12:16.252 18:15:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:12:16.252 18:15:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:12:16.252 18:15:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:12:16.252 18:15:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:12:16.252 18:15:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:12:16.252 18:15:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:12:16.252 18:15:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:16.252 18:15:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:12:16.511 18:16:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:12:16.511 "name": "Existed_Raid", 00:12:16.511 "uuid": "00000000-0000-0000-0000-000000000000", 00:12:16.511 "strip_size_kb": 0, 00:12:16.511 "state": "configuring", 00:12:16.511 "raid_level": "raid1", 00:12:16.511 "superblock": false, 00:12:16.511 "num_base_bdevs": 2, 00:12:16.511 "num_base_bdevs_discovered": 1, 00:12:16.511 "num_base_bdevs_operational": 2, 00:12:16.511 "base_bdevs_list": [ 00:12:16.511 { 00:12:16.511 "name": "BaseBdev1", 00:12:16.511 "uuid": "896e71f7-377e-4903-8ce5-e9c51a66f360", 00:12:16.511 "is_configured": true, 00:12:16.511 "data_offset": 0, 00:12:16.511 "data_size": 65536 00:12:16.511 }, 00:12:16.511 { 00:12:16.511 "name": "BaseBdev2", 00:12:16.511 "uuid": "00000000-0000-0000-0000-000000000000", 00:12:16.511 "is_configured": false, 00:12:16.511 "data_offset": 0, 00:12:16.511 "data_size": 0 00:12:16.511 } 00:12:16.511 ] 00:12:16.511 }' 00:12:16.511 18:16:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:12:16.511 18:16:00 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:12:17.447 18:16:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@260 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:12:17.706 [2024-07-12 18:16:01.237269] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:12:17.706 [2024-07-12 18:16:01.237307] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1167350 name Existed_Raid, state configuring 00:12:17.706 18:16:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@264 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:12:17.965 [2024-07-12 18:16:01.481944] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:12:17.965 [2024-07-12 18:16:01.483427] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:12:17.965 [2024-07-12 18:16:01.483458] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:12:17.965 18:16:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i = 1 )) 00:12:17.965 18:16:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:12:17.965 18:16:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 2 00:12:17.965 18:16:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:12:17.965 18:16:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:12:17.965 18:16:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:12:17.965 18:16:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:12:17.965 18:16:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:12:17.965 18:16:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:12:17.965 18:16:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:12:17.965 18:16:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:12:17.965 18:16:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:12:17.965 18:16:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:17.965 18:16:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:12:18.224 18:16:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:12:18.224 "name": "Existed_Raid", 00:12:18.224 "uuid": "00000000-0000-0000-0000-000000000000", 00:12:18.224 "strip_size_kb": 0, 00:12:18.224 "state": "configuring", 00:12:18.224 "raid_level": "raid1", 00:12:18.224 "superblock": false, 00:12:18.224 "num_base_bdevs": 2, 00:12:18.224 "num_base_bdevs_discovered": 1, 00:12:18.224 "num_base_bdevs_operational": 2, 00:12:18.224 "base_bdevs_list": [ 00:12:18.224 { 00:12:18.224 "name": "BaseBdev1", 00:12:18.224 "uuid": "896e71f7-377e-4903-8ce5-e9c51a66f360", 00:12:18.224 "is_configured": true, 00:12:18.224 "data_offset": 0, 00:12:18.224 "data_size": 65536 00:12:18.224 }, 00:12:18.224 { 00:12:18.224 "name": "BaseBdev2", 00:12:18.224 "uuid": "00000000-0000-0000-0000-000000000000", 00:12:18.224 "is_configured": false, 00:12:18.224 "data_offset": 0, 00:12:18.224 "data_size": 0 00:12:18.224 } 00:12:18.224 ] 00:12:18.224 }' 00:12:18.224 18:16:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:12:18.224 18:16:01 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:12:18.792 18:16:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:12:18.792 [2024-07-12 18:16:02.484210] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:12:18.792 [2024-07-12 18:16:02.484253] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x1168000 00:12:18.792 [2024-07-12 18:16:02.484262] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 65536, blocklen 512 00:12:18.792 [2024-07-12 18:16:02.484453] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x10820c0 00:12:18.792 [2024-07-12 18:16:02.484571] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1168000 00:12:18.792 [2024-07-12 18:16:02.484581] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x1168000 00:12:18.792 [2024-07-12 18:16:02.484745] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:12:18.792 BaseBdev2 00:12:18.792 18:16:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev2 00:12:18.792 18:16:02 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev2 00:12:18.792 18:16:02 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:12:18.792 18:16:02 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:12:18.792 18:16:02 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:12:18.792 18:16:02 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:12:18.792 18:16:02 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:12:19.051 18:16:02 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:12:19.308 [ 00:12:19.308 { 00:12:19.308 "name": "BaseBdev2", 00:12:19.308 "aliases": [ 00:12:19.308 "00fae8a4-0957-453d-884a-2b11e2d87317" 00:12:19.308 ], 00:12:19.309 "product_name": "Malloc disk", 00:12:19.309 "block_size": 512, 00:12:19.309 "num_blocks": 65536, 00:12:19.309 "uuid": "00fae8a4-0957-453d-884a-2b11e2d87317", 00:12:19.309 "assigned_rate_limits": { 00:12:19.309 "rw_ios_per_sec": 0, 00:12:19.309 "rw_mbytes_per_sec": 0, 00:12:19.309 "r_mbytes_per_sec": 0, 00:12:19.309 "w_mbytes_per_sec": 0 00:12:19.309 }, 00:12:19.309 "claimed": true, 00:12:19.309 "claim_type": "exclusive_write", 00:12:19.309 "zoned": false, 00:12:19.309 "supported_io_types": { 00:12:19.309 "read": true, 00:12:19.309 "write": true, 00:12:19.309 "unmap": true, 00:12:19.309 "flush": true, 00:12:19.309 "reset": true, 00:12:19.309 "nvme_admin": false, 00:12:19.309 "nvme_io": false, 00:12:19.309 "nvme_io_md": false, 00:12:19.309 "write_zeroes": true, 00:12:19.309 "zcopy": true, 00:12:19.309 "get_zone_info": false, 00:12:19.309 "zone_management": false, 00:12:19.309 "zone_append": false, 00:12:19.309 "compare": false, 00:12:19.309 "compare_and_write": false, 00:12:19.309 "abort": true, 00:12:19.309 "seek_hole": false, 00:12:19.309 "seek_data": false, 00:12:19.309 "copy": true, 00:12:19.309 "nvme_iov_md": false 00:12:19.309 }, 00:12:19.309 "memory_domains": [ 00:12:19.309 { 00:12:19.309 "dma_device_id": "system", 00:12:19.309 "dma_device_type": 1 00:12:19.309 }, 00:12:19.309 { 00:12:19.309 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:19.309 "dma_device_type": 2 00:12:19.309 } 00:12:19.309 ], 00:12:19.309 "driver_specific": {} 00:12:19.309 } 00:12:19.309 ] 00:12:19.309 18:16:02 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:12:19.309 18:16:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:12:19.309 18:16:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:12:19.309 18:16:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@270 -- # verify_raid_bdev_state Existed_Raid online raid1 0 2 00:12:19.309 18:16:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:12:19.309 18:16:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:12:19.309 18:16:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:12:19.309 18:16:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:12:19.309 18:16:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:12:19.309 18:16:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:12:19.309 18:16:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:12:19.309 18:16:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:12:19.309 18:16:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:12:19.309 18:16:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:19.309 18:16:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:12:19.566 18:16:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:12:19.566 "name": "Existed_Raid", 00:12:19.566 "uuid": "14b5b0c7-83c0-485f-9dd1-a62eec332073", 00:12:19.566 "strip_size_kb": 0, 00:12:19.566 "state": "online", 00:12:19.566 "raid_level": "raid1", 00:12:19.566 "superblock": false, 00:12:19.566 "num_base_bdevs": 2, 00:12:19.566 "num_base_bdevs_discovered": 2, 00:12:19.566 "num_base_bdevs_operational": 2, 00:12:19.566 "base_bdevs_list": [ 00:12:19.566 { 00:12:19.566 "name": "BaseBdev1", 00:12:19.566 "uuid": "896e71f7-377e-4903-8ce5-e9c51a66f360", 00:12:19.566 "is_configured": true, 00:12:19.566 "data_offset": 0, 00:12:19.566 "data_size": 65536 00:12:19.566 }, 00:12:19.566 { 00:12:19.566 "name": "BaseBdev2", 00:12:19.566 "uuid": "00fae8a4-0957-453d-884a-2b11e2d87317", 00:12:19.566 "is_configured": true, 00:12:19.566 "data_offset": 0, 00:12:19.566 "data_size": 65536 00:12:19.566 } 00:12:19.566 ] 00:12:19.566 }' 00:12:19.566 18:16:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:12:19.566 18:16:03 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:12:20.503 18:16:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_properties Existed_Raid 00:12:20.503 18:16:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:12:20.503 18:16:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:12:20.503 18:16:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:12:20.503 18:16:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:12:20.503 18:16:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@198 -- # local name 00:12:20.503 18:16:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:12:20.503 18:16:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:12:20.762 [2024-07-12 18:16:04.349410] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:12:20.762 18:16:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:12:20.763 "name": "Existed_Raid", 00:12:20.763 "aliases": [ 00:12:20.763 "14b5b0c7-83c0-485f-9dd1-a62eec332073" 00:12:20.763 ], 00:12:20.763 "product_name": "Raid Volume", 00:12:20.763 "block_size": 512, 00:12:20.763 "num_blocks": 65536, 00:12:20.763 "uuid": "14b5b0c7-83c0-485f-9dd1-a62eec332073", 00:12:20.763 "assigned_rate_limits": { 00:12:20.763 "rw_ios_per_sec": 0, 00:12:20.763 "rw_mbytes_per_sec": 0, 00:12:20.763 "r_mbytes_per_sec": 0, 00:12:20.763 "w_mbytes_per_sec": 0 00:12:20.763 }, 00:12:20.763 "claimed": false, 00:12:20.763 "zoned": false, 00:12:20.763 "supported_io_types": { 00:12:20.763 "read": true, 00:12:20.763 "write": true, 00:12:20.763 "unmap": false, 00:12:20.763 "flush": false, 00:12:20.763 "reset": true, 00:12:20.763 "nvme_admin": false, 00:12:20.763 "nvme_io": false, 00:12:20.763 "nvme_io_md": false, 00:12:20.763 "write_zeroes": true, 00:12:20.763 "zcopy": false, 00:12:20.763 "get_zone_info": false, 00:12:20.763 "zone_management": false, 00:12:20.763 "zone_append": false, 00:12:20.763 "compare": false, 00:12:20.763 "compare_and_write": false, 00:12:20.763 "abort": false, 00:12:20.763 "seek_hole": false, 00:12:20.763 "seek_data": false, 00:12:20.763 "copy": false, 00:12:20.763 "nvme_iov_md": false 00:12:20.763 }, 00:12:20.763 "memory_domains": [ 00:12:20.763 { 00:12:20.763 "dma_device_id": "system", 00:12:20.763 "dma_device_type": 1 00:12:20.763 }, 00:12:20.763 { 00:12:20.763 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:20.763 "dma_device_type": 2 00:12:20.763 }, 00:12:20.763 { 00:12:20.763 "dma_device_id": "system", 00:12:20.763 "dma_device_type": 1 00:12:20.763 }, 00:12:20.763 { 00:12:20.763 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:20.763 "dma_device_type": 2 00:12:20.763 } 00:12:20.763 ], 00:12:20.763 "driver_specific": { 00:12:20.763 "raid": { 00:12:20.763 "uuid": "14b5b0c7-83c0-485f-9dd1-a62eec332073", 00:12:20.763 "strip_size_kb": 0, 00:12:20.763 "state": "online", 00:12:20.763 "raid_level": "raid1", 00:12:20.763 "superblock": false, 00:12:20.763 "num_base_bdevs": 2, 00:12:20.763 "num_base_bdevs_discovered": 2, 00:12:20.763 "num_base_bdevs_operational": 2, 00:12:20.763 "base_bdevs_list": [ 00:12:20.763 { 00:12:20.763 "name": "BaseBdev1", 00:12:20.763 "uuid": "896e71f7-377e-4903-8ce5-e9c51a66f360", 00:12:20.763 "is_configured": true, 00:12:20.763 "data_offset": 0, 00:12:20.763 "data_size": 65536 00:12:20.763 }, 00:12:20.763 { 00:12:20.763 "name": "BaseBdev2", 00:12:20.763 "uuid": "00fae8a4-0957-453d-884a-2b11e2d87317", 00:12:20.763 "is_configured": true, 00:12:20.763 "data_offset": 0, 00:12:20.763 "data_size": 65536 00:12:20.763 } 00:12:20.763 ] 00:12:20.763 } 00:12:20.763 } 00:12:20.763 }' 00:12:20.763 18:16:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:12:20.763 18:16:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='BaseBdev1 00:12:20.763 BaseBdev2' 00:12:20.763 18:16:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:12:20.763 18:16:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:12:20.763 18:16:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:12:21.022 18:16:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:12:21.022 "name": "BaseBdev1", 00:12:21.022 "aliases": [ 00:12:21.022 "896e71f7-377e-4903-8ce5-e9c51a66f360" 00:12:21.022 ], 00:12:21.022 "product_name": "Malloc disk", 00:12:21.022 "block_size": 512, 00:12:21.022 "num_blocks": 65536, 00:12:21.022 "uuid": "896e71f7-377e-4903-8ce5-e9c51a66f360", 00:12:21.022 "assigned_rate_limits": { 00:12:21.022 "rw_ios_per_sec": 0, 00:12:21.022 "rw_mbytes_per_sec": 0, 00:12:21.022 "r_mbytes_per_sec": 0, 00:12:21.022 "w_mbytes_per_sec": 0 00:12:21.022 }, 00:12:21.022 "claimed": true, 00:12:21.022 "claim_type": "exclusive_write", 00:12:21.022 "zoned": false, 00:12:21.022 "supported_io_types": { 00:12:21.022 "read": true, 00:12:21.022 "write": true, 00:12:21.022 "unmap": true, 00:12:21.022 "flush": true, 00:12:21.022 "reset": true, 00:12:21.022 "nvme_admin": false, 00:12:21.022 "nvme_io": false, 00:12:21.022 "nvme_io_md": false, 00:12:21.022 "write_zeroes": true, 00:12:21.022 "zcopy": true, 00:12:21.022 "get_zone_info": false, 00:12:21.022 "zone_management": false, 00:12:21.022 "zone_append": false, 00:12:21.022 "compare": false, 00:12:21.022 "compare_and_write": false, 00:12:21.022 "abort": true, 00:12:21.022 "seek_hole": false, 00:12:21.022 "seek_data": false, 00:12:21.022 "copy": true, 00:12:21.022 "nvme_iov_md": false 00:12:21.022 }, 00:12:21.022 "memory_domains": [ 00:12:21.022 { 00:12:21.022 "dma_device_id": "system", 00:12:21.022 "dma_device_type": 1 00:12:21.022 }, 00:12:21.022 { 00:12:21.022 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:21.022 "dma_device_type": 2 00:12:21.022 } 00:12:21.022 ], 00:12:21.022 "driver_specific": {} 00:12:21.022 }' 00:12:21.022 18:16:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:12:21.022 18:16:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:12:21.022 18:16:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:12:21.022 18:16:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:12:21.282 18:16:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:12:21.282 18:16:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:12:21.282 18:16:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:12:21.282 18:16:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:12:21.282 18:16:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:12:21.282 18:16:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:12:21.282 18:16:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:12:21.541 18:16:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:12:21.541 18:16:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:12:21.541 18:16:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:12:21.541 18:16:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:12:21.541 18:16:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:12:21.541 "name": "BaseBdev2", 00:12:21.541 "aliases": [ 00:12:21.541 "00fae8a4-0957-453d-884a-2b11e2d87317" 00:12:21.542 ], 00:12:21.542 "product_name": "Malloc disk", 00:12:21.542 "block_size": 512, 00:12:21.542 "num_blocks": 65536, 00:12:21.542 "uuid": "00fae8a4-0957-453d-884a-2b11e2d87317", 00:12:21.542 "assigned_rate_limits": { 00:12:21.542 "rw_ios_per_sec": 0, 00:12:21.542 "rw_mbytes_per_sec": 0, 00:12:21.542 "r_mbytes_per_sec": 0, 00:12:21.542 "w_mbytes_per_sec": 0 00:12:21.542 }, 00:12:21.542 "claimed": true, 00:12:21.542 "claim_type": "exclusive_write", 00:12:21.542 "zoned": false, 00:12:21.542 "supported_io_types": { 00:12:21.542 "read": true, 00:12:21.542 "write": true, 00:12:21.542 "unmap": true, 00:12:21.542 "flush": true, 00:12:21.542 "reset": true, 00:12:21.542 "nvme_admin": false, 00:12:21.542 "nvme_io": false, 00:12:21.542 "nvme_io_md": false, 00:12:21.542 "write_zeroes": true, 00:12:21.542 "zcopy": true, 00:12:21.542 "get_zone_info": false, 00:12:21.542 "zone_management": false, 00:12:21.542 "zone_append": false, 00:12:21.542 "compare": false, 00:12:21.542 "compare_and_write": false, 00:12:21.542 "abort": true, 00:12:21.542 "seek_hole": false, 00:12:21.542 "seek_data": false, 00:12:21.542 "copy": true, 00:12:21.542 "nvme_iov_md": false 00:12:21.542 }, 00:12:21.542 "memory_domains": [ 00:12:21.542 { 00:12:21.542 "dma_device_id": "system", 00:12:21.542 "dma_device_type": 1 00:12:21.542 }, 00:12:21.542 { 00:12:21.542 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:21.542 "dma_device_type": 2 00:12:21.542 } 00:12:21.542 ], 00:12:21.542 "driver_specific": {} 00:12:21.542 }' 00:12:21.542 18:16:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:12:21.542 18:16:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:12:21.801 18:16:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:12:21.801 18:16:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:12:21.801 18:16:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:12:21.801 18:16:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:12:21.801 18:16:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:12:21.801 18:16:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:12:21.801 18:16:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:12:21.801 18:16:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:12:21.801 18:16:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:12:22.062 18:16:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:12:22.062 18:16:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@274 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:12:22.062 [2024-07-12 18:16:05.768965] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:12:22.322 18:16:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@275 -- # local expected_state 00:12:22.322 18:16:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@276 -- # has_redundancy raid1 00:12:22.322 18:16:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:12:22.322 18:16:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@214 -- # return 0 00:12:22.322 18:16:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@279 -- # expected_state=online 00:12:22.322 18:16:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@281 -- # verify_raid_bdev_state Existed_Raid online raid1 0 1 00:12:22.322 18:16:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:12:22.322 18:16:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:12:22.322 18:16:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:12:22.322 18:16:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:12:22.322 18:16:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:12:22.322 18:16:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:12:22.322 18:16:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:12:22.322 18:16:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:12:22.322 18:16:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:12:22.322 18:16:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:22.322 18:16:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:12:22.322 18:16:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:12:22.322 "name": "Existed_Raid", 00:12:22.322 "uuid": "14b5b0c7-83c0-485f-9dd1-a62eec332073", 00:12:22.322 "strip_size_kb": 0, 00:12:22.322 "state": "online", 00:12:22.322 "raid_level": "raid1", 00:12:22.322 "superblock": false, 00:12:22.322 "num_base_bdevs": 2, 00:12:22.322 "num_base_bdevs_discovered": 1, 00:12:22.322 "num_base_bdevs_operational": 1, 00:12:22.322 "base_bdevs_list": [ 00:12:22.322 { 00:12:22.322 "name": null, 00:12:22.322 "uuid": "00000000-0000-0000-0000-000000000000", 00:12:22.322 "is_configured": false, 00:12:22.322 "data_offset": 0, 00:12:22.322 "data_size": 65536 00:12:22.322 }, 00:12:22.322 { 00:12:22.322 "name": "BaseBdev2", 00:12:22.322 "uuid": "00fae8a4-0957-453d-884a-2b11e2d87317", 00:12:22.322 "is_configured": true, 00:12:22.322 "data_offset": 0, 00:12:22.322 "data_size": 65536 00:12:22.322 } 00:12:22.322 ] 00:12:22.322 }' 00:12:22.322 18:16:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:12:22.322 18:16:06 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:12:23.259 18:16:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i = 1 )) 00:12:23.259 18:16:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:12:23.259 18:16:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:23.259 18:16:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:12:23.259 18:16:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:12:23.259 18:16:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:12:23.259 18:16:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:12:23.518 [2024-07-12 18:16:07.110388] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:12:23.518 [2024-07-12 18:16:07.110462] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:12:23.518 [2024-07-12 18:16:07.121297] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:12:23.518 [2024-07-12 18:16:07.121333] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:12:23.518 [2024-07-12 18:16:07.121344] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1168000 name Existed_Raid, state offline 00:12:23.518 18:16:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:12:23.518 18:16:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:12:23.518 18:16:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # jq -r '.[0]["name"] | select(.)' 00:12:23.518 18:16:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:23.777 18:16:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # raid_bdev= 00:12:23.777 18:16:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@294 -- # '[' -n '' ']' 00:12:23.777 18:16:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@299 -- # '[' 2 -gt 2 ']' 00:12:23.777 18:16:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@341 -- # killprocess 2469883 00:12:23.777 18:16:07 bdev_raid.raid_state_function_test -- common/autotest_common.sh@948 -- # '[' -z 2469883 ']' 00:12:23.777 18:16:07 bdev_raid.raid_state_function_test -- common/autotest_common.sh@952 -- # kill -0 2469883 00:12:23.777 18:16:07 bdev_raid.raid_state_function_test -- common/autotest_common.sh@953 -- # uname 00:12:23.777 18:16:07 bdev_raid.raid_state_function_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:12:23.777 18:16:07 bdev_raid.raid_state_function_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2469883 00:12:23.777 18:16:07 bdev_raid.raid_state_function_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:12:23.777 18:16:07 bdev_raid.raid_state_function_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:12:23.777 18:16:07 bdev_raid.raid_state_function_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2469883' 00:12:23.777 killing process with pid 2469883 00:12:23.777 18:16:07 bdev_raid.raid_state_function_test -- common/autotest_common.sh@967 -- # kill 2469883 00:12:23.777 [2024-07-12 18:16:07.453828] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:12:23.777 18:16:07 bdev_raid.raid_state_function_test -- common/autotest_common.sh@972 -- # wait 2469883 00:12:23.777 [2024-07-12 18:16:07.454791] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:12:24.037 18:16:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@343 -- # return 0 00:12:24.037 00:12:24.037 real 0m11.122s 00:12:24.037 user 0m19.785s 00:12:24.037 sys 0m2.065s 00:12:24.037 18:16:07 bdev_raid.raid_state_function_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:12:24.037 18:16:07 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:12:24.037 ************************************ 00:12:24.037 END TEST raid_state_function_test 00:12:24.037 ************************************ 00:12:24.037 18:16:07 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:12:24.037 18:16:07 bdev_raid -- bdev/bdev_raid.sh@868 -- # run_test raid_state_function_test_sb raid_state_function_test raid1 2 true 00:12:24.037 18:16:07 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:12:24.037 18:16:07 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:12:24.037 18:16:07 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:12:24.296 ************************************ 00:12:24.296 START TEST raid_state_function_test_sb 00:12:24.296 ************************************ 00:12:24.296 18:16:07 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@1123 -- # raid_state_function_test raid1 2 true 00:12:24.296 18:16:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@220 -- # local raid_level=raid1 00:12:24.296 18:16:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@221 -- # local num_base_bdevs=2 00:12:24.296 18:16:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@222 -- # local superblock=true 00:12:24.296 18:16:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@223 -- # local raid_bdev 00:12:24.296 18:16:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i = 1 )) 00:12:24.296 18:16:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:12:24.296 18:16:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev1 00:12:24.296 18:16:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:12:24.296 18:16:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:12:24.296 18:16:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev2 00:12:24.296 18:16:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:12:24.296 18:16:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:12:24.296 18:16:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:12:24.296 18:16:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # local base_bdevs 00:12:24.296 18:16:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@225 -- # local raid_bdev_name=Existed_Raid 00:12:24.296 18:16:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # local strip_size 00:12:24.296 18:16:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@227 -- # local strip_size_create_arg 00:12:24.296 18:16:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@228 -- # local superblock_create_arg 00:12:24.296 18:16:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@230 -- # '[' raid1 '!=' raid1 ']' 00:12:24.296 18:16:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@234 -- # strip_size=0 00:12:24.296 18:16:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@237 -- # '[' true = true ']' 00:12:24.296 18:16:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@238 -- # superblock_create_arg=-s 00:12:24.296 18:16:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@244 -- # raid_pid=2471613 00:12:24.296 18:16:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@245 -- # echo 'Process raid pid: 2471613' 00:12:24.296 Process raid pid: 2471613 00:12:24.296 18:16:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@243 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:12:24.296 18:16:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@246 -- # waitforlisten 2471613 /var/tmp/spdk-raid.sock 00:12:24.296 18:16:07 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@829 -- # '[' -z 2471613 ']' 00:12:24.296 18:16:07 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:12:24.296 18:16:07 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@834 -- # local max_retries=100 00:12:24.296 18:16:07 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:12:24.296 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:12:24.296 18:16:07 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@838 -- # xtrace_disable 00:12:24.296 18:16:07 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:12:24.296 [2024-07-12 18:16:07.830302] Starting SPDK v24.09-pre git sha1 182dd7de4 / DPDK 24.03.0 initialization... 00:12:24.296 [2024-07-12 18:16:07.830366] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:12:24.296 [2024-07-12 18:16:07.953624] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:12:24.556 [2024-07-12 18:16:08.054346] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:12:24.556 [2024-07-12 18:16:08.118495] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:12:24.556 [2024-07-12 18:16:08.118527] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:12:25.123 18:16:08 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:12:25.123 18:16:08 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@862 -- # return 0 00:12:25.123 18:16:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@250 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:12:25.381 [2024-07-12 18:16:08.967910] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:12:25.381 [2024-07-12 18:16:08.967977] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:12:25.381 [2024-07-12 18:16:08.967988] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:12:25.381 [2024-07-12 18:16:08.968000] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:12:25.381 18:16:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@251 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 2 00:12:25.381 18:16:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:12:25.381 18:16:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:12:25.381 18:16:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:12:25.381 18:16:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:12:25.381 18:16:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:12:25.381 18:16:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:12:25.381 18:16:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:12:25.381 18:16:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:12:25.381 18:16:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:12:25.381 18:16:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:25.381 18:16:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:12:25.639 18:16:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:12:25.639 "name": "Existed_Raid", 00:12:25.639 "uuid": "cf03f603-1881-4d12-adc7-cd7a7ee204a5", 00:12:25.639 "strip_size_kb": 0, 00:12:25.639 "state": "configuring", 00:12:25.639 "raid_level": "raid1", 00:12:25.639 "superblock": true, 00:12:25.639 "num_base_bdevs": 2, 00:12:25.639 "num_base_bdevs_discovered": 0, 00:12:25.640 "num_base_bdevs_operational": 2, 00:12:25.640 "base_bdevs_list": [ 00:12:25.640 { 00:12:25.640 "name": "BaseBdev1", 00:12:25.640 "uuid": "00000000-0000-0000-0000-000000000000", 00:12:25.640 "is_configured": false, 00:12:25.640 "data_offset": 0, 00:12:25.640 "data_size": 0 00:12:25.640 }, 00:12:25.640 { 00:12:25.640 "name": "BaseBdev2", 00:12:25.640 "uuid": "00000000-0000-0000-0000-000000000000", 00:12:25.640 "is_configured": false, 00:12:25.640 "data_offset": 0, 00:12:25.640 "data_size": 0 00:12:25.640 } 00:12:25.640 ] 00:12:25.640 }' 00:12:25.640 18:16:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:12:25.640 18:16:09 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:12:26.207 18:16:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@252 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:12:26.464 [2024-07-12 18:16:09.966407] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:12:26.464 [2024-07-12 18:16:09.966436] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xda6a80 name Existed_Raid, state configuring 00:12:26.464 18:16:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@256 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:12:26.464 [2024-07-12 18:16:10.142914] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:12:26.464 [2024-07-12 18:16:10.142958] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:12:26.464 [2024-07-12 18:16:10.142968] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:12:26.464 [2024-07-12 18:16:10.142980] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:12:26.464 18:16:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@257 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:12:26.723 [2024-07-12 18:16:10.397551] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:12:26.723 BaseBdev1 00:12:26.723 18:16:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@258 -- # waitforbdev BaseBdev1 00:12:26.723 18:16:10 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev1 00:12:26.723 18:16:10 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:12:26.723 18:16:10 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:12:26.723 18:16:10 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:12:26.723 18:16:10 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:12:26.723 18:16:10 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:12:26.981 18:16:10 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:12:27.244 [ 00:12:27.245 { 00:12:27.245 "name": "BaseBdev1", 00:12:27.245 "aliases": [ 00:12:27.245 "f0306fbb-485e-4d81-86c2-e6559022e61b" 00:12:27.245 ], 00:12:27.245 "product_name": "Malloc disk", 00:12:27.245 "block_size": 512, 00:12:27.245 "num_blocks": 65536, 00:12:27.245 "uuid": "f0306fbb-485e-4d81-86c2-e6559022e61b", 00:12:27.245 "assigned_rate_limits": { 00:12:27.245 "rw_ios_per_sec": 0, 00:12:27.245 "rw_mbytes_per_sec": 0, 00:12:27.245 "r_mbytes_per_sec": 0, 00:12:27.245 "w_mbytes_per_sec": 0 00:12:27.245 }, 00:12:27.245 "claimed": true, 00:12:27.245 "claim_type": "exclusive_write", 00:12:27.245 "zoned": false, 00:12:27.245 "supported_io_types": { 00:12:27.245 "read": true, 00:12:27.245 "write": true, 00:12:27.245 "unmap": true, 00:12:27.245 "flush": true, 00:12:27.245 "reset": true, 00:12:27.245 "nvme_admin": false, 00:12:27.245 "nvme_io": false, 00:12:27.245 "nvme_io_md": false, 00:12:27.245 "write_zeroes": true, 00:12:27.245 "zcopy": true, 00:12:27.245 "get_zone_info": false, 00:12:27.245 "zone_management": false, 00:12:27.245 "zone_append": false, 00:12:27.245 "compare": false, 00:12:27.245 "compare_and_write": false, 00:12:27.245 "abort": true, 00:12:27.245 "seek_hole": false, 00:12:27.245 "seek_data": false, 00:12:27.245 "copy": true, 00:12:27.245 "nvme_iov_md": false 00:12:27.245 }, 00:12:27.245 "memory_domains": [ 00:12:27.245 { 00:12:27.245 "dma_device_id": "system", 00:12:27.245 "dma_device_type": 1 00:12:27.245 }, 00:12:27.245 { 00:12:27.245 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:27.245 "dma_device_type": 2 00:12:27.245 } 00:12:27.245 ], 00:12:27.245 "driver_specific": {} 00:12:27.245 } 00:12:27.245 ] 00:12:27.245 18:16:10 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:12:27.245 18:16:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@259 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 2 00:12:27.245 18:16:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:12:27.245 18:16:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:12:27.245 18:16:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:12:27.245 18:16:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:12:27.245 18:16:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:12:27.245 18:16:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:12:27.245 18:16:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:12:27.245 18:16:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:12:27.245 18:16:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:12:27.245 18:16:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:12:27.245 18:16:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:27.508 18:16:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:12:27.508 "name": "Existed_Raid", 00:12:27.508 "uuid": "f1545155-6082-4bde-97b4-bdc51076ef00", 00:12:27.508 "strip_size_kb": 0, 00:12:27.508 "state": "configuring", 00:12:27.508 "raid_level": "raid1", 00:12:27.508 "superblock": true, 00:12:27.508 "num_base_bdevs": 2, 00:12:27.508 "num_base_bdevs_discovered": 1, 00:12:27.508 "num_base_bdevs_operational": 2, 00:12:27.508 "base_bdevs_list": [ 00:12:27.508 { 00:12:27.508 "name": "BaseBdev1", 00:12:27.508 "uuid": "f0306fbb-485e-4d81-86c2-e6559022e61b", 00:12:27.508 "is_configured": true, 00:12:27.508 "data_offset": 2048, 00:12:27.508 "data_size": 63488 00:12:27.508 }, 00:12:27.508 { 00:12:27.508 "name": "BaseBdev2", 00:12:27.508 "uuid": "00000000-0000-0000-0000-000000000000", 00:12:27.508 "is_configured": false, 00:12:27.508 "data_offset": 0, 00:12:27.508 "data_size": 0 00:12:27.508 } 00:12:27.508 ] 00:12:27.508 }' 00:12:27.508 18:16:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:12:27.508 18:16:11 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:12:28.078 18:16:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@260 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:12:28.078 [2024-07-12 18:16:11.660881] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:12:28.078 [2024-07-12 18:16:11.660914] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xda6350 name Existed_Raid, state configuring 00:12:28.078 18:16:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@264 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:12:28.354 [2024-07-12 18:16:11.845415] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:12:28.354 [2024-07-12 18:16:11.846907] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:12:28.354 [2024-07-12 18:16:11.846947] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:12:28.354 18:16:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i = 1 )) 00:12:28.354 18:16:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:12:28.354 18:16:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 2 00:12:28.354 18:16:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:12:28.354 18:16:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:12:28.354 18:16:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:12:28.354 18:16:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:12:28.354 18:16:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:12:28.354 18:16:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:12:28.355 18:16:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:12:28.355 18:16:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:12:28.355 18:16:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:12:28.355 18:16:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:28.355 18:16:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:12:28.626 18:16:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:12:28.626 "name": "Existed_Raid", 00:12:28.626 "uuid": "c9a0cfcc-dde8-4ab0-b4ff-31fb7ad5e244", 00:12:28.626 "strip_size_kb": 0, 00:12:28.626 "state": "configuring", 00:12:28.626 "raid_level": "raid1", 00:12:28.626 "superblock": true, 00:12:28.626 "num_base_bdevs": 2, 00:12:28.626 "num_base_bdevs_discovered": 1, 00:12:28.626 "num_base_bdevs_operational": 2, 00:12:28.626 "base_bdevs_list": [ 00:12:28.626 { 00:12:28.626 "name": "BaseBdev1", 00:12:28.626 "uuid": "f0306fbb-485e-4d81-86c2-e6559022e61b", 00:12:28.626 "is_configured": true, 00:12:28.626 "data_offset": 2048, 00:12:28.626 "data_size": 63488 00:12:28.626 }, 00:12:28.626 { 00:12:28.626 "name": "BaseBdev2", 00:12:28.626 "uuid": "00000000-0000-0000-0000-000000000000", 00:12:28.626 "is_configured": false, 00:12:28.626 "data_offset": 0, 00:12:28.626 "data_size": 0 00:12:28.626 } 00:12:28.626 ] 00:12:28.626 }' 00:12:28.626 18:16:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:12:28.626 18:16:12 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:12:29.192 18:16:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:12:29.450 [2024-07-12 18:16:12.948852] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:12:29.450 [2024-07-12 18:16:12.949009] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0xda7000 00:12:29.450 [2024-07-12 18:16:12.949022] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:12:29.450 [2024-07-12 18:16:12.949194] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xcc10c0 00:12:29.450 [2024-07-12 18:16:12.949315] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0xda7000 00:12:29.450 [2024-07-12 18:16:12.949326] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0xda7000 00:12:29.450 [2024-07-12 18:16:12.949417] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:12:29.450 BaseBdev2 00:12:29.450 18:16:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev2 00:12:29.450 18:16:12 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev2 00:12:29.450 18:16:12 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:12:29.450 18:16:12 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:12:29.450 18:16:12 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:12:29.450 18:16:12 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:12:29.450 18:16:12 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:12:29.708 18:16:13 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:12:29.708 [ 00:12:29.708 { 00:12:29.708 "name": "BaseBdev2", 00:12:29.708 "aliases": [ 00:12:29.708 "110beeac-935b-46bc-86d5-3c8f7f4e4fb6" 00:12:29.708 ], 00:12:29.708 "product_name": "Malloc disk", 00:12:29.708 "block_size": 512, 00:12:29.708 "num_blocks": 65536, 00:12:29.708 "uuid": "110beeac-935b-46bc-86d5-3c8f7f4e4fb6", 00:12:29.708 "assigned_rate_limits": { 00:12:29.708 "rw_ios_per_sec": 0, 00:12:29.708 "rw_mbytes_per_sec": 0, 00:12:29.708 "r_mbytes_per_sec": 0, 00:12:29.708 "w_mbytes_per_sec": 0 00:12:29.708 }, 00:12:29.708 "claimed": true, 00:12:29.708 "claim_type": "exclusive_write", 00:12:29.708 "zoned": false, 00:12:29.708 "supported_io_types": { 00:12:29.708 "read": true, 00:12:29.708 "write": true, 00:12:29.708 "unmap": true, 00:12:29.708 "flush": true, 00:12:29.708 "reset": true, 00:12:29.708 "nvme_admin": false, 00:12:29.708 "nvme_io": false, 00:12:29.708 "nvme_io_md": false, 00:12:29.708 "write_zeroes": true, 00:12:29.708 "zcopy": true, 00:12:29.708 "get_zone_info": false, 00:12:29.708 "zone_management": false, 00:12:29.708 "zone_append": false, 00:12:29.708 "compare": false, 00:12:29.708 "compare_and_write": false, 00:12:29.708 "abort": true, 00:12:29.708 "seek_hole": false, 00:12:29.708 "seek_data": false, 00:12:29.708 "copy": true, 00:12:29.708 "nvme_iov_md": false 00:12:29.708 }, 00:12:29.708 "memory_domains": [ 00:12:29.708 { 00:12:29.708 "dma_device_id": "system", 00:12:29.708 "dma_device_type": 1 00:12:29.708 }, 00:12:29.708 { 00:12:29.708 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:29.708 "dma_device_type": 2 00:12:29.708 } 00:12:29.708 ], 00:12:29.708 "driver_specific": {} 00:12:29.708 } 00:12:29.708 ] 00:12:29.708 18:16:13 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:12:29.966 18:16:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:12:29.966 18:16:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:12:29.966 18:16:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@270 -- # verify_raid_bdev_state Existed_Raid online raid1 0 2 00:12:29.966 18:16:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:12:29.967 18:16:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:12:29.967 18:16:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:12:29.967 18:16:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:12:29.967 18:16:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:12:29.967 18:16:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:12:29.967 18:16:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:12:29.967 18:16:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:12:29.967 18:16:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:12:29.967 18:16:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:29.967 18:16:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:12:29.967 18:16:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:12:29.967 "name": "Existed_Raid", 00:12:29.967 "uuid": "c9a0cfcc-dde8-4ab0-b4ff-31fb7ad5e244", 00:12:29.967 "strip_size_kb": 0, 00:12:29.967 "state": "online", 00:12:29.967 "raid_level": "raid1", 00:12:29.967 "superblock": true, 00:12:29.967 "num_base_bdevs": 2, 00:12:29.967 "num_base_bdevs_discovered": 2, 00:12:29.967 "num_base_bdevs_operational": 2, 00:12:29.967 "base_bdevs_list": [ 00:12:29.967 { 00:12:29.967 "name": "BaseBdev1", 00:12:29.967 "uuid": "f0306fbb-485e-4d81-86c2-e6559022e61b", 00:12:29.967 "is_configured": true, 00:12:29.967 "data_offset": 2048, 00:12:29.967 "data_size": 63488 00:12:29.967 }, 00:12:29.967 { 00:12:29.967 "name": "BaseBdev2", 00:12:29.967 "uuid": "110beeac-935b-46bc-86d5-3c8f7f4e4fb6", 00:12:29.967 "is_configured": true, 00:12:29.967 "data_offset": 2048, 00:12:29.967 "data_size": 63488 00:12:29.967 } 00:12:29.967 ] 00:12:29.967 }' 00:12:29.967 18:16:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:12:29.967 18:16:13 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:12:30.904 18:16:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_properties Existed_Raid 00:12:30.904 18:16:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:12:30.904 18:16:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:12:30.904 18:16:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:12:30.904 18:16:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:12:30.904 18:16:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@198 -- # local name 00:12:30.904 18:16:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:12:30.904 18:16:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:12:30.904 [2024-07-12 18:16:14.497232] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:12:30.904 18:16:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:12:30.904 "name": "Existed_Raid", 00:12:30.904 "aliases": [ 00:12:30.904 "c9a0cfcc-dde8-4ab0-b4ff-31fb7ad5e244" 00:12:30.904 ], 00:12:30.904 "product_name": "Raid Volume", 00:12:30.904 "block_size": 512, 00:12:30.904 "num_blocks": 63488, 00:12:30.904 "uuid": "c9a0cfcc-dde8-4ab0-b4ff-31fb7ad5e244", 00:12:30.904 "assigned_rate_limits": { 00:12:30.904 "rw_ios_per_sec": 0, 00:12:30.904 "rw_mbytes_per_sec": 0, 00:12:30.904 "r_mbytes_per_sec": 0, 00:12:30.904 "w_mbytes_per_sec": 0 00:12:30.904 }, 00:12:30.904 "claimed": false, 00:12:30.904 "zoned": false, 00:12:30.904 "supported_io_types": { 00:12:30.904 "read": true, 00:12:30.904 "write": true, 00:12:30.904 "unmap": false, 00:12:30.904 "flush": false, 00:12:30.904 "reset": true, 00:12:30.904 "nvme_admin": false, 00:12:30.904 "nvme_io": false, 00:12:30.904 "nvme_io_md": false, 00:12:30.904 "write_zeroes": true, 00:12:30.904 "zcopy": false, 00:12:30.904 "get_zone_info": false, 00:12:30.904 "zone_management": false, 00:12:30.904 "zone_append": false, 00:12:30.904 "compare": false, 00:12:30.904 "compare_and_write": false, 00:12:30.904 "abort": false, 00:12:30.904 "seek_hole": false, 00:12:30.904 "seek_data": false, 00:12:30.904 "copy": false, 00:12:30.904 "nvme_iov_md": false 00:12:30.904 }, 00:12:30.904 "memory_domains": [ 00:12:30.904 { 00:12:30.904 "dma_device_id": "system", 00:12:30.904 "dma_device_type": 1 00:12:30.904 }, 00:12:30.904 { 00:12:30.904 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:30.904 "dma_device_type": 2 00:12:30.904 }, 00:12:30.904 { 00:12:30.904 "dma_device_id": "system", 00:12:30.904 "dma_device_type": 1 00:12:30.904 }, 00:12:30.904 { 00:12:30.904 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:30.904 "dma_device_type": 2 00:12:30.904 } 00:12:30.904 ], 00:12:30.904 "driver_specific": { 00:12:30.904 "raid": { 00:12:30.904 "uuid": "c9a0cfcc-dde8-4ab0-b4ff-31fb7ad5e244", 00:12:30.904 "strip_size_kb": 0, 00:12:30.904 "state": "online", 00:12:30.904 "raid_level": "raid1", 00:12:30.904 "superblock": true, 00:12:30.904 "num_base_bdevs": 2, 00:12:30.904 "num_base_bdevs_discovered": 2, 00:12:30.904 "num_base_bdevs_operational": 2, 00:12:30.904 "base_bdevs_list": [ 00:12:30.904 { 00:12:30.904 "name": "BaseBdev1", 00:12:30.904 "uuid": "f0306fbb-485e-4d81-86c2-e6559022e61b", 00:12:30.904 "is_configured": true, 00:12:30.904 "data_offset": 2048, 00:12:30.904 "data_size": 63488 00:12:30.904 }, 00:12:30.904 { 00:12:30.904 "name": "BaseBdev2", 00:12:30.904 "uuid": "110beeac-935b-46bc-86d5-3c8f7f4e4fb6", 00:12:30.904 "is_configured": true, 00:12:30.904 "data_offset": 2048, 00:12:30.904 "data_size": 63488 00:12:30.904 } 00:12:30.904 ] 00:12:30.904 } 00:12:30.904 } 00:12:30.904 }' 00:12:30.904 18:16:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:12:30.904 18:16:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # base_bdev_names='BaseBdev1 00:12:30.904 BaseBdev2' 00:12:30.904 18:16:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:12:30.904 18:16:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:12:30.904 18:16:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:12:31.163 18:16:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:12:31.163 "name": "BaseBdev1", 00:12:31.163 "aliases": [ 00:12:31.163 "f0306fbb-485e-4d81-86c2-e6559022e61b" 00:12:31.163 ], 00:12:31.163 "product_name": "Malloc disk", 00:12:31.163 "block_size": 512, 00:12:31.163 "num_blocks": 65536, 00:12:31.163 "uuid": "f0306fbb-485e-4d81-86c2-e6559022e61b", 00:12:31.163 "assigned_rate_limits": { 00:12:31.163 "rw_ios_per_sec": 0, 00:12:31.163 "rw_mbytes_per_sec": 0, 00:12:31.163 "r_mbytes_per_sec": 0, 00:12:31.163 "w_mbytes_per_sec": 0 00:12:31.163 }, 00:12:31.163 "claimed": true, 00:12:31.163 "claim_type": "exclusive_write", 00:12:31.163 "zoned": false, 00:12:31.163 "supported_io_types": { 00:12:31.163 "read": true, 00:12:31.163 "write": true, 00:12:31.163 "unmap": true, 00:12:31.163 "flush": true, 00:12:31.163 "reset": true, 00:12:31.163 "nvme_admin": false, 00:12:31.163 "nvme_io": false, 00:12:31.163 "nvme_io_md": false, 00:12:31.163 "write_zeroes": true, 00:12:31.163 "zcopy": true, 00:12:31.163 "get_zone_info": false, 00:12:31.163 "zone_management": false, 00:12:31.163 "zone_append": false, 00:12:31.163 "compare": false, 00:12:31.163 "compare_and_write": false, 00:12:31.163 "abort": true, 00:12:31.163 "seek_hole": false, 00:12:31.163 "seek_data": false, 00:12:31.163 "copy": true, 00:12:31.163 "nvme_iov_md": false 00:12:31.163 }, 00:12:31.163 "memory_domains": [ 00:12:31.163 { 00:12:31.163 "dma_device_id": "system", 00:12:31.163 "dma_device_type": 1 00:12:31.163 }, 00:12:31.163 { 00:12:31.163 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:31.163 "dma_device_type": 2 00:12:31.163 } 00:12:31.163 ], 00:12:31.163 "driver_specific": {} 00:12:31.163 }' 00:12:31.163 18:16:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:12:31.163 18:16:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:12:31.422 18:16:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:12:31.422 18:16:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:12:31.422 18:16:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:12:31.422 18:16:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:12:31.422 18:16:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:12:31.422 18:16:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:12:31.422 18:16:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:12:31.422 18:16:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:12:31.681 18:16:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:12:31.681 18:16:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:12:31.681 18:16:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:12:31.681 18:16:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:12:31.681 18:16:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:12:31.939 18:16:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:12:31.939 "name": "BaseBdev2", 00:12:31.939 "aliases": [ 00:12:31.939 "110beeac-935b-46bc-86d5-3c8f7f4e4fb6" 00:12:31.939 ], 00:12:31.939 "product_name": "Malloc disk", 00:12:31.939 "block_size": 512, 00:12:31.940 "num_blocks": 65536, 00:12:31.940 "uuid": "110beeac-935b-46bc-86d5-3c8f7f4e4fb6", 00:12:31.940 "assigned_rate_limits": { 00:12:31.940 "rw_ios_per_sec": 0, 00:12:31.940 "rw_mbytes_per_sec": 0, 00:12:31.940 "r_mbytes_per_sec": 0, 00:12:31.940 "w_mbytes_per_sec": 0 00:12:31.940 }, 00:12:31.940 "claimed": true, 00:12:31.940 "claim_type": "exclusive_write", 00:12:31.940 "zoned": false, 00:12:31.940 "supported_io_types": { 00:12:31.940 "read": true, 00:12:31.940 "write": true, 00:12:31.940 "unmap": true, 00:12:31.940 "flush": true, 00:12:31.940 "reset": true, 00:12:31.940 "nvme_admin": false, 00:12:31.940 "nvme_io": false, 00:12:31.940 "nvme_io_md": false, 00:12:31.940 "write_zeroes": true, 00:12:31.940 "zcopy": true, 00:12:31.940 "get_zone_info": false, 00:12:31.940 "zone_management": false, 00:12:31.940 "zone_append": false, 00:12:31.940 "compare": false, 00:12:31.940 "compare_and_write": false, 00:12:31.940 "abort": true, 00:12:31.940 "seek_hole": false, 00:12:31.940 "seek_data": false, 00:12:31.940 "copy": true, 00:12:31.940 "nvme_iov_md": false 00:12:31.940 }, 00:12:31.940 "memory_domains": [ 00:12:31.940 { 00:12:31.940 "dma_device_id": "system", 00:12:31.940 "dma_device_type": 1 00:12:31.940 }, 00:12:31.940 { 00:12:31.940 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:31.940 "dma_device_type": 2 00:12:31.940 } 00:12:31.940 ], 00:12:31.940 "driver_specific": {} 00:12:31.940 }' 00:12:31.940 18:16:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:12:31.940 18:16:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:12:31.940 18:16:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:12:31.940 18:16:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:12:31.940 18:16:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:12:31.940 18:16:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:12:31.940 18:16:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:12:31.940 18:16:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:12:32.199 18:16:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:12:32.199 18:16:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:12:32.199 18:16:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:12:32.199 18:16:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:12:32.199 18:16:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@274 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:12:32.458 [2024-07-12 18:16:15.997002] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:12:32.458 18:16:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@275 -- # local expected_state 00:12:32.458 18:16:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@276 -- # has_redundancy raid1 00:12:32.458 18:16:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@213 -- # case $1 in 00:12:32.458 18:16:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@214 -- # return 0 00:12:32.458 18:16:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@279 -- # expected_state=online 00:12:32.458 18:16:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@281 -- # verify_raid_bdev_state Existed_Raid online raid1 0 1 00:12:32.458 18:16:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:12:32.458 18:16:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:12:32.458 18:16:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:12:32.458 18:16:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:12:32.458 18:16:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:12:32.458 18:16:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:12:32.458 18:16:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:12:32.458 18:16:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:12:32.458 18:16:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:12:32.458 18:16:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:32.458 18:16:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:12:32.717 18:16:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:12:32.717 "name": "Existed_Raid", 00:12:32.717 "uuid": "c9a0cfcc-dde8-4ab0-b4ff-31fb7ad5e244", 00:12:32.717 "strip_size_kb": 0, 00:12:32.717 "state": "online", 00:12:32.717 "raid_level": "raid1", 00:12:32.717 "superblock": true, 00:12:32.717 "num_base_bdevs": 2, 00:12:32.717 "num_base_bdevs_discovered": 1, 00:12:32.717 "num_base_bdevs_operational": 1, 00:12:32.717 "base_bdevs_list": [ 00:12:32.717 { 00:12:32.717 "name": null, 00:12:32.717 "uuid": "00000000-0000-0000-0000-000000000000", 00:12:32.717 "is_configured": false, 00:12:32.717 "data_offset": 2048, 00:12:32.717 "data_size": 63488 00:12:32.717 }, 00:12:32.717 { 00:12:32.717 "name": "BaseBdev2", 00:12:32.717 "uuid": "110beeac-935b-46bc-86d5-3c8f7f4e4fb6", 00:12:32.717 "is_configured": true, 00:12:32.717 "data_offset": 2048, 00:12:32.717 "data_size": 63488 00:12:32.717 } 00:12:32.717 ] 00:12:32.717 }' 00:12:32.717 18:16:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:12:32.717 18:16:16 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:12:33.284 18:16:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i = 1 )) 00:12:33.284 18:16:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:12:33.284 18:16:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:33.284 18:16:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:12:33.543 18:16:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:12:33.543 18:16:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:12:33.543 18:16:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:12:33.802 [2024-07-12 18:16:17.334450] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:12:33.802 [2024-07-12 18:16:17.334531] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:12:33.802 [2024-07-12 18:16:17.347277] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:12:33.802 [2024-07-12 18:16:17.347313] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:12:33.802 [2024-07-12 18:16:17.347325] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xda7000 name Existed_Raid, state offline 00:12:33.802 18:16:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:12:33.802 18:16:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:12:33.802 18:16:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:33.802 18:16:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # jq -r '.[0]["name"] | select(.)' 00:12:34.061 18:16:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # raid_bdev= 00:12:34.061 18:16:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@294 -- # '[' -n '' ']' 00:12:34.061 18:16:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@299 -- # '[' 2 -gt 2 ']' 00:12:34.061 18:16:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@341 -- # killprocess 2471613 00:12:34.061 18:16:17 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@948 -- # '[' -z 2471613 ']' 00:12:34.061 18:16:17 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@952 -- # kill -0 2471613 00:12:34.061 18:16:17 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@953 -- # uname 00:12:34.061 18:16:17 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:12:34.061 18:16:17 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2471613 00:12:34.061 18:16:17 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:12:34.061 18:16:17 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:12:34.061 18:16:17 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2471613' 00:12:34.061 killing process with pid 2471613 00:12:34.061 18:16:17 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@967 -- # kill 2471613 00:12:34.061 [2024-07-12 18:16:17.663290] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:12:34.061 18:16:17 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@972 -- # wait 2471613 00:12:34.061 [2024-07-12 18:16:17.664271] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:12:34.336 18:16:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@343 -- # return 0 00:12:34.336 00:12:34.336 real 0m10.122s 00:12:34.336 user 0m17.968s 00:12:34.336 sys 0m1.897s 00:12:34.336 18:16:17 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@1124 -- # xtrace_disable 00:12:34.336 18:16:17 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:12:34.336 ************************************ 00:12:34.336 END TEST raid_state_function_test_sb 00:12:34.336 ************************************ 00:12:34.336 18:16:17 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:12:34.336 18:16:17 bdev_raid -- bdev/bdev_raid.sh@869 -- # run_test raid_superblock_test raid_superblock_test raid1 2 00:12:34.336 18:16:17 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 4 -le 1 ']' 00:12:34.336 18:16:17 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:12:34.336 18:16:17 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:12:34.336 ************************************ 00:12:34.336 START TEST raid_superblock_test 00:12:34.336 ************************************ 00:12:34.336 18:16:17 bdev_raid.raid_superblock_test -- common/autotest_common.sh@1123 -- # raid_superblock_test raid1 2 00:12:34.336 18:16:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@392 -- # local raid_level=raid1 00:12:34.336 18:16:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@393 -- # local num_base_bdevs=2 00:12:34.336 18:16:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@394 -- # base_bdevs_malloc=() 00:12:34.336 18:16:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@394 -- # local base_bdevs_malloc 00:12:34.336 18:16:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@395 -- # base_bdevs_pt=() 00:12:34.336 18:16:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@395 -- # local base_bdevs_pt 00:12:34.336 18:16:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@396 -- # base_bdevs_pt_uuid=() 00:12:34.336 18:16:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@396 -- # local base_bdevs_pt_uuid 00:12:34.336 18:16:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@397 -- # local raid_bdev_name=raid_bdev1 00:12:34.336 18:16:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@398 -- # local strip_size 00:12:34.336 18:16:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@399 -- # local strip_size_create_arg 00:12:34.336 18:16:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@400 -- # local raid_bdev_uuid 00:12:34.336 18:16:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@401 -- # local raid_bdev 00:12:34.337 18:16:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@403 -- # '[' raid1 '!=' raid1 ']' 00:12:34.337 18:16:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@407 -- # strip_size=0 00:12:34.337 18:16:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@410 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -L bdev_raid 00:12:34.337 18:16:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@411 -- # raid_pid=2473157 00:12:34.337 18:16:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@412 -- # waitforlisten 2473157 /var/tmp/spdk-raid.sock 00:12:34.337 18:16:17 bdev_raid.raid_superblock_test -- common/autotest_common.sh@829 -- # '[' -z 2473157 ']' 00:12:34.337 18:16:17 bdev_raid.raid_superblock_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:12:34.337 18:16:17 bdev_raid.raid_superblock_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:12:34.337 18:16:17 bdev_raid.raid_superblock_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:12:34.337 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:12:34.337 18:16:17 bdev_raid.raid_superblock_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:12:34.337 18:16:17 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:12:34.337 [2024-07-12 18:16:18.014587] Starting SPDK v24.09-pre git sha1 182dd7de4 / DPDK 24.03.0 initialization... 00:12:34.337 [2024-07-12 18:16:18.014657] [ DPDK EAL parameters: bdev_svc --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2473157 ] 00:12:34.595 [2024-07-12 18:16:18.142178] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:12:34.595 [2024-07-12 18:16:18.238871] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:12:34.595 [2024-07-12 18:16:18.296967] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:12:34.595 [2024-07-12 18:16:18.297003] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:12:35.162 18:16:18 bdev_raid.raid_superblock_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:12:35.162 18:16:18 bdev_raid.raid_superblock_test -- common/autotest_common.sh@862 -- # return 0 00:12:35.162 18:16:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i = 1 )) 00:12:35.162 18:16:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:12:35.162 18:16:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc1 00:12:35.162 18:16:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt1 00:12:35.162 18:16:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000001 00:12:35.162 18:16:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:12:35.162 18:16:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:12:35.162 18:16:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:12:35.162 18:16:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc1 00:12:35.421 malloc1 00:12:35.421 18:16:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:12:35.679 [2024-07-12 18:16:19.349527] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:12:35.679 [2024-07-12 18:16:19.349573] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:12:35.679 [2024-07-12 18:16:19.349593] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x24e9570 00:12:35.679 [2024-07-12 18:16:19.349606] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:12:35.679 [2024-07-12 18:16:19.351335] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:12:35.679 [2024-07-12 18:16:19.351365] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:12:35.679 pt1 00:12:35.679 18:16:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:12:35.679 18:16:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:12:35.679 18:16:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc2 00:12:35.679 18:16:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt2 00:12:35.679 18:16:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000002 00:12:35.679 18:16:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:12:35.679 18:16:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:12:35.679 18:16:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:12:35.679 18:16:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc2 00:12:35.939 malloc2 00:12:35.939 18:16:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:12:36.199 [2024-07-12 18:16:19.843575] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:12:36.199 [2024-07-12 18:16:19.843622] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:12:36.199 [2024-07-12 18:16:19.843638] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x24ea970 00:12:36.199 [2024-07-12 18:16:19.843650] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:12:36.199 [2024-07-12 18:16:19.845262] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:12:36.199 [2024-07-12 18:16:19.845291] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:12:36.199 pt2 00:12:36.199 18:16:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:12:36.199 18:16:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:12:36.199 18:16:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@429 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'pt1 pt2' -n raid_bdev1 -s 00:12:36.458 [2024-07-12 18:16:20.080225] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:12:36.458 [2024-07-12 18:16:20.081621] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:12:36.458 [2024-07-12 18:16:20.081769] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x268d270 00:12:36.458 [2024-07-12 18:16:20.081782] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:12:36.458 [2024-07-12 18:16:20.081998] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x24e10e0 00:12:36.458 [2024-07-12 18:16:20.082146] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x268d270 00:12:36.458 [2024-07-12 18:16:20.082156] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x268d270 00:12:36.458 [2024-07-12 18:16:20.082261] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:12:36.458 18:16:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@430 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:12:36.458 18:16:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:12:36.458 18:16:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:12:36.458 18:16:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:12:36.458 18:16:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:12:36.458 18:16:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:12:36.458 18:16:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:12:36.458 18:16:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:12:36.458 18:16:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:12:36.458 18:16:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:12:36.458 18:16:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:12:36.458 18:16:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:36.718 18:16:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:12:36.718 "name": "raid_bdev1", 00:12:36.718 "uuid": "cf4fd141-379f-4176-8005-dc3b01d408ec", 00:12:36.718 "strip_size_kb": 0, 00:12:36.718 "state": "online", 00:12:36.718 "raid_level": "raid1", 00:12:36.718 "superblock": true, 00:12:36.718 "num_base_bdevs": 2, 00:12:36.718 "num_base_bdevs_discovered": 2, 00:12:36.718 "num_base_bdevs_operational": 2, 00:12:36.718 "base_bdevs_list": [ 00:12:36.718 { 00:12:36.718 "name": "pt1", 00:12:36.718 "uuid": "00000000-0000-0000-0000-000000000001", 00:12:36.718 "is_configured": true, 00:12:36.718 "data_offset": 2048, 00:12:36.718 "data_size": 63488 00:12:36.718 }, 00:12:36.718 { 00:12:36.718 "name": "pt2", 00:12:36.718 "uuid": "00000000-0000-0000-0000-000000000002", 00:12:36.718 "is_configured": true, 00:12:36.718 "data_offset": 2048, 00:12:36.718 "data_size": 63488 00:12:36.718 } 00:12:36.718 ] 00:12:36.718 }' 00:12:36.718 18:16:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:12:36.718 18:16:20 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:12:37.286 18:16:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@431 -- # verify_raid_bdev_properties raid_bdev1 00:12:37.286 18:16:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:12:37.286 18:16:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:12:37.286 18:16:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:12:37.286 18:16:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:12:37.286 18:16:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@198 -- # local name 00:12:37.286 18:16:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:12:37.286 18:16:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:12:37.546 [2024-07-12 18:16:21.163294] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:12:37.546 18:16:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:12:37.546 "name": "raid_bdev1", 00:12:37.546 "aliases": [ 00:12:37.546 "cf4fd141-379f-4176-8005-dc3b01d408ec" 00:12:37.546 ], 00:12:37.546 "product_name": "Raid Volume", 00:12:37.546 "block_size": 512, 00:12:37.546 "num_blocks": 63488, 00:12:37.546 "uuid": "cf4fd141-379f-4176-8005-dc3b01d408ec", 00:12:37.546 "assigned_rate_limits": { 00:12:37.546 "rw_ios_per_sec": 0, 00:12:37.546 "rw_mbytes_per_sec": 0, 00:12:37.546 "r_mbytes_per_sec": 0, 00:12:37.546 "w_mbytes_per_sec": 0 00:12:37.546 }, 00:12:37.546 "claimed": false, 00:12:37.546 "zoned": false, 00:12:37.546 "supported_io_types": { 00:12:37.546 "read": true, 00:12:37.546 "write": true, 00:12:37.546 "unmap": false, 00:12:37.546 "flush": false, 00:12:37.546 "reset": true, 00:12:37.546 "nvme_admin": false, 00:12:37.546 "nvme_io": false, 00:12:37.546 "nvme_io_md": false, 00:12:37.546 "write_zeroes": true, 00:12:37.546 "zcopy": false, 00:12:37.546 "get_zone_info": false, 00:12:37.546 "zone_management": false, 00:12:37.546 "zone_append": false, 00:12:37.546 "compare": false, 00:12:37.546 "compare_and_write": false, 00:12:37.546 "abort": false, 00:12:37.546 "seek_hole": false, 00:12:37.546 "seek_data": false, 00:12:37.546 "copy": false, 00:12:37.546 "nvme_iov_md": false 00:12:37.546 }, 00:12:37.546 "memory_domains": [ 00:12:37.546 { 00:12:37.546 "dma_device_id": "system", 00:12:37.546 "dma_device_type": 1 00:12:37.546 }, 00:12:37.546 { 00:12:37.546 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:37.546 "dma_device_type": 2 00:12:37.546 }, 00:12:37.546 { 00:12:37.546 "dma_device_id": "system", 00:12:37.546 "dma_device_type": 1 00:12:37.546 }, 00:12:37.546 { 00:12:37.546 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:37.546 "dma_device_type": 2 00:12:37.546 } 00:12:37.546 ], 00:12:37.546 "driver_specific": { 00:12:37.546 "raid": { 00:12:37.546 "uuid": "cf4fd141-379f-4176-8005-dc3b01d408ec", 00:12:37.546 "strip_size_kb": 0, 00:12:37.546 "state": "online", 00:12:37.546 "raid_level": "raid1", 00:12:37.546 "superblock": true, 00:12:37.546 "num_base_bdevs": 2, 00:12:37.546 "num_base_bdevs_discovered": 2, 00:12:37.546 "num_base_bdevs_operational": 2, 00:12:37.546 "base_bdevs_list": [ 00:12:37.546 { 00:12:37.546 "name": "pt1", 00:12:37.546 "uuid": "00000000-0000-0000-0000-000000000001", 00:12:37.546 "is_configured": true, 00:12:37.546 "data_offset": 2048, 00:12:37.546 "data_size": 63488 00:12:37.546 }, 00:12:37.546 { 00:12:37.546 "name": "pt2", 00:12:37.546 "uuid": "00000000-0000-0000-0000-000000000002", 00:12:37.546 "is_configured": true, 00:12:37.546 "data_offset": 2048, 00:12:37.546 "data_size": 63488 00:12:37.546 } 00:12:37.546 ] 00:12:37.546 } 00:12:37.546 } 00:12:37.546 }' 00:12:37.546 18:16:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:12:37.546 18:16:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:12:37.546 pt2' 00:12:37.546 18:16:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:12:37.546 18:16:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:12:37.546 18:16:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:12:37.806 18:16:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:12:37.806 "name": "pt1", 00:12:37.806 "aliases": [ 00:12:37.806 "00000000-0000-0000-0000-000000000001" 00:12:37.806 ], 00:12:37.806 "product_name": "passthru", 00:12:37.806 "block_size": 512, 00:12:37.806 "num_blocks": 65536, 00:12:37.806 "uuid": "00000000-0000-0000-0000-000000000001", 00:12:37.806 "assigned_rate_limits": { 00:12:37.806 "rw_ios_per_sec": 0, 00:12:37.806 "rw_mbytes_per_sec": 0, 00:12:37.806 "r_mbytes_per_sec": 0, 00:12:37.806 "w_mbytes_per_sec": 0 00:12:37.806 }, 00:12:37.806 "claimed": true, 00:12:37.806 "claim_type": "exclusive_write", 00:12:37.806 "zoned": false, 00:12:37.806 "supported_io_types": { 00:12:37.806 "read": true, 00:12:37.806 "write": true, 00:12:37.806 "unmap": true, 00:12:37.806 "flush": true, 00:12:37.806 "reset": true, 00:12:37.806 "nvme_admin": false, 00:12:37.806 "nvme_io": false, 00:12:37.806 "nvme_io_md": false, 00:12:37.806 "write_zeroes": true, 00:12:37.806 "zcopy": true, 00:12:37.806 "get_zone_info": false, 00:12:37.806 "zone_management": false, 00:12:37.806 "zone_append": false, 00:12:37.806 "compare": false, 00:12:37.806 "compare_and_write": false, 00:12:37.806 "abort": true, 00:12:37.806 "seek_hole": false, 00:12:37.806 "seek_data": false, 00:12:37.806 "copy": true, 00:12:37.806 "nvme_iov_md": false 00:12:37.806 }, 00:12:37.806 "memory_domains": [ 00:12:37.806 { 00:12:37.806 "dma_device_id": "system", 00:12:37.806 "dma_device_type": 1 00:12:37.806 }, 00:12:37.806 { 00:12:37.806 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:37.806 "dma_device_type": 2 00:12:37.806 } 00:12:37.806 ], 00:12:37.806 "driver_specific": { 00:12:37.806 "passthru": { 00:12:37.806 "name": "pt1", 00:12:37.806 "base_bdev_name": "malloc1" 00:12:37.806 } 00:12:37.806 } 00:12:37.806 }' 00:12:37.806 18:16:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:12:37.806 18:16:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:12:38.065 18:16:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:12:38.065 18:16:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:12:38.065 18:16:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:12:38.065 18:16:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:12:38.065 18:16:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:12:38.065 18:16:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:12:38.065 18:16:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:12:38.065 18:16:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:12:38.065 18:16:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:12:38.324 18:16:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:12:38.324 18:16:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:12:38.324 18:16:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:12:38.324 18:16:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:12:38.583 18:16:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:12:38.583 "name": "pt2", 00:12:38.583 "aliases": [ 00:12:38.583 "00000000-0000-0000-0000-000000000002" 00:12:38.583 ], 00:12:38.583 "product_name": "passthru", 00:12:38.583 "block_size": 512, 00:12:38.583 "num_blocks": 65536, 00:12:38.583 "uuid": "00000000-0000-0000-0000-000000000002", 00:12:38.583 "assigned_rate_limits": { 00:12:38.583 "rw_ios_per_sec": 0, 00:12:38.583 "rw_mbytes_per_sec": 0, 00:12:38.583 "r_mbytes_per_sec": 0, 00:12:38.583 "w_mbytes_per_sec": 0 00:12:38.583 }, 00:12:38.583 "claimed": true, 00:12:38.583 "claim_type": "exclusive_write", 00:12:38.583 "zoned": false, 00:12:38.583 "supported_io_types": { 00:12:38.583 "read": true, 00:12:38.583 "write": true, 00:12:38.583 "unmap": true, 00:12:38.583 "flush": true, 00:12:38.583 "reset": true, 00:12:38.583 "nvme_admin": false, 00:12:38.583 "nvme_io": false, 00:12:38.583 "nvme_io_md": false, 00:12:38.583 "write_zeroes": true, 00:12:38.583 "zcopy": true, 00:12:38.583 "get_zone_info": false, 00:12:38.583 "zone_management": false, 00:12:38.583 "zone_append": false, 00:12:38.583 "compare": false, 00:12:38.583 "compare_and_write": false, 00:12:38.583 "abort": true, 00:12:38.583 "seek_hole": false, 00:12:38.583 "seek_data": false, 00:12:38.583 "copy": true, 00:12:38.583 "nvme_iov_md": false 00:12:38.583 }, 00:12:38.583 "memory_domains": [ 00:12:38.583 { 00:12:38.583 "dma_device_id": "system", 00:12:38.583 "dma_device_type": 1 00:12:38.583 }, 00:12:38.583 { 00:12:38.583 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:38.583 "dma_device_type": 2 00:12:38.583 } 00:12:38.583 ], 00:12:38.583 "driver_specific": { 00:12:38.583 "passthru": { 00:12:38.583 "name": "pt2", 00:12:38.583 "base_bdev_name": "malloc2" 00:12:38.583 } 00:12:38.583 } 00:12:38.583 }' 00:12:38.583 18:16:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:12:38.583 18:16:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:12:38.583 18:16:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:12:38.583 18:16:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:12:38.583 18:16:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:12:38.583 18:16:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:12:38.583 18:16:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:12:38.583 18:16:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:12:38.842 18:16:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:12:38.842 18:16:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:12:38.842 18:16:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:12:38.842 18:16:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:12:38.842 18:16:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # jq -r '.[] | .uuid' 00:12:38.842 18:16:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:12:39.101 [2024-07-12 18:16:22.615149] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:12:39.101 18:16:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # raid_bdev_uuid=cf4fd141-379f-4176-8005-dc3b01d408ec 00:12:39.101 18:16:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@435 -- # '[' -z cf4fd141-379f-4176-8005-dc3b01d408ec ']' 00:12:39.101 18:16:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@440 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:12:39.359 [2024-07-12 18:16:22.851524] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:12:39.359 [2024-07-12 18:16:22.851543] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:12:39.359 [2024-07-12 18:16:22.851592] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:12:39.359 [2024-07-12 18:16:22.851646] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:12:39.359 [2024-07-12 18:16:22.851658] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x268d270 name raid_bdev1, state offline 00:12:39.359 18:16:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:39.359 18:16:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # jq -r '.[]' 00:12:39.617 18:16:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # raid_bdev= 00:12:39.617 18:16:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@442 -- # '[' -n '' ']' 00:12:39.617 18:16:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:12:39.617 18:16:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt1 00:12:39.875 18:16:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:12:39.875 18:16:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:12:39.875 18:16:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs 00:12:39.875 18:16:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # jq -r '[.[] | select(.product_name == "passthru")] | any' 00:12:40.133 18:16:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # '[' false == true ']' 00:12:40.133 18:16:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@456 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'malloc1 malloc2' -n raid_bdev1 00:12:40.133 18:16:23 bdev_raid.raid_superblock_test -- common/autotest_common.sh@648 -- # local es=0 00:12:40.133 18:16:23 bdev_raid.raid_superblock_test -- common/autotest_common.sh@650 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'malloc1 malloc2' -n raid_bdev1 00:12:40.133 18:16:23 bdev_raid.raid_superblock_test -- common/autotest_common.sh@636 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:12:40.133 18:16:23 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:12:40.133 18:16:23 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:12:40.133 18:16:23 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:12:40.133 18:16:23 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:12:40.133 18:16:23 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:12:40.133 18:16:23 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:12:40.133 18:16:23 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py ]] 00:12:40.133 18:16:23 bdev_raid.raid_superblock_test -- common/autotest_common.sh@651 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'malloc1 malloc2' -n raid_bdev1 00:12:40.392 [2024-07-12 18:16:24.070707] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc1 is claimed 00:12:40.392 [2024-07-12 18:16:24.072047] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc2 is claimed 00:12:40.392 [2024-07-12 18:16:24.072101] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc1 00:12:40.392 [2024-07-12 18:16:24.072139] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc2 00:12:40.392 [2024-07-12 18:16:24.072157] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:12:40.392 [2024-07-12 18:16:24.072167] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x268cff0 name raid_bdev1, state configuring 00:12:40.392 request: 00:12:40.392 { 00:12:40.392 "name": "raid_bdev1", 00:12:40.392 "raid_level": "raid1", 00:12:40.392 "base_bdevs": [ 00:12:40.392 "malloc1", 00:12:40.392 "malloc2" 00:12:40.392 ], 00:12:40.392 "superblock": false, 00:12:40.392 "method": "bdev_raid_create", 00:12:40.392 "req_id": 1 00:12:40.392 } 00:12:40.392 Got JSON-RPC error response 00:12:40.392 response: 00:12:40.392 { 00:12:40.392 "code": -17, 00:12:40.392 "message": "Failed to create RAID bdev raid_bdev1: File exists" 00:12:40.392 } 00:12:40.392 18:16:24 bdev_raid.raid_superblock_test -- common/autotest_common.sh@651 -- # es=1 00:12:40.392 18:16:24 bdev_raid.raid_superblock_test -- common/autotest_common.sh@659 -- # (( es > 128 )) 00:12:40.392 18:16:24 bdev_raid.raid_superblock_test -- common/autotest_common.sh@670 -- # [[ -n '' ]] 00:12:40.392 18:16:24 bdev_raid.raid_superblock_test -- common/autotest_common.sh@675 -- # (( !es == 0 )) 00:12:40.392 18:16:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:40.392 18:16:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # jq -r '.[]' 00:12:40.652 18:16:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # raid_bdev= 00:12:40.652 18:16:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@459 -- # '[' -n '' ']' 00:12:40.652 18:16:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@464 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:12:40.910 [2024-07-12 18:16:24.475727] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:12:40.910 [2024-07-12 18:16:24.475765] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:12:40.910 [2024-07-12 18:16:24.475785] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x24e97a0 00:12:40.910 [2024-07-12 18:16:24.475797] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:12:40.910 [2024-07-12 18:16:24.477415] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:12:40.910 [2024-07-12 18:16:24.477441] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:12:40.910 [2024-07-12 18:16:24.477514] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt1 00:12:40.910 [2024-07-12 18:16:24.477539] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:12:40.910 pt1 00:12:40.910 18:16:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@467 -- # verify_raid_bdev_state raid_bdev1 configuring raid1 0 2 00:12:40.910 18:16:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:12:40.910 18:16:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:12:40.910 18:16:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:12:40.910 18:16:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:12:40.910 18:16:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:12:40.910 18:16:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:12:40.910 18:16:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:12:40.910 18:16:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:12:40.911 18:16:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:12:40.911 18:16:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:40.911 18:16:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:12:41.170 18:16:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:12:41.170 "name": "raid_bdev1", 00:12:41.170 "uuid": "cf4fd141-379f-4176-8005-dc3b01d408ec", 00:12:41.170 "strip_size_kb": 0, 00:12:41.170 "state": "configuring", 00:12:41.170 "raid_level": "raid1", 00:12:41.170 "superblock": true, 00:12:41.170 "num_base_bdevs": 2, 00:12:41.170 "num_base_bdevs_discovered": 1, 00:12:41.170 "num_base_bdevs_operational": 2, 00:12:41.170 "base_bdevs_list": [ 00:12:41.170 { 00:12:41.170 "name": "pt1", 00:12:41.170 "uuid": "00000000-0000-0000-0000-000000000001", 00:12:41.170 "is_configured": true, 00:12:41.170 "data_offset": 2048, 00:12:41.170 "data_size": 63488 00:12:41.170 }, 00:12:41.170 { 00:12:41.170 "name": null, 00:12:41.170 "uuid": "00000000-0000-0000-0000-000000000002", 00:12:41.170 "is_configured": false, 00:12:41.170 "data_offset": 2048, 00:12:41.170 "data_size": 63488 00:12:41.170 } 00:12:41.170 ] 00:12:41.170 }' 00:12:41.170 18:16:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:12:41.170 18:16:24 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:12:41.738 18:16:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@469 -- # '[' 2 -gt 2 ']' 00:12:41.738 18:16:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i = 1 )) 00:12:41.738 18:16:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:12:41.738 18:16:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@478 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:12:41.997 [2024-07-12 18:16:25.546569] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:12:41.997 [2024-07-12 18:16:25.546614] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:12:41.997 [2024-07-12 18:16:25.546632] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x26816f0 00:12:41.997 [2024-07-12 18:16:25.546644] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:12:41.997 [2024-07-12 18:16:25.546989] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:12:41.997 [2024-07-12 18:16:25.547007] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:12:41.997 [2024-07-12 18:16:25.547068] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:12:41.997 [2024-07-12 18:16:25.547086] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:12:41.997 [2024-07-12 18:16:25.547180] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x2682590 00:12:41.997 [2024-07-12 18:16:25.547191] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:12:41.997 [2024-07-12 18:16:25.547356] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x24e3540 00:12:41.997 [2024-07-12 18:16:25.547477] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x2682590 00:12:41.997 [2024-07-12 18:16:25.547486] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x2682590 00:12:41.997 [2024-07-12 18:16:25.547580] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:12:41.997 pt2 00:12:41.997 18:16:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i++ )) 00:12:41.997 18:16:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:12:41.997 18:16:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@482 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:12:41.997 18:16:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:12:41.997 18:16:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:12:41.997 18:16:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:12:41.997 18:16:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:12:41.997 18:16:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:12:41.997 18:16:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:12:41.997 18:16:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:12:41.997 18:16:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:12:41.997 18:16:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:12:41.997 18:16:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:41.997 18:16:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:12:42.257 18:16:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:12:42.257 "name": "raid_bdev1", 00:12:42.257 "uuid": "cf4fd141-379f-4176-8005-dc3b01d408ec", 00:12:42.257 "strip_size_kb": 0, 00:12:42.257 "state": "online", 00:12:42.257 "raid_level": "raid1", 00:12:42.257 "superblock": true, 00:12:42.257 "num_base_bdevs": 2, 00:12:42.257 "num_base_bdevs_discovered": 2, 00:12:42.257 "num_base_bdevs_operational": 2, 00:12:42.257 "base_bdevs_list": [ 00:12:42.257 { 00:12:42.257 "name": "pt1", 00:12:42.257 "uuid": "00000000-0000-0000-0000-000000000001", 00:12:42.257 "is_configured": true, 00:12:42.257 "data_offset": 2048, 00:12:42.257 "data_size": 63488 00:12:42.257 }, 00:12:42.257 { 00:12:42.257 "name": "pt2", 00:12:42.257 "uuid": "00000000-0000-0000-0000-000000000002", 00:12:42.257 "is_configured": true, 00:12:42.257 "data_offset": 2048, 00:12:42.257 "data_size": 63488 00:12:42.257 } 00:12:42.257 ] 00:12:42.257 }' 00:12:42.257 18:16:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:12:42.257 18:16:25 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:12:42.824 18:16:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@483 -- # verify_raid_bdev_properties raid_bdev1 00:12:42.824 18:16:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:12:42.824 18:16:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:12:42.824 18:16:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:12:42.824 18:16:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:12:42.824 18:16:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@198 -- # local name 00:12:42.824 18:16:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:12:42.824 18:16:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:12:43.083 [2024-07-12 18:16:26.641720] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:12:43.083 18:16:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:12:43.083 "name": "raid_bdev1", 00:12:43.083 "aliases": [ 00:12:43.083 "cf4fd141-379f-4176-8005-dc3b01d408ec" 00:12:43.083 ], 00:12:43.083 "product_name": "Raid Volume", 00:12:43.083 "block_size": 512, 00:12:43.083 "num_blocks": 63488, 00:12:43.083 "uuid": "cf4fd141-379f-4176-8005-dc3b01d408ec", 00:12:43.083 "assigned_rate_limits": { 00:12:43.083 "rw_ios_per_sec": 0, 00:12:43.083 "rw_mbytes_per_sec": 0, 00:12:43.083 "r_mbytes_per_sec": 0, 00:12:43.083 "w_mbytes_per_sec": 0 00:12:43.083 }, 00:12:43.083 "claimed": false, 00:12:43.083 "zoned": false, 00:12:43.083 "supported_io_types": { 00:12:43.083 "read": true, 00:12:43.083 "write": true, 00:12:43.083 "unmap": false, 00:12:43.083 "flush": false, 00:12:43.083 "reset": true, 00:12:43.083 "nvme_admin": false, 00:12:43.083 "nvme_io": false, 00:12:43.083 "nvme_io_md": false, 00:12:43.083 "write_zeroes": true, 00:12:43.083 "zcopy": false, 00:12:43.084 "get_zone_info": false, 00:12:43.084 "zone_management": false, 00:12:43.084 "zone_append": false, 00:12:43.084 "compare": false, 00:12:43.084 "compare_and_write": false, 00:12:43.084 "abort": false, 00:12:43.084 "seek_hole": false, 00:12:43.084 "seek_data": false, 00:12:43.084 "copy": false, 00:12:43.084 "nvme_iov_md": false 00:12:43.084 }, 00:12:43.084 "memory_domains": [ 00:12:43.084 { 00:12:43.084 "dma_device_id": "system", 00:12:43.084 "dma_device_type": 1 00:12:43.084 }, 00:12:43.084 { 00:12:43.084 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:43.084 "dma_device_type": 2 00:12:43.084 }, 00:12:43.084 { 00:12:43.084 "dma_device_id": "system", 00:12:43.084 "dma_device_type": 1 00:12:43.084 }, 00:12:43.084 { 00:12:43.084 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:43.084 "dma_device_type": 2 00:12:43.084 } 00:12:43.084 ], 00:12:43.084 "driver_specific": { 00:12:43.084 "raid": { 00:12:43.084 "uuid": "cf4fd141-379f-4176-8005-dc3b01d408ec", 00:12:43.084 "strip_size_kb": 0, 00:12:43.084 "state": "online", 00:12:43.084 "raid_level": "raid1", 00:12:43.084 "superblock": true, 00:12:43.084 "num_base_bdevs": 2, 00:12:43.084 "num_base_bdevs_discovered": 2, 00:12:43.084 "num_base_bdevs_operational": 2, 00:12:43.084 "base_bdevs_list": [ 00:12:43.084 { 00:12:43.084 "name": "pt1", 00:12:43.084 "uuid": "00000000-0000-0000-0000-000000000001", 00:12:43.084 "is_configured": true, 00:12:43.084 "data_offset": 2048, 00:12:43.084 "data_size": 63488 00:12:43.084 }, 00:12:43.084 { 00:12:43.084 "name": "pt2", 00:12:43.084 "uuid": "00000000-0000-0000-0000-000000000002", 00:12:43.084 "is_configured": true, 00:12:43.084 "data_offset": 2048, 00:12:43.084 "data_size": 63488 00:12:43.084 } 00:12:43.084 ] 00:12:43.084 } 00:12:43.084 } 00:12:43.084 }' 00:12:43.084 18:16:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:12:43.084 18:16:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:12:43.084 pt2' 00:12:43.084 18:16:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:12:43.084 18:16:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:12:43.084 18:16:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:12:43.343 18:16:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:12:43.343 "name": "pt1", 00:12:43.343 "aliases": [ 00:12:43.343 "00000000-0000-0000-0000-000000000001" 00:12:43.343 ], 00:12:43.343 "product_name": "passthru", 00:12:43.343 "block_size": 512, 00:12:43.343 "num_blocks": 65536, 00:12:43.343 "uuid": "00000000-0000-0000-0000-000000000001", 00:12:43.343 "assigned_rate_limits": { 00:12:43.343 "rw_ios_per_sec": 0, 00:12:43.343 "rw_mbytes_per_sec": 0, 00:12:43.343 "r_mbytes_per_sec": 0, 00:12:43.343 "w_mbytes_per_sec": 0 00:12:43.343 }, 00:12:43.343 "claimed": true, 00:12:43.343 "claim_type": "exclusive_write", 00:12:43.343 "zoned": false, 00:12:43.343 "supported_io_types": { 00:12:43.343 "read": true, 00:12:43.343 "write": true, 00:12:43.343 "unmap": true, 00:12:43.343 "flush": true, 00:12:43.343 "reset": true, 00:12:43.343 "nvme_admin": false, 00:12:43.343 "nvme_io": false, 00:12:43.343 "nvme_io_md": false, 00:12:43.343 "write_zeroes": true, 00:12:43.343 "zcopy": true, 00:12:43.343 "get_zone_info": false, 00:12:43.343 "zone_management": false, 00:12:43.343 "zone_append": false, 00:12:43.343 "compare": false, 00:12:43.343 "compare_and_write": false, 00:12:43.343 "abort": true, 00:12:43.343 "seek_hole": false, 00:12:43.343 "seek_data": false, 00:12:43.343 "copy": true, 00:12:43.343 "nvme_iov_md": false 00:12:43.343 }, 00:12:43.343 "memory_domains": [ 00:12:43.343 { 00:12:43.343 "dma_device_id": "system", 00:12:43.343 "dma_device_type": 1 00:12:43.343 }, 00:12:43.343 { 00:12:43.343 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:43.343 "dma_device_type": 2 00:12:43.343 } 00:12:43.343 ], 00:12:43.343 "driver_specific": { 00:12:43.343 "passthru": { 00:12:43.343 "name": "pt1", 00:12:43.343 "base_bdev_name": "malloc1" 00:12:43.343 } 00:12:43.343 } 00:12:43.343 }' 00:12:43.343 18:16:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:12:43.343 18:16:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:12:43.343 18:16:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:12:43.343 18:16:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:12:43.602 18:16:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:12:43.602 18:16:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:12:43.602 18:16:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:12:43.602 18:16:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:12:43.602 18:16:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:12:43.603 18:16:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:12:43.603 18:16:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:12:43.603 18:16:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:12:43.603 18:16:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:12:43.603 18:16:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:12:43.603 18:16:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:12:43.862 18:16:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:12:43.862 "name": "pt2", 00:12:43.862 "aliases": [ 00:12:43.862 "00000000-0000-0000-0000-000000000002" 00:12:43.862 ], 00:12:43.862 "product_name": "passthru", 00:12:43.862 "block_size": 512, 00:12:43.862 "num_blocks": 65536, 00:12:43.862 "uuid": "00000000-0000-0000-0000-000000000002", 00:12:43.862 "assigned_rate_limits": { 00:12:43.862 "rw_ios_per_sec": 0, 00:12:43.862 "rw_mbytes_per_sec": 0, 00:12:43.862 "r_mbytes_per_sec": 0, 00:12:43.862 "w_mbytes_per_sec": 0 00:12:43.862 }, 00:12:43.862 "claimed": true, 00:12:43.862 "claim_type": "exclusive_write", 00:12:43.862 "zoned": false, 00:12:43.862 "supported_io_types": { 00:12:43.862 "read": true, 00:12:43.862 "write": true, 00:12:43.862 "unmap": true, 00:12:43.862 "flush": true, 00:12:43.862 "reset": true, 00:12:43.862 "nvme_admin": false, 00:12:43.862 "nvme_io": false, 00:12:43.862 "nvme_io_md": false, 00:12:43.862 "write_zeroes": true, 00:12:43.862 "zcopy": true, 00:12:43.862 "get_zone_info": false, 00:12:43.862 "zone_management": false, 00:12:43.862 "zone_append": false, 00:12:43.862 "compare": false, 00:12:43.862 "compare_and_write": false, 00:12:43.862 "abort": true, 00:12:43.862 "seek_hole": false, 00:12:43.862 "seek_data": false, 00:12:43.862 "copy": true, 00:12:43.862 "nvme_iov_md": false 00:12:43.862 }, 00:12:43.862 "memory_domains": [ 00:12:43.862 { 00:12:43.862 "dma_device_id": "system", 00:12:43.862 "dma_device_type": 1 00:12:43.862 }, 00:12:43.862 { 00:12:43.862 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:43.862 "dma_device_type": 2 00:12:43.862 } 00:12:43.862 ], 00:12:43.862 "driver_specific": { 00:12:43.862 "passthru": { 00:12:43.862 "name": "pt2", 00:12:43.862 "base_bdev_name": "malloc2" 00:12:43.862 } 00:12:43.862 } 00:12:43.862 }' 00:12:43.862 18:16:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:12:44.121 18:16:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:12:44.121 18:16:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:12:44.121 18:16:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:12:44.121 18:16:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:12:44.121 18:16:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:12:44.121 18:16:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:12:44.121 18:16:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:12:44.121 18:16:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:12:44.121 18:16:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:12:44.379 18:16:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:12:44.379 18:16:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:12:44.379 18:16:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:12:44.379 18:16:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # jq -r '.[] | .uuid' 00:12:44.638 [2024-07-12 18:16:28.133698] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:12:44.638 18:16:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # '[' cf4fd141-379f-4176-8005-dc3b01d408ec '!=' cf4fd141-379f-4176-8005-dc3b01d408ec ']' 00:12:44.638 18:16:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@490 -- # has_redundancy raid1 00:12:44.638 18:16:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:12:44.638 18:16:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@214 -- # return 0 00:12:44.638 18:16:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@492 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt1 00:12:44.897 [2024-07-12 18:16:28.370110] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: pt1 00:12:44.897 18:16:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@495 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:12:44.897 18:16:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:12:44.897 18:16:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:12:44.897 18:16:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:12:44.897 18:16:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:12:44.897 18:16:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:12:44.897 18:16:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:12:44.897 18:16:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:12:44.897 18:16:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:12:44.897 18:16:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:12:44.897 18:16:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:44.897 18:16:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:12:45.156 18:16:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:12:45.156 "name": "raid_bdev1", 00:12:45.156 "uuid": "cf4fd141-379f-4176-8005-dc3b01d408ec", 00:12:45.156 "strip_size_kb": 0, 00:12:45.156 "state": "online", 00:12:45.156 "raid_level": "raid1", 00:12:45.156 "superblock": true, 00:12:45.156 "num_base_bdevs": 2, 00:12:45.156 "num_base_bdevs_discovered": 1, 00:12:45.156 "num_base_bdevs_operational": 1, 00:12:45.156 "base_bdevs_list": [ 00:12:45.156 { 00:12:45.156 "name": null, 00:12:45.156 "uuid": "00000000-0000-0000-0000-000000000000", 00:12:45.156 "is_configured": false, 00:12:45.156 "data_offset": 2048, 00:12:45.156 "data_size": 63488 00:12:45.156 }, 00:12:45.156 { 00:12:45.156 "name": "pt2", 00:12:45.156 "uuid": "00000000-0000-0000-0000-000000000002", 00:12:45.156 "is_configured": true, 00:12:45.156 "data_offset": 2048, 00:12:45.156 "data_size": 63488 00:12:45.156 } 00:12:45.156 ] 00:12:45.156 }' 00:12:45.156 18:16:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:12:45.156 18:16:28 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:12:45.724 18:16:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@498 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:12:45.724 [2024-07-12 18:16:29.384767] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:12:45.724 [2024-07-12 18:16:29.384791] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:12:45.724 [2024-07-12 18:16:29.384836] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:12:45.724 [2024-07-12 18:16:29.384873] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:12:45.724 [2024-07-12 18:16:29.384885] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x2682590 name raid_bdev1, state offline 00:12:45.724 18:16:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@499 -- # jq -r '.[]' 00:12:45.724 18:16:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@499 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:45.983 18:16:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@499 -- # raid_bdev= 00:12:45.983 18:16:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@500 -- # '[' -n '' ']' 00:12:45.983 18:16:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@505 -- # (( i = 1 )) 00:12:45.983 18:16:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@505 -- # (( i < num_base_bdevs )) 00:12:45.983 18:16:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@506 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:12:46.242 18:16:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@505 -- # (( i++ )) 00:12:46.242 18:16:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@505 -- # (( i < num_base_bdevs )) 00:12:46.242 18:16:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@510 -- # (( i = 1 )) 00:12:46.242 18:16:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@510 -- # (( i < num_base_bdevs - 1 )) 00:12:46.242 18:16:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@518 -- # i=1 00:12:46.242 18:16:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@519 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:12:46.501 [2024-07-12 18:16:29.978308] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:12:46.501 [2024-07-12 18:16:29.978358] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:12:46.501 [2024-07-12 18:16:29.978375] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x24ea160 00:12:46.501 [2024-07-12 18:16:29.978387] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:12:46.501 [2024-07-12 18:16:29.979980] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:12:46.501 [2024-07-12 18:16:29.980008] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:12:46.501 [2024-07-12 18:16:29.980077] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:12:46.501 [2024-07-12 18:16:29.980102] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:12:46.501 [2024-07-12 18:16:29.980184] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x24e0380 00:12:46.501 [2024-07-12 18:16:29.980194] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:12:46.501 [2024-07-12 18:16:29.980361] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x24e1a80 00:12:46.501 [2024-07-12 18:16:29.980479] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x24e0380 00:12:46.501 [2024-07-12 18:16:29.980489] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x24e0380 00:12:46.501 [2024-07-12 18:16:29.980582] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:12:46.501 pt2 00:12:46.501 18:16:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@522 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:12:46.501 18:16:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:12:46.501 18:16:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:12:46.501 18:16:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:12:46.501 18:16:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:12:46.501 18:16:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:12:46.501 18:16:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:12:46.501 18:16:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:12:46.501 18:16:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:12:46.501 18:16:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:12:46.501 18:16:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:46.501 18:16:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:12:46.501 18:16:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:12:46.501 "name": "raid_bdev1", 00:12:46.501 "uuid": "cf4fd141-379f-4176-8005-dc3b01d408ec", 00:12:46.501 "strip_size_kb": 0, 00:12:46.501 "state": "online", 00:12:46.501 "raid_level": "raid1", 00:12:46.501 "superblock": true, 00:12:46.501 "num_base_bdevs": 2, 00:12:46.501 "num_base_bdevs_discovered": 1, 00:12:46.501 "num_base_bdevs_operational": 1, 00:12:46.501 "base_bdevs_list": [ 00:12:46.501 { 00:12:46.501 "name": null, 00:12:46.501 "uuid": "00000000-0000-0000-0000-000000000000", 00:12:46.501 "is_configured": false, 00:12:46.501 "data_offset": 2048, 00:12:46.501 "data_size": 63488 00:12:46.501 }, 00:12:46.501 { 00:12:46.501 "name": "pt2", 00:12:46.501 "uuid": "00000000-0000-0000-0000-000000000002", 00:12:46.501 "is_configured": true, 00:12:46.501 "data_offset": 2048, 00:12:46.501 "data_size": 63488 00:12:46.501 } 00:12:46.501 ] 00:12:46.501 }' 00:12:46.501 18:16:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:12:46.501 18:16:30 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:12:47.082 18:16:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@525 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:12:47.663 [2024-07-12 18:16:31.241653] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:12:47.663 [2024-07-12 18:16:31.241680] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:12:47.663 [2024-07-12 18:16:31.241732] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:12:47.663 [2024-07-12 18:16:31.241775] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:12:47.663 [2024-07-12 18:16:31.241787] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x24e0380 name raid_bdev1, state offline 00:12:47.663 18:16:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@526 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:47.663 18:16:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@526 -- # jq -r '.[]' 00:12:47.923 18:16:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@526 -- # raid_bdev= 00:12:47.923 18:16:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@527 -- # '[' -n '' ']' 00:12:47.923 18:16:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@531 -- # '[' 2 -gt 2 ']' 00:12:47.923 18:16:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@539 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:12:48.492 [2024-07-12 18:16:32.003639] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:12:48.492 [2024-07-12 18:16:32.003681] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:12:48.492 [2024-07-12 18:16:32.003698] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x268c520 00:12:48.492 [2024-07-12 18:16:32.003711] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:12:48.492 [2024-07-12 18:16:32.005364] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:12:48.492 [2024-07-12 18:16:32.005392] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:12:48.492 [2024-07-12 18:16:32.005459] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt1 00:12:48.492 [2024-07-12 18:16:32.005482] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:12:48.492 [2024-07-12 18:16:32.005577] bdev_raid.c:3547:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev pt2 (4) greater than existing raid bdev raid_bdev1 (2) 00:12:48.492 [2024-07-12 18:16:32.005590] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:12:48.492 [2024-07-12 18:16:32.005603] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x24e13f0 name raid_bdev1, state configuring 00:12:48.492 [2024-07-12 18:16:32.005625] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:12:48.492 [2024-07-12 18:16:32.005684] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x24e32b0 00:12:48.492 [2024-07-12 18:16:32.005695] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:12:48.492 [2024-07-12 18:16:32.005858] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x24e0350 00:12:48.492 [2024-07-12 18:16:32.005985] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x24e32b0 00:12:48.492 [2024-07-12 18:16:32.005997] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x24e32b0 00:12:48.492 [2024-07-12 18:16:32.006094] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:12:48.492 pt1 00:12:48.492 18:16:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@541 -- # '[' 2 -gt 2 ']' 00:12:48.492 18:16:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@553 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:12:48.492 18:16:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:12:48.492 18:16:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:12:48.492 18:16:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:12:48.492 18:16:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:12:48.492 18:16:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:12:48.492 18:16:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:12:48.492 18:16:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:12:48.492 18:16:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:12:48.492 18:16:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:12:48.492 18:16:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:48.492 18:16:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:12:48.751 18:16:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:12:48.751 "name": "raid_bdev1", 00:12:48.751 "uuid": "cf4fd141-379f-4176-8005-dc3b01d408ec", 00:12:48.751 "strip_size_kb": 0, 00:12:48.751 "state": "online", 00:12:48.751 "raid_level": "raid1", 00:12:48.751 "superblock": true, 00:12:48.751 "num_base_bdevs": 2, 00:12:48.751 "num_base_bdevs_discovered": 1, 00:12:48.751 "num_base_bdevs_operational": 1, 00:12:48.751 "base_bdevs_list": [ 00:12:48.751 { 00:12:48.751 "name": null, 00:12:48.751 "uuid": "00000000-0000-0000-0000-000000000000", 00:12:48.751 "is_configured": false, 00:12:48.752 "data_offset": 2048, 00:12:48.752 "data_size": 63488 00:12:48.752 }, 00:12:48.752 { 00:12:48.752 "name": "pt2", 00:12:48.752 "uuid": "00000000-0000-0000-0000-000000000002", 00:12:48.752 "is_configured": true, 00:12:48.752 "data_offset": 2048, 00:12:48.752 "data_size": 63488 00:12:48.752 } 00:12:48.752 ] 00:12:48.752 }' 00:12:48.752 18:16:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:12:48.752 18:16:32 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:12:49.319 18:16:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@554 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs online 00:12:49.319 18:16:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@554 -- # jq -r '.[].base_bdevs_list[0].is_configured' 00:12:49.319 18:16:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@554 -- # [[ false == \f\a\l\s\e ]] 00:12:49.319 18:16:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@557 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:12:49.319 18:16:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@557 -- # jq -r '.[] | .uuid' 00:12:49.577 [2024-07-12 18:16:33.231127] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:12:49.577 18:16:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@557 -- # '[' cf4fd141-379f-4176-8005-dc3b01d408ec '!=' cf4fd141-379f-4176-8005-dc3b01d408ec ']' 00:12:49.577 18:16:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@562 -- # killprocess 2473157 00:12:49.577 18:16:33 bdev_raid.raid_superblock_test -- common/autotest_common.sh@948 -- # '[' -z 2473157 ']' 00:12:49.577 18:16:33 bdev_raid.raid_superblock_test -- common/autotest_common.sh@952 -- # kill -0 2473157 00:12:49.577 18:16:33 bdev_raid.raid_superblock_test -- common/autotest_common.sh@953 -- # uname 00:12:49.577 18:16:33 bdev_raid.raid_superblock_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:12:49.577 18:16:33 bdev_raid.raid_superblock_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2473157 00:12:49.577 18:16:33 bdev_raid.raid_superblock_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:12:49.836 18:16:33 bdev_raid.raid_superblock_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:12:49.837 18:16:33 bdev_raid.raid_superblock_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2473157' 00:12:49.837 killing process with pid 2473157 00:12:49.837 18:16:33 bdev_raid.raid_superblock_test -- common/autotest_common.sh@967 -- # kill 2473157 00:12:49.837 [2024-07-12 18:16:33.305568] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:12:49.837 [2024-07-12 18:16:33.305616] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:12:49.837 [2024-07-12 18:16:33.305656] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:12:49.837 [2024-07-12 18:16:33.305667] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x24e32b0 name raid_bdev1, state offline 00:12:49.837 18:16:33 bdev_raid.raid_superblock_test -- common/autotest_common.sh@972 -- # wait 2473157 00:12:49.837 [2024-07-12 18:16:33.323572] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:12:49.837 18:16:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@564 -- # return 0 00:12:49.837 00:12:49.837 real 0m15.582s 00:12:49.837 user 0m28.302s 00:12:49.837 sys 0m2.779s 00:12:49.837 18:16:33 bdev_raid.raid_superblock_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:12:49.837 18:16:33 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:12:49.837 ************************************ 00:12:49.837 END TEST raid_superblock_test 00:12:49.837 ************************************ 00:12:50.096 18:16:33 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:12:50.096 18:16:33 bdev_raid -- bdev/bdev_raid.sh@870 -- # run_test raid_read_error_test raid_io_error_test raid1 2 read 00:12:50.096 18:16:33 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:12:50.096 18:16:33 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:12:50.096 18:16:33 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:12:50.096 ************************************ 00:12:50.096 START TEST raid_read_error_test 00:12:50.096 ************************************ 00:12:50.096 18:16:33 bdev_raid.raid_read_error_test -- common/autotest_common.sh@1123 -- # raid_io_error_test raid1 2 read 00:12:50.096 18:16:33 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@788 -- # local raid_level=raid1 00:12:50.096 18:16:33 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@789 -- # local num_base_bdevs=2 00:12:50.096 18:16:33 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@790 -- # local error_io_type=read 00:12:50.096 18:16:33 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i = 1 )) 00:12:50.096 18:16:33 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:12:50.096 18:16:33 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev1 00:12:50.096 18:16:33 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:12:50.096 18:16:33 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:12:50.096 18:16:33 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev2 00:12:50.096 18:16:33 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:12:50.096 18:16:33 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:12:50.096 18:16:33 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:12:50.096 18:16:33 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # local base_bdevs 00:12:50.096 18:16:33 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@792 -- # local raid_bdev_name=raid_bdev1 00:12:50.096 18:16:33 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # local strip_size 00:12:50.096 18:16:33 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@794 -- # local create_arg 00:12:50.096 18:16:33 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@795 -- # local bdevperf_log 00:12:50.096 18:16:33 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@796 -- # local fail_per_s 00:12:50.096 18:16:33 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@798 -- # '[' raid1 '!=' raid1 ']' 00:12:50.096 18:16:33 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@802 -- # strip_size=0 00:12:50.096 18:16:33 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@805 -- # mktemp -p /raidtest 00:12:50.096 18:16:33 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@805 -- # bdevperf_log=/raidtest/tmp.ShNhtwxMb3 00:12:50.096 18:16:33 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@808 -- # raid_pid=2475619 00:12:50.096 18:16:33 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@809 -- # waitforlisten 2475619 /var/tmp/spdk-raid.sock 00:12:50.096 18:16:33 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@807 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 128k -q 1 -z -f -L bdev_raid 00:12:50.096 18:16:33 bdev_raid.raid_read_error_test -- common/autotest_common.sh@829 -- # '[' -z 2475619 ']' 00:12:50.097 18:16:33 bdev_raid.raid_read_error_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:12:50.097 18:16:33 bdev_raid.raid_read_error_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:12:50.097 18:16:33 bdev_raid.raid_read_error_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:12:50.097 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:12:50.097 18:16:33 bdev_raid.raid_read_error_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:12:50.097 18:16:33 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:12:50.097 [2024-07-12 18:16:33.683247] Starting SPDK v24.09-pre git sha1 182dd7de4 / DPDK 24.03.0 initialization... 00:12:50.097 [2024-07-12 18:16:33.683311] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2475619 ] 00:12:50.097 [2024-07-12 18:16:33.801864] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:12:50.356 [2024-07-12 18:16:33.910035] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:12:50.356 [2024-07-12 18:16:33.974172] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:12:50.356 [2024-07-12 18:16:33.974206] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:12:50.924 18:16:34 bdev_raid.raid_read_error_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:12:50.924 18:16:34 bdev_raid.raid_read_error_test -- common/autotest_common.sh@862 -- # return 0 00:12:50.924 18:16:34 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:12:50.924 18:16:34 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:12:51.182 BaseBdev1_malloc 00:12:51.183 18:16:34 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev1_malloc 00:12:51.442 true 00:12:51.442 18:16:35 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev1_malloc -p BaseBdev1 00:12:51.701 [2024-07-12 18:16:35.325518] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev1_malloc 00:12:51.701 [2024-07-12 18:16:35.325563] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:12:51.701 [2024-07-12 18:16:35.325583] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x127b0d0 00:12:51.701 [2024-07-12 18:16:35.325595] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:12:51.701 [2024-07-12 18:16:35.327460] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:12:51.701 [2024-07-12 18:16:35.327490] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:12:51.701 BaseBdev1 00:12:51.701 18:16:35 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:12:51.701 18:16:35 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:12:51.960 BaseBdev2_malloc 00:12:51.960 18:16:35 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev2_malloc 00:12:52.219 true 00:12:52.219 18:16:35 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev2_malloc -p BaseBdev2 00:12:52.478 [2024-07-12 18:16:36.053247] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev2_malloc 00:12:52.478 [2024-07-12 18:16:36.053291] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:12:52.478 [2024-07-12 18:16:36.053311] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x127f910 00:12:52.478 [2024-07-12 18:16:36.053324] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:12:52.478 [2024-07-12 18:16:36.054875] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:12:52.478 [2024-07-12 18:16:36.054902] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:12:52.478 BaseBdev2 00:12:52.478 18:16:36 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@819 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'BaseBdev1 BaseBdev2' -n raid_bdev1 -s 00:12:52.738 [2024-07-12 18:16:36.297915] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:12:52.738 [2024-07-12 18:16:36.299286] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:12:52.738 [2024-07-12 18:16:36.299468] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x1281320 00:12:52.738 [2024-07-12 18:16:36.299482] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:12:52.738 [2024-07-12 18:16:36.299673] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x10e8d00 00:12:52.738 [2024-07-12 18:16:36.299828] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1281320 00:12:52.738 [2024-07-12 18:16:36.299839] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x1281320 00:12:52.738 [2024-07-12 18:16:36.299954] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:12:52.738 18:16:36 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@820 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:12:52.738 18:16:36 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:12:52.738 18:16:36 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:12:52.738 18:16:36 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:12:52.738 18:16:36 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:12:52.738 18:16:36 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:12:52.738 18:16:36 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:12:52.738 18:16:36 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:12:52.738 18:16:36 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:12:52.738 18:16:36 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:12:52.738 18:16:36 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:52.738 18:16:36 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:12:52.997 18:16:36 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:12:52.997 "name": "raid_bdev1", 00:12:52.997 "uuid": "7e347496-2e2b-4859-8e4d-290eb0427c08", 00:12:52.997 "strip_size_kb": 0, 00:12:52.997 "state": "online", 00:12:52.997 "raid_level": "raid1", 00:12:52.997 "superblock": true, 00:12:52.997 "num_base_bdevs": 2, 00:12:52.997 "num_base_bdevs_discovered": 2, 00:12:52.997 "num_base_bdevs_operational": 2, 00:12:52.997 "base_bdevs_list": [ 00:12:52.997 { 00:12:52.997 "name": "BaseBdev1", 00:12:52.997 "uuid": "54cacdce-5d9a-5a37-928c-47791ed3b579", 00:12:52.997 "is_configured": true, 00:12:52.997 "data_offset": 2048, 00:12:52.997 "data_size": 63488 00:12:52.997 }, 00:12:52.997 { 00:12:52.997 "name": "BaseBdev2", 00:12:52.997 "uuid": "c67fc978-2760-5022-9e8d-20a8790dc95d", 00:12:52.997 "is_configured": true, 00:12:52.997 "data_offset": 2048, 00:12:52.997 "data_size": 63488 00:12:52.997 } 00:12:52.997 ] 00:12:52.997 }' 00:12:52.997 18:16:36 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:12:52.997 18:16:36 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:12:53.564 18:16:37 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@824 -- # sleep 1 00:12:53.564 18:16:37 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@823 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/spdk-raid.sock perform_tests 00:12:53.564 [2024-07-12 18:16:37.264755] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x127cc70 00:12:54.500 18:16:38 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@827 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_inject_error EE_BaseBdev1_malloc read failure 00:12:54.759 18:16:38 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@829 -- # local expected_num_base_bdevs 00:12:54.759 18:16:38 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@830 -- # [[ raid1 = \r\a\i\d\1 ]] 00:12:54.759 18:16:38 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@830 -- # [[ read = \w\r\i\t\e ]] 00:12:54.759 18:16:38 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@833 -- # expected_num_base_bdevs=2 00:12:54.759 18:16:38 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@835 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:12:54.759 18:16:38 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:12:54.759 18:16:38 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:12:54.759 18:16:38 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:12:54.759 18:16:38 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:12:54.759 18:16:38 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:12:54.759 18:16:38 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:12:54.759 18:16:38 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:12:54.759 18:16:38 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:12:54.759 18:16:38 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:12:54.759 18:16:38 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:54.759 18:16:38 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:12:55.018 18:16:38 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:12:55.018 "name": "raid_bdev1", 00:12:55.018 "uuid": "7e347496-2e2b-4859-8e4d-290eb0427c08", 00:12:55.018 "strip_size_kb": 0, 00:12:55.018 "state": "online", 00:12:55.018 "raid_level": "raid1", 00:12:55.018 "superblock": true, 00:12:55.018 "num_base_bdevs": 2, 00:12:55.018 "num_base_bdevs_discovered": 2, 00:12:55.018 "num_base_bdevs_operational": 2, 00:12:55.018 "base_bdevs_list": [ 00:12:55.018 { 00:12:55.018 "name": "BaseBdev1", 00:12:55.018 "uuid": "54cacdce-5d9a-5a37-928c-47791ed3b579", 00:12:55.018 "is_configured": true, 00:12:55.018 "data_offset": 2048, 00:12:55.018 "data_size": 63488 00:12:55.018 }, 00:12:55.018 { 00:12:55.018 "name": "BaseBdev2", 00:12:55.018 "uuid": "c67fc978-2760-5022-9e8d-20a8790dc95d", 00:12:55.018 "is_configured": true, 00:12:55.018 "data_offset": 2048, 00:12:55.018 "data_size": 63488 00:12:55.018 } 00:12:55.018 ] 00:12:55.018 }' 00:12:55.018 18:16:38 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:12:55.018 18:16:38 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:12:55.585 18:16:39 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@837 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:12:55.844 [2024-07-12 18:16:39.483788] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:12:55.844 [2024-07-12 18:16:39.483830] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:12:55.844 [2024-07-12 18:16:39.486967] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:12:55.844 [2024-07-12 18:16:39.486996] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:12:55.844 [2024-07-12 18:16:39.487074] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:12:55.844 [2024-07-12 18:16:39.487086] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1281320 name raid_bdev1, state offline 00:12:55.844 0 00:12:55.844 18:16:39 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@839 -- # killprocess 2475619 00:12:55.844 18:16:39 bdev_raid.raid_read_error_test -- common/autotest_common.sh@948 -- # '[' -z 2475619 ']' 00:12:55.844 18:16:39 bdev_raid.raid_read_error_test -- common/autotest_common.sh@952 -- # kill -0 2475619 00:12:55.844 18:16:39 bdev_raid.raid_read_error_test -- common/autotest_common.sh@953 -- # uname 00:12:55.844 18:16:39 bdev_raid.raid_read_error_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:12:55.844 18:16:39 bdev_raid.raid_read_error_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2475619 00:12:55.844 18:16:39 bdev_raid.raid_read_error_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:12:55.844 18:16:39 bdev_raid.raid_read_error_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:12:55.844 18:16:39 bdev_raid.raid_read_error_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2475619' 00:12:55.844 killing process with pid 2475619 00:12:55.844 18:16:39 bdev_raid.raid_read_error_test -- common/autotest_common.sh@967 -- # kill 2475619 00:12:55.844 [2024-07-12 18:16:39.553194] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:12:55.844 18:16:39 bdev_raid.raid_read_error_test -- common/autotest_common.sh@972 -- # wait 2475619 00:12:55.844 [2024-07-12 18:16:39.563949] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:12:56.103 18:16:39 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # grep -v Job /raidtest/tmp.ShNhtwxMb3 00:12:56.103 18:16:39 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # grep raid_bdev1 00:12:56.103 18:16:39 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # awk '{print $6}' 00:12:56.103 18:16:39 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # fail_per_s=0.00 00:12:56.103 18:16:39 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@844 -- # has_redundancy raid1 00:12:56.103 18:16:39 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:12:56.103 18:16:39 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@214 -- # return 0 00:12:56.103 18:16:39 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@845 -- # [[ 0.00 = \0\.\0\0 ]] 00:12:56.103 00:12:56.103 real 0m6.183s 00:12:56.103 user 0m9.651s 00:12:56.103 sys 0m1.107s 00:12:56.103 18:16:39 bdev_raid.raid_read_error_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:12:56.103 18:16:39 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:12:56.103 ************************************ 00:12:56.103 END TEST raid_read_error_test 00:12:56.103 ************************************ 00:12:56.361 18:16:39 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:12:56.361 18:16:39 bdev_raid -- bdev/bdev_raid.sh@871 -- # run_test raid_write_error_test raid_io_error_test raid1 2 write 00:12:56.361 18:16:39 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:12:56.361 18:16:39 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:12:56.361 18:16:39 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:12:56.361 ************************************ 00:12:56.361 START TEST raid_write_error_test 00:12:56.361 ************************************ 00:12:56.361 18:16:39 bdev_raid.raid_write_error_test -- common/autotest_common.sh@1123 -- # raid_io_error_test raid1 2 write 00:12:56.362 18:16:39 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@788 -- # local raid_level=raid1 00:12:56.362 18:16:39 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@789 -- # local num_base_bdevs=2 00:12:56.362 18:16:39 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@790 -- # local error_io_type=write 00:12:56.362 18:16:39 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i = 1 )) 00:12:56.362 18:16:39 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:12:56.362 18:16:39 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev1 00:12:56.362 18:16:39 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:12:56.362 18:16:39 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:12:56.362 18:16:39 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev2 00:12:56.362 18:16:39 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:12:56.362 18:16:39 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:12:56.362 18:16:39 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:12:56.362 18:16:39 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # local base_bdevs 00:12:56.362 18:16:39 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@792 -- # local raid_bdev_name=raid_bdev1 00:12:56.362 18:16:39 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # local strip_size 00:12:56.362 18:16:39 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@794 -- # local create_arg 00:12:56.362 18:16:39 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@795 -- # local bdevperf_log 00:12:56.362 18:16:39 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@796 -- # local fail_per_s 00:12:56.362 18:16:39 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@798 -- # '[' raid1 '!=' raid1 ']' 00:12:56.362 18:16:39 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@802 -- # strip_size=0 00:12:56.362 18:16:39 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@805 -- # mktemp -p /raidtest 00:12:56.362 18:16:39 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@805 -- # bdevperf_log=/raidtest/tmp.OjMNq4Hu9K 00:12:56.362 18:16:39 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@808 -- # raid_pid=2476430 00:12:56.362 18:16:39 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@809 -- # waitforlisten 2476430 /var/tmp/spdk-raid.sock 00:12:56.362 18:16:39 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@807 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 128k -q 1 -z -f -L bdev_raid 00:12:56.362 18:16:39 bdev_raid.raid_write_error_test -- common/autotest_common.sh@829 -- # '[' -z 2476430 ']' 00:12:56.362 18:16:39 bdev_raid.raid_write_error_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:12:56.362 18:16:39 bdev_raid.raid_write_error_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:12:56.362 18:16:39 bdev_raid.raid_write_error_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:12:56.362 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:12:56.362 18:16:39 bdev_raid.raid_write_error_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:12:56.362 18:16:39 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:12:56.362 [2024-07-12 18:16:39.961338] Starting SPDK v24.09-pre git sha1 182dd7de4 / DPDK 24.03.0 initialization... 00:12:56.362 [2024-07-12 18:16:39.961405] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2476430 ] 00:12:56.362 [2024-07-12 18:16:40.087306] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:12:56.621 [2024-07-12 18:16:40.193770] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:12:56.621 [2024-07-12 18:16:40.246604] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:12:56.621 [2024-07-12 18:16:40.246635] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:12:57.187 18:16:40 bdev_raid.raid_write_error_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:12:57.187 18:16:40 bdev_raid.raid_write_error_test -- common/autotest_common.sh@862 -- # return 0 00:12:57.187 18:16:40 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:12:57.187 18:16:40 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:12:57.445 BaseBdev1_malloc 00:12:57.445 18:16:41 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev1_malloc 00:12:57.703 true 00:12:57.703 18:16:41 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev1_malloc -p BaseBdev1 00:12:57.961 [2024-07-12 18:16:41.641083] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev1_malloc 00:12:57.961 [2024-07-12 18:16:41.641129] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:12:57.961 [2024-07-12 18:16:41.641150] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x117f0d0 00:12:57.961 [2024-07-12 18:16:41.641163] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:12:57.961 [2024-07-12 18:16:41.643037] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:12:57.961 [2024-07-12 18:16:41.643068] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:12:57.961 BaseBdev1 00:12:57.961 18:16:41 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:12:57.961 18:16:41 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:12:58.220 BaseBdev2_malloc 00:12:58.220 18:16:41 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev2_malloc 00:12:58.479 true 00:12:58.479 18:16:42 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev2_malloc -p BaseBdev2 00:12:59.047 [2024-07-12 18:16:42.628321] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev2_malloc 00:12:59.047 [2024-07-12 18:16:42.628371] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:12:59.047 [2024-07-12 18:16:42.628392] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1183910 00:12:59.047 [2024-07-12 18:16:42.628404] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:12:59.047 [2024-07-12 18:16:42.630045] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:12:59.047 [2024-07-12 18:16:42.630076] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:12:59.047 BaseBdev2 00:12:59.047 18:16:42 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@819 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'BaseBdev1 BaseBdev2' -n raid_bdev1 -s 00:12:59.306 [2024-07-12 18:16:42.877011] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:12:59.306 [2024-07-12 18:16:42.878412] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:12:59.306 [2024-07-12 18:16:42.878606] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x1185320 00:12:59.306 [2024-07-12 18:16:42.878619] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:12:59.306 [2024-07-12 18:16:42.878822] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xfecd00 00:12:59.306 [2024-07-12 18:16:42.878993] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1185320 00:12:59.306 [2024-07-12 18:16:42.879004] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x1185320 00:12:59.306 [2024-07-12 18:16:42.879117] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:12:59.306 18:16:42 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@820 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:12:59.306 18:16:42 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:12:59.306 18:16:42 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:12:59.306 18:16:42 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:12:59.306 18:16:42 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:12:59.306 18:16:42 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:12:59.306 18:16:42 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:12:59.306 18:16:42 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:12:59.306 18:16:42 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:12:59.306 18:16:42 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:12:59.306 18:16:42 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:59.306 18:16:42 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:12:59.564 18:16:43 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:12:59.564 "name": "raid_bdev1", 00:12:59.564 "uuid": "3ca03c06-809e-471d-ae47-20a6c2aa7019", 00:12:59.564 "strip_size_kb": 0, 00:12:59.564 "state": "online", 00:12:59.564 "raid_level": "raid1", 00:12:59.564 "superblock": true, 00:12:59.564 "num_base_bdevs": 2, 00:12:59.564 "num_base_bdevs_discovered": 2, 00:12:59.564 "num_base_bdevs_operational": 2, 00:12:59.564 "base_bdevs_list": [ 00:12:59.564 { 00:12:59.564 "name": "BaseBdev1", 00:12:59.564 "uuid": "d2129493-85dc-5668-82c0-f5cb15f43294", 00:12:59.564 "is_configured": true, 00:12:59.564 "data_offset": 2048, 00:12:59.564 "data_size": 63488 00:12:59.564 }, 00:12:59.564 { 00:12:59.564 "name": "BaseBdev2", 00:12:59.564 "uuid": "38d8da87-d202-59ba-9d34-b8c7140f2deb", 00:12:59.564 "is_configured": true, 00:12:59.564 "data_offset": 2048, 00:12:59.564 "data_size": 63488 00:12:59.564 } 00:12:59.564 ] 00:12:59.564 }' 00:12:59.564 18:16:43 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:12:59.564 18:16:43 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:13:00.133 18:16:43 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@824 -- # sleep 1 00:13:00.133 18:16:43 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@823 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/spdk-raid.sock perform_tests 00:13:00.133 [2024-07-12 18:16:43.835803] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1180c70 00:13:01.073 18:16:44 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@827 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_inject_error EE_BaseBdev1_malloc write failure 00:13:01.331 [2024-07-12 18:16:44.959550] bdev_raid.c:2221:_raid_bdev_fail_base_bdev: *NOTICE*: Failing base bdev in slot 0 ('BaseBdev1') of raid bdev 'raid_bdev1' 00:13:01.331 [2024-07-12 18:16:44.959609] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:13:01.331 [2024-07-12 18:16:44.959787] bdev_raid.c:1919:raid_bdev_channel_remove_base_bdev: *DEBUG*: slot: 0 raid_ch: 0x1180c70 00:13:01.331 18:16:44 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@829 -- # local expected_num_base_bdevs 00:13:01.331 18:16:44 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@830 -- # [[ raid1 = \r\a\i\d\1 ]] 00:13:01.331 18:16:44 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@830 -- # [[ write = \w\r\i\t\e ]] 00:13:01.331 18:16:44 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@831 -- # expected_num_base_bdevs=1 00:13:01.331 18:16:44 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@835 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:13:01.331 18:16:44 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:13:01.331 18:16:44 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:13:01.331 18:16:44 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:13:01.331 18:16:44 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:13:01.331 18:16:44 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:13:01.331 18:16:44 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:01.331 18:16:44 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:01.331 18:16:44 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:01.331 18:16:44 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:01.331 18:16:44 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:01.331 18:16:44 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:13:01.589 18:16:45 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:01.589 "name": "raid_bdev1", 00:13:01.589 "uuid": "3ca03c06-809e-471d-ae47-20a6c2aa7019", 00:13:01.589 "strip_size_kb": 0, 00:13:01.589 "state": "online", 00:13:01.589 "raid_level": "raid1", 00:13:01.589 "superblock": true, 00:13:01.589 "num_base_bdevs": 2, 00:13:01.589 "num_base_bdevs_discovered": 1, 00:13:01.589 "num_base_bdevs_operational": 1, 00:13:01.589 "base_bdevs_list": [ 00:13:01.589 { 00:13:01.589 "name": null, 00:13:01.589 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:01.589 "is_configured": false, 00:13:01.589 "data_offset": 2048, 00:13:01.589 "data_size": 63488 00:13:01.589 }, 00:13:01.589 { 00:13:01.589 "name": "BaseBdev2", 00:13:01.589 "uuid": "38d8da87-d202-59ba-9d34-b8c7140f2deb", 00:13:01.589 "is_configured": true, 00:13:01.589 "data_offset": 2048, 00:13:01.589 "data_size": 63488 00:13:01.589 } 00:13:01.589 ] 00:13:01.589 }' 00:13:01.589 18:16:45 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:01.589 18:16:45 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:13:02.156 18:16:45 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@837 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:13:02.415 [2024-07-12 18:16:46.047650] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:13:02.415 [2024-07-12 18:16:46.047684] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:13:02.415 [2024-07-12 18:16:46.050802] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:13:02.415 [2024-07-12 18:16:46.050828] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:13:02.415 [2024-07-12 18:16:46.050879] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:13:02.415 [2024-07-12 18:16:46.050890] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1185320 name raid_bdev1, state offline 00:13:02.415 0 00:13:02.415 18:16:46 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@839 -- # killprocess 2476430 00:13:02.415 18:16:46 bdev_raid.raid_write_error_test -- common/autotest_common.sh@948 -- # '[' -z 2476430 ']' 00:13:02.415 18:16:46 bdev_raid.raid_write_error_test -- common/autotest_common.sh@952 -- # kill -0 2476430 00:13:02.415 18:16:46 bdev_raid.raid_write_error_test -- common/autotest_common.sh@953 -- # uname 00:13:02.415 18:16:46 bdev_raid.raid_write_error_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:13:02.415 18:16:46 bdev_raid.raid_write_error_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2476430 00:13:02.415 18:16:46 bdev_raid.raid_write_error_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:13:02.415 18:16:46 bdev_raid.raid_write_error_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:13:02.415 18:16:46 bdev_raid.raid_write_error_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2476430' 00:13:02.415 killing process with pid 2476430 00:13:02.415 18:16:46 bdev_raid.raid_write_error_test -- common/autotest_common.sh@967 -- # kill 2476430 00:13:02.415 [2024-07-12 18:16:46.114669] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:13:02.415 18:16:46 bdev_raid.raid_write_error_test -- common/autotest_common.sh@972 -- # wait 2476430 00:13:02.415 [2024-07-12 18:16:46.124876] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:13:02.675 18:16:46 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # grep -v Job /raidtest/tmp.OjMNq4Hu9K 00:13:02.675 18:16:46 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # grep raid_bdev1 00:13:02.675 18:16:46 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # awk '{print $6}' 00:13:02.675 18:16:46 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # fail_per_s=0.00 00:13:02.675 18:16:46 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@844 -- # has_redundancy raid1 00:13:02.675 18:16:46 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:13:02.675 18:16:46 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@214 -- # return 0 00:13:02.675 18:16:46 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@845 -- # [[ 0.00 = \0\.\0\0 ]] 00:13:02.675 00:13:02.675 real 0m6.463s 00:13:02.675 user 0m10.234s 00:13:02.675 sys 0m1.074s 00:13:02.675 18:16:46 bdev_raid.raid_write_error_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:13:02.675 18:16:46 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:13:02.675 ************************************ 00:13:02.675 END TEST raid_write_error_test 00:13:02.675 ************************************ 00:13:02.675 18:16:46 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:13:02.675 18:16:46 bdev_raid -- bdev/bdev_raid.sh@865 -- # for n in {2..4} 00:13:02.675 18:16:46 bdev_raid -- bdev/bdev_raid.sh@866 -- # for level in raid0 concat raid1 00:13:02.675 18:16:46 bdev_raid -- bdev/bdev_raid.sh@867 -- # run_test raid_state_function_test raid_state_function_test raid0 3 false 00:13:02.675 18:16:46 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:13:02.675 18:16:46 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:13:02.675 18:16:46 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:13:02.935 ************************************ 00:13:02.935 START TEST raid_state_function_test 00:13:02.935 ************************************ 00:13:02.935 18:16:46 bdev_raid.raid_state_function_test -- common/autotest_common.sh@1123 -- # raid_state_function_test raid0 3 false 00:13:02.935 18:16:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@220 -- # local raid_level=raid0 00:13:02.935 18:16:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@221 -- # local num_base_bdevs=3 00:13:02.935 18:16:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@222 -- # local superblock=false 00:13:02.935 18:16:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@223 -- # local raid_bdev 00:13:02.935 18:16:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i = 1 )) 00:13:02.935 18:16:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:13:02.935 18:16:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev1 00:13:02.935 18:16:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:13:02.935 18:16:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:13:02.935 18:16:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev2 00:13:02.935 18:16:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:13:02.935 18:16:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:13:02.935 18:16:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev3 00:13:02.935 18:16:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:13:02.935 18:16:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:13:02.935 18:16:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3') 00:13:02.935 18:16:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # local base_bdevs 00:13:02.935 18:16:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@225 -- # local raid_bdev_name=Existed_Raid 00:13:02.935 18:16:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # local strip_size 00:13:02.935 18:16:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@227 -- # local strip_size_create_arg 00:13:02.935 18:16:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@228 -- # local superblock_create_arg 00:13:02.935 18:16:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@230 -- # '[' raid0 '!=' raid1 ']' 00:13:02.935 18:16:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@231 -- # strip_size=64 00:13:02.935 18:16:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@232 -- # strip_size_create_arg='-z 64' 00:13:02.935 18:16:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@237 -- # '[' false = true ']' 00:13:02.935 18:16:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@240 -- # superblock_create_arg= 00:13:02.935 18:16:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@244 -- # raid_pid=2477407 00:13:02.935 18:16:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@245 -- # echo 'Process raid pid: 2477407' 00:13:02.935 Process raid pid: 2477407 00:13:02.935 18:16:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@246 -- # waitforlisten 2477407 /var/tmp/spdk-raid.sock 00:13:02.935 18:16:46 bdev_raid.raid_state_function_test -- common/autotest_common.sh@829 -- # '[' -z 2477407 ']' 00:13:02.935 18:16:46 bdev_raid.raid_state_function_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:13:02.935 18:16:46 bdev_raid.raid_state_function_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:13:02.935 18:16:46 bdev_raid.raid_state_function_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:13:02.935 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:13:02.935 18:16:46 bdev_raid.raid_state_function_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:13:02.935 18:16:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@243 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:13:02.935 18:16:46 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:13:02.935 [2024-07-12 18:16:46.489080] Starting SPDK v24.09-pre git sha1 182dd7de4 / DPDK 24.03.0 initialization... 00:13:02.935 [2024-07-12 18:16:46.489145] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:13:02.935 [2024-07-12 18:16:46.615867] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:13:03.194 [2024-07-12 18:16:46.721578] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:13:03.195 [2024-07-12 18:16:46.781711] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:13:03.195 [2024-07-12 18:16:46.781740] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:13:03.760 18:16:47 bdev_raid.raid_state_function_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:13:03.760 18:16:47 bdev_raid.raid_state_function_test -- common/autotest_common.sh@862 -- # return 0 00:13:03.760 18:16:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@250 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:13:04.018 [2024-07-12 18:16:47.543827] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:13:04.018 [2024-07-12 18:16:47.543867] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:13:04.018 [2024-07-12 18:16:47.543881] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:13:04.018 [2024-07-12 18:16:47.543893] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:13:04.018 [2024-07-12 18:16:47.543902] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:13:04.018 [2024-07-12 18:16:47.543913] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:13:04.018 18:16:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@251 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:13:04.018 18:16:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:13:04.018 18:16:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:13:04.018 18:16:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:13:04.018 18:16:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:13:04.018 18:16:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:13:04.018 18:16:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:04.018 18:16:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:04.018 18:16:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:04.018 18:16:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:04.018 18:16:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:04.018 18:16:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:13:04.583 18:16:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:04.583 "name": "Existed_Raid", 00:13:04.583 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:04.583 "strip_size_kb": 64, 00:13:04.583 "state": "configuring", 00:13:04.583 "raid_level": "raid0", 00:13:04.583 "superblock": false, 00:13:04.583 "num_base_bdevs": 3, 00:13:04.583 "num_base_bdevs_discovered": 0, 00:13:04.583 "num_base_bdevs_operational": 3, 00:13:04.583 "base_bdevs_list": [ 00:13:04.583 { 00:13:04.583 "name": "BaseBdev1", 00:13:04.583 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:04.583 "is_configured": false, 00:13:04.583 "data_offset": 0, 00:13:04.583 "data_size": 0 00:13:04.583 }, 00:13:04.583 { 00:13:04.583 "name": "BaseBdev2", 00:13:04.583 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:04.583 "is_configured": false, 00:13:04.583 "data_offset": 0, 00:13:04.583 "data_size": 0 00:13:04.583 }, 00:13:04.583 { 00:13:04.583 "name": "BaseBdev3", 00:13:04.583 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:04.583 "is_configured": false, 00:13:04.583 "data_offset": 0, 00:13:04.583 "data_size": 0 00:13:04.583 } 00:13:04.583 ] 00:13:04.583 }' 00:13:04.583 18:16:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:04.583 18:16:48 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:13:05.149 18:16:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@252 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:13:05.408 [2024-07-12 18:16:48.887238] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:13:05.408 [2024-07-12 18:16:48.887268] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1cb3a80 name Existed_Raid, state configuring 00:13:05.408 18:16:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@256 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:13:05.408 [2024-07-12 18:16:49.131893] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:13:05.408 [2024-07-12 18:16:49.131919] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:13:05.408 [2024-07-12 18:16:49.131934] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:13:05.408 [2024-07-12 18:16:49.131946] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:13:05.408 [2024-07-12 18:16:49.131962] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:13:05.408 [2024-07-12 18:16:49.131973] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:13:05.666 18:16:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@257 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:13:05.667 [2024-07-12 18:16:49.382467] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:13:05.667 BaseBdev1 00:13:05.925 18:16:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@258 -- # waitforbdev BaseBdev1 00:13:05.925 18:16:49 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev1 00:13:05.925 18:16:49 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:13:05.925 18:16:49 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:13:05.925 18:16:49 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:13:05.925 18:16:49 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:13:05.925 18:16:49 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:13:05.925 18:16:49 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:13:06.184 [ 00:13:06.184 { 00:13:06.184 "name": "BaseBdev1", 00:13:06.184 "aliases": [ 00:13:06.184 "35e3a6f6-5079-4334-952e-b25285546db7" 00:13:06.184 ], 00:13:06.184 "product_name": "Malloc disk", 00:13:06.184 "block_size": 512, 00:13:06.184 "num_blocks": 65536, 00:13:06.184 "uuid": "35e3a6f6-5079-4334-952e-b25285546db7", 00:13:06.184 "assigned_rate_limits": { 00:13:06.185 "rw_ios_per_sec": 0, 00:13:06.185 "rw_mbytes_per_sec": 0, 00:13:06.185 "r_mbytes_per_sec": 0, 00:13:06.185 "w_mbytes_per_sec": 0 00:13:06.185 }, 00:13:06.185 "claimed": true, 00:13:06.185 "claim_type": "exclusive_write", 00:13:06.185 "zoned": false, 00:13:06.185 "supported_io_types": { 00:13:06.185 "read": true, 00:13:06.185 "write": true, 00:13:06.185 "unmap": true, 00:13:06.185 "flush": true, 00:13:06.185 "reset": true, 00:13:06.185 "nvme_admin": false, 00:13:06.185 "nvme_io": false, 00:13:06.185 "nvme_io_md": false, 00:13:06.185 "write_zeroes": true, 00:13:06.185 "zcopy": true, 00:13:06.185 "get_zone_info": false, 00:13:06.185 "zone_management": false, 00:13:06.185 "zone_append": false, 00:13:06.185 "compare": false, 00:13:06.185 "compare_and_write": false, 00:13:06.185 "abort": true, 00:13:06.185 "seek_hole": false, 00:13:06.185 "seek_data": false, 00:13:06.185 "copy": true, 00:13:06.185 "nvme_iov_md": false 00:13:06.185 }, 00:13:06.185 "memory_domains": [ 00:13:06.185 { 00:13:06.185 "dma_device_id": "system", 00:13:06.185 "dma_device_type": 1 00:13:06.185 }, 00:13:06.185 { 00:13:06.185 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:06.185 "dma_device_type": 2 00:13:06.185 } 00:13:06.185 ], 00:13:06.185 "driver_specific": {} 00:13:06.185 } 00:13:06.185 ] 00:13:06.185 18:16:49 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:13:06.185 18:16:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@259 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:13:06.185 18:16:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:13:06.185 18:16:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:13:06.185 18:16:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:13:06.185 18:16:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:13:06.185 18:16:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:13:06.185 18:16:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:06.185 18:16:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:06.185 18:16:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:06.185 18:16:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:06.185 18:16:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:06.185 18:16:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:13:06.499 18:16:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:06.499 "name": "Existed_Raid", 00:13:06.499 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:06.499 "strip_size_kb": 64, 00:13:06.499 "state": "configuring", 00:13:06.499 "raid_level": "raid0", 00:13:06.499 "superblock": false, 00:13:06.499 "num_base_bdevs": 3, 00:13:06.499 "num_base_bdevs_discovered": 1, 00:13:06.499 "num_base_bdevs_operational": 3, 00:13:06.499 "base_bdevs_list": [ 00:13:06.499 { 00:13:06.499 "name": "BaseBdev1", 00:13:06.499 "uuid": "35e3a6f6-5079-4334-952e-b25285546db7", 00:13:06.499 "is_configured": true, 00:13:06.499 "data_offset": 0, 00:13:06.499 "data_size": 65536 00:13:06.499 }, 00:13:06.499 { 00:13:06.499 "name": "BaseBdev2", 00:13:06.499 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:06.499 "is_configured": false, 00:13:06.499 "data_offset": 0, 00:13:06.499 "data_size": 0 00:13:06.499 }, 00:13:06.499 { 00:13:06.499 "name": "BaseBdev3", 00:13:06.499 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:06.499 "is_configured": false, 00:13:06.499 "data_offset": 0, 00:13:06.499 "data_size": 0 00:13:06.499 } 00:13:06.499 ] 00:13:06.499 }' 00:13:06.499 18:16:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:06.499 18:16:50 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:13:07.068 18:16:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@260 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:13:07.327 [2024-07-12 18:16:50.914527] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:13:07.327 [2024-07-12 18:16:50.914564] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1cb3310 name Existed_Raid, state configuring 00:13:07.327 18:16:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@264 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:13:07.587 [2024-07-12 18:16:51.159199] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:13:07.587 [2024-07-12 18:16:51.160652] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:13:07.587 [2024-07-12 18:16:51.160684] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:13:07.587 [2024-07-12 18:16:51.160694] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:13:07.587 [2024-07-12 18:16:51.160705] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:13:07.587 18:16:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i = 1 )) 00:13:07.587 18:16:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:13:07.587 18:16:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:13:07.587 18:16:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:13:07.587 18:16:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:13:07.587 18:16:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:13:07.587 18:16:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:13:07.587 18:16:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:13:07.587 18:16:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:07.587 18:16:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:07.587 18:16:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:07.587 18:16:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:07.587 18:16:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:07.587 18:16:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:13:07.846 18:16:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:07.846 "name": "Existed_Raid", 00:13:07.846 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:07.846 "strip_size_kb": 64, 00:13:07.846 "state": "configuring", 00:13:07.846 "raid_level": "raid0", 00:13:07.846 "superblock": false, 00:13:07.846 "num_base_bdevs": 3, 00:13:07.846 "num_base_bdevs_discovered": 1, 00:13:07.846 "num_base_bdevs_operational": 3, 00:13:07.846 "base_bdevs_list": [ 00:13:07.846 { 00:13:07.846 "name": "BaseBdev1", 00:13:07.846 "uuid": "35e3a6f6-5079-4334-952e-b25285546db7", 00:13:07.846 "is_configured": true, 00:13:07.846 "data_offset": 0, 00:13:07.846 "data_size": 65536 00:13:07.846 }, 00:13:07.846 { 00:13:07.846 "name": "BaseBdev2", 00:13:07.846 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:07.846 "is_configured": false, 00:13:07.847 "data_offset": 0, 00:13:07.847 "data_size": 0 00:13:07.847 }, 00:13:07.847 { 00:13:07.847 "name": "BaseBdev3", 00:13:07.847 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:07.847 "is_configured": false, 00:13:07.847 "data_offset": 0, 00:13:07.847 "data_size": 0 00:13:07.847 } 00:13:07.847 ] 00:13:07.847 }' 00:13:07.847 18:16:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:07.847 18:16:51 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:13:08.415 18:16:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:13:08.673 [2024-07-12 18:16:52.197392] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:13:08.673 BaseBdev2 00:13:08.673 18:16:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev2 00:13:08.673 18:16:52 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev2 00:13:08.673 18:16:52 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:13:08.673 18:16:52 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:13:08.673 18:16:52 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:13:08.673 18:16:52 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:13:08.674 18:16:52 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:13:08.932 18:16:52 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:13:09.215 [ 00:13:09.215 { 00:13:09.215 "name": "BaseBdev2", 00:13:09.215 "aliases": [ 00:13:09.215 "1690f9fe-9821-43d7-9dd4-c9afe8e9b899" 00:13:09.215 ], 00:13:09.215 "product_name": "Malloc disk", 00:13:09.215 "block_size": 512, 00:13:09.215 "num_blocks": 65536, 00:13:09.215 "uuid": "1690f9fe-9821-43d7-9dd4-c9afe8e9b899", 00:13:09.215 "assigned_rate_limits": { 00:13:09.215 "rw_ios_per_sec": 0, 00:13:09.215 "rw_mbytes_per_sec": 0, 00:13:09.215 "r_mbytes_per_sec": 0, 00:13:09.215 "w_mbytes_per_sec": 0 00:13:09.215 }, 00:13:09.215 "claimed": true, 00:13:09.215 "claim_type": "exclusive_write", 00:13:09.215 "zoned": false, 00:13:09.215 "supported_io_types": { 00:13:09.215 "read": true, 00:13:09.215 "write": true, 00:13:09.215 "unmap": true, 00:13:09.215 "flush": true, 00:13:09.215 "reset": true, 00:13:09.215 "nvme_admin": false, 00:13:09.215 "nvme_io": false, 00:13:09.215 "nvme_io_md": false, 00:13:09.215 "write_zeroes": true, 00:13:09.215 "zcopy": true, 00:13:09.215 "get_zone_info": false, 00:13:09.215 "zone_management": false, 00:13:09.215 "zone_append": false, 00:13:09.215 "compare": false, 00:13:09.215 "compare_and_write": false, 00:13:09.215 "abort": true, 00:13:09.215 "seek_hole": false, 00:13:09.215 "seek_data": false, 00:13:09.215 "copy": true, 00:13:09.215 "nvme_iov_md": false 00:13:09.215 }, 00:13:09.215 "memory_domains": [ 00:13:09.215 { 00:13:09.215 "dma_device_id": "system", 00:13:09.215 "dma_device_type": 1 00:13:09.215 }, 00:13:09.215 { 00:13:09.215 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:09.215 "dma_device_type": 2 00:13:09.215 } 00:13:09.215 ], 00:13:09.215 "driver_specific": {} 00:13:09.215 } 00:13:09.215 ] 00:13:09.215 18:16:52 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:13:09.215 18:16:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:13:09.215 18:16:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:13:09.215 18:16:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:13:09.215 18:16:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:13:09.215 18:16:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:13:09.215 18:16:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:13:09.215 18:16:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:13:09.215 18:16:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:13:09.215 18:16:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:09.215 18:16:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:09.215 18:16:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:09.215 18:16:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:09.215 18:16:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:09.215 18:16:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:13:09.474 18:16:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:09.474 "name": "Existed_Raid", 00:13:09.474 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:09.474 "strip_size_kb": 64, 00:13:09.474 "state": "configuring", 00:13:09.474 "raid_level": "raid0", 00:13:09.474 "superblock": false, 00:13:09.474 "num_base_bdevs": 3, 00:13:09.474 "num_base_bdevs_discovered": 2, 00:13:09.474 "num_base_bdevs_operational": 3, 00:13:09.474 "base_bdevs_list": [ 00:13:09.474 { 00:13:09.474 "name": "BaseBdev1", 00:13:09.474 "uuid": "35e3a6f6-5079-4334-952e-b25285546db7", 00:13:09.474 "is_configured": true, 00:13:09.474 "data_offset": 0, 00:13:09.474 "data_size": 65536 00:13:09.474 }, 00:13:09.474 { 00:13:09.474 "name": "BaseBdev2", 00:13:09.474 "uuid": "1690f9fe-9821-43d7-9dd4-c9afe8e9b899", 00:13:09.474 "is_configured": true, 00:13:09.474 "data_offset": 0, 00:13:09.474 "data_size": 65536 00:13:09.474 }, 00:13:09.474 { 00:13:09.474 "name": "BaseBdev3", 00:13:09.474 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:09.474 "is_configured": false, 00:13:09.474 "data_offset": 0, 00:13:09.474 "data_size": 0 00:13:09.474 } 00:13:09.474 ] 00:13:09.474 }' 00:13:09.474 18:16:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:09.474 18:16:52 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:13:10.041 18:16:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:13:10.041 [2024-07-12 18:16:53.696701] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:13:10.041 [2024-07-12 18:16:53.696736] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x1cb4400 00:13:10.041 [2024-07-12 18:16:53.696745] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 196608, blocklen 512 00:13:10.041 [2024-07-12 18:16:53.696998] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1cb3ef0 00:13:10.041 [2024-07-12 18:16:53.697115] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1cb4400 00:13:10.041 [2024-07-12 18:16:53.697125] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x1cb4400 00:13:10.041 [2024-07-12 18:16:53.697277] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:13:10.041 BaseBdev3 00:13:10.041 18:16:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev3 00:13:10.041 18:16:53 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev3 00:13:10.041 18:16:53 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:13:10.041 18:16:53 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:13:10.041 18:16:53 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:13:10.041 18:16:53 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:13:10.041 18:16:53 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:13:10.299 18:16:53 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:13:10.558 [ 00:13:10.558 { 00:13:10.558 "name": "BaseBdev3", 00:13:10.558 "aliases": [ 00:13:10.558 "463bbe89-4786-49e6-a0f9-4e802c915719" 00:13:10.558 ], 00:13:10.558 "product_name": "Malloc disk", 00:13:10.558 "block_size": 512, 00:13:10.558 "num_blocks": 65536, 00:13:10.558 "uuid": "463bbe89-4786-49e6-a0f9-4e802c915719", 00:13:10.558 "assigned_rate_limits": { 00:13:10.558 "rw_ios_per_sec": 0, 00:13:10.558 "rw_mbytes_per_sec": 0, 00:13:10.558 "r_mbytes_per_sec": 0, 00:13:10.558 "w_mbytes_per_sec": 0 00:13:10.558 }, 00:13:10.558 "claimed": true, 00:13:10.558 "claim_type": "exclusive_write", 00:13:10.558 "zoned": false, 00:13:10.558 "supported_io_types": { 00:13:10.558 "read": true, 00:13:10.558 "write": true, 00:13:10.558 "unmap": true, 00:13:10.558 "flush": true, 00:13:10.558 "reset": true, 00:13:10.558 "nvme_admin": false, 00:13:10.558 "nvme_io": false, 00:13:10.558 "nvme_io_md": false, 00:13:10.558 "write_zeroes": true, 00:13:10.558 "zcopy": true, 00:13:10.558 "get_zone_info": false, 00:13:10.558 "zone_management": false, 00:13:10.558 "zone_append": false, 00:13:10.558 "compare": false, 00:13:10.558 "compare_and_write": false, 00:13:10.558 "abort": true, 00:13:10.558 "seek_hole": false, 00:13:10.558 "seek_data": false, 00:13:10.558 "copy": true, 00:13:10.558 "nvme_iov_md": false 00:13:10.558 }, 00:13:10.558 "memory_domains": [ 00:13:10.558 { 00:13:10.558 "dma_device_id": "system", 00:13:10.558 "dma_device_type": 1 00:13:10.558 }, 00:13:10.558 { 00:13:10.558 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:10.558 "dma_device_type": 2 00:13:10.558 } 00:13:10.558 ], 00:13:10.558 "driver_specific": {} 00:13:10.558 } 00:13:10.558 ] 00:13:10.558 18:16:54 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:13:10.558 18:16:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:13:10.558 18:16:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:13:10.558 18:16:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@270 -- # verify_raid_bdev_state Existed_Raid online raid0 64 3 00:13:10.558 18:16:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:13:10.558 18:16:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:13:10.558 18:16:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:13:10.558 18:16:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:13:10.558 18:16:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:13:10.558 18:16:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:10.558 18:16:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:10.558 18:16:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:10.558 18:16:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:10.558 18:16:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:13:10.559 18:16:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:10.817 18:16:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:10.817 "name": "Existed_Raid", 00:13:10.817 "uuid": "8dd2d0b0-812e-4499-adab-6289b25dd488", 00:13:10.817 "strip_size_kb": 64, 00:13:10.817 "state": "online", 00:13:10.817 "raid_level": "raid0", 00:13:10.817 "superblock": false, 00:13:10.817 "num_base_bdevs": 3, 00:13:10.817 "num_base_bdevs_discovered": 3, 00:13:10.817 "num_base_bdevs_operational": 3, 00:13:10.817 "base_bdevs_list": [ 00:13:10.817 { 00:13:10.817 "name": "BaseBdev1", 00:13:10.817 "uuid": "35e3a6f6-5079-4334-952e-b25285546db7", 00:13:10.817 "is_configured": true, 00:13:10.817 "data_offset": 0, 00:13:10.817 "data_size": 65536 00:13:10.817 }, 00:13:10.817 { 00:13:10.817 "name": "BaseBdev2", 00:13:10.817 "uuid": "1690f9fe-9821-43d7-9dd4-c9afe8e9b899", 00:13:10.817 "is_configured": true, 00:13:10.817 "data_offset": 0, 00:13:10.817 "data_size": 65536 00:13:10.817 }, 00:13:10.817 { 00:13:10.817 "name": "BaseBdev3", 00:13:10.817 "uuid": "463bbe89-4786-49e6-a0f9-4e802c915719", 00:13:10.817 "is_configured": true, 00:13:10.817 "data_offset": 0, 00:13:10.817 "data_size": 65536 00:13:10.817 } 00:13:10.817 ] 00:13:10.817 }' 00:13:10.817 18:16:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:10.817 18:16:54 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:13:11.384 18:16:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_properties Existed_Raid 00:13:11.384 18:16:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:13:11.384 18:16:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:13:11.384 18:16:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:13:11.384 18:16:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:13:11.384 18:16:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@198 -- # local name 00:13:11.384 18:16:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:13:11.384 18:16:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:13:11.642 [2024-07-12 18:16:55.200987] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:13:11.642 18:16:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:13:11.642 "name": "Existed_Raid", 00:13:11.642 "aliases": [ 00:13:11.642 "8dd2d0b0-812e-4499-adab-6289b25dd488" 00:13:11.642 ], 00:13:11.642 "product_name": "Raid Volume", 00:13:11.642 "block_size": 512, 00:13:11.642 "num_blocks": 196608, 00:13:11.642 "uuid": "8dd2d0b0-812e-4499-adab-6289b25dd488", 00:13:11.642 "assigned_rate_limits": { 00:13:11.642 "rw_ios_per_sec": 0, 00:13:11.642 "rw_mbytes_per_sec": 0, 00:13:11.642 "r_mbytes_per_sec": 0, 00:13:11.642 "w_mbytes_per_sec": 0 00:13:11.642 }, 00:13:11.642 "claimed": false, 00:13:11.642 "zoned": false, 00:13:11.642 "supported_io_types": { 00:13:11.642 "read": true, 00:13:11.642 "write": true, 00:13:11.642 "unmap": true, 00:13:11.642 "flush": true, 00:13:11.642 "reset": true, 00:13:11.642 "nvme_admin": false, 00:13:11.642 "nvme_io": false, 00:13:11.642 "nvme_io_md": false, 00:13:11.642 "write_zeroes": true, 00:13:11.642 "zcopy": false, 00:13:11.642 "get_zone_info": false, 00:13:11.642 "zone_management": false, 00:13:11.642 "zone_append": false, 00:13:11.642 "compare": false, 00:13:11.642 "compare_and_write": false, 00:13:11.643 "abort": false, 00:13:11.643 "seek_hole": false, 00:13:11.643 "seek_data": false, 00:13:11.643 "copy": false, 00:13:11.643 "nvme_iov_md": false 00:13:11.643 }, 00:13:11.643 "memory_domains": [ 00:13:11.643 { 00:13:11.643 "dma_device_id": "system", 00:13:11.643 "dma_device_type": 1 00:13:11.643 }, 00:13:11.643 { 00:13:11.643 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:11.643 "dma_device_type": 2 00:13:11.643 }, 00:13:11.643 { 00:13:11.643 "dma_device_id": "system", 00:13:11.643 "dma_device_type": 1 00:13:11.643 }, 00:13:11.643 { 00:13:11.643 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:11.643 "dma_device_type": 2 00:13:11.643 }, 00:13:11.643 { 00:13:11.643 "dma_device_id": "system", 00:13:11.643 "dma_device_type": 1 00:13:11.643 }, 00:13:11.643 { 00:13:11.643 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:11.643 "dma_device_type": 2 00:13:11.643 } 00:13:11.643 ], 00:13:11.643 "driver_specific": { 00:13:11.643 "raid": { 00:13:11.643 "uuid": "8dd2d0b0-812e-4499-adab-6289b25dd488", 00:13:11.643 "strip_size_kb": 64, 00:13:11.643 "state": "online", 00:13:11.643 "raid_level": "raid0", 00:13:11.643 "superblock": false, 00:13:11.643 "num_base_bdevs": 3, 00:13:11.643 "num_base_bdevs_discovered": 3, 00:13:11.643 "num_base_bdevs_operational": 3, 00:13:11.643 "base_bdevs_list": [ 00:13:11.643 { 00:13:11.643 "name": "BaseBdev1", 00:13:11.643 "uuid": "35e3a6f6-5079-4334-952e-b25285546db7", 00:13:11.643 "is_configured": true, 00:13:11.643 "data_offset": 0, 00:13:11.643 "data_size": 65536 00:13:11.643 }, 00:13:11.643 { 00:13:11.643 "name": "BaseBdev2", 00:13:11.643 "uuid": "1690f9fe-9821-43d7-9dd4-c9afe8e9b899", 00:13:11.643 "is_configured": true, 00:13:11.643 "data_offset": 0, 00:13:11.643 "data_size": 65536 00:13:11.643 }, 00:13:11.643 { 00:13:11.643 "name": "BaseBdev3", 00:13:11.643 "uuid": "463bbe89-4786-49e6-a0f9-4e802c915719", 00:13:11.643 "is_configured": true, 00:13:11.643 "data_offset": 0, 00:13:11.643 "data_size": 65536 00:13:11.643 } 00:13:11.643 ] 00:13:11.643 } 00:13:11.643 } 00:13:11.643 }' 00:13:11.643 18:16:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:13:11.643 18:16:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='BaseBdev1 00:13:11.643 BaseBdev2 00:13:11.643 BaseBdev3' 00:13:11.643 18:16:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:13:11.643 18:16:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:13:11.643 18:16:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:13:11.902 18:16:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:13:11.902 "name": "BaseBdev1", 00:13:11.902 "aliases": [ 00:13:11.902 "35e3a6f6-5079-4334-952e-b25285546db7" 00:13:11.902 ], 00:13:11.902 "product_name": "Malloc disk", 00:13:11.902 "block_size": 512, 00:13:11.902 "num_blocks": 65536, 00:13:11.902 "uuid": "35e3a6f6-5079-4334-952e-b25285546db7", 00:13:11.902 "assigned_rate_limits": { 00:13:11.902 "rw_ios_per_sec": 0, 00:13:11.902 "rw_mbytes_per_sec": 0, 00:13:11.902 "r_mbytes_per_sec": 0, 00:13:11.902 "w_mbytes_per_sec": 0 00:13:11.902 }, 00:13:11.902 "claimed": true, 00:13:11.902 "claim_type": "exclusive_write", 00:13:11.902 "zoned": false, 00:13:11.902 "supported_io_types": { 00:13:11.902 "read": true, 00:13:11.902 "write": true, 00:13:11.902 "unmap": true, 00:13:11.902 "flush": true, 00:13:11.902 "reset": true, 00:13:11.902 "nvme_admin": false, 00:13:11.902 "nvme_io": false, 00:13:11.902 "nvme_io_md": false, 00:13:11.902 "write_zeroes": true, 00:13:11.902 "zcopy": true, 00:13:11.902 "get_zone_info": false, 00:13:11.902 "zone_management": false, 00:13:11.902 "zone_append": false, 00:13:11.902 "compare": false, 00:13:11.902 "compare_and_write": false, 00:13:11.902 "abort": true, 00:13:11.902 "seek_hole": false, 00:13:11.902 "seek_data": false, 00:13:11.902 "copy": true, 00:13:11.902 "nvme_iov_md": false 00:13:11.902 }, 00:13:11.902 "memory_domains": [ 00:13:11.902 { 00:13:11.902 "dma_device_id": "system", 00:13:11.902 "dma_device_type": 1 00:13:11.902 }, 00:13:11.902 { 00:13:11.902 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:11.902 "dma_device_type": 2 00:13:11.902 } 00:13:11.902 ], 00:13:11.902 "driver_specific": {} 00:13:11.902 }' 00:13:11.902 18:16:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:13:11.902 18:16:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:13:11.902 18:16:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:13:11.902 18:16:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:13:12.160 18:16:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:13:12.160 18:16:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:13:12.160 18:16:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:13:12.160 18:16:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:13:12.160 18:16:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:13:12.160 18:16:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:13:12.160 18:16:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:13:12.160 18:16:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:13:12.160 18:16:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:13:12.160 18:16:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:13:12.161 18:16:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:13:12.420 18:16:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:13:12.420 "name": "BaseBdev2", 00:13:12.420 "aliases": [ 00:13:12.420 "1690f9fe-9821-43d7-9dd4-c9afe8e9b899" 00:13:12.420 ], 00:13:12.420 "product_name": "Malloc disk", 00:13:12.420 "block_size": 512, 00:13:12.420 "num_blocks": 65536, 00:13:12.420 "uuid": "1690f9fe-9821-43d7-9dd4-c9afe8e9b899", 00:13:12.420 "assigned_rate_limits": { 00:13:12.420 "rw_ios_per_sec": 0, 00:13:12.420 "rw_mbytes_per_sec": 0, 00:13:12.420 "r_mbytes_per_sec": 0, 00:13:12.420 "w_mbytes_per_sec": 0 00:13:12.420 }, 00:13:12.420 "claimed": true, 00:13:12.420 "claim_type": "exclusive_write", 00:13:12.420 "zoned": false, 00:13:12.420 "supported_io_types": { 00:13:12.420 "read": true, 00:13:12.420 "write": true, 00:13:12.420 "unmap": true, 00:13:12.420 "flush": true, 00:13:12.420 "reset": true, 00:13:12.420 "nvme_admin": false, 00:13:12.420 "nvme_io": false, 00:13:12.420 "nvme_io_md": false, 00:13:12.420 "write_zeroes": true, 00:13:12.420 "zcopy": true, 00:13:12.420 "get_zone_info": false, 00:13:12.420 "zone_management": false, 00:13:12.420 "zone_append": false, 00:13:12.420 "compare": false, 00:13:12.420 "compare_and_write": false, 00:13:12.420 "abort": true, 00:13:12.420 "seek_hole": false, 00:13:12.420 "seek_data": false, 00:13:12.420 "copy": true, 00:13:12.420 "nvme_iov_md": false 00:13:12.420 }, 00:13:12.420 "memory_domains": [ 00:13:12.420 { 00:13:12.420 "dma_device_id": "system", 00:13:12.420 "dma_device_type": 1 00:13:12.420 }, 00:13:12.420 { 00:13:12.420 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:12.420 "dma_device_type": 2 00:13:12.420 } 00:13:12.420 ], 00:13:12.420 "driver_specific": {} 00:13:12.420 }' 00:13:12.420 18:16:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:13:12.679 18:16:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:13:12.679 18:16:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:13:12.679 18:16:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:13:12.679 18:16:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:13:12.679 18:16:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:13:12.679 18:16:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:13:12.679 18:16:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:13:12.679 18:16:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:13:12.679 18:16:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:13:12.679 18:16:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:13:12.938 18:16:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:13:12.938 18:16:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:13:12.938 18:16:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:13:12.938 18:16:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:13:12.938 18:16:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:13:12.938 "name": "BaseBdev3", 00:13:12.938 "aliases": [ 00:13:12.938 "463bbe89-4786-49e6-a0f9-4e802c915719" 00:13:12.938 ], 00:13:12.938 "product_name": "Malloc disk", 00:13:12.938 "block_size": 512, 00:13:12.938 "num_blocks": 65536, 00:13:12.938 "uuid": "463bbe89-4786-49e6-a0f9-4e802c915719", 00:13:12.938 "assigned_rate_limits": { 00:13:12.938 "rw_ios_per_sec": 0, 00:13:12.938 "rw_mbytes_per_sec": 0, 00:13:12.938 "r_mbytes_per_sec": 0, 00:13:12.938 "w_mbytes_per_sec": 0 00:13:12.938 }, 00:13:12.938 "claimed": true, 00:13:12.938 "claim_type": "exclusive_write", 00:13:12.938 "zoned": false, 00:13:12.938 "supported_io_types": { 00:13:12.938 "read": true, 00:13:12.938 "write": true, 00:13:12.938 "unmap": true, 00:13:12.938 "flush": true, 00:13:12.938 "reset": true, 00:13:12.938 "nvme_admin": false, 00:13:12.938 "nvme_io": false, 00:13:12.938 "nvme_io_md": false, 00:13:12.938 "write_zeroes": true, 00:13:12.938 "zcopy": true, 00:13:12.938 "get_zone_info": false, 00:13:12.938 "zone_management": false, 00:13:12.938 "zone_append": false, 00:13:12.938 "compare": false, 00:13:12.938 "compare_and_write": false, 00:13:12.938 "abort": true, 00:13:12.938 "seek_hole": false, 00:13:12.938 "seek_data": false, 00:13:12.938 "copy": true, 00:13:12.938 "nvme_iov_md": false 00:13:12.938 }, 00:13:12.938 "memory_domains": [ 00:13:12.938 { 00:13:12.938 "dma_device_id": "system", 00:13:12.938 "dma_device_type": 1 00:13:12.938 }, 00:13:12.938 { 00:13:12.938 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:12.938 "dma_device_type": 2 00:13:12.938 } 00:13:12.938 ], 00:13:12.938 "driver_specific": {} 00:13:12.938 }' 00:13:12.938 18:16:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:13:12.938 18:16:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:13:13.197 18:16:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:13:13.197 18:16:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:13:13.197 18:16:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:13:13.197 18:16:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:13:13.197 18:16:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:13:13.197 18:16:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:13:13.197 18:16:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:13:13.197 18:16:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:13:13.456 18:16:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:13:13.457 18:16:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:13:13.457 18:16:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@274 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:13:13.716 [2024-07-12 18:16:57.190024] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:13:13.716 [2024-07-12 18:16:57.190048] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:13:13.716 [2024-07-12 18:16:57.190085] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:13:13.716 18:16:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@275 -- # local expected_state 00:13:13.716 18:16:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@276 -- # has_redundancy raid0 00:13:13.716 18:16:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:13:13.716 18:16:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@215 -- # return 1 00:13:13.716 18:16:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@277 -- # expected_state=offline 00:13:13.716 18:16:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@281 -- # verify_raid_bdev_state Existed_Raid offline raid0 64 2 00:13:13.716 18:16:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:13:13.716 18:16:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=offline 00:13:13.716 18:16:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:13:13.716 18:16:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:13:13.716 18:16:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:13:13.716 18:16:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:13.716 18:16:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:13.716 18:16:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:13.716 18:16:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:13.716 18:16:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:13.716 18:16:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:13:13.976 18:16:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:13.976 "name": "Existed_Raid", 00:13:13.976 "uuid": "8dd2d0b0-812e-4499-adab-6289b25dd488", 00:13:13.976 "strip_size_kb": 64, 00:13:13.976 "state": "offline", 00:13:13.976 "raid_level": "raid0", 00:13:13.976 "superblock": false, 00:13:13.976 "num_base_bdevs": 3, 00:13:13.976 "num_base_bdevs_discovered": 2, 00:13:13.976 "num_base_bdevs_operational": 2, 00:13:13.976 "base_bdevs_list": [ 00:13:13.976 { 00:13:13.976 "name": null, 00:13:13.976 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:13.976 "is_configured": false, 00:13:13.976 "data_offset": 0, 00:13:13.976 "data_size": 65536 00:13:13.976 }, 00:13:13.976 { 00:13:13.976 "name": "BaseBdev2", 00:13:13.976 "uuid": "1690f9fe-9821-43d7-9dd4-c9afe8e9b899", 00:13:13.976 "is_configured": true, 00:13:13.976 "data_offset": 0, 00:13:13.976 "data_size": 65536 00:13:13.976 }, 00:13:13.976 { 00:13:13.976 "name": "BaseBdev3", 00:13:13.976 "uuid": "463bbe89-4786-49e6-a0f9-4e802c915719", 00:13:13.976 "is_configured": true, 00:13:13.976 "data_offset": 0, 00:13:13.976 "data_size": 65536 00:13:13.976 } 00:13:13.976 ] 00:13:13.976 }' 00:13:13.977 18:16:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:13.977 18:16:57 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:13:14.545 18:16:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i = 1 )) 00:13:14.545 18:16:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:13:14.545 18:16:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:14.545 18:16:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:13:14.804 18:16:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:13:14.804 18:16:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:13:14.804 18:16:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:13:14.804 [2024-07-12 18:16:58.522566] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:13:15.063 18:16:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:13:15.063 18:16:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:13:15.063 18:16:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:15.063 18:16:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:13:15.322 18:16:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:13:15.322 18:16:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:13:15.322 18:16:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev3 00:13:15.322 [2024-07-12 18:16:59.020240] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:13:15.322 [2024-07-12 18:16:59.020280] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1cb4400 name Existed_Raid, state offline 00:13:15.582 18:16:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:13:15.582 18:16:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:13:15.582 18:16:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:15.582 18:16:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # jq -r '.[0]["name"] | select(.)' 00:13:15.582 18:16:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # raid_bdev= 00:13:15.582 18:16:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@294 -- # '[' -n '' ']' 00:13:15.582 18:16:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@299 -- # '[' 3 -gt 2 ']' 00:13:15.582 18:16:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i = 1 )) 00:13:15.582 18:16:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:13:15.582 18:16:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:13:15.841 BaseBdev2 00:13:15.841 18:16:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev2 00:13:15.841 18:16:59 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev2 00:13:15.841 18:16:59 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:13:15.841 18:16:59 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:13:15.841 18:16:59 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:13:15.841 18:16:59 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:13:15.841 18:16:59 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:13:16.100 18:16:59 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:13:16.359 [ 00:13:16.359 { 00:13:16.359 "name": "BaseBdev2", 00:13:16.359 "aliases": [ 00:13:16.359 "3fa41f38-e942-41c4-ab10-9b41e627318a" 00:13:16.359 ], 00:13:16.359 "product_name": "Malloc disk", 00:13:16.359 "block_size": 512, 00:13:16.359 "num_blocks": 65536, 00:13:16.359 "uuid": "3fa41f38-e942-41c4-ab10-9b41e627318a", 00:13:16.359 "assigned_rate_limits": { 00:13:16.359 "rw_ios_per_sec": 0, 00:13:16.360 "rw_mbytes_per_sec": 0, 00:13:16.360 "r_mbytes_per_sec": 0, 00:13:16.360 "w_mbytes_per_sec": 0 00:13:16.360 }, 00:13:16.360 "claimed": false, 00:13:16.360 "zoned": false, 00:13:16.360 "supported_io_types": { 00:13:16.360 "read": true, 00:13:16.360 "write": true, 00:13:16.360 "unmap": true, 00:13:16.360 "flush": true, 00:13:16.360 "reset": true, 00:13:16.360 "nvme_admin": false, 00:13:16.360 "nvme_io": false, 00:13:16.360 "nvme_io_md": false, 00:13:16.360 "write_zeroes": true, 00:13:16.360 "zcopy": true, 00:13:16.360 "get_zone_info": false, 00:13:16.360 "zone_management": false, 00:13:16.360 "zone_append": false, 00:13:16.360 "compare": false, 00:13:16.360 "compare_and_write": false, 00:13:16.360 "abort": true, 00:13:16.360 "seek_hole": false, 00:13:16.360 "seek_data": false, 00:13:16.360 "copy": true, 00:13:16.360 "nvme_iov_md": false 00:13:16.360 }, 00:13:16.360 "memory_domains": [ 00:13:16.360 { 00:13:16.360 "dma_device_id": "system", 00:13:16.360 "dma_device_type": 1 00:13:16.360 }, 00:13:16.360 { 00:13:16.360 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:16.360 "dma_device_type": 2 00:13:16.360 } 00:13:16.360 ], 00:13:16.360 "driver_specific": {} 00:13:16.360 } 00:13:16.360 ] 00:13:16.360 18:16:59 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:13:16.360 18:17:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:13:16.360 18:17:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:13:16.360 18:17:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:13:16.618 BaseBdev3 00:13:16.618 18:17:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev3 00:13:16.618 18:17:00 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev3 00:13:16.619 18:17:00 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:13:16.619 18:17:00 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:13:16.619 18:17:00 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:13:16.619 18:17:00 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:13:16.619 18:17:00 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:13:16.877 18:17:00 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:13:17.136 [ 00:13:17.136 { 00:13:17.136 "name": "BaseBdev3", 00:13:17.136 "aliases": [ 00:13:17.136 "a8dcd76c-9b29-4a55-8c25-cd8d9da624cb" 00:13:17.136 ], 00:13:17.136 "product_name": "Malloc disk", 00:13:17.136 "block_size": 512, 00:13:17.136 "num_blocks": 65536, 00:13:17.136 "uuid": "a8dcd76c-9b29-4a55-8c25-cd8d9da624cb", 00:13:17.136 "assigned_rate_limits": { 00:13:17.136 "rw_ios_per_sec": 0, 00:13:17.136 "rw_mbytes_per_sec": 0, 00:13:17.136 "r_mbytes_per_sec": 0, 00:13:17.136 "w_mbytes_per_sec": 0 00:13:17.136 }, 00:13:17.136 "claimed": false, 00:13:17.136 "zoned": false, 00:13:17.136 "supported_io_types": { 00:13:17.136 "read": true, 00:13:17.136 "write": true, 00:13:17.136 "unmap": true, 00:13:17.136 "flush": true, 00:13:17.136 "reset": true, 00:13:17.136 "nvme_admin": false, 00:13:17.136 "nvme_io": false, 00:13:17.136 "nvme_io_md": false, 00:13:17.136 "write_zeroes": true, 00:13:17.136 "zcopy": true, 00:13:17.136 "get_zone_info": false, 00:13:17.136 "zone_management": false, 00:13:17.136 "zone_append": false, 00:13:17.136 "compare": false, 00:13:17.136 "compare_and_write": false, 00:13:17.136 "abort": true, 00:13:17.136 "seek_hole": false, 00:13:17.136 "seek_data": false, 00:13:17.136 "copy": true, 00:13:17.136 "nvme_iov_md": false 00:13:17.136 }, 00:13:17.136 "memory_domains": [ 00:13:17.136 { 00:13:17.136 "dma_device_id": "system", 00:13:17.136 "dma_device_type": 1 00:13:17.136 }, 00:13:17.136 { 00:13:17.136 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:17.136 "dma_device_type": 2 00:13:17.136 } 00:13:17.136 ], 00:13:17.136 "driver_specific": {} 00:13:17.136 } 00:13:17.136 ] 00:13:17.136 18:17:00 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:13:17.136 18:17:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:13:17.136 18:17:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:13:17.136 18:17:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@305 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:13:17.396 [2024-07-12 18:17:00.920288] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:13:17.396 [2024-07-12 18:17:00.920333] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:13:17.396 [2024-07-12 18:17:00.920351] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:13:17.396 [2024-07-12 18:17:00.921885] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:13:17.396 18:17:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@306 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:13:17.396 18:17:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:13:17.396 18:17:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:13:17.396 18:17:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:13:17.396 18:17:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:13:17.396 18:17:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:13:17.396 18:17:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:17.396 18:17:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:17.396 18:17:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:17.396 18:17:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:17.396 18:17:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:17.396 18:17:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:13:17.655 18:17:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:17.655 "name": "Existed_Raid", 00:13:17.655 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:17.655 "strip_size_kb": 64, 00:13:17.655 "state": "configuring", 00:13:17.655 "raid_level": "raid0", 00:13:17.655 "superblock": false, 00:13:17.655 "num_base_bdevs": 3, 00:13:17.655 "num_base_bdevs_discovered": 2, 00:13:17.655 "num_base_bdevs_operational": 3, 00:13:17.655 "base_bdevs_list": [ 00:13:17.655 { 00:13:17.655 "name": "BaseBdev1", 00:13:17.655 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:17.655 "is_configured": false, 00:13:17.655 "data_offset": 0, 00:13:17.655 "data_size": 0 00:13:17.655 }, 00:13:17.655 { 00:13:17.655 "name": "BaseBdev2", 00:13:17.655 "uuid": "3fa41f38-e942-41c4-ab10-9b41e627318a", 00:13:17.655 "is_configured": true, 00:13:17.655 "data_offset": 0, 00:13:17.655 "data_size": 65536 00:13:17.655 }, 00:13:17.655 { 00:13:17.655 "name": "BaseBdev3", 00:13:17.655 "uuid": "a8dcd76c-9b29-4a55-8c25-cd8d9da624cb", 00:13:17.655 "is_configured": true, 00:13:17.655 "data_offset": 0, 00:13:17.655 "data_size": 65536 00:13:17.655 } 00:13:17.655 ] 00:13:17.655 }' 00:13:17.655 18:17:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:17.655 18:17:01 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:13:18.223 18:17:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@308 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev2 00:13:18.482 [2024-07-12 18:17:02.011170] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:13:18.482 18:17:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@309 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:13:18.482 18:17:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:13:18.482 18:17:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:13:18.482 18:17:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:13:18.482 18:17:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:13:18.482 18:17:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:13:18.482 18:17:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:18.482 18:17:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:18.482 18:17:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:18.482 18:17:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:18.482 18:17:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:18.483 18:17:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:13:18.742 18:17:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:18.742 "name": "Existed_Raid", 00:13:18.742 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:18.742 "strip_size_kb": 64, 00:13:18.742 "state": "configuring", 00:13:18.742 "raid_level": "raid0", 00:13:18.742 "superblock": false, 00:13:18.742 "num_base_bdevs": 3, 00:13:18.742 "num_base_bdevs_discovered": 1, 00:13:18.742 "num_base_bdevs_operational": 3, 00:13:18.742 "base_bdevs_list": [ 00:13:18.742 { 00:13:18.742 "name": "BaseBdev1", 00:13:18.742 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:18.742 "is_configured": false, 00:13:18.742 "data_offset": 0, 00:13:18.742 "data_size": 0 00:13:18.742 }, 00:13:18.742 { 00:13:18.742 "name": null, 00:13:18.742 "uuid": "3fa41f38-e942-41c4-ab10-9b41e627318a", 00:13:18.742 "is_configured": false, 00:13:18.742 "data_offset": 0, 00:13:18.742 "data_size": 65536 00:13:18.742 }, 00:13:18.742 { 00:13:18.742 "name": "BaseBdev3", 00:13:18.742 "uuid": "a8dcd76c-9b29-4a55-8c25-cd8d9da624cb", 00:13:18.742 "is_configured": true, 00:13:18.742 "data_offset": 0, 00:13:18.742 "data_size": 65536 00:13:18.742 } 00:13:18.742 ] 00:13:18.742 }' 00:13:18.742 18:17:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:18.742 18:17:02 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:13:19.309 18:17:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@310 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:19.309 18:17:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@310 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:13:19.568 18:17:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@310 -- # [[ false == \f\a\l\s\e ]] 00:13:19.568 18:17:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@312 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:13:19.827 [2024-07-12 18:17:03.387678] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:13:19.827 BaseBdev1 00:13:19.827 18:17:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@313 -- # waitforbdev BaseBdev1 00:13:19.827 18:17:03 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev1 00:13:19.827 18:17:03 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:13:19.827 18:17:03 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:13:19.827 18:17:03 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:13:19.827 18:17:03 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:13:19.827 18:17:03 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:13:20.086 18:17:03 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:13:20.346 [ 00:13:20.346 { 00:13:20.346 "name": "BaseBdev1", 00:13:20.346 "aliases": [ 00:13:20.346 "825d4a18-fb8a-4c27-896c-5cf5dcd725d9" 00:13:20.346 ], 00:13:20.346 "product_name": "Malloc disk", 00:13:20.346 "block_size": 512, 00:13:20.346 "num_blocks": 65536, 00:13:20.346 "uuid": "825d4a18-fb8a-4c27-896c-5cf5dcd725d9", 00:13:20.346 "assigned_rate_limits": { 00:13:20.346 "rw_ios_per_sec": 0, 00:13:20.346 "rw_mbytes_per_sec": 0, 00:13:20.346 "r_mbytes_per_sec": 0, 00:13:20.346 "w_mbytes_per_sec": 0 00:13:20.346 }, 00:13:20.346 "claimed": true, 00:13:20.346 "claim_type": "exclusive_write", 00:13:20.346 "zoned": false, 00:13:20.346 "supported_io_types": { 00:13:20.346 "read": true, 00:13:20.346 "write": true, 00:13:20.346 "unmap": true, 00:13:20.346 "flush": true, 00:13:20.346 "reset": true, 00:13:20.346 "nvme_admin": false, 00:13:20.346 "nvme_io": false, 00:13:20.346 "nvme_io_md": false, 00:13:20.346 "write_zeroes": true, 00:13:20.346 "zcopy": true, 00:13:20.346 "get_zone_info": false, 00:13:20.346 "zone_management": false, 00:13:20.346 "zone_append": false, 00:13:20.346 "compare": false, 00:13:20.346 "compare_and_write": false, 00:13:20.346 "abort": true, 00:13:20.346 "seek_hole": false, 00:13:20.346 "seek_data": false, 00:13:20.346 "copy": true, 00:13:20.346 "nvme_iov_md": false 00:13:20.346 }, 00:13:20.346 "memory_domains": [ 00:13:20.346 { 00:13:20.346 "dma_device_id": "system", 00:13:20.346 "dma_device_type": 1 00:13:20.346 }, 00:13:20.346 { 00:13:20.346 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:20.346 "dma_device_type": 2 00:13:20.346 } 00:13:20.346 ], 00:13:20.346 "driver_specific": {} 00:13:20.346 } 00:13:20.346 ] 00:13:20.346 18:17:03 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:13:20.346 18:17:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@314 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:13:20.346 18:17:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:13:20.346 18:17:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:13:20.346 18:17:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:13:20.346 18:17:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:13:20.346 18:17:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:13:20.346 18:17:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:20.346 18:17:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:20.346 18:17:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:20.346 18:17:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:20.346 18:17:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:20.346 18:17:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:13:20.606 18:17:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:20.606 "name": "Existed_Raid", 00:13:20.606 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:20.606 "strip_size_kb": 64, 00:13:20.606 "state": "configuring", 00:13:20.606 "raid_level": "raid0", 00:13:20.606 "superblock": false, 00:13:20.606 "num_base_bdevs": 3, 00:13:20.606 "num_base_bdevs_discovered": 2, 00:13:20.606 "num_base_bdevs_operational": 3, 00:13:20.606 "base_bdevs_list": [ 00:13:20.606 { 00:13:20.606 "name": "BaseBdev1", 00:13:20.606 "uuid": "825d4a18-fb8a-4c27-896c-5cf5dcd725d9", 00:13:20.606 "is_configured": true, 00:13:20.606 "data_offset": 0, 00:13:20.606 "data_size": 65536 00:13:20.606 }, 00:13:20.606 { 00:13:20.606 "name": null, 00:13:20.606 "uuid": "3fa41f38-e942-41c4-ab10-9b41e627318a", 00:13:20.606 "is_configured": false, 00:13:20.606 "data_offset": 0, 00:13:20.606 "data_size": 65536 00:13:20.606 }, 00:13:20.606 { 00:13:20.606 "name": "BaseBdev3", 00:13:20.606 "uuid": "a8dcd76c-9b29-4a55-8c25-cd8d9da624cb", 00:13:20.606 "is_configured": true, 00:13:20.606 "data_offset": 0, 00:13:20.606 "data_size": 65536 00:13:20.606 } 00:13:20.606 ] 00:13:20.606 }' 00:13:20.606 18:17:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:20.606 18:17:04 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:13:21.174 18:17:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@315 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:21.174 18:17:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@315 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:13:21.432 18:17:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@315 -- # [[ true == \t\r\u\e ]] 00:13:21.432 18:17:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@317 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev3 00:13:21.432 [2024-07-12 18:17:05.104280] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:13:21.432 18:17:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@318 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:13:21.433 18:17:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:13:21.433 18:17:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:13:21.433 18:17:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:13:21.433 18:17:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:13:21.433 18:17:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:13:21.433 18:17:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:21.433 18:17:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:21.433 18:17:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:21.433 18:17:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:21.433 18:17:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:21.433 18:17:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:13:21.691 18:17:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:21.691 "name": "Existed_Raid", 00:13:21.691 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:21.691 "strip_size_kb": 64, 00:13:21.691 "state": "configuring", 00:13:21.691 "raid_level": "raid0", 00:13:21.691 "superblock": false, 00:13:21.691 "num_base_bdevs": 3, 00:13:21.691 "num_base_bdevs_discovered": 1, 00:13:21.691 "num_base_bdevs_operational": 3, 00:13:21.691 "base_bdevs_list": [ 00:13:21.691 { 00:13:21.691 "name": "BaseBdev1", 00:13:21.691 "uuid": "825d4a18-fb8a-4c27-896c-5cf5dcd725d9", 00:13:21.691 "is_configured": true, 00:13:21.691 "data_offset": 0, 00:13:21.691 "data_size": 65536 00:13:21.691 }, 00:13:21.691 { 00:13:21.691 "name": null, 00:13:21.691 "uuid": "3fa41f38-e942-41c4-ab10-9b41e627318a", 00:13:21.691 "is_configured": false, 00:13:21.691 "data_offset": 0, 00:13:21.691 "data_size": 65536 00:13:21.691 }, 00:13:21.691 { 00:13:21.691 "name": null, 00:13:21.691 "uuid": "a8dcd76c-9b29-4a55-8c25-cd8d9da624cb", 00:13:21.691 "is_configured": false, 00:13:21.691 "data_offset": 0, 00:13:21.691 "data_size": 65536 00:13:21.691 } 00:13:21.691 ] 00:13:21.691 }' 00:13:21.691 18:17:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:21.691 18:17:05 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:13:22.256 18:17:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@319 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:22.256 18:17:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@319 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:13:22.514 18:17:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@319 -- # [[ false == \f\a\l\s\e ]] 00:13:22.514 18:17:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@321 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev3 00:13:22.772 [2024-07-12 18:17:06.431835] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:13:22.772 18:17:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@322 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:13:22.772 18:17:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:13:22.772 18:17:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:13:22.772 18:17:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:13:22.772 18:17:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:13:22.772 18:17:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:13:22.772 18:17:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:22.772 18:17:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:22.772 18:17:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:22.772 18:17:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:22.772 18:17:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:22.772 18:17:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:13:23.030 18:17:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:23.030 "name": "Existed_Raid", 00:13:23.030 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:23.030 "strip_size_kb": 64, 00:13:23.030 "state": "configuring", 00:13:23.030 "raid_level": "raid0", 00:13:23.030 "superblock": false, 00:13:23.030 "num_base_bdevs": 3, 00:13:23.030 "num_base_bdevs_discovered": 2, 00:13:23.030 "num_base_bdevs_operational": 3, 00:13:23.030 "base_bdevs_list": [ 00:13:23.030 { 00:13:23.030 "name": "BaseBdev1", 00:13:23.030 "uuid": "825d4a18-fb8a-4c27-896c-5cf5dcd725d9", 00:13:23.030 "is_configured": true, 00:13:23.030 "data_offset": 0, 00:13:23.030 "data_size": 65536 00:13:23.030 }, 00:13:23.030 { 00:13:23.030 "name": null, 00:13:23.030 "uuid": "3fa41f38-e942-41c4-ab10-9b41e627318a", 00:13:23.030 "is_configured": false, 00:13:23.030 "data_offset": 0, 00:13:23.030 "data_size": 65536 00:13:23.030 }, 00:13:23.030 { 00:13:23.030 "name": "BaseBdev3", 00:13:23.030 "uuid": "a8dcd76c-9b29-4a55-8c25-cd8d9da624cb", 00:13:23.030 "is_configured": true, 00:13:23.030 "data_offset": 0, 00:13:23.030 "data_size": 65536 00:13:23.030 } 00:13:23.030 ] 00:13:23.030 }' 00:13:23.030 18:17:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:23.030 18:17:06 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:13:23.595 18:17:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@323 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:23.595 18:17:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@323 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:13:23.853 18:17:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@323 -- # [[ true == \t\r\u\e ]] 00:13:23.853 18:17:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@325 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:13:24.111 [2024-07-12 18:17:07.787439] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:13:24.111 18:17:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@326 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:13:24.111 18:17:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:13:24.111 18:17:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:13:24.111 18:17:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:13:24.111 18:17:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:13:24.111 18:17:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:13:24.111 18:17:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:24.111 18:17:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:24.111 18:17:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:24.111 18:17:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:24.111 18:17:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:24.111 18:17:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:13:24.369 18:17:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:24.369 "name": "Existed_Raid", 00:13:24.369 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:24.369 "strip_size_kb": 64, 00:13:24.369 "state": "configuring", 00:13:24.369 "raid_level": "raid0", 00:13:24.369 "superblock": false, 00:13:24.369 "num_base_bdevs": 3, 00:13:24.369 "num_base_bdevs_discovered": 1, 00:13:24.369 "num_base_bdevs_operational": 3, 00:13:24.369 "base_bdevs_list": [ 00:13:24.369 { 00:13:24.369 "name": null, 00:13:24.369 "uuid": "825d4a18-fb8a-4c27-896c-5cf5dcd725d9", 00:13:24.369 "is_configured": false, 00:13:24.369 "data_offset": 0, 00:13:24.369 "data_size": 65536 00:13:24.369 }, 00:13:24.369 { 00:13:24.369 "name": null, 00:13:24.369 "uuid": "3fa41f38-e942-41c4-ab10-9b41e627318a", 00:13:24.369 "is_configured": false, 00:13:24.369 "data_offset": 0, 00:13:24.369 "data_size": 65536 00:13:24.369 }, 00:13:24.369 { 00:13:24.369 "name": "BaseBdev3", 00:13:24.369 "uuid": "a8dcd76c-9b29-4a55-8c25-cd8d9da624cb", 00:13:24.369 "is_configured": true, 00:13:24.369 "data_offset": 0, 00:13:24.369 "data_size": 65536 00:13:24.369 } 00:13:24.369 ] 00:13:24.369 }' 00:13:24.369 18:17:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:24.369 18:17:08 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:13:25.301 18:17:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@327 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:25.301 18:17:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@327 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:13:25.301 18:17:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@327 -- # [[ false == \f\a\l\s\e ]] 00:13:25.301 18:17:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@329 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev2 00:13:25.591 [2024-07-12 18:17:09.162783] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:13:25.591 18:17:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@330 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:13:25.591 18:17:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:13:25.591 18:17:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:13:25.591 18:17:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:13:25.591 18:17:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:13:25.591 18:17:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:13:25.591 18:17:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:25.591 18:17:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:25.591 18:17:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:25.591 18:17:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:25.591 18:17:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:25.591 18:17:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:13:25.863 18:17:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:25.863 "name": "Existed_Raid", 00:13:25.863 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:25.863 "strip_size_kb": 64, 00:13:25.863 "state": "configuring", 00:13:25.863 "raid_level": "raid0", 00:13:25.863 "superblock": false, 00:13:25.863 "num_base_bdevs": 3, 00:13:25.863 "num_base_bdevs_discovered": 2, 00:13:25.863 "num_base_bdevs_operational": 3, 00:13:25.863 "base_bdevs_list": [ 00:13:25.863 { 00:13:25.863 "name": null, 00:13:25.863 "uuid": "825d4a18-fb8a-4c27-896c-5cf5dcd725d9", 00:13:25.863 "is_configured": false, 00:13:25.863 "data_offset": 0, 00:13:25.863 "data_size": 65536 00:13:25.863 }, 00:13:25.863 { 00:13:25.863 "name": "BaseBdev2", 00:13:25.863 "uuid": "3fa41f38-e942-41c4-ab10-9b41e627318a", 00:13:25.863 "is_configured": true, 00:13:25.863 "data_offset": 0, 00:13:25.863 "data_size": 65536 00:13:25.863 }, 00:13:25.863 { 00:13:25.863 "name": "BaseBdev3", 00:13:25.863 "uuid": "a8dcd76c-9b29-4a55-8c25-cd8d9da624cb", 00:13:25.863 "is_configured": true, 00:13:25.863 "data_offset": 0, 00:13:25.863 "data_size": 65536 00:13:25.863 } 00:13:25.863 ] 00:13:25.863 }' 00:13:25.863 18:17:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:25.863 18:17:09 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:13:26.430 18:17:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@331 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:26.430 18:17:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@331 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:13:26.689 18:17:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@331 -- # [[ true == \t\r\u\e ]] 00:13:26.689 18:17:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:26.689 18:17:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@333 -- # jq -r '.[0].base_bdevs_list[0].uuid' 00:13:27.257 18:17:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b NewBaseBdev -u 825d4a18-fb8a-4c27-896c-5cf5dcd725d9 00:13:27.515 [2024-07-12 18:17:11.076700] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev NewBaseBdev is claimed 00:13:27.516 [2024-07-12 18:17:11.076745] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x1cb2450 00:13:27.516 [2024-07-12 18:17:11.076754] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 196608, blocklen 512 00:13:27.516 [2024-07-12 18:17:11.076984] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1cb3a50 00:13:27.516 [2024-07-12 18:17:11.077117] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1cb2450 00:13:27.516 [2024-07-12 18:17:11.077127] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x1cb2450 00:13:27.516 [2024-07-12 18:17:11.077303] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:13:27.516 NewBaseBdev 00:13:27.516 18:17:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@334 -- # waitforbdev NewBaseBdev 00:13:27.516 18:17:11 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=NewBaseBdev 00:13:27.516 18:17:11 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:13:27.516 18:17:11 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:13:27.516 18:17:11 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:13:27.516 18:17:11 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:13:27.516 18:17:11 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:13:27.775 18:17:11 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev -t 2000 00:13:28.033 [ 00:13:28.033 { 00:13:28.033 "name": "NewBaseBdev", 00:13:28.033 "aliases": [ 00:13:28.033 "825d4a18-fb8a-4c27-896c-5cf5dcd725d9" 00:13:28.033 ], 00:13:28.033 "product_name": "Malloc disk", 00:13:28.033 "block_size": 512, 00:13:28.033 "num_blocks": 65536, 00:13:28.033 "uuid": "825d4a18-fb8a-4c27-896c-5cf5dcd725d9", 00:13:28.033 "assigned_rate_limits": { 00:13:28.033 "rw_ios_per_sec": 0, 00:13:28.033 "rw_mbytes_per_sec": 0, 00:13:28.033 "r_mbytes_per_sec": 0, 00:13:28.033 "w_mbytes_per_sec": 0 00:13:28.033 }, 00:13:28.033 "claimed": true, 00:13:28.033 "claim_type": "exclusive_write", 00:13:28.033 "zoned": false, 00:13:28.033 "supported_io_types": { 00:13:28.033 "read": true, 00:13:28.033 "write": true, 00:13:28.033 "unmap": true, 00:13:28.033 "flush": true, 00:13:28.033 "reset": true, 00:13:28.033 "nvme_admin": false, 00:13:28.033 "nvme_io": false, 00:13:28.033 "nvme_io_md": false, 00:13:28.033 "write_zeroes": true, 00:13:28.033 "zcopy": true, 00:13:28.033 "get_zone_info": false, 00:13:28.033 "zone_management": false, 00:13:28.033 "zone_append": false, 00:13:28.033 "compare": false, 00:13:28.033 "compare_and_write": false, 00:13:28.033 "abort": true, 00:13:28.033 "seek_hole": false, 00:13:28.033 "seek_data": false, 00:13:28.033 "copy": true, 00:13:28.033 "nvme_iov_md": false 00:13:28.033 }, 00:13:28.033 "memory_domains": [ 00:13:28.033 { 00:13:28.033 "dma_device_id": "system", 00:13:28.033 "dma_device_type": 1 00:13:28.033 }, 00:13:28.033 { 00:13:28.033 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:28.033 "dma_device_type": 2 00:13:28.033 } 00:13:28.033 ], 00:13:28.033 "driver_specific": {} 00:13:28.033 } 00:13:28.033 ] 00:13:28.033 18:17:11 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:13:28.033 18:17:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@335 -- # verify_raid_bdev_state Existed_Raid online raid0 64 3 00:13:28.033 18:17:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:13:28.033 18:17:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:13:28.033 18:17:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:13:28.033 18:17:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:13:28.033 18:17:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:13:28.033 18:17:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:28.033 18:17:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:28.033 18:17:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:28.033 18:17:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:28.033 18:17:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:28.033 18:17:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:13:28.292 18:17:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:28.292 "name": "Existed_Raid", 00:13:28.292 "uuid": "8cd86da7-ed77-4d9d-8174-ee20e3dca63f", 00:13:28.292 "strip_size_kb": 64, 00:13:28.292 "state": "online", 00:13:28.292 "raid_level": "raid0", 00:13:28.292 "superblock": false, 00:13:28.292 "num_base_bdevs": 3, 00:13:28.292 "num_base_bdevs_discovered": 3, 00:13:28.292 "num_base_bdevs_operational": 3, 00:13:28.292 "base_bdevs_list": [ 00:13:28.292 { 00:13:28.292 "name": "NewBaseBdev", 00:13:28.292 "uuid": "825d4a18-fb8a-4c27-896c-5cf5dcd725d9", 00:13:28.292 "is_configured": true, 00:13:28.292 "data_offset": 0, 00:13:28.292 "data_size": 65536 00:13:28.292 }, 00:13:28.292 { 00:13:28.292 "name": "BaseBdev2", 00:13:28.292 "uuid": "3fa41f38-e942-41c4-ab10-9b41e627318a", 00:13:28.292 "is_configured": true, 00:13:28.292 "data_offset": 0, 00:13:28.292 "data_size": 65536 00:13:28.292 }, 00:13:28.292 { 00:13:28.292 "name": "BaseBdev3", 00:13:28.292 "uuid": "a8dcd76c-9b29-4a55-8c25-cd8d9da624cb", 00:13:28.292 "is_configured": true, 00:13:28.292 "data_offset": 0, 00:13:28.292 "data_size": 65536 00:13:28.292 } 00:13:28.292 ] 00:13:28.292 }' 00:13:28.292 18:17:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:28.292 18:17:11 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:13:28.859 18:17:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@336 -- # verify_raid_bdev_properties Existed_Raid 00:13:28.859 18:17:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:13:28.859 18:17:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:13:28.859 18:17:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:13:28.859 18:17:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:13:28.859 18:17:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@198 -- # local name 00:13:28.859 18:17:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:13:28.859 18:17:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:13:29.118 [2024-07-12 18:17:12.629135] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:13:29.118 18:17:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:13:29.118 "name": "Existed_Raid", 00:13:29.118 "aliases": [ 00:13:29.118 "8cd86da7-ed77-4d9d-8174-ee20e3dca63f" 00:13:29.118 ], 00:13:29.118 "product_name": "Raid Volume", 00:13:29.118 "block_size": 512, 00:13:29.118 "num_blocks": 196608, 00:13:29.118 "uuid": "8cd86da7-ed77-4d9d-8174-ee20e3dca63f", 00:13:29.118 "assigned_rate_limits": { 00:13:29.118 "rw_ios_per_sec": 0, 00:13:29.118 "rw_mbytes_per_sec": 0, 00:13:29.118 "r_mbytes_per_sec": 0, 00:13:29.118 "w_mbytes_per_sec": 0 00:13:29.118 }, 00:13:29.118 "claimed": false, 00:13:29.118 "zoned": false, 00:13:29.118 "supported_io_types": { 00:13:29.118 "read": true, 00:13:29.118 "write": true, 00:13:29.118 "unmap": true, 00:13:29.118 "flush": true, 00:13:29.118 "reset": true, 00:13:29.118 "nvme_admin": false, 00:13:29.118 "nvme_io": false, 00:13:29.118 "nvme_io_md": false, 00:13:29.118 "write_zeroes": true, 00:13:29.118 "zcopy": false, 00:13:29.118 "get_zone_info": false, 00:13:29.118 "zone_management": false, 00:13:29.118 "zone_append": false, 00:13:29.118 "compare": false, 00:13:29.118 "compare_and_write": false, 00:13:29.118 "abort": false, 00:13:29.118 "seek_hole": false, 00:13:29.118 "seek_data": false, 00:13:29.118 "copy": false, 00:13:29.118 "nvme_iov_md": false 00:13:29.118 }, 00:13:29.118 "memory_domains": [ 00:13:29.118 { 00:13:29.118 "dma_device_id": "system", 00:13:29.118 "dma_device_type": 1 00:13:29.118 }, 00:13:29.118 { 00:13:29.118 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:29.118 "dma_device_type": 2 00:13:29.118 }, 00:13:29.118 { 00:13:29.118 "dma_device_id": "system", 00:13:29.118 "dma_device_type": 1 00:13:29.118 }, 00:13:29.118 { 00:13:29.118 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:29.118 "dma_device_type": 2 00:13:29.118 }, 00:13:29.118 { 00:13:29.118 "dma_device_id": "system", 00:13:29.118 "dma_device_type": 1 00:13:29.118 }, 00:13:29.118 { 00:13:29.118 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:29.118 "dma_device_type": 2 00:13:29.118 } 00:13:29.118 ], 00:13:29.118 "driver_specific": { 00:13:29.118 "raid": { 00:13:29.118 "uuid": "8cd86da7-ed77-4d9d-8174-ee20e3dca63f", 00:13:29.118 "strip_size_kb": 64, 00:13:29.118 "state": "online", 00:13:29.118 "raid_level": "raid0", 00:13:29.118 "superblock": false, 00:13:29.118 "num_base_bdevs": 3, 00:13:29.118 "num_base_bdevs_discovered": 3, 00:13:29.118 "num_base_bdevs_operational": 3, 00:13:29.118 "base_bdevs_list": [ 00:13:29.118 { 00:13:29.118 "name": "NewBaseBdev", 00:13:29.118 "uuid": "825d4a18-fb8a-4c27-896c-5cf5dcd725d9", 00:13:29.118 "is_configured": true, 00:13:29.118 "data_offset": 0, 00:13:29.118 "data_size": 65536 00:13:29.118 }, 00:13:29.118 { 00:13:29.118 "name": "BaseBdev2", 00:13:29.118 "uuid": "3fa41f38-e942-41c4-ab10-9b41e627318a", 00:13:29.118 "is_configured": true, 00:13:29.118 "data_offset": 0, 00:13:29.118 "data_size": 65536 00:13:29.118 }, 00:13:29.118 { 00:13:29.118 "name": "BaseBdev3", 00:13:29.118 "uuid": "a8dcd76c-9b29-4a55-8c25-cd8d9da624cb", 00:13:29.118 "is_configured": true, 00:13:29.118 "data_offset": 0, 00:13:29.118 "data_size": 65536 00:13:29.118 } 00:13:29.118 ] 00:13:29.118 } 00:13:29.118 } 00:13:29.118 }' 00:13:29.118 18:17:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:13:29.118 18:17:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='NewBaseBdev 00:13:29.118 BaseBdev2 00:13:29.118 BaseBdev3' 00:13:29.118 18:17:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:13:29.118 18:17:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev 00:13:29.118 18:17:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:13:29.377 18:17:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:13:29.377 "name": "NewBaseBdev", 00:13:29.377 "aliases": [ 00:13:29.377 "825d4a18-fb8a-4c27-896c-5cf5dcd725d9" 00:13:29.377 ], 00:13:29.377 "product_name": "Malloc disk", 00:13:29.377 "block_size": 512, 00:13:29.377 "num_blocks": 65536, 00:13:29.377 "uuid": "825d4a18-fb8a-4c27-896c-5cf5dcd725d9", 00:13:29.377 "assigned_rate_limits": { 00:13:29.377 "rw_ios_per_sec": 0, 00:13:29.377 "rw_mbytes_per_sec": 0, 00:13:29.377 "r_mbytes_per_sec": 0, 00:13:29.377 "w_mbytes_per_sec": 0 00:13:29.377 }, 00:13:29.377 "claimed": true, 00:13:29.377 "claim_type": "exclusive_write", 00:13:29.377 "zoned": false, 00:13:29.377 "supported_io_types": { 00:13:29.377 "read": true, 00:13:29.377 "write": true, 00:13:29.377 "unmap": true, 00:13:29.377 "flush": true, 00:13:29.377 "reset": true, 00:13:29.377 "nvme_admin": false, 00:13:29.377 "nvme_io": false, 00:13:29.377 "nvme_io_md": false, 00:13:29.377 "write_zeroes": true, 00:13:29.377 "zcopy": true, 00:13:29.377 "get_zone_info": false, 00:13:29.377 "zone_management": false, 00:13:29.377 "zone_append": false, 00:13:29.377 "compare": false, 00:13:29.377 "compare_and_write": false, 00:13:29.377 "abort": true, 00:13:29.377 "seek_hole": false, 00:13:29.377 "seek_data": false, 00:13:29.377 "copy": true, 00:13:29.377 "nvme_iov_md": false 00:13:29.377 }, 00:13:29.377 "memory_domains": [ 00:13:29.377 { 00:13:29.377 "dma_device_id": "system", 00:13:29.377 "dma_device_type": 1 00:13:29.377 }, 00:13:29.377 { 00:13:29.377 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:29.377 "dma_device_type": 2 00:13:29.377 } 00:13:29.377 ], 00:13:29.377 "driver_specific": {} 00:13:29.377 }' 00:13:29.377 18:17:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:13:29.377 18:17:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:13:29.377 18:17:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:13:29.377 18:17:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:13:29.377 18:17:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:13:29.635 18:17:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:13:29.635 18:17:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:13:29.635 18:17:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:13:29.635 18:17:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:13:29.635 18:17:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:13:29.635 18:17:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:13:29.635 18:17:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:13:29.635 18:17:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:13:29.635 18:17:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:13:29.635 18:17:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:13:29.893 18:17:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:13:29.893 "name": "BaseBdev2", 00:13:29.893 "aliases": [ 00:13:29.893 "3fa41f38-e942-41c4-ab10-9b41e627318a" 00:13:29.893 ], 00:13:29.893 "product_name": "Malloc disk", 00:13:29.893 "block_size": 512, 00:13:29.893 "num_blocks": 65536, 00:13:29.893 "uuid": "3fa41f38-e942-41c4-ab10-9b41e627318a", 00:13:29.893 "assigned_rate_limits": { 00:13:29.893 "rw_ios_per_sec": 0, 00:13:29.893 "rw_mbytes_per_sec": 0, 00:13:29.893 "r_mbytes_per_sec": 0, 00:13:29.893 "w_mbytes_per_sec": 0 00:13:29.893 }, 00:13:29.893 "claimed": true, 00:13:29.893 "claim_type": "exclusive_write", 00:13:29.893 "zoned": false, 00:13:29.893 "supported_io_types": { 00:13:29.893 "read": true, 00:13:29.893 "write": true, 00:13:29.893 "unmap": true, 00:13:29.893 "flush": true, 00:13:29.893 "reset": true, 00:13:29.893 "nvme_admin": false, 00:13:29.893 "nvme_io": false, 00:13:29.893 "nvme_io_md": false, 00:13:29.893 "write_zeroes": true, 00:13:29.893 "zcopy": true, 00:13:29.893 "get_zone_info": false, 00:13:29.893 "zone_management": false, 00:13:29.893 "zone_append": false, 00:13:29.893 "compare": false, 00:13:29.893 "compare_and_write": false, 00:13:29.893 "abort": true, 00:13:29.893 "seek_hole": false, 00:13:29.893 "seek_data": false, 00:13:29.893 "copy": true, 00:13:29.893 "nvme_iov_md": false 00:13:29.893 }, 00:13:29.893 "memory_domains": [ 00:13:29.893 { 00:13:29.893 "dma_device_id": "system", 00:13:29.893 "dma_device_type": 1 00:13:29.893 }, 00:13:29.893 { 00:13:29.893 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:29.893 "dma_device_type": 2 00:13:29.893 } 00:13:29.893 ], 00:13:29.893 "driver_specific": {} 00:13:29.893 }' 00:13:29.893 18:17:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:13:29.893 18:17:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:13:29.893 18:17:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:13:29.893 18:17:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:13:29.893 18:17:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:13:30.152 18:17:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:13:30.152 18:17:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:13:30.152 18:17:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:13:30.152 18:17:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:13:30.152 18:17:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:13:30.152 18:17:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:13:30.152 18:17:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:13:30.152 18:17:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:13:30.152 18:17:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:13:30.152 18:17:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:13:30.412 18:17:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:13:30.412 "name": "BaseBdev3", 00:13:30.412 "aliases": [ 00:13:30.412 "a8dcd76c-9b29-4a55-8c25-cd8d9da624cb" 00:13:30.412 ], 00:13:30.412 "product_name": "Malloc disk", 00:13:30.412 "block_size": 512, 00:13:30.412 "num_blocks": 65536, 00:13:30.412 "uuid": "a8dcd76c-9b29-4a55-8c25-cd8d9da624cb", 00:13:30.412 "assigned_rate_limits": { 00:13:30.412 "rw_ios_per_sec": 0, 00:13:30.412 "rw_mbytes_per_sec": 0, 00:13:30.412 "r_mbytes_per_sec": 0, 00:13:30.412 "w_mbytes_per_sec": 0 00:13:30.412 }, 00:13:30.412 "claimed": true, 00:13:30.412 "claim_type": "exclusive_write", 00:13:30.412 "zoned": false, 00:13:30.412 "supported_io_types": { 00:13:30.412 "read": true, 00:13:30.412 "write": true, 00:13:30.412 "unmap": true, 00:13:30.412 "flush": true, 00:13:30.412 "reset": true, 00:13:30.412 "nvme_admin": false, 00:13:30.412 "nvme_io": false, 00:13:30.412 "nvme_io_md": false, 00:13:30.412 "write_zeroes": true, 00:13:30.412 "zcopy": true, 00:13:30.412 "get_zone_info": false, 00:13:30.412 "zone_management": false, 00:13:30.412 "zone_append": false, 00:13:30.412 "compare": false, 00:13:30.412 "compare_and_write": false, 00:13:30.412 "abort": true, 00:13:30.412 "seek_hole": false, 00:13:30.412 "seek_data": false, 00:13:30.412 "copy": true, 00:13:30.412 "nvme_iov_md": false 00:13:30.412 }, 00:13:30.412 "memory_domains": [ 00:13:30.412 { 00:13:30.412 "dma_device_id": "system", 00:13:30.412 "dma_device_type": 1 00:13:30.412 }, 00:13:30.412 { 00:13:30.412 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:30.412 "dma_device_type": 2 00:13:30.412 } 00:13:30.412 ], 00:13:30.412 "driver_specific": {} 00:13:30.412 }' 00:13:30.412 18:17:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:13:30.412 18:17:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:13:30.412 18:17:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:13:30.412 18:17:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:13:30.412 18:17:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:13:30.412 18:17:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:13:30.412 18:17:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:13:30.412 18:17:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:13:30.671 18:17:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:13:30.671 18:17:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:13:30.671 18:17:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:13:30.671 18:17:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:13:30.671 18:17:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@338 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:13:30.929 [2024-07-12 18:17:14.449675] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:13:30.929 [2024-07-12 18:17:14.449708] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:13:30.929 [2024-07-12 18:17:14.449773] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:13:30.929 [2024-07-12 18:17:14.449829] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:13:30.929 [2024-07-12 18:17:14.449841] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1cb2450 name Existed_Raid, state offline 00:13:30.929 18:17:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@341 -- # killprocess 2477407 00:13:30.929 18:17:14 bdev_raid.raid_state_function_test -- common/autotest_common.sh@948 -- # '[' -z 2477407 ']' 00:13:30.929 18:17:14 bdev_raid.raid_state_function_test -- common/autotest_common.sh@952 -- # kill -0 2477407 00:13:30.929 18:17:14 bdev_raid.raid_state_function_test -- common/autotest_common.sh@953 -- # uname 00:13:30.929 18:17:14 bdev_raid.raid_state_function_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:13:30.929 18:17:14 bdev_raid.raid_state_function_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2477407 00:13:30.929 18:17:14 bdev_raid.raid_state_function_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:13:30.929 18:17:14 bdev_raid.raid_state_function_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:13:30.929 18:17:14 bdev_raid.raid_state_function_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2477407' 00:13:30.929 killing process with pid 2477407 00:13:30.929 18:17:14 bdev_raid.raid_state_function_test -- common/autotest_common.sh@967 -- # kill 2477407 00:13:30.929 [2024-07-12 18:17:14.512908] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:13:30.929 18:17:14 bdev_raid.raid_state_function_test -- common/autotest_common.sh@972 -- # wait 2477407 00:13:30.929 [2024-07-12 18:17:14.575579] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:13:31.496 18:17:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@343 -- # return 0 00:13:31.496 00:13:31.496 real 0m28.516s 00:13:31.496 user 0m52.146s 00:13:31.496 sys 0m5.006s 00:13:31.496 18:17:14 bdev_raid.raid_state_function_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:13:31.496 18:17:14 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:13:31.496 ************************************ 00:13:31.496 END TEST raid_state_function_test 00:13:31.496 ************************************ 00:13:31.496 18:17:14 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:13:31.496 18:17:14 bdev_raid -- bdev/bdev_raid.sh@868 -- # run_test raid_state_function_test_sb raid_state_function_test raid0 3 true 00:13:31.496 18:17:14 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:13:31.496 18:17:14 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:13:31.496 18:17:14 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:13:31.496 ************************************ 00:13:31.496 START TEST raid_state_function_test_sb 00:13:31.496 ************************************ 00:13:31.496 18:17:15 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@1123 -- # raid_state_function_test raid0 3 true 00:13:31.496 18:17:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@220 -- # local raid_level=raid0 00:13:31.496 18:17:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@221 -- # local num_base_bdevs=3 00:13:31.496 18:17:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@222 -- # local superblock=true 00:13:31.496 18:17:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@223 -- # local raid_bdev 00:13:31.496 18:17:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i = 1 )) 00:13:31.496 18:17:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:13:31.496 18:17:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev1 00:13:31.496 18:17:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:13:31.496 18:17:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:13:31.496 18:17:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev2 00:13:31.496 18:17:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:13:31.496 18:17:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:13:31.496 18:17:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev3 00:13:31.496 18:17:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:13:31.496 18:17:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:13:31.496 18:17:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3') 00:13:31.496 18:17:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # local base_bdevs 00:13:31.496 18:17:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@225 -- # local raid_bdev_name=Existed_Raid 00:13:31.496 18:17:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # local strip_size 00:13:31.496 18:17:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@227 -- # local strip_size_create_arg 00:13:31.496 18:17:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@228 -- # local superblock_create_arg 00:13:31.496 18:17:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@230 -- # '[' raid0 '!=' raid1 ']' 00:13:31.496 18:17:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@231 -- # strip_size=64 00:13:31.496 18:17:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@232 -- # strip_size_create_arg='-z 64' 00:13:31.496 18:17:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@237 -- # '[' true = true ']' 00:13:31.496 18:17:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@238 -- # superblock_create_arg=-s 00:13:31.496 18:17:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@244 -- # raid_pid=2481876 00:13:31.496 18:17:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@245 -- # echo 'Process raid pid: 2481876' 00:13:31.496 Process raid pid: 2481876 00:13:31.496 18:17:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@243 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:13:31.496 18:17:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@246 -- # waitforlisten 2481876 /var/tmp/spdk-raid.sock 00:13:31.496 18:17:15 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@829 -- # '[' -z 2481876 ']' 00:13:31.496 18:17:15 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:13:31.496 18:17:15 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@834 -- # local max_retries=100 00:13:31.496 18:17:15 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:13:31.496 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:13:31.496 18:17:15 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@838 -- # xtrace_disable 00:13:31.496 18:17:15 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:13:31.496 [2024-07-12 18:17:15.094798] Starting SPDK v24.09-pre git sha1 182dd7de4 / DPDK 24.03.0 initialization... 00:13:31.496 [2024-07-12 18:17:15.094865] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:13:31.754 [2024-07-12 18:17:15.225148] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:13:31.754 [2024-07-12 18:17:15.327011] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:13:31.755 [2024-07-12 18:17:15.385790] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:13:31.755 [2024-07-12 18:17:15.385838] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:13:32.320 18:17:16 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:13:32.320 18:17:16 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@862 -- # return 0 00:13:32.321 18:17:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@250 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r raid0 -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:13:32.578 [2024-07-12 18:17:16.259746] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:13:32.578 [2024-07-12 18:17:16.259789] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:13:32.579 [2024-07-12 18:17:16.259800] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:13:32.579 [2024-07-12 18:17:16.259812] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:13:32.579 [2024-07-12 18:17:16.259821] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:13:32.579 [2024-07-12 18:17:16.259832] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:13:32.579 18:17:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@251 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:13:32.579 18:17:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:13:32.579 18:17:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:13:32.579 18:17:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:13:32.579 18:17:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:13:32.579 18:17:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:13:32.579 18:17:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:32.579 18:17:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:32.579 18:17:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:32.579 18:17:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:32.579 18:17:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:13:32.579 18:17:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:32.837 18:17:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:32.837 "name": "Existed_Raid", 00:13:32.837 "uuid": "71dbf6ab-802c-41e2-b99a-cbcd6d82bc03", 00:13:32.837 "strip_size_kb": 64, 00:13:32.837 "state": "configuring", 00:13:32.837 "raid_level": "raid0", 00:13:32.837 "superblock": true, 00:13:32.837 "num_base_bdevs": 3, 00:13:32.837 "num_base_bdevs_discovered": 0, 00:13:32.837 "num_base_bdevs_operational": 3, 00:13:32.837 "base_bdevs_list": [ 00:13:32.837 { 00:13:32.837 "name": "BaseBdev1", 00:13:32.837 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:32.837 "is_configured": false, 00:13:32.837 "data_offset": 0, 00:13:32.837 "data_size": 0 00:13:32.837 }, 00:13:32.837 { 00:13:32.837 "name": "BaseBdev2", 00:13:32.837 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:32.837 "is_configured": false, 00:13:32.837 "data_offset": 0, 00:13:32.837 "data_size": 0 00:13:32.837 }, 00:13:32.837 { 00:13:32.837 "name": "BaseBdev3", 00:13:32.837 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:32.837 "is_configured": false, 00:13:32.837 "data_offset": 0, 00:13:32.837 "data_size": 0 00:13:32.837 } 00:13:32.837 ] 00:13:32.837 }' 00:13:32.837 18:17:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:32.837 18:17:16 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:13:33.403 18:17:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@252 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:13:33.661 [2024-07-12 18:17:17.278303] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:13:33.661 [2024-07-12 18:17:17.278332] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1946a80 name Existed_Raid, state configuring 00:13:33.661 18:17:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@256 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r raid0 -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:13:33.919 [2024-07-12 18:17:17.458810] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:13:33.919 [2024-07-12 18:17:17.458838] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:13:33.919 [2024-07-12 18:17:17.458848] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:13:33.920 [2024-07-12 18:17:17.458859] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:13:33.920 [2024-07-12 18:17:17.458868] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:13:33.920 [2024-07-12 18:17:17.458879] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:13:33.920 18:17:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@257 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:13:34.178 [2024-07-12 18:17:17.717396] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:13:34.178 BaseBdev1 00:13:34.178 18:17:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@258 -- # waitforbdev BaseBdev1 00:13:34.178 18:17:17 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev1 00:13:34.178 18:17:17 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:13:34.178 18:17:17 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:13:34.178 18:17:17 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:13:34.178 18:17:17 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:13:34.178 18:17:17 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:13:34.436 18:17:17 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:13:34.436 [ 00:13:34.436 { 00:13:34.436 "name": "BaseBdev1", 00:13:34.436 "aliases": [ 00:13:34.436 "a6c582dc-f4d9-4737-8f84-e93a7e0dec4a" 00:13:34.436 ], 00:13:34.436 "product_name": "Malloc disk", 00:13:34.436 "block_size": 512, 00:13:34.436 "num_blocks": 65536, 00:13:34.436 "uuid": "a6c582dc-f4d9-4737-8f84-e93a7e0dec4a", 00:13:34.436 "assigned_rate_limits": { 00:13:34.436 "rw_ios_per_sec": 0, 00:13:34.436 "rw_mbytes_per_sec": 0, 00:13:34.436 "r_mbytes_per_sec": 0, 00:13:34.436 "w_mbytes_per_sec": 0 00:13:34.436 }, 00:13:34.436 "claimed": true, 00:13:34.436 "claim_type": "exclusive_write", 00:13:34.436 "zoned": false, 00:13:34.436 "supported_io_types": { 00:13:34.436 "read": true, 00:13:34.436 "write": true, 00:13:34.436 "unmap": true, 00:13:34.436 "flush": true, 00:13:34.436 "reset": true, 00:13:34.436 "nvme_admin": false, 00:13:34.436 "nvme_io": false, 00:13:34.436 "nvme_io_md": false, 00:13:34.436 "write_zeroes": true, 00:13:34.436 "zcopy": true, 00:13:34.436 "get_zone_info": false, 00:13:34.436 "zone_management": false, 00:13:34.436 "zone_append": false, 00:13:34.437 "compare": false, 00:13:34.437 "compare_and_write": false, 00:13:34.437 "abort": true, 00:13:34.437 "seek_hole": false, 00:13:34.437 "seek_data": false, 00:13:34.437 "copy": true, 00:13:34.437 "nvme_iov_md": false 00:13:34.437 }, 00:13:34.437 "memory_domains": [ 00:13:34.437 { 00:13:34.437 "dma_device_id": "system", 00:13:34.437 "dma_device_type": 1 00:13:34.437 }, 00:13:34.437 { 00:13:34.437 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:34.437 "dma_device_type": 2 00:13:34.437 } 00:13:34.437 ], 00:13:34.437 "driver_specific": {} 00:13:34.437 } 00:13:34.437 ] 00:13:34.437 18:17:18 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:13:34.437 18:17:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@259 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:13:34.437 18:17:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:13:34.437 18:17:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:13:34.437 18:17:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:13:34.437 18:17:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:13:34.437 18:17:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:13:34.437 18:17:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:34.437 18:17:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:34.437 18:17:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:34.437 18:17:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:34.437 18:17:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:34.437 18:17:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:13:34.695 18:17:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:34.695 "name": "Existed_Raid", 00:13:34.695 "uuid": "b0d9c674-01be-49aa-9ea9-78425cbe7779", 00:13:34.695 "strip_size_kb": 64, 00:13:34.695 "state": "configuring", 00:13:34.695 "raid_level": "raid0", 00:13:34.695 "superblock": true, 00:13:34.695 "num_base_bdevs": 3, 00:13:34.695 "num_base_bdevs_discovered": 1, 00:13:34.695 "num_base_bdevs_operational": 3, 00:13:34.695 "base_bdevs_list": [ 00:13:34.695 { 00:13:34.695 "name": "BaseBdev1", 00:13:34.695 "uuid": "a6c582dc-f4d9-4737-8f84-e93a7e0dec4a", 00:13:34.695 "is_configured": true, 00:13:34.695 "data_offset": 2048, 00:13:34.695 "data_size": 63488 00:13:34.695 }, 00:13:34.695 { 00:13:34.695 "name": "BaseBdev2", 00:13:34.695 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:34.695 "is_configured": false, 00:13:34.695 "data_offset": 0, 00:13:34.695 "data_size": 0 00:13:34.695 }, 00:13:34.695 { 00:13:34.695 "name": "BaseBdev3", 00:13:34.695 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:34.695 "is_configured": false, 00:13:34.695 "data_offset": 0, 00:13:34.695 "data_size": 0 00:13:34.695 } 00:13:34.695 ] 00:13:34.695 }' 00:13:34.695 18:17:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:34.695 18:17:18 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:13:35.262 18:17:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@260 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:13:35.520 [2024-07-12 18:17:19.189297] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:13:35.520 [2024-07-12 18:17:19.189333] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1946310 name Existed_Raid, state configuring 00:13:35.520 18:17:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@264 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r raid0 -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:13:35.781 [2024-07-12 18:17:19.425966] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:13:35.781 [2024-07-12 18:17:19.427395] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:13:35.781 [2024-07-12 18:17:19.427427] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:13:35.781 [2024-07-12 18:17:19.427437] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:13:35.781 [2024-07-12 18:17:19.427449] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:13:35.781 18:17:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i = 1 )) 00:13:35.781 18:17:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:13:35.781 18:17:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:13:35.781 18:17:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:13:35.781 18:17:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:13:35.781 18:17:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:13:35.781 18:17:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:13:35.781 18:17:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:13:35.781 18:17:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:35.781 18:17:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:35.781 18:17:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:35.781 18:17:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:35.781 18:17:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:35.781 18:17:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:13:36.040 18:17:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:36.040 "name": "Existed_Raid", 00:13:36.040 "uuid": "0b61a2ef-152f-4e7e-be38-7bc8fd9984ce", 00:13:36.040 "strip_size_kb": 64, 00:13:36.040 "state": "configuring", 00:13:36.040 "raid_level": "raid0", 00:13:36.040 "superblock": true, 00:13:36.040 "num_base_bdevs": 3, 00:13:36.040 "num_base_bdevs_discovered": 1, 00:13:36.040 "num_base_bdevs_operational": 3, 00:13:36.040 "base_bdevs_list": [ 00:13:36.040 { 00:13:36.040 "name": "BaseBdev1", 00:13:36.040 "uuid": "a6c582dc-f4d9-4737-8f84-e93a7e0dec4a", 00:13:36.040 "is_configured": true, 00:13:36.040 "data_offset": 2048, 00:13:36.040 "data_size": 63488 00:13:36.040 }, 00:13:36.040 { 00:13:36.040 "name": "BaseBdev2", 00:13:36.040 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:36.040 "is_configured": false, 00:13:36.040 "data_offset": 0, 00:13:36.040 "data_size": 0 00:13:36.040 }, 00:13:36.040 { 00:13:36.040 "name": "BaseBdev3", 00:13:36.040 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:36.040 "is_configured": false, 00:13:36.040 "data_offset": 0, 00:13:36.040 "data_size": 0 00:13:36.040 } 00:13:36.040 ] 00:13:36.040 }' 00:13:36.040 18:17:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:36.040 18:17:19 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:13:36.607 18:17:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:13:36.866 [2024-07-12 18:17:20.524332] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:13:36.866 BaseBdev2 00:13:36.866 18:17:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev2 00:13:36.866 18:17:20 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev2 00:13:36.866 18:17:20 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:13:36.866 18:17:20 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:13:36.866 18:17:20 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:13:36.866 18:17:20 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:13:36.866 18:17:20 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:13:37.124 18:17:20 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:13:37.383 [ 00:13:37.383 { 00:13:37.383 "name": "BaseBdev2", 00:13:37.383 "aliases": [ 00:13:37.383 "cf731de0-5aad-4b47-89ef-26ef46ada4df" 00:13:37.383 ], 00:13:37.383 "product_name": "Malloc disk", 00:13:37.383 "block_size": 512, 00:13:37.383 "num_blocks": 65536, 00:13:37.383 "uuid": "cf731de0-5aad-4b47-89ef-26ef46ada4df", 00:13:37.383 "assigned_rate_limits": { 00:13:37.383 "rw_ios_per_sec": 0, 00:13:37.383 "rw_mbytes_per_sec": 0, 00:13:37.383 "r_mbytes_per_sec": 0, 00:13:37.383 "w_mbytes_per_sec": 0 00:13:37.383 }, 00:13:37.383 "claimed": true, 00:13:37.383 "claim_type": "exclusive_write", 00:13:37.383 "zoned": false, 00:13:37.383 "supported_io_types": { 00:13:37.383 "read": true, 00:13:37.383 "write": true, 00:13:37.383 "unmap": true, 00:13:37.383 "flush": true, 00:13:37.383 "reset": true, 00:13:37.383 "nvme_admin": false, 00:13:37.383 "nvme_io": false, 00:13:37.383 "nvme_io_md": false, 00:13:37.383 "write_zeroes": true, 00:13:37.383 "zcopy": true, 00:13:37.383 "get_zone_info": false, 00:13:37.383 "zone_management": false, 00:13:37.383 "zone_append": false, 00:13:37.383 "compare": false, 00:13:37.383 "compare_and_write": false, 00:13:37.383 "abort": true, 00:13:37.383 "seek_hole": false, 00:13:37.383 "seek_data": false, 00:13:37.383 "copy": true, 00:13:37.383 "nvme_iov_md": false 00:13:37.383 }, 00:13:37.383 "memory_domains": [ 00:13:37.383 { 00:13:37.383 "dma_device_id": "system", 00:13:37.383 "dma_device_type": 1 00:13:37.383 }, 00:13:37.383 { 00:13:37.383 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:37.383 "dma_device_type": 2 00:13:37.383 } 00:13:37.383 ], 00:13:37.383 "driver_specific": {} 00:13:37.383 } 00:13:37.383 ] 00:13:37.383 18:17:21 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:13:37.383 18:17:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:13:37.383 18:17:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:13:37.383 18:17:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:13:37.383 18:17:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:13:37.383 18:17:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:13:37.383 18:17:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:13:37.383 18:17:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:13:37.383 18:17:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:13:37.383 18:17:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:37.383 18:17:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:37.383 18:17:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:37.383 18:17:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:37.383 18:17:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:37.383 18:17:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:13:37.642 18:17:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:37.642 "name": "Existed_Raid", 00:13:37.642 "uuid": "0b61a2ef-152f-4e7e-be38-7bc8fd9984ce", 00:13:37.642 "strip_size_kb": 64, 00:13:37.642 "state": "configuring", 00:13:37.642 "raid_level": "raid0", 00:13:37.642 "superblock": true, 00:13:37.642 "num_base_bdevs": 3, 00:13:37.642 "num_base_bdevs_discovered": 2, 00:13:37.642 "num_base_bdevs_operational": 3, 00:13:37.642 "base_bdevs_list": [ 00:13:37.642 { 00:13:37.642 "name": "BaseBdev1", 00:13:37.642 "uuid": "a6c582dc-f4d9-4737-8f84-e93a7e0dec4a", 00:13:37.642 "is_configured": true, 00:13:37.642 "data_offset": 2048, 00:13:37.642 "data_size": 63488 00:13:37.642 }, 00:13:37.642 { 00:13:37.642 "name": "BaseBdev2", 00:13:37.642 "uuid": "cf731de0-5aad-4b47-89ef-26ef46ada4df", 00:13:37.642 "is_configured": true, 00:13:37.642 "data_offset": 2048, 00:13:37.642 "data_size": 63488 00:13:37.642 }, 00:13:37.642 { 00:13:37.642 "name": "BaseBdev3", 00:13:37.642 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:37.642 "is_configured": false, 00:13:37.642 "data_offset": 0, 00:13:37.642 "data_size": 0 00:13:37.642 } 00:13:37.642 ] 00:13:37.642 }' 00:13:37.642 18:17:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:37.642 18:17:21 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:13:38.208 18:17:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:13:38.467 [2024-07-12 18:17:22.035754] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:13:38.467 [2024-07-12 18:17:22.035915] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x1947400 00:13:38.467 [2024-07-12 18:17:22.035940] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 190464, blocklen 512 00:13:38.467 [2024-07-12 18:17:22.036118] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1946ef0 00:13:38.467 [2024-07-12 18:17:22.036232] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1947400 00:13:38.467 [2024-07-12 18:17:22.036242] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x1947400 00:13:38.467 [2024-07-12 18:17:22.036332] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:13:38.467 BaseBdev3 00:13:38.467 18:17:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev3 00:13:38.467 18:17:22 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev3 00:13:38.467 18:17:22 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:13:38.467 18:17:22 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:13:38.467 18:17:22 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:13:38.467 18:17:22 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:13:38.467 18:17:22 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:13:38.725 18:17:22 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:13:38.983 [ 00:13:38.983 { 00:13:38.983 "name": "BaseBdev3", 00:13:38.983 "aliases": [ 00:13:38.983 "1fd0be5b-e6a2-45f3-8cc2-8c6c4063904f" 00:13:38.983 ], 00:13:38.983 "product_name": "Malloc disk", 00:13:38.983 "block_size": 512, 00:13:38.983 "num_blocks": 65536, 00:13:38.983 "uuid": "1fd0be5b-e6a2-45f3-8cc2-8c6c4063904f", 00:13:38.984 "assigned_rate_limits": { 00:13:38.984 "rw_ios_per_sec": 0, 00:13:38.984 "rw_mbytes_per_sec": 0, 00:13:38.984 "r_mbytes_per_sec": 0, 00:13:38.984 "w_mbytes_per_sec": 0 00:13:38.984 }, 00:13:38.984 "claimed": true, 00:13:38.984 "claim_type": "exclusive_write", 00:13:38.984 "zoned": false, 00:13:38.984 "supported_io_types": { 00:13:38.984 "read": true, 00:13:38.984 "write": true, 00:13:38.984 "unmap": true, 00:13:38.984 "flush": true, 00:13:38.984 "reset": true, 00:13:38.984 "nvme_admin": false, 00:13:38.984 "nvme_io": false, 00:13:38.984 "nvme_io_md": false, 00:13:38.984 "write_zeroes": true, 00:13:38.984 "zcopy": true, 00:13:38.984 "get_zone_info": false, 00:13:38.984 "zone_management": false, 00:13:38.984 "zone_append": false, 00:13:38.984 "compare": false, 00:13:38.984 "compare_and_write": false, 00:13:38.984 "abort": true, 00:13:38.984 "seek_hole": false, 00:13:38.984 "seek_data": false, 00:13:38.984 "copy": true, 00:13:38.984 "nvme_iov_md": false 00:13:38.984 }, 00:13:38.984 "memory_domains": [ 00:13:38.984 { 00:13:38.984 "dma_device_id": "system", 00:13:38.984 "dma_device_type": 1 00:13:38.984 }, 00:13:38.984 { 00:13:38.984 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:38.984 "dma_device_type": 2 00:13:38.984 } 00:13:38.984 ], 00:13:38.984 "driver_specific": {} 00:13:38.984 } 00:13:38.984 ] 00:13:38.984 18:17:22 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:13:38.984 18:17:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:13:38.984 18:17:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:13:38.984 18:17:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@270 -- # verify_raid_bdev_state Existed_Raid online raid0 64 3 00:13:38.984 18:17:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:13:38.984 18:17:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:13:38.984 18:17:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:13:38.984 18:17:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:13:38.984 18:17:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:13:38.984 18:17:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:38.984 18:17:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:38.984 18:17:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:38.984 18:17:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:38.984 18:17:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:38.984 18:17:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:13:39.248 18:17:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:39.248 "name": "Existed_Raid", 00:13:39.248 "uuid": "0b61a2ef-152f-4e7e-be38-7bc8fd9984ce", 00:13:39.248 "strip_size_kb": 64, 00:13:39.249 "state": "online", 00:13:39.249 "raid_level": "raid0", 00:13:39.249 "superblock": true, 00:13:39.249 "num_base_bdevs": 3, 00:13:39.249 "num_base_bdevs_discovered": 3, 00:13:39.249 "num_base_bdevs_operational": 3, 00:13:39.249 "base_bdevs_list": [ 00:13:39.249 { 00:13:39.249 "name": "BaseBdev1", 00:13:39.249 "uuid": "a6c582dc-f4d9-4737-8f84-e93a7e0dec4a", 00:13:39.249 "is_configured": true, 00:13:39.249 "data_offset": 2048, 00:13:39.249 "data_size": 63488 00:13:39.249 }, 00:13:39.249 { 00:13:39.249 "name": "BaseBdev2", 00:13:39.249 "uuid": "cf731de0-5aad-4b47-89ef-26ef46ada4df", 00:13:39.249 "is_configured": true, 00:13:39.249 "data_offset": 2048, 00:13:39.249 "data_size": 63488 00:13:39.249 }, 00:13:39.249 { 00:13:39.249 "name": "BaseBdev3", 00:13:39.249 "uuid": "1fd0be5b-e6a2-45f3-8cc2-8c6c4063904f", 00:13:39.249 "is_configured": true, 00:13:39.249 "data_offset": 2048, 00:13:39.249 "data_size": 63488 00:13:39.249 } 00:13:39.249 ] 00:13:39.249 }' 00:13:39.249 18:17:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:39.249 18:17:22 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:13:39.819 18:17:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_properties Existed_Raid 00:13:39.819 18:17:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:13:39.819 18:17:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:13:39.819 18:17:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:13:39.819 18:17:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:13:39.819 18:17:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@198 -- # local name 00:13:39.819 18:17:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:13:39.819 18:17:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:13:40.077 [2024-07-12 18:17:23.620266] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:13:40.078 18:17:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:13:40.078 "name": "Existed_Raid", 00:13:40.078 "aliases": [ 00:13:40.078 "0b61a2ef-152f-4e7e-be38-7bc8fd9984ce" 00:13:40.078 ], 00:13:40.078 "product_name": "Raid Volume", 00:13:40.078 "block_size": 512, 00:13:40.078 "num_blocks": 190464, 00:13:40.078 "uuid": "0b61a2ef-152f-4e7e-be38-7bc8fd9984ce", 00:13:40.078 "assigned_rate_limits": { 00:13:40.078 "rw_ios_per_sec": 0, 00:13:40.078 "rw_mbytes_per_sec": 0, 00:13:40.078 "r_mbytes_per_sec": 0, 00:13:40.078 "w_mbytes_per_sec": 0 00:13:40.078 }, 00:13:40.078 "claimed": false, 00:13:40.078 "zoned": false, 00:13:40.078 "supported_io_types": { 00:13:40.078 "read": true, 00:13:40.078 "write": true, 00:13:40.078 "unmap": true, 00:13:40.078 "flush": true, 00:13:40.078 "reset": true, 00:13:40.078 "nvme_admin": false, 00:13:40.078 "nvme_io": false, 00:13:40.078 "nvme_io_md": false, 00:13:40.078 "write_zeroes": true, 00:13:40.078 "zcopy": false, 00:13:40.078 "get_zone_info": false, 00:13:40.078 "zone_management": false, 00:13:40.078 "zone_append": false, 00:13:40.078 "compare": false, 00:13:40.078 "compare_and_write": false, 00:13:40.078 "abort": false, 00:13:40.078 "seek_hole": false, 00:13:40.078 "seek_data": false, 00:13:40.078 "copy": false, 00:13:40.078 "nvme_iov_md": false 00:13:40.078 }, 00:13:40.078 "memory_domains": [ 00:13:40.078 { 00:13:40.078 "dma_device_id": "system", 00:13:40.078 "dma_device_type": 1 00:13:40.078 }, 00:13:40.078 { 00:13:40.078 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:40.078 "dma_device_type": 2 00:13:40.078 }, 00:13:40.078 { 00:13:40.078 "dma_device_id": "system", 00:13:40.078 "dma_device_type": 1 00:13:40.078 }, 00:13:40.078 { 00:13:40.078 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:40.078 "dma_device_type": 2 00:13:40.078 }, 00:13:40.078 { 00:13:40.078 "dma_device_id": "system", 00:13:40.078 "dma_device_type": 1 00:13:40.078 }, 00:13:40.078 { 00:13:40.078 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:40.078 "dma_device_type": 2 00:13:40.078 } 00:13:40.078 ], 00:13:40.078 "driver_specific": { 00:13:40.078 "raid": { 00:13:40.078 "uuid": "0b61a2ef-152f-4e7e-be38-7bc8fd9984ce", 00:13:40.078 "strip_size_kb": 64, 00:13:40.078 "state": "online", 00:13:40.078 "raid_level": "raid0", 00:13:40.078 "superblock": true, 00:13:40.078 "num_base_bdevs": 3, 00:13:40.078 "num_base_bdevs_discovered": 3, 00:13:40.078 "num_base_bdevs_operational": 3, 00:13:40.078 "base_bdevs_list": [ 00:13:40.078 { 00:13:40.078 "name": "BaseBdev1", 00:13:40.078 "uuid": "a6c582dc-f4d9-4737-8f84-e93a7e0dec4a", 00:13:40.078 "is_configured": true, 00:13:40.078 "data_offset": 2048, 00:13:40.078 "data_size": 63488 00:13:40.078 }, 00:13:40.078 { 00:13:40.078 "name": "BaseBdev2", 00:13:40.078 "uuid": "cf731de0-5aad-4b47-89ef-26ef46ada4df", 00:13:40.078 "is_configured": true, 00:13:40.078 "data_offset": 2048, 00:13:40.078 "data_size": 63488 00:13:40.078 }, 00:13:40.078 { 00:13:40.078 "name": "BaseBdev3", 00:13:40.078 "uuid": "1fd0be5b-e6a2-45f3-8cc2-8c6c4063904f", 00:13:40.078 "is_configured": true, 00:13:40.078 "data_offset": 2048, 00:13:40.078 "data_size": 63488 00:13:40.078 } 00:13:40.078 ] 00:13:40.078 } 00:13:40.078 } 00:13:40.078 }' 00:13:40.078 18:17:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:13:40.078 18:17:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # base_bdev_names='BaseBdev1 00:13:40.078 BaseBdev2 00:13:40.078 BaseBdev3' 00:13:40.078 18:17:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:13:40.078 18:17:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:13:40.078 18:17:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:13:40.337 18:17:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:13:40.337 "name": "BaseBdev1", 00:13:40.337 "aliases": [ 00:13:40.337 "a6c582dc-f4d9-4737-8f84-e93a7e0dec4a" 00:13:40.337 ], 00:13:40.337 "product_name": "Malloc disk", 00:13:40.337 "block_size": 512, 00:13:40.337 "num_blocks": 65536, 00:13:40.337 "uuid": "a6c582dc-f4d9-4737-8f84-e93a7e0dec4a", 00:13:40.337 "assigned_rate_limits": { 00:13:40.337 "rw_ios_per_sec": 0, 00:13:40.337 "rw_mbytes_per_sec": 0, 00:13:40.337 "r_mbytes_per_sec": 0, 00:13:40.337 "w_mbytes_per_sec": 0 00:13:40.337 }, 00:13:40.337 "claimed": true, 00:13:40.337 "claim_type": "exclusive_write", 00:13:40.337 "zoned": false, 00:13:40.337 "supported_io_types": { 00:13:40.337 "read": true, 00:13:40.337 "write": true, 00:13:40.337 "unmap": true, 00:13:40.337 "flush": true, 00:13:40.337 "reset": true, 00:13:40.337 "nvme_admin": false, 00:13:40.337 "nvme_io": false, 00:13:40.337 "nvme_io_md": false, 00:13:40.337 "write_zeroes": true, 00:13:40.337 "zcopy": true, 00:13:40.337 "get_zone_info": false, 00:13:40.337 "zone_management": false, 00:13:40.337 "zone_append": false, 00:13:40.337 "compare": false, 00:13:40.337 "compare_and_write": false, 00:13:40.337 "abort": true, 00:13:40.337 "seek_hole": false, 00:13:40.337 "seek_data": false, 00:13:40.337 "copy": true, 00:13:40.337 "nvme_iov_md": false 00:13:40.337 }, 00:13:40.337 "memory_domains": [ 00:13:40.337 { 00:13:40.337 "dma_device_id": "system", 00:13:40.337 "dma_device_type": 1 00:13:40.337 }, 00:13:40.337 { 00:13:40.337 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:40.337 "dma_device_type": 2 00:13:40.337 } 00:13:40.337 ], 00:13:40.337 "driver_specific": {} 00:13:40.337 }' 00:13:40.337 18:17:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:13:40.337 18:17:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:13:40.337 18:17:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:13:40.337 18:17:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:13:40.595 18:17:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:13:40.595 18:17:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:13:40.595 18:17:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:13:40.595 18:17:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:13:40.595 18:17:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:13:40.595 18:17:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:13:40.595 18:17:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:13:40.595 18:17:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:13:40.595 18:17:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:13:40.595 18:17:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:13:40.595 18:17:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:13:40.854 18:17:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:13:40.854 "name": "BaseBdev2", 00:13:40.854 "aliases": [ 00:13:40.854 "cf731de0-5aad-4b47-89ef-26ef46ada4df" 00:13:40.854 ], 00:13:40.854 "product_name": "Malloc disk", 00:13:40.854 "block_size": 512, 00:13:40.854 "num_blocks": 65536, 00:13:40.854 "uuid": "cf731de0-5aad-4b47-89ef-26ef46ada4df", 00:13:40.854 "assigned_rate_limits": { 00:13:40.854 "rw_ios_per_sec": 0, 00:13:40.854 "rw_mbytes_per_sec": 0, 00:13:40.854 "r_mbytes_per_sec": 0, 00:13:40.854 "w_mbytes_per_sec": 0 00:13:40.854 }, 00:13:40.854 "claimed": true, 00:13:40.854 "claim_type": "exclusive_write", 00:13:40.854 "zoned": false, 00:13:40.854 "supported_io_types": { 00:13:40.854 "read": true, 00:13:40.854 "write": true, 00:13:40.854 "unmap": true, 00:13:40.854 "flush": true, 00:13:40.854 "reset": true, 00:13:40.854 "nvme_admin": false, 00:13:40.854 "nvme_io": false, 00:13:40.854 "nvme_io_md": false, 00:13:40.854 "write_zeroes": true, 00:13:40.854 "zcopy": true, 00:13:40.854 "get_zone_info": false, 00:13:40.854 "zone_management": false, 00:13:40.854 "zone_append": false, 00:13:40.854 "compare": false, 00:13:40.854 "compare_and_write": false, 00:13:40.854 "abort": true, 00:13:40.854 "seek_hole": false, 00:13:40.854 "seek_data": false, 00:13:40.854 "copy": true, 00:13:40.854 "nvme_iov_md": false 00:13:40.854 }, 00:13:40.854 "memory_domains": [ 00:13:40.854 { 00:13:40.854 "dma_device_id": "system", 00:13:40.854 "dma_device_type": 1 00:13:40.854 }, 00:13:40.854 { 00:13:40.854 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:40.854 "dma_device_type": 2 00:13:40.854 } 00:13:40.854 ], 00:13:40.854 "driver_specific": {} 00:13:40.854 }' 00:13:40.854 18:17:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:13:40.854 18:17:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:13:41.113 18:17:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:13:41.113 18:17:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:13:41.113 18:17:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:13:41.113 18:17:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:13:41.113 18:17:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:13:41.113 18:17:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:13:41.113 18:17:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:13:41.113 18:17:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:13:41.113 18:17:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:13:41.371 18:17:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:13:41.371 18:17:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:13:41.371 18:17:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:13:41.371 18:17:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:13:41.629 18:17:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:13:41.629 "name": "BaseBdev3", 00:13:41.629 "aliases": [ 00:13:41.629 "1fd0be5b-e6a2-45f3-8cc2-8c6c4063904f" 00:13:41.629 ], 00:13:41.629 "product_name": "Malloc disk", 00:13:41.629 "block_size": 512, 00:13:41.629 "num_blocks": 65536, 00:13:41.629 "uuid": "1fd0be5b-e6a2-45f3-8cc2-8c6c4063904f", 00:13:41.629 "assigned_rate_limits": { 00:13:41.629 "rw_ios_per_sec": 0, 00:13:41.629 "rw_mbytes_per_sec": 0, 00:13:41.629 "r_mbytes_per_sec": 0, 00:13:41.629 "w_mbytes_per_sec": 0 00:13:41.629 }, 00:13:41.629 "claimed": true, 00:13:41.629 "claim_type": "exclusive_write", 00:13:41.629 "zoned": false, 00:13:41.629 "supported_io_types": { 00:13:41.629 "read": true, 00:13:41.629 "write": true, 00:13:41.629 "unmap": true, 00:13:41.629 "flush": true, 00:13:41.629 "reset": true, 00:13:41.629 "nvme_admin": false, 00:13:41.629 "nvme_io": false, 00:13:41.630 "nvme_io_md": false, 00:13:41.630 "write_zeroes": true, 00:13:41.630 "zcopy": true, 00:13:41.630 "get_zone_info": false, 00:13:41.630 "zone_management": false, 00:13:41.630 "zone_append": false, 00:13:41.630 "compare": false, 00:13:41.630 "compare_and_write": false, 00:13:41.630 "abort": true, 00:13:41.630 "seek_hole": false, 00:13:41.630 "seek_data": false, 00:13:41.630 "copy": true, 00:13:41.630 "nvme_iov_md": false 00:13:41.630 }, 00:13:41.630 "memory_domains": [ 00:13:41.630 { 00:13:41.630 "dma_device_id": "system", 00:13:41.630 "dma_device_type": 1 00:13:41.630 }, 00:13:41.630 { 00:13:41.630 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:41.630 "dma_device_type": 2 00:13:41.630 } 00:13:41.630 ], 00:13:41.630 "driver_specific": {} 00:13:41.630 }' 00:13:41.630 18:17:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:13:41.630 18:17:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:13:41.630 18:17:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:13:41.630 18:17:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:13:41.630 18:17:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:13:41.630 18:17:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:13:41.630 18:17:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:13:41.630 18:17:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:13:41.888 18:17:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:13:41.888 18:17:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:13:41.888 18:17:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:13:41.888 18:17:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:13:41.888 18:17:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@274 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:13:42.147 [2024-07-12 18:17:25.685522] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:13:42.147 [2024-07-12 18:17:25.685552] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:13:42.147 [2024-07-12 18:17:25.685592] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:13:42.147 18:17:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@275 -- # local expected_state 00:13:42.147 18:17:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@276 -- # has_redundancy raid0 00:13:42.147 18:17:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@213 -- # case $1 in 00:13:42.147 18:17:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@215 -- # return 1 00:13:42.147 18:17:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@277 -- # expected_state=offline 00:13:42.147 18:17:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@281 -- # verify_raid_bdev_state Existed_Raid offline raid0 64 2 00:13:42.148 18:17:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:13:42.148 18:17:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=offline 00:13:42.148 18:17:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:13:42.148 18:17:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:13:42.148 18:17:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:13:42.148 18:17:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:42.148 18:17:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:42.148 18:17:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:42.148 18:17:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:42.148 18:17:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:13:42.148 18:17:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:42.407 18:17:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:42.407 "name": "Existed_Raid", 00:13:42.407 "uuid": "0b61a2ef-152f-4e7e-be38-7bc8fd9984ce", 00:13:42.407 "strip_size_kb": 64, 00:13:42.407 "state": "offline", 00:13:42.407 "raid_level": "raid0", 00:13:42.407 "superblock": true, 00:13:42.407 "num_base_bdevs": 3, 00:13:42.407 "num_base_bdevs_discovered": 2, 00:13:42.407 "num_base_bdevs_operational": 2, 00:13:42.407 "base_bdevs_list": [ 00:13:42.407 { 00:13:42.407 "name": null, 00:13:42.407 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:42.407 "is_configured": false, 00:13:42.407 "data_offset": 2048, 00:13:42.407 "data_size": 63488 00:13:42.407 }, 00:13:42.407 { 00:13:42.407 "name": "BaseBdev2", 00:13:42.407 "uuid": "cf731de0-5aad-4b47-89ef-26ef46ada4df", 00:13:42.407 "is_configured": true, 00:13:42.407 "data_offset": 2048, 00:13:42.407 "data_size": 63488 00:13:42.407 }, 00:13:42.407 { 00:13:42.407 "name": "BaseBdev3", 00:13:42.407 "uuid": "1fd0be5b-e6a2-45f3-8cc2-8c6c4063904f", 00:13:42.407 "is_configured": true, 00:13:42.407 "data_offset": 2048, 00:13:42.407 "data_size": 63488 00:13:42.407 } 00:13:42.407 ] 00:13:42.407 }' 00:13:42.407 18:17:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:42.407 18:17:25 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:13:42.974 18:17:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i = 1 )) 00:13:42.974 18:17:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:13:42.974 18:17:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:13:42.974 18:17:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:43.232 18:17:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:13:43.232 18:17:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:13:43.233 18:17:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:13:43.491 [2024-07-12 18:17:27.018983] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:13:43.491 18:17:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:13:43.491 18:17:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:13:43.491 18:17:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:43.491 18:17:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:13:43.783 18:17:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:13:43.783 18:17:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:13:43.783 18:17:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev3 00:13:44.041 [2024-07-12 18:17:27.524357] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:13:44.041 [2024-07-12 18:17:27.524401] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1947400 name Existed_Raid, state offline 00:13:44.041 18:17:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:13:44.041 18:17:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:13:44.041 18:17:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:44.041 18:17:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # jq -r '.[0]["name"] | select(.)' 00:13:44.299 18:17:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # raid_bdev= 00:13:44.299 18:17:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@294 -- # '[' -n '' ']' 00:13:44.299 18:17:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@299 -- # '[' 3 -gt 2 ']' 00:13:44.299 18:17:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i = 1 )) 00:13:44.299 18:17:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:13:44.299 18:17:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:13:44.299 BaseBdev2 00:13:44.557 18:17:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev2 00:13:44.557 18:17:28 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev2 00:13:44.557 18:17:28 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:13:44.557 18:17:28 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:13:44.557 18:17:28 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:13:44.557 18:17:28 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:13:44.557 18:17:28 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:13:44.558 18:17:28 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:13:44.816 [ 00:13:44.816 { 00:13:44.817 "name": "BaseBdev2", 00:13:44.817 "aliases": [ 00:13:44.817 "ee803b37-e86f-4a4a-9394-62ba2f5e85ad" 00:13:44.817 ], 00:13:44.817 "product_name": "Malloc disk", 00:13:44.817 "block_size": 512, 00:13:44.817 "num_blocks": 65536, 00:13:44.817 "uuid": "ee803b37-e86f-4a4a-9394-62ba2f5e85ad", 00:13:44.817 "assigned_rate_limits": { 00:13:44.817 "rw_ios_per_sec": 0, 00:13:44.817 "rw_mbytes_per_sec": 0, 00:13:44.817 "r_mbytes_per_sec": 0, 00:13:44.817 "w_mbytes_per_sec": 0 00:13:44.817 }, 00:13:44.817 "claimed": false, 00:13:44.817 "zoned": false, 00:13:44.817 "supported_io_types": { 00:13:44.817 "read": true, 00:13:44.817 "write": true, 00:13:44.817 "unmap": true, 00:13:44.817 "flush": true, 00:13:44.817 "reset": true, 00:13:44.817 "nvme_admin": false, 00:13:44.817 "nvme_io": false, 00:13:44.817 "nvme_io_md": false, 00:13:44.817 "write_zeroes": true, 00:13:44.817 "zcopy": true, 00:13:44.817 "get_zone_info": false, 00:13:44.817 "zone_management": false, 00:13:44.817 "zone_append": false, 00:13:44.817 "compare": false, 00:13:44.817 "compare_and_write": false, 00:13:44.817 "abort": true, 00:13:44.817 "seek_hole": false, 00:13:44.817 "seek_data": false, 00:13:44.817 "copy": true, 00:13:44.817 "nvme_iov_md": false 00:13:44.817 }, 00:13:44.817 "memory_domains": [ 00:13:44.817 { 00:13:44.817 "dma_device_id": "system", 00:13:44.817 "dma_device_type": 1 00:13:44.817 }, 00:13:44.817 { 00:13:44.817 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:44.817 "dma_device_type": 2 00:13:44.817 } 00:13:44.817 ], 00:13:44.817 "driver_specific": {} 00:13:44.817 } 00:13:44.817 ] 00:13:44.817 18:17:28 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:13:44.817 18:17:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:13:44.817 18:17:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:13:44.817 18:17:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:13:45.075 BaseBdev3 00:13:45.075 18:17:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev3 00:13:45.075 18:17:28 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev3 00:13:45.075 18:17:28 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:13:45.075 18:17:28 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:13:45.075 18:17:28 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:13:45.075 18:17:28 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:13:45.075 18:17:28 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:13:45.333 18:17:28 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:13:45.594 [ 00:13:45.594 { 00:13:45.594 "name": "BaseBdev3", 00:13:45.594 "aliases": [ 00:13:45.594 "9ae007a2-b1bd-4bcd-96f3-679a2225d96a" 00:13:45.594 ], 00:13:45.594 "product_name": "Malloc disk", 00:13:45.594 "block_size": 512, 00:13:45.594 "num_blocks": 65536, 00:13:45.594 "uuid": "9ae007a2-b1bd-4bcd-96f3-679a2225d96a", 00:13:45.594 "assigned_rate_limits": { 00:13:45.594 "rw_ios_per_sec": 0, 00:13:45.594 "rw_mbytes_per_sec": 0, 00:13:45.594 "r_mbytes_per_sec": 0, 00:13:45.594 "w_mbytes_per_sec": 0 00:13:45.594 }, 00:13:45.594 "claimed": false, 00:13:45.594 "zoned": false, 00:13:45.594 "supported_io_types": { 00:13:45.594 "read": true, 00:13:45.594 "write": true, 00:13:45.594 "unmap": true, 00:13:45.594 "flush": true, 00:13:45.594 "reset": true, 00:13:45.594 "nvme_admin": false, 00:13:45.594 "nvme_io": false, 00:13:45.594 "nvme_io_md": false, 00:13:45.594 "write_zeroes": true, 00:13:45.594 "zcopy": true, 00:13:45.594 "get_zone_info": false, 00:13:45.594 "zone_management": false, 00:13:45.594 "zone_append": false, 00:13:45.594 "compare": false, 00:13:45.594 "compare_and_write": false, 00:13:45.594 "abort": true, 00:13:45.594 "seek_hole": false, 00:13:45.594 "seek_data": false, 00:13:45.594 "copy": true, 00:13:45.594 "nvme_iov_md": false 00:13:45.594 }, 00:13:45.594 "memory_domains": [ 00:13:45.594 { 00:13:45.594 "dma_device_id": "system", 00:13:45.594 "dma_device_type": 1 00:13:45.594 }, 00:13:45.594 { 00:13:45.594 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:45.594 "dma_device_type": 2 00:13:45.594 } 00:13:45.594 ], 00:13:45.594 "driver_specific": {} 00:13:45.594 } 00:13:45.594 ] 00:13:45.594 18:17:29 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:13:45.594 18:17:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:13:45.594 18:17:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:13:45.594 18:17:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@305 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r raid0 -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:13:45.851 [2024-07-12 18:17:29.443867] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:13:45.851 [2024-07-12 18:17:29.443909] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:13:45.851 [2024-07-12 18:17:29.443934] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:13:45.851 [2024-07-12 18:17:29.445257] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:13:45.851 18:17:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@306 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:13:45.851 18:17:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:13:45.851 18:17:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:13:45.851 18:17:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:13:45.851 18:17:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:13:45.851 18:17:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:13:45.851 18:17:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:45.851 18:17:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:45.851 18:17:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:45.851 18:17:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:45.851 18:17:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:45.851 18:17:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:13:46.109 18:17:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:46.109 "name": "Existed_Raid", 00:13:46.109 "uuid": "3c072903-ee60-470b-adc9-49683f48cc7b", 00:13:46.109 "strip_size_kb": 64, 00:13:46.109 "state": "configuring", 00:13:46.109 "raid_level": "raid0", 00:13:46.109 "superblock": true, 00:13:46.109 "num_base_bdevs": 3, 00:13:46.109 "num_base_bdevs_discovered": 2, 00:13:46.109 "num_base_bdevs_operational": 3, 00:13:46.109 "base_bdevs_list": [ 00:13:46.109 { 00:13:46.109 "name": "BaseBdev1", 00:13:46.109 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:46.109 "is_configured": false, 00:13:46.109 "data_offset": 0, 00:13:46.109 "data_size": 0 00:13:46.109 }, 00:13:46.109 { 00:13:46.109 "name": "BaseBdev2", 00:13:46.109 "uuid": "ee803b37-e86f-4a4a-9394-62ba2f5e85ad", 00:13:46.109 "is_configured": true, 00:13:46.109 "data_offset": 2048, 00:13:46.109 "data_size": 63488 00:13:46.109 }, 00:13:46.109 { 00:13:46.109 "name": "BaseBdev3", 00:13:46.109 "uuid": "9ae007a2-b1bd-4bcd-96f3-679a2225d96a", 00:13:46.109 "is_configured": true, 00:13:46.109 "data_offset": 2048, 00:13:46.109 "data_size": 63488 00:13:46.109 } 00:13:46.109 ] 00:13:46.109 }' 00:13:46.109 18:17:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:46.109 18:17:29 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:13:46.674 18:17:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@308 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev2 00:13:46.932 [2024-07-12 18:17:30.510677] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:13:46.932 18:17:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@309 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:13:46.932 18:17:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:13:46.932 18:17:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:13:46.932 18:17:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:13:46.932 18:17:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:13:46.932 18:17:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:13:46.932 18:17:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:46.932 18:17:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:46.932 18:17:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:46.932 18:17:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:46.932 18:17:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:46.932 18:17:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:13:47.189 18:17:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:47.189 "name": "Existed_Raid", 00:13:47.189 "uuid": "3c072903-ee60-470b-adc9-49683f48cc7b", 00:13:47.189 "strip_size_kb": 64, 00:13:47.189 "state": "configuring", 00:13:47.189 "raid_level": "raid0", 00:13:47.189 "superblock": true, 00:13:47.189 "num_base_bdevs": 3, 00:13:47.189 "num_base_bdevs_discovered": 1, 00:13:47.189 "num_base_bdevs_operational": 3, 00:13:47.189 "base_bdevs_list": [ 00:13:47.189 { 00:13:47.189 "name": "BaseBdev1", 00:13:47.189 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:47.189 "is_configured": false, 00:13:47.189 "data_offset": 0, 00:13:47.189 "data_size": 0 00:13:47.189 }, 00:13:47.189 { 00:13:47.189 "name": null, 00:13:47.189 "uuid": "ee803b37-e86f-4a4a-9394-62ba2f5e85ad", 00:13:47.189 "is_configured": false, 00:13:47.189 "data_offset": 2048, 00:13:47.189 "data_size": 63488 00:13:47.189 }, 00:13:47.189 { 00:13:47.189 "name": "BaseBdev3", 00:13:47.189 "uuid": "9ae007a2-b1bd-4bcd-96f3-679a2225d96a", 00:13:47.189 "is_configured": true, 00:13:47.189 "data_offset": 2048, 00:13:47.189 "data_size": 63488 00:13:47.189 } 00:13:47.189 ] 00:13:47.189 }' 00:13:47.189 18:17:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:47.189 18:17:30 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:13:47.753 18:17:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@310 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:13:47.753 18:17:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@310 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:48.010 18:17:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@310 -- # [[ false == \f\a\l\s\e ]] 00:13:48.010 18:17:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@312 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:13:48.268 [2024-07-12 18:17:31.830773] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:13:48.268 BaseBdev1 00:13:48.268 18:17:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@313 -- # waitforbdev BaseBdev1 00:13:48.268 18:17:31 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev1 00:13:48.268 18:17:31 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:13:48.268 18:17:31 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:13:48.268 18:17:31 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:13:48.268 18:17:31 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:13:48.268 18:17:31 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:13:48.525 18:17:32 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:13:48.784 [ 00:13:48.784 { 00:13:48.784 "name": "BaseBdev1", 00:13:48.784 "aliases": [ 00:13:48.784 "6b2cd226-5f3f-4891-b2be-b9a122ca060f" 00:13:48.784 ], 00:13:48.784 "product_name": "Malloc disk", 00:13:48.784 "block_size": 512, 00:13:48.784 "num_blocks": 65536, 00:13:48.784 "uuid": "6b2cd226-5f3f-4891-b2be-b9a122ca060f", 00:13:48.784 "assigned_rate_limits": { 00:13:48.784 "rw_ios_per_sec": 0, 00:13:48.784 "rw_mbytes_per_sec": 0, 00:13:48.784 "r_mbytes_per_sec": 0, 00:13:48.784 "w_mbytes_per_sec": 0 00:13:48.784 }, 00:13:48.784 "claimed": true, 00:13:48.784 "claim_type": "exclusive_write", 00:13:48.784 "zoned": false, 00:13:48.784 "supported_io_types": { 00:13:48.784 "read": true, 00:13:48.784 "write": true, 00:13:48.784 "unmap": true, 00:13:48.784 "flush": true, 00:13:48.784 "reset": true, 00:13:48.784 "nvme_admin": false, 00:13:48.784 "nvme_io": false, 00:13:48.784 "nvme_io_md": false, 00:13:48.784 "write_zeroes": true, 00:13:48.784 "zcopy": true, 00:13:48.784 "get_zone_info": false, 00:13:48.784 "zone_management": false, 00:13:48.784 "zone_append": false, 00:13:48.784 "compare": false, 00:13:48.784 "compare_and_write": false, 00:13:48.784 "abort": true, 00:13:48.784 "seek_hole": false, 00:13:48.784 "seek_data": false, 00:13:48.784 "copy": true, 00:13:48.784 "nvme_iov_md": false 00:13:48.784 }, 00:13:48.784 "memory_domains": [ 00:13:48.784 { 00:13:48.784 "dma_device_id": "system", 00:13:48.784 "dma_device_type": 1 00:13:48.784 }, 00:13:48.784 { 00:13:48.784 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:48.784 "dma_device_type": 2 00:13:48.784 } 00:13:48.784 ], 00:13:48.784 "driver_specific": {} 00:13:48.784 } 00:13:48.784 ] 00:13:48.784 18:17:32 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:13:48.784 18:17:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@314 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:13:48.784 18:17:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:13:48.784 18:17:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:13:48.784 18:17:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:13:48.784 18:17:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:13:48.784 18:17:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:13:48.784 18:17:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:48.784 18:17:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:48.784 18:17:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:48.784 18:17:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:48.784 18:17:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:48.784 18:17:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:13:49.042 18:17:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:49.042 "name": "Existed_Raid", 00:13:49.042 "uuid": "3c072903-ee60-470b-adc9-49683f48cc7b", 00:13:49.042 "strip_size_kb": 64, 00:13:49.042 "state": "configuring", 00:13:49.042 "raid_level": "raid0", 00:13:49.042 "superblock": true, 00:13:49.042 "num_base_bdevs": 3, 00:13:49.042 "num_base_bdevs_discovered": 2, 00:13:49.042 "num_base_bdevs_operational": 3, 00:13:49.042 "base_bdevs_list": [ 00:13:49.042 { 00:13:49.042 "name": "BaseBdev1", 00:13:49.042 "uuid": "6b2cd226-5f3f-4891-b2be-b9a122ca060f", 00:13:49.042 "is_configured": true, 00:13:49.042 "data_offset": 2048, 00:13:49.042 "data_size": 63488 00:13:49.042 }, 00:13:49.042 { 00:13:49.042 "name": null, 00:13:49.042 "uuid": "ee803b37-e86f-4a4a-9394-62ba2f5e85ad", 00:13:49.042 "is_configured": false, 00:13:49.042 "data_offset": 2048, 00:13:49.042 "data_size": 63488 00:13:49.042 }, 00:13:49.042 { 00:13:49.042 "name": "BaseBdev3", 00:13:49.042 "uuid": "9ae007a2-b1bd-4bcd-96f3-679a2225d96a", 00:13:49.042 "is_configured": true, 00:13:49.042 "data_offset": 2048, 00:13:49.042 "data_size": 63488 00:13:49.042 } 00:13:49.042 ] 00:13:49.042 }' 00:13:49.042 18:17:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:49.042 18:17:32 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:13:49.614 18:17:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@315 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:13:49.614 18:17:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@315 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:49.871 18:17:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@315 -- # [[ true == \t\r\u\e ]] 00:13:49.871 18:17:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@317 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev3 00:13:50.130 [2024-07-12 18:17:33.683690] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:13:50.130 18:17:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@318 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:13:50.130 18:17:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:13:50.130 18:17:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:13:50.130 18:17:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:13:50.130 18:17:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:13:50.130 18:17:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:13:50.130 18:17:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:50.130 18:17:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:50.130 18:17:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:50.130 18:17:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:50.130 18:17:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:50.130 18:17:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:13:50.388 18:17:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:50.388 "name": "Existed_Raid", 00:13:50.388 "uuid": "3c072903-ee60-470b-adc9-49683f48cc7b", 00:13:50.388 "strip_size_kb": 64, 00:13:50.388 "state": "configuring", 00:13:50.388 "raid_level": "raid0", 00:13:50.388 "superblock": true, 00:13:50.388 "num_base_bdevs": 3, 00:13:50.388 "num_base_bdevs_discovered": 1, 00:13:50.388 "num_base_bdevs_operational": 3, 00:13:50.388 "base_bdevs_list": [ 00:13:50.388 { 00:13:50.388 "name": "BaseBdev1", 00:13:50.388 "uuid": "6b2cd226-5f3f-4891-b2be-b9a122ca060f", 00:13:50.388 "is_configured": true, 00:13:50.388 "data_offset": 2048, 00:13:50.388 "data_size": 63488 00:13:50.388 }, 00:13:50.388 { 00:13:50.388 "name": null, 00:13:50.388 "uuid": "ee803b37-e86f-4a4a-9394-62ba2f5e85ad", 00:13:50.388 "is_configured": false, 00:13:50.388 "data_offset": 2048, 00:13:50.388 "data_size": 63488 00:13:50.388 }, 00:13:50.388 { 00:13:50.388 "name": null, 00:13:50.388 "uuid": "9ae007a2-b1bd-4bcd-96f3-679a2225d96a", 00:13:50.388 "is_configured": false, 00:13:50.388 "data_offset": 2048, 00:13:50.388 "data_size": 63488 00:13:50.388 } 00:13:50.388 ] 00:13:50.388 }' 00:13:50.388 18:17:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:50.388 18:17:33 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:13:50.953 18:17:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@319 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:50.953 18:17:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@319 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:13:51.211 18:17:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@319 -- # [[ false == \f\a\l\s\e ]] 00:13:51.211 18:17:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@321 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev3 00:13:51.468 [2024-07-12 18:17:34.999196] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:13:51.468 18:17:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@322 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:13:51.468 18:17:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:13:51.468 18:17:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:13:51.468 18:17:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:13:51.468 18:17:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:13:51.468 18:17:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:13:51.468 18:17:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:51.468 18:17:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:51.468 18:17:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:51.468 18:17:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:51.468 18:17:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:51.468 18:17:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:13:51.726 18:17:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:51.726 "name": "Existed_Raid", 00:13:51.726 "uuid": "3c072903-ee60-470b-adc9-49683f48cc7b", 00:13:51.726 "strip_size_kb": 64, 00:13:51.726 "state": "configuring", 00:13:51.726 "raid_level": "raid0", 00:13:51.726 "superblock": true, 00:13:51.726 "num_base_bdevs": 3, 00:13:51.726 "num_base_bdevs_discovered": 2, 00:13:51.726 "num_base_bdevs_operational": 3, 00:13:51.726 "base_bdevs_list": [ 00:13:51.726 { 00:13:51.726 "name": "BaseBdev1", 00:13:51.726 "uuid": "6b2cd226-5f3f-4891-b2be-b9a122ca060f", 00:13:51.726 "is_configured": true, 00:13:51.726 "data_offset": 2048, 00:13:51.726 "data_size": 63488 00:13:51.726 }, 00:13:51.726 { 00:13:51.726 "name": null, 00:13:51.726 "uuid": "ee803b37-e86f-4a4a-9394-62ba2f5e85ad", 00:13:51.726 "is_configured": false, 00:13:51.726 "data_offset": 2048, 00:13:51.726 "data_size": 63488 00:13:51.726 }, 00:13:51.726 { 00:13:51.726 "name": "BaseBdev3", 00:13:51.726 "uuid": "9ae007a2-b1bd-4bcd-96f3-679a2225d96a", 00:13:51.726 "is_configured": true, 00:13:51.726 "data_offset": 2048, 00:13:51.726 "data_size": 63488 00:13:51.726 } 00:13:51.726 ] 00:13:51.726 }' 00:13:51.726 18:17:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:51.726 18:17:35 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:13:52.293 18:17:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@323 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:52.293 18:17:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@323 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:13:52.550 18:17:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@323 -- # [[ true == \t\r\u\e ]] 00:13:52.550 18:17:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@325 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:13:52.808 [2024-07-12 18:17:36.334759] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:13:52.808 18:17:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@326 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:13:52.808 18:17:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:13:52.808 18:17:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:13:52.808 18:17:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:13:52.808 18:17:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:13:52.808 18:17:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:13:52.808 18:17:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:52.808 18:17:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:52.808 18:17:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:52.808 18:17:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:52.808 18:17:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:52.808 18:17:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:13:53.067 18:17:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:53.067 "name": "Existed_Raid", 00:13:53.067 "uuid": "3c072903-ee60-470b-adc9-49683f48cc7b", 00:13:53.067 "strip_size_kb": 64, 00:13:53.067 "state": "configuring", 00:13:53.067 "raid_level": "raid0", 00:13:53.067 "superblock": true, 00:13:53.067 "num_base_bdevs": 3, 00:13:53.067 "num_base_bdevs_discovered": 1, 00:13:53.067 "num_base_bdevs_operational": 3, 00:13:53.067 "base_bdevs_list": [ 00:13:53.067 { 00:13:53.067 "name": null, 00:13:53.067 "uuid": "6b2cd226-5f3f-4891-b2be-b9a122ca060f", 00:13:53.067 "is_configured": false, 00:13:53.067 "data_offset": 2048, 00:13:53.067 "data_size": 63488 00:13:53.067 }, 00:13:53.067 { 00:13:53.067 "name": null, 00:13:53.067 "uuid": "ee803b37-e86f-4a4a-9394-62ba2f5e85ad", 00:13:53.067 "is_configured": false, 00:13:53.067 "data_offset": 2048, 00:13:53.067 "data_size": 63488 00:13:53.067 }, 00:13:53.067 { 00:13:53.067 "name": "BaseBdev3", 00:13:53.067 "uuid": "9ae007a2-b1bd-4bcd-96f3-679a2225d96a", 00:13:53.067 "is_configured": true, 00:13:53.067 "data_offset": 2048, 00:13:53.067 "data_size": 63488 00:13:53.067 } 00:13:53.067 ] 00:13:53.067 }' 00:13:53.067 18:17:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:53.067 18:17:36 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:13:53.633 18:17:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@327 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:53.633 18:17:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@327 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:13:53.892 18:17:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@327 -- # [[ false == \f\a\l\s\e ]] 00:13:53.892 18:17:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@329 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev2 00:13:54.150 [2024-07-12 18:17:37.680472] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:13:54.150 18:17:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@330 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:13:54.150 18:17:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:13:54.150 18:17:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:13:54.150 18:17:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:13:54.150 18:17:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:13:54.150 18:17:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:13:54.150 18:17:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:54.150 18:17:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:54.150 18:17:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:54.150 18:17:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:54.150 18:17:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:54.150 18:17:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:13:54.716 18:17:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:54.716 "name": "Existed_Raid", 00:13:54.716 "uuid": "3c072903-ee60-470b-adc9-49683f48cc7b", 00:13:54.716 "strip_size_kb": 64, 00:13:54.716 "state": "configuring", 00:13:54.716 "raid_level": "raid0", 00:13:54.716 "superblock": true, 00:13:54.716 "num_base_bdevs": 3, 00:13:54.716 "num_base_bdevs_discovered": 2, 00:13:54.716 "num_base_bdevs_operational": 3, 00:13:54.716 "base_bdevs_list": [ 00:13:54.716 { 00:13:54.716 "name": null, 00:13:54.716 "uuid": "6b2cd226-5f3f-4891-b2be-b9a122ca060f", 00:13:54.716 "is_configured": false, 00:13:54.716 "data_offset": 2048, 00:13:54.716 "data_size": 63488 00:13:54.716 }, 00:13:54.716 { 00:13:54.716 "name": "BaseBdev2", 00:13:54.716 "uuid": "ee803b37-e86f-4a4a-9394-62ba2f5e85ad", 00:13:54.716 "is_configured": true, 00:13:54.716 "data_offset": 2048, 00:13:54.716 "data_size": 63488 00:13:54.716 }, 00:13:54.716 { 00:13:54.716 "name": "BaseBdev3", 00:13:54.716 "uuid": "9ae007a2-b1bd-4bcd-96f3-679a2225d96a", 00:13:54.716 "is_configured": true, 00:13:54.716 "data_offset": 2048, 00:13:54.716 "data_size": 63488 00:13:54.716 } 00:13:54.716 ] 00:13:54.716 }' 00:13:54.716 18:17:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:54.716 18:17:38 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:13:55.282 18:17:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@331 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:55.282 18:17:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@331 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:13:55.282 18:17:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@331 -- # [[ true == \t\r\u\e ]] 00:13:55.282 18:17:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:55.282 18:17:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@333 -- # jq -r '.[0].base_bdevs_list[0].uuid' 00:13:55.847 18:17:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b NewBaseBdev -u 6b2cd226-5f3f-4891-b2be-b9a122ca060f 00:13:56.105 [2024-07-12 18:17:39.733429] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev NewBaseBdev is claimed 00:13:56.105 [2024-07-12 18:17:39.733581] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x1945e90 00:13:56.105 [2024-07-12 18:17:39.733594] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 190464, blocklen 512 00:13:56.105 [2024-07-12 18:17:39.733769] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x164c940 00:13:56.105 [2024-07-12 18:17:39.733878] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1945e90 00:13:56.105 [2024-07-12 18:17:39.733888] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x1945e90 00:13:56.105 [2024-07-12 18:17:39.733986] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:13:56.105 NewBaseBdev 00:13:56.105 18:17:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@334 -- # waitforbdev NewBaseBdev 00:13:56.105 18:17:39 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=NewBaseBdev 00:13:56.105 18:17:39 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:13:56.105 18:17:39 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:13:56.105 18:17:39 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:13:56.105 18:17:39 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:13:56.105 18:17:39 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:13:56.363 18:17:39 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev -t 2000 00:13:56.621 [ 00:13:56.621 { 00:13:56.621 "name": "NewBaseBdev", 00:13:56.621 "aliases": [ 00:13:56.621 "6b2cd226-5f3f-4891-b2be-b9a122ca060f" 00:13:56.621 ], 00:13:56.621 "product_name": "Malloc disk", 00:13:56.621 "block_size": 512, 00:13:56.621 "num_blocks": 65536, 00:13:56.622 "uuid": "6b2cd226-5f3f-4891-b2be-b9a122ca060f", 00:13:56.622 "assigned_rate_limits": { 00:13:56.622 "rw_ios_per_sec": 0, 00:13:56.622 "rw_mbytes_per_sec": 0, 00:13:56.622 "r_mbytes_per_sec": 0, 00:13:56.622 "w_mbytes_per_sec": 0 00:13:56.622 }, 00:13:56.622 "claimed": true, 00:13:56.622 "claim_type": "exclusive_write", 00:13:56.622 "zoned": false, 00:13:56.622 "supported_io_types": { 00:13:56.622 "read": true, 00:13:56.622 "write": true, 00:13:56.622 "unmap": true, 00:13:56.622 "flush": true, 00:13:56.622 "reset": true, 00:13:56.622 "nvme_admin": false, 00:13:56.622 "nvme_io": false, 00:13:56.622 "nvme_io_md": false, 00:13:56.622 "write_zeroes": true, 00:13:56.622 "zcopy": true, 00:13:56.622 "get_zone_info": false, 00:13:56.622 "zone_management": false, 00:13:56.622 "zone_append": false, 00:13:56.622 "compare": false, 00:13:56.622 "compare_and_write": false, 00:13:56.622 "abort": true, 00:13:56.622 "seek_hole": false, 00:13:56.622 "seek_data": false, 00:13:56.622 "copy": true, 00:13:56.622 "nvme_iov_md": false 00:13:56.622 }, 00:13:56.622 "memory_domains": [ 00:13:56.622 { 00:13:56.622 "dma_device_id": "system", 00:13:56.622 "dma_device_type": 1 00:13:56.622 }, 00:13:56.622 { 00:13:56.622 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:56.622 "dma_device_type": 2 00:13:56.622 } 00:13:56.622 ], 00:13:56.622 "driver_specific": {} 00:13:56.622 } 00:13:56.622 ] 00:13:56.622 18:17:40 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:13:56.622 18:17:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@335 -- # verify_raid_bdev_state Existed_Raid online raid0 64 3 00:13:56.622 18:17:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:13:56.622 18:17:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:13:56.622 18:17:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:13:56.622 18:17:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:13:56.622 18:17:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:13:56.622 18:17:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:56.622 18:17:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:56.622 18:17:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:56.622 18:17:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:56.622 18:17:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:56.622 18:17:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:13:56.880 18:17:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:56.880 "name": "Existed_Raid", 00:13:56.880 "uuid": "3c072903-ee60-470b-adc9-49683f48cc7b", 00:13:56.880 "strip_size_kb": 64, 00:13:56.880 "state": "online", 00:13:56.880 "raid_level": "raid0", 00:13:56.880 "superblock": true, 00:13:56.880 "num_base_bdevs": 3, 00:13:56.880 "num_base_bdevs_discovered": 3, 00:13:56.880 "num_base_bdevs_operational": 3, 00:13:56.880 "base_bdevs_list": [ 00:13:56.880 { 00:13:56.880 "name": "NewBaseBdev", 00:13:56.880 "uuid": "6b2cd226-5f3f-4891-b2be-b9a122ca060f", 00:13:56.880 "is_configured": true, 00:13:56.880 "data_offset": 2048, 00:13:56.880 "data_size": 63488 00:13:56.880 }, 00:13:56.880 { 00:13:56.880 "name": "BaseBdev2", 00:13:56.880 "uuid": "ee803b37-e86f-4a4a-9394-62ba2f5e85ad", 00:13:56.880 "is_configured": true, 00:13:56.880 "data_offset": 2048, 00:13:56.880 "data_size": 63488 00:13:56.880 }, 00:13:56.880 { 00:13:56.880 "name": "BaseBdev3", 00:13:56.880 "uuid": "9ae007a2-b1bd-4bcd-96f3-679a2225d96a", 00:13:56.880 "is_configured": true, 00:13:56.880 "data_offset": 2048, 00:13:56.880 "data_size": 63488 00:13:56.880 } 00:13:56.880 ] 00:13:56.880 }' 00:13:56.880 18:17:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:56.880 18:17:40 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:13:57.814 18:17:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@336 -- # verify_raid_bdev_properties Existed_Raid 00:13:57.814 18:17:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:13:57.814 18:17:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:13:57.814 18:17:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:13:57.814 18:17:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:13:57.814 18:17:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@198 -- # local name 00:13:57.814 18:17:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:13:57.814 18:17:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:13:57.814 [2024-07-12 18:17:41.486405] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:13:57.814 18:17:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:13:57.814 "name": "Existed_Raid", 00:13:57.814 "aliases": [ 00:13:57.814 "3c072903-ee60-470b-adc9-49683f48cc7b" 00:13:57.814 ], 00:13:57.814 "product_name": "Raid Volume", 00:13:57.814 "block_size": 512, 00:13:57.814 "num_blocks": 190464, 00:13:57.814 "uuid": "3c072903-ee60-470b-adc9-49683f48cc7b", 00:13:57.814 "assigned_rate_limits": { 00:13:57.814 "rw_ios_per_sec": 0, 00:13:57.814 "rw_mbytes_per_sec": 0, 00:13:57.814 "r_mbytes_per_sec": 0, 00:13:57.814 "w_mbytes_per_sec": 0 00:13:57.814 }, 00:13:57.814 "claimed": false, 00:13:57.814 "zoned": false, 00:13:57.814 "supported_io_types": { 00:13:57.814 "read": true, 00:13:57.814 "write": true, 00:13:57.814 "unmap": true, 00:13:57.814 "flush": true, 00:13:57.814 "reset": true, 00:13:57.814 "nvme_admin": false, 00:13:57.814 "nvme_io": false, 00:13:57.814 "nvme_io_md": false, 00:13:57.814 "write_zeroes": true, 00:13:57.814 "zcopy": false, 00:13:57.814 "get_zone_info": false, 00:13:57.814 "zone_management": false, 00:13:57.814 "zone_append": false, 00:13:57.814 "compare": false, 00:13:57.814 "compare_and_write": false, 00:13:57.814 "abort": false, 00:13:57.814 "seek_hole": false, 00:13:57.814 "seek_data": false, 00:13:57.814 "copy": false, 00:13:57.814 "nvme_iov_md": false 00:13:57.814 }, 00:13:57.814 "memory_domains": [ 00:13:57.814 { 00:13:57.814 "dma_device_id": "system", 00:13:57.814 "dma_device_type": 1 00:13:57.814 }, 00:13:57.814 { 00:13:57.814 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:57.814 "dma_device_type": 2 00:13:57.814 }, 00:13:57.814 { 00:13:57.814 "dma_device_id": "system", 00:13:57.814 "dma_device_type": 1 00:13:57.814 }, 00:13:57.814 { 00:13:57.814 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:57.814 "dma_device_type": 2 00:13:57.814 }, 00:13:57.814 { 00:13:57.814 "dma_device_id": "system", 00:13:57.815 "dma_device_type": 1 00:13:57.815 }, 00:13:57.815 { 00:13:57.815 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:57.815 "dma_device_type": 2 00:13:57.815 } 00:13:57.815 ], 00:13:57.815 "driver_specific": { 00:13:57.815 "raid": { 00:13:57.815 "uuid": "3c072903-ee60-470b-adc9-49683f48cc7b", 00:13:57.815 "strip_size_kb": 64, 00:13:57.815 "state": "online", 00:13:57.815 "raid_level": "raid0", 00:13:57.815 "superblock": true, 00:13:57.815 "num_base_bdevs": 3, 00:13:57.815 "num_base_bdevs_discovered": 3, 00:13:57.815 "num_base_bdevs_operational": 3, 00:13:57.815 "base_bdevs_list": [ 00:13:57.815 { 00:13:57.815 "name": "NewBaseBdev", 00:13:57.815 "uuid": "6b2cd226-5f3f-4891-b2be-b9a122ca060f", 00:13:57.815 "is_configured": true, 00:13:57.815 "data_offset": 2048, 00:13:57.815 "data_size": 63488 00:13:57.815 }, 00:13:57.815 { 00:13:57.815 "name": "BaseBdev2", 00:13:57.815 "uuid": "ee803b37-e86f-4a4a-9394-62ba2f5e85ad", 00:13:57.815 "is_configured": true, 00:13:57.815 "data_offset": 2048, 00:13:57.815 "data_size": 63488 00:13:57.815 }, 00:13:57.815 { 00:13:57.815 "name": "BaseBdev3", 00:13:57.815 "uuid": "9ae007a2-b1bd-4bcd-96f3-679a2225d96a", 00:13:57.815 "is_configured": true, 00:13:57.815 "data_offset": 2048, 00:13:57.815 "data_size": 63488 00:13:57.815 } 00:13:57.815 ] 00:13:57.815 } 00:13:57.815 } 00:13:57.815 }' 00:13:57.815 18:17:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:13:58.073 18:17:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # base_bdev_names='NewBaseBdev 00:13:58.073 BaseBdev2 00:13:58.073 BaseBdev3' 00:13:58.073 18:17:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:13:58.073 18:17:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev 00:13:58.073 18:17:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:13:58.073 18:17:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:13:58.073 "name": "NewBaseBdev", 00:13:58.073 "aliases": [ 00:13:58.073 "6b2cd226-5f3f-4891-b2be-b9a122ca060f" 00:13:58.073 ], 00:13:58.073 "product_name": "Malloc disk", 00:13:58.073 "block_size": 512, 00:13:58.073 "num_blocks": 65536, 00:13:58.073 "uuid": "6b2cd226-5f3f-4891-b2be-b9a122ca060f", 00:13:58.073 "assigned_rate_limits": { 00:13:58.073 "rw_ios_per_sec": 0, 00:13:58.073 "rw_mbytes_per_sec": 0, 00:13:58.073 "r_mbytes_per_sec": 0, 00:13:58.073 "w_mbytes_per_sec": 0 00:13:58.073 }, 00:13:58.073 "claimed": true, 00:13:58.073 "claim_type": "exclusive_write", 00:13:58.073 "zoned": false, 00:13:58.073 "supported_io_types": { 00:13:58.073 "read": true, 00:13:58.073 "write": true, 00:13:58.073 "unmap": true, 00:13:58.073 "flush": true, 00:13:58.073 "reset": true, 00:13:58.073 "nvme_admin": false, 00:13:58.073 "nvme_io": false, 00:13:58.073 "nvme_io_md": false, 00:13:58.073 "write_zeroes": true, 00:13:58.073 "zcopy": true, 00:13:58.073 "get_zone_info": false, 00:13:58.073 "zone_management": false, 00:13:58.073 "zone_append": false, 00:13:58.073 "compare": false, 00:13:58.073 "compare_and_write": false, 00:13:58.073 "abort": true, 00:13:58.073 "seek_hole": false, 00:13:58.073 "seek_data": false, 00:13:58.073 "copy": true, 00:13:58.073 "nvme_iov_md": false 00:13:58.073 }, 00:13:58.073 "memory_domains": [ 00:13:58.073 { 00:13:58.073 "dma_device_id": "system", 00:13:58.073 "dma_device_type": 1 00:13:58.073 }, 00:13:58.073 { 00:13:58.073 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:58.073 "dma_device_type": 2 00:13:58.073 } 00:13:58.073 ], 00:13:58.073 "driver_specific": {} 00:13:58.073 }' 00:13:58.073 18:17:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:13:58.331 18:17:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:13:58.331 18:17:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:13:58.331 18:17:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:13:58.331 18:17:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:13:58.331 18:17:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:13:58.331 18:17:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:13:58.331 18:17:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:13:58.331 18:17:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:13:58.332 18:17:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:13:58.589 18:17:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:13:58.590 18:17:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:13:58.590 18:17:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:13:58.590 18:17:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:13:58.590 18:17:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:13:58.848 18:17:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:13:58.848 "name": "BaseBdev2", 00:13:58.848 "aliases": [ 00:13:58.848 "ee803b37-e86f-4a4a-9394-62ba2f5e85ad" 00:13:58.848 ], 00:13:58.848 "product_name": "Malloc disk", 00:13:58.848 "block_size": 512, 00:13:58.848 "num_blocks": 65536, 00:13:58.848 "uuid": "ee803b37-e86f-4a4a-9394-62ba2f5e85ad", 00:13:58.848 "assigned_rate_limits": { 00:13:58.848 "rw_ios_per_sec": 0, 00:13:58.848 "rw_mbytes_per_sec": 0, 00:13:58.848 "r_mbytes_per_sec": 0, 00:13:58.848 "w_mbytes_per_sec": 0 00:13:58.848 }, 00:13:58.848 "claimed": true, 00:13:58.848 "claim_type": "exclusive_write", 00:13:58.848 "zoned": false, 00:13:58.848 "supported_io_types": { 00:13:58.848 "read": true, 00:13:58.848 "write": true, 00:13:58.848 "unmap": true, 00:13:58.848 "flush": true, 00:13:58.848 "reset": true, 00:13:58.848 "nvme_admin": false, 00:13:58.848 "nvme_io": false, 00:13:58.848 "nvme_io_md": false, 00:13:58.848 "write_zeroes": true, 00:13:58.848 "zcopy": true, 00:13:58.848 "get_zone_info": false, 00:13:58.848 "zone_management": false, 00:13:58.848 "zone_append": false, 00:13:58.848 "compare": false, 00:13:58.848 "compare_and_write": false, 00:13:58.848 "abort": true, 00:13:58.848 "seek_hole": false, 00:13:58.848 "seek_data": false, 00:13:58.848 "copy": true, 00:13:58.848 "nvme_iov_md": false 00:13:58.848 }, 00:13:58.848 "memory_domains": [ 00:13:58.848 { 00:13:58.848 "dma_device_id": "system", 00:13:58.848 "dma_device_type": 1 00:13:58.848 }, 00:13:58.848 { 00:13:58.848 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:58.848 "dma_device_type": 2 00:13:58.848 } 00:13:58.848 ], 00:13:58.848 "driver_specific": {} 00:13:58.848 }' 00:13:58.848 18:17:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:13:58.848 18:17:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:13:58.848 18:17:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:13:58.848 18:17:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:13:58.848 18:17:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:13:58.848 18:17:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:13:58.848 18:17:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:13:59.106 18:17:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:13:59.106 18:17:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:13:59.106 18:17:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:13:59.106 18:17:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:13:59.106 18:17:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:13:59.106 18:17:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:13:59.106 18:17:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:13:59.106 18:17:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:13:59.364 18:17:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:13:59.364 "name": "BaseBdev3", 00:13:59.364 "aliases": [ 00:13:59.364 "9ae007a2-b1bd-4bcd-96f3-679a2225d96a" 00:13:59.364 ], 00:13:59.364 "product_name": "Malloc disk", 00:13:59.364 "block_size": 512, 00:13:59.364 "num_blocks": 65536, 00:13:59.364 "uuid": "9ae007a2-b1bd-4bcd-96f3-679a2225d96a", 00:13:59.364 "assigned_rate_limits": { 00:13:59.364 "rw_ios_per_sec": 0, 00:13:59.364 "rw_mbytes_per_sec": 0, 00:13:59.364 "r_mbytes_per_sec": 0, 00:13:59.364 "w_mbytes_per_sec": 0 00:13:59.364 }, 00:13:59.364 "claimed": true, 00:13:59.364 "claim_type": "exclusive_write", 00:13:59.364 "zoned": false, 00:13:59.364 "supported_io_types": { 00:13:59.364 "read": true, 00:13:59.364 "write": true, 00:13:59.364 "unmap": true, 00:13:59.364 "flush": true, 00:13:59.364 "reset": true, 00:13:59.364 "nvme_admin": false, 00:13:59.364 "nvme_io": false, 00:13:59.364 "nvme_io_md": false, 00:13:59.364 "write_zeroes": true, 00:13:59.364 "zcopy": true, 00:13:59.364 "get_zone_info": false, 00:13:59.364 "zone_management": false, 00:13:59.364 "zone_append": false, 00:13:59.364 "compare": false, 00:13:59.364 "compare_and_write": false, 00:13:59.364 "abort": true, 00:13:59.364 "seek_hole": false, 00:13:59.364 "seek_data": false, 00:13:59.364 "copy": true, 00:13:59.364 "nvme_iov_md": false 00:13:59.364 }, 00:13:59.364 "memory_domains": [ 00:13:59.364 { 00:13:59.364 "dma_device_id": "system", 00:13:59.364 "dma_device_type": 1 00:13:59.364 }, 00:13:59.364 { 00:13:59.364 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:59.364 "dma_device_type": 2 00:13:59.364 } 00:13:59.364 ], 00:13:59.364 "driver_specific": {} 00:13:59.364 }' 00:13:59.364 18:17:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:13:59.364 18:17:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:13:59.364 18:17:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:13:59.364 18:17:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:13:59.621 18:17:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:13:59.621 18:17:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:13:59.621 18:17:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:13:59.621 18:17:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:13:59.621 18:17:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:13:59.621 18:17:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:13:59.621 18:17:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:13:59.621 18:17:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:13:59.621 18:17:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@338 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:13:59.879 [2024-07-12 18:17:43.575660] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:13:59.879 [2024-07-12 18:17:43.575685] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:13:59.879 [2024-07-12 18:17:43.575731] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:13:59.879 [2024-07-12 18:17:43.575780] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:13:59.879 [2024-07-12 18:17:43.575792] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1945e90 name Existed_Raid, state offline 00:13:59.879 18:17:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@341 -- # killprocess 2481876 00:13:59.879 18:17:43 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@948 -- # '[' -z 2481876 ']' 00:13:59.879 18:17:43 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@952 -- # kill -0 2481876 00:13:59.879 18:17:43 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@953 -- # uname 00:13:59.879 18:17:43 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:13:59.879 18:17:43 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2481876 00:14:00.138 18:17:43 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:14:00.138 18:17:43 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:14:00.138 18:17:43 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2481876' 00:14:00.138 killing process with pid 2481876 00:14:00.138 18:17:43 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@967 -- # kill 2481876 00:14:00.138 [2024-07-12 18:17:43.640699] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:14:00.138 18:17:43 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@972 -- # wait 2481876 00:14:00.138 [2024-07-12 18:17:43.667831] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:14:00.396 18:17:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@343 -- # return 0 00:14:00.396 00:14:00.396 real 0m28.865s 00:14:00.396 user 0m52.934s 00:14:00.396 sys 0m5.108s 00:14:00.396 18:17:43 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@1124 -- # xtrace_disable 00:14:00.396 18:17:43 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:14:00.396 ************************************ 00:14:00.396 END TEST raid_state_function_test_sb 00:14:00.396 ************************************ 00:14:00.396 18:17:43 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:14:00.396 18:17:43 bdev_raid -- bdev/bdev_raid.sh@869 -- # run_test raid_superblock_test raid_superblock_test raid0 3 00:14:00.396 18:17:43 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 4 -le 1 ']' 00:14:00.396 18:17:43 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:14:00.396 18:17:43 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:14:00.396 ************************************ 00:14:00.396 START TEST raid_superblock_test 00:14:00.396 ************************************ 00:14:00.396 18:17:43 bdev_raid.raid_superblock_test -- common/autotest_common.sh@1123 -- # raid_superblock_test raid0 3 00:14:00.396 18:17:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@392 -- # local raid_level=raid0 00:14:00.396 18:17:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@393 -- # local num_base_bdevs=3 00:14:00.396 18:17:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@394 -- # base_bdevs_malloc=() 00:14:00.396 18:17:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@394 -- # local base_bdevs_malloc 00:14:00.396 18:17:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@395 -- # base_bdevs_pt=() 00:14:00.396 18:17:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@395 -- # local base_bdevs_pt 00:14:00.396 18:17:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@396 -- # base_bdevs_pt_uuid=() 00:14:00.396 18:17:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@396 -- # local base_bdevs_pt_uuid 00:14:00.396 18:17:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@397 -- # local raid_bdev_name=raid_bdev1 00:14:00.396 18:17:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@398 -- # local strip_size 00:14:00.396 18:17:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@399 -- # local strip_size_create_arg 00:14:00.396 18:17:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@400 -- # local raid_bdev_uuid 00:14:00.396 18:17:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@401 -- # local raid_bdev 00:14:00.396 18:17:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@403 -- # '[' raid0 '!=' raid1 ']' 00:14:00.396 18:17:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@404 -- # strip_size=64 00:14:00.396 18:17:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@405 -- # strip_size_create_arg='-z 64' 00:14:00.396 18:17:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@411 -- # raid_pid=2486168 00:14:00.396 18:17:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@412 -- # waitforlisten 2486168 /var/tmp/spdk-raid.sock 00:14:00.396 18:17:43 bdev_raid.raid_superblock_test -- common/autotest_common.sh@829 -- # '[' -z 2486168 ']' 00:14:00.396 18:17:43 bdev_raid.raid_superblock_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:14:00.396 18:17:43 bdev_raid.raid_superblock_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:14:00.396 18:17:43 bdev_raid.raid_superblock_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:14:00.396 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:14:00.396 18:17:43 bdev_raid.raid_superblock_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:14:00.396 18:17:43 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:14:00.396 18:17:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@410 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -L bdev_raid 00:14:00.397 [2024-07-12 18:17:44.032609] Starting SPDK v24.09-pre git sha1 182dd7de4 / DPDK 24.03.0 initialization... 00:14:00.397 [2024-07-12 18:17:44.032674] [ DPDK EAL parameters: bdev_svc --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2486168 ] 00:14:00.656 [2024-07-12 18:17:44.159758] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:14:00.656 [2024-07-12 18:17:44.261297] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:14:00.656 [2024-07-12 18:17:44.325857] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:14:00.656 [2024-07-12 18:17:44.325894] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:14:01.222 18:17:44 bdev_raid.raid_superblock_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:14:01.222 18:17:44 bdev_raid.raid_superblock_test -- common/autotest_common.sh@862 -- # return 0 00:14:01.222 18:17:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i = 1 )) 00:14:01.222 18:17:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:14:01.222 18:17:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc1 00:14:01.222 18:17:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt1 00:14:01.222 18:17:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000001 00:14:01.222 18:17:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:14:01.222 18:17:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:14:01.222 18:17:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:14:01.222 18:17:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc1 00:14:01.481 malloc1 00:14:01.481 18:17:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:14:01.757 [2024-07-12 18:17:45.334379] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:14:01.757 [2024-07-12 18:17:45.334427] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:14:01.757 [2024-07-12 18:17:45.334447] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xcc2570 00:14:01.757 [2024-07-12 18:17:45.334460] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:14:01.757 [2024-07-12 18:17:45.336177] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:14:01.757 [2024-07-12 18:17:45.336219] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:14:01.757 pt1 00:14:01.757 18:17:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:14:01.758 18:17:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:14:01.758 18:17:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc2 00:14:01.758 18:17:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt2 00:14:01.758 18:17:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000002 00:14:01.758 18:17:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:14:01.758 18:17:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:14:01.758 18:17:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:14:01.758 18:17:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc2 00:14:02.015 malloc2 00:14:02.016 18:17:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:14:02.274 [2024-07-12 18:17:45.809705] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:14:02.274 [2024-07-12 18:17:45.809749] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:14:02.274 [2024-07-12 18:17:45.809767] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xcc3970 00:14:02.274 [2024-07-12 18:17:45.809780] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:14:02.274 [2024-07-12 18:17:45.811358] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:14:02.274 [2024-07-12 18:17:45.811386] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:14:02.274 pt2 00:14:02.274 18:17:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:14:02.274 18:17:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:14:02.274 18:17:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc3 00:14:02.274 18:17:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt3 00:14:02.274 18:17:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000003 00:14:02.274 18:17:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:14:02.274 18:17:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:14:02.274 18:17:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:14:02.274 18:17:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc3 00:14:02.532 malloc3 00:14:02.532 18:17:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc3 -p pt3 -u 00000000-0000-0000-0000-000000000003 00:14:02.532 [2024-07-12 18:17:46.215432] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc3 00:14:02.532 [2024-07-12 18:17:46.215474] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:14:02.532 [2024-07-12 18:17:46.215491] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xe5a340 00:14:02.532 [2024-07-12 18:17:46.215503] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:14:02.532 [2024-07-12 18:17:46.217025] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:14:02.532 [2024-07-12 18:17:46.217052] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt3 00:14:02.532 pt3 00:14:02.532 18:17:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:14:02.532 18:17:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:14:02.532 18:17:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@429 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'pt1 pt2 pt3' -n raid_bdev1 -s 00:14:02.815 [2024-07-12 18:17:46.444060] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:14:02.815 [2024-07-12 18:17:46.445400] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:14:02.815 [2024-07-12 18:17:46.445453] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt3 is claimed 00:14:02.815 [2024-07-12 18:17:46.445600] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0xcbaea0 00:14:02.815 [2024-07-12 18:17:46.445612] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 190464, blocklen 512 00:14:02.815 [2024-07-12 18:17:46.445812] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xcc2240 00:14:02.815 [2024-07-12 18:17:46.445962] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0xcbaea0 00:14:02.815 [2024-07-12 18:17:46.445973] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0xcbaea0 00:14:02.815 [2024-07-12 18:17:46.446067] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:14:02.815 18:17:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@430 -- # verify_raid_bdev_state raid_bdev1 online raid0 64 3 00:14:02.815 18:17:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:14:02.815 18:17:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:14:02.815 18:17:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:14:02.815 18:17:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:14:02.815 18:17:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:14:02.815 18:17:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:02.815 18:17:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:02.815 18:17:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:02.815 18:17:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:02.815 18:17:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:02.815 18:17:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:14:03.099 18:17:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:03.099 "name": "raid_bdev1", 00:14:03.099 "uuid": "3eaba237-1a0e-47fe-8d6d-5cb27d34dec2", 00:14:03.099 "strip_size_kb": 64, 00:14:03.099 "state": "online", 00:14:03.099 "raid_level": "raid0", 00:14:03.099 "superblock": true, 00:14:03.099 "num_base_bdevs": 3, 00:14:03.099 "num_base_bdevs_discovered": 3, 00:14:03.099 "num_base_bdevs_operational": 3, 00:14:03.099 "base_bdevs_list": [ 00:14:03.099 { 00:14:03.099 "name": "pt1", 00:14:03.099 "uuid": "00000000-0000-0000-0000-000000000001", 00:14:03.099 "is_configured": true, 00:14:03.099 "data_offset": 2048, 00:14:03.099 "data_size": 63488 00:14:03.099 }, 00:14:03.099 { 00:14:03.099 "name": "pt2", 00:14:03.099 "uuid": "00000000-0000-0000-0000-000000000002", 00:14:03.099 "is_configured": true, 00:14:03.099 "data_offset": 2048, 00:14:03.099 "data_size": 63488 00:14:03.099 }, 00:14:03.099 { 00:14:03.099 "name": "pt3", 00:14:03.099 "uuid": "00000000-0000-0000-0000-000000000003", 00:14:03.099 "is_configured": true, 00:14:03.099 "data_offset": 2048, 00:14:03.099 "data_size": 63488 00:14:03.099 } 00:14:03.099 ] 00:14:03.099 }' 00:14:03.099 18:17:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:03.099 18:17:46 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:14:04.047 18:17:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@431 -- # verify_raid_bdev_properties raid_bdev1 00:14:04.047 18:17:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:14:04.047 18:17:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:14:04.047 18:17:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:14:04.047 18:17:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:14:04.047 18:17:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@198 -- # local name 00:14:04.047 18:17:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:14:04.047 18:17:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:14:04.306 [2024-07-12 18:17:47.783938] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:14:04.306 18:17:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:14:04.306 "name": "raid_bdev1", 00:14:04.306 "aliases": [ 00:14:04.306 "3eaba237-1a0e-47fe-8d6d-5cb27d34dec2" 00:14:04.306 ], 00:14:04.306 "product_name": "Raid Volume", 00:14:04.306 "block_size": 512, 00:14:04.306 "num_blocks": 190464, 00:14:04.306 "uuid": "3eaba237-1a0e-47fe-8d6d-5cb27d34dec2", 00:14:04.306 "assigned_rate_limits": { 00:14:04.306 "rw_ios_per_sec": 0, 00:14:04.306 "rw_mbytes_per_sec": 0, 00:14:04.306 "r_mbytes_per_sec": 0, 00:14:04.306 "w_mbytes_per_sec": 0 00:14:04.306 }, 00:14:04.306 "claimed": false, 00:14:04.306 "zoned": false, 00:14:04.306 "supported_io_types": { 00:14:04.306 "read": true, 00:14:04.306 "write": true, 00:14:04.306 "unmap": true, 00:14:04.306 "flush": true, 00:14:04.306 "reset": true, 00:14:04.306 "nvme_admin": false, 00:14:04.306 "nvme_io": false, 00:14:04.306 "nvme_io_md": false, 00:14:04.306 "write_zeroes": true, 00:14:04.306 "zcopy": false, 00:14:04.306 "get_zone_info": false, 00:14:04.306 "zone_management": false, 00:14:04.306 "zone_append": false, 00:14:04.306 "compare": false, 00:14:04.306 "compare_and_write": false, 00:14:04.306 "abort": false, 00:14:04.306 "seek_hole": false, 00:14:04.306 "seek_data": false, 00:14:04.306 "copy": false, 00:14:04.306 "nvme_iov_md": false 00:14:04.306 }, 00:14:04.306 "memory_domains": [ 00:14:04.306 { 00:14:04.306 "dma_device_id": "system", 00:14:04.306 "dma_device_type": 1 00:14:04.306 }, 00:14:04.306 { 00:14:04.306 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:04.306 "dma_device_type": 2 00:14:04.306 }, 00:14:04.306 { 00:14:04.306 "dma_device_id": "system", 00:14:04.306 "dma_device_type": 1 00:14:04.306 }, 00:14:04.306 { 00:14:04.306 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:04.306 "dma_device_type": 2 00:14:04.306 }, 00:14:04.306 { 00:14:04.306 "dma_device_id": "system", 00:14:04.306 "dma_device_type": 1 00:14:04.306 }, 00:14:04.306 { 00:14:04.306 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:04.306 "dma_device_type": 2 00:14:04.306 } 00:14:04.306 ], 00:14:04.306 "driver_specific": { 00:14:04.306 "raid": { 00:14:04.306 "uuid": "3eaba237-1a0e-47fe-8d6d-5cb27d34dec2", 00:14:04.306 "strip_size_kb": 64, 00:14:04.306 "state": "online", 00:14:04.306 "raid_level": "raid0", 00:14:04.306 "superblock": true, 00:14:04.306 "num_base_bdevs": 3, 00:14:04.306 "num_base_bdevs_discovered": 3, 00:14:04.306 "num_base_bdevs_operational": 3, 00:14:04.306 "base_bdevs_list": [ 00:14:04.306 { 00:14:04.306 "name": "pt1", 00:14:04.306 "uuid": "00000000-0000-0000-0000-000000000001", 00:14:04.306 "is_configured": true, 00:14:04.306 "data_offset": 2048, 00:14:04.306 "data_size": 63488 00:14:04.306 }, 00:14:04.306 { 00:14:04.306 "name": "pt2", 00:14:04.306 "uuid": "00000000-0000-0000-0000-000000000002", 00:14:04.306 "is_configured": true, 00:14:04.306 "data_offset": 2048, 00:14:04.306 "data_size": 63488 00:14:04.306 }, 00:14:04.306 { 00:14:04.306 "name": "pt3", 00:14:04.306 "uuid": "00000000-0000-0000-0000-000000000003", 00:14:04.306 "is_configured": true, 00:14:04.306 "data_offset": 2048, 00:14:04.306 "data_size": 63488 00:14:04.306 } 00:14:04.306 ] 00:14:04.306 } 00:14:04.306 } 00:14:04.306 }' 00:14:04.306 18:17:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:14:04.306 18:17:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:14:04.306 pt2 00:14:04.306 pt3' 00:14:04.306 18:17:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:14:04.306 18:17:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:14:04.306 18:17:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:14:04.565 18:17:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:14:04.565 "name": "pt1", 00:14:04.565 "aliases": [ 00:14:04.565 "00000000-0000-0000-0000-000000000001" 00:14:04.565 ], 00:14:04.565 "product_name": "passthru", 00:14:04.565 "block_size": 512, 00:14:04.565 "num_blocks": 65536, 00:14:04.565 "uuid": "00000000-0000-0000-0000-000000000001", 00:14:04.565 "assigned_rate_limits": { 00:14:04.565 "rw_ios_per_sec": 0, 00:14:04.565 "rw_mbytes_per_sec": 0, 00:14:04.565 "r_mbytes_per_sec": 0, 00:14:04.565 "w_mbytes_per_sec": 0 00:14:04.565 }, 00:14:04.565 "claimed": true, 00:14:04.565 "claim_type": "exclusive_write", 00:14:04.565 "zoned": false, 00:14:04.565 "supported_io_types": { 00:14:04.565 "read": true, 00:14:04.565 "write": true, 00:14:04.565 "unmap": true, 00:14:04.565 "flush": true, 00:14:04.565 "reset": true, 00:14:04.565 "nvme_admin": false, 00:14:04.565 "nvme_io": false, 00:14:04.565 "nvme_io_md": false, 00:14:04.565 "write_zeroes": true, 00:14:04.565 "zcopy": true, 00:14:04.565 "get_zone_info": false, 00:14:04.565 "zone_management": false, 00:14:04.565 "zone_append": false, 00:14:04.565 "compare": false, 00:14:04.565 "compare_and_write": false, 00:14:04.565 "abort": true, 00:14:04.565 "seek_hole": false, 00:14:04.565 "seek_data": false, 00:14:04.565 "copy": true, 00:14:04.565 "nvme_iov_md": false 00:14:04.565 }, 00:14:04.565 "memory_domains": [ 00:14:04.565 { 00:14:04.565 "dma_device_id": "system", 00:14:04.565 "dma_device_type": 1 00:14:04.565 }, 00:14:04.565 { 00:14:04.565 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:04.565 "dma_device_type": 2 00:14:04.565 } 00:14:04.565 ], 00:14:04.565 "driver_specific": { 00:14:04.565 "passthru": { 00:14:04.565 "name": "pt1", 00:14:04.565 "base_bdev_name": "malloc1" 00:14:04.565 } 00:14:04.565 } 00:14:04.565 }' 00:14:04.565 18:17:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:04.565 18:17:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:04.565 18:17:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:14:04.565 18:17:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:04.565 18:17:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:04.823 18:17:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:14:04.823 18:17:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:04.823 18:17:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:04.823 18:17:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:14:04.823 18:17:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:04.823 18:17:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:04.823 18:17:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:14:04.823 18:17:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:14:04.823 18:17:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:14:04.823 18:17:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:14:05.081 18:17:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:14:05.081 "name": "pt2", 00:14:05.081 "aliases": [ 00:14:05.081 "00000000-0000-0000-0000-000000000002" 00:14:05.081 ], 00:14:05.082 "product_name": "passthru", 00:14:05.082 "block_size": 512, 00:14:05.082 "num_blocks": 65536, 00:14:05.082 "uuid": "00000000-0000-0000-0000-000000000002", 00:14:05.082 "assigned_rate_limits": { 00:14:05.082 "rw_ios_per_sec": 0, 00:14:05.082 "rw_mbytes_per_sec": 0, 00:14:05.082 "r_mbytes_per_sec": 0, 00:14:05.082 "w_mbytes_per_sec": 0 00:14:05.082 }, 00:14:05.082 "claimed": true, 00:14:05.082 "claim_type": "exclusive_write", 00:14:05.082 "zoned": false, 00:14:05.082 "supported_io_types": { 00:14:05.082 "read": true, 00:14:05.082 "write": true, 00:14:05.082 "unmap": true, 00:14:05.082 "flush": true, 00:14:05.082 "reset": true, 00:14:05.082 "nvme_admin": false, 00:14:05.082 "nvme_io": false, 00:14:05.082 "nvme_io_md": false, 00:14:05.082 "write_zeroes": true, 00:14:05.082 "zcopy": true, 00:14:05.082 "get_zone_info": false, 00:14:05.082 "zone_management": false, 00:14:05.082 "zone_append": false, 00:14:05.082 "compare": false, 00:14:05.082 "compare_and_write": false, 00:14:05.082 "abort": true, 00:14:05.082 "seek_hole": false, 00:14:05.082 "seek_data": false, 00:14:05.082 "copy": true, 00:14:05.082 "nvme_iov_md": false 00:14:05.082 }, 00:14:05.082 "memory_domains": [ 00:14:05.082 { 00:14:05.082 "dma_device_id": "system", 00:14:05.082 "dma_device_type": 1 00:14:05.082 }, 00:14:05.082 { 00:14:05.082 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:05.082 "dma_device_type": 2 00:14:05.082 } 00:14:05.082 ], 00:14:05.082 "driver_specific": { 00:14:05.082 "passthru": { 00:14:05.082 "name": "pt2", 00:14:05.082 "base_bdev_name": "malloc2" 00:14:05.082 } 00:14:05.082 } 00:14:05.082 }' 00:14:05.082 18:17:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:05.082 18:17:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:05.340 18:17:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:14:05.340 18:17:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:05.340 18:17:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:05.340 18:17:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:14:05.340 18:17:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:05.340 18:17:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:05.340 18:17:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:14:05.340 18:17:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:05.340 18:17:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:05.597 18:17:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:14:05.597 18:17:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:14:05.597 18:17:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:14:05.597 18:17:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt3 00:14:05.855 18:17:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:14:05.855 "name": "pt3", 00:14:05.855 "aliases": [ 00:14:05.855 "00000000-0000-0000-0000-000000000003" 00:14:05.855 ], 00:14:05.855 "product_name": "passthru", 00:14:05.855 "block_size": 512, 00:14:05.855 "num_blocks": 65536, 00:14:05.855 "uuid": "00000000-0000-0000-0000-000000000003", 00:14:05.855 "assigned_rate_limits": { 00:14:05.855 "rw_ios_per_sec": 0, 00:14:05.855 "rw_mbytes_per_sec": 0, 00:14:05.855 "r_mbytes_per_sec": 0, 00:14:05.855 "w_mbytes_per_sec": 0 00:14:05.855 }, 00:14:05.855 "claimed": true, 00:14:05.855 "claim_type": "exclusive_write", 00:14:05.855 "zoned": false, 00:14:05.855 "supported_io_types": { 00:14:05.855 "read": true, 00:14:05.855 "write": true, 00:14:05.855 "unmap": true, 00:14:05.855 "flush": true, 00:14:05.855 "reset": true, 00:14:05.855 "nvme_admin": false, 00:14:05.855 "nvme_io": false, 00:14:05.855 "nvme_io_md": false, 00:14:05.855 "write_zeroes": true, 00:14:05.855 "zcopy": true, 00:14:05.855 "get_zone_info": false, 00:14:05.855 "zone_management": false, 00:14:05.855 "zone_append": false, 00:14:05.855 "compare": false, 00:14:05.855 "compare_and_write": false, 00:14:05.855 "abort": true, 00:14:05.855 "seek_hole": false, 00:14:05.855 "seek_data": false, 00:14:05.855 "copy": true, 00:14:05.855 "nvme_iov_md": false 00:14:05.855 }, 00:14:05.855 "memory_domains": [ 00:14:05.855 { 00:14:05.855 "dma_device_id": "system", 00:14:05.855 "dma_device_type": 1 00:14:05.855 }, 00:14:05.855 { 00:14:05.855 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:05.855 "dma_device_type": 2 00:14:05.855 } 00:14:05.855 ], 00:14:05.855 "driver_specific": { 00:14:05.855 "passthru": { 00:14:05.855 "name": "pt3", 00:14:05.855 "base_bdev_name": "malloc3" 00:14:05.855 } 00:14:05.855 } 00:14:05.855 }' 00:14:05.855 18:17:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:05.855 18:17:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:05.855 18:17:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:14:05.855 18:17:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:05.855 18:17:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:05.856 18:17:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:14:05.856 18:17:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:05.856 18:17:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:06.113 18:17:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:14:06.113 18:17:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:06.114 18:17:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:06.114 18:17:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:14:06.114 18:17:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:14:06.114 18:17:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # jq -r '.[] | .uuid' 00:14:06.372 [2024-07-12 18:17:49.909548] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:14:06.372 18:17:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # raid_bdev_uuid=3eaba237-1a0e-47fe-8d6d-5cb27d34dec2 00:14:06.372 18:17:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@435 -- # '[' -z 3eaba237-1a0e-47fe-8d6d-5cb27d34dec2 ']' 00:14:06.372 18:17:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@440 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:14:06.939 [2024-07-12 18:17:50.414629] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:14:06.939 [2024-07-12 18:17:50.414653] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:14:06.939 [2024-07-12 18:17:50.414705] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:14:06.939 [2024-07-12 18:17:50.414754] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:14:06.939 [2024-07-12 18:17:50.414767] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xcbaea0 name raid_bdev1, state offline 00:14:06.939 18:17:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:06.939 18:17:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # jq -r '.[]' 00:14:07.198 18:17:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # raid_bdev= 00:14:07.198 18:17:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@442 -- # '[' -n '' ']' 00:14:07.198 18:17:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:14:07.198 18:17:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt1 00:14:07.764 18:17:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:14:07.764 18:17:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:14:07.764 18:17:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:14:07.764 18:17:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt3 00:14:08.332 18:17:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs 00:14:08.332 18:17:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # jq -r '[.[] | select(.product_name == "passthru")] | any' 00:14:08.591 18:17:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # '[' false == true ']' 00:14:08.591 18:17:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@456 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'malloc1 malloc2 malloc3' -n raid_bdev1 00:14:08.591 18:17:52 bdev_raid.raid_superblock_test -- common/autotest_common.sh@648 -- # local es=0 00:14:08.591 18:17:52 bdev_raid.raid_superblock_test -- common/autotest_common.sh@650 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'malloc1 malloc2 malloc3' -n raid_bdev1 00:14:08.591 18:17:52 bdev_raid.raid_superblock_test -- common/autotest_common.sh@636 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:14:08.591 18:17:52 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:14:08.591 18:17:52 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:14:08.591 18:17:52 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:14:08.591 18:17:52 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:14:08.591 18:17:52 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:14:08.591 18:17:52 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:14:08.591 18:17:52 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py ]] 00:14:08.591 18:17:52 bdev_raid.raid_superblock_test -- common/autotest_common.sh@651 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'malloc1 malloc2 malloc3' -n raid_bdev1 00:14:08.849 [2024-07-12 18:17:52.439885] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc1 is claimed 00:14:08.849 [2024-07-12 18:17:52.441243] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc2 is claimed 00:14:08.849 [2024-07-12 18:17:52.441284] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc3 is claimed 00:14:08.849 [2024-07-12 18:17:52.441328] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc1 00:14:08.849 [2024-07-12 18:17:52.441366] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc2 00:14:08.849 [2024-07-12 18:17:52.441389] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc3 00:14:08.849 [2024-07-12 18:17:52.441407] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:14:08.849 [2024-07-12 18:17:52.441417] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xe65ff0 name raid_bdev1, state configuring 00:14:08.849 request: 00:14:08.849 { 00:14:08.849 "name": "raid_bdev1", 00:14:08.849 "raid_level": "raid0", 00:14:08.849 "base_bdevs": [ 00:14:08.849 "malloc1", 00:14:08.849 "malloc2", 00:14:08.849 "malloc3" 00:14:08.849 ], 00:14:08.849 "strip_size_kb": 64, 00:14:08.849 "superblock": false, 00:14:08.849 "method": "bdev_raid_create", 00:14:08.849 "req_id": 1 00:14:08.849 } 00:14:08.849 Got JSON-RPC error response 00:14:08.849 response: 00:14:08.849 { 00:14:08.849 "code": -17, 00:14:08.849 "message": "Failed to create RAID bdev raid_bdev1: File exists" 00:14:08.849 } 00:14:08.849 18:17:52 bdev_raid.raid_superblock_test -- common/autotest_common.sh@651 -- # es=1 00:14:08.849 18:17:52 bdev_raid.raid_superblock_test -- common/autotest_common.sh@659 -- # (( es > 128 )) 00:14:08.849 18:17:52 bdev_raid.raid_superblock_test -- common/autotest_common.sh@670 -- # [[ -n '' ]] 00:14:08.849 18:17:52 bdev_raid.raid_superblock_test -- common/autotest_common.sh@675 -- # (( !es == 0 )) 00:14:08.849 18:17:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:08.849 18:17:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # jq -r '.[]' 00:14:09.107 18:17:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # raid_bdev= 00:14:09.107 18:17:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@459 -- # '[' -n '' ']' 00:14:09.107 18:17:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@464 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:14:09.365 [2024-07-12 18:17:52.929285] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:14:09.365 [2024-07-12 18:17:52.929326] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:14:09.365 [2024-07-12 18:17:52.929346] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xcc27a0 00:14:09.365 [2024-07-12 18:17:52.929359] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:14:09.365 [2024-07-12 18:17:52.930995] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:14:09.365 [2024-07-12 18:17:52.931024] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:14:09.365 [2024-07-12 18:17:52.931089] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt1 00:14:09.365 [2024-07-12 18:17:52.931113] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:14:09.365 pt1 00:14:09.365 18:17:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@467 -- # verify_raid_bdev_state raid_bdev1 configuring raid0 64 3 00:14:09.365 18:17:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:14:09.365 18:17:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:14:09.365 18:17:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:14:09.365 18:17:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:14:09.365 18:17:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:14:09.365 18:17:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:09.365 18:17:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:09.365 18:17:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:09.365 18:17:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:09.365 18:17:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:09.365 18:17:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:14:09.623 18:17:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:09.623 "name": "raid_bdev1", 00:14:09.623 "uuid": "3eaba237-1a0e-47fe-8d6d-5cb27d34dec2", 00:14:09.623 "strip_size_kb": 64, 00:14:09.623 "state": "configuring", 00:14:09.623 "raid_level": "raid0", 00:14:09.623 "superblock": true, 00:14:09.623 "num_base_bdevs": 3, 00:14:09.623 "num_base_bdevs_discovered": 1, 00:14:09.623 "num_base_bdevs_operational": 3, 00:14:09.623 "base_bdevs_list": [ 00:14:09.623 { 00:14:09.623 "name": "pt1", 00:14:09.623 "uuid": "00000000-0000-0000-0000-000000000001", 00:14:09.623 "is_configured": true, 00:14:09.623 "data_offset": 2048, 00:14:09.623 "data_size": 63488 00:14:09.623 }, 00:14:09.623 { 00:14:09.623 "name": null, 00:14:09.623 "uuid": "00000000-0000-0000-0000-000000000002", 00:14:09.623 "is_configured": false, 00:14:09.623 "data_offset": 2048, 00:14:09.623 "data_size": 63488 00:14:09.623 }, 00:14:09.623 { 00:14:09.623 "name": null, 00:14:09.623 "uuid": "00000000-0000-0000-0000-000000000003", 00:14:09.623 "is_configured": false, 00:14:09.623 "data_offset": 2048, 00:14:09.623 "data_size": 63488 00:14:09.623 } 00:14:09.623 ] 00:14:09.623 }' 00:14:09.623 18:17:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:09.623 18:17:53 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:14:10.189 18:17:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@469 -- # '[' 3 -gt 2 ']' 00:14:10.189 18:17:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@471 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:14:10.448 [2024-07-12 18:17:54.040274] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:14:10.448 [2024-07-12 18:17:54.040319] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:14:10.448 [2024-07-12 18:17:54.040337] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xcb9c70 00:14:10.448 [2024-07-12 18:17:54.040350] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:14:10.448 [2024-07-12 18:17:54.040680] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:14:10.448 [2024-07-12 18:17:54.040697] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:14:10.448 [2024-07-12 18:17:54.040757] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:14:10.448 [2024-07-12 18:17:54.040775] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:14:10.448 pt2 00:14:10.448 18:17:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@472 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:14:10.706 [2024-07-12 18:17:54.284956] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: pt2 00:14:10.706 18:17:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@473 -- # verify_raid_bdev_state raid_bdev1 configuring raid0 64 3 00:14:10.706 18:17:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:14:10.706 18:17:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:14:10.706 18:17:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:14:10.706 18:17:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:14:10.706 18:17:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:14:10.706 18:17:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:10.706 18:17:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:10.706 18:17:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:10.706 18:17:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:10.706 18:17:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:10.706 18:17:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:14:10.964 18:17:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:10.964 "name": "raid_bdev1", 00:14:10.964 "uuid": "3eaba237-1a0e-47fe-8d6d-5cb27d34dec2", 00:14:10.964 "strip_size_kb": 64, 00:14:10.964 "state": "configuring", 00:14:10.964 "raid_level": "raid0", 00:14:10.964 "superblock": true, 00:14:10.964 "num_base_bdevs": 3, 00:14:10.964 "num_base_bdevs_discovered": 1, 00:14:10.964 "num_base_bdevs_operational": 3, 00:14:10.964 "base_bdevs_list": [ 00:14:10.964 { 00:14:10.964 "name": "pt1", 00:14:10.964 "uuid": "00000000-0000-0000-0000-000000000001", 00:14:10.964 "is_configured": true, 00:14:10.964 "data_offset": 2048, 00:14:10.964 "data_size": 63488 00:14:10.964 }, 00:14:10.964 { 00:14:10.964 "name": null, 00:14:10.964 "uuid": "00000000-0000-0000-0000-000000000002", 00:14:10.964 "is_configured": false, 00:14:10.964 "data_offset": 2048, 00:14:10.964 "data_size": 63488 00:14:10.964 }, 00:14:10.964 { 00:14:10.964 "name": null, 00:14:10.964 "uuid": "00000000-0000-0000-0000-000000000003", 00:14:10.964 "is_configured": false, 00:14:10.964 "data_offset": 2048, 00:14:10.964 "data_size": 63488 00:14:10.964 } 00:14:10.964 ] 00:14:10.964 }' 00:14:10.964 18:17:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:10.964 18:17:54 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:14:11.529 18:17:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i = 1 )) 00:14:11.529 18:17:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:14:11.529 18:17:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@478 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:14:11.812 [2024-07-12 18:17:55.375820] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:14:11.812 [2024-07-12 18:17:55.375865] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:14:11.812 [2024-07-12 18:17:55.375883] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xe5afa0 00:14:11.812 [2024-07-12 18:17:55.375895] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:14:11.812 [2024-07-12 18:17:55.376228] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:14:11.812 [2024-07-12 18:17:55.376246] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:14:11.812 [2024-07-12 18:17:55.376306] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:14:11.812 [2024-07-12 18:17:55.376323] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:14:11.812 pt2 00:14:11.812 18:17:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i++ )) 00:14:11.812 18:17:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:14:11.812 18:17:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@478 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc3 -p pt3 -u 00000000-0000-0000-0000-000000000003 00:14:12.072 [2024-07-12 18:17:55.624488] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc3 00:14:12.072 [2024-07-12 18:17:55.624520] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:14:12.072 [2024-07-12 18:17:55.624537] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xe5bb30 00:14:12.072 [2024-07-12 18:17:55.624549] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:14:12.072 [2024-07-12 18:17:55.624834] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:14:12.072 [2024-07-12 18:17:55.624851] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt3 00:14:12.072 [2024-07-12 18:17:55.624900] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt3 00:14:12.072 [2024-07-12 18:17:55.624917] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt3 is claimed 00:14:12.072 [2024-07-12 18:17:55.625026] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0xe5cc00 00:14:12.072 [2024-07-12 18:17:55.625037] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 190464, blocklen 512 00:14:12.072 [2024-07-12 18:17:55.625199] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xe659b0 00:14:12.072 [2024-07-12 18:17:55.625317] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0xe5cc00 00:14:12.072 [2024-07-12 18:17:55.625327] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0xe5cc00 00:14:12.072 [2024-07-12 18:17:55.625416] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:14:12.072 pt3 00:14:12.072 18:17:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i++ )) 00:14:12.072 18:17:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:14:12.072 18:17:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@482 -- # verify_raid_bdev_state raid_bdev1 online raid0 64 3 00:14:12.072 18:17:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:14:12.072 18:17:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:14:12.072 18:17:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:14:12.072 18:17:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:14:12.072 18:17:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:14:12.072 18:17:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:12.072 18:17:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:12.072 18:17:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:12.072 18:17:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:12.072 18:17:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:12.072 18:17:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:14:12.331 18:17:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:12.331 "name": "raid_bdev1", 00:14:12.331 "uuid": "3eaba237-1a0e-47fe-8d6d-5cb27d34dec2", 00:14:12.331 "strip_size_kb": 64, 00:14:12.331 "state": "online", 00:14:12.331 "raid_level": "raid0", 00:14:12.331 "superblock": true, 00:14:12.331 "num_base_bdevs": 3, 00:14:12.331 "num_base_bdevs_discovered": 3, 00:14:12.331 "num_base_bdevs_operational": 3, 00:14:12.331 "base_bdevs_list": [ 00:14:12.331 { 00:14:12.331 "name": "pt1", 00:14:12.331 "uuid": "00000000-0000-0000-0000-000000000001", 00:14:12.331 "is_configured": true, 00:14:12.331 "data_offset": 2048, 00:14:12.331 "data_size": 63488 00:14:12.331 }, 00:14:12.331 { 00:14:12.331 "name": "pt2", 00:14:12.331 "uuid": "00000000-0000-0000-0000-000000000002", 00:14:12.331 "is_configured": true, 00:14:12.331 "data_offset": 2048, 00:14:12.331 "data_size": 63488 00:14:12.331 }, 00:14:12.331 { 00:14:12.331 "name": "pt3", 00:14:12.331 "uuid": "00000000-0000-0000-0000-000000000003", 00:14:12.331 "is_configured": true, 00:14:12.331 "data_offset": 2048, 00:14:12.331 "data_size": 63488 00:14:12.331 } 00:14:12.331 ] 00:14:12.331 }' 00:14:12.331 18:17:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:12.331 18:17:55 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:14:12.897 18:17:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@483 -- # verify_raid_bdev_properties raid_bdev1 00:14:12.897 18:17:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:14:12.897 18:17:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:14:12.897 18:17:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:14:12.897 18:17:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:14:12.897 18:17:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@198 -- # local name 00:14:12.897 18:17:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:14:12.897 18:17:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:14:13.156 [2024-07-12 18:17:56.627425] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:14:13.156 18:17:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:14:13.156 "name": "raid_bdev1", 00:14:13.156 "aliases": [ 00:14:13.156 "3eaba237-1a0e-47fe-8d6d-5cb27d34dec2" 00:14:13.156 ], 00:14:13.156 "product_name": "Raid Volume", 00:14:13.156 "block_size": 512, 00:14:13.156 "num_blocks": 190464, 00:14:13.156 "uuid": "3eaba237-1a0e-47fe-8d6d-5cb27d34dec2", 00:14:13.156 "assigned_rate_limits": { 00:14:13.156 "rw_ios_per_sec": 0, 00:14:13.156 "rw_mbytes_per_sec": 0, 00:14:13.156 "r_mbytes_per_sec": 0, 00:14:13.156 "w_mbytes_per_sec": 0 00:14:13.156 }, 00:14:13.156 "claimed": false, 00:14:13.156 "zoned": false, 00:14:13.156 "supported_io_types": { 00:14:13.156 "read": true, 00:14:13.156 "write": true, 00:14:13.156 "unmap": true, 00:14:13.156 "flush": true, 00:14:13.156 "reset": true, 00:14:13.156 "nvme_admin": false, 00:14:13.156 "nvme_io": false, 00:14:13.156 "nvme_io_md": false, 00:14:13.156 "write_zeroes": true, 00:14:13.156 "zcopy": false, 00:14:13.156 "get_zone_info": false, 00:14:13.156 "zone_management": false, 00:14:13.156 "zone_append": false, 00:14:13.156 "compare": false, 00:14:13.156 "compare_and_write": false, 00:14:13.156 "abort": false, 00:14:13.156 "seek_hole": false, 00:14:13.156 "seek_data": false, 00:14:13.156 "copy": false, 00:14:13.156 "nvme_iov_md": false 00:14:13.156 }, 00:14:13.156 "memory_domains": [ 00:14:13.156 { 00:14:13.156 "dma_device_id": "system", 00:14:13.156 "dma_device_type": 1 00:14:13.156 }, 00:14:13.156 { 00:14:13.156 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:13.156 "dma_device_type": 2 00:14:13.156 }, 00:14:13.156 { 00:14:13.156 "dma_device_id": "system", 00:14:13.156 "dma_device_type": 1 00:14:13.156 }, 00:14:13.156 { 00:14:13.156 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:13.156 "dma_device_type": 2 00:14:13.156 }, 00:14:13.156 { 00:14:13.156 "dma_device_id": "system", 00:14:13.156 "dma_device_type": 1 00:14:13.156 }, 00:14:13.156 { 00:14:13.156 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:13.156 "dma_device_type": 2 00:14:13.156 } 00:14:13.156 ], 00:14:13.156 "driver_specific": { 00:14:13.156 "raid": { 00:14:13.156 "uuid": "3eaba237-1a0e-47fe-8d6d-5cb27d34dec2", 00:14:13.156 "strip_size_kb": 64, 00:14:13.156 "state": "online", 00:14:13.156 "raid_level": "raid0", 00:14:13.156 "superblock": true, 00:14:13.156 "num_base_bdevs": 3, 00:14:13.156 "num_base_bdevs_discovered": 3, 00:14:13.156 "num_base_bdevs_operational": 3, 00:14:13.156 "base_bdevs_list": [ 00:14:13.156 { 00:14:13.156 "name": "pt1", 00:14:13.156 "uuid": "00000000-0000-0000-0000-000000000001", 00:14:13.156 "is_configured": true, 00:14:13.156 "data_offset": 2048, 00:14:13.156 "data_size": 63488 00:14:13.156 }, 00:14:13.156 { 00:14:13.156 "name": "pt2", 00:14:13.156 "uuid": "00000000-0000-0000-0000-000000000002", 00:14:13.156 "is_configured": true, 00:14:13.156 "data_offset": 2048, 00:14:13.156 "data_size": 63488 00:14:13.156 }, 00:14:13.156 { 00:14:13.156 "name": "pt3", 00:14:13.156 "uuid": "00000000-0000-0000-0000-000000000003", 00:14:13.156 "is_configured": true, 00:14:13.156 "data_offset": 2048, 00:14:13.156 "data_size": 63488 00:14:13.156 } 00:14:13.156 ] 00:14:13.156 } 00:14:13.156 } 00:14:13.156 }' 00:14:13.156 18:17:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:14:13.156 18:17:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:14:13.156 pt2 00:14:13.156 pt3' 00:14:13.156 18:17:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:14:13.156 18:17:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:14:13.156 18:17:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:14:13.415 18:17:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:14:13.415 "name": "pt1", 00:14:13.415 "aliases": [ 00:14:13.415 "00000000-0000-0000-0000-000000000001" 00:14:13.415 ], 00:14:13.415 "product_name": "passthru", 00:14:13.415 "block_size": 512, 00:14:13.415 "num_blocks": 65536, 00:14:13.415 "uuid": "00000000-0000-0000-0000-000000000001", 00:14:13.415 "assigned_rate_limits": { 00:14:13.415 "rw_ios_per_sec": 0, 00:14:13.415 "rw_mbytes_per_sec": 0, 00:14:13.415 "r_mbytes_per_sec": 0, 00:14:13.415 "w_mbytes_per_sec": 0 00:14:13.415 }, 00:14:13.415 "claimed": true, 00:14:13.415 "claim_type": "exclusive_write", 00:14:13.415 "zoned": false, 00:14:13.415 "supported_io_types": { 00:14:13.415 "read": true, 00:14:13.415 "write": true, 00:14:13.415 "unmap": true, 00:14:13.415 "flush": true, 00:14:13.415 "reset": true, 00:14:13.415 "nvme_admin": false, 00:14:13.415 "nvme_io": false, 00:14:13.415 "nvme_io_md": false, 00:14:13.415 "write_zeroes": true, 00:14:13.415 "zcopy": true, 00:14:13.415 "get_zone_info": false, 00:14:13.415 "zone_management": false, 00:14:13.415 "zone_append": false, 00:14:13.415 "compare": false, 00:14:13.415 "compare_and_write": false, 00:14:13.415 "abort": true, 00:14:13.415 "seek_hole": false, 00:14:13.415 "seek_data": false, 00:14:13.415 "copy": true, 00:14:13.415 "nvme_iov_md": false 00:14:13.415 }, 00:14:13.415 "memory_domains": [ 00:14:13.415 { 00:14:13.415 "dma_device_id": "system", 00:14:13.415 "dma_device_type": 1 00:14:13.415 }, 00:14:13.415 { 00:14:13.415 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:13.415 "dma_device_type": 2 00:14:13.415 } 00:14:13.415 ], 00:14:13.415 "driver_specific": { 00:14:13.415 "passthru": { 00:14:13.415 "name": "pt1", 00:14:13.415 "base_bdev_name": "malloc1" 00:14:13.415 } 00:14:13.415 } 00:14:13.415 }' 00:14:13.415 18:17:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:13.415 18:17:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:13.415 18:17:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:14:13.415 18:17:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:13.415 18:17:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:13.415 18:17:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:14:13.415 18:17:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:13.673 18:17:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:13.673 18:17:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:14:13.673 18:17:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:13.673 18:17:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:13.673 18:17:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:14:13.673 18:17:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:14:13.673 18:17:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:14:13.673 18:17:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:14:13.931 18:17:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:14:13.931 "name": "pt2", 00:14:13.931 "aliases": [ 00:14:13.931 "00000000-0000-0000-0000-000000000002" 00:14:13.931 ], 00:14:13.931 "product_name": "passthru", 00:14:13.931 "block_size": 512, 00:14:13.931 "num_blocks": 65536, 00:14:13.931 "uuid": "00000000-0000-0000-0000-000000000002", 00:14:13.931 "assigned_rate_limits": { 00:14:13.931 "rw_ios_per_sec": 0, 00:14:13.931 "rw_mbytes_per_sec": 0, 00:14:13.931 "r_mbytes_per_sec": 0, 00:14:13.931 "w_mbytes_per_sec": 0 00:14:13.931 }, 00:14:13.931 "claimed": true, 00:14:13.931 "claim_type": "exclusive_write", 00:14:13.931 "zoned": false, 00:14:13.931 "supported_io_types": { 00:14:13.931 "read": true, 00:14:13.931 "write": true, 00:14:13.931 "unmap": true, 00:14:13.931 "flush": true, 00:14:13.931 "reset": true, 00:14:13.931 "nvme_admin": false, 00:14:13.931 "nvme_io": false, 00:14:13.931 "nvme_io_md": false, 00:14:13.931 "write_zeroes": true, 00:14:13.931 "zcopy": true, 00:14:13.931 "get_zone_info": false, 00:14:13.931 "zone_management": false, 00:14:13.931 "zone_append": false, 00:14:13.931 "compare": false, 00:14:13.931 "compare_and_write": false, 00:14:13.931 "abort": true, 00:14:13.931 "seek_hole": false, 00:14:13.931 "seek_data": false, 00:14:13.931 "copy": true, 00:14:13.931 "nvme_iov_md": false 00:14:13.931 }, 00:14:13.931 "memory_domains": [ 00:14:13.931 { 00:14:13.931 "dma_device_id": "system", 00:14:13.931 "dma_device_type": 1 00:14:13.931 }, 00:14:13.931 { 00:14:13.931 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:13.931 "dma_device_type": 2 00:14:13.931 } 00:14:13.931 ], 00:14:13.931 "driver_specific": { 00:14:13.931 "passthru": { 00:14:13.931 "name": "pt2", 00:14:13.931 "base_bdev_name": "malloc2" 00:14:13.931 } 00:14:13.931 } 00:14:13.931 }' 00:14:13.931 18:17:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:13.931 18:17:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:13.931 18:17:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:14:13.931 18:17:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:14.189 18:17:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:14.189 18:17:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:14:14.189 18:17:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:14.189 18:17:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:14.189 18:17:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:14:14.189 18:17:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:14.189 18:17:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:14.189 18:17:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:14:14.189 18:17:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:14:14.189 18:17:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt3 00:14:14.189 18:17:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:14:14.446 18:17:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:14:14.447 "name": "pt3", 00:14:14.447 "aliases": [ 00:14:14.447 "00000000-0000-0000-0000-000000000003" 00:14:14.447 ], 00:14:14.447 "product_name": "passthru", 00:14:14.447 "block_size": 512, 00:14:14.447 "num_blocks": 65536, 00:14:14.447 "uuid": "00000000-0000-0000-0000-000000000003", 00:14:14.447 "assigned_rate_limits": { 00:14:14.447 "rw_ios_per_sec": 0, 00:14:14.447 "rw_mbytes_per_sec": 0, 00:14:14.447 "r_mbytes_per_sec": 0, 00:14:14.447 "w_mbytes_per_sec": 0 00:14:14.447 }, 00:14:14.447 "claimed": true, 00:14:14.447 "claim_type": "exclusive_write", 00:14:14.447 "zoned": false, 00:14:14.447 "supported_io_types": { 00:14:14.447 "read": true, 00:14:14.447 "write": true, 00:14:14.447 "unmap": true, 00:14:14.447 "flush": true, 00:14:14.447 "reset": true, 00:14:14.447 "nvme_admin": false, 00:14:14.447 "nvme_io": false, 00:14:14.447 "nvme_io_md": false, 00:14:14.447 "write_zeroes": true, 00:14:14.447 "zcopy": true, 00:14:14.447 "get_zone_info": false, 00:14:14.447 "zone_management": false, 00:14:14.447 "zone_append": false, 00:14:14.447 "compare": false, 00:14:14.447 "compare_and_write": false, 00:14:14.447 "abort": true, 00:14:14.447 "seek_hole": false, 00:14:14.447 "seek_data": false, 00:14:14.447 "copy": true, 00:14:14.447 "nvme_iov_md": false 00:14:14.447 }, 00:14:14.447 "memory_domains": [ 00:14:14.447 { 00:14:14.447 "dma_device_id": "system", 00:14:14.447 "dma_device_type": 1 00:14:14.447 }, 00:14:14.447 { 00:14:14.447 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:14.447 "dma_device_type": 2 00:14:14.447 } 00:14:14.447 ], 00:14:14.447 "driver_specific": { 00:14:14.447 "passthru": { 00:14:14.447 "name": "pt3", 00:14:14.447 "base_bdev_name": "malloc3" 00:14:14.447 } 00:14:14.447 } 00:14:14.447 }' 00:14:14.447 18:17:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:14.447 18:17:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:14.705 18:17:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:14:14.705 18:17:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:14.705 18:17:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:14.705 18:17:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:14:14.705 18:17:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:14.705 18:17:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:14.705 18:17:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:14:14.705 18:17:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:14.705 18:17:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:14.963 18:17:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:14:14.963 18:17:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:14:14.963 18:17:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # jq -r '.[] | .uuid' 00:14:15.221 [2024-07-12 18:17:58.692891] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:14:15.221 18:17:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # '[' 3eaba237-1a0e-47fe-8d6d-5cb27d34dec2 '!=' 3eaba237-1a0e-47fe-8d6d-5cb27d34dec2 ']' 00:14:15.221 18:17:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@490 -- # has_redundancy raid0 00:14:15.221 18:17:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:14:15.221 18:17:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@215 -- # return 1 00:14:15.221 18:17:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@562 -- # killprocess 2486168 00:14:15.221 18:17:58 bdev_raid.raid_superblock_test -- common/autotest_common.sh@948 -- # '[' -z 2486168 ']' 00:14:15.221 18:17:58 bdev_raid.raid_superblock_test -- common/autotest_common.sh@952 -- # kill -0 2486168 00:14:15.221 18:17:58 bdev_raid.raid_superblock_test -- common/autotest_common.sh@953 -- # uname 00:14:15.221 18:17:58 bdev_raid.raid_superblock_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:14:15.221 18:17:58 bdev_raid.raid_superblock_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2486168 00:14:15.221 18:17:58 bdev_raid.raid_superblock_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:14:15.221 18:17:58 bdev_raid.raid_superblock_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:14:15.221 18:17:58 bdev_raid.raid_superblock_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2486168' 00:14:15.221 killing process with pid 2486168 00:14:15.221 18:17:58 bdev_raid.raid_superblock_test -- common/autotest_common.sh@967 -- # kill 2486168 00:14:15.221 [2024-07-12 18:17:58.764807] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:14:15.221 [2024-07-12 18:17:58.764857] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:14:15.221 [2024-07-12 18:17:58.764908] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:14:15.221 [2024-07-12 18:17:58.764919] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xe5cc00 name raid_bdev1, state offline 00:14:15.221 18:17:58 bdev_raid.raid_superblock_test -- common/autotest_common.sh@972 -- # wait 2486168 00:14:15.221 [2024-07-12 18:17:58.791178] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:14:15.480 18:17:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@564 -- # return 0 00:14:15.480 00:14:15.480 real 0m15.026s 00:14:15.480 user 0m27.154s 00:14:15.480 sys 0m2.611s 00:14:15.480 18:17:58 bdev_raid.raid_superblock_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:14:15.480 18:17:58 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:14:15.480 ************************************ 00:14:15.480 END TEST raid_superblock_test 00:14:15.480 ************************************ 00:14:15.480 18:17:59 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:14:15.480 18:17:59 bdev_raid -- bdev/bdev_raid.sh@870 -- # run_test raid_read_error_test raid_io_error_test raid0 3 read 00:14:15.480 18:17:59 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:14:15.480 18:17:59 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:14:15.480 18:17:59 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:14:15.480 ************************************ 00:14:15.480 START TEST raid_read_error_test 00:14:15.480 ************************************ 00:14:15.480 18:17:59 bdev_raid.raid_read_error_test -- common/autotest_common.sh@1123 -- # raid_io_error_test raid0 3 read 00:14:15.480 18:17:59 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@788 -- # local raid_level=raid0 00:14:15.480 18:17:59 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@789 -- # local num_base_bdevs=3 00:14:15.480 18:17:59 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@790 -- # local error_io_type=read 00:14:15.480 18:17:59 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i = 1 )) 00:14:15.480 18:17:59 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:14:15.480 18:17:59 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev1 00:14:15.480 18:17:59 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:14:15.480 18:17:59 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:14:15.480 18:17:59 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev2 00:14:15.480 18:17:59 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:14:15.480 18:17:59 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:14:15.480 18:17:59 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev3 00:14:15.480 18:17:59 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:14:15.480 18:17:59 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:14:15.480 18:17:59 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3') 00:14:15.480 18:17:59 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # local base_bdevs 00:14:15.480 18:17:59 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@792 -- # local raid_bdev_name=raid_bdev1 00:14:15.480 18:17:59 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # local strip_size 00:14:15.480 18:17:59 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@794 -- # local create_arg 00:14:15.480 18:17:59 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@795 -- # local bdevperf_log 00:14:15.480 18:17:59 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@796 -- # local fail_per_s 00:14:15.480 18:17:59 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@798 -- # '[' raid0 '!=' raid1 ']' 00:14:15.480 18:17:59 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@799 -- # strip_size=64 00:14:15.480 18:17:59 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@800 -- # create_arg+=' -z 64' 00:14:15.480 18:17:59 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@805 -- # mktemp -p /raidtest 00:14:15.480 18:17:59 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@805 -- # bdevperf_log=/raidtest/tmp.CBNpcZdKTS 00:14:15.480 18:17:59 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@808 -- # raid_pid=2488399 00:14:15.480 18:17:59 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@809 -- # waitforlisten 2488399 /var/tmp/spdk-raid.sock 00:14:15.480 18:17:59 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@807 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 128k -q 1 -z -f -L bdev_raid 00:14:15.480 18:17:59 bdev_raid.raid_read_error_test -- common/autotest_common.sh@829 -- # '[' -z 2488399 ']' 00:14:15.480 18:17:59 bdev_raid.raid_read_error_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:14:15.480 18:17:59 bdev_raid.raid_read_error_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:14:15.480 18:17:59 bdev_raid.raid_read_error_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:14:15.480 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:14:15.480 18:17:59 bdev_raid.raid_read_error_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:14:15.480 18:17:59 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:14:15.480 [2024-07-12 18:17:59.152810] Starting SPDK v24.09-pre git sha1 182dd7de4 / DPDK 24.03.0 initialization... 00:14:15.480 [2024-07-12 18:17:59.152882] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2488399 ] 00:14:15.739 [2024-07-12 18:17:59.282293] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:14:15.739 [2024-07-12 18:17:59.389031] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:14:15.739 [2024-07-12 18:17:59.460262] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:14:15.739 [2024-07-12 18:17:59.460300] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:14:16.673 18:18:00 bdev_raid.raid_read_error_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:14:16.673 18:18:00 bdev_raid.raid_read_error_test -- common/autotest_common.sh@862 -- # return 0 00:14:16.673 18:18:00 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:14:16.673 18:18:00 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:14:16.673 BaseBdev1_malloc 00:14:16.673 18:18:00 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev1_malloc 00:14:16.932 true 00:14:16.932 18:18:00 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev1_malloc -p BaseBdev1 00:14:17.190 [2024-07-12 18:18:00.807072] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev1_malloc 00:14:17.190 [2024-07-12 18:18:00.807118] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:14:17.190 [2024-07-12 18:18:00.807139] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x15e70d0 00:14:17.190 [2024-07-12 18:18:00.807152] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:14:17.190 [2024-07-12 18:18:00.809058] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:14:17.190 [2024-07-12 18:18:00.809088] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:14:17.190 BaseBdev1 00:14:17.190 18:18:00 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:14:17.190 18:18:00 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:14:17.449 BaseBdev2_malloc 00:14:17.449 18:18:01 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev2_malloc 00:14:17.707 true 00:14:17.707 18:18:01 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev2_malloc -p BaseBdev2 00:14:17.964 [2024-07-12 18:18:01.550727] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev2_malloc 00:14:17.964 [2024-07-12 18:18:01.550771] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:14:17.964 [2024-07-12 18:18:01.550792] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x15eb910 00:14:17.964 [2024-07-12 18:18:01.550805] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:14:17.964 [2024-07-12 18:18:01.552406] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:14:17.964 [2024-07-12 18:18:01.552433] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:14:17.964 BaseBdev2 00:14:17.964 18:18:01 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:14:17.964 18:18:01 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3_malloc 00:14:18.221 BaseBdev3_malloc 00:14:18.221 18:18:01 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev3_malloc 00:14:18.478 true 00:14:18.478 18:18:02 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev3_malloc -p BaseBdev3 00:14:18.735 [2024-07-12 18:18:02.270177] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev3_malloc 00:14:18.735 [2024-07-12 18:18:02.270221] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:14:18.735 [2024-07-12 18:18:02.270242] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x15edbd0 00:14:18.735 [2024-07-12 18:18:02.270255] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:14:18.735 [2024-07-12 18:18:02.271827] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:14:18.735 [2024-07-12 18:18:02.271855] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev3 00:14:18.735 BaseBdev3 00:14:18.735 18:18:02 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@819 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n raid_bdev1 -s 00:14:18.992 [2024-07-12 18:18:02.510846] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:14:18.992 [2024-07-12 18:18:02.512202] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:14:18.992 [2024-07-12 18:18:02.512273] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:14:18.992 [2024-07-12 18:18:02.512475] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x15ef280 00:14:18.992 [2024-07-12 18:18:02.512486] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 190464, blocklen 512 00:14:18.992 [2024-07-12 18:18:02.512686] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x15eee20 00:14:18.992 [2024-07-12 18:18:02.512834] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x15ef280 00:14:18.992 [2024-07-12 18:18:02.512844] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x15ef280 00:14:18.992 [2024-07-12 18:18:02.512957] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:14:18.992 18:18:02 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@820 -- # verify_raid_bdev_state raid_bdev1 online raid0 64 3 00:14:18.992 18:18:02 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:14:18.992 18:18:02 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:14:18.992 18:18:02 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:14:18.992 18:18:02 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:14:18.993 18:18:02 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:14:18.993 18:18:02 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:18.993 18:18:02 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:18.993 18:18:02 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:18.993 18:18:02 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:18.993 18:18:02 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:18.993 18:18:02 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:14:19.250 18:18:02 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:19.250 "name": "raid_bdev1", 00:14:19.250 "uuid": "b29753af-2e41-4d57-913d-8ac69526f334", 00:14:19.250 "strip_size_kb": 64, 00:14:19.250 "state": "online", 00:14:19.250 "raid_level": "raid0", 00:14:19.250 "superblock": true, 00:14:19.250 "num_base_bdevs": 3, 00:14:19.250 "num_base_bdevs_discovered": 3, 00:14:19.250 "num_base_bdevs_operational": 3, 00:14:19.250 "base_bdevs_list": [ 00:14:19.250 { 00:14:19.250 "name": "BaseBdev1", 00:14:19.250 "uuid": "f020cb51-cb62-5ac8-9d45-685e860052b4", 00:14:19.250 "is_configured": true, 00:14:19.250 "data_offset": 2048, 00:14:19.250 "data_size": 63488 00:14:19.250 }, 00:14:19.250 { 00:14:19.250 "name": "BaseBdev2", 00:14:19.250 "uuid": "70ceebab-aabe-51ab-a561-8a5aaf88759a", 00:14:19.250 "is_configured": true, 00:14:19.250 "data_offset": 2048, 00:14:19.250 "data_size": 63488 00:14:19.250 }, 00:14:19.250 { 00:14:19.250 "name": "BaseBdev3", 00:14:19.250 "uuid": "03f687bb-aa06-59ff-8d3e-93b528980868", 00:14:19.250 "is_configured": true, 00:14:19.250 "data_offset": 2048, 00:14:19.250 "data_size": 63488 00:14:19.250 } 00:14:19.250 ] 00:14:19.250 }' 00:14:19.250 18:18:02 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:19.250 18:18:02 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:14:19.815 18:18:03 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@824 -- # sleep 1 00:14:19.815 18:18:03 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@823 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/spdk-raid.sock perform_tests 00:14:19.815 [2024-07-12 18:18:03.481693] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x143d5b0 00:14:20.745 18:18:04 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@827 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_inject_error EE_BaseBdev1_malloc read failure 00:14:21.003 18:18:04 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@829 -- # local expected_num_base_bdevs 00:14:21.003 18:18:04 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@830 -- # [[ raid0 = \r\a\i\d\1 ]] 00:14:21.003 18:18:04 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@833 -- # expected_num_base_bdevs=3 00:14:21.003 18:18:04 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@835 -- # verify_raid_bdev_state raid_bdev1 online raid0 64 3 00:14:21.003 18:18:04 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:14:21.003 18:18:04 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:14:21.003 18:18:04 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:14:21.003 18:18:04 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:14:21.003 18:18:04 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:14:21.003 18:18:04 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:21.003 18:18:04 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:21.003 18:18:04 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:21.003 18:18:04 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:21.003 18:18:04 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:21.003 18:18:04 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:14:21.261 18:18:04 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:21.261 "name": "raid_bdev1", 00:14:21.261 "uuid": "b29753af-2e41-4d57-913d-8ac69526f334", 00:14:21.261 "strip_size_kb": 64, 00:14:21.261 "state": "online", 00:14:21.261 "raid_level": "raid0", 00:14:21.261 "superblock": true, 00:14:21.261 "num_base_bdevs": 3, 00:14:21.261 "num_base_bdevs_discovered": 3, 00:14:21.261 "num_base_bdevs_operational": 3, 00:14:21.261 "base_bdevs_list": [ 00:14:21.261 { 00:14:21.261 "name": "BaseBdev1", 00:14:21.261 "uuid": "f020cb51-cb62-5ac8-9d45-685e860052b4", 00:14:21.261 "is_configured": true, 00:14:21.261 "data_offset": 2048, 00:14:21.261 "data_size": 63488 00:14:21.261 }, 00:14:21.261 { 00:14:21.261 "name": "BaseBdev2", 00:14:21.261 "uuid": "70ceebab-aabe-51ab-a561-8a5aaf88759a", 00:14:21.261 "is_configured": true, 00:14:21.261 "data_offset": 2048, 00:14:21.261 "data_size": 63488 00:14:21.261 }, 00:14:21.261 { 00:14:21.261 "name": "BaseBdev3", 00:14:21.261 "uuid": "03f687bb-aa06-59ff-8d3e-93b528980868", 00:14:21.261 "is_configured": true, 00:14:21.261 "data_offset": 2048, 00:14:21.261 "data_size": 63488 00:14:21.261 } 00:14:21.261 ] 00:14:21.261 }' 00:14:21.261 18:18:04 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:21.261 18:18:04 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:14:21.826 18:18:05 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@837 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:14:22.106 [2024-07-12 18:18:05.719828] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:14:22.106 [2024-07-12 18:18:05.719871] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:14:22.106 [2024-07-12 18:18:05.723041] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:14:22.106 [2024-07-12 18:18:05.723078] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:14:22.106 [2024-07-12 18:18:05.723114] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:14:22.106 [2024-07-12 18:18:05.723126] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x15ef280 name raid_bdev1, state offline 00:14:22.106 0 00:14:22.106 18:18:05 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@839 -- # killprocess 2488399 00:14:22.106 18:18:05 bdev_raid.raid_read_error_test -- common/autotest_common.sh@948 -- # '[' -z 2488399 ']' 00:14:22.106 18:18:05 bdev_raid.raid_read_error_test -- common/autotest_common.sh@952 -- # kill -0 2488399 00:14:22.106 18:18:05 bdev_raid.raid_read_error_test -- common/autotest_common.sh@953 -- # uname 00:14:22.106 18:18:05 bdev_raid.raid_read_error_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:14:22.106 18:18:05 bdev_raid.raid_read_error_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2488399 00:14:22.106 18:18:05 bdev_raid.raid_read_error_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:14:22.106 18:18:05 bdev_raid.raid_read_error_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:14:22.106 18:18:05 bdev_raid.raid_read_error_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2488399' 00:14:22.106 killing process with pid 2488399 00:14:22.106 18:18:05 bdev_raid.raid_read_error_test -- common/autotest_common.sh@967 -- # kill 2488399 00:14:22.106 [2024-07-12 18:18:05.787818] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:14:22.106 18:18:05 bdev_raid.raid_read_error_test -- common/autotest_common.sh@972 -- # wait 2488399 00:14:22.106 [2024-07-12 18:18:05.808958] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:14:22.390 18:18:06 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # grep -v Job /raidtest/tmp.CBNpcZdKTS 00:14:22.390 18:18:06 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # grep raid_bdev1 00:14:22.390 18:18:06 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # awk '{print $6}' 00:14:22.390 18:18:06 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # fail_per_s=0.45 00:14:22.390 18:18:06 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@844 -- # has_redundancy raid0 00:14:22.390 18:18:06 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:14:22.390 18:18:06 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@215 -- # return 1 00:14:22.390 18:18:06 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@847 -- # [[ 0.45 != \0\.\0\0 ]] 00:14:22.390 00:14:22.390 real 0m6.972s 00:14:22.390 user 0m11.004s 00:14:22.390 sys 0m1.252s 00:14:22.390 18:18:06 bdev_raid.raid_read_error_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:14:22.390 18:18:06 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:14:22.390 ************************************ 00:14:22.390 END TEST raid_read_error_test 00:14:22.390 ************************************ 00:14:22.390 18:18:06 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:14:22.390 18:18:06 bdev_raid -- bdev/bdev_raid.sh@871 -- # run_test raid_write_error_test raid_io_error_test raid0 3 write 00:14:22.390 18:18:06 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:14:22.390 18:18:06 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:14:22.390 18:18:06 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:14:22.657 ************************************ 00:14:22.657 START TEST raid_write_error_test 00:14:22.657 ************************************ 00:14:22.657 18:18:06 bdev_raid.raid_write_error_test -- common/autotest_common.sh@1123 -- # raid_io_error_test raid0 3 write 00:14:22.657 18:18:06 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@788 -- # local raid_level=raid0 00:14:22.657 18:18:06 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@789 -- # local num_base_bdevs=3 00:14:22.657 18:18:06 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@790 -- # local error_io_type=write 00:14:22.657 18:18:06 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i = 1 )) 00:14:22.657 18:18:06 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:14:22.657 18:18:06 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev1 00:14:22.657 18:18:06 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:14:22.657 18:18:06 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:14:22.657 18:18:06 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev2 00:14:22.657 18:18:06 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:14:22.657 18:18:06 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:14:22.657 18:18:06 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev3 00:14:22.657 18:18:06 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:14:22.657 18:18:06 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:14:22.657 18:18:06 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3') 00:14:22.657 18:18:06 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # local base_bdevs 00:14:22.657 18:18:06 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@792 -- # local raid_bdev_name=raid_bdev1 00:14:22.657 18:18:06 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # local strip_size 00:14:22.657 18:18:06 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@794 -- # local create_arg 00:14:22.657 18:18:06 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@795 -- # local bdevperf_log 00:14:22.657 18:18:06 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@796 -- # local fail_per_s 00:14:22.657 18:18:06 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@798 -- # '[' raid0 '!=' raid1 ']' 00:14:22.657 18:18:06 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@799 -- # strip_size=64 00:14:22.657 18:18:06 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@800 -- # create_arg+=' -z 64' 00:14:22.657 18:18:06 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@805 -- # mktemp -p /raidtest 00:14:22.657 18:18:06 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@805 -- # bdevperf_log=/raidtest/tmp.ObEQl0zSi2 00:14:22.657 18:18:06 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@808 -- # raid_pid=2490002 00:14:22.657 18:18:06 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@809 -- # waitforlisten 2490002 /var/tmp/spdk-raid.sock 00:14:22.657 18:18:06 bdev_raid.raid_write_error_test -- common/autotest_common.sh@829 -- # '[' -z 2490002 ']' 00:14:22.657 18:18:06 bdev_raid.raid_write_error_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:14:22.657 18:18:06 bdev_raid.raid_write_error_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:14:22.657 18:18:06 bdev_raid.raid_write_error_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:14:22.657 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:14:22.657 18:18:06 bdev_raid.raid_write_error_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:14:22.657 18:18:06 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:14:22.657 18:18:06 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@807 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 128k -q 1 -z -f -L bdev_raid 00:14:22.657 [2024-07-12 18:18:06.259259] Starting SPDK v24.09-pre git sha1 182dd7de4 / DPDK 24.03.0 initialization... 00:14:22.657 [2024-07-12 18:18:06.259395] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2490002 ] 00:14:22.916 [2024-07-12 18:18:06.454424] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:14:22.916 [2024-07-12 18:18:06.553565] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:14:22.916 [2024-07-12 18:18:06.618826] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:14:22.916 [2024-07-12 18:18:06.618874] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:14:23.482 18:18:07 bdev_raid.raid_write_error_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:14:23.482 18:18:07 bdev_raid.raid_write_error_test -- common/autotest_common.sh@862 -- # return 0 00:14:23.482 18:18:07 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:14:23.482 18:18:07 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:14:24.048 BaseBdev1_malloc 00:14:24.048 18:18:07 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev1_malloc 00:14:24.307 true 00:14:24.307 18:18:07 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev1_malloc -p BaseBdev1 00:14:24.566 [2024-07-12 18:18:08.105503] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev1_malloc 00:14:24.566 [2024-07-12 18:18:08.105548] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:14:24.566 [2024-07-12 18:18:08.105569] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x180d0d0 00:14:24.566 [2024-07-12 18:18:08.105581] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:14:24.566 [2024-07-12 18:18:08.107468] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:14:24.566 [2024-07-12 18:18:08.107498] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:14:24.566 BaseBdev1 00:14:24.566 18:18:08 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:14:24.566 18:18:08 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:14:24.824 BaseBdev2_malloc 00:14:24.824 18:18:08 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev2_malloc 00:14:25.082 true 00:14:25.082 18:18:08 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev2_malloc -p BaseBdev2 00:14:25.648 [2024-07-12 18:18:09.073852] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev2_malloc 00:14:25.648 [2024-07-12 18:18:09.073896] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:14:25.648 [2024-07-12 18:18:09.073917] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1811910 00:14:25.648 [2024-07-12 18:18:09.073936] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:14:25.648 [2024-07-12 18:18:09.075509] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:14:25.648 [2024-07-12 18:18:09.075537] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:14:25.648 BaseBdev2 00:14:25.648 18:18:09 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:14:25.648 18:18:09 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3_malloc 00:14:25.648 BaseBdev3_malloc 00:14:25.648 18:18:09 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev3_malloc 00:14:25.907 true 00:14:25.907 18:18:09 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev3_malloc -p BaseBdev3 00:14:26.474 [2024-07-12 18:18:10.062230] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev3_malloc 00:14:26.474 [2024-07-12 18:18:10.062280] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:14:26.474 [2024-07-12 18:18:10.062301] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1813bd0 00:14:26.474 [2024-07-12 18:18:10.062313] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:14:26.474 [2024-07-12 18:18:10.063949] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:14:26.474 [2024-07-12 18:18:10.063984] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev3 00:14:26.474 BaseBdev3 00:14:26.474 18:18:10 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@819 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n raid_bdev1 -s 00:14:26.733 [2024-07-12 18:18:10.318942] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:14:26.733 [2024-07-12 18:18:10.320320] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:14:26.733 [2024-07-12 18:18:10.320391] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:14:26.733 [2024-07-12 18:18:10.320598] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x1815280 00:14:26.733 [2024-07-12 18:18:10.320610] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 190464, blocklen 512 00:14:26.733 [2024-07-12 18:18:10.320811] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1814e20 00:14:26.733 [2024-07-12 18:18:10.320966] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1815280 00:14:26.733 [2024-07-12 18:18:10.320976] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x1815280 00:14:26.733 [2024-07-12 18:18:10.321084] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:14:26.733 18:18:10 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@820 -- # verify_raid_bdev_state raid_bdev1 online raid0 64 3 00:14:26.733 18:18:10 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:14:26.733 18:18:10 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:14:26.733 18:18:10 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:14:26.733 18:18:10 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:14:26.733 18:18:10 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:14:26.733 18:18:10 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:26.733 18:18:10 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:26.733 18:18:10 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:26.733 18:18:10 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:26.733 18:18:10 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:26.733 18:18:10 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:14:27.300 18:18:10 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:27.300 "name": "raid_bdev1", 00:14:27.300 "uuid": "9bf5460d-13fa-4b52-8474-70cf147c7ea8", 00:14:27.300 "strip_size_kb": 64, 00:14:27.300 "state": "online", 00:14:27.300 "raid_level": "raid0", 00:14:27.300 "superblock": true, 00:14:27.300 "num_base_bdevs": 3, 00:14:27.300 "num_base_bdevs_discovered": 3, 00:14:27.300 "num_base_bdevs_operational": 3, 00:14:27.300 "base_bdevs_list": [ 00:14:27.300 { 00:14:27.300 "name": "BaseBdev1", 00:14:27.300 "uuid": "be995a7e-0c87-57db-9277-a3b51832cdbf", 00:14:27.300 "is_configured": true, 00:14:27.300 "data_offset": 2048, 00:14:27.300 "data_size": 63488 00:14:27.300 }, 00:14:27.300 { 00:14:27.300 "name": "BaseBdev2", 00:14:27.300 "uuid": "67c384ef-0691-5e2a-9254-b48463471fb5", 00:14:27.300 "is_configured": true, 00:14:27.300 "data_offset": 2048, 00:14:27.300 "data_size": 63488 00:14:27.300 }, 00:14:27.300 { 00:14:27.300 "name": "BaseBdev3", 00:14:27.300 "uuid": "552d27f3-034c-59e3-9a6a-ac64040eeb7d", 00:14:27.300 "is_configured": true, 00:14:27.300 "data_offset": 2048, 00:14:27.300 "data_size": 63488 00:14:27.300 } 00:14:27.300 ] 00:14:27.300 }' 00:14:27.300 18:18:10 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:27.300 18:18:10 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:14:27.865 18:18:11 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@824 -- # sleep 1 00:14:27.865 18:18:11 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@823 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/spdk-raid.sock perform_tests 00:14:27.865 [2024-07-12 18:18:11.558485] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x16635b0 00:14:28.800 18:18:12 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@827 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_inject_error EE_BaseBdev1_malloc write failure 00:14:29.058 18:18:12 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@829 -- # local expected_num_base_bdevs 00:14:29.058 18:18:12 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@830 -- # [[ raid0 = \r\a\i\d\1 ]] 00:14:29.058 18:18:12 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@833 -- # expected_num_base_bdevs=3 00:14:29.058 18:18:12 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@835 -- # verify_raid_bdev_state raid_bdev1 online raid0 64 3 00:14:29.058 18:18:12 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:14:29.058 18:18:12 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:14:29.058 18:18:12 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:14:29.058 18:18:12 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:14:29.058 18:18:12 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:14:29.058 18:18:12 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:29.058 18:18:12 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:29.058 18:18:12 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:29.058 18:18:12 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:29.058 18:18:12 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:29.058 18:18:12 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:14:29.315 18:18:12 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:29.315 "name": "raid_bdev1", 00:14:29.315 "uuid": "9bf5460d-13fa-4b52-8474-70cf147c7ea8", 00:14:29.315 "strip_size_kb": 64, 00:14:29.315 "state": "online", 00:14:29.315 "raid_level": "raid0", 00:14:29.315 "superblock": true, 00:14:29.315 "num_base_bdevs": 3, 00:14:29.315 "num_base_bdevs_discovered": 3, 00:14:29.315 "num_base_bdevs_operational": 3, 00:14:29.315 "base_bdevs_list": [ 00:14:29.315 { 00:14:29.315 "name": "BaseBdev1", 00:14:29.315 "uuid": "be995a7e-0c87-57db-9277-a3b51832cdbf", 00:14:29.315 "is_configured": true, 00:14:29.315 "data_offset": 2048, 00:14:29.315 "data_size": 63488 00:14:29.315 }, 00:14:29.315 { 00:14:29.315 "name": "BaseBdev2", 00:14:29.315 "uuid": "67c384ef-0691-5e2a-9254-b48463471fb5", 00:14:29.315 "is_configured": true, 00:14:29.315 "data_offset": 2048, 00:14:29.315 "data_size": 63488 00:14:29.315 }, 00:14:29.315 { 00:14:29.315 "name": "BaseBdev3", 00:14:29.315 "uuid": "552d27f3-034c-59e3-9a6a-ac64040eeb7d", 00:14:29.315 "is_configured": true, 00:14:29.315 "data_offset": 2048, 00:14:29.315 "data_size": 63488 00:14:29.315 } 00:14:29.315 ] 00:14:29.315 }' 00:14:29.315 18:18:12 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:29.315 18:18:12 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:14:29.881 18:18:13 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@837 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:14:30.138 [2024-07-12 18:18:13.767565] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:14:30.138 [2024-07-12 18:18:13.767603] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:14:30.138 [2024-07-12 18:18:13.770756] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:14:30.138 [2024-07-12 18:18:13.770792] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:14:30.138 [2024-07-12 18:18:13.770826] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:14:30.138 [2024-07-12 18:18:13.770837] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1815280 name raid_bdev1, state offline 00:14:30.138 0 00:14:30.138 18:18:13 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@839 -- # killprocess 2490002 00:14:30.138 18:18:13 bdev_raid.raid_write_error_test -- common/autotest_common.sh@948 -- # '[' -z 2490002 ']' 00:14:30.138 18:18:13 bdev_raid.raid_write_error_test -- common/autotest_common.sh@952 -- # kill -0 2490002 00:14:30.138 18:18:13 bdev_raid.raid_write_error_test -- common/autotest_common.sh@953 -- # uname 00:14:30.138 18:18:13 bdev_raid.raid_write_error_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:14:30.138 18:18:13 bdev_raid.raid_write_error_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2490002 00:14:30.138 18:18:13 bdev_raid.raid_write_error_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:14:30.138 18:18:13 bdev_raid.raid_write_error_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:14:30.138 18:18:13 bdev_raid.raid_write_error_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2490002' 00:14:30.138 killing process with pid 2490002 00:14:30.138 18:18:13 bdev_raid.raid_write_error_test -- common/autotest_common.sh@967 -- # kill 2490002 00:14:30.138 [2024-07-12 18:18:13.839110] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:14:30.138 18:18:13 bdev_raid.raid_write_error_test -- common/autotest_common.sh@972 -- # wait 2490002 00:14:30.138 [2024-07-12 18:18:13.860718] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:14:30.396 18:18:14 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # grep -v Job /raidtest/tmp.ObEQl0zSi2 00:14:30.396 18:18:14 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # grep raid_bdev1 00:14:30.396 18:18:14 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # awk '{print $6}' 00:14:30.396 18:18:14 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # fail_per_s=0.45 00:14:30.396 18:18:14 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@844 -- # has_redundancy raid0 00:14:30.396 18:18:14 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:14:30.396 18:18:14 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@215 -- # return 1 00:14:30.396 18:18:14 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@847 -- # [[ 0.45 != \0\.\0\0 ]] 00:14:30.396 00:14:30.396 real 0m7.962s 00:14:30.396 user 0m12.867s 00:14:30.396 sys 0m1.359s 00:14:30.396 18:18:14 bdev_raid.raid_write_error_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:14:30.396 18:18:14 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:14:30.396 ************************************ 00:14:30.396 END TEST raid_write_error_test 00:14:30.396 ************************************ 00:14:30.654 18:18:14 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:14:30.654 18:18:14 bdev_raid -- bdev/bdev_raid.sh@866 -- # for level in raid0 concat raid1 00:14:30.654 18:18:14 bdev_raid -- bdev/bdev_raid.sh@867 -- # run_test raid_state_function_test raid_state_function_test concat 3 false 00:14:30.655 18:18:14 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:14:30.655 18:18:14 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:14:30.655 18:18:14 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:14:30.655 ************************************ 00:14:30.655 START TEST raid_state_function_test 00:14:30.655 ************************************ 00:14:30.655 18:18:14 bdev_raid.raid_state_function_test -- common/autotest_common.sh@1123 -- # raid_state_function_test concat 3 false 00:14:30.655 18:18:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@220 -- # local raid_level=concat 00:14:30.655 18:18:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@221 -- # local num_base_bdevs=3 00:14:30.655 18:18:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@222 -- # local superblock=false 00:14:30.655 18:18:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@223 -- # local raid_bdev 00:14:30.655 18:18:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i = 1 )) 00:14:30.655 18:18:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:14:30.655 18:18:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev1 00:14:30.655 18:18:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:14:30.655 18:18:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:14:30.655 18:18:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev2 00:14:30.655 18:18:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:14:30.655 18:18:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:14:30.655 18:18:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev3 00:14:30.655 18:18:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:14:30.655 18:18:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:14:30.655 18:18:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3') 00:14:30.655 18:18:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # local base_bdevs 00:14:30.655 18:18:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@225 -- # local raid_bdev_name=Existed_Raid 00:14:30.655 18:18:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # local strip_size 00:14:30.655 18:18:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@227 -- # local strip_size_create_arg 00:14:30.655 18:18:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@228 -- # local superblock_create_arg 00:14:30.655 18:18:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@230 -- # '[' concat '!=' raid1 ']' 00:14:30.655 18:18:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@231 -- # strip_size=64 00:14:30.655 18:18:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@232 -- # strip_size_create_arg='-z 64' 00:14:30.655 18:18:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@237 -- # '[' false = true ']' 00:14:30.655 18:18:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@240 -- # superblock_create_arg= 00:14:30.655 18:18:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@244 -- # raid_pid=2491187 00:14:30.655 18:18:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@245 -- # echo 'Process raid pid: 2491187' 00:14:30.655 Process raid pid: 2491187 00:14:30.655 18:18:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@243 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:14:30.655 18:18:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@246 -- # waitforlisten 2491187 /var/tmp/spdk-raid.sock 00:14:30.655 18:18:14 bdev_raid.raid_state_function_test -- common/autotest_common.sh@829 -- # '[' -z 2491187 ']' 00:14:30.655 18:18:14 bdev_raid.raid_state_function_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:14:30.655 18:18:14 bdev_raid.raid_state_function_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:14:30.655 18:18:14 bdev_raid.raid_state_function_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:14:30.655 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:14:30.655 18:18:14 bdev_raid.raid_state_function_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:14:30.655 18:18:14 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:14:30.655 [2024-07-12 18:18:14.263009] Starting SPDK v24.09-pre git sha1 182dd7de4 / DPDK 24.03.0 initialization... 00:14:30.655 [2024-07-12 18:18:14.263081] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:14:30.913 [2024-07-12 18:18:14.396498] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:14:30.913 [2024-07-12 18:18:14.497178] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:14:30.913 [2024-07-12 18:18:14.560400] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:14:30.913 [2024-07-12 18:18:14.560459] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:14:31.477 18:18:15 bdev_raid.raid_state_function_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:14:31.477 18:18:15 bdev_raid.raid_state_function_test -- common/autotest_common.sh@862 -- # return 0 00:14:31.477 18:18:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@250 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:14:31.735 [2024-07-12 18:18:15.419198] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:14:31.735 [2024-07-12 18:18:15.419240] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:14:31.735 [2024-07-12 18:18:15.419251] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:14:31.735 [2024-07-12 18:18:15.419268] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:14:31.735 [2024-07-12 18:18:15.419277] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:14:31.735 [2024-07-12 18:18:15.419288] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:14:31.735 18:18:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@251 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:14:31.735 18:18:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:14:31.735 18:18:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:14:31.735 18:18:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:14:31.735 18:18:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:14:31.735 18:18:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:14:31.735 18:18:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:31.735 18:18:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:31.735 18:18:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:31.735 18:18:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:31.735 18:18:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:31.735 18:18:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:14:31.993 18:18:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:31.993 "name": "Existed_Raid", 00:14:31.993 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:31.993 "strip_size_kb": 64, 00:14:31.993 "state": "configuring", 00:14:31.993 "raid_level": "concat", 00:14:31.993 "superblock": false, 00:14:31.993 "num_base_bdevs": 3, 00:14:31.993 "num_base_bdevs_discovered": 0, 00:14:31.993 "num_base_bdevs_operational": 3, 00:14:31.993 "base_bdevs_list": [ 00:14:31.993 { 00:14:31.993 "name": "BaseBdev1", 00:14:31.993 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:31.993 "is_configured": false, 00:14:31.993 "data_offset": 0, 00:14:31.993 "data_size": 0 00:14:31.993 }, 00:14:31.993 { 00:14:31.993 "name": "BaseBdev2", 00:14:31.993 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:31.993 "is_configured": false, 00:14:31.993 "data_offset": 0, 00:14:31.993 "data_size": 0 00:14:31.993 }, 00:14:31.993 { 00:14:31.993 "name": "BaseBdev3", 00:14:31.993 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:31.993 "is_configured": false, 00:14:31.993 "data_offset": 0, 00:14:31.993 "data_size": 0 00:14:31.993 } 00:14:31.993 ] 00:14:31.993 }' 00:14:31.993 18:18:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:31.993 18:18:15 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:14:32.558 18:18:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@252 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:14:32.815 [2024-07-12 18:18:16.485904] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:14:32.815 [2024-07-12 18:18:16.485939] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1f59a80 name Existed_Raid, state configuring 00:14:32.815 18:18:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@256 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:14:33.073 [2024-07-12 18:18:16.726563] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:14:33.073 [2024-07-12 18:18:16.726594] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:14:33.073 [2024-07-12 18:18:16.726604] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:14:33.073 [2024-07-12 18:18:16.726615] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:14:33.073 [2024-07-12 18:18:16.726624] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:14:33.073 [2024-07-12 18:18:16.726643] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:14:33.073 18:18:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@257 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:14:33.331 [2024-07-12 18:18:16.981132] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:14:33.331 BaseBdev1 00:14:33.331 18:18:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@258 -- # waitforbdev BaseBdev1 00:14:33.331 18:18:16 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev1 00:14:33.331 18:18:16 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:14:33.331 18:18:16 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:14:33.331 18:18:16 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:14:33.331 18:18:16 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:14:33.331 18:18:16 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:14:33.588 18:18:17 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:14:33.846 [ 00:14:33.846 { 00:14:33.846 "name": "BaseBdev1", 00:14:33.846 "aliases": [ 00:14:33.846 "60694b70-a283-4e74-9f62-e538e612909e" 00:14:33.846 ], 00:14:33.846 "product_name": "Malloc disk", 00:14:33.846 "block_size": 512, 00:14:33.846 "num_blocks": 65536, 00:14:33.846 "uuid": "60694b70-a283-4e74-9f62-e538e612909e", 00:14:33.846 "assigned_rate_limits": { 00:14:33.846 "rw_ios_per_sec": 0, 00:14:33.846 "rw_mbytes_per_sec": 0, 00:14:33.846 "r_mbytes_per_sec": 0, 00:14:33.846 "w_mbytes_per_sec": 0 00:14:33.846 }, 00:14:33.846 "claimed": true, 00:14:33.846 "claim_type": "exclusive_write", 00:14:33.846 "zoned": false, 00:14:33.846 "supported_io_types": { 00:14:33.846 "read": true, 00:14:33.846 "write": true, 00:14:33.846 "unmap": true, 00:14:33.846 "flush": true, 00:14:33.846 "reset": true, 00:14:33.846 "nvme_admin": false, 00:14:33.846 "nvme_io": false, 00:14:33.846 "nvme_io_md": false, 00:14:33.846 "write_zeroes": true, 00:14:33.846 "zcopy": true, 00:14:33.846 "get_zone_info": false, 00:14:33.846 "zone_management": false, 00:14:33.846 "zone_append": false, 00:14:33.846 "compare": false, 00:14:33.846 "compare_and_write": false, 00:14:33.846 "abort": true, 00:14:33.846 "seek_hole": false, 00:14:33.846 "seek_data": false, 00:14:33.846 "copy": true, 00:14:33.846 "nvme_iov_md": false 00:14:33.846 }, 00:14:33.846 "memory_domains": [ 00:14:33.846 { 00:14:33.846 "dma_device_id": "system", 00:14:33.846 "dma_device_type": 1 00:14:33.846 }, 00:14:33.846 { 00:14:33.846 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:33.846 "dma_device_type": 2 00:14:33.846 } 00:14:33.846 ], 00:14:33.846 "driver_specific": {} 00:14:33.846 } 00:14:33.846 ] 00:14:33.846 18:18:17 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:14:33.846 18:18:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@259 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:14:33.846 18:18:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:14:33.846 18:18:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:14:33.846 18:18:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:14:33.846 18:18:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:14:33.846 18:18:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:14:33.846 18:18:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:33.846 18:18:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:33.846 18:18:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:33.846 18:18:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:33.846 18:18:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:33.846 18:18:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:14:34.103 18:18:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:34.103 "name": "Existed_Raid", 00:14:34.103 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:34.103 "strip_size_kb": 64, 00:14:34.103 "state": "configuring", 00:14:34.103 "raid_level": "concat", 00:14:34.103 "superblock": false, 00:14:34.103 "num_base_bdevs": 3, 00:14:34.103 "num_base_bdevs_discovered": 1, 00:14:34.103 "num_base_bdevs_operational": 3, 00:14:34.103 "base_bdevs_list": [ 00:14:34.103 { 00:14:34.103 "name": "BaseBdev1", 00:14:34.103 "uuid": "60694b70-a283-4e74-9f62-e538e612909e", 00:14:34.103 "is_configured": true, 00:14:34.103 "data_offset": 0, 00:14:34.103 "data_size": 65536 00:14:34.103 }, 00:14:34.103 { 00:14:34.103 "name": "BaseBdev2", 00:14:34.103 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:34.103 "is_configured": false, 00:14:34.103 "data_offset": 0, 00:14:34.103 "data_size": 0 00:14:34.103 }, 00:14:34.103 { 00:14:34.103 "name": "BaseBdev3", 00:14:34.103 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:34.103 "is_configured": false, 00:14:34.103 "data_offset": 0, 00:14:34.103 "data_size": 0 00:14:34.103 } 00:14:34.103 ] 00:14:34.103 }' 00:14:34.103 18:18:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:34.103 18:18:17 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:14:34.669 18:18:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@260 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:14:34.928 [2024-07-12 18:18:18.573336] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:14:34.928 [2024-07-12 18:18:18.573377] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1f59310 name Existed_Raid, state configuring 00:14:34.928 18:18:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@264 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:14:35.187 [2024-07-12 18:18:18.805989] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:14:35.187 [2024-07-12 18:18:18.807475] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:14:35.187 [2024-07-12 18:18:18.807510] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:14:35.187 [2024-07-12 18:18:18.807520] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:14:35.187 [2024-07-12 18:18:18.807532] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:14:35.187 18:18:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i = 1 )) 00:14:35.187 18:18:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:14:35.187 18:18:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:14:35.187 18:18:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:14:35.187 18:18:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:14:35.187 18:18:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:14:35.187 18:18:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:14:35.187 18:18:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:14:35.187 18:18:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:35.187 18:18:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:35.187 18:18:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:35.187 18:18:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:35.187 18:18:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:35.187 18:18:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:14:35.445 18:18:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:35.445 "name": "Existed_Raid", 00:14:35.445 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:35.445 "strip_size_kb": 64, 00:14:35.445 "state": "configuring", 00:14:35.445 "raid_level": "concat", 00:14:35.445 "superblock": false, 00:14:35.445 "num_base_bdevs": 3, 00:14:35.445 "num_base_bdevs_discovered": 1, 00:14:35.445 "num_base_bdevs_operational": 3, 00:14:35.445 "base_bdevs_list": [ 00:14:35.445 { 00:14:35.445 "name": "BaseBdev1", 00:14:35.445 "uuid": "60694b70-a283-4e74-9f62-e538e612909e", 00:14:35.445 "is_configured": true, 00:14:35.445 "data_offset": 0, 00:14:35.445 "data_size": 65536 00:14:35.445 }, 00:14:35.445 { 00:14:35.445 "name": "BaseBdev2", 00:14:35.445 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:35.445 "is_configured": false, 00:14:35.445 "data_offset": 0, 00:14:35.445 "data_size": 0 00:14:35.445 }, 00:14:35.445 { 00:14:35.445 "name": "BaseBdev3", 00:14:35.445 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:35.445 "is_configured": false, 00:14:35.445 "data_offset": 0, 00:14:35.445 "data_size": 0 00:14:35.445 } 00:14:35.445 ] 00:14:35.445 }' 00:14:35.445 18:18:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:35.445 18:18:19 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:14:36.010 18:18:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:14:36.268 [2024-07-12 18:18:19.884459] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:14:36.268 BaseBdev2 00:14:36.268 18:18:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev2 00:14:36.268 18:18:19 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev2 00:14:36.268 18:18:19 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:14:36.268 18:18:19 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:14:36.268 18:18:19 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:14:36.268 18:18:19 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:14:36.269 18:18:19 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:14:36.545 18:18:20 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:14:36.804 [ 00:14:36.804 { 00:14:36.804 "name": "BaseBdev2", 00:14:36.804 "aliases": [ 00:14:36.804 "faa6b665-df1f-4542-ab10-801ec1864d23" 00:14:36.804 ], 00:14:36.804 "product_name": "Malloc disk", 00:14:36.804 "block_size": 512, 00:14:36.804 "num_blocks": 65536, 00:14:36.804 "uuid": "faa6b665-df1f-4542-ab10-801ec1864d23", 00:14:36.804 "assigned_rate_limits": { 00:14:36.804 "rw_ios_per_sec": 0, 00:14:36.804 "rw_mbytes_per_sec": 0, 00:14:36.804 "r_mbytes_per_sec": 0, 00:14:36.804 "w_mbytes_per_sec": 0 00:14:36.804 }, 00:14:36.804 "claimed": true, 00:14:36.804 "claim_type": "exclusive_write", 00:14:36.804 "zoned": false, 00:14:36.804 "supported_io_types": { 00:14:36.804 "read": true, 00:14:36.804 "write": true, 00:14:36.804 "unmap": true, 00:14:36.804 "flush": true, 00:14:36.804 "reset": true, 00:14:36.804 "nvme_admin": false, 00:14:36.804 "nvme_io": false, 00:14:36.804 "nvme_io_md": false, 00:14:36.804 "write_zeroes": true, 00:14:36.804 "zcopy": true, 00:14:36.804 "get_zone_info": false, 00:14:36.804 "zone_management": false, 00:14:36.804 "zone_append": false, 00:14:36.804 "compare": false, 00:14:36.804 "compare_and_write": false, 00:14:36.804 "abort": true, 00:14:36.804 "seek_hole": false, 00:14:36.804 "seek_data": false, 00:14:36.804 "copy": true, 00:14:36.804 "nvme_iov_md": false 00:14:36.804 }, 00:14:36.804 "memory_domains": [ 00:14:36.804 { 00:14:36.804 "dma_device_id": "system", 00:14:36.804 "dma_device_type": 1 00:14:36.804 }, 00:14:36.804 { 00:14:36.804 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:36.804 "dma_device_type": 2 00:14:36.804 } 00:14:36.804 ], 00:14:36.804 "driver_specific": {} 00:14:36.804 } 00:14:36.804 ] 00:14:36.804 18:18:20 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:14:36.804 18:18:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:14:36.804 18:18:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:14:36.804 18:18:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:14:36.804 18:18:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:14:36.804 18:18:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:14:36.804 18:18:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:14:36.804 18:18:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:14:36.804 18:18:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:14:36.804 18:18:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:36.804 18:18:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:36.804 18:18:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:36.804 18:18:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:36.804 18:18:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:14:36.804 18:18:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:37.062 18:18:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:37.062 "name": "Existed_Raid", 00:14:37.062 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:37.062 "strip_size_kb": 64, 00:14:37.062 "state": "configuring", 00:14:37.062 "raid_level": "concat", 00:14:37.062 "superblock": false, 00:14:37.062 "num_base_bdevs": 3, 00:14:37.062 "num_base_bdevs_discovered": 2, 00:14:37.062 "num_base_bdevs_operational": 3, 00:14:37.062 "base_bdevs_list": [ 00:14:37.062 { 00:14:37.062 "name": "BaseBdev1", 00:14:37.062 "uuid": "60694b70-a283-4e74-9f62-e538e612909e", 00:14:37.062 "is_configured": true, 00:14:37.062 "data_offset": 0, 00:14:37.062 "data_size": 65536 00:14:37.062 }, 00:14:37.062 { 00:14:37.062 "name": "BaseBdev2", 00:14:37.062 "uuid": "faa6b665-df1f-4542-ab10-801ec1864d23", 00:14:37.062 "is_configured": true, 00:14:37.062 "data_offset": 0, 00:14:37.062 "data_size": 65536 00:14:37.062 }, 00:14:37.062 { 00:14:37.062 "name": "BaseBdev3", 00:14:37.062 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:37.062 "is_configured": false, 00:14:37.062 "data_offset": 0, 00:14:37.062 "data_size": 0 00:14:37.062 } 00:14:37.062 ] 00:14:37.062 }' 00:14:37.062 18:18:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:37.062 18:18:20 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:14:37.628 18:18:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:14:37.887 [2024-07-12 18:18:21.445560] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:14:37.887 [2024-07-12 18:18:21.445601] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x1f5a400 00:14:37.887 [2024-07-12 18:18:21.445610] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 196608, blocklen 512 00:14:37.887 [2024-07-12 18:18:21.445862] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1f59ef0 00:14:37.887 [2024-07-12 18:18:21.445999] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1f5a400 00:14:37.887 [2024-07-12 18:18:21.446010] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x1f5a400 00:14:37.887 [2024-07-12 18:18:21.446181] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:14:37.887 BaseBdev3 00:14:37.887 18:18:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev3 00:14:37.887 18:18:21 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev3 00:14:37.887 18:18:21 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:14:37.887 18:18:21 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:14:37.887 18:18:21 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:14:37.887 18:18:21 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:14:37.887 18:18:21 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:14:38.146 18:18:21 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:14:38.146 [ 00:14:38.146 { 00:14:38.146 "name": "BaseBdev3", 00:14:38.146 "aliases": [ 00:14:38.146 "88184737-fb11-4dac-9267-ffe24127f885" 00:14:38.146 ], 00:14:38.146 "product_name": "Malloc disk", 00:14:38.146 "block_size": 512, 00:14:38.146 "num_blocks": 65536, 00:14:38.146 "uuid": "88184737-fb11-4dac-9267-ffe24127f885", 00:14:38.146 "assigned_rate_limits": { 00:14:38.146 "rw_ios_per_sec": 0, 00:14:38.146 "rw_mbytes_per_sec": 0, 00:14:38.146 "r_mbytes_per_sec": 0, 00:14:38.146 "w_mbytes_per_sec": 0 00:14:38.146 }, 00:14:38.146 "claimed": true, 00:14:38.146 "claim_type": "exclusive_write", 00:14:38.146 "zoned": false, 00:14:38.146 "supported_io_types": { 00:14:38.146 "read": true, 00:14:38.146 "write": true, 00:14:38.146 "unmap": true, 00:14:38.146 "flush": true, 00:14:38.146 "reset": true, 00:14:38.146 "nvme_admin": false, 00:14:38.146 "nvme_io": false, 00:14:38.146 "nvme_io_md": false, 00:14:38.146 "write_zeroes": true, 00:14:38.146 "zcopy": true, 00:14:38.146 "get_zone_info": false, 00:14:38.146 "zone_management": false, 00:14:38.146 "zone_append": false, 00:14:38.146 "compare": false, 00:14:38.146 "compare_and_write": false, 00:14:38.146 "abort": true, 00:14:38.146 "seek_hole": false, 00:14:38.146 "seek_data": false, 00:14:38.146 "copy": true, 00:14:38.146 "nvme_iov_md": false 00:14:38.146 }, 00:14:38.146 "memory_domains": [ 00:14:38.146 { 00:14:38.146 "dma_device_id": "system", 00:14:38.146 "dma_device_type": 1 00:14:38.146 }, 00:14:38.146 { 00:14:38.146 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:38.146 "dma_device_type": 2 00:14:38.146 } 00:14:38.146 ], 00:14:38.146 "driver_specific": {} 00:14:38.146 } 00:14:38.146 ] 00:14:38.146 18:18:21 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:14:38.146 18:18:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:14:38.146 18:18:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:14:38.146 18:18:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@270 -- # verify_raid_bdev_state Existed_Raid online concat 64 3 00:14:38.146 18:18:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:14:38.146 18:18:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:14:38.146 18:18:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:14:38.146 18:18:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:14:38.146 18:18:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:14:38.146 18:18:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:38.146 18:18:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:38.146 18:18:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:38.146 18:18:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:38.146 18:18:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:38.146 18:18:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:14:38.714 18:18:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:38.714 "name": "Existed_Raid", 00:14:38.714 "uuid": "98511660-d6ed-4d96-95e9-5d557d7bcc67", 00:14:38.714 "strip_size_kb": 64, 00:14:38.714 "state": "online", 00:14:38.714 "raid_level": "concat", 00:14:38.714 "superblock": false, 00:14:38.714 "num_base_bdevs": 3, 00:14:38.714 "num_base_bdevs_discovered": 3, 00:14:38.714 "num_base_bdevs_operational": 3, 00:14:38.714 "base_bdevs_list": [ 00:14:38.714 { 00:14:38.714 "name": "BaseBdev1", 00:14:38.714 "uuid": "60694b70-a283-4e74-9f62-e538e612909e", 00:14:38.714 "is_configured": true, 00:14:38.714 "data_offset": 0, 00:14:38.714 "data_size": 65536 00:14:38.714 }, 00:14:38.714 { 00:14:38.714 "name": "BaseBdev2", 00:14:38.714 "uuid": "faa6b665-df1f-4542-ab10-801ec1864d23", 00:14:38.714 "is_configured": true, 00:14:38.714 "data_offset": 0, 00:14:38.714 "data_size": 65536 00:14:38.714 }, 00:14:38.714 { 00:14:38.714 "name": "BaseBdev3", 00:14:38.714 "uuid": "88184737-fb11-4dac-9267-ffe24127f885", 00:14:38.714 "is_configured": true, 00:14:38.714 "data_offset": 0, 00:14:38.714 "data_size": 65536 00:14:38.714 } 00:14:38.714 ] 00:14:38.714 }' 00:14:38.714 18:18:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:38.714 18:18:22 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:14:39.282 18:18:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_properties Existed_Raid 00:14:39.282 18:18:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:14:39.282 18:18:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:14:39.282 18:18:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:14:39.282 18:18:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:14:39.282 18:18:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@198 -- # local name 00:14:39.282 18:18:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:14:39.282 18:18:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:14:39.849 [2024-07-12 18:18:23.471238] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:14:39.849 18:18:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:14:39.849 "name": "Existed_Raid", 00:14:39.849 "aliases": [ 00:14:39.849 "98511660-d6ed-4d96-95e9-5d557d7bcc67" 00:14:39.849 ], 00:14:39.849 "product_name": "Raid Volume", 00:14:39.849 "block_size": 512, 00:14:39.849 "num_blocks": 196608, 00:14:39.849 "uuid": "98511660-d6ed-4d96-95e9-5d557d7bcc67", 00:14:39.849 "assigned_rate_limits": { 00:14:39.849 "rw_ios_per_sec": 0, 00:14:39.849 "rw_mbytes_per_sec": 0, 00:14:39.849 "r_mbytes_per_sec": 0, 00:14:39.849 "w_mbytes_per_sec": 0 00:14:39.849 }, 00:14:39.849 "claimed": false, 00:14:39.849 "zoned": false, 00:14:39.849 "supported_io_types": { 00:14:39.849 "read": true, 00:14:39.849 "write": true, 00:14:39.849 "unmap": true, 00:14:39.849 "flush": true, 00:14:39.849 "reset": true, 00:14:39.849 "nvme_admin": false, 00:14:39.849 "nvme_io": false, 00:14:39.849 "nvme_io_md": false, 00:14:39.849 "write_zeroes": true, 00:14:39.849 "zcopy": false, 00:14:39.849 "get_zone_info": false, 00:14:39.849 "zone_management": false, 00:14:39.849 "zone_append": false, 00:14:39.849 "compare": false, 00:14:39.849 "compare_and_write": false, 00:14:39.849 "abort": false, 00:14:39.849 "seek_hole": false, 00:14:39.849 "seek_data": false, 00:14:39.849 "copy": false, 00:14:39.849 "nvme_iov_md": false 00:14:39.849 }, 00:14:39.849 "memory_domains": [ 00:14:39.849 { 00:14:39.849 "dma_device_id": "system", 00:14:39.849 "dma_device_type": 1 00:14:39.849 }, 00:14:39.849 { 00:14:39.849 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:39.849 "dma_device_type": 2 00:14:39.849 }, 00:14:39.849 { 00:14:39.849 "dma_device_id": "system", 00:14:39.849 "dma_device_type": 1 00:14:39.849 }, 00:14:39.849 { 00:14:39.849 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:39.849 "dma_device_type": 2 00:14:39.849 }, 00:14:39.849 { 00:14:39.849 "dma_device_id": "system", 00:14:39.849 "dma_device_type": 1 00:14:39.849 }, 00:14:39.849 { 00:14:39.849 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:39.849 "dma_device_type": 2 00:14:39.849 } 00:14:39.849 ], 00:14:39.849 "driver_specific": { 00:14:39.849 "raid": { 00:14:39.849 "uuid": "98511660-d6ed-4d96-95e9-5d557d7bcc67", 00:14:39.849 "strip_size_kb": 64, 00:14:39.849 "state": "online", 00:14:39.849 "raid_level": "concat", 00:14:39.849 "superblock": false, 00:14:39.849 "num_base_bdevs": 3, 00:14:39.849 "num_base_bdevs_discovered": 3, 00:14:39.849 "num_base_bdevs_operational": 3, 00:14:39.849 "base_bdevs_list": [ 00:14:39.849 { 00:14:39.849 "name": "BaseBdev1", 00:14:39.849 "uuid": "60694b70-a283-4e74-9f62-e538e612909e", 00:14:39.849 "is_configured": true, 00:14:39.849 "data_offset": 0, 00:14:39.849 "data_size": 65536 00:14:39.849 }, 00:14:39.849 { 00:14:39.849 "name": "BaseBdev2", 00:14:39.849 "uuid": "faa6b665-df1f-4542-ab10-801ec1864d23", 00:14:39.849 "is_configured": true, 00:14:39.849 "data_offset": 0, 00:14:39.849 "data_size": 65536 00:14:39.849 }, 00:14:39.849 { 00:14:39.849 "name": "BaseBdev3", 00:14:39.849 "uuid": "88184737-fb11-4dac-9267-ffe24127f885", 00:14:39.849 "is_configured": true, 00:14:39.849 "data_offset": 0, 00:14:39.849 "data_size": 65536 00:14:39.849 } 00:14:39.849 ] 00:14:39.849 } 00:14:39.849 } 00:14:39.849 }' 00:14:39.849 18:18:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:14:39.849 18:18:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='BaseBdev1 00:14:39.849 BaseBdev2 00:14:39.849 BaseBdev3' 00:14:39.849 18:18:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:14:39.849 18:18:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:14:39.849 18:18:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:14:40.108 18:18:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:14:40.108 "name": "BaseBdev1", 00:14:40.108 "aliases": [ 00:14:40.108 "60694b70-a283-4e74-9f62-e538e612909e" 00:14:40.108 ], 00:14:40.108 "product_name": "Malloc disk", 00:14:40.108 "block_size": 512, 00:14:40.108 "num_blocks": 65536, 00:14:40.108 "uuid": "60694b70-a283-4e74-9f62-e538e612909e", 00:14:40.108 "assigned_rate_limits": { 00:14:40.108 "rw_ios_per_sec": 0, 00:14:40.108 "rw_mbytes_per_sec": 0, 00:14:40.108 "r_mbytes_per_sec": 0, 00:14:40.108 "w_mbytes_per_sec": 0 00:14:40.108 }, 00:14:40.108 "claimed": true, 00:14:40.108 "claim_type": "exclusive_write", 00:14:40.108 "zoned": false, 00:14:40.108 "supported_io_types": { 00:14:40.108 "read": true, 00:14:40.108 "write": true, 00:14:40.108 "unmap": true, 00:14:40.108 "flush": true, 00:14:40.108 "reset": true, 00:14:40.108 "nvme_admin": false, 00:14:40.108 "nvme_io": false, 00:14:40.108 "nvme_io_md": false, 00:14:40.108 "write_zeroes": true, 00:14:40.108 "zcopy": true, 00:14:40.108 "get_zone_info": false, 00:14:40.108 "zone_management": false, 00:14:40.108 "zone_append": false, 00:14:40.108 "compare": false, 00:14:40.108 "compare_and_write": false, 00:14:40.108 "abort": true, 00:14:40.108 "seek_hole": false, 00:14:40.108 "seek_data": false, 00:14:40.108 "copy": true, 00:14:40.108 "nvme_iov_md": false 00:14:40.108 }, 00:14:40.108 "memory_domains": [ 00:14:40.108 { 00:14:40.108 "dma_device_id": "system", 00:14:40.108 "dma_device_type": 1 00:14:40.108 }, 00:14:40.108 { 00:14:40.108 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:40.108 "dma_device_type": 2 00:14:40.108 } 00:14:40.108 ], 00:14:40.108 "driver_specific": {} 00:14:40.108 }' 00:14:40.108 18:18:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:40.367 18:18:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:40.367 18:18:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:14:40.367 18:18:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:40.367 18:18:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:40.367 18:18:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:14:40.367 18:18:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:40.367 18:18:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:40.367 18:18:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:14:40.367 18:18:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:40.367 18:18:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:40.626 18:18:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:14:40.626 18:18:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:14:40.626 18:18:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:14:40.626 18:18:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:14:40.626 18:18:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:14:40.626 "name": "BaseBdev2", 00:14:40.626 "aliases": [ 00:14:40.626 "faa6b665-df1f-4542-ab10-801ec1864d23" 00:14:40.626 ], 00:14:40.626 "product_name": "Malloc disk", 00:14:40.626 "block_size": 512, 00:14:40.626 "num_blocks": 65536, 00:14:40.626 "uuid": "faa6b665-df1f-4542-ab10-801ec1864d23", 00:14:40.626 "assigned_rate_limits": { 00:14:40.626 "rw_ios_per_sec": 0, 00:14:40.626 "rw_mbytes_per_sec": 0, 00:14:40.626 "r_mbytes_per_sec": 0, 00:14:40.626 "w_mbytes_per_sec": 0 00:14:40.626 }, 00:14:40.626 "claimed": true, 00:14:40.626 "claim_type": "exclusive_write", 00:14:40.626 "zoned": false, 00:14:40.626 "supported_io_types": { 00:14:40.626 "read": true, 00:14:40.626 "write": true, 00:14:40.626 "unmap": true, 00:14:40.626 "flush": true, 00:14:40.626 "reset": true, 00:14:40.626 "nvme_admin": false, 00:14:40.626 "nvme_io": false, 00:14:40.626 "nvme_io_md": false, 00:14:40.626 "write_zeroes": true, 00:14:40.626 "zcopy": true, 00:14:40.626 "get_zone_info": false, 00:14:40.626 "zone_management": false, 00:14:40.626 "zone_append": false, 00:14:40.626 "compare": false, 00:14:40.626 "compare_and_write": false, 00:14:40.626 "abort": true, 00:14:40.626 "seek_hole": false, 00:14:40.626 "seek_data": false, 00:14:40.626 "copy": true, 00:14:40.626 "nvme_iov_md": false 00:14:40.626 }, 00:14:40.626 "memory_domains": [ 00:14:40.626 { 00:14:40.626 "dma_device_id": "system", 00:14:40.626 "dma_device_type": 1 00:14:40.626 }, 00:14:40.626 { 00:14:40.626 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:40.626 "dma_device_type": 2 00:14:40.626 } 00:14:40.626 ], 00:14:40.626 "driver_specific": {} 00:14:40.626 }' 00:14:40.626 18:18:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:40.626 18:18:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:40.626 18:18:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:14:40.626 18:18:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:40.884 18:18:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:40.884 18:18:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:14:40.885 18:18:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:40.885 18:18:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:40.885 18:18:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:14:40.885 18:18:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:40.885 18:18:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:40.885 18:18:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:14:40.885 18:18:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:14:40.885 18:18:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:14:40.885 18:18:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:14:41.143 18:18:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:14:41.143 "name": "BaseBdev3", 00:14:41.143 "aliases": [ 00:14:41.143 "88184737-fb11-4dac-9267-ffe24127f885" 00:14:41.143 ], 00:14:41.143 "product_name": "Malloc disk", 00:14:41.143 "block_size": 512, 00:14:41.143 "num_blocks": 65536, 00:14:41.143 "uuid": "88184737-fb11-4dac-9267-ffe24127f885", 00:14:41.143 "assigned_rate_limits": { 00:14:41.143 "rw_ios_per_sec": 0, 00:14:41.143 "rw_mbytes_per_sec": 0, 00:14:41.143 "r_mbytes_per_sec": 0, 00:14:41.143 "w_mbytes_per_sec": 0 00:14:41.143 }, 00:14:41.143 "claimed": true, 00:14:41.143 "claim_type": "exclusive_write", 00:14:41.143 "zoned": false, 00:14:41.143 "supported_io_types": { 00:14:41.143 "read": true, 00:14:41.143 "write": true, 00:14:41.143 "unmap": true, 00:14:41.143 "flush": true, 00:14:41.143 "reset": true, 00:14:41.143 "nvme_admin": false, 00:14:41.143 "nvme_io": false, 00:14:41.143 "nvme_io_md": false, 00:14:41.143 "write_zeroes": true, 00:14:41.143 "zcopy": true, 00:14:41.143 "get_zone_info": false, 00:14:41.143 "zone_management": false, 00:14:41.143 "zone_append": false, 00:14:41.143 "compare": false, 00:14:41.143 "compare_and_write": false, 00:14:41.143 "abort": true, 00:14:41.143 "seek_hole": false, 00:14:41.143 "seek_data": false, 00:14:41.143 "copy": true, 00:14:41.143 "nvme_iov_md": false 00:14:41.143 }, 00:14:41.143 "memory_domains": [ 00:14:41.143 { 00:14:41.143 "dma_device_id": "system", 00:14:41.143 "dma_device_type": 1 00:14:41.143 }, 00:14:41.143 { 00:14:41.143 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:41.143 "dma_device_type": 2 00:14:41.143 } 00:14:41.143 ], 00:14:41.143 "driver_specific": {} 00:14:41.143 }' 00:14:41.143 18:18:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:41.419 18:18:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:41.419 18:18:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:14:41.419 18:18:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:41.419 18:18:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:41.419 18:18:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:14:41.419 18:18:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:41.419 18:18:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:41.419 18:18:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:14:41.419 18:18:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:41.678 18:18:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:41.678 18:18:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:14:41.678 18:18:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@274 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:14:41.936 [2024-07-12 18:18:25.428178] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:14:41.936 [2024-07-12 18:18:25.428202] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:14:41.936 [2024-07-12 18:18:25.428243] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:14:41.936 18:18:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@275 -- # local expected_state 00:14:41.936 18:18:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@276 -- # has_redundancy concat 00:14:41.936 18:18:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:14:41.936 18:18:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@215 -- # return 1 00:14:41.936 18:18:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@277 -- # expected_state=offline 00:14:41.936 18:18:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@281 -- # verify_raid_bdev_state Existed_Raid offline concat 64 2 00:14:41.936 18:18:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:14:41.936 18:18:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=offline 00:14:41.936 18:18:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:14:41.936 18:18:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:14:41.936 18:18:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:14:41.936 18:18:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:41.937 18:18:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:41.937 18:18:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:41.937 18:18:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:41.937 18:18:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:41.937 18:18:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:14:42.195 18:18:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:42.195 "name": "Existed_Raid", 00:14:42.195 "uuid": "98511660-d6ed-4d96-95e9-5d557d7bcc67", 00:14:42.195 "strip_size_kb": 64, 00:14:42.195 "state": "offline", 00:14:42.195 "raid_level": "concat", 00:14:42.195 "superblock": false, 00:14:42.195 "num_base_bdevs": 3, 00:14:42.195 "num_base_bdevs_discovered": 2, 00:14:42.195 "num_base_bdevs_operational": 2, 00:14:42.195 "base_bdevs_list": [ 00:14:42.195 { 00:14:42.195 "name": null, 00:14:42.195 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:42.195 "is_configured": false, 00:14:42.195 "data_offset": 0, 00:14:42.195 "data_size": 65536 00:14:42.195 }, 00:14:42.195 { 00:14:42.195 "name": "BaseBdev2", 00:14:42.195 "uuid": "faa6b665-df1f-4542-ab10-801ec1864d23", 00:14:42.195 "is_configured": true, 00:14:42.195 "data_offset": 0, 00:14:42.195 "data_size": 65536 00:14:42.195 }, 00:14:42.195 { 00:14:42.195 "name": "BaseBdev3", 00:14:42.195 "uuid": "88184737-fb11-4dac-9267-ffe24127f885", 00:14:42.195 "is_configured": true, 00:14:42.195 "data_offset": 0, 00:14:42.195 "data_size": 65536 00:14:42.195 } 00:14:42.195 ] 00:14:42.195 }' 00:14:42.195 18:18:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:42.195 18:18:25 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:14:42.762 18:18:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i = 1 )) 00:14:42.762 18:18:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:14:42.762 18:18:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:42.762 18:18:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:14:42.762 18:18:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:14:42.762 18:18:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:14:42.762 18:18:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:14:43.020 [2024-07-12 18:18:26.640833] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:14:43.020 18:18:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:14:43.020 18:18:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:14:43.020 18:18:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:43.020 18:18:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:14:43.278 18:18:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:14:43.278 18:18:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:14:43.278 18:18:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev3 00:14:43.537 [2024-07-12 18:18:27.081428] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:14:43.537 [2024-07-12 18:18:27.081469] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1f5a400 name Existed_Raid, state offline 00:14:43.537 18:18:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:14:43.537 18:18:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:14:43.537 18:18:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:43.537 18:18:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # jq -r '.[0]["name"] | select(.)' 00:14:43.795 18:18:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # raid_bdev= 00:14:43.795 18:18:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@294 -- # '[' -n '' ']' 00:14:43.795 18:18:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@299 -- # '[' 3 -gt 2 ']' 00:14:43.795 18:18:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i = 1 )) 00:14:43.795 18:18:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:14:43.795 18:18:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:14:44.053 BaseBdev2 00:14:44.053 18:18:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev2 00:14:44.053 18:18:27 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev2 00:14:44.053 18:18:27 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:14:44.053 18:18:27 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:14:44.053 18:18:27 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:14:44.053 18:18:27 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:14:44.053 18:18:27 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:14:44.310 18:18:27 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:14:44.568 [ 00:14:44.568 { 00:14:44.568 "name": "BaseBdev2", 00:14:44.568 "aliases": [ 00:14:44.568 "e9e9b4b1-316d-4484-b071-4fd20f2573a5" 00:14:44.568 ], 00:14:44.568 "product_name": "Malloc disk", 00:14:44.568 "block_size": 512, 00:14:44.568 "num_blocks": 65536, 00:14:44.568 "uuid": "e9e9b4b1-316d-4484-b071-4fd20f2573a5", 00:14:44.568 "assigned_rate_limits": { 00:14:44.568 "rw_ios_per_sec": 0, 00:14:44.568 "rw_mbytes_per_sec": 0, 00:14:44.568 "r_mbytes_per_sec": 0, 00:14:44.568 "w_mbytes_per_sec": 0 00:14:44.568 }, 00:14:44.568 "claimed": false, 00:14:44.568 "zoned": false, 00:14:44.568 "supported_io_types": { 00:14:44.568 "read": true, 00:14:44.568 "write": true, 00:14:44.568 "unmap": true, 00:14:44.568 "flush": true, 00:14:44.568 "reset": true, 00:14:44.568 "nvme_admin": false, 00:14:44.568 "nvme_io": false, 00:14:44.568 "nvme_io_md": false, 00:14:44.568 "write_zeroes": true, 00:14:44.568 "zcopy": true, 00:14:44.568 "get_zone_info": false, 00:14:44.568 "zone_management": false, 00:14:44.568 "zone_append": false, 00:14:44.568 "compare": false, 00:14:44.568 "compare_and_write": false, 00:14:44.568 "abort": true, 00:14:44.568 "seek_hole": false, 00:14:44.568 "seek_data": false, 00:14:44.568 "copy": true, 00:14:44.568 "nvme_iov_md": false 00:14:44.568 }, 00:14:44.568 "memory_domains": [ 00:14:44.568 { 00:14:44.568 "dma_device_id": "system", 00:14:44.568 "dma_device_type": 1 00:14:44.568 }, 00:14:44.568 { 00:14:44.568 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:44.568 "dma_device_type": 2 00:14:44.568 } 00:14:44.568 ], 00:14:44.568 "driver_specific": {} 00:14:44.568 } 00:14:44.568 ] 00:14:44.568 18:18:28 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:14:44.568 18:18:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:14:44.568 18:18:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:14:44.568 18:18:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:14:44.825 BaseBdev3 00:14:44.825 18:18:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev3 00:14:44.825 18:18:28 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev3 00:14:44.825 18:18:28 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:14:44.825 18:18:28 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:14:44.825 18:18:28 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:14:44.825 18:18:28 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:14:44.825 18:18:28 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:14:44.825 18:18:28 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:14:45.082 [ 00:14:45.082 { 00:14:45.082 "name": "BaseBdev3", 00:14:45.082 "aliases": [ 00:14:45.082 "f309239d-ca90-4508-9adf-09e47699e432" 00:14:45.082 ], 00:14:45.082 "product_name": "Malloc disk", 00:14:45.082 "block_size": 512, 00:14:45.082 "num_blocks": 65536, 00:14:45.083 "uuid": "f309239d-ca90-4508-9adf-09e47699e432", 00:14:45.083 "assigned_rate_limits": { 00:14:45.083 "rw_ios_per_sec": 0, 00:14:45.083 "rw_mbytes_per_sec": 0, 00:14:45.083 "r_mbytes_per_sec": 0, 00:14:45.083 "w_mbytes_per_sec": 0 00:14:45.083 }, 00:14:45.083 "claimed": false, 00:14:45.083 "zoned": false, 00:14:45.083 "supported_io_types": { 00:14:45.083 "read": true, 00:14:45.083 "write": true, 00:14:45.083 "unmap": true, 00:14:45.083 "flush": true, 00:14:45.083 "reset": true, 00:14:45.083 "nvme_admin": false, 00:14:45.083 "nvme_io": false, 00:14:45.083 "nvme_io_md": false, 00:14:45.083 "write_zeroes": true, 00:14:45.083 "zcopy": true, 00:14:45.083 "get_zone_info": false, 00:14:45.083 "zone_management": false, 00:14:45.083 "zone_append": false, 00:14:45.083 "compare": false, 00:14:45.083 "compare_and_write": false, 00:14:45.083 "abort": true, 00:14:45.083 "seek_hole": false, 00:14:45.083 "seek_data": false, 00:14:45.083 "copy": true, 00:14:45.083 "nvme_iov_md": false 00:14:45.083 }, 00:14:45.083 "memory_domains": [ 00:14:45.083 { 00:14:45.083 "dma_device_id": "system", 00:14:45.083 "dma_device_type": 1 00:14:45.083 }, 00:14:45.083 { 00:14:45.083 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:45.083 "dma_device_type": 2 00:14:45.083 } 00:14:45.083 ], 00:14:45.083 "driver_specific": {} 00:14:45.083 } 00:14:45.083 ] 00:14:45.083 18:18:28 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:14:45.083 18:18:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:14:45.083 18:18:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:14:45.083 18:18:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@305 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:14:45.341 [2024-07-12 18:18:28.925867] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:14:45.341 [2024-07-12 18:18:28.925912] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:14:45.341 [2024-07-12 18:18:28.925939] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:14:45.341 [2024-07-12 18:18:28.927512] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:14:45.341 18:18:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@306 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:14:45.341 18:18:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:14:45.341 18:18:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:14:45.341 18:18:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:14:45.341 18:18:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:14:45.341 18:18:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:14:45.341 18:18:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:45.341 18:18:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:45.341 18:18:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:45.341 18:18:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:45.341 18:18:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:14:45.341 18:18:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:45.599 18:18:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:45.599 "name": "Existed_Raid", 00:14:45.599 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:45.599 "strip_size_kb": 64, 00:14:45.599 "state": "configuring", 00:14:45.599 "raid_level": "concat", 00:14:45.599 "superblock": false, 00:14:45.599 "num_base_bdevs": 3, 00:14:45.599 "num_base_bdevs_discovered": 2, 00:14:45.599 "num_base_bdevs_operational": 3, 00:14:45.599 "base_bdevs_list": [ 00:14:45.599 { 00:14:45.599 "name": "BaseBdev1", 00:14:45.599 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:45.599 "is_configured": false, 00:14:45.599 "data_offset": 0, 00:14:45.599 "data_size": 0 00:14:45.599 }, 00:14:45.599 { 00:14:45.599 "name": "BaseBdev2", 00:14:45.599 "uuid": "e9e9b4b1-316d-4484-b071-4fd20f2573a5", 00:14:45.599 "is_configured": true, 00:14:45.599 "data_offset": 0, 00:14:45.599 "data_size": 65536 00:14:45.599 }, 00:14:45.599 { 00:14:45.599 "name": "BaseBdev3", 00:14:45.599 "uuid": "f309239d-ca90-4508-9adf-09e47699e432", 00:14:45.599 "is_configured": true, 00:14:45.599 "data_offset": 0, 00:14:45.599 "data_size": 65536 00:14:45.599 } 00:14:45.599 ] 00:14:45.599 }' 00:14:45.599 18:18:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:45.599 18:18:29 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:14:46.166 18:18:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@308 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev2 00:14:46.731 [2024-07-12 18:18:30.249379] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:14:46.732 18:18:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@309 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:14:46.732 18:18:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:14:46.732 18:18:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:14:46.732 18:18:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:14:46.732 18:18:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:14:46.732 18:18:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:14:46.732 18:18:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:46.732 18:18:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:46.732 18:18:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:46.732 18:18:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:46.732 18:18:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:46.732 18:18:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:14:46.990 18:18:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:46.990 "name": "Existed_Raid", 00:14:46.990 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:46.990 "strip_size_kb": 64, 00:14:46.990 "state": "configuring", 00:14:46.990 "raid_level": "concat", 00:14:46.990 "superblock": false, 00:14:46.990 "num_base_bdevs": 3, 00:14:46.990 "num_base_bdevs_discovered": 1, 00:14:46.990 "num_base_bdevs_operational": 3, 00:14:46.990 "base_bdevs_list": [ 00:14:46.990 { 00:14:46.990 "name": "BaseBdev1", 00:14:46.990 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:46.990 "is_configured": false, 00:14:46.990 "data_offset": 0, 00:14:46.990 "data_size": 0 00:14:46.990 }, 00:14:46.990 { 00:14:46.990 "name": null, 00:14:46.990 "uuid": "e9e9b4b1-316d-4484-b071-4fd20f2573a5", 00:14:46.990 "is_configured": false, 00:14:46.990 "data_offset": 0, 00:14:46.990 "data_size": 65536 00:14:46.990 }, 00:14:46.990 { 00:14:46.990 "name": "BaseBdev3", 00:14:46.990 "uuid": "f309239d-ca90-4508-9adf-09e47699e432", 00:14:46.990 "is_configured": true, 00:14:46.990 "data_offset": 0, 00:14:46.990 "data_size": 65536 00:14:46.990 } 00:14:46.990 ] 00:14:46.990 }' 00:14:46.990 18:18:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:46.990 18:18:30 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:14:47.557 18:18:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@310 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:47.557 18:18:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@310 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:14:47.814 18:18:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@310 -- # [[ false == \f\a\l\s\e ]] 00:14:47.814 18:18:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@312 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:14:48.072 [2024-07-12 18:18:31.654997] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:14:48.072 BaseBdev1 00:14:48.072 18:18:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@313 -- # waitforbdev BaseBdev1 00:14:48.072 18:18:31 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev1 00:14:48.072 18:18:31 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:14:48.072 18:18:31 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:14:48.072 18:18:31 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:14:48.072 18:18:31 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:14:48.072 18:18:31 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:14:48.329 18:18:31 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:14:48.588 [ 00:14:48.588 { 00:14:48.588 "name": "BaseBdev1", 00:14:48.588 "aliases": [ 00:14:48.588 "6fa97b09-7dd7-40da-95f8-44c2ef5e51a6" 00:14:48.588 ], 00:14:48.588 "product_name": "Malloc disk", 00:14:48.588 "block_size": 512, 00:14:48.588 "num_blocks": 65536, 00:14:48.588 "uuid": "6fa97b09-7dd7-40da-95f8-44c2ef5e51a6", 00:14:48.588 "assigned_rate_limits": { 00:14:48.588 "rw_ios_per_sec": 0, 00:14:48.588 "rw_mbytes_per_sec": 0, 00:14:48.588 "r_mbytes_per_sec": 0, 00:14:48.588 "w_mbytes_per_sec": 0 00:14:48.588 }, 00:14:48.588 "claimed": true, 00:14:48.588 "claim_type": "exclusive_write", 00:14:48.588 "zoned": false, 00:14:48.588 "supported_io_types": { 00:14:48.588 "read": true, 00:14:48.588 "write": true, 00:14:48.588 "unmap": true, 00:14:48.588 "flush": true, 00:14:48.588 "reset": true, 00:14:48.588 "nvme_admin": false, 00:14:48.588 "nvme_io": false, 00:14:48.588 "nvme_io_md": false, 00:14:48.588 "write_zeroes": true, 00:14:48.588 "zcopy": true, 00:14:48.588 "get_zone_info": false, 00:14:48.588 "zone_management": false, 00:14:48.588 "zone_append": false, 00:14:48.588 "compare": false, 00:14:48.588 "compare_and_write": false, 00:14:48.588 "abort": true, 00:14:48.588 "seek_hole": false, 00:14:48.588 "seek_data": false, 00:14:48.588 "copy": true, 00:14:48.588 "nvme_iov_md": false 00:14:48.588 }, 00:14:48.588 "memory_domains": [ 00:14:48.588 { 00:14:48.588 "dma_device_id": "system", 00:14:48.588 "dma_device_type": 1 00:14:48.588 }, 00:14:48.588 { 00:14:48.588 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:48.588 "dma_device_type": 2 00:14:48.588 } 00:14:48.588 ], 00:14:48.588 "driver_specific": {} 00:14:48.588 } 00:14:48.588 ] 00:14:48.588 18:18:32 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:14:48.588 18:18:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@314 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:14:48.588 18:18:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:14:48.588 18:18:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:14:48.588 18:18:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:14:48.588 18:18:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:14:48.588 18:18:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:14:48.588 18:18:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:48.588 18:18:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:48.588 18:18:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:48.588 18:18:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:48.588 18:18:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:48.588 18:18:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:14:48.847 18:18:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:48.847 "name": "Existed_Raid", 00:14:48.847 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:48.847 "strip_size_kb": 64, 00:14:48.847 "state": "configuring", 00:14:48.847 "raid_level": "concat", 00:14:48.847 "superblock": false, 00:14:48.847 "num_base_bdevs": 3, 00:14:48.847 "num_base_bdevs_discovered": 2, 00:14:48.847 "num_base_bdevs_operational": 3, 00:14:48.847 "base_bdevs_list": [ 00:14:48.847 { 00:14:48.847 "name": "BaseBdev1", 00:14:48.847 "uuid": "6fa97b09-7dd7-40da-95f8-44c2ef5e51a6", 00:14:48.847 "is_configured": true, 00:14:48.847 "data_offset": 0, 00:14:48.847 "data_size": 65536 00:14:48.847 }, 00:14:48.847 { 00:14:48.847 "name": null, 00:14:48.847 "uuid": "e9e9b4b1-316d-4484-b071-4fd20f2573a5", 00:14:48.847 "is_configured": false, 00:14:48.847 "data_offset": 0, 00:14:48.847 "data_size": 65536 00:14:48.847 }, 00:14:48.847 { 00:14:48.847 "name": "BaseBdev3", 00:14:48.847 "uuid": "f309239d-ca90-4508-9adf-09e47699e432", 00:14:48.847 "is_configured": true, 00:14:48.847 "data_offset": 0, 00:14:48.847 "data_size": 65536 00:14:48.847 } 00:14:48.847 ] 00:14:48.847 }' 00:14:48.847 18:18:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:48.847 18:18:32 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:14:49.413 18:18:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@315 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:14:49.413 18:18:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@315 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:49.671 18:18:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@315 -- # [[ true == \t\r\u\e ]] 00:14:49.671 18:18:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@317 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev3 00:14:49.928 [2024-07-12 18:18:33.431725] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:14:49.928 18:18:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@318 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:14:49.928 18:18:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:14:49.928 18:18:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:14:49.928 18:18:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:14:49.928 18:18:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:14:49.928 18:18:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:14:49.928 18:18:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:49.928 18:18:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:49.928 18:18:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:49.928 18:18:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:49.928 18:18:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:49.928 18:18:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:14:50.185 18:18:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:50.185 "name": "Existed_Raid", 00:14:50.185 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:50.185 "strip_size_kb": 64, 00:14:50.185 "state": "configuring", 00:14:50.185 "raid_level": "concat", 00:14:50.185 "superblock": false, 00:14:50.185 "num_base_bdevs": 3, 00:14:50.185 "num_base_bdevs_discovered": 1, 00:14:50.185 "num_base_bdevs_operational": 3, 00:14:50.185 "base_bdevs_list": [ 00:14:50.185 { 00:14:50.185 "name": "BaseBdev1", 00:14:50.185 "uuid": "6fa97b09-7dd7-40da-95f8-44c2ef5e51a6", 00:14:50.185 "is_configured": true, 00:14:50.185 "data_offset": 0, 00:14:50.185 "data_size": 65536 00:14:50.185 }, 00:14:50.185 { 00:14:50.185 "name": null, 00:14:50.185 "uuid": "e9e9b4b1-316d-4484-b071-4fd20f2573a5", 00:14:50.185 "is_configured": false, 00:14:50.185 "data_offset": 0, 00:14:50.185 "data_size": 65536 00:14:50.185 }, 00:14:50.185 { 00:14:50.185 "name": null, 00:14:50.185 "uuid": "f309239d-ca90-4508-9adf-09e47699e432", 00:14:50.185 "is_configured": false, 00:14:50.185 "data_offset": 0, 00:14:50.185 "data_size": 65536 00:14:50.185 } 00:14:50.185 ] 00:14:50.185 }' 00:14:50.185 18:18:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:50.185 18:18:33 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:14:50.750 18:18:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@319 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:50.750 18:18:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@319 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:14:51.007 18:18:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@319 -- # [[ false == \f\a\l\s\e ]] 00:14:51.008 18:18:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@321 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev3 00:14:51.265 [2024-07-12 18:18:34.779319] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:14:51.266 18:18:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@322 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:14:51.266 18:18:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:14:51.266 18:18:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:14:51.266 18:18:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:14:51.266 18:18:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:14:51.266 18:18:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:14:51.266 18:18:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:51.266 18:18:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:51.266 18:18:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:51.266 18:18:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:51.266 18:18:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:51.266 18:18:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:14:51.524 18:18:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:51.524 "name": "Existed_Raid", 00:14:51.524 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:51.524 "strip_size_kb": 64, 00:14:51.524 "state": "configuring", 00:14:51.524 "raid_level": "concat", 00:14:51.524 "superblock": false, 00:14:51.524 "num_base_bdevs": 3, 00:14:51.524 "num_base_bdevs_discovered": 2, 00:14:51.524 "num_base_bdevs_operational": 3, 00:14:51.524 "base_bdevs_list": [ 00:14:51.524 { 00:14:51.524 "name": "BaseBdev1", 00:14:51.524 "uuid": "6fa97b09-7dd7-40da-95f8-44c2ef5e51a6", 00:14:51.524 "is_configured": true, 00:14:51.524 "data_offset": 0, 00:14:51.524 "data_size": 65536 00:14:51.524 }, 00:14:51.524 { 00:14:51.524 "name": null, 00:14:51.524 "uuid": "e9e9b4b1-316d-4484-b071-4fd20f2573a5", 00:14:51.524 "is_configured": false, 00:14:51.524 "data_offset": 0, 00:14:51.524 "data_size": 65536 00:14:51.524 }, 00:14:51.524 { 00:14:51.524 "name": "BaseBdev3", 00:14:51.524 "uuid": "f309239d-ca90-4508-9adf-09e47699e432", 00:14:51.524 "is_configured": true, 00:14:51.524 "data_offset": 0, 00:14:51.524 "data_size": 65536 00:14:51.524 } 00:14:51.524 ] 00:14:51.524 }' 00:14:51.524 18:18:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:51.524 18:18:35 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:14:52.090 18:18:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@323 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:52.090 18:18:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@323 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:14:52.349 18:18:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@323 -- # [[ true == \t\r\u\e ]] 00:14:52.349 18:18:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@325 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:14:52.608 [2024-07-12 18:18:36.151011] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:14:52.608 18:18:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@326 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:14:52.608 18:18:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:14:52.608 18:18:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:14:52.608 18:18:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:14:52.608 18:18:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:14:52.608 18:18:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:14:52.608 18:18:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:52.608 18:18:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:52.608 18:18:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:52.608 18:18:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:52.608 18:18:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:52.608 18:18:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:14:52.867 18:18:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:52.867 "name": "Existed_Raid", 00:14:52.867 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:52.867 "strip_size_kb": 64, 00:14:52.867 "state": "configuring", 00:14:52.867 "raid_level": "concat", 00:14:52.867 "superblock": false, 00:14:52.867 "num_base_bdevs": 3, 00:14:52.867 "num_base_bdevs_discovered": 1, 00:14:52.867 "num_base_bdevs_operational": 3, 00:14:52.867 "base_bdevs_list": [ 00:14:52.867 { 00:14:52.867 "name": null, 00:14:52.867 "uuid": "6fa97b09-7dd7-40da-95f8-44c2ef5e51a6", 00:14:52.867 "is_configured": false, 00:14:52.867 "data_offset": 0, 00:14:52.867 "data_size": 65536 00:14:52.867 }, 00:14:52.867 { 00:14:52.867 "name": null, 00:14:52.867 "uuid": "e9e9b4b1-316d-4484-b071-4fd20f2573a5", 00:14:52.867 "is_configured": false, 00:14:52.867 "data_offset": 0, 00:14:52.867 "data_size": 65536 00:14:52.867 }, 00:14:52.867 { 00:14:52.867 "name": "BaseBdev3", 00:14:52.867 "uuid": "f309239d-ca90-4508-9adf-09e47699e432", 00:14:52.867 "is_configured": true, 00:14:52.867 "data_offset": 0, 00:14:52.867 "data_size": 65536 00:14:52.867 } 00:14:52.867 ] 00:14:52.867 }' 00:14:52.867 18:18:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:52.867 18:18:36 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:14:53.433 18:18:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@327 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:53.433 18:18:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@327 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:14:53.690 18:18:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@327 -- # [[ false == \f\a\l\s\e ]] 00:14:53.690 18:18:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@329 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev2 00:14:53.949 [2024-07-12 18:18:37.507887] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:14:53.949 18:18:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@330 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:14:53.949 18:18:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:14:53.949 18:18:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:14:53.949 18:18:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:14:53.949 18:18:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:14:53.949 18:18:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:14:53.949 18:18:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:53.949 18:18:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:53.949 18:18:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:53.949 18:18:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:53.949 18:18:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:53.949 18:18:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:14:54.207 18:18:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:54.207 "name": "Existed_Raid", 00:14:54.207 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:54.207 "strip_size_kb": 64, 00:14:54.207 "state": "configuring", 00:14:54.207 "raid_level": "concat", 00:14:54.207 "superblock": false, 00:14:54.207 "num_base_bdevs": 3, 00:14:54.207 "num_base_bdevs_discovered": 2, 00:14:54.207 "num_base_bdevs_operational": 3, 00:14:54.207 "base_bdevs_list": [ 00:14:54.207 { 00:14:54.207 "name": null, 00:14:54.207 "uuid": "6fa97b09-7dd7-40da-95f8-44c2ef5e51a6", 00:14:54.207 "is_configured": false, 00:14:54.207 "data_offset": 0, 00:14:54.207 "data_size": 65536 00:14:54.207 }, 00:14:54.207 { 00:14:54.207 "name": "BaseBdev2", 00:14:54.207 "uuid": "e9e9b4b1-316d-4484-b071-4fd20f2573a5", 00:14:54.207 "is_configured": true, 00:14:54.207 "data_offset": 0, 00:14:54.207 "data_size": 65536 00:14:54.207 }, 00:14:54.207 { 00:14:54.207 "name": "BaseBdev3", 00:14:54.207 "uuid": "f309239d-ca90-4508-9adf-09e47699e432", 00:14:54.207 "is_configured": true, 00:14:54.207 "data_offset": 0, 00:14:54.207 "data_size": 65536 00:14:54.207 } 00:14:54.207 ] 00:14:54.207 }' 00:14:54.207 18:18:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:54.207 18:18:37 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:14:54.773 18:18:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@331 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:54.773 18:18:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@331 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:14:55.030 18:18:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@331 -- # [[ true == \t\r\u\e ]] 00:14:55.030 18:18:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:55.030 18:18:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@333 -- # jq -r '.[0].base_bdevs_list[0].uuid' 00:14:55.595 18:18:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b NewBaseBdev -u 6fa97b09-7dd7-40da-95f8-44c2ef5e51a6 00:14:55.854 [2024-07-12 18:18:39.406099] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev NewBaseBdev is claimed 00:14:55.854 [2024-07-12 18:18:39.406138] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x1f58450 00:14:55.854 [2024-07-12 18:18:39.406147] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 196608, blocklen 512 00:14:55.854 [2024-07-12 18:18:39.406353] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1f59ed0 00:14:55.854 [2024-07-12 18:18:39.406475] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1f58450 00:14:55.854 [2024-07-12 18:18:39.406486] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x1f58450 00:14:55.854 [2024-07-12 18:18:39.406654] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:14:55.854 NewBaseBdev 00:14:55.854 18:18:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@334 -- # waitforbdev NewBaseBdev 00:14:55.854 18:18:39 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=NewBaseBdev 00:14:55.854 18:18:39 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:14:55.854 18:18:39 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:14:55.854 18:18:39 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:14:55.854 18:18:39 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:14:55.854 18:18:39 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:14:56.112 18:18:39 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev -t 2000 00:14:56.371 [ 00:14:56.371 { 00:14:56.371 "name": "NewBaseBdev", 00:14:56.371 "aliases": [ 00:14:56.371 "6fa97b09-7dd7-40da-95f8-44c2ef5e51a6" 00:14:56.371 ], 00:14:56.371 "product_name": "Malloc disk", 00:14:56.371 "block_size": 512, 00:14:56.371 "num_blocks": 65536, 00:14:56.371 "uuid": "6fa97b09-7dd7-40da-95f8-44c2ef5e51a6", 00:14:56.371 "assigned_rate_limits": { 00:14:56.371 "rw_ios_per_sec": 0, 00:14:56.371 "rw_mbytes_per_sec": 0, 00:14:56.371 "r_mbytes_per_sec": 0, 00:14:56.371 "w_mbytes_per_sec": 0 00:14:56.371 }, 00:14:56.371 "claimed": true, 00:14:56.371 "claim_type": "exclusive_write", 00:14:56.371 "zoned": false, 00:14:56.371 "supported_io_types": { 00:14:56.371 "read": true, 00:14:56.371 "write": true, 00:14:56.371 "unmap": true, 00:14:56.371 "flush": true, 00:14:56.371 "reset": true, 00:14:56.371 "nvme_admin": false, 00:14:56.371 "nvme_io": false, 00:14:56.371 "nvme_io_md": false, 00:14:56.371 "write_zeroes": true, 00:14:56.371 "zcopy": true, 00:14:56.371 "get_zone_info": false, 00:14:56.371 "zone_management": false, 00:14:56.371 "zone_append": false, 00:14:56.371 "compare": false, 00:14:56.371 "compare_and_write": false, 00:14:56.371 "abort": true, 00:14:56.371 "seek_hole": false, 00:14:56.371 "seek_data": false, 00:14:56.371 "copy": true, 00:14:56.371 "nvme_iov_md": false 00:14:56.371 }, 00:14:56.371 "memory_domains": [ 00:14:56.371 { 00:14:56.371 "dma_device_id": "system", 00:14:56.371 "dma_device_type": 1 00:14:56.371 }, 00:14:56.371 { 00:14:56.371 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:56.371 "dma_device_type": 2 00:14:56.371 } 00:14:56.371 ], 00:14:56.371 "driver_specific": {} 00:14:56.371 } 00:14:56.371 ] 00:14:56.371 18:18:39 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:14:56.371 18:18:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@335 -- # verify_raid_bdev_state Existed_Raid online concat 64 3 00:14:56.371 18:18:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:14:56.371 18:18:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:14:56.371 18:18:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:14:56.371 18:18:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:14:56.371 18:18:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:14:56.371 18:18:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:56.371 18:18:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:56.371 18:18:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:56.371 18:18:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:56.371 18:18:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:56.371 18:18:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:14:56.629 18:18:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:56.629 "name": "Existed_Raid", 00:14:56.629 "uuid": "30568aca-1bb1-4bfe-a283-80ad72378c4d", 00:14:56.629 "strip_size_kb": 64, 00:14:56.629 "state": "online", 00:14:56.629 "raid_level": "concat", 00:14:56.629 "superblock": false, 00:14:56.629 "num_base_bdevs": 3, 00:14:56.629 "num_base_bdevs_discovered": 3, 00:14:56.629 "num_base_bdevs_operational": 3, 00:14:56.629 "base_bdevs_list": [ 00:14:56.629 { 00:14:56.629 "name": "NewBaseBdev", 00:14:56.629 "uuid": "6fa97b09-7dd7-40da-95f8-44c2ef5e51a6", 00:14:56.629 "is_configured": true, 00:14:56.629 "data_offset": 0, 00:14:56.629 "data_size": 65536 00:14:56.629 }, 00:14:56.629 { 00:14:56.629 "name": "BaseBdev2", 00:14:56.629 "uuid": "e9e9b4b1-316d-4484-b071-4fd20f2573a5", 00:14:56.629 "is_configured": true, 00:14:56.629 "data_offset": 0, 00:14:56.629 "data_size": 65536 00:14:56.629 }, 00:14:56.629 { 00:14:56.629 "name": "BaseBdev3", 00:14:56.629 "uuid": "f309239d-ca90-4508-9adf-09e47699e432", 00:14:56.629 "is_configured": true, 00:14:56.629 "data_offset": 0, 00:14:56.629 "data_size": 65536 00:14:56.629 } 00:14:56.629 ] 00:14:56.629 }' 00:14:56.629 18:18:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:56.629 18:18:40 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:14:57.196 18:18:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@336 -- # verify_raid_bdev_properties Existed_Raid 00:14:57.196 18:18:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:14:57.196 18:18:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:14:57.196 18:18:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:14:57.196 18:18:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:14:57.196 18:18:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@198 -- # local name 00:14:57.196 18:18:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:14:57.196 18:18:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:14:57.196 [2024-07-12 18:18:40.914411] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:14:57.455 18:18:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:14:57.455 "name": "Existed_Raid", 00:14:57.455 "aliases": [ 00:14:57.455 "30568aca-1bb1-4bfe-a283-80ad72378c4d" 00:14:57.455 ], 00:14:57.455 "product_name": "Raid Volume", 00:14:57.455 "block_size": 512, 00:14:57.455 "num_blocks": 196608, 00:14:57.455 "uuid": "30568aca-1bb1-4bfe-a283-80ad72378c4d", 00:14:57.455 "assigned_rate_limits": { 00:14:57.455 "rw_ios_per_sec": 0, 00:14:57.455 "rw_mbytes_per_sec": 0, 00:14:57.455 "r_mbytes_per_sec": 0, 00:14:57.455 "w_mbytes_per_sec": 0 00:14:57.455 }, 00:14:57.455 "claimed": false, 00:14:57.455 "zoned": false, 00:14:57.455 "supported_io_types": { 00:14:57.455 "read": true, 00:14:57.455 "write": true, 00:14:57.455 "unmap": true, 00:14:57.455 "flush": true, 00:14:57.455 "reset": true, 00:14:57.455 "nvme_admin": false, 00:14:57.455 "nvme_io": false, 00:14:57.455 "nvme_io_md": false, 00:14:57.455 "write_zeroes": true, 00:14:57.455 "zcopy": false, 00:14:57.455 "get_zone_info": false, 00:14:57.455 "zone_management": false, 00:14:57.455 "zone_append": false, 00:14:57.455 "compare": false, 00:14:57.455 "compare_and_write": false, 00:14:57.455 "abort": false, 00:14:57.455 "seek_hole": false, 00:14:57.455 "seek_data": false, 00:14:57.455 "copy": false, 00:14:57.455 "nvme_iov_md": false 00:14:57.455 }, 00:14:57.455 "memory_domains": [ 00:14:57.455 { 00:14:57.455 "dma_device_id": "system", 00:14:57.455 "dma_device_type": 1 00:14:57.455 }, 00:14:57.455 { 00:14:57.455 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:57.455 "dma_device_type": 2 00:14:57.455 }, 00:14:57.455 { 00:14:57.455 "dma_device_id": "system", 00:14:57.455 "dma_device_type": 1 00:14:57.455 }, 00:14:57.455 { 00:14:57.455 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:57.455 "dma_device_type": 2 00:14:57.455 }, 00:14:57.455 { 00:14:57.455 "dma_device_id": "system", 00:14:57.455 "dma_device_type": 1 00:14:57.455 }, 00:14:57.455 { 00:14:57.455 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:57.455 "dma_device_type": 2 00:14:57.455 } 00:14:57.455 ], 00:14:57.455 "driver_specific": { 00:14:57.455 "raid": { 00:14:57.455 "uuid": "30568aca-1bb1-4bfe-a283-80ad72378c4d", 00:14:57.455 "strip_size_kb": 64, 00:14:57.455 "state": "online", 00:14:57.455 "raid_level": "concat", 00:14:57.455 "superblock": false, 00:14:57.455 "num_base_bdevs": 3, 00:14:57.455 "num_base_bdevs_discovered": 3, 00:14:57.455 "num_base_bdevs_operational": 3, 00:14:57.455 "base_bdevs_list": [ 00:14:57.455 { 00:14:57.455 "name": "NewBaseBdev", 00:14:57.455 "uuid": "6fa97b09-7dd7-40da-95f8-44c2ef5e51a6", 00:14:57.455 "is_configured": true, 00:14:57.455 "data_offset": 0, 00:14:57.455 "data_size": 65536 00:14:57.455 }, 00:14:57.455 { 00:14:57.455 "name": "BaseBdev2", 00:14:57.455 "uuid": "e9e9b4b1-316d-4484-b071-4fd20f2573a5", 00:14:57.455 "is_configured": true, 00:14:57.455 "data_offset": 0, 00:14:57.455 "data_size": 65536 00:14:57.455 }, 00:14:57.455 { 00:14:57.455 "name": "BaseBdev3", 00:14:57.455 "uuid": "f309239d-ca90-4508-9adf-09e47699e432", 00:14:57.455 "is_configured": true, 00:14:57.455 "data_offset": 0, 00:14:57.455 "data_size": 65536 00:14:57.455 } 00:14:57.455 ] 00:14:57.455 } 00:14:57.455 } 00:14:57.455 }' 00:14:57.455 18:18:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:14:57.455 18:18:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='NewBaseBdev 00:14:57.455 BaseBdev2 00:14:57.455 BaseBdev3' 00:14:57.455 18:18:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:14:57.455 18:18:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev 00:14:57.455 18:18:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:14:57.714 18:18:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:14:57.715 "name": "NewBaseBdev", 00:14:57.715 "aliases": [ 00:14:57.715 "6fa97b09-7dd7-40da-95f8-44c2ef5e51a6" 00:14:57.715 ], 00:14:57.715 "product_name": "Malloc disk", 00:14:57.715 "block_size": 512, 00:14:57.715 "num_blocks": 65536, 00:14:57.715 "uuid": "6fa97b09-7dd7-40da-95f8-44c2ef5e51a6", 00:14:57.715 "assigned_rate_limits": { 00:14:57.715 "rw_ios_per_sec": 0, 00:14:57.715 "rw_mbytes_per_sec": 0, 00:14:57.715 "r_mbytes_per_sec": 0, 00:14:57.715 "w_mbytes_per_sec": 0 00:14:57.715 }, 00:14:57.715 "claimed": true, 00:14:57.715 "claim_type": "exclusive_write", 00:14:57.715 "zoned": false, 00:14:57.715 "supported_io_types": { 00:14:57.715 "read": true, 00:14:57.715 "write": true, 00:14:57.715 "unmap": true, 00:14:57.715 "flush": true, 00:14:57.715 "reset": true, 00:14:57.715 "nvme_admin": false, 00:14:57.715 "nvme_io": false, 00:14:57.715 "nvme_io_md": false, 00:14:57.715 "write_zeroes": true, 00:14:57.715 "zcopy": true, 00:14:57.715 "get_zone_info": false, 00:14:57.715 "zone_management": false, 00:14:57.715 "zone_append": false, 00:14:57.715 "compare": false, 00:14:57.715 "compare_and_write": false, 00:14:57.715 "abort": true, 00:14:57.715 "seek_hole": false, 00:14:57.715 "seek_data": false, 00:14:57.715 "copy": true, 00:14:57.715 "nvme_iov_md": false 00:14:57.715 }, 00:14:57.715 "memory_domains": [ 00:14:57.715 { 00:14:57.715 "dma_device_id": "system", 00:14:57.715 "dma_device_type": 1 00:14:57.715 }, 00:14:57.715 { 00:14:57.715 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:57.715 "dma_device_type": 2 00:14:57.715 } 00:14:57.715 ], 00:14:57.715 "driver_specific": {} 00:14:57.715 }' 00:14:57.715 18:18:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:57.715 18:18:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:57.715 18:18:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:14:57.715 18:18:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:57.715 18:18:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:57.715 18:18:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:14:57.715 18:18:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:57.715 18:18:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:57.974 18:18:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:14:57.974 18:18:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:57.974 18:18:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:57.974 18:18:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:14:57.974 18:18:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:14:57.974 18:18:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:14:57.974 18:18:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:14:58.232 18:18:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:14:58.232 "name": "BaseBdev2", 00:14:58.232 "aliases": [ 00:14:58.232 "e9e9b4b1-316d-4484-b071-4fd20f2573a5" 00:14:58.232 ], 00:14:58.232 "product_name": "Malloc disk", 00:14:58.232 "block_size": 512, 00:14:58.232 "num_blocks": 65536, 00:14:58.232 "uuid": "e9e9b4b1-316d-4484-b071-4fd20f2573a5", 00:14:58.232 "assigned_rate_limits": { 00:14:58.232 "rw_ios_per_sec": 0, 00:14:58.232 "rw_mbytes_per_sec": 0, 00:14:58.232 "r_mbytes_per_sec": 0, 00:14:58.232 "w_mbytes_per_sec": 0 00:14:58.232 }, 00:14:58.232 "claimed": true, 00:14:58.232 "claim_type": "exclusive_write", 00:14:58.232 "zoned": false, 00:14:58.232 "supported_io_types": { 00:14:58.232 "read": true, 00:14:58.232 "write": true, 00:14:58.232 "unmap": true, 00:14:58.232 "flush": true, 00:14:58.232 "reset": true, 00:14:58.232 "nvme_admin": false, 00:14:58.232 "nvme_io": false, 00:14:58.232 "nvme_io_md": false, 00:14:58.232 "write_zeroes": true, 00:14:58.232 "zcopy": true, 00:14:58.232 "get_zone_info": false, 00:14:58.232 "zone_management": false, 00:14:58.232 "zone_append": false, 00:14:58.232 "compare": false, 00:14:58.232 "compare_and_write": false, 00:14:58.232 "abort": true, 00:14:58.232 "seek_hole": false, 00:14:58.232 "seek_data": false, 00:14:58.232 "copy": true, 00:14:58.232 "nvme_iov_md": false 00:14:58.232 }, 00:14:58.232 "memory_domains": [ 00:14:58.232 { 00:14:58.232 "dma_device_id": "system", 00:14:58.232 "dma_device_type": 1 00:14:58.232 }, 00:14:58.232 { 00:14:58.232 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:58.232 "dma_device_type": 2 00:14:58.232 } 00:14:58.232 ], 00:14:58.232 "driver_specific": {} 00:14:58.232 }' 00:14:58.232 18:18:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:58.232 18:18:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:58.232 18:18:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:14:58.232 18:18:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:58.232 18:18:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:58.491 18:18:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:14:58.491 18:18:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:58.491 18:18:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:58.491 18:18:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:14:58.491 18:18:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:58.491 18:18:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:58.491 18:18:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:14:58.491 18:18:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:14:58.491 18:18:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:14:58.491 18:18:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:14:58.749 18:18:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:14:58.749 "name": "BaseBdev3", 00:14:58.749 "aliases": [ 00:14:58.749 "f309239d-ca90-4508-9adf-09e47699e432" 00:14:58.749 ], 00:14:58.749 "product_name": "Malloc disk", 00:14:58.749 "block_size": 512, 00:14:58.749 "num_blocks": 65536, 00:14:58.749 "uuid": "f309239d-ca90-4508-9adf-09e47699e432", 00:14:58.749 "assigned_rate_limits": { 00:14:58.749 "rw_ios_per_sec": 0, 00:14:58.749 "rw_mbytes_per_sec": 0, 00:14:58.749 "r_mbytes_per_sec": 0, 00:14:58.749 "w_mbytes_per_sec": 0 00:14:58.749 }, 00:14:58.749 "claimed": true, 00:14:58.749 "claim_type": "exclusive_write", 00:14:58.749 "zoned": false, 00:14:58.749 "supported_io_types": { 00:14:58.749 "read": true, 00:14:58.749 "write": true, 00:14:58.749 "unmap": true, 00:14:58.749 "flush": true, 00:14:58.749 "reset": true, 00:14:58.749 "nvme_admin": false, 00:14:58.749 "nvme_io": false, 00:14:58.749 "nvme_io_md": false, 00:14:58.749 "write_zeroes": true, 00:14:58.749 "zcopy": true, 00:14:58.749 "get_zone_info": false, 00:14:58.749 "zone_management": false, 00:14:58.749 "zone_append": false, 00:14:58.749 "compare": false, 00:14:58.749 "compare_and_write": false, 00:14:58.749 "abort": true, 00:14:58.749 "seek_hole": false, 00:14:58.749 "seek_data": false, 00:14:58.749 "copy": true, 00:14:58.749 "nvme_iov_md": false 00:14:58.749 }, 00:14:58.749 "memory_domains": [ 00:14:58.749 { 00:14:58.749 "dma_device_id": "system", 00:14:58.749 "dma_device_type": 1 00:14:58.749 }, 00:14:58.749 { 00:14:58.749 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:58.749 "dma_device_type": 2 00:14:58.749 } 00:14:58.749 ], 00:14:58.749 "driver_specific": {} 00:14:58.749 }' 00:14:58.749 18:18:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:58.749 18:18:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:59.008 18:18:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:14:59.008 18:18:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:59.008 18:18:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:59.008 18:18:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:14:59.008 18:18:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:59.008 18:18:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:59.008 18:18:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:14:59.008 18:18:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:59.008 18:18:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:59.266 18:18:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:14:59.266 18:18:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@338 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:14:59.266 [2024-07-12 18:18:42.991648] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:14:59.266 [2024-07-12 18:18:42.991671] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:14:59.266 [2024-07-12 18:18:42.991724] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:14:59.266 [2024-07-12 18:18:42.991775] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:14:59.266 [2024-07-12 18:18:42.991787] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1f58450 name Existed_Raid, state offline 00:14:59.524 18:18:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@341 -- # killprocess 2491187 00:14:59.524 18:18:43 bdev_raid.raid_state_function_test -- common/autotest_common.sh@948 -- # '[' -z 2491187 ']' 00:14:59.524 18:18:43 bdev_raid.raid_state_function_test -- common/autotest_common.sh@952 -- # kill -0 2491187 00:14:59.524 18:18:43 bdev_raid.raid_state_function_test -- common/autotest_common.sh@953 -- # uname 00:14:59.524 18:18:43 bdev_raid.raid_state_function_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:14:59.524 18:18:43 bdev_raid.raid_state_function_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2491187 00:14:59.524 18:18:43 bdev_raid.raid_state_function_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:14:59.524 18:18:43 bdev_raid.raid_state_function_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:14:59.524 18:18:43 bdev_raid.raid_state_function_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2491187' 00:14:59.524 killing process with pid 2491187 00:14:59.524 18:18:43 bdev_raid.raid_state_function_test -- common/autotest_common.sh@967 -- # kill 2491187 00:14:59.524 [2024-07-12 18:18:43.067219] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:14:59.524 18:18:43 bdev_raid.raid_state_function_test -- common/autotest_common.sh@972 -- # wait 2491187 00:14:59.524 [2024-07-12 18:18:43.121719] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:14:59.782 18:18:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@343 -- # return 0 00:14:59.782 00:14:59.782 real 0m29.307s 00:14:59.782 user 0m53.624s 00:14:59.782 sys 0m5.144s 00:14:59.782 18:18:43 bdev_raid.raid_state_function_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:14:59.782 18:18:43 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:14:59.782 ************************************ 00:14:59.782 END TEST raid_state_function_test 00:14:59.782 ************************************ 00:15:00.041 18:18:43 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:15:00.041 18:18:43 bdev_raid -- bdev/bdev_raid.sh@868 -- # run_test raid_state_function_test_sb raid_state_function_test concat 3 true 00:15:00.041 18:18:43 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:15:00.041 18:18:43 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:15:00.041 18:18:43 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:15:00.041 ************************************ 00:15:00.041 START TEST raid_state_function_test_sb 00:15:00.041 ************************************ 00:15:00.041 18:18:43 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@1123 -- # raid_state_function_test concat 3 true 00:15:00.041 18:18:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@220 -- # local raid_level=concat 00:15:00.041 18:18:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@221 -- # local num_base_bdevs=3 00:15:00.041 18:18:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@222 -- # local superblock=true 00:15:00.041 18:18:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@223 -- # local raid_bdev 00:15:00.041 18:18:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i = 1 )) 00:15:00.041 18:18:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:15:00.041 18:18:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev1 00:15:00.041 18:18:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:15:00.041 18:18:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:15:00.041 18:18:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev2 00:15:00.041 18:18:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:15:00.041 18:18:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:15:00.041 18:18:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev3 00:15:00.041 18:18:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:15:00.041 18:18:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:15:00.041 18:18:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3') 00:15:00.041 18:18:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # local base_bdevs 00:15:00.041 18:18:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@225 -- # local raid_bdev_name=Existed_Raid 00:15:00.041 18:18:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # local strip_size 00:15:00.041 18:18:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@227 -- # local strip_size_create_arg 00:15:00.041 18:18:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@228 -- # local superblock_create_arg 00:15:00.041 18:18:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@230 -- # '[' concat '!=' raid1 ']' 00:15:00.041 18:18:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@231 -- # strip_size=64 00:15:00.041 18:18:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@232 -- # strip_size_create_arg='-z 64' 00:15:00.041 18:18:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@237 -- # '[' true = true ']' 00:15:00.041 18:18:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@238 -- # superblock_create_arg=-s 00:15:00.041 18:18:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@244 -- # raid_pid=2495649 00:15:00.041 18:18:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@245 -- # echo 'Process raid pid: 2495649' 00:15:00.041 Process raid pid: 2495649 00:15:00.041 18:18:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@243 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:15:00.041 18:18:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@246 -- # waitforlisten 2495649 /var/tmp/spdk-raid.sock 00:15:00.041 18:18:43 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@829 -- # '[' -z 2495649 ']' 00:15:00.041 18:18:43 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:15:00.041 18:18:43 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@834 -- # local max_retries=100 00:15:00.041 18:18:43 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:15:00.041 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:15:00.041 18:18:43 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@838 -- # xtrace_disable 00:15:00.041 18:18:43 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:15:00.041 [2024-07-12 18:18:43.648336] Starting SPDK v24.09-pre git sha1 182dd7de4 / DPDK 24.03.0 initialization... 00:15:00.041 [2024-07-12 18:18:43.648388] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:15:00.041 [2024-07-12 18:18:43.762708] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:15:00.337 [2024-07-12 18:18:43.868398] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:15:00.337 [2024-07-12 18:18:43.927252] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:15:00.337 [2024-07-12 18:18:43.927282] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:15:00.915 18:18:44 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:15:00.915 18:18:44 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@862 -- # return 0 00:15:00.915 18:18:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@250 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r concat -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:15:01.174 [2024-07-12 18:18:44.817225] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:15:01.174 [2024-07-12 18:18:44.817270] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:15:01.174 [2024-07-12 18:18:44.817281] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:15:01.174 [2024-07-12 18:18:44.817293] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:15:01.174 [2024-07-12 18:18:44.817302] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:15:01.174 [2024-07-12 18:18:44.817314] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:15:01.174 18:18:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@251 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:15:01.174 18:18:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:15:01.174 18:18:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:15:01.174 18:18:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:15:01.174 18:18:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:15:01.174 18:18:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:15:01.174 18:18:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:01.174 18:18:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:01.174 18:18:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:01.174 18:18:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:01.174 18:18:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:01.174 18:18:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:15:01.433 18:18:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:01.433 "name": "Existed_Raid", 00:15:01.433 "uuid": "a669ea3c-bc6f-4ba7-b054-b6caf2a6e854", 00:15:01.433 "strip_size_kb": 64, 00:15:01.433 "state": "configuring", 00:15:01.433 "raid_level": "concat", 00:15:01.433 "superblock": true, 00:15:01.433 "num_base_bdevs": 3, 00:15:01.433 "num_base_bdevs_discovered": 0, 00:15:01.433 "num_base_bdevs_operational": 3, 00:15:01.433 "base_bdevs_list": [ 00:15:01.433 { 00:15:01.433 "name": "BaseBdev1", 00:15:01.433 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:01.433 "is_configured": false, 00:15:01.433 "data_offset": 0, 00:15:01.433 "data_size": 0 00:15:01.433 }, 00:15:01.433 { 00:15:01.433 "name": "BaseBdev2", 00:15:01.433 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:01.433 "is_configured": false, 00:15:01.433 "data_offset": 0, 00:15:01.433 "data_size": 0 00:15:01.433 }, 00:15:01.433 { 00:15:01.433 "name": "BaseBdev3", 00:15:01.433 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:01.433 "is_configured": false, 00:15:01.433 "data_offset": 0, 00:15:01.433 "data_size": 0 00:15:01.433 } 00:15:01.433 ] 00:15:01.433 }' 00:15:01.433 18:18:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:01.433 18:18:45 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:15:01.999 18:18:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@252 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:15:02.258 [2024-07-12 18:18:45.815738] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:15:02.258 [2024-07-12 18:18:45.815772] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xf81a80 name Existed_Raid, state configuring 00:15:02.258 18:18:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@256 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r concat -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:15:02.517 [2024-07-12 18:18:46.068437] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:15:02.517 [2024-07-12 18:18:46.068472] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:15:02.517 [2024-07-12 18:18:46.068482] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:15:02.517 [2024-07-12 18:18:46.068494] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:15:02.517 [2024-07-12 18:18:46.068503] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:15:02.517 [2024-07-12 18:18:46.068514] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:15:02.517 18:18:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@257 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:15:02.776 [2024-07-12 18:18:46.320139] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:15:02.776 BaseBdev1 00:15:02.776 18:18:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@258 -- # waitforbdev BaseBdev1 00:15:02.776 18:18:46 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev1 00:15:02.776 18:18:46 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:15:02.776 18:18:46 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:15:02.776 18:18:46 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:15:02.776 18:18:46 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:15:02.776 18:18:46 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:15:03.035 18:18:46 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:15:03.294 [ 00:15:03.294 { 00:15:03.294 "name": "BaseBdev1", 00:15:03.294 "aliases": [ 00:15:03.294 "a6d65cc9-7c40-4062-93c5-ab23b08c9588" 00:15:03.294 ], 00:15:03.294 "product_name": "Malloc disk", 00:15:03.294 "block_size": 512, 00:15:03.294 "num_blocks": 65536, 00:15:03.294 "uuid": "a6d65cc9-7c40-4062-93c5-ab23b08c9588", 00:15:03.294 "assigned_rate_limits": { 00:15:03.294 "rw_ios_per_sec": 0, 00:15:03.294 "rw_mbytes_per_sec": 0, 00:15:03.294 "r_mbytes_per_sec": 0, 00:15:03.294 "w_mbytes_per_sec": 0 00:15:03.294 }, 00:15:03.294 "claimed": true, 00:15:03.294 "claim_type": "exclusive_write", 00:15:03.294 "zoned": false, 00:15:03.294 "supported_io_types": { 00:15:03.294 "read": true, 00:15:03.294 "write": true, 00:15:03.294 "unmap": true, 00:15:03.294 "flush": true, 00:15:03.294 "reset": true, 00:15:03.294 "nvme_admin": false, 00:15:03.294 "nvme_io": false, 00:15:03.294 "nvme_io_md": false, 00:15:03.294 "write_zeroes": true, 00:15:03.294 "zcopy": true, 00:15:03.294 "get_zone_info": false, 00:15:03.294 "zone_management": false, 00:15:03.294 "zone_append": false, 00:15:03.294 "compare": false, 00:15:03.294 "compare_and_write": false, 00:15:03.294 "abort": true, 00:15:03.294 "seek_hole": false, 00:15:03.294 "seek_data": false, 00:15:03.294 "copy": true, 00:15:03.294 "nvme_iov_md": false 00:15:03.294 }, 00:15:03.294 "memory_domains": [ 00:15:03.294 { 00:15:03.294 "dma_device_id": "system", 00:15:03.294 "dma_device_type": 1 00:15:03.294 }, 00:15:03.294 { 00:15:03.294 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:03.294 "dma_device_type": 2 00:15:03.294 } 00:15:03.294 ], 00:15:03.294 "driver_specific": {} 00:15:03.294 } 00:15:03.294 ] 00:15:03.294 18:18:46 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:15:03.294 18:18:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@259 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:15:03.294 18:18:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:15:03.294 18:18:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:15:03.294 18:18:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:15:03.294 18:18:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:15:03.294 18:18:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:15:03.294 18:18:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:03.294 18:18:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:03.294 18:18:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:03.294 18:18:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:03.294 18:18:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:03.294 18:18:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:15:03.553 18:18:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:03.553 "name": "Existed_Raid", 00:15:03.553 "uuid": "0aba97df-66e1-4518-8f63-0b4fac88a8d9", 00:15:03.553 "strip_size_kb": 64, 00:15:03.553 "state": "configuring", 00:15:03.553 "raid_level": "concat", 00:15:03.553 "superblock": true, 00:15:03.553 "num_base_bdevs": 3, 00:15:03.553 "num_base_bdevs_discovered": 1, 00:15:03.553 "num_base_bdevs_operational": 3, 00:15:03.553 "base_bdevs_list": [ 00:15:03.553 { 00:15:03.553 "name": "BaseBdev1", 00:15:03.553 "uuid": "a6d65cc9-7c40-4062-93c5-ab23b08c9588", 00:15:03.553 "is_configured": true, 00:15:03.553 "data_offset": 2048, 00:15:03.553 "data_size": 63488 00:15:03.553 }, 00:15:03.553 { 00:15:03.553 "name": "BaseBdev2", 00:15:03.553 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:03.553 "is_configured": false, 00:15:03.553 "data_offset": 0, 00:15:03.553 "data_size": 0 00:15:03.553 }, 00:15:03.553 { 00:15:03.553 "name": "BaseBdev3", 00:15:03.553 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:03.553 "is_configured": false, 00:15:03.553 "data_offset": 0, 00:15:03.553 "data_size": 0 00:15:03.553 } 00:15:03.553 ] 00:15:03.553 }' 00:15:03.553 18:18:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:03.553 18:18:47 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:15:04.120 18:18:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@260 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:15:04.379 [2024-07-12 18:18:47.868258] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:15:04.379 [2024-07-12 18:18:47.868302] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xf81310 name Existed_Raid, state configuring 00:15:04.379 18:18:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@264 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r concat -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:15:04.638 [2024-07-12 18:18:48.112946] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:15:04.638 [2024-07-12 18:18:48.114420] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:15:04.638 [2024-07-12 18:18:48.114451] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:15:04.638 [2024-07-12 18:18:48.114461] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:15:04.638 [2024-07-12 18:18:48.114473] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:15:04.638 18:18:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i = 1 )) 00:15:04.638 18:18:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:15:04.638 18:18:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:15:04.638 18:18:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:15:04.638 18:18:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:15:04.638 18:18:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:15:04.638 18:18:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:15:04.638 18:18:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:15:04.638 18:18:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:04.638 18:18:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:04.638 18:18:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:04.638 18:18:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:04.638 18:18:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:04.638 18:18:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:15:04.897 18:18:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:04.897 "name": "Existed_Raid", 00:15:04.897 "uuid": "5a5de65e-3efa-4cc4-b39e-87f5c69daf74", 00:15:04.897 "strip_size_kb": 64, 00:15:04.897 "state": "configuring", 00:15:04.897 "raid_level": "concat", 00:15:04.897 "superblock": true, 00:15:04.897 "num_base_bdevs": 3, 00:15:04.897 "num_base_bdevs_discovered": 1, 00:15:04.897 "num_base_bdevs_operational": 3, 00:15:04.897 "base_bdevs_list": [ 00:15:04.897 { 00:15:04.897 "name": "BaseBdev1", 00:15:04.897 "uuid": "a6d65cc9-7c40-4062-93c5-ab23b08c9588", 00:15:04.897 "is_configured": true, 00:15:04.897 "data_offset": 2048, 00:15:04.897 "data_size": 63488 00:15:04.897 }, 00:15:04.897 { 00:15:04.897 "name": "BaseBdev2", 00:15:04.897 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:04.897 "is_configured": false, 00:15:04.897 "data_offset": 0, 00:15:04.897 "data_size": 0 00:15:04.897 }, 00:15:04.897 { 00:15:04.897 "name": "BaseBdev3", 00:15:04.897 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:04.897 "is_configured": false, 00:15:04.897 "data_offset": 0, 00:15:04.897 "data_size": 0 00:15:04.897 } 00:15:04.897 ] 00:15:04.897 }' 00:15:04.897 18:18:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:04.897 18:18:48 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:15:05.464 18:18:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:15:05.464 [2024-07-12 18:18:49.179304] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:15:05.464 BaseBdev2 00:15:05.722 18:18:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev2 00:15:05.722 18:18:49 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev2 00:15:05.722 18:18:49 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:15:05.722 18:18:49 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:15:05.722 18:18:49 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:15:05.722 18:18:49 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:15:05.722 18:18:49 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:15:05.722 18:18:49 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:15:05.980 [ 00:15:05.980 { 00:15:05.980 "name": "BaseBdev2", 00:15:05.980 "aliases": [ 00:15:05.980 "91a6f07a-01a8-4bc9-b1a2-a135c9346c73" 00:15:05.980 ], 00:15:05.980 "product_name": "Malloc disk", 00:15:05.980 "block_size": 512, 00:15:05.980 "num_blocks": 65536, 00:15:05.980 "uuid": "91a6f07a-01a8-4bc9-b1a2-a135c9346c73", 00:15:05.980 "assigned_rate_limits": { 00:15:05.980 "rw_ios_per_sec": 0, 00:15:05.980 "rw_mbytes_per_sec": 0, 00:15:05.980 "r_mbytes_per_sec": 0, 00:15:05.980 "w_mbytes_per_sec": 0 00:15:05.980 }, 00:15:05.980 "claimed": true, 00:15:05.980 "claim_type": "exclusive_write", 00:15:05.980 "zoned": false, 00:15:05.980 "supported_io_types": { 00:15:05.980 "read": true, 00:15:05.980 "write": true, 00:15:05.980 "unmap": true, 00:15:05.980 "flush": true, 00:15:05.980 "reset": true, 00:15:05.980 "nvme_admin": false, 00:15:05.980 "nvme_io": false, 00:15:05.980 "nvme_io_md": false, 00:15:05.980 "write_zeroes": true, 00:15:05.980 "zcopy": true, 00:15:05.980 "get_zone_info": false, 00:15:05.980 "zone_management": false, 00:15:05.980 "zone_append": false, 00:15:05.980 "compare": false, 00:15:05.980 "compare_and_write": false, 00:15:05.980 "abort": true, 00:15:05.980 "seek_hole": false, 00:15:05.980 "seek_data": false, 00:15:05.980 "copy": true, 00:15:05.980 "nvme_iov_md": false 00:15:05.980 }, 00:15:05.980 "memory_domains": [ 00:15:05.980 { 00:15:05.980 "dma_device_id": "system", 00:15:05.980 "dma_device_type": 1 00:15:05.980 }, 00:15:05.980 { 00:15:05.980 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:05.980 "dma_device_type": 2 00:15:05.980 } 00:15:05.980 ], 00:15:05.980 "driver_specific": {} 00:15:05.980 } 00:15:05.980 ] 00:15:05.980 18:18:49 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:15:05.980 18:18:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:15:05.980 18:18:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:15:05.980 18:18:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:15:05.980 18:18:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:15:05.980 18:18:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:15:05.980 18:18:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:15:05.980 18:18:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:15:05.980 18:18:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:15:05.980 18:18:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:05.980 18:18:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:05.980 18:18:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:05.980 18:18:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:05.980 18:18:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:05.980 18:18:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:15:06.239 18:18:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:06.239 "name": "Existed_Raid", 00:15:06.239 "uuid": "5a5de65e-3efa-4cc4-b39e-87f5c69daf74", 00:15:06.239 "strip_size_kb": 64, 00:15:06.239 "state": "configuring", 00:15:06.239 "raid_level": "concat", 00:15:06.239 "superblock": true, 00:15:06.239 "num_base_bdevs": 3, 00:15:06.239 "num_base_bdevs_discovered": 2, 00:15:06.239 "num_base_bdevs_operational": 3, 00:15:06.239 "base_bdevs_list": [ 00:15:06.239 { 00:15:06.239 "name": "BaseBdev1", 00:15:06.239 "uuid": "a6d65cc9-7c40-4062-93c5-ab23b08c9588", 00:15:06.239 "is_configured": true, 00:15:06.239 "data_offset": 2048, 00:15:06.239 "data_size": 63488 00:15:06.239 }, 00:15:06.239 { 00:15:06.239 "name": "BaseBdev2", 00:15:06.239 "uuid": "91a6f07a-01a8-4bc9-b1a2-a135c9346c73", 00:15:06.239 "is_configured": true, 00:15:06.239 "data_offset": 2048, 00:15:06.239 "data_size": 63488 00:15:06.239 }, 00:15:06.239 { 00:15:06.239 "name": "BaseBdev3", 00:15:06.239 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:06.239 "is_configured": false, 00:15:06.239 "data_offset": 0, 00:15:06.239 "data_size": 0 00:15:06.239 } 00:15:06.239 ] 00:15:06.239 }' 00:15:06.239 18:18:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:06.239 18:18:49 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:15:07.175 18:18:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:15:07.175 [2024-07-12 18:18:50.774970] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:15:07.175 [2024-07-12 18:18:50.775131] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0xf82400 00:15:07.175 [2024-07-12 18:18:50.775145] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 190464, blocklen 512 00:15:07.175 [2024-07-12 18:18:50.775320] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xf81ef0 00:15:07.175 [2024-07-12 18:18:50.775437] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0xf82400 00:15:07.175 [2024-07-12 18:18:50.775447] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0xf82400 00:15:07.175 [2024-07-12 18:18:50.775536] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:15:07.175 BaseBdev3 00:15:07.175 18:18:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev3 00:15:07.175 18:18:50 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev3 00:15:07.175 18:18:50 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:15:07.176 18:18:50 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:15:07.176 18:18:50 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:15:07.176 18:18:50 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:15:07.176 18:18:50 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:15:07.433 18:18:51 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:15:07.691 [ 00:15:07.691 { 00:15:07.691 "name": "BaseBdev3", 00:15:07.691 "aliases": [ 00:15:07.691 "b1d89af2-e39f-4308-9913-c3b9091f29d8" 00:15:07.691 ], 00:15:07.691 "product_name": "Malloc disk", 00:15:07.691 "block_size": 512, 00:15:07.691 "num_blocks": 65536, 00:15:07.691 "uuid": "b1d89af2-e39f-4308-9913-c3b9091f29d8", 00:15:07.691 "assigned_rate_limits": { 00:15:07.691 "rw_ios_per_sec": 0, 00:15:07.691 "rw_mbytes_per_sec": 0, 00:15:07.691 "r_mbytes_per_sec": 0, 00:15:07.691 "w_mbytes_per_sec": 0 00:15:07.691 }, 00:15:07.691 "claimed": true, 00:15:07.691 "claim_type": "exclusive_write", 00:15:07.691 "zoned": false, 00:15:07.691 "supported_io_types": { 00:15:07.691 "read": true, 00:15:07.691 "write": true, 00:15:07.691 "unmap": true, 00:15:07.691 "flush": true, 00:15:07.691 "reset": true, 00:15:07.691 "nvme_admin": false, 00:15:07.691 "nvme_io": false, 00:15:07.691 "nvme_io_md": false, 00:15:07.691 "write_zeroes": true, 00:15:07.691 "zcopy": true, 00:15:07.691 "get_zone_info": false, 00:15:07.691 "zone_management": false, 00:15:07.691 "zone_append": false, 00:15:07.691 "compare": false, 00:15:07.691 "compare_and_write": false, 00:15:07.691 "abort": true, 00:15:07.691 "seek_hole": false, 00:15:07.691 "seek_data": false, 00:15:07.691 "copy": true, 00:15:07.691 "nvme_iov_md": false 00:15:07.691 }, 00:15:07.691 "memory_domains": [ 00:15:07.691 { 00:15:07.691 "dma_device_id": "system", 00:15:07.691 "dma_device_type": 1 00:15:07.691 }, 00:15:07.691 { 00:15:07.691 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:07.691 "dma_device_type": 2 00:15:07.691 } 00:15:07.691 ], 00:15:07.691 "driver_specific": {} 00:15:07.691 } 00:15:07.691 ] 00:15:07.691 18:18:51 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:15:07.691 18:18:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:15:07.691 18:18:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:15:07.691 18:18:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@270 -- # verify_raid_bdev_state Existed_Raid online concat 64 3 00:15:07.691 18:18:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:15:07.691 18:18:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:15:07.691 18:18:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:15:07.691 18:18:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:15:07.691 18:18:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:15:07.691 18:18:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:07.691 18:18:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:07.691 18:18:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:07.691 18:18:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:07.691 18:18:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:07.691 18:18:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:15:07.950 18:18:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:07.950 "name": "Existed_Raid", 00:15:07.950 "uuid": "5a5de65e-3efa-4cc4-b39e-87f5c69daf74", 00:15:07.950 "strip_size_kb": 64, 00:15:07.950 "state": "online", 00:15:07.950 "raid_level": "concat", 00:15:07.950 "superblock": true, 00:15:07.950 "num_base_bdevs": 3, 00:15:07.950 "num_base_bdevs_discovered": 3, 00:15:07.950 "num_base_bdevs_operational": 3, 00:15:07.950 "base_bdevs_list": [ 00:15:07.950 { 00:15:07.950 "name": "BaseBdev1", 00:15:07.950 "uuid": "a6d65cc9-7c40-4062-93c5-ab23b08c9588", 00:15:07.950 "is_configured": true, 00:15:07.950 "data_offset": 2048, 00:15:07.950 "data_size": 63488 00:15:07.950 }, 00:15:07.950 { 00:15:07.950 "name": "BaseBdev2", 00:15:07.950 "uuid": "91a6f07a-01a8-4bc9-b1a2-a135c9346c73", 00:15:07.950 "is_configured": true, 00:15:07.950 "data_offset": 2048, 00:15:07.950 "data_size": 63488 00:15:07.950 }, 00:15:07.950 { 00:15:07.950 "name": "BaseBdev3", 00:15:07.950 "uuid": "b1d89af2-e39f-4308-9913-c3b9091f29d8", 00:15:07.950 "is_configured": true, 00:15:07.950 "data_offset": 2048, 00:15:07.950 "data_size": 63488 00:15:07.950 } 00:15:07.950 ] 00:15:07.950 }' 00:15:07.950 18:18:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:07.950 18:18:51 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:15:08.519 18:18:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_properties Existed_Raid 00:15:08.519 18:18:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:15:08.519 18:18:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:15:08.519 18:18:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:15:08.519 18:18:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:15:08.519 18:18:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@198 -- # local name 00:15:08.519 18:18:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:15:08.519 18:18:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:15:08.776 [2024-07-12 18:18:52.351453] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:15:08.776 18:18:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:15:08.776 "name": "Existed_Raid", 00:15:08.776 "aliases": [ 00:15:08.776 "5a5de65e-3efa-4cc4-b39e-87f5c69daf74" 00:15:08.776 ], 00:15:08.776 "product_name": "Raid Volume", 00:15:08.776 "block_size": 512, 00:15:08.776 "num_blocks": 190464, 00:15:08.776 "uuid": "5a5de65e-3efa-4cc4-b39e-87f5c69daf74", 00:15:08.776 "assigned_rate_limits": { 00:15:08.776 "rw_ios_per_sec": 0, 00:15:08.776 "rw_mbytes_per_sec": 0, 00:15:08.776 "r_mbytes_per_sec": 0, 00:15:08.776 "w_mbytes_per_sec": 0 00:15:08.776 }, 00:15:08.776 "claimed": false, 00:15:08.776 "zoned": false, 00:15:08.776 "supported_io_types": { 00:15:08.776 "read": true, 00:15:08.776 "write": true, 00:15:08.776 "unmap": true, 00:15:08.776 "flush": true, 00:15:08.776 "reset": true, 00:15:08.776 "nvme_admin": false, 00:15:08.776 "nvme_io": false, 00:15:08.776 "nvme_io_md": false, 00:15:08.776 "write_zeroes": true, 00:15:08.776 "zcopy": false, 00:15:08.776 "get_zone_info": false, 00:15:08.776 "zone_management": false, 00:15:08.776 "zone_append": false, 00:15:08.776 "compare": false, 00:15:08.776 "compare_and_write": false, 00:15:08.776 "abort": false, 00:15:08.776 "seek_hole": false, 00:15:08.776 "seek_data": false, 00:15:08.776 "copy": false, 00:15:08.776 "nvme_iov_md": false 00:15:08.776 }, 00:15:08.776 "memory_domains": [ 00:15:08.776 { 00:15:08.776 "dma_device_id": "system", 00:15:08.776 "dma_device_type": 1 00:15:08.776 }, 00:15:08.776 { 00:15:08.776 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:08.776 "dma_device_type": 2 00:15:08.776 }, 00:15:08.776 { 00:15:08.776 "dma_device_id": "system", 00:15:08.776 "dma_device_type": 1 00:15:08.776 }, 00:15:08.776 { 00:15:08.776 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:08.776 "dma_device_type": 2 00:15:08.776 }, 00:15:08.776 { 00:15:08.776 "dma_device_id": "system", 00:15:08.776 "dma_device_type": 1 00:15:08.776 }, 00:15:08.776 { 00:15:08.777 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:08.777 "dma_device_type": 2 00:15:08.777 } 00:15:08.777 ], 00:15:08.777 "driver_specific": { 00:15:08.777 "raid": { 00:15:08.777 "uuid": "5a5de65e-3efa-4cc4-b39e-87f5c69daf74", 00:15:08.777 "strip_size_kb": 64, 00:15:08.777 "state": "online", 00:15:08.777 "raid_level": "concat", 00:15:08.777 "superblock": true, 00:15:08.777 "num_base_bdevs": 3, 00:15:08.777 "num_base_bdevs_discovered": 3, 00:15:08.777 "num_base_bdevs_operational": 3, 00:15:08.777 "base_bdevs_list": [ 00:15:08.777 { 00:15:08.777 "name": "BaseBdev1", 00:15:08.777 "uuid": "a6d65cc9-7c40-4062-93c5-ab23b08c9588", 00:15:08.777 "is_configured": true, 00:15:08.777 "data_offset": 2048, 00:15:08.777 "data_size": 63488 00:15:08.777 }, 00:15:08.777 { 00:15:08.777 "name": "BaseBdev2", 00:15:08.777 "uuid": "91a6f07a-01a8-4bc9-b1a2-a135c9346c73", 00:15:08.777 "is_configured": true, 00:15:08.777 "data_offset": 2048, 00:15:08.777 "data_size": 63488 00:15:08.777 }, 00:15:08.777 { 00:15:08.777 "name": "BaseBdev3", 00:15:08.777 "uuid": "b1d89af2-e39f-4308-9913-c3b9091f29d8", 00:15:08.777 "is_configured": true, 00:15:08.777 "data_offset": 2048, 00:15:08.777 "data_size": 63488 00:15:08.777 } 00:15:08.777 ] 00:15:08.777 } 00:15:08.777 } 00:15:08.777 }' 00:15:08.777 18:18:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:15:08.777 18:18:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # base_bdev_names='BaseBdev1 00:15:08.777 BaseBdev2 00:15:08.777 BaseBdev3' 00:15:08.777 18:18:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:15:08.777 18:18:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:15:08.777 18:18:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:15:09.036 18:18:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:15:09.036 "name": "BaseBdev1", 00:15:09.036 "aliases": [ 00:15:09.036 "a6d65cc9-7c40-4062-93c5-ab23b08c9588" 00:15:09.036 ], 00:15:09.036 "product_name": "Malloc disk", 00:15:09.036 "block_size": 512, 00:15:09.036 "num_blocks": 65536, 00:15:09.036 "uuid": "a6d65cc9-7c40-4062-93c5-ab23b08c9588", 00:15:09.036 "assigned_rate_limits": { 00:15:09.036 "rw_ios_per_sec": 0, 00:15:09.036 "rw_mbytes_per_sec": 0, 00:15:09.036 "r_mbytes_per_sec": 0, 00:15:09.036 "w_mbytes_per_sec": 0 00:15:09.036 }, 00:15:09.036 "claimed": true, 00:15:09.036 "claim_type": "exclusive_write", 00:15:09.036 "zoned": false, 00:15:09.036 "supported_io_types": { 00:15:09.036 "read": true, 00:15:09.036 "write": true, 00:15:09.036 "unmap": true, 00:15:09.036 "flush": true, 00:15:09.036 "reset": true, 00:15:09.036 "nvme_admin": false, 00:15:09.036 "nvme_io": false, 00:15:09.036 "nvme_io_md": false, 00:15:09.036 "write_zeroes": true, 00:15:09.036 "zcopy": true, 00:15:09.036 "get_zone_info": false, 00:15:09.036 "zone_management": false, 00:15:09.036 "zone_append": false, 00:15:09.036 "compare": false, 00:15:09.036 "compare_and_write": false, 00:15:09.036 "abort": true, 00:15:09.036 "seek_hole": false, 00:15:09.036 "seek_data": false, 00:15:09.036 "copy": true, 00:15:09.036 "nvme_iov_md": false 00:15:09.036 }, 00:15:09.036 "memory_domains": [ 00:15:09.036 { 00:15:09.036 "dma_device_id": "system", 00:15:09.036 "dma_device_type": 1 00:15:09.036 }, 00:15:09.036 { 00:15:09.036 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:09.036 "dma_device_type": 2 00:15:09.036 } 00:15:09.036 ], 00:15:09.036 "driver_specific": {} 00:15:09.036 }' 00:15:09.036 18:18:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:09.036 18:18:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:09.036 18:18:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:15:09.036 18:18:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:09.295 18:18:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:09.295 18:18:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:15:09.295 18:18:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:09.295 18:18:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:09.295 18:18:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:15:09.295 18:18:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:09.295 18:18:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:09.295 18:18:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:15:09.295 18:18:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:15:09.295 18:18:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:15:09.295 18:18:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:15:09.553 18:18:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:15:09.553 "name": "BaseBdev2", 00:15:09.553 "aliases": [ 00:15:09.553 "91a6f07a-01a8-4bc9-b1a2-a135c9346c73" 00:15:09.553 ], 00:15:09.553 "product_name": "Malloc disk", 00:15:09.553 "block_size": 512, 00:15:09.553 "num_blocks": 65536, 00:15:09.553 "uuid": "91a6f07a-01a8-4bc9-b1a2-a135c9346c73", 00:15:09.553 "assigned_rate_limits": { 00:15:09.553 "rw_ios_per_sec": 0, 00:15:09.553 "rw_mbytes_per_sec": 0, 00:15:09.553 "r_mbytes_per_sec": 0, 00:15:09.553 "w_mbytes_per_sec": 0 00:15:09.553 }, 00:15:09.553 "claimed": true, 00:15:09.553 "claim_type": "exclusive_write", 00:15:09.553 "zoned": false, 00:15:09.553 "supported_io_types": { 00:15:09.553 "read": true, 00:15:09.553 "write": true, 00:15:09.553 "unmap": true, 00:15:09.553 "flush": true, 00:15:09.553 "reset": true, 00:15:09.553 "nvme_admin": false, 00:15:09.553 "nvme_io": false, 00:15:09.553 "nvme_io_md": false, 00:15:09.553 "write_zeroes": true, 00:15:09.553 "zcopy": true, 00:15:09.553 "get_zone_info": false, 00:15:09.553 "zone_management": false, 00:15:09.553 "zone_append": false, 00:15:09.553 "compare": false, 00:15:09.553 "compare_and_write": false, 00:15:09.553 "abort": true, 00:15:09.553 "seek_hole": false, 00:15:09.553 "seek_data": false, 00:15:09.553 "copy": true, 00:15:09.553 "nvme_iov_md": false 00:15:09.553 }, 00:15:09.553 "memory_domains": [ 00:15:09.553 { 00:15:09.553 "dma_device_id": "system", 00:15:09.553 "dma_device_type": 1 00:15:09.553 }, 00:15:09.553 { 00:15:09.553 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:09.553 "dma_device_type": 2 00:15:09.553 } 00:15:09.553 ], 00:15:09.553 "driver_specific": {} 00:15:09.553 }' 00:15:09.553 18:18:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:09.812 18:18:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:09.812 18:18:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:15:09.812 18:18:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:09.812 18:18:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:09.812 18:18:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:15:09.812 18:18:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:09.812 18:18:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:09.812 18:18:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:15:09.812 18:18:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:10.072 18:18:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:10.072 18:18:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:15:10.072 18:18:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:15:10.072 18:18:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:15:10.072 18:18:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:15:10.331 18:18:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:15:10.331 "name": "BaseBdev3", 00:15:10.331 "aliases": [ 00:15:10.331 "b1d89af2-e39f-4308-9913-c3b9091f29d8" 00:15:10.331 ], 00:15:10.331 "product_name": "Malloc disk", 00:15:10.331 "block_size": 512, 00:15:10.331 "num_blocks": 65536, 00:15:10.331 "uuid": "b1d89af2-e39f-4308-9913-c3b9091f29d8", 00:15:10.331 "assigned_rate_limits": { 00:15:10.331 "rw_ios_per_sec": 0, 00:15:10.331 "rw_mbytes_per_sec": 0, 00:15:10.331 "r_mbytes_per_sec": 0, 00:15:10.331 "w_mbytes_per_sec": 0 00:15:10.331 }, 00:15:10.331 "claimed": true, 00:15:10.331 "claim_type": "exclusive_write", 00:15:10.331 "zoned": false, 00:15:10.331 "supported_io_types": { 00:15:10.331 "read": true, 00:15:10.331 "write": true, 00:15:10.331 "unmap": true, 00:15:10.331 "flush": true, 00:15:10.331 "reset": true, 00:15:10.331 "nvme_admin": false, 00:15:10.331 "nvme_io": false, 00:15:10.331 "nvme_io_md": false, 00:15:10.331 "write_zeroes": true, 00:15:10.331 "zcopy": true, 00:15:10.331 "get_zone_info": false, 00:15:10.331 "zone_management": false, 00:15:10.331 "zone_append": false, 00:15:10.331 "compare": false, 00:15:10.331 "compare_and_write": false, 00:15:10.331 "abort": true, 00:15:10.331 "seek_hole": false, 00:15:10.331 "seek_data": false, 00:15:10.331 "copy": true, 00:15:10.331 "nvme_iov_md": false 00:15:10.331 }, 00:15:10.331 "memory_domains": [ 00:15:10.331 { 00:15:10.331 "dma_device_id": "system", 00:15:10.331 "dma_device_type": 1 00:15:10.331 }, 00:15:10.331 { 00:15:10.331 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:10.331 "dma_device_type": 2 00:15:10.331 } 00:15:10.331 ], 00:15:10.331 "driver_specific": {} 00:15:10.331 }' 00:15:10.331 18:18:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:10.331 18:18:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:10.331 18:18:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:15:10.331 18:18:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:10.331 18:18:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:10.331 18:18:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:15:10.331 18:18:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:10.589 18:18:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:10.589 18:18:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:15:10.589 18:18:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:10.590 18:18:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:10.590 18:18:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:15:10.590 18:18:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@274 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:15:10.848 [2024-07-12 18:18:54.420702] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:15:10.848 [2024-07-12 18:18:54.420733] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:15:10.848 [2024-07-12 18:18:54.420774] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:15:10.848 18:18:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@275 -- # local expected_state 00:15:10.848 18:18:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@276 -- # has_redundancy concat 00:15:10.848 18:18:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@213 -- # case $1 in 00:15:10.848 18:18:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@215 -- # return 1 00:15:10.848 18:18:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@277 -- # expected_state=offline 00:15:10.848 18:18:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@281 -- # verify_raid_bdev_state Existed_Raid offline concat 64 2 00:15:10.848 18:18:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:15:10.848 18:18:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=offline 00:15:10.848 18:18:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:15:10.848 18:18:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:15:10.848 18:18:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:15:10.848 18:18:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:10.848 18:18:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:10.848 18:18:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:10.848 18:18:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:10.848 18:18:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:10.848 18:18:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:15:11.107 18:18:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:11.107 "name": "Existed_Raid", 00:15:11.107 "uuid": "5a5de65e-3efa-4cc4-b39e-87f5c69daf74", 00:15:11.107 "strip_size_kb": 64, 00:15:11.107 "state": "offline", 00:15:11.107 "raid_level": "concat", 00:15:11.107 "superblock": true, 00:15:11.107 "num_base_bdevs": 3, 00:15:11.107 "num_base_bdevs_discovered": 2, 00:15:11.107 "num_base_bdevs_operational": 2, 00:15:11.107 "base_bdevs_list": [ 00:15:11.107 { 00:15:11.107 "name": null, 00:15:11.107 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:11.107 "is_configured": false, 00:15:11.107 "data_offset": 2048, 00:15:11.107 "data_size": 63488 00:15:11.107 }, 00:15:11.107 { 00:15:11.107 "name": "BaseBdev2", 00:15:11.107 "uuid": "91a6f07a-01a8-4bc9-b1a2-a135c9346c73", 00:15:11.107 "is_configured": true, 00:15:11.107 "data_offset": 2048, 00:15:11.107 "data_size": 63488 00:15:11.107 }, 00:15:11.107 { 00:15:11.107 "name": "BaseBdev3", 00:15:11.107 "uuid": "b1d89af2-e39f-4308-9913-c3b9091f29d8", 00:15:11.107 "is_configured": true, 00:15:11.107 "data_offset": 2048, 00:15:11.107 "data_size": 63488 00:15:11.107 } 00:15:11.107 ] 00:15:11.107 }' 00:15:11.107 18:18:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:11.107 18:18:54 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:15:11.673 18:18:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i = 1 )) 00:15:11.673 18:18:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:15:11.673 18:18:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:11.673 18:18:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:15:11.931 18:18:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:15:11.931 18:18:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:15:11.931 18:18:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:15:12.191 [2024-07-12 18:18:55.738119] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:15:12.191 18:18:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:15:12.191 18:18:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:15:12.191 18:18:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:12.191 18:18:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:15:12.450 18:18:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:15:12.450 18:18:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:15:12.450 18:18:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev3 00:15:12.450 [2024-07-12 18:18:56.151695] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:15:12.450 [2024-07-12 18:18:56.151745] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xf82400 name Existed_Raid, state offline 00:15:12.709 18:18:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:15:12.709 18:18:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:15:12.709 18:18:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:12.709 18:18:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # jq -r '.[0]["name"] | select(.)' 00:15:12.709 18:18:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # raid_bdev= 00:15:12.709 18:18:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@294 -- # '[' -n '' ']' 00:15:12.709 18:18:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@299 -- # '[' 3 -gt 2 ']' 00:15:12.709 18:18:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i = 1 )) 00:15:12.709 18:18:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:15:12.709 18:18:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:15:12.967 BaseBdev2 00:15:12.967 18:18:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev2 00:15:12.967 18:18:56 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev2 00:15:12.967 18:18:56 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:15:12.967 18:18:56 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:15:12.967 18:18:56 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:15:12.967 18:18:56 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:15:12.967 18:18:56 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:15:13.226 18:18:56 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:15:13.484 [ 00:15:13.484 { 00:15:13.484 "name": "BaseBdev2", 00:15:13.484 "aliases": [ 00:15:13.484 "d53536bc-40bd-4ce2-a720-0b8cb7186b82" 00:15:13.484 ], 00:15:13.484 "product_name": "Malloc disk", 00:15:13.484 "block_size": 512, 00:15:13.484 "num_blocks": 65536, 00:15:13.484 "uuid": "d53536bc-40bd-4ce2-a720-0b8cb7186b82", 00:15:13.484 "assigned_rate_limits": { 00:15:13.484 "rw_ios_per_sec": 0, 00:15:13.484 "rw_mbytes_per_sec": 0, 00:15:13.484 "r_mbytes_per_sec": 0, 00:15:13.484 "w_mbytes_per_sec": 0 00:15:13.484 }, 00:15:13.484 "claimed": false, 00:15:13.484 "zoned": false, 00:15:13.484 "supported_io_types": { 00:15:13.484 "read": true, 00:15:13.484 "write": true, 00:15:13.484 "unmap": true, 00:15:13.484 "flush": true, 00:15:13.485 "reset": true, 00:15:13.485 "nvme_admin": false, 00:15:13.485 "nvme_io": false, 00:15:13.485 "nvme_io_md": false, 00:15:13.485 "write_zeroes": true, 00:15:13.485 "zcopy": true, 00:15:13.485 "get_zone_info": false, 00:15:13.485 "zone_management": false, 00:15:13.485 "zone_append": false, 00:15:13.485 "compare": false, 00:15:13.485 "compare_and_write": false, 00:15:13.485 "abort": true, 00:15:13.485 "seek_hole": false, 00:15:13.485 "seek_data": false, 00:15:13.485 "copy": true, 00:15:13.485 "nvme_iov_md": false 00:15:13.485 }, 00:15:13.485 "memory_domains": [ 00:15:13.485 { 00:15:13.485 "dma_device_id": "system", 00:15:13.485 "dma_device_type": 1 00:15:13.485 }, 00:15:13.485 { 00:15:13.485 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:13.485 "dma_device_type": 2 00:15:13.485 } 00:15:13.485 ], 00:15:13.485 "driver_specific": {} 00:15:13.485 } 00:15:13.485 ] 00:15:13.485 18:18:57 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:15:13.485 18:18:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:15:13.485 18:18:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:15:13.485 18:18:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:15:13.743 BaseBdev3 00:15:13.743 18:18:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev3 00:15:13.743 18:18:57 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev3 00:15:13.743 18:18:57 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:15:13.743 18:18:57 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:15:13.743 18:18:57 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:15:13.743 18:18:57 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:15:13.743 18:18:57 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:15:14.002 18:18:57 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:15:14.261 [ 00:15:14.261 { 00:15:14.261 "name": "BaseBdev3", 00:15:14.261 "aliases": [ 00:15:14.261 "610e8fa5-063b-4fdb-8974-729e625ddd98" 00:15:14.261 ], 00:15:14.261 "product_name": "Malloc disk", 00:15:14.261 "block_size": 512, 00:15:14.261 "num_blocks": 65536, 00:15:14.261 "uuid": "610e8fa5-063b-4fdb-8974-729e625ddd98", 00:15:14.261 "assigned_rate_limits": { 00:15:14.261 "rw_ios_per_sec": 0, 00:15:14.261 "rw_mbytes_per_sec": 0, 00:15:14.261 "r_mbytes_per_sec": 0, 00:15:14.261 "w_mbytes_per_sec": 0 00:15:14.261 }, 00:15:14.261 "claimed": false, 00:15:14.261 "zoned": false, 00:15:14.261 "supported_io_types": { 00:15:14.261 "read": true, 00:15:14.261 "write": true, 00:15:14.261 "unmap": true, 00:15:14.261 "flush": true, 00:15:14.261 "reset": true, 00:15:14.261 "nvme_admin": false, 00:15:14.261 "nvme_io": false, 00:15:14.261 "nvme_io_md": false, 00:15:14.261 "write_zeroes": true, 00:15:14.261 "zcopy": true, 00:15:14.261 "get_zone_info": false, 00:15:14.261 "zone_management": false, 00:15:14.261 "zone_append": false, 00:15:14.261 "compare": false, 00:15:14.261 "compare_and_write": false, 00:15:14.261 "abort": true, 00:15:14.261 "seek_hole": false, 00:15:14.261 "seek_data": false, 00:15:14.261 "copy": true, 00:15:14.261 "nvme_iov_md": false 00:15:14.261 }, 00:15:14.261 "memory_domains": [ 00:15:14.261 { 00:15:14.261 "dma_device_id": "system", 00:15:14.261 "dma_device_type": 1 00:15:14.261 }, 00:15:14.261 { 00:15:14.261 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:14.261 "dma_device_type": 2 00:15:14.261 } 00:15:14.261 ], 00:15:14.261 "driver_specific": {} 00:15:14.261 } 00:15:14.261 ] 00:15:14.261 18:18:57 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:15:14.261 18:18:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:15:14.261 18:18:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:15:14.261 18:18:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@305 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r concat -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:15:14.519 [2024-07-12 18:18:58.071334] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:15:14.519 [2024-07-12 18:18:58.071381] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:15:14.519 [2024-07-12 18:18:58.071401] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:15:14.519 [2024-07-12 18:18:58.072809] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:15:14.519 18:18:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@306 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:15:14.519 18:18:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:15:14.519 18:18:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:15:14.519 18:18:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:15:14.519 18:18:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:15:14.519 18:18:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:15:14.519 18:18:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:14.519 18:18:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:14.519 18:18:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:14.519 18:18:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:14.519 18:18:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:14.519 18:18:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:15:14.777 18:18:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:14.777 "name": "Existed_Raid", 00:15:14.777 "uuid": "3fa0e84c-667f-459b-9ca5-2ee7d5271954", 00:15:14.777 "strip_size_kb": 64, 00:15:14.777 "state": "configuring", 00:15:14.777 "raid_level": "concat", 00:15:14.777 "superblock": true, 00:15:14.777 "num_base_bdevs": 3, 00:15:14.777 "num_base_bdevs_discovered": 2, 00:15:14.777 "num_base_bdevs_operational": 3, 00:15:14.777 "base_bdevs_list": [ 00:15:14.777 { 00:15:14.777 "name": "BaseBdev1", 00:15:14.777 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:14.777 "is_configured": false, 00:15:14.777 "data_offset": 0, 00:15:14.777 "data_size": 0 00:15:14.777 }, 00:15:14.777 { 00:15:14.777 "name": "BaseBdev2", 00:15:14.777 "uuid": "d53536bc-40bd-4ce2-a720-0b8cb7186b82", 00:15:14.777 "is_configured": true, 00:15:14.777 "data_offset": 2048, 00:15:14.777 "data_size": 63488 00:15:14.777 }, 00:15:14.777 { 00:15:14.777 "name": "BaseBdev3", 00:15:14.777 "uuid": "610e8fa5-063b-4fdb-8974-729e625ddd98", 00:15:14.777 "is_configured": true, 00:15:14.777 "data_offset": 2048, 00:15:14.777 "data_size": 63488 00:15:14.777 } 00:15:14.777 ] 00:15:14.777 }' 00:15:14.777 18:18:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:14.777 18:18:58 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:15:15.342 18:18:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@308 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev2 00:15:15.599 [2024-07-12 18:18:59.069959] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:15:15.599 18:18:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@309 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:15:15.599 18:18:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:15:15.599 18:18:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:15:15.599 18:18:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:15:15.599 18:18:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:15:15.599 18:18:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:15:15.599 18:18:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:15.599 18:18:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:15.599 18:18:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:15.599 18:18:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:15.599 18:18:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:15.599 18:18:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:15:15.858 18:18:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:15.858 "name": "Existed_Raid", 00:15:15.858 "uuid": "3fa0e84c-667f-459b-9ca5-2ee7d5271954", 00:15:15.858 "strip_size_kb": 64, 00:15:15.858 "state": "configuring", 00:15:15.858 "raid_level": "concat", 00:15:15.858 "superblock": true, 00:15:15.858 "num_base_bdevs": 3, 00:15:15.858 "num_base_bdevs_discovered": 1, 00:15:15.858 "num_base_bdevs_operational": 3, 00:15:15.858 "base_bdevs_list": [ 00:15:15.858 { 00:15:15.858 "name": "BaseBdev1", 00:15:15.858 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:15.858 "is_configured": false, 00:15:15.858 "data_offset": 0, 00:15:15.858 "data_size": 0 00:15:15.858 }, 00:15:15.858 { 00:15:15.858 "name": null, 00:15:15.858 "uuid": "d53536bc-40bd-4ce2-a720-0b8cb7186b82", 00:15:15.858 "is_configured": false, 00:15:15.858 "data_offset": 2048, 00:15:15.858 "data_size": 63488 00:15:15.858 }, 00:15:15.858 { 00:15:15.858 "name": "BaseBdev3", 00:15:15.858 "uuid": "610e8fa5-063b-4fdb-8974-729e625ddd98", 00:15:15.858 "is_configured": true, 00:15:15.858 "data_offset": 2048, 00:15:15.858 "data_size": 63488 00:15:15.858 } 00:15:15.858 ] 00:15:15.858 }' 00:15:15.858 18:18:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:15.858 18:18:59 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:15:16.425 18:18:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@310 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:16.425 18:18:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@310 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:15:16.683 18:19:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@310 -- # [[ false == \f\a\l\s\e ]] 00:15:16.683 18:19:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@312 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:15:16.683 [2024-07-12 18:19:00.406098] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:15:16.683 BaseBdev1 00:15:16.940 18:19:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@313 -- # waitforbdev BaseBdev1 00:15:16.940 18:19:00 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev1 00:15:16.940 18:19:00 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:15:16.940 18:19:00 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:15:16.940 18:19:00 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:15:16.940 18:19:00 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:15:16.940 18:19:00 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:15:17.197 18:19:00 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:15:17.198 [ 00:15:17.198 { 00:15:17.198 "name": "BaseBdev1", 00:15:17.198 "aliases": [ 00:15:17.198 "bf3e864b-cc22-465a-9818-bf7c00944f08" 00:15:17.198 ], 00:15:17.198 "product_name": "Malloc disk", 00:15:17.198 "block_size": 512, 00:15:17.198 "num_blocks": 65536, 00:15:17.198 "uuid": "bf3e864b-cc22-465a-9818-bf7c00944f08", 00:15:17.198 "assigned_rate_limits": { 00:15:17.198 "rw_ios_per_sec": 0, 00:15:17.198 "rw_mbytes_per_sec": 0, 00:15:17.198 "r_mbytes_per_sec": 0, 00:15:17.198 "w_mbytes_per_sec": 0 00:15:17.198 }, 00:15:17.198 "claimed": true, 00:15:17.198 "claim_type": "exclusive_write", 00:15:17.198 "zoned": false, 00:15:17.198 "supported_io_types": { 00:15:17.198 "read": true, 00:15:17.198 "write": true, 00:15:17.198 "unmap": true, 00:15:17.198 "flush": true, 00:15:17.198 "reset": true, 00:15:17.198 "nvme_admin": false, 00:15:17.198 "nvme_io": false, 00:15:17.198 "nvme_io_md": false, 00:15:17.198 "write_zeroes": true, 00:15:17.198 "zcopy": true, 00:15:17.198 "get_zone_info": false, 00:15:17.198 "zone_management": false, 00:15:17.198 "zone_append": false, 00:15:17.198 "compare": false, 00:15:17.198 "compare_and_write": false, 00:15:17.198 "abort": true, 00:15:17.198 "seek_hole": false, 00:15:17.198 "seek_data": false, 00:15:17.198 "copy": true, 00:15:17.198 "nvme_iov_md": false 00:15:17.198 }, 00:15:17.198 "memory_domains": [ 00:15:17.198 { 00:15:17.198 "dma_device_id": "system", 00:15:17.198 "dma_device_type": 1 00:15:17.198 }, 00:15:17.198 { 00:15:17.198 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:17.198 "dma_device_type": 2 00:15:17.198 } 00:15:17.198 ], 00:15:17.198 "driver_specific": {} 00:15:17.198 } 00:15:17.198 ] 00:15:17.198 18:19:00 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:15:17.198 18:19:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@314 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:15:17.198 18:19:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:15:17.198 18:19:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:15:17.198 18:19:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:15:17.198 18:19:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:15:17.198 18:19:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:15:17.198 18:19:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:17.198 18:19:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:17.198 18:19:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:17.198 18:19:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:17.198 18:19:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:15:17.198 18:19:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:17.455 18:19:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:17.455 "name": "Existed_Raid", 00:15:17.455 "uuid": "3fa0e84c-667f-459b-9ca5-2ee7d5271954", 00:15:17.455 "strip_size_kb": 64, 00:15:17.455 "state": "configuring", 00:15:17.455 "raid_level": "concat", 00:15:17.455 "superblock": true, 00:15:17.455 "num_base_bdevs": 3, 00:15:17.455 "num_base_bdevs_discovered": 2, 00:15:17.455 "num_base_bdevs_operational": 3, 00:15:17.455 "base_bdevs_list": [ 00:15:17.455 { 00:15:17.455 "name": "BaseBdev1", 00:15:17.455 "uuid": "bf3e864b-cc22-465a-9818-bf7c00944f08", 00:15:17.455 "is_configured": true, 00:15:17.455 "data_offset": 2048, 00:15:17.455 "data_size": 63488 00:15:17.455 }, 00:15:17.455 { 00:15:17.455 "name": null, 00:15:17.455 "uuid": "d53536bc-40bd-4ce2-a720-0b8cb7186b82", 00:15:17.455 "is_configured": false, 00:15:17.455 "data_offset": 2048, 00:15:17.455 "data_size": 63488 00:15:17.455 }, 00:15:17.455 { 00:15:17.455 "name": "BaseBdev3", 00:15:17.455 "uuid": "610e8fa5-063b-4fdb-8974-729e625ddd98", 00:15:17.455 "is_configured": true, 00:15:17.455 "data_offset": 2048, 00:15:17.455 "data_size": 63488 00:15:17.455 } 00:15:17.455 ] 00:15:17.455 }' 00:15:17.455 18:19:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:17.455 18:19:01 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:15:18.390 18:19:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@315 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:18.390 18:19:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@315 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:15:18.390 18:19:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@315 -- # [[ true == \t\r\u\e ]] 00:15:18.390 18:19:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@317 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev3 00:15:18.650 [2024-07-12 18:19:02.234981] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:15:18.650 18:19:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@318 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:15:18.650 18:19:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:15:18.650 18:19:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:15:18.650 18:19:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:15:18.650 18:19:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:15:18.650 18:19:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:15:18.650 18:19:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:18.650 18:19:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:18.650 18:19:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:18.650 18:19:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:18.650 18:19:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:18.650 18:19:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:15:18.908 18:19:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:18.908 "name": "Existed_Raid", 00:15:18.908 "uuid": "3fa0e84c-667f-459b-9ca5-2ee7d5271954", 00:15:18.908 "strip_size_kb": 64, 00:15:18.908 "state": "configuring", 00:15:18.908 "raid_level": "concat", 00:15:18.908 "superblock": true, 00:15:18.908 "num_base_bdevs": 3, 00:15:18.908 "num_base_bdevs_discovered": 1, 00:15:18.908 "num_base_bdevs_operational": 3, 00:15:18.908 "base_bdevs_list": [ 00:15:18.908 { 00:15:18.908 "name": "BaseBdev1", 00:15:18.908 "uuid": "bf3e864b-cc22-465a-9818-bf7c00944f08", 00:15:18.908 "is_configured": true, 00:15:18.908 "data_offset": 2048, 00:15:18.908 "data_size": 63488 00:15:18.908 }, 00:15:18.908 { 00:15:18.908 "name": null, 00:15:18.908 "uuid": "d53536bc-40bd-4ce2-a720-0b8cb7186b82", 00:15:18.908 "is_configured": false, 00:15:18.908 "data_offset": 2048, 00:15:18.908 "data_size": 63488 00:15:18.908 }, 00:15:18.908 { 00:15:18.908 "name": null, 00:15:18.908 "uuid": "610e8fa5-063b-4fdb-8974-729e625ddd98", 00:15:18.909 "is_configured": false, 00:15:18.909 "data_offset": 2048, 00:15:18.909 "data_size": 63488 00:15:18.909 } 00:15:18.909 ] 00:15:18.909 }' 00:15:18.909 18:19:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:18.909 18:19:02 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:15:19.538 18:19:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@319 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:19.538 18:19:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@319 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:15:19.822 18:19:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@319 -- # [[ false == \f\a\l\s\e ]] 00:15:19.822 18:19:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@321 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev3 00:15:20.081 [2024-07-12 18:19:03.558505] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:15:20.081 18:19:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@322 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:15:20.081 18:19:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:15:20.081 18:19:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:15:20.081 18:19:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:15:20.081 18:19:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:15:20.081 18:19:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:15:20.081 18:19:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:20.081 18:19:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:20.081 18:19:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:20.081 18:19:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:20.081 18:19:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:20.081 18:19:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:15:20.650 18:19:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:20.650 "name": "Existed_Raid", 00:15:20.650 "uuid": "3fa0e84c-667f-459b-9ca5-2ee7d5271954", 00:15:20.650 "strip_size_kb": 64, 00:15:20.650 "state": "configuring", 00:15:20.650 "raid_level": "concat", 00:15:20.650 "superblock": true, 00:15:20.650 "num_base_bdevs": 3, 00:15:20.650 "num_base_bdevs_discovered": 2, 00:15:20.650 "num_base_bdevs_operational": 3, 00:15:20.650 "base_bdevs_list": [ 00:15:20.650 { 00:15:20.650 "name": "BaseBdev1", 00:15:20.650 "uuid": "bf3e864b-cc22-465a-9818-bf7c00944f08", 00:15:20.650 "is_configured": true, 00:15:20.650 "data_offset": 2048, 00:15:20.650 "data_size": 63488 00:15:20.650 }, 00:15:20.650 { 00:15:20.650 "name": null, 00:15:20.650 "uuid": "d53536bc-40bd-4ce2-a720-0b8cb7186b82", 00:15:20.650 "is_configured": false, 00:15:20.650 "data_offset": 2048, 00:15:20.650 "data_size": 63488 00:15:20.650 }, 00:15:20.650 { 00:15:20.650 "name": "BaseBdev3", 00:15:20.650 "uuid": "610e8fa5-063b-4fdb-8974-729e625ddd98", 00:15:20.650 "is_configured": true, 00:15:20.650 "data_offset": 2048, 00:15:20.650 "data_size": 63488 00:15:20.650 } 00:15:20.650 ] 00:15:20.650 }' 00:15:20.650 18:19:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:20.650 18:19:04 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:15:21.218 18:19:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@323 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:21.219 18:19:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@323 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:15:21.219 18:19:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@323 -- # [[ true == \t\r\u\e ]] 00:15:21.219 18:19:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@325 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:15:21.478 [2024-07-12 18:19:05.102618] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:15:21.478 18:19:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@326 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:15:21.478 18:19:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:15:21.478 18:19:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:15:21.478 18:19:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:15:21.478 18:19:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:15:21.478 18:19:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:15:21.478 18:19:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:21.478 18:19:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:21.478 18:19:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:21.478 18:19:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:21.478 18:19:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:21.478 18:19:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:15:21.738 18:19:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:21.738 "name": "Existed_Raid", 00:15:21.738 "uuid": "3fa0e84c-667f-459b-9ca5-2ee7d5271954", 00:15:21.738 "strip_size_kb": 64, 00:15:21.738 "state": "configuring", 00:15:21.738 "raid_level": "concat", 00:15:21.738 "superblock": true, 00:15:21.738 "num_base_bdevs": 3, 00:15:21.738 "num_base_bdevs_discovered": 1, 00:15:21.738 "num_base_bdevs_operational": 3, 00:15:21.738 "base_bdevs_list": [ 00:15:21.738 { 00:15:21.738 "name": null, 00:15:21.738 "uuid": "bf3e864b-cc22-465a-9818-bf7c00944f08", 00:15:21.738 "is_configured": false, 00:15:21.738 "data_offset": 2048, 00:15:21.738 "data_size": 63488 00:15:21.738 }, 00:15:21.738 { 00:15:21.738 "name": null, 00:15:21.738 "uuid": "d53536bc-40bd-4ce2-a720-0b8cb7186b82", 00:15:21.738 "is_configured": false, 00:15:21.738 "data_offset": 2048, 00:15:21.738 "data_size": 63488 00:15:21.738 }, 00:15:21.738 { 00:15:21.738 "name": "BaseBdev3", 00:15:21.738 "uuid": "610e8fa5-063b-4fdb-8974-729e625ddd98", 00:15:21.738 "is_configured": true, 00:15:21.738 "data_offset": 2048, 00:15:21.738 "data_size": 63488 00:15:21.738 } 00:15:21.738 ] 00:15:21.738 }' 00:15:21.738 18:19:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:21.738 18:19:05 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:15:22.305 18:19:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@327 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:22.305 18:19:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@327 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:15:22.564 18:19:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@327 -- # [[ false == \f\a\l\s\e ]] 00:15:22.564 18:19:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@329 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev2 00:15:22.823 [2024-07-12 18:19:06.390478] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:15:22.823 18:19:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@330 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:15:22.823 18:19:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:15:22.823 18:19:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:15:22.823 18:19:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:15:22.823 18:19:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:15:22.823 18:19:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:15:22.823 18:19:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:22.823 18:19:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:22.823 18:19:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:22.823 18:19:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:22.823 18:19:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:22.823 18:19:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:15:23.082 18:19:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:23.082 "name": "Existed_Raid", 00:15:23.082 "uuid": "3fa0e84c-667f-459b-9ca5-2ee7d5271954", 00:15:23.082 "strip_size_kb": 64, 00:15:23.082 "state": "configuring", 00:15:23.082 "raid_level": "concat", 00:15:23.082 "superblock": true, 00:15:23.082 "num_base_bdevs": 3, 00:15:23.082 "num_base_bdevs_discovered": 2, 00:15:23.082 "num_base_bdevs_operational": 3, 00:15:23.082 "base_bdevs_list": [ 00:15:23.082 { 00:15:23.082 "name": null, 00:15:23.082 "uuid": "bf3e864b-cc22-465a-9818-bf7c00944f08", 00:15:23.082 "is_configured": false, 00:15:23.082 "data_offset": 2048, 00:15:23.082 "data_size": 63488 00:15:23.082 }, 00:15:23.082 { 00:15:23.082 "name": "BaseBdev2", 00:15:23.082 "uuid": "d53536bc-40bd-4ce2-a720-0b8cb7186b82", 00:15:23.082 "is_configured": true, 00:15:23.082 "data_offset": 2048, 00:15:23.082 "data_size": 63488 00:15:23.082 }, 00:15:23.082 { 00:15:23.082 "name": "BaseBdev3", 00:15:23.082 "uuid": "610e8fa5-063b-4fdb-8974-729e625ddd98", 00:15:23.082 "is_configured": true, 00:15:23.082 "data_offset": 2048, 00:15:23.082 "data_size": 63488 00:15:23.082 } 00:15:23.082 ] 00:15:23.082 }' 00:15:23.082 18:19:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:23.082 18:19:06 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:15:23.649 18:19:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@331 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:23.649 18:19:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@331 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:15:23.908 18:19:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@331 -- # [[ true == \t\r\u\e ]] 00:15:23.908 18:19:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:23.908 18:19:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@333 -- # jq -r '.[0].base_bdevs_list[0].uuid' 00:15:24.168 18:19:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b NewBaseBdev -u bf3e864b-cc22-465a-9818-bf7c00944f08 00:15:24.427 [2024-07-12 18:19:07.970045] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev NewBaseBdev is claimed 00:15:24.427 [2024-07-12 18:19:07.970203] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0xf80f50 00:15:24.427 [2024-07-12 18:19:07.970216] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 190464, blocklen 512 00:15:24.427 [2024-07-12 18:19:07.970389] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xc87940 00:15:24.427 [2024-07-12 18:19:07.970499] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0xf80f50 00:15:24.427 [2024-07-12 18:19:07.970509] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0xf80f50 00:15:24.427 [2024-07-12 18:19:07.970599] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:15:24.427 NewBaseBdev 00:15:24.427 18:19:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@334 -- # waitforbdev NewBaseBdev 00:15:24.427 18:19:07 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=NewBaseBdev 00:15:24.427 18:19:07 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:15:24.427 18:19:07 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:15:24.427 18:19:07 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:15:24.427 18:19:07 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:15:24.427 18:19:07 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:15:24.687 18:19:08 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev -t 2000 00:15:24.947 [ 00:15:24.947 { 00:15:24.947 "name": "NewBaseBdev", 00:15:24.947 "aliases": [ 00:15:24.947 "bf3e864b-cc22-465a-9818-bf7c00944f08" 00:15:24.947 ], 00:15:24.947 "product_name": "Malloc disk", 00:15:24.947 "block_size": 512, 00:15:24.947 "num_blocks": 65536, 00:15:24.947 "uuid": "bf3e864b-cc22-465a-9818-bf7c00944f08", 00:15:24.947 "assigned_rate_limits": { 00:15:24.947 "rw_ios_per_sec": 0, 00:15:24.947 "rw_mbytes_per_sec": 0, 00:15:24.947 "r_mbytes_per_sec": 0, 00:15:24.947 "w_mbytes_per_sec": 0 00:15:24.947 }, 00:15:24.947 "claimed": true, 00:15:24.947 "claim_type": "exclusive_write", 00:15:24.947 "zoned": false, 00:15:24.947 "supported_io_types": { 00:15:24.947 "read": true, 00:15:24.947 "write": true, 00:15:24.947 "unmap": true, 00:15:24.947 "flush": true, 00:15:24.947 "reset": true, 00:15:24.947 "nvme_admin": false, 00:15:24.947 "nvme_io": false, 00:15:24.947 "nvme_io_md": false, 00:15:24.947 "write_zeroes": true, 00:15:24.947 "zcopy": true, 00:15:24.947 "get_zone_info": false, 00:15:24.947 "zone_management": false, 00:15:24.947 "zone_append": false, 00:15:24.947 "compare": false, 00:15:24.947 "compare_and_write": false, 00:15:24.947 "abort": true, 00:15:24.947 "seek_hole": false, 00:15:24.947 "seek_data": false, 00:15:24.947 "copy": true, 00:15:24.947 "nvme_iov_md": false 00:15:24.947 }, 00:15:24.947 "memory_domains": [ 00:15:24.947 { 00:15:24.947 "dma_device_id": "system", 00:15:24.947 "dma_device_type": 1 00:15:24.947 }, 00:15:24.947 { 00:15:24.947 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:24.947 "dma_device_type": 2 00:15:24.947 } 00:15:24.947 ], 00:15:24.947 "driver_specific": {} 00:15:24.947 } 00:15:24.947 ] 00:15:24.947 18:19:08 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:15:24.947 18:19:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@335 -- # verify_raid_bdev_state Existed_Raid online concat 64 3 00:15:24.947 18:19:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:15:24.947 18:19:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:15:24.947 18:19:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:15:24.947 18:19:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:15:24.947 18:19:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:15:24.947 18:19:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:24.947 18:19:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:24.947 18:19:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:24.947 18:19:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:24.947 18:19:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:24.947 18:19:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:15:25.207 18:19:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:25.207 "name": "Existed_Raid", 00:15:25.207 "uuid": "3fa0e84c-667f-459b-9ca5-2ee7d5271954", 00:15:25.207 "strip_size_kb": 64, 00:15:25.207 "state": "online", 00:15:25.207 "raid_level": "concat", 00:15:25.207 "superblock": true, 00:15:25.207 "num_base_bdevs": 3, 00:15:25.207 "num_base_bdevs_discovered": 3, 00:15:25.207 "num_base_bdevs_operational": 3, 00:15:25.207 "base_bdevs_list": [ 00:15:25.207 { 00:15:25.207 "name": "NewBaseBdev", 00:15:25.207 "uuid": "bf3e864b-cc22-465a-9818-bf7c00944f08", 00:15:25.207 "is_configured": true, 00:15:25.207 "data_offset": 2048, 00:15:25.207 "data_size": 63488 00:15:25.207 }, 00:15:25.207 { 00:15:25.207 "name": "BaseBdev2", 00:15:25.207 "uuid": "d53536bc-40bd-4ce2-a720-0b8cb7186b82", 00:15:25.207 "is_configured": true, 00:15:25.207 "data_offset": 2048, 00:15:25.207 "data_size": 63488 00:15:25.207 }, 00:15:25.207 { 00:15:25.207 "name": "BaseBdev3", 00:15:25.207 "uuid": "610e8fa5-063b-4fdb-8974-729e625ddd98", 00:15:25.207 "is_configured": true, 00:15:25.207 "data_offset": 2048, 00:15:25.207 "data_size": 63488 00:15:25.207 } 00:15:25.207 ] 00:15:25.207 }' 00:15:25.207 18:19:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:25.207 18:19:08 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:15:25.775 18:19:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@336 -- # verify_raid_bdev_properties Existed_Raid 00:15:25.775 18:19:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:15:25.775 18:19:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:15:25.775 18:19:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:15:25.775 18:19:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:15:25.775 18:19:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@198 -- # local name 00:15:25.775 18:19:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:15:25.775 18:19:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:15:26.034 [2024-07-12 18:19:09.530505] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:15:26.034 18:19:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:15:26.034 "name": "Existed_Raid", 00:15:26.034 "aliases": [ 00:15:26.034 "3fa0e84c-667f-459b-9ca5-2ee7d5271954" 00:15:26.034 ], 00:15:26.034 "product_name": "Raid Volume", 00:15:26.034 "block_size": 512, 00:15:26.034 "num_blocks": 190464, 00:15:26.034 "uuid": "3fa0e84c-667f-459b-9ca5-2ee7d5271954", 00:15:26.034 "assigned_rate_limits": { 00:15:26.034 "rw_ios_per_sec": 0, 00:15:26.034 "rw_mbytes_per_sec": 0, 00:15:26.034 "r_mbytes_per_sec": 0, 00:15:26.034 "w_mbytes_per_sec": 0 00:15:26.034 }, 00:15:26.034 "claimed": false, 00:15:26.034 "zoned": false, 00:15:26.034 "supported_io_types": { 00:15:26.034 "read": true, 00:15:26.034 "write": true, 00:15:26.034 "unmap": true, 00:15:26.034 "flush": true, 00:15:26.034 "reset": true, 00:15:26.034 "nvme_admin": false, 00:15:26.034 "nvme_io": false, 00:15:26.034 "nvme_io_md": false, 00:15:26.034 "write_zeroes": true, 00:15:26.034 "zcopy": false, 00:15:26.034 "get_zone_info": false, 00:15:26.034 "zone_management": false, 00:15:26.034 "zone_append": false, 00:15:26.034 "compare": false, 00:15:26.034 "compare_and_write": false, 00:15:26.034 "abort": false, 00:15:26.034 "seek_hole": false, 00:15:26.034 "seek_data": false, 00:15:26.034 "copy": false, 00:15:26.034 "nvme_iov_md": false 00:15:26.034 }, 00:15:26.034 "memory_domains": [ 00:15:26.034 { 00:15:26.034 "dma_device_id": "system", 00:15:26.034 "dma_device_type": 1 00:15:26.034 }, 00:15:26.034 { 00:15:26.034 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:26.034 "dma_device_type": 2 00:15:26.034 }, 00:15:26.034 { 00:15:26.034 "dma_device_id": "system", 00:15:26.034 "dma_device_type": 1 00:15:26.034 }, 00:15:26.034 { 00:15:26.034 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:26.034 "dma_device_type": 2 00:15:26.034 }, 00:15:26.034 { 00:15:26.034 "dma_device_id": "system", 00:15:26.034 "dma_device_type": 1 00:15:26.034 }, 00:15:26.034 { 00:15:26.034 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:26.034 "dma_device_type": 2 00:15:26.034 } 00:15:26.034 ], 00:15:26.034 "driver_specific": { 00:15:26.034 "raid": { 00:15:26.034 "uuid": "3fa0e84c-667f-459b-9ca5-2ee7d5271954", 00:15:26.034 "strip_size_kb": 64, 00:15:26.034 "state": "online", 00:15:26.034 "raid_level": "concat", 00:15:26.034 "superblock": true, 00:15:26.034 "num_base_bdevs": 3, 00:15:26.034 "num_base_bdevs_discovered": 3, 00:15:26.034 "num_base_bdevs_operational": 3, 00:15:26.034 "base_bdevs_list": [ 00:15:26.034 { 00:15:26.034 "name": "NewBaseBdev", 00:15:26.034 "uuid": "bf3e864b-cc22-465a-9818-bf7c00944f08", 00:15:26.034 "is_configured": true, 00:15:26.034 "data_offset": 2048, 00:15:26.034 "data_size": 63488 00:15:26.034 }, 00:15:26.034 { 00:15:26.034 "name": "BaseBdev2", 00:15:26.034 "uuid": "d53536bc-40bd-4ce2-a720-0b8cb7186b82", 00:15:26.034 "is_configured": true, 00:15:26.034 "data_offset": 2048, 00:15:26.034 "data_size": 63488 00:15:26.034 }, 00:15:26.034 { 00:15:26.034 "name": "BaseBdev3", 00:15:26.034 "uuid": "610e8fa5-063b-4fdb-8974-729e625ddd98", 00:15:26.034 "is_configured": true, 00:15:26.034 "data_offset": 2048, 00:15:26.034 "data_size": 63488 00:15:26.034 } 00:15:26.034 ] 00:15:26.034 } 00:15:26.034 } 00:15:26.034 }' 00:15:26.034 18:19:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:15:26.034 18:19:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # base_bdev_names='NewBaseBdev 00:15:26.034 BaseBdev2 00:15:26.034 BaseBdev3' 00:15:26.034 18:19:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:15:26.034 18:19:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev 00:15:26.034 18:19:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:15:26.293 18:19:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:15:26.293 "name": "NewBaseBdev", 00:15:26.293 "aliases": [ 00:15:26.293 "bf3e864b-cc22-465a-9818-bf7c00944f08" 00:15:26.293 ], 00:15:26.293 "product_name": "Malloc disk", 00:15:26.293 "block_size": 512, 00:15:26.293 "num_blocks": 65536, 00:15:26.293 "uuid": "bf3e864b-cc22-465a-9818-bf7c00944f08", 00:15:26.293 "assigned_rate_limits": { 00:15:26.293 "rw_ios_per_sec": 0, 00:15:26.293 "rw_mbytes_per_sec": 0, 00:15:26.293 "r_mbytes_per_sec": 0, 00:15:26.293 "w_mbytes_per_sec": 0 00:15:26.293 }, 00:15:26.293 "claimed": true, 00:15:26.293 "claim_type": "exclusive_write", 00:15:26.293 "zoned": false, 00:15:26.293 "supported_io_types": { 00:15:26.293 "read": true, 00:15:26.293 "write": true, 00:15:26.293 "unmap": true, 00:15:26.293 "flush": true, 00:15:26.293 "reset": true, 00:15:26.293 "nvme_admin": false, 00:15:26.293 "nvme_io": false, 00:15:26.293 "nvme_io_md": false, 00:15:26.293 "write_zeroes": true, 00:15:26.293 "zcopy": true, 00:15:26.293 "get_zone_info": false, 00:15:26.293 "zone_management": false, 00:15:26.293 "zone_append": false, 00:15:26.293 "compare": false, 00:15:26.293 "compare_and_write": false, 00:15:26.293 "abort": true, 00:15:26.293 "seek_hole": false, 00:15:26.293 "seek_data": false, 00:15:26.293 "copy": true, 00:15:26.293 "nvme_iov_md": false 00:15:26.293 }, 00:15:26.293 "memory_domains": [ 00:15:26.293 { 00:15:26.293 "dma_device_id": "system", 00:15:26.293 "dma_device_type": 1 00:15:26.293 }, 00:15:26.293 { 00:15:26.293 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:26.293 "dma_device_type": 2 00:15:26.293 } 00:15:26.293 ], 00:15:26.293 "driver_specific": {} 00:15:26.293 }' 00:15:26.293 18:19:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:26.293 18:19:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:26.293 18:19:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:15:26.293 18:19:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:26.293 18:19:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:26.293 18:19:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:15:26.293 18:19:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:26.551 18:19:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:26.551 18:19:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:15:26.551 18:19:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:26.551 18:19:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:26.551 18:19:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:15:26.551 18:19:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:15:26.551 18:19:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:15:26.551 18:19:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:15:26.809 18:19:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:15:26.809 "name": "BaseBdev2", 00:15:26.809 "aliases": [ 00:15:26.809 "d53536bc-40bd-4ce2-a720-0b8cb7186b82" 00:15:26.809 ], 00:15:26.809 "product_name": "Malloc disk", 00:15:26.809 "block_size": 512, 00:15:26.809 "num_blocks": 65536, 00:15:26.809 "uuid": "d53536bc-40bd-4ce2-a720-0b8cb7186b82", 00:15:26.809 "assigned_rate_limits": { 00:15:26.809 "rw_ios_per_sec": 0, 00:15:26.809 "rw_mbytes_per_sec": 0, 00:15:26.809 "r_mbytes_per_sec": 0, 00:15:26.809 "w_mbytes_per_sec": 0 00:15:26.809 }, 00:15:26.809 "claimed": true, 00:15:26.809 "claim_type": "exclusive_write", 00:15:26.809 "zoned": false, 00:15:26.809 "supported_io_types": { 00:15:26.809 "read": true, 00:15:26.809 "write": true, 00:15:26.809 "unmap": true, 00:15:26.809 "flush": true, 00:15:26.809 "reset": true, 00:15:26.809 "nvme_admin": false, 00:15:26.809 "nvme_io": false, 00:15:26.809 "nvme_io_md": false, 00:15:26.809 "write_zeroes": true, 00:15:26.809 "zcopy": true, 00:15:26.809 "get_zone_info": false, 00:15:26.809 "zone_management": false, 00:15:26.809 "zone_append": false, 00:15:26.809 "compare": false, 00:15:26.809 "compare_and_write": false, 00:15:26.809 "abort": true, 00:15:26.809 "seek_hole": false, 00:15:26.809 "seek_data": false, 00:15:26.809 "copy": true, 00:15:26.809 "nvme_iov_md": false 00:15:26.809 }, 00:15:26.809 "memory_domains": [ 00:15:26.809 { 00:15:26.809 "dma_device_id": "system", 00:15:26.809 "dma_device_type": 1 00:15:26.809 }, 00:15:26.809 { 00:15:26.809 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:26.809 "dma_device_type": 2 00:15:26.809 } 00:15:26.809 ], 00:15:26.809 "driver_specific": {} 00:15:26.809 }' 00:15:26.809 18:19:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:26.809 18:19:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:26.809 18:19:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:15:26.809 18:19:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:27.067 18:19:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:27.067 18:19:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:15:27.067 18:19:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:27.067 18:19:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:27.067 18:19:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:15:27.067 18:19:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:27.067 18:19:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:27.067 18:19:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:15:27.067 18:19:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:15:27.067 18:19:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:15:27.067 18:19:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:15:27.325 18:19:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:15:27.325 "name": "BaseBdev3", 00:15:27.325 "aliases": [ 00:15:27.325 "610e8fa5-063b-4fdb-8974-729e625ddd98" 00:15:27.325 ], 00:15:27.325 "product_name": "Malloc disk", 00:15:27.325 "block_size": 512, 00:15:27.325 "num_blocks": 65536, 00:15:27.325 "uuid": "610e8fa5-063b-4fdb-8974-729e625ddd98", 00:15:27.325 "assigned_rate_limits": { 00:15:27.325 "rw_ios_per_sec": 0, 00:15:27.325 "rw_mbytes_per_sec": 0, 00:15:27.325 "r_mbytes_per_sec": 0, 00:15:27.325 "w_mbytes_per_sec": 0 00:15:27.325 }, 00:15:27.325 "claimed": true, 00:15:27.325 "claim_type": "exclusive_write", 00:15:27.325 "zoned": false, 00:15:27.325 "supported_io_types": { 00:15:27.325 "read": true, 00:15:27.325 "write": true, 00:15:27.325 "unmap": true, 00:15:27.325 "flush": true, 00:15:27.325 "reset": true, 00:15:27.325 "nvme_admin": false, 00:15:27.325 "nvme_io": false, 00:15:27.325 "nvme_io_md": false, 00:15:27.325 "write_zeroes": true, 00:15:27.325 "zcopy": true, 00:15:27.325 "get_zone_info": false, 00:15:27.325 "zone_management": false, 00:15:27.325 "zone_append": false, 00:15:27.325 "compare": false, 00:15:27.325 "compare_and_write": false, 00:15:27.325 "abort": true, 00:15:27.325 "seek_hole": false, 00:15:27.325 "seek_data": false, 00:15:27.325 "copy": true, 00:15:27.325 "nvme_iov_md": false 00:15:27.325 }, 00:15:27.325 "memory_domains": [ 00:15:27.325 { 00:15:27.325 "dma_device_id": "system", 00:15:27.325 "dma_device_type": 1 00:15:27.325 }, 00:15:27.325 { 00:15:27.325 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:27.325 "dma_device_type": 2 00:15:27.325 } 00:15:27.325 ], 00:15:27.325 "driver_specific": {} 00:15:27.325 }' 00:15:27.325 18:19:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:27.584 18:19:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:27.584 18:19:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:15:27.584 18:19:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:27.584 18:19:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:27.584 18:19:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:15:27.584 18:19:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:27.584 18:19:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:27.584 18:19:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:15:27.584 18:19:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:27.842 18:19:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:27.842 18:19:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:15:27.842 18:19:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@338 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:15:28.102 [2024-07-12 18:19:11.583669] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:15:28.102 [2024-07-12 18:19:11.583700] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:15:28.102 [2024-07-12 18:19:11.583752] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:15:28.102 [2024-07-12 18:19:11.583799] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:15:28.102 [2024-07-12 18:19:11.583811] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xf80f50 name Existed_Raid, state offline 00:15:28.102 18:19:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@341 -- # killprocess 2495649 00:15:28.102 18:19:11 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@948 -- # '[' -z 2495649 ']' 00:15:28.102 18:19:11 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@952 -- # kill -0 2495649 00:15:28.102 18:19:11 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@953 -- # uname 00:15:28.102 18:19:11 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:15:28.102 18:19:11 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2495649 00:15:28.102 18:19:11 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:15:28.102 18:19:11 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:15:28.102 18:19:11 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2495649' 00:15:28.102 killing process with pid 2495649 00:15:28.102 18:19:11 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@967 -- # kill 2495649 00:15:28.102 [2024-07-12 18:19:11.649013] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:15:28.102 18:19:11 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@972 -- # wait 2495649 00:15:28.102 [2024-07-12 18:19:11.679601] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:15:28.362 18:19:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@343 -- # return 0 00:15:28.362 00:15:28.362 real 0m28.324s 00:15:28.362 user 0m51.953s 00:15:28.362 sys 0m5.070s 00:15:28.362 18:19:11 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@1124 -- # xtrace_disable 00:15:28.362 18:19:11 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:15:28.362 ************************************ 00:15:28.362 END TEST raid_state_function_test_sb 00:15:28.362 ************************************ 00:15:28.362 18:19:11 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:15:28.362 18:19:11 bdev_raid -- bdev/bdev_raid.sh@869 -- # run_test raid_superblock_test raid_superblock_test concat 3 00:15:28.362 18:19:11 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 4 -le 1 ']' 00:15:28.362 18:19:11 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:15:28.362 18:19:11 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:15:28.362 ************************************ 00:15:28.362 START TEST raid_superblock_test 00:15:28.362 ************************************ 00:15:28.362 18:19:11 bdev_raid.raid_superblock_test -- common/autotest_common.sh@1123 -- # raid_superblock_test concat 3 00:15:28.362 18:19:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@392 -- # local raid_level=concat 00:15:28.362 18:19:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@393 -- # local num_base_bdevs=3 00:15:28.362 18:19:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@394 -- # base_bdevs_malloc=() 00:15:28.362 18:19:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@394 -- # local base_bdevs_malloc 00:15:28.362 18:19:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@395 -- # base_bdevs_pt=() 00:15:28.362 18:19:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@395 -- # local base_bdevs_pt 00:15:28.362 18:19:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@396 -- # base_bdevs_pt_uuid=() 00:15:28.362 18:19:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@396 -- # local base_bdevs_pt_uuid 00:15:28.362 18:19:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@397 -- # local raid_bdev_name=raid_bdev1 00:15:28.362 18:19:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@398 -- # local strip_size 00:15:28.362 18:19:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@399 -- # local strip_size_create_arg 00:15:28.362 18:19:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@400 -- # local raid_bdev_uuid 00:15:28.362 18:19:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@401 -- # local raid_bdev 00:15:28.362 18:19:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@403 -- # '[' concat '!=' raid1 ']' 00:15:28.362 18:19:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@404 -- # strip_size=64 00:15:28.362 18:19:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@405 -- # strip_size_create_arg='-z 64' 00:15:28.362 18:19:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@411 -- # raid_pid=2499945 00:15:28.362 18:19:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@412 -- # waitforlisten 2499945 /var/tmp/spdk-raid.sock 00:15:28.362 18:19:11 bdev_raid.raid_superblock_test -- common/autotest_common.sh@829 -- # '[' -z 2499945 ']' 00:15:28.362 18:19:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@410 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -L bdev_raid 00:15:28.362 18:19:11 bdev_raid.raid_superblock_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:15:28.362 18:19:11 bdev_raid.raid_superblock_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:15:28.362 18:19:11 bdev_raid.raid_superblock_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:15:28.362 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:15:28.362 18:19:11 bdev_raid.raid_superblock_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:15:28.362 18:19:11 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:15:28.362 [2024-07-12 18:19:12.040755] Starting SPDK v24.09-pre git sha1 182dd7de4 / DPDK 24.03.0 initialization... 00:15:28.362 [2024-07-12 18:19:12.040819] [ DPDK EAL parameters: bdev_svc --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2499945 ] 00:15:28.622 [2024-07-12 18:19:12.168346] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:15:28.622 [2024-07-12 18:19:12.275293] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:15:28.622 [2024-07-12 18:19:12.343335] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:15:28.622 [2024-07-12 18:19:12.343372] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:15:29.556 18:19:13 bdev_raid.raid_superblock_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:15:29.556 18:19:13 bdev_raid.raid_superblock_test -- common/autotest_common.sh@862 -- # return 0 00:15:29.556 18:19:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i = 1 )) 00:15:29.556 18:19:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:15:29.556 18:19:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc1 00:15:29.556 18:19:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt1 00:15:29.556 18:19:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000001 00:15:29.556 18:19:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:15:29.556 18:19:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:15:29.556 18:19:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:15:29.556 18:19:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc1 00:15:29.813 malloc1 00:15:29.814 18:19:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:15:30.381 [2024-07-12 18:19:13.976629] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:15:30.381 [2024-07-12 18:19:13.976676] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:15:30.381 [2024-07-12 18:19:13.976697] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1885570 00:15:30.381 [2024-07-12 18:19:13.976709] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:15:30.381 [2024-07-12 18:19:13.978484] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:15:30.381 [2024-07-12 18:19:13.978512] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:15:30.381 pt1 00:15:30.381 18:19:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:15:30.381 18:19:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:15:30.381 18:19:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc2 00:15:30.381 18:19:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt2 00:15:30.381 18:19:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000002 00:15:30.381 18:19:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:15:30.381 18:19:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:15:30.381 18:19:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:15:30.381 18:19:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc2 00:15:30.640 malloc2 00:15:30.640 18:19:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:15:31.207 [2024-07-12 18:19:14.739418] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:15:31.207 [2024-07-12 18:19:14.739463] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:15:31.207 [2024-07-12 18:19:14.739481] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1886970 00:15:31.207 [2024-07-12 18:19:14.739493] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:15:31.207 [2024-07-12 18:19:14.741190] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:15:31.207 [2024-07-12 18:19:14.741218] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:15:31.207 pt2 00:15:31.207 18:19:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:15:31.207 18:19:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:15:31.207 18:19:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc3 00:15:31.207 18:19:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt3 00:15:31.207 18:19:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000003 00:15:31.207 18:19:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:15:31.207 18:19:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:15:31.207 18:19:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:15:31.207 18:19:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc3 00:15:31.466 malloc3 00:15:31.466 18:19:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc3 -p pt3 -u 00000000-0000-0000-0000-000000000003 00:15:32.034 [2024-07-12 18:19:15.503232] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc3 00:15:32.034 [2024-07-12 18:19:15.503284] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:15:32.034 [2024-07-12 18:19:15.503302] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1a1d340 00:15:32.034 [2024-07-12 18:19:15.503315] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:15:32.034 [2024-07-12 18:19:15.504993] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:15:32.034 [2024-07-12 18:19:15.505019] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt3 00:15:32.034 pt3 00:15:32.034 18:19:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:15:32.034 18:19:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:15:32.034 18:19:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@429 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'pt1 pt2 pt3' -n raid_bdev1 -s 00:15:32.035 [2024-07-12 18:19:15.755915] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:15:32.035 [2024-07-12 18:19:15.757258] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:15:32.035 [2024-07-12 18:19:15.757313] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt3 is claimed 00:15:32.035 [2024-07-12 18:19:15.757462] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x187dea0 00:15:32.035 [2024-07-12 18:19:15.757474] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 190464, blocklen 512 00:15:32.035 [2024-07-12 18:19:15.757672] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1885240 00:15:32.035 [2024-07-12 18:19:15.757812] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x187dea0 00:15:32.035 [2024-07-12 18:19:15.757823] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x187dea0 00:15:32.035 [2024-07-12 18:19:15.757918] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:15:32.294 18:19:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@430 -- # verify_raid_bdev_state raid_bdev1 online concat 64 3 00:15:32.294 18:19:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:15:32.294 18:19:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:15:32.294 18:19:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:15:32.294 18:19:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:15:32.294 18:19:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:15:32.294 18:19:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:32.294 18:19:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:32.294 18:19:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:32.294 18:19:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:32.294 18:19:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:32.294 18:19:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:15:32.294 18:19:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:32.294 "name": "raid_bdev1", 00:15:32.294 "uuid": "4d8afa8a-7615-457e-a6b5-4acfbd17e608", 00:15:32.294 "strip_size_kb": 64, 00:15:32.294 "state": "online", 00:15:32.294 "raid_level": "concat", 00:15:32.294 "superblock": true, 00:15:32.294 "num_base_bdevs": 3, 00:15:32.294 "num_base_bdevs_discovered": 3, 00:15:32.295 "num_base_bdevs_operational": 3, 00:15:32.295 "base_bdevs_list": [ 00:15:32.295 { 00:15:32.295 "name": "pt1", 00:15:32.295 "uuid": "00000000-0000-0000-0000-000000000001", 00:15:32.295 "is_configured": true, 00:15:32.295 "data_offset": 2048, 00:15:32.295 "data_size": 63488 00:15:32.295 }, 00:15:32.295 { 00:15:32.295 "name": "pt2", 00:15:32.295 "uuid": "00000000-0000-0000-0000-000000000002", 00:15:32.295 "is_configured": true, 00:15:32.295 "data_offset": 2048, 00:15:32.295 "data_size": 63488 00:15:32.295 }, 00:15:32.295 { 00:15:32.295 "name": "pt3", 00:15:32.295 "uuid": "00000000-0000-0000-0000-000000000003", 00:15:32.295 "is_configured": true, 00:15:32.295 "data_offset": 2048, 00:15:32.295 "data_size": 63488 00:15:32.295 } 00:15:32.295 ] 00:15:32.295 }' 00:15:32.295 18:19:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:32.295 18:19:16 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:15:32.862 18:19:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@431 -- # verify_raid_bdev_properties raid_bdev1 00:15:32.862 18:19:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:15:32.862 18:19:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:15:32.862 18:19:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:15:32.862 18:19:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:15:32.862 18:19:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@198 -- # local name 00:15:33.122 18:19:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:15:33.122 18:19:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:15:33.122 [2024-07-12 18:19:16.814974] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:15:33.122 18:19:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:15:33.122 "name": "raid_bdev1", 00:15:33.122 "aliases": [ 00:15:33.122 "4d8afa8a-7615-457e-a6b5-4acfbd17e608" 00:15:33.122 ], 00:15:33.122 "product_name": "Raid Volume", 00:15:33.122 "block_size": 512, 00:15:33.122 "num_blocks": 190464, 00:15:33.122 "uuid": "4d8afa8a-7615-457e-a6b5-4acfbd17e608", 00:15:33.122 "assigned_rate_limits": { 00:15:33.122 "rw_ios_per_sec": 0, 00:15:33.122 "rw_mbytes_per_sec": 0, 00:15:33.122 "r_mbytes_per_sec": 0, 00:15:33.122 "w_mbytes_per_sec": 0 00:15:33.122 }, 00:15:33.122 "claimed": false, 00:15:33.122 "zoned": false, 00:15:33.122 "supported_io_types": { 00:15:33.122 "read": true, 00:15:33.122 "write": true, 00:15:33.122 "unmap": true, 00:15:33.122 "flush": true, 00:15:33.122 "reset": true, 00:15:33.122 "nvme_admin": false, 00:15:33.122 "nvme_io": false, 00:15:33.122 "nvme_io_md": false, 00:15:33.122 "write_zeroes": true, 00:15:33.122 "zcopy": false, 00:15:33.122 "get_zone_info": false, 00:15:33.122 "zone_management": false, 00:15:33.122 "zone_append": false, 00:15:33.122 "compare": false, 00:15:33.122 "compare_and_write": false, 00:15:33.122 "abort": false, 00:15:33.122 "seek_hole": false, 00:15:33.122 "seek_data": false, 00:15:33.122 "copy": false, 00:15:33.122 "nvme_iov_md": false 00:15:33.122 }, 00:15:33.122 "memory_domains": [ 00:15:33.122 { 00:15:33.122 "dma_device_id": "system", 00:15:33.122 "dma_device_type": 1 00:15:33.122 }, 00:15:33.122 { 00:15:33.122 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:33.122 "dma_device_type": 2 00:15:33.122 }, 00:15:33.122 { 00:15:33.122 "dma_device_id": "system", 00:15:33.122 "dma_device_type": 1 00:15:33.122 }, 00:15:33.122 { 00:15:33.122 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:33.122 "dma_device_type": 2 00:15:33.122 }, 00:15:33.122 { 00:15:33.122 "dma_device_id": "system", 00:15:33.122 "dma_device_type": 1 00:15:33.122 }, 00:15:33.122 { 00:15:33.122 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:33.122 "dma_device_type": 2 00:15:33.122 } 00:15:33.122 ], 00:15:33.122 "driver_specific": { 00:15:33.122 "raid": { 00:15:33.122 "uuid": "4d8afa8a-7615-457e-a6b5-4acfbd17e608", 00:15:33.122 "strip_size_kb": 64, 00:15:33.122 "state": "online", 00:15:33.122 "raid_level": "concat", 00:15:33.122 "superblock": true, 00:15:33.122 "num_base_bdevs": 3, 00:15:33.122 "num_base_bdevs_discovered": 3, 00:15:33.122 "num_base_bdevs_operational": 3, 00:15:33.122 "base_bdevs_list": [ 00:15:33.122 { 00:15:33.122 "name": "pt1", 00:15:33.122 "uuid": "00000000-0000-0000-0000-000000000001", 00:15:33.122 "is_configured": true, 00:15:33.122 "data_offset": 2048, 00:15:33.122 "data_size": 63488 00:15:33.122 }, 00:15:33.122 { 00:15:33.122 "name": "pt2", 00:15:33.122 "uuid": "00000000-0000-0000-0000-000000000002", 00:15:33.122 "is_configured": true, 00:15:33.122 "data_offset": 2048, 00:15:33.122 "data_size": 63488 00:15:33.122 }, 00:15:33.122 { 00:15:33.122 "name": "pt3", 00:15:33.122 "uuid": "00000000-0000-0000-0000-000000000003", 00:15:33.122 "is_configured": true, 00:15:33.122 "data_offset": 2048, 00:15:33.122 "data_size": 63488 00:15:33.122 } 00:15:33.122 ] 00:15:33.122 } 00:15:33.122 } 00:15:33.122 }' 00:15:33.122 18:19:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:15:33.381 18:19:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:15:33.381 pt2 00:15:33.381 pt3' 00:15:33.381 18:19:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:15:33.381 18:19:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:15:33.381 18:19:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:15:33.640 18:19:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:15:33.640 "name": "pt1", 00:15:33.640 "aliases": [ 00:15:33.640 "00000000-0000-0000-0000-000000000001" 00:15:33.640 ], 00:15:33.640 "product_name": "passthru", 00:15:33.640 "block_size": 512, 00:15:33.640 "num_blocks": 65536, 00:15:33.640 "uuid": "00000000-0000-0000-0000-000000000001", 00:15:33.640 "assigned_rate_limits": { 00:15:33.640 "rw_ios_per_sec": 0, 00:15:33.640 "rw_mbytes_per_sec": 0, 00:15:33.640 "r_mbytes_per_sec": 0, 00:15:33.640 "w_mbytes_per_sec": 0 00:15:33.640 }, 00:15:33.640 "claimed": true, 00:15:33.640 "claim_type": "exclusive_write", 00:15:33.640 "zoned": false, 00:15:33.640 "supported_io_types": { 00:15:33.640 "read": true, 00:15:33.640 "write": true, 00:15:33.640 "unmap": true, 00:15:33.640 "flush": true, 00:15:33.640 "reset": true, 00:15:33.640 "nvme_admin": false, 00:15:33.640 "nvme_io": false, 00:15:33.640 "nvme_io_md": false, 00:15:33.640 "write_zeroes": true, 00:15:33.640 "zcopy": true, 00:15:33.640 "get_zone_info": false, 00:15:33.640 "zone_management": false, 00:15:33.640 "zone_append": false, 00:15:33.640 "compare": false, 00:15:33.640 "compare_and_write": false, 00:15:33.640 "abort": true, 00:15:33.640 "seek_hole": false, 00:15:33.640 "seek_data": false, 00:15:33.640 "copy": true, 00:15:33.640 "nvme_iov_md": false 00:15:33.640 }, 00:15:33.640 "memory_domains": [ 00:15:33.640 { 00:15:33.640 "dma_device_id": "system", 00:15:33.640 "dma_device_type": 1 00:15:33.640 }, 00:15:33.640 { 00:15:33.640 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:33.640 "dma_device_type": 2 00:15:33.640 } 00:15:33.640 ], 00:15:33.640 "driver_specific": { 00:15:33.640 "passthru": { 00:15:33.640 "name": "pt1", 00:15:33.640 "base_bdev_name": "malloc1" 00:15:33.640 } 00:15:33.640 } 00:15:33.640 }' 00:15:33.640 18:19:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:33.640 18:19:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:33.640 18:19:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:15:33.640 18:19:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:33.640 18:19:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:33.640 18:19:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:15:33.640 18:19:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:33.640 18:19:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:33.899 18:19:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:15:33.899 18:19:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:33.899 18:19:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:33.899 18:19:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:15:33.899 18:19:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:15:33.899 18:19:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:15:33.899 18:19:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:15:34.158 18:19:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:15:34.158 "name": "pt2", 00:15:34.158 "aliases": [ 00:15:34.158 "00000000-0000-0000-0000-000000000002" 00:15:34.158 ], 00:15:34.158 "product_name": "passthru", 00:15:34.158 "block_size": 512, 00:15:34.158 "num_blocks": 65536, 00:15:34.158 "uuid": "00000000-0000-0000-0000-000000000002", 00:15:34.158 "assigned_rate_limits": { 00:15:34.158 "rw_ios_per_sec": 0, 00:15:34.158 "rw_mbytes_per_sec": 0, 00:15:34.158 "r_mbytes_per_sec": 0, 00:15:34.158 "w_mbytes_per_sec": 0 00:15:34.158 }, 00:15:34.158 "claimed": true, 00:15:34.158 "claim_type": "exclusive_write", 00:15:34.158 "zoned": false, 00:15:34.158 "supported_io_types": { 00:15:34.158 "read": true, 00:15:34.158 "write": true, 00:15:34.158 "unmap": true, 00:15:34.158 "flush": true, 00:15:34.158 "reset": true, 00:15:34.158 "nvme_admin": false, 00:15:34.158 "nvme_io": false, 00:15:34.158 "nvme_io_md": false, 00:15:34.158 "write_zeroes": true, 00:15:34.158 "zcopy": true, 00:15:34.158 "get_zone_info": false, 00:15:34.158 "zone_management": false, 00:15:34.158 "zone_append": false, 00:15:34.158 "compare": false, 00:15:34.158 "compare_and_write": false, 00:15:34.158 "abort": true, 00:15:34.158 "seek_hole": false, 00:15:34.158 "seek_data": false, 00:15:34.158 "copy": true, 00:15:34.158 "nvme_iov_md": false 00:15:34.158 }, 00:15:34.158 "memory_domains": [ 00:15:34.158 { 00:15:34.158 "dma_device_id": "system", 00:15:34.158 "dma_device_type": 1 00:15:34.158 }, 00:15:34.158 { 00:15:34.158 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:34.158 "dma_device_type": 2 00:15:34.158 } 00:15:34.158 ], 00:15:34.158 "driver_specific": { 00:15:34.158 "passthru": { 00:15:34.158 "name": "pt2", 00:15:34.158 "base_bdev_name": "malloc2" 00:15:34.158 } 00:15:34.158 } 00:15:34.158 }' 00:15:34.158 18:19:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:34.158 18:19:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:34.158 18:19:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:15:34.158 18:19:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:34.158 18:19:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:34.416 18:19:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:15:34.416 18:19:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:34.416 18:19:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:34.416 18:19:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:15:34.416 18:19:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:34.416 18:19:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:34.416 18:19:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:15:34.416 18:19:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:15:34.416 18:19:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt3 00:15:34.416 18:19:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:15:34.675 18:19:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:15:34.675 "name": "pt3", 00:15:34.675 "aliases": [ 00:15:34.675 "00000000-0000-0000-0000-000000000003" 00:15:34.675 ], 00:15:34.675 "product_name": "passthru", 00:15:34.675 "block_size": 512, 00:15:34.675 "num_blocks": 65536, 00:15:34.675 "uuid": "00000000-0000-0000-0000-000000000003", 00:15:34.675 "assigned_rate_limits": { 00:15:34.675 "rw_ios_per_sec": 0, 00:15:34.675 "rw_mbytes_per_sec": 0, 00:15:34.675 "r_mbytes_per_sec": 0, 00:15:34.675 "w_mbytes_per_sec": 0 00:15:34.675 }, 00:15:34.675 "claimed": true, 00:15:34.675 "claim_type": "exclusive_write", 00:15:34.675 "zoned": false, 00:15:34.675 "supported_io_types": { 00:15:34.675 "read": true, 00:15:34.675 "write": true, 00:15:34.675 "unmap": true, 00:15:34.675 "flush": true, 00:15:34.675 "reset": true, 00:15:34.675 "nvme_admin": false, 00:15:34.675 "nvme_io": false, 00:15:34.675 "nvme_io_md": false, 00:15:34.675 "write_zeroes": true, 00:15:34.675 "zcopy": true, 00:15:34.675 "get_zone_info": false, 00:15:34.675 "zone_management": false, 00:15:34.675 "zone_append": false, 00:15:34.675 "compare": false, 00:15:34.675 "compare_and_write": false, 00:15:34.675 "abort": true, 00:15:34.675 "seek_hole": false, 00:15:34.675 "seek_data": false, 00:15:34.675 "copy": true, 00:15:34.675 "nvme_iov_md": false 00:15:34.675 }, 00:15:34.675 "memory_domains": [ 00:15:34.675 { 00:15:34.675 "dma_device_id": "system", 00:15:34.675 "dma_device_type": 1 00:15:34.675 }, 00:15:34.675 { 00:15:34.675 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:34.675 "dma_device_type": 2 00:15:34.675 } 00:15:34.675 ], 00:15:34.675 "driver_specific": { 00:15:34.675 "passthru": { 00:15:34.675 "name": "pt3", 00:15:34.675 "base_bdev_name": "malloc3" 00:15:34.675 } 00:15:34.675 } 00:15:34.675 }' 00:15:34.675 18:19:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:34.675 18:19:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:34.675 18:19:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:15:34.933 18:19:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:34.933 18:19:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:34.933 18:19:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:15:34.933 18:19:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:34.933 18:19:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:34.933 18:19:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:15:34.933 18:19:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:34.933 18:19:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:34.933 18:19:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:15:34.933 18:19:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:15:34.933 18:19:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # jq -r '.[] | .uuid' 00:15:35.192 [2024-07-12 18:19:18.876424] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:15:35.192 18:19:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # raid_bdev_uuid=4d8afa8a-7615-457e-a6b5-4acfbd17e608 00:15:35.192 18:19:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@435 -- # '[' -z 4d8afa8a-7615-457e-a6b5-4acfbd17e608 ']' 00:15:35.192 18:19:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@440 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:15:35.450 [2024-07-12 18:19:19.120799] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:15:35.450 [2024-07-12 18:19:19.120820] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:15:35.450 [2024-07-12 18:19:19.120865] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:15:35.450 [2024-07-12 18:19:19.120917] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:15:35.450 [2024-07-12 18:19:19.120933] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x187dea0 name raid_bdev1, state offline 00:15:35.450 18:19:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:35.450 18:19:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # jq -r '.[]' 00:15:35.708 18:19:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # raid_bdev= 00:15:35.708 18:19:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@442 -- # '[' -n '' ']' 00:15:35.708 18:19:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:15:35.708 18:19:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt1 00:15:36.275 18:19:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:15:36.275 18:19:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:15:36.533 18:19:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:15:36.533 18:19:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt3 00:15:36.792 18:19:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs 00:15:36.792 18:19:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # jq -r '[.[] | select(.product_name == "passthru")] | any' 00:15:37.050 18:19:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # '[' false == true ']' 00:15:37.050 18:19:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@456 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'malloc1 malloc2 malloc3' -n raid_bdev1 00:15:37.050 18:19:20 bdev_raid.raid_superblock_test -- common/autotest_common.sh@648 -- # local es=0 00:15:37.051 18:19:20 bdev_raid.raid_superblock_test -- common/autotest_common.sh@650 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'malloc1 malloc2 malloc3' -n raid_bdev1 00:15:37.051 18:19:20 bdev_raid.raid_superblock_test -- common/autotest_common.sh@636 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:15:37.051 18:19:20 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:15:37.051 18:19:20 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:15:37.051 18:19:20 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:15:37.051 18:19:20 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:15:37.051 18:19:20 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:15:37.051 18:19:20 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:15:37.051 18:19:20 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py ]] 00:15:37.051 18:19:20 bdev_raid.raid_superblock_test -- common/autotest_common.sh@651 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'malloc1 malloc2 malloc3' -n raid_bdev1 00:15:37.309 [2024-07-12 18:19:20.869350] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc1 is claimed 00:15:37.309 [2024-07-12 18:19:20.870686] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc2 is claimed 00:15:37.309 [2024-07-12 18:19:20.870729] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc3 is claimed 00:15:37.309 [2024-07-12 18:19:20.870772] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc1 00:15:37.309 [2024-07-12 18:19:20.870809] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc2 00:15:37.309 [2024-07-12 18:19:20.870832] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc3 00:15:37.309 [2024-07-12 18:19:20.870850] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:15:37.309 [2024-07-12 18:19:20.870859] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1a28ff0 name raid_bdev1, state configuring 00:15:37.309 request: 00:15:37.309 { 00:15:37.309 "name": "raid_bdev1", 00:15:37.309 "raid_level": "concat", 00:15:37.309 "base_bdevs": [ 00:15:37.309 "malloc1", 00:15:37.309 "malloc2", 00:15:37.309 "malloc3" 00:15:37.309 ], 00:15:37.309 "strip_size_kb": 64, 00:15:37.309 "superblock": false, 00:15:37.309 "method": "bdev_raid_create", 00:15:37.309 "req_id": 1 00:15:37.309 } 00:15:37.309 Got JSON-RPC error response 00:15:37.309 response: 00:15:37.309 { 00:15:37.309 "code": -17, 00:15:37.309 "message": "Failed to create RAID bdev raid_bdev1: File exists" 00:15:37.309 } 00:15:37.309 18:19:20 bdev_raid.raid_superblock_test -- common/autotest_common.sh@651 -- # es=1 00:15:37.309 18:19:20 bdev_raid.raid_superblock_test -- common/autotest_common.sh@659 -- # (( es > 128 )) 00:15:37.309 18:19:20 bdev_raid.raid_superblock_test -- common/autotest_common.sh@670 -- # [[ -n '' ]] 00:15:37.309 18:19:20 bdev_raid.raid_superblock_test -- common/autotest_common.sh@675 -- # (( !es == 0 )) 00:15:37.309 18:19:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:37.309 18:19:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # jq -r '.[]' 00:15:37.567 18:19:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # raid_bdev= 00:15:37.567 18:19:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@459 -- # '[' -n '' ']' 00:15:37.567 18:19:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@464 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:15:37.826 [2024-07-12 18:19:21.298415] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:15:37.826 [2024-07-12 18:19:21.298460] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:15:37.826 [2024-07-12 18:19:21.298481] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x18857a0 00:15:37.826 [2024-07-12 18:19:21.298494] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:15:37.826 [2024-07-12 18:19:21.300129] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:15:37.826 [2024-07-12 18:19:21.300157] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:15:37.826 [2024-07-12 18:19:21.300224] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt1 00:15:37.826 [2024-07-12 18:19:21.300249] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:15:37.826 pt1 00:15:37.826 18:19:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@467 -- # verify_raid_bdev_state raid_bdev1 configuring concat 64 3 00:15:37.826 18:19:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:15:37.826 18:19:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:15:37.826 18:19:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:15:37.826 18:19:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:15:37.826 18:19:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:15:37.826 18:19:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:37.826 18:19:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:37.826 18:19:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:37.826 18:19:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:37.826 18:19:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:37.826 18:19:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:15:38.095 18:19:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:38.095 "name": "raid_bdev1", 00:15:38.095 "uuid": "4d8afa8a-7615-457e-a6b5-4acfbd17e608", 00:15:38.095 "strip_size_kb": 64, 00:15:38.095 "state": "configuring", 00:15:38.095 "raid_level": "concat", 00:15:38.095 "superblock": true, 00:15:38.095 "num_base_bdevs": 3, 00:15:38.095 "num_base_bdevs_discovered": 1, 00:15:38.095 "num_base_bdevs_operational": 3, 00:15:38.095 "base_bdevs_list": [ 00:15:38.095 { 00:15:38.095 "name": "pt1", 00:15:38.095 "uuid": "00000000-0000-0000-0000-000000000001", 00:15:38.095 "is_configured": true, 00:15:38.095 "data_offset": 2048, 00:15:38.095 "data_size": 63488 00:15:38.095 }, 00:15:38.095 { 00:15:38.095 "name": null, 00:15:38.095 "uuid": "00000000-0000-0000-0000-000000000002", 00:15:38.095 "is_configured": false, 00:15:38.095 "data_offset": 2048, 00:15:38.095 "data_size": 63488 00:15:38.095 }, 00:15:38.095 { 00:15:38.095 "name": null, 00:15:38.095 "uuid": "00000000-0000-0000-0000-000000000003", 00:15:38.095 "is_configured": false, 00:15:38.095 "data_offset": 2048, 00:15:38.095 "data_size": 63488 00:15:38.095 } 00:15:38.095 ] 00:15:38.095 }' 00:15:38.095 18:19:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:38.095 18:19:21 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:15:38.733 18:19:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@469 -- # '[' 3 -gt 2 ']' 00:15:38.734 18:19:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@471 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:15:38.734 [2024-07-12 18:19:22.389314] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:15:38.734 [2024-07-12 18:19:22.389362] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:15:38.734 [2024-07-12 18:19:22.389381] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x187cc70 00:15:38.734 [2024-07-12 18:19:22.389393] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:15:38.734 [2024-07-12 18:19:22.389731] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:15:38.734 [2024-07-12 18:19:22.389748] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:15:38.734 [2024-07-12 18:19:22.389810] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:15:38.734 [2024-07-12 18:19:22.389828] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:15:38.734 pt2 00:15:38.734 18:19:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@472 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:15:38.992 [2024-07-12 18:19:22.637980] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: pt2 00:15:38.992 18:19:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@473 -- # verify_raid_bdev_state raid_bdev1 configuring concat 64 3 00:15:38.993 18:19:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:15:38.993 18:19:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:15:38.993 18:19:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:15:38.993 18:19:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:15:38.993 18:19:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:15:38.993 18:19:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:38.993 18:19:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:38.993 18:19:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:38.993 18:19:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:38.993 18:19:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:38.993 18:19:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:15:39.273 18:19:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:39.273 "name": "raid_bdev1", 00:15:39.273 "uuid": "4d8afa8a-7615-457e-a6b5-4acfbd17e608", 00:15:39.273 "strip_size_kb": 64, 00:15:39.273 "state": "configuring", 00:15:39.273 "raid_level": "concat", 00:15:39.273 "superblock": true, 00:15:39.273 "num_base_bdevs": 3, 00:15:39.273 "num_base_bdevs_discovered": 1, 00:15:39.273 "num_base_bdevs_operational": 3, 00:15:39.273 "base_bdevs_list": [ 00:15:39.273 { 00:15:39.273 "name": "pt1", 00:15:39.273 "uuid": "00000000-0000-0000-0000-000000000001", 00:15:39.273 "is_configured": true, 00:15:39.273 "data_offset": 2048, 00:15:39.273 "data_size": 63488 00:15:39.273 }, 00:15:39.273 { 00:15:39.273 "name": null, 00:15:39.273 "uuid": "00000000-0000-0000-0000-000000000002", 00:15:39.273 "is_configured": false, 00:15:39.273 "data_offset": 2048, 00:15:39.273 "data_size": 63488 00:15:39.273 }, 00:15:39.273 { 00:15:39.273 "name": null, 00:15:39.273 "uuid": "00000000-0000-0000-0000-000000000003", 00:15:39.273 "is_configured": false, 00:15:39.274 "data_offset": 2048, 00:15:39.274 "data_size": 63488 00:15:39.274 } 00:15:39.274 ] 00:15:39.274 }' 00:15:39.274 18:19:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:39.274 18:19:22 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:15:39.846 18:19:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i = 1 )) 00:15:39.846 18:19:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:15:39.846 18:19:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@478 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:15:40.105 [2024-07-12 18:19:23.744918] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:15:40.105 [2024-07-12 18:19:23.744972] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:15:40.105 [2024-07-12 18:19:23.744992] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1885a10 00:15:40.105 [2024-07-12 18:19:23.745004] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:15:40.105 [2024-07-12 18:19:23.745328] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:15:40.105 [2024-07-12 18:19:23.745345] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:15:40.105 [2024-07-12 18:19:23.745405] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:15:40.105 [2024-07-12 18:19:23.745424] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:15:40.105 pt2 00:15:40.105 18:19:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i++ )) 00:15:40.105 18:19:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:15:40.105 18:19:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@478 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc3 -p pt3 -u 00000000-0000-0000-0000-000000000003 00:15:40.363 [2024-07-12 18:19:23.985569] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc3 00:15:40.363 [2024-07-12 18:19:23.985603] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:15:40.363 [2024-07-12 18:19:23.985619] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1a1f740 00:15:40.363 [2024-07-12 18:19:23.985631] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:15:40.363 [2024-07-12 18:19:23.985913] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:15:40.363 [2024-07-12 18:19:23.985938] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt3 00:15:40.363 [2024-07-12 18:19:23.985992] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt3 00:15:40.363 [2024-07-12 18:19:23.986009] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt3 is claimed 00:15:40.363 [2024-07-12 18:19:23.986111] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x1a1fc00 00:15:40.363 [2024-07-12 18:19:23.986121] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 190464, blocklen 512 00:15:40.363 [2024-07-12 18:19:23.986286] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1884a40 00:15:40.363 [2024-07-12 18:19:23.986406] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1a1fc00 00:15:40.363 [2024-07-12 18:19:23.986416] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x1a1fc00 00:15:40.363 [2024-07-12 18:19:23.986507] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:15:40.363 pt3 00:15:40.363 18:19:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i++ )) 00:15:40.363 18:19:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:15:40.363 18:19:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@482 -- # verify_raid_bdev_state raid_bdev1 online concat 64 3 00:15:40.363 18:19:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:15:40.364 18:19:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:15:40.364 18:19:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:15:40.364 18:19:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:15:40.364 18:19:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:15:40.364 18:19:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:40.364 18:19:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:40.364 18:19:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:40.364 18:19:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:40.364 18:19:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:40.364 18:19:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:15:40.622 18:19:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:40.622 "name": "raid_bdev1", 00:15:40.622 "uuid": "4d8afa8a-7615-457e-a6b5-4acfbd17e608", 00:15:40.622 "strip_size_kb": 64, 00:15:40.622 "state": "online", 00:15:40.622 "raid_level": "concat", 00:15:40.622 "superblock": true, 00:15:40.622 "num_base_bdevs": 3, 00:15:40.622 "num_base_bdevs_discovered": 3, 00:15:40.622 "num_base_bdevs_operational": 3, 00:15:40.622 "base_bdevs_list": [ 00:15:40.622 { 00:15:40.622 "name": "pt1", 00:15:40.622 "uuid": "00000000-0000-0000-0000-000000000001", 00:15:40.622 "is_configured": true, 00:15:40.622 "data_offset": 2048, 00:15:40.622 "data_size": 63488 00:15:40.622 }, 00:15:40.622 { 00:15:40.622 "name": "pt2", 00:15:40.622 "uuid": "00000000-0000-0000-0000-000000000002", 00:15:40.622 "is_configured": true, 00:15:40.622 "data_offset": 2048, 00:15:40.622 "data_size": 63488 00:15:40.622 }, 00:15:40.622 { 00:15:40.622 "name": "pt3", 00:15:40.622 "uuid": "00000000-0000-0000-0000-000000000003", 00:15:40.622 "is_configured": true, 00:15:40.622 "data_offset": 2048, 00:15:40.622 "data_size": 63488 00:15:40.622 } 00:15:40.622 ] 00:15:40.622 }' 00:15:40.622 18:19:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:40.622 18:19:24 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:15:41.189 18:19:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@483 -- # verify_raid_bdev_properties raid_bdev1 00:15:41.189 18:19:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:15:41.189 18:19:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:15:41.189 18:19:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:15:41.189 18:19:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:15:41.189 18:19:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@198 -- # local name 00:15:41.189 18:19:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:15:41.189 18:19:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:15:41.448 [2024-07-12 18:19:25.092758] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:15:41.448 18:19:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:15:41.448 "name": "raid_bdev1", 00:15:41.448 "aliases": [ 00:15:41.448 "4d8afa8a-7615-457e-a6b5-4acfbd17e608" 00:15:41.448 ], 00:15:41.448 "product_name": "Raid Volume", 00:15:41.448 "block_size": 512, 00:15:41.449 "num_blocks": 190464, 00:15:41.449 "uuid": "4d8afa8a-7615-457e-a6b5-4acfbd17e608", 00:15:41.449 "assigned_rate_limits": { 00:15:41.449 "rw_ios_per_sec": 0, 00:15:41.449 "rw_mbytes_per_sec": 0, 00:15:41.449 "r_mbytes_per_sec": 0, 00:15:41.449 "w_mbytes_per_sec": 0 00:15:41.449 }, 00:15:41.449 "claimed": false, 00:15:41.449 "zoned": false, 00:15:41.449 "supported_io_types": { 00:15:41.449 "read": true, 00:15:41.449 "write": true, 00:15:41.449 "unmap": true, 00:15:41.449 "flush": true, 00:15:41.449 "reset": true, 00:15:41.449 "nvme_admin": false, 00:15:41.449 "nvme_io": false, 00:15:41.449 "nvme_io_md": false, 00:15:41.449 "write_zeroes": true, 00:15:41.449 "zcopy": false, 00:15:41.449 "get_zone_info": false, 00:15:41.449 "zone_management": false, 00:15:41.449 "zone_append": false, 00:15:41.449 "compare": false, 00:15:41.449 "compare_and_write": false, 00:15:41.449 "abort": false, 00:15:41.449 "seek_hole": false, 00:15:41.449 "seek_data": false, 00:15:41.449 "copy": false, 00:15:41.449 "nvme_iov_md": false 00:15:41.449 }, 00:15:41.449 "memory_domains": [ 00:15:41.449 { 00:15:41.449 "dma_device_id": "system", 00:15:41.449 "dma_device_type": 1 00:15:41.449 }, 00:15:41.449 { 00:15:41.449 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:41.449 "dma_device_type": 2 00:15:41.449 }, 00:15:41.449 { 00:15:41.449 "dma_device_id": "system", 00:15:41.449 "dma_device_type": 1 00:15:41.449 }, 00:15:41.449 { 00:15:41.449 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:41.449 "dma_device_type": 2 00:15:41.449 }, 00:15:41.449 { 00:15:41.449 "dma_device_id": "system", 00:15:41.449 "dma_device_type": 1 00:15:41.449 }, 00:15:41.449 { 00:15:41.449 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:41.449 "dma_device_type": 2 00:15:41.449 } 00:15:41.449 ], 00:15:41.449 "driver_specific": { 00:15:41.449 "raid": { 00:15:41.449 "uuid": "4d8afa8a-7615-457e-a6b5-4acfbd17e608", 00:15:41.449 "strip_size_kb": 64, 00:15:41.449 "state": "online", 00:15:41.449 "raid_level": "concat", 00:15:41.449 "superblock": true, 00:15:41.449 "num_base_bdevs": 3, 00:15:41.449 "num_base_bdevs_discovered": 3, 00:15:41.449 "num_base_bdevs_operational": 3, 00:15:41.449 "base_bdevs_list": [ 00:15:41.449 { 00:15:41.449 "name": "pt1", 00:15:41.449 "uuid": "00000000-0000-0000-0000-000000000001", 00:15:41.449 "is_configured": true, 00:15:41.449 "data_offset": 2048, 00:15:41.449 "data_size": 63488 00:15:41.449 }, 00:15:41.449 { 00:15:41.449 "name": "pt2", 00:15:41.449 "uuid": "00000000-0000-0000-0000-000000000002", 00:15:41.449 "is_configured": true, 00:15:41.449 "data_offset": 2048, 00:15:41.449 "data_size": 63488 00:15:41.449 }, 00:15:41.449 { 00:15:41.449 "name": "pt3", 00:15:41.449 "uuid": "00000000-0000-0000-0000-000000000003", 00:15:41.449 "is_configured": true, 00:15:41.449 "data_offset": 2048, 00:15:41.449 "data_size": 63488 00:15:41.449 } 00:15:41.449 ] 00:15:41.449 } 00:15:41.449 } 00:15:41.449 }' 00:15:41.449 18:19:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:15:41.449 18:19:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:15:41.449 pt2 00:15:41.449 pt3' 00:15:41.449 18:19:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:15:41.449 18:19:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:15:41.449 18:19:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:15:41.707 18:19:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:15:41.708 "name": "pt1", 00:15:41.708 "aliases": [ 00:15:41.708 "00000000-0000-0000-0000-000000000001" 00:15:41.708 ], 00:15:41.708 "product_name": "passthru", 00:15:41.708 "block_size": 512, 00:15:41.708 "num_blocks": 65536, 00:15:41.708 "uuid": "00000000-0000-0000-0000-000000000001", 00:15:41.708 "assigned_rate_limits": { 00:15:41.708 "rw_ios_per_sec": 0, 00:15:41.708 "rw_mbytes_per_sec": 0, 00:15:41.708 "r_mbytes_per_sec": 0, 00:15:41.708 "w_mbytes_per_sec": 0 00:15:41.708 }, 00:15:41.708 "claimed": true, 00:15:41.708 "claim_type": "exclusive_write", 00:15:41.708 "zoned": false, 00:15:41.708 "supported_io_types": { 00:15:41.708 "read": true, 00:15:41.708 "write": true, 00:15:41.708 "unmap": true, 00:15:41.708 "flush": true, 00:15:41.708 "reset": true, 00:15:41.708 "nvme_admin": false, 00:15:41.708 "nvme_io": false, 00:15:41.708 "nvme_io_md": false, 00:15:41.708 "write_zeroes": true, 00:15:41.708 "zcopy": true, 00:15:41.708 "get_zone_info": false, 00:15:41.708 "zone_management": false, 00:15:41.708 "zone_append": false, 00:15:41.708 "compare": false, 00:15:41.708 "compare_and_write": false, 00:15:41.708 "abort": true, 00:15:41.708 "seek_hole": false, 00:15:41.708 "seek_data": false, 00:15:41.708 "copy": true, 00:15:41.708 "nvme_iov_md": false 00:15:41.708 }, 00:15:41.708 "memory_domains": [ 00:15:41.708 { 00:15:41.708 "dma_device_id": "system", 00:15:41.708 "dma_device_type": 1 00:15:41.708 }, 00:15:41.708 { 00:15:41.708 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:41.708 "dma_device_type": 2 00:15:41.708 } 00:15:41.708 ], 00:15:41.708 "driver_specific": { 00:15:41.708 "passthru": { 00:15:41.708 "name": "pt1", 00:15:41.708 "base_bdev_name": "malloc1" 00:15:41.708 } 00:15:41.708 } 00:15:41.708 }' 00:15:41.708 18:19:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:41.966 18:19:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:41.966 18:19:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:15:41.966 18:19:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:41.966 18:19:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:41.966 18:19:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:15:41.966 18:19:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:41.966 18:19:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:41.966 18:19:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:15:41.966 18:19:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:42.225 18:19:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:42.225 18:19:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:15:42.225 18:19:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:15:42.225 18:19:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:15:42.225 18:19:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:15:42.483 18:19:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:15:42.483 "name": "pt2", 00:15:42.483 "aliases": [ 00:15:42.483 "00000000-0000-0000-0000-000000000002" 00:15:42.483 ], 00:15:42.483 "product_name": "passthru", 00:15:42.483 "block_size": 512, 00:15:42.483 "num_blocks": 65536, 00:15:42.483 "uuid": "00000000-0000-0000-0000-000000000002", 00:15:42.483 "assigned_rate_limits": { 00:15:42.483 "rw_ios_per_sec": 0, 00:15:42.483 "rw_mbytes_per_sec": 0, 00:15:42.483 "r_mbytes_per_sec": 0, 00:15:42.483 "w_mbytes_per_sec": 0 00:15:42.483 }, 00:15:42.483 "claimed": true, 00:15:42.483 "claim_type": "exclusive_write", 00:15:42.483 "zoned": false, 00:15:42.483 "supported_io_types": { 00:15:42.483 "read": true, 00:15:42.483 "write": true, 00:15:42.483 "unmap": true, 00:15:42.483 "flush": true, 00:15:42.483 "reset": true, 00:15:42.483 "nvme_admin": false, 00:15:42.483 "nvme_io": false, 00:15:42.483 "nvme_io_md": false, 00:15:42.483 "write_zeroes": true, 00:15:42.483 "zcopy": true, 00:15:42.483 "get_zone_info": false, 00:15:42.483 "zone_management": false, 00:15:42.483 "zone_append": false, 00:15:42.483 "compare": false, 00:15:42.483 "compare_and_write": false, 00:15:42.483 "abort": true, 00:15:42.483 "seek_hole": false, 00:15:42.483 "seek_data": false, 00:15:42.483 "copy": true, 00:15:42.483 "nvme_iov_md": false 00:15:42.483 }, 00:15:42.483 "memory_domains": [ 00:15:42.483 { 00:15:42.483 "dma_device_id": "system", 00:15:42.483 "dma_device_type": 1 00:15:42.483 }, 00:15:42.483 { 00:15:42.483 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:42.483 "dma_device_type": 2 00:15:42.483 } 00:15:42.483 ], 00:15:42.483 "driver_specific": { 00:15:42.483 "passthru": { 00:15:42.483 "name": "pt2", 00:15:42.483 "base_bdev_name": "malloc2" 00:15:42.483 } 00:15:42.483 } 00:15:42.483 }' 00:15:42.483 18:19:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:42.483 18:19:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:42.483 18:19:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:15:42.483 18:19:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:42.483 18:19:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:42.483 18:19:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:15:42.483 18:19:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:42.742 18:19:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:42.742 18:19:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:15:42.742 18:19:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:42.742 18:19:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:42.742 18:19:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:15:42.742 18:19:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:15:42.742 18:19:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt3 00:15:42.742 18:19:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:15:43.001 18:19:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:15:43.001 "name": "pt3", 00:15:43.001 "aliases": [ 00:15:43.001 "00000000-0000-0000-0000-000000000003" 00:15:43.001 ], 00:15:43.001 "product_name": "passthru", 00:15:43.001 "block_size": 512, 00:15:43.001 "num_blocks": 65536, 00:15:43.001 "uuid": "00000000-0000-0000-0000-000000000003", 00:15:43.001 "assigned_rate_limits": { 00:15:43.001 "rw_ios_per_sec": 0, 00:15:43.001 "rw_mbytes_per_sec": 0, 00:15:43.001 "r_mbytes_per_sec": 0, 00:15:43.001 "w_mbytes_per_sec": 0 00:15:43.001 }, 00:15:43.001 "claimed": true, 00:15:43.001 "claim_type": "exclusive_write", 00:15:43.001 "zoned": false, 00:15:43.001 "supported_io_types": { 00:15:43.001 "read": true, 00:15:43.001 "write": true, 00:15:43.001 "unmap": true, 00:15:43.001 "flush": true, 00:15:43.001 "reset": true, 00:15:43.001 "nvme_admin": false, 00:15:43.001 "nvme_io": false, 00:15:43.001 "nvme_io_md": false, 00:15:43.001 "write_zeroes": true, 00:15:43.001 "zcopy": true, 00:15:43.001 "get_zone_info": false, 00:15:43.001 "zone_management": false, 00:15:43.001 "zone_append": false, 00:15:43.001 "compare": false, 00:15:43.001 "compare_and_write": false, 00:15:43.001 "abort": true, 00:15:43.001 "seek_hole": false, 00:15:43.001 "seek_data": false, 00:15:43.001 "copy": true, 00:15:43.001 "nvme_iov_md": false 00:15:43.001 }, 00:15:43.001 "memory_domains": [ 00:15:43.001 { 00:15:43.001 "dma_device_id": "system", 00:15:43.001 "dma_device_type": 1 00:15:43.001 }, 00:15:43.001 { 00:15:43.001 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:43.001 "dma_device_type": 2 00:15:43.001 } 00:15:43.001 ], 00:15:43.001 "driver_specific": { 00:15:43.001 "passthru": { 00:15:43.001 "name": "pt3", 00:15:43.001 "base_bdev_name": "malloc3" 00:15:43.001 } 00:15:43.001 } 00:15:43.001 }' 00:15:43.001 18:19:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:43.001 18:19:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:43.001 18:19:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:15:43.001 18:19:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:43.259 18:19:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:43.259 18:19:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:15:43.259 18:19:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:43.259 18:19:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:43.259 18:19:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:15:43.259 18:19:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:43.259 18:19:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:43.259 18:19:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:15:43.259 18:19:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:15:43.259 18:19:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # jq -r '.[] | .uuid' 00:15:43.518 [2024-07-12 18:19:27.170299] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:15:43.518 18:19:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # '[' 4d8afa8a-7615-457e-a6b5-4acfbd17e608 '!=' 4d8afa8a-7615-457e-a6b5-4acfbd17e608 ']' 00:15:43.518 18:19:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@490 -- # has_redundancy concat 00:15:43.518 18:19:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:15:43.518 18:19:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@215 -- # return 1 00:15:43.518 18:19:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@562 -- # killprocess 2499945 00:15:43.518 18:19:27 bdev_raid.raid_superblock_test -- common/autotest_common.sh@948 -- # '[' -z 2499945 ']' 00:15:43.518 18:19:27 bdev_raid.raid_superblock_test -- common/autotest_common.sh@952 -- # kill -0 2499945 00:15:43.518 18:19:27 bdev_raid.raid_superblock_test -- common/autotest_common.sh@953 -- # uname 00:15:43.518 18:19:27 bdev_raid.raid_superblock_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:15:43.518 18:19:27 bdev_raid.raid_superblock_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2499945 00:15:43.518 18:19:27 bdev_raid.raid_superblock_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:15:43.518 18:19:27 bdev_raid.raid_superblock_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:15:43.518 18:19:27 bdev_raid.raid_superblock_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2499945' 00:15:43.518 killing process with pid 2499945 00:15:43.518 18:19:27 bdev_raid.raid_superblock_test -- common/autotest_common.sh@967 -- # kill 2499945 00:15:43.518 [2024-07-12 18:19:27.239123] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:15:43.518 [2024-07-12 18:19:27.239173] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:15:43.518 [2024-07-12 18:19:27.239229] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:15:43.518 [2024-07-12 18:19:27.239241] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1a1fc00 name raid_bdev1, state offline 00:15:43.518 18:19:27 bdev_raid.raid_superblock_test -- common/autotest_common.sh@972 -- # wait 2499945 00:15:43.778 [2024-07-12 18:19:27.266841] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:15:43.778 18:19:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@564 -- # return 0 00:15:43.778 00:15:43.778 real 0m15.501s 00:15:43.778 user 0m27.988s 00:15:43.778 sys 0m2.746s 00:15:43.778 18:19:27 bdev_raid.raid_superblock_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:15:43.778 18:19:27 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:15:43.778 ************************************ 00:15:43.778 END TEST raid_superblock_test 00:15:43.778 ************************************ 00:15:44.038 18:19:27 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:15:44.038 18:19:27 bdev_raid -- bdev/bdev_raid.sh@870 -- # run_test raid_read_error_test raid_io_error_test concat 3 read 00:15:44.038 18:19:27 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:15:44.038 18:19:27 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:15:44.038 18:19:27 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:15:44.038 ************************************ 00:15:44.038 START TEST raid_read_error_test 00:15:44.038 ************************************ 00:15:44.038 18:19:27 bdev_raid.raid_read_error_test -- common/autotest_common.sh@1123 -- # raid_io_error_test concat 3 read 00:15:44.038 18:19:27 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@788 -- # local raid_level=concat 00:15:44.038 18:19:27 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@789 -- # local num_base_bdevs=3 00:15:44.038 18:19:27 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@790 -- # local error_io_type=read 00:15:44.038 18:19:27 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i = 1 )) 00:15:44.038 18:19:27 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:15:44.038 18:19:27 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev1 00:15:44.038 18:19:27 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:15:44.038 18:19:27 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:15:44.038 18:19:27 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev2 00:15:44.038 18:19:27 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:15:44.038 18:19:27 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:15:44.038 18:19:27 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev3 00:15:44.038 18:19:27 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:15:44.038 18:19:27 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:15:44.038 18:19:27 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3') 00:15:44.038 18:19:27 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # local base_bdevs 00:15:44.038 18:19:27 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@792 -- # local raid_bdev_name=raid_bdev1 00:15:44.038 18:19:27 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # local strip_size 00:15:44.038 18:19:27 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@794 -- # local create_arg 00:15:44.038 18:19:27 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@795 -- # local bdevperf_log 00:15:44.038 18:19:27 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@796 -- # local fail_per_s 00:15:44.038 18:19:27 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@798 -- # '[' concat '!=' raid1 ']' 00:15:44.038 18:19:27 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@799 -- # strip_size=64 00:15:44.038 18:19:27 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@800 -- # create_arg+=' -z 64' 00:15:44.038 18:19:27 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@805 -- # mktemp -p /raidtest 00:15:44.038 18:19:27 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@805 -- # bdevperf_log=/raidtest/tmp.CQhjyZfiuP 00:15:44.038 18:19:27 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@808 -- # raid_pid=2502183 00:15:44.038 18:19:27 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@809 -- # waitforlisten 2502183 /var/tmp/spdk-raid.sock 00:15:44.038 18:19:27 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@807 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 128k -q 1 -z -f -L bdev_raid 00:15:44.038 18:19:27 bdev_raid.raid_read_error_test -- common/autotest_common.sh@829 -- # '[' -z 2502183 ']' 00:15:44.038 18:19:27 bdev_raid.raid_read_error_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:15:44.038 18:19:27 bdev_raid.raid_read_error_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:15:44.038 18:19:27 bdev_raid.raid_read_error_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:15:44.038 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:15:44.038 18:19:27 bdev_raid.raid_read_error_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:15:44.038 18:19:27 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:15:44.038 [2024-07-12 18:19:27.648302] Starting SPDK v24.09-pre git sha1 182dd7de4 / DPDK 24.03.0 initialization... 00:15:44.038 [2024-07-12 18:19:27.648364] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2502183 ] 00:15:44.038 [2024-07-12 18:19:27.765244] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:15:44.298 [2024-07-12 18:19:27.867165] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:15:44.298 [2024-07-12 18:19:27.940339] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:15:44.298 [2024-07-12 18:19:27.940380] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:15:44.867 18:19:28 bdev_raid.raid_read_error_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:15:44.867 18:19:28 bdev_raid.raid_read_error_test -- common/autotest_common.sh@862 -- # return 0 00:15:44.867 18:19:28 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:15:44.867 18:19:28 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:15:45.125 BaseBdev1_malloc 00:15:45.125 18:19:28 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev1_malloc 00:15:45.384 true 00:15:45.384 18:19:29 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev1_malloc -p BaseBdev1 00:15:45.642 [2024-07-12 18:19:29.235957] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev1_malloc 00:15:45.642 [2024-07-12 18:19:29.236005] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:15:45.642 [2024-07-12 18:19:29.236026] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x175b0d0 00:15:45.642 [2024-07-12 18:19:29.236039] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:15:45.642 [2024-07-12 18:19:29.237820] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:15:45.642 [2024-07-12 18:19:29.237851] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:15:45.642 BaseBdev1 00:15:45.642 18:19:29 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:15:45.642 18:19:29 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:15:45.900 BaseBdev2_malloc 00:15:45.900 18:19:29 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev2_malloc 00:15:46.157 true 00:15:46.157 18:19:29 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev2_malloc -p BaseBdev2 00:15:46.414 [2024-07-12 18:19:29.970527] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev2_malloc 00:15:46.414 [2024-07-12 18:19:29.970571] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:15:46.414 [2024-07-12 18:19:29.970591] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x175f910 00:15:46.414 [2024-07-12 18:19:29.970604] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:15:46.414 [2024-07-12 18:19:29.972067] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:15:46.414 [2024-07-12 18:19:29.972096] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:15:46.414 BaseBdev2 00:15:46.414 18:19:29 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:15:46.414 18:19:29 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3_malloc 00:15:46.671 BaseBdev3_malloc 00:15:46.671 18:19:30 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev3_malloc 00:15:46.928 true 00:15:46.928 18:19:30 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev3_malloc -p BaseBdev3 00:15:46.928 [2024-07-12 18:19:30.633007] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev3_malloc 00:15:46.928 [2024-07-12 18:19:30.633055] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:15:46.928 [2024-07-12 18:19:30.633077] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1761bd0 00:15:46.928 [2024-07-12 18:19:30.633090] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:15:46.928 [2024-07-12 18:19:30.634614] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:15:46.928 [2024-07-12 18:19:30.634641] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev3 00:15:46.928 BaseBdev3 00:15:46.928 18:19:30 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@819 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n raid_bdev1 -s 00:15:47.186 [2024-07-12 18:19:30.873668] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:15:47.186 [2024-07-12 18:19:30.874982] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:15:47.186 [2024-07-12 18:19:30.875052] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:15:47.186 [2024-07-12 18:19:30.875257] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x1763280 00:15:47.186 [2024-07-12 18:19:30.875269] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 190464, blocklen 512 00:15:47.186 [2024-07-12 18:19:30.875470] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1762e20 00:15:47.186 [2024-07-12 18:19:30.875618] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1763280 00:15:47.186 [2024-07-12 18:19:30.875628] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x1763280 00:15:47.186 [2024-07-12 18:19:30.875729] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:15:47.186 18:19:30 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@820 -- # verify_raid_bdev_state raid_bdev1 online concat 64 3 00:15:47.186 18:19:30 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:15:47.186 18:19:30 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:15:47.186 18:19:30 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:15:47.186 18:19:30 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:15:47.186 18:19:30 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:15:47.186 18:19:30 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:47.186 18:19:30 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:47.186 18:19:30 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:47.186 18:19:30 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:47.186 18:19:30 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:47.186 18:19:30 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:15:47.444 18:19:31 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:47.444 "name": "raid_bdev1", 00:15:47.444 "uuid": "185f72d1-aa6e-495d-8a78-a61ec4669ffa", 00:15:47.444 "strip_size_kb": 64, 00:15:47.444 "state": "online", 00:15:47.444 "raid_level": "concat", 00:15:47.444 "superblock": true, 00:15:47.444 "num_base_bdevs": 3, 00:15:47.444 "num_base_bdevs_discovered": 3, 00:15:47.444 "num_base_bdevs_operational": 3, 00:15:47.444 "base_bdevs_list": [ 00:15:47.444 { 00:15:47.444 "name": "BaseBdev1", 00:15:47.444 "uuid": "260c8584-f341-53c0-87ff-b5d2f9336d5d", 00:15:47.444 "is_configured": true, 00:15:47.444 "data_offset": 2048, 00:15:47.444 "data_size": 63488 00:15:47.444 }, 00:15:47.444 { 00:15:47.444 "name": "BaseBdev2", 00:15:47.444 "uuid": "221dfc6d-5eca-54d4-9165-e4a95174fd2c", 00:15:47.444 "is_configured": true, 00:15:47.444 "data_offset": 2048, 00:15:47.444 "data_size": 63488 00:15:47.444 }, 00:15:47.444 { 00:15:47.444 "name": "BaseBdev3", 00:15:47.444 "uuid": "ae5cb7f5-4d47-5437-a677-8d676432fc3e", 00:15:47.444 "is_configured": true, 00:15:47.444 "data_offset": 2048, 00:15:47.444 "data_size": 63488 00:15:47.444 } 00:15:47.444 ] 00:15:47.444 }' 00:15:47.444 18:19:31 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:47.444 18:19:31 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:15:48.009 18:19:31 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@823 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/spdk-raid.sock perform_tests 00:15:48.009 18:19:31 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@824 -- # sleep 1 00:15:48.266 [2024-07-12 18:19:31.796381] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x15b14d0 00:15:49.205 18:19:32 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@827 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_inject_error EE_BaseBdev1_malloc read failure 00:15:49.465 18:19:32 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@829 -- # local expected_num_base_bdevs 00:15:49.465 18:19:32 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@830 -- # [[ concat = \r\a\i\d\1 ]] 00:15:49.465 18:19:32 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@833 -- # expected_num_base_bdevs=3 00:15:49.465 18:19:32 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@835 -- # verify_raid_bdev_state raid_bdev1 online concat 64 3 00:15:49.465 18:19:32 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:15:49.465 18:19:32 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:15:49.465 18:19:32 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:15:49.465 18:19:32 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:15:49.465 18:19:32 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:15:49.465 18:19:32 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:49.465 18:19:32 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:49.465 18:19:32 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:49.465 18:19:32 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:49.465 18:19:32 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:49.465 18:19:32 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:15:49.723 18:19:33 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:49.723 "name": "raid_bdev1", 00:15:49.723 "uuid": "185f72d1-aa6e-495d-8a78-a61ec4669ffa", 00:15:49.723 "strip_size_kb": 64, 00:15:49.723 "state": "online", 00:15:49.723 "raid_level": "concat", 00:15:49.723 "superblock": true, 00:15:49.723 "num_base_bdevs": 3, 00:15:49.723 "num_base_bdevs_discovered": 3, 00:15:49.723 "num_base_bdevs_operational": 3, 00:15:49.723 "base_bdevs_list": [ 00:15:49.723 { 00:15:49.723 "name": "BaseBdev1", 00:15:49.723 "uuid": "260c8584-f341-53c0-87ff-b5d2f9336d5d", 00:15:49.723 "is_configured": true, 00:15:49.723 "data_offset": 2048, 00:15:49.723 "data_size": 63488 00:15:49.723 }, 00:15:49.723 { 00:15:49.723 "name": "BaseBdev2", 00:15:49.723 "uuid": "221dfc6d-5eca-54d4-9165-e4a95174fd2c", 00:15:49.723 "is_configured": true, 00:15:49.723 "data_offset": 2048, 00:15:49.723 "data_size": 63488 00:15:49.723 }, 00:15:49.723 { 00:15:49.723 "name": "BaseBdev3", 00:15:49.723 "uuid": "ae5cb7f5-4d47-5437-a677-8d676432fc3e", 00:15:49.723 "is_configured": true, 00:15:49.723 "data_offset": 2048, 00:15:49.723 "data_size": 63488 00:15:49.723 } 00:15:49.723 ] 00:15:49.723 }' 00:15:49.723 18:19:33 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:49.723 18:19:33 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:15:50.291 18:19:33 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@837 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:15:50.291 [2024-07-12 18:19:33.981746] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:15:50.291 [2024-07-12 18:19:33.981795] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:15:50.291 [2024-07-12 18:19:33.984970] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:15:50.291 [2024-07-12 18:19:33.985008] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:15:50.291 [2024-07-12 18:19:33.985042] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:15:50.291 [2024-07-12 18:19:33.985053] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1763280 name raid_bdev1, state offline 00:15:50.291 0 00:15:50.291 18:19:33 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@839 -- # killprocess 2502183 00:15:50.291 18:19:33 bdev_raid.raid_read_error_test -- common/autotest_common.sh@948 -- # '[' -z 2502183 ']' 00:15:50.291 18:19:33 bdev_raid.raid_read_error_test -- common/autotest_common.sh@952 -- # kill -0 2502183 00:15:50.291 18:19:34 bdev_raid.raid_read_error_test -- common/autotest_common.sh@953 -- # uname 00:15:50.291 18:19:34 bdev_raid.raid_read_error_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:15:50.291 18:19:34 bdev_raid.raid_read_error_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2502183 00:15:50.550 18:19:34 bdev_raid.raid_read_error_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:15:50.550 18:19:34 bdev_raid.raid_read_error_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:15:50.550 18:19:34 bdev_raid.raid_read_error_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2502183' 00:15:50.550 killing process with pid 2502183 00:15:50.550 18:19:34 bdev_raid.raid_read_error_test -- common/autotest_common.sh@967 -- # kill 2502183 00:15:50.550 [2024-07-12 18:19:34.046792] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:15:50.550 18:19:34 bdev_raid.raid_read_error_test -- common/autotest_common.sh@972 -- # wait 2502183 00:15:50.550 [2024-07-12 18:19:34.066790] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:15:50.808 18:19:34 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # grep -v Job /raidtest/tmp.CQhjyZfiuP 00:15:50.808 18:19:34 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # grep raid_bdev1 00:15:50.808 18:19:34 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # awk '{print $6}' 00:15:50.808 18:19:34 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # fail_per_s=0.46 00:15:50.808 18:19:34 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@844 -- # has_redundancy concat 00:15:50.808 18:19:34 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:15:50.808 18:19:34 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@215 -- # return 1 00:15:50.808 18:19:34 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@847 -- # [[ 0.46 != \0\.\0\0 ]] 00:15:50.808 00:15:50.808 real 0m6.710s 00:15:50.808 user 0m10.538s 00:15:50.808 sys 0m1.188s 00:15:50.808 18:19:34 bdev_raid.raid_read_error_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:15:50.808 18:19:34 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:15:50.808 ************************************ 00:15:50.808 END TEST raid_read_error_test 00:15:50.808 ************************************ 00:15:50.808 18:19:34 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:15:50.808 18:19:34 bdev_raid -- bdev/bdev_raid.sh@871 -- # run_test raid_write_error_test raid_io_error_test concat 3 write 00:15:50.808 18:19:34 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:15:50.808 18:19:34 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:15:50.808 18:19:34 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:15:50.808 ************************************ 00:15:50.808 START TEST raid_write_error_test 00:15:50.808 ************************************ 00:15:50.808 18:19:34 bdev_raid.raid_write_error_test -- common/autotest_common.sh@1123 -- # raid_io_error_test concat 3 write 00:15:50.808 18:19:34 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@788 -- # local raid_level=concat 00:15:50.808 18:19:34 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@789 -- # local num_base_bdevs=3 00:15:50.808 18:19:34 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@790 -- # local error_io_type=write 00:15:50.808 18:19:34 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i = 1 )) 00:15:50.808 18:19:34 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:15:50.808 18:19:34 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev1 00:15:50.808 18:19:34 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:15:50.808 18:19:34 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:15:50.808 18:19:34 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev2 00:15:50.808 18:19:34 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:15:50.808 18:19:34 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:15:50.808 18:19:34 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev3 00:15:50.808 18:19:34 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:15:50.808 18:19:34 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:15:50.808 18:19:34 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3') 00:15:50.808 18:19:34 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # local base_bdevs 00:15:50.808 18:19:34 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@792 -- # local raid_bdev_name=raid_bdev1 00:15:50.808 18:19:34 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # local strip_size 00:15:50.808 18:19:34 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@794 -- # local create_arg 00:15:50.808 18:19:34 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@795 -- # local bdevperf_log 00:15:50.808 18:19:34 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@796 -- # local fail_per_s 00:15:50.808 18:19:34 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@798 -- # '[' concat '!=' raid1 ']' 00:15:50.808 18:19:34 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@799 -- # strip_size=64 00:15:50.808 18:19:34 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@800 -- # create_arg+=' -z 64' 00:15:50.808 18:19:34 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@805 -- # mktemp -p /raidtest 00:15:50.808 18:19:34 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@805 -- # bdevperf_log=/raidtest/tmp.ESTei4TiQN 00:15:50.808 18:19:34 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@808 -- # raid_pid=2503158 00:15:50.808 18:19:34 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@809 -- # waitforlisten 2503158 /var/tmp/spdk-raid.sock 00:15:50.808 18:19:34 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@807 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 128k -q 1 -z -f -L bdev_raid 00:15:50.808 18:19:34 bdev_raid.raid_write_error_test -- common/autotest_common.sh@829 -- # '[' -z 2503158 ']' 00:15:50.808 18:19:34 bdev_raid.raid_write_error_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:15:50.808 18:19:34 bdev_raid.raid_write_error_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:15:50.808 18:19:34 bdev_raid.raid_write_error_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:15:50.808 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:15:50.808 18:19:34 bdev_raid.raid_write_error_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:15:50.808 18:19:34 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:15:50.808 [2024-07-12 18:19:34.455029] Starting SPDK v24.09-pre git sha1 182dd7de4 / DPDK 24.03.0 initialization... 00:15:50.808 [2024-07-12 18:19:34.455100] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2503158 ] 00:15:51.067 [2024-07-12 18:19:34.585991] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:15:51.067 [2024-07-12 18:19:34.685153] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:15:51.067 [2024-07-12 18:19:34.746414] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:15:51.067 [2024-07-12 18:19:34.746461] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:15:52.002 18:19:35 bdev_raid.raid_write_error_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:15:52.002 18:19:35 bdev_raid.raid_write_error_test -- common/autotest_common.sh@862 -- # return 0 00:15:52.002 18:19:35 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:15:52.002 18:19:35 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:15:52.002 BaseBdev1_malloc 00:15:52.002 18:19:35 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev1_malloc 00:15:52.259 true 00:15:52.259 18:19:35 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev1_malloc -p BaseBdev1 00:15:52.517 [2024-07-12 18:19:36.097008] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev1_malloc 00:15:52.517 [2024-07-12 18:19:36.097055] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:15:52.517 [2024-07-12 18:19:36.097075] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xde60d0 00:15:52.517 [2024-07-12 18:19:36.097088] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:15:52.517 [2024-07-12 18:19:36.098878] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:15:52.517 [2024-07-12 18:19:36.098907] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:15:52.517 BaseBdev1 00:15:52.517 18:19:36 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:15:52.517 18:19:36 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:15:52.776 BaseBdev2_malloc 00:15:52.776 18:19:36 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev2_malloc 00:15:53.034 true 00:15:53.034 18:19:36 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev2_malloc -p BaseBdev2 00:15:53.292 [2024-07-12 18:19:36.835529] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev2_malloc 00:15:53.292 [2024-07-12 18:19:36.835576] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:15:53.292 [2024-07-12 18:19:36.835595] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xdea910 00:15:53.292 [2024-07-12 18:19:36.835607] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:15:53.292 [2024-07-12 18:19:36.836996] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:15:53.292 [2024-07-12 18:19:36.837023] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:15:53.292 BaseBdev2 00:15:53.292 18:19:36 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:15:53.292 18:19:36 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3_malloc 00:15:53.551 BaseBdev3_malloc 00:15:53.551 18:19:37 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev3_malloc 00:15:53.811 true 00:15:53.811 18:19:37 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev3_malloc -p BaseBdev3 00:15:53.811 [2024-07-12 18:19:37.505953] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev3_malloc 00:15:53.811 [2024-07-12 18:19:37.506000] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:15:53.811 [2024-07-12 18:19:37.506020] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xdecbd0 00:15:53.811 [2024-07-12 18:19:37.506033] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:15:53.811 [2024-07-12 18:19:37.507565] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:15:53.811 [2024-07-12 18:19:37.507599] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev3 00:15:53.811 BaseBdev3 00:15:53.811 18:19:37 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@819 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n raid_bdev1 -s 00:15:54.118 [2024-07-12 18:19:37.754630] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:15:54.118 [2024-07-12 18:19:37.755858] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:15:54.118 [2024-07-12 18:19:37.755934] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:15:54.118 [2024-07-12 18:19:37.756136] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0xdee280 00:15:54.118 [2024-07-12 18:19:37.756148] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 190464, blocklen 512 00:15:54.118 [2024-07-12 18:19:37.756338] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xdede20 00:15:54.118 [2024-07-12 18:19:37.756481] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0xdee280 00:15:54.118 [2024-07-12 18:19:37.756491] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0xdee280 00:15:54.118 [2024-07-12 18:19:37.756589] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:15:54.118 18:19:37 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@820 -- # verify_raid_bdev_state raid_bdev1 online concat 64 3 00:15:54.118 18:19:37 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:15:54.118 18:19:37 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:15:54.118 18:19:37 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:15:54.118 18:19:37 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:15:54.118 18:19:37 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:15:54.118 18:19:37 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:54.118 18:19:37 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:54.118 18:19:37 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:54.118 18:19:37 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:54.118 18:19:37 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:54.118 18:19:37 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:15:54.376 18:19:38 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:54.376 "name": "raid_bdev1", 00:15:54.376 "uuid": "1380ae2f-506a-463f-8e42-8fc4f9756058", 00:15:54.376 "strip_size_kb": 64, 00:15:54.376 "state": "online", 00:15:54.376 "raid_level": "concat", 00:15:54.376 "superblock": true, 00:15:54.376 "num_base_bdevs": 3, 00:15:54.376 "num_base_bdevs_discovered": 3, 00:15:54.376 "num_base_bdevs_operational": 3, 00:15:54.376 "base_bdevs_list": [ 00:15:54.376 { 00:15:54.376 "name": "BaseBdev1", 00:15:54.376 "uuid": "30c349fe-1c8b-54ad-9f95-1742b96772d1", 00:15:54.376 "is_configured": true, 00:15:54.376 "data_offset": 2048, 00:15:54.376 "data_size": 63488 00:15:54.376 }, 00:15:54.376 { 00:15:54.376 "name": "BaseBdev2", 00:15:54.376 "uuid": "0e00fab5-6e0d-5cbd-a8dd-e8bd7aa8fd49", 00:15:54.376 "is_configured": true, 00:15:54.376 "data_offset": 2048, 00:15:54.376 "data_size": 63488 00:15:54.376 }, 00:15:54.376 { 00:15:54.376 "name": "BaseBdev3", 00:15:54.376 "uuid": "3200bece-cffa-58bd-b7eb-44f03744c314", 00:15:54.376 "is_configured": true, 00:15:54.376 "data_offset": 2048, 00:15:54.376 "data_size": 63488 00:15:54.376 } 00:15:54.376 ] 00:15:54.376 }' 00:15:54.376 18:19:38 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:54.376 18:19:38 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:15:54.944 18:19:38 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@824 -- # sleep 1 00:15:54.944 18:19:38 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@823 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/spdk-raid.sock perform_tests 00:15:55.203 [2024-07-12 18:19:38.729502] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xc3c4d0 00:15:56.138 18:19:39 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@827 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_inject_error EE_BaseBdev1_malloc write failure 00:15:56.397 18:19:39 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@829 -- # local expected_num_base_bdevs 00:15:56.397 18:19:39 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@830 -- # [[ concat = \r\a\i\d\1 ]] 00:15:56.397 18:19:39 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@833 -- # expected_num_base_bdevs=3 00:15:56.397 18:19:39 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@835 -- # verify_raid_bdev_state raid_bdev1 online concat 64 3 00:15:56.397 18:19:39 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:15:56.397 18:19:39 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:15:56.397 18:19:39 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:15:56.397 18:19:39 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:15:56.397 18:19:39 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:15:56.397 18:19:39 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:56.397 18:19:39 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:56.397 18:19:39 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:56.397 18:19:39 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:56.397 18:19:39 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:56.397 18:19:39 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:15:56.656 18:19:40 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:56.656 "name": "raid_bdev1", 00:15:56.656 "uuid": "1380ae2f-506a-463f-8e42-8fc4f9756058", 00:15:56.656 "strip_size_kb": 64, 00:15:56.656 "state": "online", 00:15:56.656 "raid_level": "concat", 00:15:56.656 "superblock": true, 00:15:56.656 "num_base_bdevs": 3, 00:15:56.656 "num_base_bdevs_discovered": 3, 00:15:56.656 "num_base_bdevs_operational": 3, 00:15:56.656 "base_bdevs_list": [ 00:15:56.656 { 00:15:56.656 "name": "BaseBdev1", 00:15:56.656 "uuid": "30c349fe-1c8b-54ad-9f95-1742b96772d1", 00:15:56.656 "is_configured": true, 00:15:56.656 "data_offset": 2048, 00:15:56.656 "data_size": 63488 00:15:56.656 }, 00:15:56.656 { 00:15:56.656 "name": "BaseBdev2", 00:15:56.656 "uuid": "0e00fab5-6e0d-5cbd-a8dd-e8bd7aa8fd49", 00:15:56.656 "is_configured": true, 00:15:56.656 "data_offset": 2048, 00:15:56.656 "data_size": 63488 00:15:56.656 }, 00:15:56.656 { 00:15:56.656 "name": "BaseBdev3", 00:15:56.656 "uuid": "3200bece-cffa-58bd-b7eb-44f03744c314", 00:15:56.656 "is_configured": true, 00:15:56.656 "data_offset": 2048, 00:15:56.656 "data_size": 63488 00:15:56.656 } 00:15:56.656 ] 00:15:56.656 }' 00:15:56.656 18:19:40 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:56.656 18:19:40 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:15:57.225 18:19:40 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@837 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:15:57.225 [2024-07-12 18:19:40.881923] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:15:57.225 [2024-07-12 18:19:40.881981] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:15:57.225 [2024-07-12 18:19:40.885151] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:15:57.225 [2024-07-12 18:19:40.885190] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:15:57.225 [2024-07-12 18:19:40.885225] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:15:57.225 [2024-07-12 18:19:40.885237] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xdee280 name raid_bdev1, state offline 00:15:57.225 0 00:15:57.225 18:19:40 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@839 -- # killprocess 2503158 00:15:57.225 18:19:40 bdev_raid.raid_write_error_test -- common/autotest_common.sh@948 -- # '[' -z 2503158 ']' 00:15:57.225 18:19:40 bdev_raid.raid_write_error_test -- common/autotest_common.sh@952 -- # kill -0 2503158 00:15:57.225 18:19:40 bdev_raid.raid_write_error_test -- common/autotest_common.sh@953 -- # uname 00:15:57.225 18:19:40 bdev_raid.raid_write_error_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:15:57.225 18:19:40 bdev_raid.raid_write_error_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2503158 00:15:57.225 18:19:40 bdev_raid.raid_write_error_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:15:57.225 18:19:40 bdev_raid.raid_write_error_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:15:57.225 18:19:40 bdev_raid.raid_write_error_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2503158' 00:15:57.225 killing process with pid 2503158 00:15:57.225 18:19:40 bdev_raid.raid_write_error_test -- common/autotest_common.sh@967 -- # kill 2503158 00:15:57.225 [2024-07-12 18:19:40.949609] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:15:57.225 18:19:40 bdev_raid.raid_write_error_test -- common/autotest_common.sh@972 -- # wait 2503158 00:15:57.483 [2024-07-12 18:19:40.970498] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:15:57.483 18:19:41 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # grep -v Job /raidtest/tmp.ESTei4TiQN 00:15:57.483 18:19:41 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # grep raid_bdev1 00:15:57.483 18:19:41 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # awk '{print $6}' 00:15:57.483 18:19:41 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # fail_per_s=0.47 00:15:57.742 18:19:41 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@844 -- # has_redundancy concat 00:15:57.742 18:19:41 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:15:57.742 18:19:41 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@215 -- # return 1 00:15:57.742 18:19:41 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@847 -- # [[ 0.47 != \0\.\0\0 ]] 00:15:57.742 00:15:57.742 real 0m6.836s 00:15:57.742 user 0m10.811s 00:15:57.742 sys 0m1.204s 00:15:57.742 18:19:41 bdev_raid.raid_write_error_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:15:57.742 18:19:41 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:15:57.742 ************************************ 00:15:57.742 END TEST raid_write_error_test 00:15:57.742 ************************************ 00:15:57.742 18:19:41 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:15:57.742 18:19:41 bdev_raid -- bdev/bdev_raid.sh@866 -- # for level in raid0 concat raid1 00:15:57.742 18:19:41 bdev_raid -- bdev/bdev_raid.sh@867 -- # run_test raid_state_function_test raid_state_function_test raid1 3 false 00:15:57.742 18:19:41 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:15:57.742 18:19:41 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:15:57.742 18:19:41 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:15:57.742 ************************************ 00:15:57.742 START TEST raid_state_function_test 00:15:57.742 ************************************ 00:15:57.742 18:19:41 bdev_raid.raid_state_function_test -- common/autotest_common.sh@1123 -- # raid_state_function_test raid1 3 false 00:15:57.742 18:19:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@220 -- # local raid_level=raid1 00:15:57.742 18:19:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@221 -- # local num_base_bdevs=3 00:15:57.742 18:19:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@222 -- # local superblock=false 00:15:57.742 18:19:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@223 -- # local raid_bdev 00:15:57.742 18:19:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i = 1 )) 00:15:57.742 18:19:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:15:57.742 18:19:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev1 00:15:57.742 18:19:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:15:57.742 18:19:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:15:57.742 18:19:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev2 00:15:57.742 18:19:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:15:57.742 18:19:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:15:57.742 18:19:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev3 00:15:57.742 18:19:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:15:57.742 18:19:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:15:57.742 18:19:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3') 00:15:57.742 18:19:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # local base_bdevs 00:15:57.742 18:19:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@225 -- # local raid_bdev_name=Existed_Raid 00:15:57.742 18:19:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # local strip_size 00:15:57.742 18:19:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@227 -- # local strip_size_create_arg 00:15:57.742 18:19:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@228 -- # local superblock_create_arg 00:15:57.742 18:19:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@230 -- # '[' raid1 '!=' raid1 ']' 00:15:57.742 18:19:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@234 -- # strip_size=0 00:15:57.742 18:19:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@237 -- # '[' false = true ']' 00:15:57.742 18:19:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@240 -- # superblock_create_arg= 00:15:57.742 18:19:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@244 -- # raid_pid=2504194 00:15:57.742 18:19:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@245 -- # echo 'Process raid pid: 2504194' 00:15:57.742 Process raid pid: 2504194 00:15:57.742 18:19:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@243 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:15:57.742 18:19:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@246 -- # waitforlisten 2504194 /var/tmp/spdk-raid.sock 00:15:57.742 18:19:41 bdev_raid.raid_state_function_test -- common/autotest_common.sh@829 -- # '[' -z 2504194 ']' 00:15:57.742 18:19:41 bdev_raid.raid_state_function_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:15:57.742 18:19:41 bdev_raid.raid_state_function_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:15:57.742 18:19:41 bdev_raid.raid_state_function_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:15:57.742 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:15:57.743 18:19:41 bdev_raid.raid_state_function_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:15:57.743 18:19:41 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:15:57.743 [2024-07-12 18:19:41.360721] Starting SPDK v24.09-pre git sha1 182dd7de4 / DPDK 24.03.0 initialization... 00:15:57.743 [2024-07-12 18:19:41.360788] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:15:58.001 [2024-07-12 18:19:41.492474] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:15:58.001 [2024-07-12 18:19:41.602865] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:15:58.001 [2024-07-12 18:19:41.669524] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:15:58.001 [2024-07-12 18:19:41.669556] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:15:58.567 18:19:42 bdev_raid.raid_state_function_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:15:58.567 18:19:42 bdev_raid.raid_state_function_test -- common/autotest_common.sh@862 -- # return 0 00:15:58.567 18:19:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@250 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:15:58.826 [2024-07-12 18:19:42.515415] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:15:58.826 [2024-07-12 18:19:42.515458] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:15:58.826 [2024-07-12 18:19:42.515469] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:15:58.826 [2024-07-12 18:19:42.515485] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:15:58.826 [2024-07-12 18:19:42.515494] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:15:58.826 [2024-07-12 18:19:42.515506] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:15:58.826 18:19:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@251 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:15:58.826 18:19:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:15:58.826 18:19:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:15:58.826 18:19:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:15:58.826 18:19:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:15:58.826 18:19:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:15:58.826 18:19:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:58.826 18:19:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:58.826 18:19:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:58.826 18:19:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:58.826 18:19:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:58.826 18:19:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:15:59.085 18:19:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:59.085 "name": "Existed_Raid", 00:15:59.085 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:59.085 "strip_size_kb": 0, 00:15:59.085 "state": "configuring", 00:15:59.085 "raid_level": "raid1", 00:15:59.085 "superblock": false, 00:15:59.085 "num_base_bdevs": 3, 00:15:59.085 "num_base_bdevs_discovered": 0, 00:15:59.085 "num_base_bdevs_operational": 3, 00:15:59.085 "base_bdevs_list": [ 00:15:59.085 { 00:15:59.085 "name": "BaseBdev1", 00:15:59.085 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:59.085 "is_configured": false, 00:15:59.085 "data_offset": 0, 00:15:59.085 "data_size": 0 00:15:59.085 }, 00:15:59.085 { 00:15:59.085 "name": "BaseBdev2", 00:15:59.085 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:59.085 "is_configured": false, 00:15:59.085 "data_offset": 0, 00:15:59.085 "data_size": 0 00:15:59.085 }, 00:15:59.085 { 00:15:59.085 "name": "BaseBdev3", 00:15:59.085 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:59.085 "is_configured": false, 00:15:59.085 "data_offset": 0, 00:15:59.085 "data_size": 0 00:15:59.085 } 00:15:59.085 ] 00:15:59.085 }' 00:15:59.085 18:19:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:59.085 18:19:42 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:16:00.018 18:19:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@252 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:16:00.018 [2024-07-12 18:19:43.642281] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:16:00.018 [2024-07-12 18:19:43.642312] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x13e0a80 name Existed_Raid, state configuring 00:16:00.018 18:19:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@256 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:16:00.276 [2024-07-12 18:19:43.882931] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:16:00.276 [2024-07-12 18:19:43.882962] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:16:00.276 [2024-07-12 18:19:43.882972] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:16:00.276 [2024-07-12 18:19:43.882984] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:16:00.276 [2024-07-12 18:19:43.882993] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:16:00.276 [2024-07-12 18:19:43.883004] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:16:00.276 18:19:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@257 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:16:00.535 [2024-07-12 18:19:44.137830] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:16:00.535 BaseBdev1 00:16:00.535 18:19:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@258 -- # waitforbdev BaseBdev1 00:16:00.535 18:19:44 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev1 00:16:00.535 18:19:44 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:16:00.535 18:19:44 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:16:00.535 18:19:44 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:16:00.535 18:19:44 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:16:00.535 18:19:44 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:16:00.793 18:19:44 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:16:01.051 [ 00:16:01.051 { 00:16:01.051 "name": "BaseBdev1", 00:16:01.051 "aliases": [ 00:16:01.051 "290ed002-d7bc-4382-97c5-9b3a4ecf1b12" 00:16:01.051 ], 00:16:01.051 "product_name": "Malloc disk", 00:16:01.051 "block_size": 512, 00:16:01.051 "num_blocks": 65536, 00:16:01.051 "uuid": "290ed002-d7bc-4382-97c5-9b3a4ecf1b12", 00:16:01.051 "assigned_rate_limits": { 00:16:01.051 "rw_ios_per_sec": 0, 00:16:01.051 "rw_mbytes_per_sec": 0, 00:16:01.051 "r_mbytes_per_sec": 0, 00:16:01.051 "w_mbytes_per_sec": 0 00:16:01.051 }, 00:16:01.051 "claimed": true, 00:16:01.051 "claim_type": "exclusive_write", 00:16:01.051 "zoned": false, 00:16:01.051 "supported_io_types": { 00:16:01.051 "read": true, 00:16:01.051 "write": true, 00:16:01.051 "unmap": true, 00:16:01.051 "flush": true, 00:16:01.051 "reset": true, 00:16:01.051 "nvme_admin": false, 00:16:01.051 "nvme_io": false, 00:16:01.051 "nvme_io_md": false, 00:16:01.051 "write_zeroes": true, 00:16:01.051 "zcopy": true, 00:16:01.051 "get_zone_info": false, 00:16:01.051 "zone_management": false, 00:16:01.051 "zone_append": false, 00:16:01.051 "compare": false, 00:16:01.051 "compare_and_write": false, 00:16:01.051 "abort": true, 00:16:01.051 "seek_hole": false, 00:16:01.051 "seek_data": false, 00:16:01.051 "copy": true, 00:16:01.051 "nvme_iov_md": false 00:16:01.051 }, 00:16:01.051 "memory_domains": [ 00:16:01.051 { 00:16:01.051 "dma_device_id": "system", 00:16:01.051 "dma_device_type": 1 00:16:01.051 }, 00:16:01.051 { 00:16:01.051 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:01.051 "dma_device_type": 2 00:16:01.051 } 00:16:01.051 ], 00:16:01.051 "driver_specific": {} 00:16:01.051 } 00:16:01.051 ] 00:16:01.051 18:19:44 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:16:01.051 18:19:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@259 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:16:01.051 18:19:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:16:01.051 18:19:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:16:01.051 18:19:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:16:01.051 18:19:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:16:01.051 18:19:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:16:01.051 18:19:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:01.051 18:19:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:01.051 18:19:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:01.051 18:19:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:01.051 18:19:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:01.051 18:19:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:16:01.309 18:19:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:01.309 "name": "Existed_Raid", 00:16:01.309 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:01.309 "strip_size_kb": 0, 00:16:01.309 "state": "configuring", 00:16:01.309 "raid_level": "raid1", 00:16:01.309 "superblock": false, 00:16:01.309 "num_base_bdevs": 3, 00:16:01.309 "num_base_bdevs_discovered": 1, 00:16:01.309 "num_base_bdevs_operational": 3, 00:16:01.309 "base_bdevs_list": [ 00:16:01.309 { 00:16:01.309 "name": "BaseBdev1", 00:16:01.309 "uuid": "290ed002-d7bc-4382-97c5-9b3a4ecf1b12", 00:16:01.309 "is_configured": true, 00:16:01.309 "data_offset": 0, 00:16:01.309 "data_size": 65536 00:16:01.309 }, 00:16:01.309 { 00:16:01.309 "name": "BaseBdev2", 00:16:01.309 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:01.309 "is_configured": false, 00:16:01.309 "data_offset": 0, 00:16:01.309 "data_size": 0 00:16:01.309 }, 00:16:01.309 { 00:16:01.309 "name": "BaseBdev3", 00:16:01.309 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:01.309 "is_configured": false, 00:16:01.309 "data_offset": 0, 00:16:01.309 "data_size": 0 00:16:01.309 } 00:16:01.309 ] 00:16:01.309 }' 00:16:01.309 18:19:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:01.309 18:19:44 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:16:01.875 18:19:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@260 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:16:02.134 [2024-07-12 18:19:45.770145] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:16:02.134 [2024-07-12 18:19:45.770184] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x13e0310 name Existed_Raid, state configuring 00:16:02.134 18:19:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@264 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:16:02.393 [2024-07-12 18:19:46.014806] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:16:02.393 [2024-07-12 18:19:46.016301] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:16:02.393 [2024-07-12 18:19:46.016334] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:16:02.393 [2024-07-12 18:19:46.016344] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:16:02.393 [2024-07-12 18:19:46.016356] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:16:02.393 18:19:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i = 1 )) 00:16:02.393 18:19:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:16:02.393 18:19:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:16:02.393 18:19:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:16:02.393 18:19:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:16:02.393 18:19:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:16:02.393 18:19:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:16:02.393 18:19:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:16:02.393 18:19:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:02.393 18:19:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:02.393 18:19:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:02.393 18:19:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:02.393 18:19:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:16:02.393 18:19:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:02.652 18:19:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:02.652 "name": "Existed_Raid", 00:16:02.652 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:02.652 "strip_size_kb": 0, 00:16:02.652 "state": "configuring", 00:16:02.652 "raid_level": "raid1", 00:16:02.652 "superblock": false, 00:16:02.652 "num_base_bdevs": 3, 00:16:02.652 "num_base_bdevs_discovered": 1, 00:16:02.652 "num_base_bdevs_operational": 3, 00:16:02.652 "base_bdevs_list": [ 00:16:02.652 { 00:16:02.652 "name": "BaseBdev1", 00:16:02.652 "uuid": "290ed002-d7bc-4382-97c5-9b3a4ecf1b12", 00:16:02.652 "is_configured": true, 00:16:02.652 "data_offset": 0, 00:16:02.652 "data_size": 65536 00:16:02.652 }, 00:16:02.652 { 00:16:02.652 "name": "BaseBdev2", 00:16:02.652 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:02.652 "is_configured": false, 00:16:02.652 "data_offset": 0, 00:16:02.652 "data_size": 0 00:16:02.652 }, 00:16:02.652 { 00:16:02.652 "name": "BaseBdev3", 00:16:02.652 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:02.652 "is_configured": false, 00:16:02.652 "data_offset": 0, 00:16:02.652 "data_size": 0 00:16:02.652 } 00:16:02.652 ] 00:16:02.652 }' 00:16:02.652 18:19:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:02.652 18:19:46 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:16:03.219 18:19:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:16:03.478 [2024-07-12 18:19:47.081198] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:16:03.478 BaseBdev2 00:16:03.478 18:19:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev2 00:16:03.478 18:19:47 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev2 00:16:03.478 18:19:47 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:16:03.478 18:19:47 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:16:03.478 18:19:47 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:16:03.478 18:19:47 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:16:03.478 18:19:47 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:16:03.736 18:19:47 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:16:03.995 [ 00:16:03.995 { 00:16:03.995 "name": "BaseBdev2", 00:16:03.995 "aliases": [ 00:16:03.995 "e502cfb1-892c-406e-ac07-4824e778dfe6" 00:16:03.995 ], 00:16:03.995 "product_name": "Malloc disk", 00:16:03.995 "block_size": 512, 00:16:03.995 "num_blocks": 65536, 00:16:03.995 "uuid": "e502cfb1-892c-406e-ac07-4824e778dfe6", 00:16:03.995 "assigned_rate_limits": { 00:16:03.995 "rw_ios_per_sec": 0, 00:16:03.995 "rw_mbytes_per_sec": 0, 00:16:03.995 "r_mbytes_per_sec": 0, 00:16:03.995 "w_mbytes_per_sec": 0 00:16:03.995 }, 00:16:03.995 "claimed": true, 00:16:03.995 "claim_type": "exclusive_write", 00:16:03.995 "zoned": false, 00:16:03.995 "supported_io_types": { 00:16:03.995 "read": true, 00:16:03.995 "write": true, 00:16:03.995 "unmap": true, 00:16:03.995 "flush": true, 00:16:03.995 "reset": true, 00:16:03.995 "nvme_admin": false, 00:16:03.995 "nvme_io": false, 00:16:03.995 "nvme_io_md": false, 00:16:03.995 "write_zeroes": true, 00:16:03.995 "zcopy": true, 00:16:03.995 "get_zone_info": false, 00:16:03.995 "zone_management": false, 00:16:03.995 "zone_append": false, 00:16:03.995 "compare": false, 00:16:03.995 "compare_and_write": false, 00:16:03.995 "abort": true, 00:16:03.995 "seek_hole": false, 00:16:03.995 "seek_data": false, 00:16:03.995 "copy": true, 00:16:03.995 "nvme_iov_md": false 00:16:03.995 }, 00:16:03.995 "memory_domains": [ 00:16:03.995 { 00:16:03.995 "dma_device_id": "system", 00:16:03.995 "dma_device_type": 1 00:16:03.995 }, 00:16:03.995 { 00:16:03.995 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:03.995 "dma_device_type": 2 00:16:03.995 } 00:16:03.995 ], 00:16:03.995 "driver_specific": {} 00:16:03.995 } 00:16:03.995 ] 00:16:03.995 18:19:47 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:16:03.995 18:19:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:16:03.995 18:19:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:16:03.995 18:19:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:16:03.995 18:19:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:16:03.995 18:19:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:16:03.995 18:19:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:16:03.995 18:19:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:16:03.995 18:19:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:16:03.995 18:19:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:03.995 18:19:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:03.995 18:19:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:03.995 18:19:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:03.995 18:19:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:03.995 18:19:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:16:04.254 18:19:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:04.254 "name": "Existed_Raid", 00:16:04.254 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:04.254 "strip_size_kb": 0, 00:16:04.254 "state": "configuring", 00:16:04.254 "raid_level": "raid1", 00:16:04.254 "superblock": false, 00:16:04.254 "num_base_bdevs": 3, 00:16:04.254 "num_base_bdevs_discovered": 2, 00:16:04.254 "num_base_bdevs_operational": 3, 00:16:04.254 "base_bdevs_list": [ 00:16:04.254 { 00:16:04.254 "name": "BaseBdev1", 00:16:04.254 "uuid": "290ed002-d7bc-4382-97c5-9b3a4ecf1b12", 00:16:04.254 "is_configured": true, 00:16:04.254 "data_offset": 0, 00:16:04.254 "data_size": 65536 00:16:04.254 }, 00:16:04.254 { 00:16:04.254 "name": "BaseBdev2", 00:16:04.254 "uuid": "e502cfb1-892c-406e-ac07-4824e778dfe6", 00:16:04.254 "is_configured": true, 00:16:04.254 "data_offset": 0, 00:16:04.254 "data_size": 65536 00:16:04.254 }, 00:16:04.254 { 00:16:04.254 "name": "BaseBdev3", 00:16:04.254 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:04.254 "is_configured": false, 00:16:04.254 "data_offset": 0, 00:16:04.254 "data_size": 0 00:16:04.254 } 00:16:04.254 ] 00:16:04.254 }' 00:16:04.254 18:19:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:04.254 18:19:47 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:16:04.821 18:19:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:16:05.080 [2024-07-12 18:19:48.604578] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:16:05.080 [2024-07-12 18:19:48.604611] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x13e1400 00:16:05.080 [2024-07-12 18:19:48.604619] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 65536, blocklen 512 00:16:05.080 [2024-07-12 18:19:48.604864] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x13e0ef0 00:16:05.080 [2024-07-12 18:19:48.604992] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x13e1400 00:16:05.080 [2024-07-12 18:19:48.605003] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x13e1400 00:16:05.080 [2024-07-12 18:19:48.605156] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:16:05.080 BaseBdev3 00:16:05.080 18:19:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev3 00:16:05.080 18:19:48 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev3 00:16:05.080 18:19:48 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:16:05.080 18:19:48 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:16:05.080 18:19:48 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:16:05.080 18:19:48 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:16:05.080 18:19:48 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:16:05.339 18:19:48 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:16:05.598 [ 00:16:05.598 { 00:16:05.598 "name": "BaseBdev3", 00:16:05.598 "aliases": [ 00:16:05.598 "666010d1-6e8c-4e07-9877-ffcda1ce6b1f" 00:16:05.598 ], 00:16:05.598 "product_name": "Malloc disk", 00:16:05.598 "block_size": 512, 00:16:05.598 "num_blocks": 65536, 00:16:05.598 "uuid": "666010d1-6e8c-4e07-9877-ffcda1ce6b1f", 00:16:05.598 "assigned_rate_limits": { 00:16:05.598 "rw_ios_per_sec": 0, 00:16:05.598 "rw_mbytes_per_sec": 0, 00:16:05.598 "r_mbytes_per_sec": 0, 00:16:05.598 "w_mbytes_per_sec": 0 00:16:05.598 }, 00:16:05.598 "claimed": true, 00:16:05.598 "claim_type": "exclusive_write", 00:16:05.598 "zoned": false, 00:16:05.598 "supported_io_types": { 00:16:05.598 "read": true, 00:16:05.598 "write": true, 00:16:05.598 "unmap": true, 00:16:05.598 "flush": true, 00:16:05.598 "reset": true, 00:16:05.598 "nvme_admin": false, 00:16:05.598 "nvme_io": false, 00:16:05.598 "nvme_io_md": false, 00:16:05.598 "write_zeroes": true, 00:16:05.598 "zcopy": true, 00:16:05.598 "get_zone_info": false, 00:16:05.598 "zone_management": false, 00:16:05.598 "zone_append": false, 00:16:05.598 "compare": false, 00:16:05.598 "compare_and_write": false, 00:16:05.598 "abort": true, 00:16:05.598 "seek_hole": false, 00:16:05.598 "seek_data": false, 00:16:05.598 "copy": true, 00:16:05.598 "nvme_iov_md": false 00:16:05.598 }, 00:16:05.598 "memory_domains": [ 00:16:05.598 { 00:16:05.598 "dma_device_id": "system", 00:16:05.598 "dma_device_type": 1 00:16:05.598 }, 00:16:05.598 { 00:16:05.598 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:05.598 "dma_device_type": 2 00:16:05.598 } 00:16:05.598 ], 00:16:05.598 "driver_specific": {} 00:16:05.598 } 00:16:05.598 ] 00:16:05.598 18:19:49 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:16:05.598 18:19:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:16:05.598 18:19:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:16:05.598 18:19:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@270 -- # verify_raid_bdev_state Existed_Raid online raid1 0 3 00:16:05.598 18:19:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:16:05.598 18:19:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:16:05.598 18:19:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:16:05.598 18:19:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:16:05.598 18:19:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:16:05.598 18:19:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:05.598 18:19:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:05.598 18:19:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:05.598 18:19:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:05.598 18:19:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:05.598 18:19:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:16:05.856 18:19:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:05.856 "name": "Existed_Raid", 00:16:05.856 "uuid": "78d3f059-260c-405c-974d-4cedf6fd7049", 00:16:05.856 "strip_size_kb": 0, 00:16:05.856 "state": "online", 00:16:05.856 "raid_level": "raid1", 00:16:05.856 "superblock": false, 00:16:05.856 "num_base_bdevs": 3, 00:16:05.856 "num_base_bdevs_discovered": 3, 00:16:05.856 "num_base_bdevs_operational": 3, 00:16:05.856 "base_bdevs_list": [ 00:16:05.856 { 00:16:05.856 "name": "BaseBdev1", 00:16:05.856 "uuid": "290ed002-d7bc-4382-97c5-9b3a4ecf1b12", 00:16:05.856 "is_configured": true, 00:16:05.856 "data_offset": 0, 00:16:05.856 "data_size": 65536 00:16:05.856 }, 00:16:05.856 { 00:16:05.856 "name": "BaseBdev2", 00:16:05.856 "uuid": "e502cfb1-892c-406e-ac07-4824e778dfe6", 00:16:05.856 "is_configured": true, 00:16:05.856 "data_offset": 0, 00:16:05.856 "data_size": 65536 00:16:05.856 }, 00:16:05.856 { 00:16:05.856 "name": "BaseBdev3", 00:16:05.856 "uuid": "666010d1-6e8c-4e07-9877-ffcda1ce6b1f", 00:16:05.856 "is_configured": true, 00:16:05.856 "data_offset": 0, 00:16:05.856 "data_size": 65536 00:16:05.856 } 00:16:05.856 ] 00:16:05.856 }' 00:16:05.856 18:19:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:05.857 18:19:49 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:16:06.423 18:19:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_properties Existed_Raid 00:16:06.423 18:19:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:16:06.423 18:19:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:16:06.423 18:19:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:16:06.423 18:19:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:16:06.423 18:19:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@198 -- # local name 00:16:06.423 18:19:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:16:06.423 18:19:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:16:06.682 [2024-07-12 18:19:50.185089] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:16:06.682 18:19:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:16:06.682 "name": "Existed_Raid", 00:16:06.682 "aliases": [ 00:16:06.682 "78d3f059-260c-405c-974d-4cedf6fd7049" 00:16:06.682 ], 00:16:06.682 "product_name": "Raid Volume", 00:16:06.682 "block_size": 512, 00:16:06.682 "num_blocks": 65536, 00:16:06.682 "uuid": "78d3f059-260c-405c-974d-4cedf6fd7049", 00:16:06.682 "assigned_rate_limits": { 00:16:06.682 "rw_ios_per_sec": 0, 00:16:06.682 "rw_mbytes_per_sec": 0, 00:16:06.682 "r_mbytes_per_sec": 0, 00:16:06.682 "w_mbytes_per_sec": 0 00:16:06.682 }, 00:16:06.682 "claimed": false, 00:16:06.682 "zoned": false, 00:16:06.682 "supported_io_types": { 00:16:06.682 "read": true, 00:16:06.682 "write": true, 00:16:06.682 "unmap": false, 00:16:06.682 "flush": false, 00:16:06.682 "reset": true, 00:16:06.682 "nvme_admin": false, 00:16:06.682 "nvme_io": false, 00:16:06.682 "nvme_io_md": false, 00:16:06.682 "write_zeroes": true, 00:16:06.682 "zcopy": false, 00:16:06.682 "get_zone_info": false, 00:16:06.682 "zone_management": false, 00:16:06.682 "zone_append": false, 00:16:06.682 "compare": false, 00:16:06.682 "compare_and_write": false, 00:16:06.682 "abort": false, 00:16:06.682 "seek_hole": false, 00:16:06.682 "seek_data": false, 00:16:06.682 "copy": false, 00:16:06.682 "nvme_iov_md": false 00:16:06.682 }, 00:16:06.682 "memory_domains": [ 00:16:06.682 { 00:16:06.682 "dma_device_id": "system", 00:16:06.682 "dma_device_type": 1 00:16:06.682 }, 00:16:06.682 { 00:16:06.682 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:06.682 "dma_device_type": 2 00:16:06.682 }, 00:16:06.682 { 00:16:06.682 "dma_device_id": "system", 00:16:06.682 "dma_device_type": 1 00:16:06.682 }, 00:16:06.682 { 00:16:06.682 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:06.682 "dma_device_type": 2 00:16:06.682 }, 00:16:06.682 { 00:16:06.682 "dma_device_id": "system", 00:16:06.682 "dma_device_type": 1 00:16:06.682 }, 00:16:06.682 { 00:16:06.682 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:06.682 "dma_device_type": 2 00:16:06.682 } 00:16:06.682 ], 00:16:06.682 "driver_specific": { 00:16:06.682 "raid": { 00:16:06.682 "uuid": "78d3f059-260c-405c-974d-4cedf6fd7049", 00:16:06.682 "strip_size_kb": 0, 00:16:06.682 "state": "online", 00:16:06.682 "raid_level": "raid1", 00:16:06.682 "superblock": false, 00:16:06.682 "num_base_bdevs": 3, 00:16:06.682 "num_base_bdevs_discovered": 3, 00:16:06.682 "num_base_bdevs_operational": 3, 00:16:06.682 "base_bdevs_list": [ 00:16:06.682 { 00:16:06.682 "name": "BaseBdev1", 00:16:06.682 "uuid": "290ed002-d7bc-4382-97c5-9b3a4ecf1b12", 00:16:06.682 "is_configured": true, 00:16:06.682 "data_offset": 0, 00:16:06.682 "data_size": 65536 00:16:06.682 }, 00:16:06.682 { 00:16:06.682 "name": "BaseBdev2", 00:16:06.682 "uuid": "e502cfb1-892c-406e-ac07-4824e778dfe6", 00:16:06.682 "is_configured": true, 00:16:06.682 "data_offset": 0, 00:16:06.682 "data_size": 65536 00:16:06.682 }, 00:16:06.682 { 00:16:06.682 "name": "BaseBdev3", 00:16:06.682 "uuid": "666010d1-6e8c-4e07-9877-ffcda1ce6b1f", 00:16:06.682 "is_configured": true, 00:16:06.682 "data_offset": 0, 00:16:06.682 "data_size": 65536 00:16:06.682 } 00:16:06.682 ] 00:16:06.682 } 00:16:06.682 } 00:16:06.682 }' 00:16:06.682 18:19:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:16:06.682 18:19:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='BaseBdev1 00:16:06.682 BaseBdev2 00:16:06.682 BaseBdev3' 00:16:06.682 18:19:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:16:06.682 18:19:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:16:06.682 18:19:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:16:06.941 18:19:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:16:06.941 "name": "BaseBdev1", 00:16:06.941 "aliases": [ 00:16:06.941 "290ed002-d7bc-4382-97c5-9b3a4ecf1b12" 00:16:06.941 ], 00:16:06.941 "product_name": "Malloc disk", 00:16:06.941 "block_size": 512, 00:16:06.941 "num_blocks": 65536, 00:16:06.941 "uuid": "290ed002-d7bc-4382-97c5-9b3a4ecf1b12", 00:16:06.941 "assigned_rate_limits": { 00:16:06.941 "rw_ios_per_sec": 0, 00:16:06.941 "rw_mbytes_per_sec": 0, 00:16:06.941 "r_mbytes_per_sec": 0, 00:16:06.941 "w_mbytes_per_sec": 0 00:16:06.941 }, 00:16:06.941 "claimed": true, 00:16:06.941 "claim_type": "exclusive_write", 00:16:06.941 "zoned": false, 00:16:06.941 "supported_io_types": { 00:16:06.941 "read": true, 00:16:06.941 "write": true, 00:16:06.941 "unmap": true, 00:16:06.941 "flush": true, 00:16:06.941 "reset": true, 00:16:06.941 "nvme_admin": false, 00:16:06.941 "nvme_io": false, 00:16:06.941 "nvme_io_md": false, 00:16:06.941 "write_zeroes": true, 00:16:06.941 "zcopy": true, 00:16:06.941 "get_zone_info": false, 00:16:06.941 "zone_management": false, 00:16:06.941 "zone_append": false, 00:16:06.941 "compare": false, 00:16:06.941 "compare_and_write": false, 00:16:06.941 "abort": true, 00:16:06.941 "seek_hole": false, 00:16:06.941 "seek_data": false, 00:16:06.941 "copy": true, 00:16:06.941 "nvme_iov_md": false 00:16:06.941 }, 00:16:06.941 "memory_domains": [ 00:16:06.941 { 00:16:06.941 "dma_device_id": "system", 00:16:06.941 "dma_device_type": 1 00:16:06.941 }, 00:16:06.941 { 00:16:06.941 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:06.941 "dma_device_type": 2 00:16:06.941 } 00:16:06.941 ], 00:16:06.941 "driver_specific": {} 00:16:06.941 }' 00:16:06.941 18:19:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:06.941 18:19:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:06.941 18:19:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:16:06.941 18:19:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:06.941 18:19:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:07.202 18:19:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:16:07.202 18:19:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:07.202 18:19:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:07.202 18:19:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:16:07.202 18:19:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:07.202 18:19:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:07.203 18:19:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:16:07.203 18:19:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:16:07.203 18:19:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:16:07.203 18:19:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:16:07.461 18:19:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:16:07.461 "name": "BaseBdev2", 00:16:07.461 "aliases": [ 00:16:07.461 "e502cfb1-892c-406e-ac07-4824e778dfe6" 00:16:07.461 ], 00:16:07.461 "product_name": "Malloc disk", 00:16:07.461 "block_size": 512, 00:16:07.461 "num_blocks": 65536, 00:16:07.461 "uuid": "e502cfb1-892c-406e-ac07-4824e778dfe6", 00:16:07.461 "assigned_rate_limits": { 00:16:07.461 "rw_ios_per_sec": 0, 00:16:07.461 "rw_mbytes_per_sec": 0, 00:16:07.461 "r_mbytes_per_sec": 0, 00:16:07.461 "w_mbytes_per_sec": 0 00:16:07.461 }, 00:16:07.461 "claimed": true, 00:16:07.461 "claim_type": "exclusive_write", 00:16:07.461 "zoned": false, 00:16:07.461 "supported_io_types": { 00:16:07.461 "read": true, 00:16:07.461 "write": true, 00:16:07.461 "unmap": true, 00:16:07.461 "flush": true, 00:16:07.461 "reset": true, 00:16:07.461 "nvme_admin": false, 00:16:07.461 "nvme_io": false, 00:16:07.461 "nvme_io_md": false, 00:16:07.461 "write_zeroes": true, 00:16:07.461 "zcopy": true, 00:16:07.461 "get_zone_info": false, 00:16:07.461 "zone_management": false, 00:16:07.461 "zone_append": false, 00:16:07.461 "compare": false, 00:16:07.461 "compare_and_write": false, 00:16:07.461 "abort": true, 00:16:07.461 "seek_hole": false, 00:16:07.461 "seek_data": false, 00:16:07.461 "copy": true, 00:16:07.461 "nvme_iov_md": false 00:16:07.461 }, 00:16:07.461 "memory_domains": [ 00:16:07.461 { 00:16:07.461 "dma_device_id": "system", 00:16:07.461 "dma_device_type": 1 00:16:07.461 }, 00:16:07.461 { 00:16:07.461 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:07.461 "dma_device_type": 2 00:16:07.461 } 00:16:07.461 ], 00:16:07.461 "driver_specific": {} 00:16:07.461 }' 00:16:07.461 18:19:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:07.461 18:19:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:07.719 18:19:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:16:07.719 18:19:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:07.719 18:19:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:07.719 18:19:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:16:07.719 18:19:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:07.719 18:19:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:07.719 18:19:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:16:07.719 18:19:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:07.719 18:19:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:07.719 18:19:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:16:07.719 18:19:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:16:07.719 18:19:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:16:07.719 18:19:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:16:07.977 18:19:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:16:07.977 "name": "BaseBdev3", 00:16:07.977 "aliases": [ 00:16:07.977 "666010d1-6e8c-4e07-9877-ffcda1ce6b1f" 00:16:07.977 ], 00:16:07.977 "product_name": "Malloc disk", 00:16:07.977 "block_size": 512, 00:16:07.977 "num_blocks": 65536, 00:16:07.977 "uuid": "666010d1-6e8c-4e07-9877-ffcda1ce6b1f", 00:16:07.977 "assigned_rate_limits": { 00:16:07.977 "rw_ios_per_sec": 0, 00:16:07.977 "rw_mbytes_per_sec": 0, 00:16:07.977 "r_mbytes_per_sec": 0, 00:16:07.977 "w_mbytes_per_sec": 0 00:16:07.977 }, 00:16:07.977 "claimed": true, 00:16:07.977 "claim_type": "exclusive_write", 00:16:07.977 "zoned": false, 00:16:07.977 "supported_io_types": { 00:16:07.977 "read": true, 00:16:07.977 "write": true, 00:16:07.977 "unmap": true, 00:16:07.977 "flush": true, 00:16:07.977 "reset": true, 00:16:07.977 "nvme_admin": false, 00:16:07.977 "nvme_io": false, 00:16:07.977 "nvme_io_md": false, 00:16:07.977 "write_zeroes": true, 00:16:07.977 "zcopy": true, 00:16:07.977 "get_zone_info": false, 00:16:07.977 "zone_management": false, 00:16:07.977 "zone_append": false, 00:16:07.977 "compare": false, 00:16:07.977 "compare_and_write": false, 00:16:07.977 "abort": true, 00:16:07.977 "seek_hole": false, 00:16:07.977 "seek_data": false, 00:16:07.977 "copy": true, 00:16:07.977 "nvme_iov_md": false 00:16:07.977 }, 00:16:07.977 "memory_domains": [ 00:16:07.977 { 00:16:07.977 "dma_device_id": "system", 00:16:07.977 "dma_device_type": 1 00:16:07.977 }, 00:16:07.977 { 00:16:07.977 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:07.977 "dma_device_type": 2 00:16:07.977 } 00:16:07.977 ], 00:16:07.977 "driver_specific": {} 00:16:07.977 }' 00:16:07.977 18:19:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:08.235 18:19:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:08.235 18:19:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:16:08.236 18:19:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:08.236 18:19:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:08.236 18:19:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:16:08.236 18:19:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:08.236 18:19:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:08.236 18:19:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:16:08.236 18:19:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:08.494 18:19:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:08.494 18:19:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:16:08.494 18:19:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@274 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:16:08.753 [2024-07-12 18:19:52.258324] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:16:08.753 18:19:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@275 -- # local expected_state 00:16:08.753 18:19:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@276 -- # has_redundancy raid1 00:16:08.753 18:19:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:16:08.753 18:19:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@214 -- # return 0 00:16:08.753 18:19:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@279 -- # expected_state=online 00:16:08.753 18:19:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@281 -- # verify_raid_bdev_state Existed_Raid online raid1 0 2 00:16:08.753 18:19:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:16:08.753 18:19:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:16:08.753 18:19:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:16:08.753 18:19:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:16:08.753 18:19:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:16:08.753 18:19:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:08.753 18:19:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:08.753 18:19:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:08.753 18:19:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:08.753 18:19:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:08.753 18:19:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:16:09.011 18:19:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:09.011 "name": "Existed_Raid", 00:16:09.011 "uuid": "78d3f059-260c-405c-974d-4cedf6fd7049", 00:16:09.011 "strip_size_kb": 0, 00:16:09.011 "state": "online", 00:16:09.011 "raid_level": "raid1", 00:16:09.011 "superblock": false, 00:16:09.011 "num_base_bdevs": 3, 00:16:09.011 "num_base_bdevs_discovered": 2, 00:16:09.011 "num_base_bdevs_operational": 2, 00:16:09.011 "base_bdevs_list": [ 00:16:09.011 { 00:16:09.011 "name": null, 00:16:09.011 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:09.011 "is_configured": false, 00:16:09.011 "data_offset": 0, 00:16:09.011 "data_size": 65536 00:16:09.011 }, 00:16:09.011 { 00:16:09.011 "name": "BaseBdev2", 00:16:09.011 "uuid": "e502cfb1-892c-406e-ac07-4824e778dfe6", 00:16:09.011 "is_configured": true, 00:16:09.011 "data_offset": 0, 00:16:09.011 "data_size": 65536 00:16:09.011 }, 00:16:09.011 { 00:16:09.011 "name": "BaseBdev3", 00:16:09.011 "uuid": "666010d1-6e8c-4e07-9877-ffcda1ce6b1f", 00:16:09.011 "is_configured": true, 00:16:09.011 "data_offset": 0, 00:16:09.011 "data_size": 65536 00:16:09.011 } 00:16:09.011 ] 00:16:09.011 }' 00:16:09.011 18:19:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:09.011 18:19:52 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:16:09.595 18:19:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i = 1 )) 00:16:09.595 18:19:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:16:09.595 18:19:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:09.595 18:19:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:16:09.854 18:19:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:16:09.854 18:19:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:16:09.854 18:19:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:16:10.113 [2024-07-12 18:19:53.619805] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:16:10.113 18:19:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:16:10.113 18:19:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:16:10.113 18:19:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:10.113 18:19:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:16:10.371 18:19:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:16:10.371 18:19:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:16:10.371 18:19:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev3 00:16:10.629 [2024-07-12 18:19:54.120541] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:16:10.629 [2024-07-12 18:19:54.120616] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:16:10.629 [2024-07-12 18:19:54.131301] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:16:10.629 [2024-07-12 18:19:54.131333] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:16:10.629 [2024-07-12 18:19:54.131344] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x13e1400 name Existed_Raid, state offline 00:16:10.629 18:19:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:16:10.629 18:19:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:16:10.629 18:19:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:10.629 18:19:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # jq -r '.[0]["name"] | select(.)' 00:16:10.888 18:19:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # raid_bdev= 00:16:10.888 18:19:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@294 -- # '[' -n '' ']' 00:16:10.888 18:19:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@299 -- # '[' 3 -gt 2 ']' 00:16:10.888 18:19:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i = 1 )) 00:16:10.888 18:19:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:16:10.888 18:19:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:16:11.214 BaseBdev2 00:16:11.214 18:19:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev2 00:16:11.214 18:19:54 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev2 00:16:11.214 18:19:54 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:16:11.214 18:19:54 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:16:11.214 18:19:54 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:16:11.214 18:19:54 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:16:11.214 18:19:54 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:16:11.214 18:19:54 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:16:11.472 [ 00:16:11.472 { 00:16:11.472 "name": "BaseBdev2", 00:16:11.472 "aliases": [ 00:16:11.472 "c119ee5f-d23e-4da6-a668-9dced66414dc" 00:16:11.472 ], 00:16:11.472 "product_name": "Malloc disk", 00:16:11.472 "block_size": 512, 00:16:11.472 "num_blocks": 65536, 00:16:11.472 "uuid": "c119ee5f-d23e-4da6-a668-9dced66414dc", 00:16:11.472 "assigned_rate_limits": { 00:16:11.472 "rw_ios_per_sec": 0, 00:16:11.472 "rw_mbytes_per_sec": 0, 00:16:11.472 "r_mbytes_per_sec": 0, 00:16:11.472 "w_mbytes_per_sec": 0 00:16:11.472 }, 00:16:11.472 "claimed": false, 00:16:11.472 "zoned": false, 00:16:11.472 "supported_io_types": { 00:16:11.472 "read": true, 00:16:11.472 "write": true, 00:16:11.472 "unmap": true, 00:16:11.472 "flush": true, 00:16:11.472 "reset": true, 00:16:11.472 "nvme_admin": false, 00:16:11.472 "nvme_io": false, 00:16:11.472 "nvme_io_md": false, 00:16:11.472 "write_zeroes": true, 00:16:11.472 "zcopy": true, 00:16:11.472 "get_zone_info": false, 00:16:11.472 "zone_management": false, 00:16:11.472 "zone_append": false, 00:16:11.472 "compare": false, 00:16:11.472 "compare_and_write": false, 00:16:11.472 "abort": true, 00:16:11.472 "seek_hole": false, 00:16:11.472 "seek_data": false, 00:16:11.472 "copy": true, 00:16:11.472 "nvme_iov_md": false 00:16:11.472 }, 00:16:11.472 "memory_domains": [ 00:16:11.472 { 00:16:11.472 "dma_device_id": "system", 00:16:11.472 "dma_device_type": 1 00:16:11.472 }, 00:16:11.472 { 00:16:11.472 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:11.472 "dma_device_type": 2 00:16:11.472 } 00:16:11.472 ], 00:16:11.472 "driver_specific": {} 00:16:11.472 } 00:16:11.472 ] 00:16:11.472 18:19:55 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:16:11.472 18:19:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:16:11.472 18:19:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:16:11.472 18:19:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:16:11.731 BaseBdev3 00:16:11.731 18:19:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev3 00:16:11.731 18:19:55 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev3 00:16:11.731 18:19:55 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:16:11.731 18:19:55 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:16:11.731 18:19:55 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:16:11.731 18:19:55 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:16:11.731 18:19:55 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:16:11.989 18:19:55 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:16:12.248 [ 00:16:12.248 { 00:16:12.248 "name": "BaseBdev3", 00:16:12.248 "aliases": [ 00:16:12.248 "fd970701-0b55-4621-bfbc-180a53749d58" 00:16:12.248 ], 00:16:12.248 "product_name": "Malloc disk", 00:16:12.248 "block_size": 512, 00:16:12.248 "num_blocks": 65536, 00:16:12.248 "uuid": "fd970701-0b55-4621-bfbc-180a53749d58", 00:16:12.248 "assigned_rate_limits": { 00:16:12.248 "rw_ios_per_sec": 0, 00:16:12.248 "rw_mbytes_per_sec": 0, 00:16:12.248 "r_mbytes_per_sec": 0, 00:16:12.248 "w_mbytes_per_sec": 0 00:16:12.248 }, 00:16:12.248 "claimed": false, 00:16:12.248 "zoned": false, 00:16:12.248 "supported_io_types": { 00:16:12.248 "read": true, 00:16:12.249 "write": true, 00:16:12.249 "unmap": true, 00:16:12.249 "flush": true, 00:16:12.249 "reset": true, 00:16:12.249 "nvme_admin": false, 00:16:12.249 "nvme_io": false, 00:16:12.249 "nvme_io_md": false, 00:16:12.249 "write_zeroes": true, 00:16:12.249 "zcopy": true, 00:16:12.249 "get_zone_info": false, 00:16:12.249 "zone_management": false, 00:16:12.249 "zone_append": false, 00:16:12.249 "compare": false, 00:16:12.249 "compare_and_write": false, 00:16:12.249 "abort": true, 00:16:12.249 "seek_hole": false, 00:16:12.249 "seek_data": false, 00:16:12.249 "copy": true, 00:16:12.249 "nvme_iov_md": false 00:16:12.249 }, 00:16:12.249 "memory_domains": [ 00:16:12.249 { 00:16:12.249 "dma_device_id": "system", 00:16:12.249 "dma_device_type": 1 00:16:12.249 }, 00:16:12.249 { 00:16:12.249 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:12.249 "dma_device_type": 2 00:16:12.249 } 00:16:12.249 ], 00:16:12.249 "driver_specific": {} 00:16:12.249 } 00:16:12.249 ] 00:16:12.249 18:19:55 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:16:12.249 18:19:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:16:12.249 18:19:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:16:12.249 18:19:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@305 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:16:12.507 [2024-07-12 18:19:56.033986] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:16:12.507 [2024-07-12 18:19:56.034027] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:16:12.507 [2024-07-12 18:19:56.034045] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:16:12.507 [2024-07-12 18:19:56.035360] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:16:12.507 18:19:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@306 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:16:12.507 18:19:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:16:12.507 18:19:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:16:12.507 18:19:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:16:12.507 18:19:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:16:12.507 18:19:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:16:12.507 18:19:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:12.507 18:19:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:12.507 18:19:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:12.507 18:19:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:12.507 18:19:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:12.507 18:19:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:16:12.766 18:19:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:12.766 "name": "Existed_Raid", 00:16:12.766 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:12.766 "strip_size_kb": 0, 00:16:12.766 "state": "configuring", 00:16:12.766 "raid_level": "raid1", 00:16:12.766 "superblock": false, 00:16:12.766 "num_base_bdevs": 3, 00:16:12.766 "num_base_bdevs_discovered": 2, 00:16:12.766 "num_base_bdevs_operational": 3, 00:16:12.766 "base_bdevs_list": [ 00:16:12.766 { 00:16:12.766 "name": "BaseBdev1", 00:16:12.766 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:12.766 "is_configured": false, 00:16:12.766 "data_offset": 0, 00:16:12.766 "data_size": 0 00:16:12.766 }, 00:16:12.766 { 00:16:12.766 "name": "BaseBdev2", 00:16:12.766 "uuid": "c119ee5f-d23e-4da6-a668-9dced66414dc", 00:16:12.766 "is_configured": true, 00:16:12.766 "data_offset": 0, 00:16:12.766 "data_size": 65536 00:16:12.766 }, 00:16:12.766 { 00:16:12.766 "name": "BaseBdev3", 00:16:12.766 "uuid": "fd970701-0b55-4621-bfbc-180a53749d58", 00:16:12.766 "is_configured": true, 00:16:12.766 "data_offset": 0, 00:16:12.766 "data_size": 65536 00:16:12.766 } 00:16:12.766 ] 00:16:12.766 }' 00:16:12.766 18:19:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:12.766 18:19:56 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:16:13.333 18:19:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@308 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev2 00:16:13.592 [2024-07-12 18:19:57.104807] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:16:13.592 18:19:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@309 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:16:13.592 18:19:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:16:13.592 18:19:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:16:13.592 18:19:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:16:13.592 18:19:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:16:13.592 18:19:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:16:13.592 18:19:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:13.592 18:19:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:13.592 18:19:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:13.592 18:19:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:13.592 18:19:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:13.592 18:19:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:16:13.592 18:19:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:13.592 "name": "Existed_Raid", 00:16:13.592 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:13.592 "strip_size_kb": 0, 00:16:13.592 "state": "configuring", 00:16:13.592 "raid_level": "raid1", 00:16:13.592 "superblock": false, 00:16:13.592 "num_base_bdevs": 3, 00:16:13.592 "num_base_bdevs_discovered": 1, 00:16:13.592 "num_base_bdevs_operational": 3, 00:16:13.592 "base_bdevs_list": [ 00:16:13.592 { 00:16:13.592 "name": "BaseBdev1", 00:16:13.592 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:13.592 "is_configured": false, 00:16:13.592 "data_offset": 0, 00:16:13.592 "data_size": 0 00:16:13.592 }, 00:16:13.592 { 00:16:13.592 "name": null, 00:16:13.592 "uuid": "c119ee5f-d23e-4da6-a668-9dced66414dc", 00:16:13.592 "is_configured": false, 00:16:13.592 "data_offset": 0, 00:16:13.592 "data_size": 65536 00:16:13.592 }, 00:16:13.592 { 00:16:13.592 "name": "BaseBdev3", 00:16:13.592 "uuid": "fd970701-0b55-4621-bfbc-180a53749d58", 00:16:13.592 "is_configured": true, 00:16:13.592 "data_offset": 0, 00:16:13.592 "data_size": 65536 00:16:13.592 } 00:16:13.592 ] 00:16:13.592 }' 00:16:13.592 18:19:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:13.592 18:19:57 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:16:14.159 18:19:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@310 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:14.159 18:19:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@310 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:16:14.417 18:19:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@310 -- # [[ false == \f\a\l\s\e ]] 00:16:14.417 18:19:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@312 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:16:14.675 [2024-07-12 18:19:58.316564] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:16:14.675 BaseBdev1 00:16:14.676 18:19:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@313 -- # waitforbdev BaseBdev1 00:16:14.676 18:19:58 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev1 00:16:14.676 18:19:58 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:16:14.676 18:19:58 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:16:14.676 18:19:58 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:16:14.676 18:19:58 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:16:14.676 18:19:58 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:16:15.242 18:19:58 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:16:15.810 [ 00:16:15.810 { 00:16:15.810 "name": "BaseBdev1", 00:16:15.810 "aliases": [ 00:16:15.810 "1aeb2614-fcdb-4c32-97f1-fef8676f76a0" 00:16:15.810 ], 00:16:15.810 "product_name": "Malloc disk", 00:16:15.810 "block_size": 512, 00:16:15.810 "num_blocks": 65536, 00:16:15.810 "uuid": "1aeb2614-fcdb-4c32-97f1-fef8676f76a0", 00:16:15.810 "assigned_rate_limits": { 00:16:15.810 "rw_ios_per_sec": 0, 00:16:15.810 "rw_mbytes_per_sec": 0, 00:16:15.810 "r_mbytes_per_sec": 0, 00:16:15.810 "w_mbytes_per_sec": 0 00:16:15.810 }, 00:16:15.810 "claimed": true, 00:16:15.810 "claim_type": "exclusive_write", 00:16:15.810 "zoned": false, 00:16:15.810 "supported_io_types": { 00:16:15.810 "read": true, 00:16:15.810 "write": true, 00:16:15.810 "unmap": true, 00:16:15.810 "flush": true, 00:16:15.810 "reset": true, 00:16:15.810 "nvme_admin": false, 00:16:15.810 "nvme_io": false, 00:16:15.810 "nvme_io_md": false, 00:16:15.810 "write_zeroes": true, 00:16:15.810 "zcopy": true, 00:16:15.810 "get_zone_info": false, 00:16:15.810 "zone_management": false, 00:16:15.810 "zone_append": false, 00:16:15.810 "compare": false, 00:16:15.810 "compare_and_write": false, 00:16:15.810 "abort": true, 00:16:15.810 "seek_hole": false, 00:16:15.810 "seek_data": false, 00:16:15.810 "copy": true, 00:16:15.810 "nvme_iov_md": false 00:16:15.810 }, 00:16:15.810 "memory_domains": [ 00:16:15.810 { 00:16:15.810 "dma_device_id": "system", 00:16:15.810 "dma_device_type": 1 00:16:15.810 }, 00:16:15.810 { 00:16:15.810 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:15.810 "dma_device_type": 2 00:16:15.810 } 00:16:15.810 ], 00:16:15.810 "driver_specific": {} 00:16:15.810 } 00:16:15.810 ] 00:16:15.810 18:19:59 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:16:15.810 18:19:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@314 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:16:15.810 18:19:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:16:15.810 18:19:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:16:15.810 18:19:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:16:15.810 18:19:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:16:15.810 18:19:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:16:15.810 18:19:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:15.810 18:19:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:15.810 18:19:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:15.810 18:19:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:15.810 18:19:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:15.810 18:19:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:16:16.068 18:19:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:16.068 "name": "Existed_Raid", 00:16:16.068 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:16.068 "strip_size_kb": 0, 00:16:16.068 "state": "configuring", 00:16:16.068 "raid_level": "raid1", 00:16:16.068 "superblock": false, 00:16:16.068 "num_base_bdevs": 3, 00:16:16.068 "num_base_bdevs_discovered": 2, 00:16:16.069 "num_base_bdevs_operational": 3, 00:16:16.069 "base_bdevs_list": [ 00:16:16.069 { 00:16:16.069 "name": "BaseBdev1", 00:16:16.069 "uuid": "1aeb2614-fcdb-4c32-97f1-fef8676f76a0", 00:16:16.069 "is_configured": true, 00:16:16.069 "data_offset": 0, 00:16:16.069 "data_size": 65536 00:16:16.069 }, 00:16:16.069 { 00:16:16.069 "name": null, 00:16:16.069 "uuid": "c119ee5f-d23e-4da6-a668-9dced66414dc", 00:16:16.069 "is_configured": false, 00:16:16.069 "data_offset": 0, 00:16:16.069 "data_size": 65536 00:16:16.069 }, 00:16:16.069 { 00:16:16.069 "name": "BaseBdev3", 00:16:16.069 "uuid": "fd970701-0b55-4621-bfbc-180a53749d58", 00:16:16.069 "is_configured": true, 00:16:16.069 "data_offset": 0, 00:16:16.069 "data_size": 65536 00:16:16.069 } 00:16:16.069 ] 00:16:16.069 }' 00:16:16.069 18:19:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:16.069 18:19:59 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:16:16.634 18:20:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@315 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:16.634 18:20:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@315 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:16:16.892 18:20:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@315 -- # [[ true == \t\r\u\e ]] 00:16:16.892 18:20:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@317 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev3 00:16:17.150 [2024-07-12 18:20:00.686874] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:16:17.150 18:20:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@318 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:16:17.150 18:20:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:16:17.150 18:20:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:16:17.150 18:20:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:16:17.150 18:20:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:16:17.150 18:20:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:16:17.150 18:20:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:17.150 18:20:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:17.150 18:20:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:17.150 18:20:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:17.150 18:20:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:17.150 18:20:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:16:17.407 18:20:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:17.407 "name": "Existed_Raid", 00:16:17.407 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:17.407 "strip_size_kb": 0, 00:16:17.407 "state": "configuring", 00:16:17.407 "raid_level": "raid1", 00:16:17.407 "superblock": false, 00:16:17.407 "num_base_bdevs": 3, 00:16:17.407 "num_base_bdevs_discovered": 1, 00:16:17.407 "num_base_bdevs_operational": 3, 00:16:17.407 "base_bdevs_list": [ 00:16:17.407 { 00:16:17.407 "name": "BaseBdev1", 00:16:17.407 "uuid": "1aeb2614-fcdb-4c32-97f1-fef8676f76a0", 00:16:17.407 "is_configured": true, 00:16:17.407 "data_offset": 0, 00:16:17.407 "data_size": 65536 00:16:17.407 }, 00:16:17.407 { 00:16:17.407 "name": null, 00:16:17.407 "uuid": "c119ee5f-d23e-4da6-a668-9dced66414dc", 00:16:17.407 "is_configured": false, 00:16:17.407 "data_offset": 0, 00:16:17.407 "data_size": 65536 00:16:17.407 }, 00:16:17.407 { 00:16:17.407 "name": null, 00:16:17.407 "uuid": "fd970701-0b55-4621-bfbc-180a53749d58", 00:16:17.407 "is_configured": false, 00:16:17.407 "data_offset": 0, 00:16:17.407 "data_size": 65536 00:16:17.408 } 00:16:17.408 ] 00:16:17.408 }' 00:16:17.408 18:20:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:17.408 18:20:00 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:16:17.973 18:20:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@319 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:17.973 18:20:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@319 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:16:18.232 18:20:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@319 -- # [[ false == \f\a\l\s\e ]] 00:16:18.232 18:20:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@321 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev3 00:16:18.491 [2024-07-12 18:20:01.966265] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:16:18.491 18:20:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@322 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:16:18.491 18:20:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:16:18.491 18:20:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:16:18.491 18:20:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:16:18.491 18:20:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:16:18.491 18:20:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:16:18.491 18:20:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:18.491 18:20:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:18.491 18:20:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:18.491 18:20:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:18.491 18:20:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:18.491 18:20:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:16:18.750 18:20:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:18.750 "name": "Existed_Raid", 00:16:18.750 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:18.750 "strip_size_kb": 0, 00:16:18.750 "state": "configuring", 00:16:18.750 "raid_level": "raid1", 00:16:18.750 "superblock": false, 00:16:18.750 "num_base_bdevs": 3, 00:16:18.750 "num_base_bdevs_discovered": 2, 00:16:18.750 "num_base_bdevs_operational": 3, 00:16:18.750 "base_bdevs_list": [ 00:16:18.750 { 00:16:18.750 "name": "BaseBdev1", 00:16:18.750 "uuid": "1aeb2614-fcdb-4c32-97f1-fef8676f76a0", 00:16:18.750 "is_configured": true, 00:16:18.750 "data_offset": 0, 00:16:18.750 "data_size": 65536 00:16:18.750 }, 00:16:18.750 { 00:16:18.750 "name": null, 00:16:18.750 "uuid": "c119ee5f-d23e-4da6-a668-9dced66414dc", 00:16:18.750 "is_configured": false, 00:16:18.750 "data_offset": 0, 00:16:18.750 "data_size": 65536 00:16:18.750 }, 00:16:18.750 { 00:16:18.750 "name": "BaseBdev3", 00:16:18.750 "uuid": "fd970701-0b55-4621-bfbc-180a53749d58", 00:16:18.750 "is_configured": true, 00:16:18.750 "data_offset": 0, 00:16:18.750 "data_size": 65536 00:16:18.750 } 00:16:18.750 ] 00:16:18.750 }' 00:16:18.750 18:20:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:18.750 18:20:02 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:16:19.317 18:20:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@323 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:19.317 18:20:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@323 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:16:19.317 18:20:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@323 -- # [[ true == \t\r\u\e ]] 00:16:19.317 18:20:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@325 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:16:19.575 [2024-07-12 18:20:03.177501] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:16:19.575 18:20:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@326 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:16:19.575 18:20:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:16:19.575 18:20:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:16:19.575 18:20:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:16:19.575 18:20:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:16:19.575 18:20:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:16:19.575 18:20:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:19.575 18:20:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:19.575 18:20:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:19.575 18:20:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:19.575 18:20:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:19.575 18:20:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:16:19.834 18:20:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:19.834 "name": "Existed_Raid", 00:16:19.834 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:19.834 "strip_size_kb": 0, 00:16:19.834 "state": "configuring", 00:16:19.834 "raid_level": "raid1", 00:16:19.834 "superblock": false, 00:16:19.834 "num_base_bdevs": 3, 00:16:19.834 "num_base_bdevs_discovered": 1, 00:16:19.834 "num_base_bdevs_operational": 3, 00:16:19.834 "base_bdevs_list": [ 00:16:19.834 { 00:16:19.834 "name": null, 00:16:19.834 "uuid": "1aeb2614-fcdb-4c32-97f1-fef8676f76a0", 00:16:19.834 "is_configured": false, 00:16:19.834 "data_offset": 0, 00:16:19.834 "data_size": 65536 00:16:19.834 }, 00:16:19.834 { 00:16:19.834 "name": null, 00:16:19.834 "uuid": "c119ee5f-d23e-4da6-a668-9dced66414dc", 00:16:19.834 "is_configured": false, 00:16:19.834 "data_offset": 0, 00:16:19.834 "data_size": 65536 00:16:19.834 }, 00:16:19.834 { 00:16:19.834 "name": "BaseBdev3", 00:16:19.834 "uuid": "fd970701-0b55-4621-bfbc-180a53749d58", 00:16:19.834 "is_configured": true, 00:16:19.834 "data_offset": 0, 00:16:19.834 "data_size": 65536 00:16:19.834 } 00:16:19.834 ] 00:16:19.834 }' 00:16:19.834 18:20:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:19.834 18:20:03 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:16:20.401 18:20:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@327 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:20.401 18:20:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@327 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:16:20.401 18:20:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@327 -- # [[ false == \f\a\l\s\e ]] 00:16:20.401 18:20:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@329 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev2 00:16:20.660 [2024-07-12 18:20:04.342966] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:16:20.660 18:20:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@330 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:16:20.660 18:20:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:16:20.660 18:20:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:16:20.660 18:20:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:16:20.660 18:20:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:16:20.660 18:20:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:16:20.660 18:20:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:20.660 18:20:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:20.660 18:20:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:20.660 18:20:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:20.660 18:20:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:20.660 18:20:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:16:20.919 18:20:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:20.919 "name": "Existed_Raid", 00:16:20.919 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:20.919 "strip_size_kb": 0, 00:16:20.919 "state": "configuring", 00:16:20.919 "raid_level": "raid1", 00:16:20.919 "superblock": false, 00:16:20.919 "num_base_bdevs": 3, 00:16:20.919 "num_base_bdevs_discovered": 2, 00:16:20.919 "num_base_bdevs_operational": 3, 00:16:20.919 "base_bdevs_list": [ 00:16:20.919 { 00:16:20.919 "name": null, 00:16:20.919 "uuid": "1aeb2614-fcdb-4c32-97f1-fef8676f76a0", 00:16:20.919 "is_configured": false, 00:16:20.919 "data_offset": 0, 00:16:20.919 "data_size": 65536 00:16:20.919 }, 00:16:20.919 { 00:16:20.919 "name": "BaseBdev2", 00:16:20.919 "uuid": "c119ee5f-d23e-4da6-a668-9dced66414dc", 00:16:20.919 "is_configured": true, 00:16:20.919 "data_offset": 0, 00:16:20.919 "data_size": 65536 00:16:20.919 }, 00:16:20.919 { 00:16:20.919 "name": "BaseBdev3", 00:16:20.919 "uuid": "fd970701-0b55-4621-bfbc-180a53749d58", 00:16:20.919 "is_configured": true, 00:16:20.919 "data_offset": 0, 00:16:20.919 "data_size": 65536 00:16:20.919 } 00:16:20.919 ] 00:16:20.919 }' 00:16:20.919 18:20:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:20.919 18:20:04 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:16:21.486 18:20:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@331 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:16:21.486 18:20:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@331 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:21.746 18:20:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@331 -- # [[ true == \t\r\u\e ]] 00:16:21.746 18:20:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:21.746 18:20:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@333 -- # jq -r '.[0].base_bdevs_list[0].uuid' 00:16:22.005 18:20:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b NewBaseBdev -u 1aeb2614-fcdb-4c32-97f1-fef8676f76a0 00:16:22.005 [2024-07-12 18:20:05.715169] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev NewBaseBdev is claimed 00:16:22.005 [2024-07-12 18:20:05.715207] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x13e4e40 00:16:22.005 [2024-07-12 18:20:05.715216] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 65536, blocklen 512 00:16:22.005 [2024-07-12 18:20:05.715406] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x13e1e60 00:16:22.005 [2024-07-12 18:20:05.715525] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x13e4e40 00:16:22.005 [2024-07-12 18:20:05.715535] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x13e4e40 00:16:22.005 [2024-07-12 18:20:05.715695] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:16:22.005 NewBaseBdev 00:16:22.005 18:20:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@334 -- # waitforbdev NewBaseBdev 00:16:22.005 18:20:05 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=NewBaseBdev 00:16:22.005 18:20:05 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:16:22.005 18:20:05 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:16:22.005 18:20:05 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:16:22.005 18:20:05 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:16:22.005 18:20:05 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:16:22.270 18:20:05 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev -t 2000 00:16:22.532 [ 00:16:22.532 { 00:16:22.532 "name": "NewBaseBdev", 00:16:22.532 "aliases": [ 00:16:22.532 "1aeb2614-fcdb-4c32-97f1-fef8676f76a0" 00:16:22.532 ], 00:16:22.532 "product_name": "Malloc disk", 00:16:22.532 "block_size": 512, 00:16:22.532 "num_blocks": 65536, 00:16:22.532 "uuid": "1aeb2614-fcdb-4c32-97f1-fef8676f76a0", 00:16:22.532 "assigned_rate_limits": { 00:16:22.532 "rw_ios_per_sec": 0, 00:16:22.532 "rw_mbytes_per_sec": 0, 00:16:22.532 "r_mbytes_per_sec": 0, 00:16:22.532 "w_mbytes_per_sec": 0 00:16:22.532 }, 00:16:22.532 "claimed": true, 00:16:22.532 "claim_type": "exclusive_write", 00:16:22.532 "zoned": false, 00:16:22.532 "supported_io_types": { 00:16:22.532 "read": true, 00:16:22.532 "write": true, 00:16:22.532 "unmap": true, 00:16:22.532 "flush": true, 00:16:22.532 "reset": true, 00:16:22.532 "nvme_admin": false, 00:16:22.532 "nvme_io": false, 00:16:22.532 "nvme_io_md": false, 00:16:22.532 "write_zeroes": true, 00:16:22.532 "zcopy": true, 00:16:22.532 "get_zone_info": false, 00:16:22.532 "zone_management": false, 00:16:22.532 "zone_append": false, 00:16:22.532 "compare": false, 00:16:22.533 "compare_and_write": false, 00:16:22.533 "abort": true, 00:16:22.533 "seek_hole": false, 00:16:22.533 "seek_data": false, 00:16:22.533 "copy": true, 00:16:22.533 "nvme_iov_md": false 00:16:22.533 }, 00:16:22.533 "memory_domains": [ 00:16:22.533 { 00:16:22.533 "dma_device_id": "system", 00:16:22.533 "dma_device_type": 1 00:16:22.533 }, 00:16:22.533 { 00:16:22.533 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:22.533 "dma_device_type": 2 00:16:22.533 } 00:16:22.533 ], 00:16:22.533 "driver_specific": {} 00:16:22.533 } 00:16:22.533 ] 00:16:22.533 18:20:06 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:16:22.533 18:20:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@335 -- # verify_raid_bdev_state Existed_Raid online raid1 0 3 00:16:22.533 18:20:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:16:22.533 18:20:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:16:22.533 18:20:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:16:22.533 18:20:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:16:22.533 18:20:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:16:22.533 18:20:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:22.533 18:20:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:22.533 18:20:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:22.533 18:20:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:22.533 18:20:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:22.533 18:20:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:16:22.791 18:20:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:22.791 "name": "Existed_Raid", 00:16:22.791 "uuid": "23414028-2280-4503-94ec-c4eeba15f6d0", 00:16:22.791 "strip_size_kb": 0, 00:16:22.791 "state": "online", 00:16:22.791 "raid_level": "raid1", 00:16:22.791 "superblock": false, 00:16:22.791 "num_base_bdevs": 3, 00:16:22.791 "num_base_bdevs_discovered": 3, 00:16:22.791 "num_base_bdevs_operational": 3, 00:16:22.791 "base_bdevs_list": [ 00:16:22.791 { 00:16:22.791 "name": "NewBaseBdev", 00:16:22.791 "uuid": "1aeb2614-fcdb-4c32-97f1-fef8676f76a0", 00:16:22.791 "is_configured": true, 00:16:22.791 "data_offset": 0, 00:16:22.791 "data_size": 65536 00:16:22.791 }, 00:16:22.791 { 00:16:22.792 "name": "BaseBdev2", 00:16:22.792 "uuid": "c119ee5f-d23e-4da6-a668-9dced66414dc", 00:16:22.792 "is_configured": true, 00:16:22.792 "data_offset": 0, 00:16:22.792 "data_size": 65536 00:16:22.792 }, 00:16:22.792 { 00:16:22.792 "name": "BaseBdev3", 00:16:22.792 "uuid": "fd970701-0b55-4621-bfbc-180a53749d58", 00:16:22.792 "is_configured": true, 00:16:22.792 "data_offset": 0, 00:16:22.792 "data_size": 65536 00:16:22.792 } 00:16:22.792 ] 00:16:22.792 }' 00:16:22.792 18:20:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:22.792 18:20:06 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:16:23.728 18:20:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@336 -- # verify_raid_bdev_properties Existed_Raid 00:16:23.728 18:20:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:16:23.728 18:20:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:16:23.728 18:20:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:16:23.728 18:20:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:16:23.728 18:20:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@198 -- # local name 00:16:23.728 18:20:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:16:23.728 18:20:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:16:23.728 [2024-07-12 18:20:07.371865] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:16:23.728 18:20:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:16:23.728 "name": "Existed_Raid", 00:16:23.728 "aliases": [ 00:16:23.728 "23414028-2280-4503-94ec-c4eeba15f6d0" 00:16:23.728 ], 00:16:23.728 "product_name": "Raid Volume", 00:16:23.728 "block_size": 512, 00:16:23.728 "num_blocks": 65536, 00:16:23.728 "uuid": "23414028-2280-4503-94ec-c4eeba15f6d0", 00:16:23.728 "assigned_rate_limits": { 00:16:23.728 "rw_ios_per_sec": 0, 00:16:23.728 "rw_mbytes_per_sec": 0, 00:16:23.728 "r_mbytes_per_sec": 0, 00:16:23.728 "w_mbytes_per_sec": 0 00:16:23.728 }, 00:16:23.728 "claimed": false, 00:16:23.728 "zoned": false, 00:16:23.728 "supported_io_types": { 00:16:23.728 "read": true, 00:16:23.728 "write": true, 00:16:23.728 "unmap": false, 00:16:23.728 "flush": false, 00:16:23.728 "reset": true, 00:16:23.728 "nvme_admin": false, 00:16:23.728 "nvme_io": false, 00:16:23.728 "nvme_io_md": false, 00:16:23.728 "write_zeroes": true, 00:16:23.728 "zcopy": false, 00:16:23.728 "get_zone_info": false, 00:16:23.728 "zone_management": false, 00:16:23.728 "zone_append": false, 00:16:23.728 "compare": false, 00:16:23.728 "compare_and_write": false, 00:16:23.728 "abort": false, 00:16:23.728 "seek_hole": false, 00:16:23.728 "seek_data": false, 00:16:23.728 "copy": false, 00:16:23.728 "nvme_iov_md": false 00:16:23.728 }, 00:16:23.728 "memory_domains": [ 00:16:23.728 { 00:16:23.728 "dma_device_id": "system", 00:16:23.728 "dma_device_type": 1 00:16:23.728 }, 00:16:23.728 { 00:16:23.728 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:23.728 "dma_device_type": 2 00:16:23.728 }, 00:16:23.728 { 00:16:23.728 "dma_device_id": "system", 00:16:23.728 "dma_device_type": 1 00:16:23.728 }, 00:16:23.728 { 00:16:23.728 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:23.728 "dma_device_type": 2 00:16:23.728 }, 00:16:23.728 { 00:16:23.728 "dma_device_id": "system", 00:16:23.728 "dma_device_type": 1 00:16:23.728 }, 00:16:23.728 { 00:16:23.728 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:23.728 "dma_device_type": 2 00:16:23.728 } 00:16:23.728 ], 00:16:23.728 "driver_specific": { 00:16:23.728 "raid": { 00:16:23.728 "uuid": "23414028-2280-4503-94ec-c4eeba15f6d0", 00:16:23.728 "strip_size_kb": 0, 00:16:23.728 "state": "online", 00:16:23.728 "raid_level": "raid1", 00:16:23.728 "superblock": false, 00:16:23.728 "num_base_bdevs": 3, 00:16:23.728 "num_base_bdevs_discovered": 3, 00:16:23.728 "num_base_bdevs_operational": 3, 00:16:23.728 "base_bdevs_list": [ 00:16:23.728 { 00:16:23.728 "name": "NewBaseBdev", 00:16:23.728 "uuid": "1aeb2614-fcdb-4c32-97f1-fef8676f76a0", 00:16:23.728 "is_configured": true, 00:16:23.728 "data_offset": 0, 00:16:23.728 "data_size": 65536 00:16:23.728 }, 00:16:23.728 { 00:16:23.728 "name": "BaseBdev2", 00:16:23.728 "uuid": "c119ee5f-d23e-4da6-a668-9dced66414dc", 00:16:23.728 "is_configured": true, 00:16:23.728 "data_offset": 0, 00:16:23.728 "data_size": 65536 00:16:23.728 }, 00:16:23.728 { 00:16:23.728 "name": "BaseBdev3", 00:16:23.728 "uuid": "fd970701-0b55-4621-bfbc-180a53749d58", 00:16:23.728 "is_configured": true, 00:16:23.728 "data_offset": 0, 00:16:23.728 "data_size": 65536 00:16:23.728 } 00:16:23.728 ] 00:16:23.728 } 00:16:23.728 } 00:16:23.728 }' 00:16:23.728 18:20:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:16:23.728 18:20:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='NewBaseBdev 00:16:23.728 BaseBdev2 00:16:23.728 BaseBdev3' 00:16:23.728 18:20:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:16:23.728 18:20:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:16:23.728 18:20:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev 00:16:23.987 18:20:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:16:23.987 "name": "NewBaseBdev", 00:16:23.987 "aliases": [ 00:16:23.987 "1aeb2614-fcdb-4c32-97f1-fef8676f76a0" 00:16:23.987 ], 00:16:23.987 "product_name": "Malloc disk", 00:16:23.987 "block_size": 512, 00:16:23.987 "num_blocks": 65536, 00:16:23.987 "uuid": "1aeb2614-fcdb-4c32-97f1-fef8676f76a0", 00:16:23.987 "assigned_rate_limits": { 00:16:23.987 "rw_ios_per_sec": 0, 00:16:23.987 "rw_mbytes_per_sec": 0, 00:16:23.987 "r_mbytes_per_sec": 0, 00:16:23.987 "w_mbytes_per_sec": 0 00:16:23.987 }, 00:16:23.987 "claimed": true, 00:16:23.987 "claim_type": "exclusive_write", 00:16:23.987 "zoned": false, 00:16:23.987 "supported_io_types": { 00:16:23.987 "read": true, 00:16:23.987 "write": true, 00:16:23.987 "unmap": true, 00:16:23.987 "flush": true, 00:16:23.987 "reset": true, 00:16:23.987 "nvme_admin": false, 00:16:23.987 "nvme_io": false, 00:16:23.987 "nvme_io_md": false, 00:16:23.987 "write_zeroes": true, 00:16:23.987 "zcopy": true, 00:16:23.987 "get_zone_info": false, 00:16:23.987 "zone_management": false, 00:16:23.987 "zone_append": false, 00:16:23.987 "compare": false, 00:16:23.987 "compare_and_write": false, 00:16:23.987 "abort": true, 00:16:23.987 "seek_hole": false, 00:16:23.987 "seek_data": false, 00:16:23.987 "copy": true, 00:16:23.987 "nvme_iov_md": false 00:16:23.987 }, 00:16:23.987 "memory_domains": [ 00:16:23.987 { 00:16:23.987 "dma_device_id": "system", 00:16:23.987 "dma_device_type": 1 00:16:23.987 }, 00:16:23.987 { 00:16:23.987 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:23.987 "dma_device_type": 2 00:16:23.987 } 00:16:23.987 ], 00:16:23.987 "driver_specific": {} 00:16:23.987 }' 00:16:23.987 18:20:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:24.246 18:20:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:24.246 18:20:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:16:24.246 18:20:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:24.246 18:20:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:24.246 18:20:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:16:24.246 18:20:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:24.246 18:20:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:24.505 18:20:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:16:24.505 18:20:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:24.505 18:20:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:24.505 18:20:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:16:24.505 18:20:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:16:24.505 18:20:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:16:24.505 18:20:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:16:24.764 18:20:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:16:24.764 "name": "BaseBdev2", 00:16:24.764 "aliases": [ 00:16:24.764 "c119ee5f-d23e-4da6-a668-9dced66414dc" 00:16:24.764 ], 00:16:24.764 "product_name": "Malloc disk", 00:16:24.764 "block_size": 512, 00:16:24.764 "num_blocks": 65536, 00:16:24.764 "uuid": "c119ee5f-d23e-4da6-a668-9dced66414dc", 00:16:24.764 "assigned_rate_limits": { 00:16:24.764 "rw_ios_per_sec": 0, 00:16:24.764 "rw_mbytes_per_sec": 0, 00:16:24.764 "r_mbytes_per_sec": 0, 00:16:24.764 "w_mbytes_per_sec": 0 00:16:24.764 }, 00:16:24.764 "claimed": true, 00:16:24.764 "claim_type": "exclusive_write", 00:16:24.764 "zoned": false, 00:16:24.764 "supported_io_types": { 00:16:24.764 "read": true, 00:16:24.764 "write": true, 00:16:24.764 "unmap": true, 00:16:24.764 "flush": true, 00:16:24.764 "reset": true, 00:16:24.764 "nvme_admin": false, 00:16:24.764 "nvme_io": false, 00:16:24.764 "nvme_io_md": false, 00:16:24.764 "write_zeroes": true, 00:16:24.764 "zcopy": true, 00:16:24.764 "get_zone_info": false, 00:16:24.764 "zone_management": false, 00:16:24.764 "zone_append": false, 00:16:24.764 "compare": false, 00:16:24.764 "compare_and_write": false, 00:16:24.764 "abort": true, 00:16:24.764 "seek_hole": false, 00:16:24.764 "seek_data": false, 00:16:24.764 "copy": true, 00:16:24.764 "nvme_iov_md": false 00:16:24.764 }, 00:16:24.764 "memory_domains": [ 00:16:24.764 { 00:16:24.764 "dma_device_id": "system", 00:16:24.764 "dma_device_type": 1 00:16:24.764 }, 00:16:24.764 { 00:16:24.764 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:24.764 "dma_device_type": 2 00:16:24.764 } 00:16:24.764 ], 00:16:24.764 "driver_specific": {} 00:16:24.764 }' 00:16:24.764 18:20:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:24.764 18:20:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:24.764 18:20:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:16:24.764 18:20:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:25.023 18:20:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:25.023 18:20:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:16:25.023 18:20:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:25.023 18:20:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:25.023 18:20:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:16:25.023 18:20:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:25.023 18:20:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:25.023 18:20:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:16:25.023 18:20:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:16:25.023 18:20:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:16:25.023 18:20:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:16:25.281 18:20:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:16:25.281 "name": "BaseBdev3", 00:16:25.281 "aliases": [ 00:16:25.281 "fd970701-0b55-4621-bfbc-180a53749d58" 00:16:25.281 ], 00:16:25.281 "product_name": "Malloc disk", 00:16:25.281 "block_size": 512, 00:16:25.281 "num_blocks": 65536, 00:16:25.281 "uuid": "fd970701-0b55-4621-bfbc-180a53749d58", 00:16:25.281 "assigned_rate_limits": { 00:16:25.281 "rw_ios_per_sec": 0, 00:16:25.281 "rw_mbytes_per_sec": 0, 00:16:25.281 "r_mbytes_per_sec": 0, 00:16:25.281 "w_mbytes_per_sec": 0 00:16:25.281 }, 00:16:25.281 "claimed": true, 00:16:25.281 "claim_type": "exclusive_write", 00:16:25.281 "zoned": false, 00:16:25.281 "supported_io_types": { 00:16:25.281 "read": true, 00:16:25.282 "write": true, 00:16:25.282 "unmap": true, 00:16:25.282 "flush": true, 00:16:25.282 "reset": true, 00:16:25.282 "nvme_admin": false, 00:16:25.282 "nvme_io": false, 00:16:25.282 "nvme_io_md": false, 00:16:25.282 "write_zeroes": true, 00:16:25.282 "zcopy": true, 00:16:25.282 "get_zone_info": false, 00:16:25.282 "zone_management": false, 00:16:25.282 "zone_append": false, 00:16:25.282 "compare": false, 00:16:25.282 "compare_and_write": false, 00:16:25.282 "abort": true, 00:16:25.282 "seek_hole": false, 00:16:25.282 "seek_data": false, 00:16:25.282 "copy": true, 00:16:25.282 "nvme_iov_md": false 00:16:25.282 }, 00:16:25.282 "memory_domains": [ 00:16:25.282 { 00:16:25.282 "dma_device_id": "system", 00:16:25.282 "dma_device_type": 1 00:16:25.282 }, 00:16:25.282 { 00:16:25.282 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:25.282 "dma_device_type": 2 00:16:25.282 } 00:16:25.282 ], 00:16:25.282 "driver_specific": {} 00:16:25.282 }' 00:16:25.282 18:20:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:25.540 18:20:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:25.540 18:20:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:16:25.540 18:20:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:25.540 18:20:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:25.540 18:20:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:16:25.540 18:20:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:25.540 18:20:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:25.799 18:20:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:16:25.799 18:20:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:25.799 18:20:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:25.799 18:20:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:16:25.799 18:20:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@338 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:16:26.058 [2024-07-12 18:20:09.609519] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:16:26.058 [2024-07-12 18:20:09.609543] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:16:26.058 [2024-07-12 18:20:09.609591] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:16:26.058 [2024-07-12 18:20:09.609847] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:16:26.058 [2024-07-12 18:20:09.609859] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x13e4e40 name Existed_Raid, state offline 00:16:26.058 18:20:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@341 -- # killprocess 2504194 00:16:26.058 18:20:09 bdev_raid.raid_state_function_test -- common/autotest_common.sh@948 -- # '[' -z 2504194 ']' 00:16:26.058 18:20:09 bdev_raid.raid_state_function_test -- common/autotest_common.sh@952 -- # kill -0 2504194 00:16:26.058 18:20:09 bdev_raid.raid_state_function_test -- common/autotest_common.sh@953 -- # uname 00:16:26.058 18:20:09 bdev_raid.raid_state_function_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:16:26.058 18:20:09 bdev_raid.raid_state_function_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2504194 00:16:26.058 18:20:09 bdev_raid.raid_state_function_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:16:26.058 18:20:09 bdev_raid.raid_state_function_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:16:26.058 18:20:09 bdev_raid.raid_state_function_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2504194' 00:16:26.058 killing process with pid 2504194 00:16:26.058 18:20:09 bdev_raid.raid_state_function_test -- common/autotest_common.sh@967 -- # kill 2504194 00:16:26.058 [2024-07-12 18:20:09.681817] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:16:26.058 18:20:09 bdev_raid.raid_state_function_test -- common/autotest_common.sh@972 -- # wait 2504194 00:16:26.058 [2024-07-12 18:20:09.708795] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:16:26.317 18:20:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@343 -- # return 0 00:16:26.317 00:16:26.317 real 0m28.636s 00:16:26.317 user 0m52.503s 00:16:26.317 sys 0m5.116s 00:16:26.317 18:20:09 bdev_raid.raid_state_function_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:16:26.317 18:20:09 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:16:26.317 ************************************ 00:16:26.317 END TEST raid_state_function_test 00:16:26.317 ************************************ 00:16:26.317 18:20:09 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:16:26.317 18:20:09 bdev_raid -- bdev/bdev_raid.sh@868 -- # run_test raid_state_function_test_sb raid_state_function_test raid1 3 true 00:16:26.317 18:20:09 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:16:26.317 18:20:09 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:16:26.317 18:20:09 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:16:26.317 ************************************ 00:16:26.317 START TEST raid_state_function_test_sb 00:16:26.317 ************************************ 00:16:26.317 18:20:10 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@1123 -- # raid_state_function_test raid1 3 true 00:16:26.317 18:20:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@220 -- # local raid_level=raid1 00:16:26.317 18:20:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@221 -- # local num_base_bdevs=3 00:16:26.317 18:20:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@222 -- # local superblock=true 00:16:26.317 18:20:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@223 -- # local raid_bdev 00:16:26.317 18:20:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i = 1 )) 00:16:26.317 18:20:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:16:26.317 18:20:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev1 00:16:26.317 18:20:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:16:26.317 18:20:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:16:26.317 18:20:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev2 00:16:26.317 18:20:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:16:26.317 18:20:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:16:26.317 18:20:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev3 00:16:26.317 18:20:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:16:26.317 18:20:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:16:26.317 18:20:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3') 00:16:26.317 18:20:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # local base_bdevs 00:16:26.317 18:20:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@225 -- # local raid_bdev_name=Existed_Raid 00:16:26.317 18:20:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # local strip_size 00:16:26.317 18:20:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@227 -- # local strip_size_create_arg 00:16:26.317 18:20:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@228 -- # local superblock_create_arg 00:16:26.317 18:20:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@230 -- # '[' raid1 '!=' raid1 ']' 00:16:26.317 18:20:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@234 -- # strip_size=0 00:16:26.317 18:20:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@237 -- # '[' true = true ']' 00:16:26.317 18:20:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@238 -- # superblock_create_arg=-s 00:16:26.317 18:20:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@244 -- # raid_pid=2508585 00:16:26.317 18:20:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@245 -- # echo 'Process raid pid: 2508585' 00:16:26.317 Process raid pid: 2508585 00:16:26.317 18:20:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@246 -- # waitforlisten 2508585 /var/tmp/spdk-raid.sock 00:16:26.317 18:20:10 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@829 -- # '[' -z 2508585 ']' 00:16:26.317 18:20:10 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:16:26.317 18:20:10 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@834 -- # local max_retries=100 00:16:26.317 18:20:10 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:16:26.317 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:16:26.317 18:20:10 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@838 -- # xtrace_disable 00:16:26.317 18:20:10 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:16:26.318 18:20:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@243 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:16:26.576 [2024-07-12 18:20:10.084938] Starting SPDK v24.09-pre git sha1 182dd7de4 / DPDK 24.03.0 initialization... 00:16:26.576 [2024-07-12 18:20:10.085003] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:16:26.576 [2024-07-12 18:20:10.215061] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:16:26.834 [2024-07-12 18:20:10.317915] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:16:26.834 [2024-07-12 18:20:10.387876] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:16:26.834 [2024-07-12 18:20:10.387905] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:16:27.400 18:20:11 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:16:27.400 18:20:11 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@862 -- # return 0 00:16:27.400 18:20:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@250 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:16:27.660 [2024-07-12 18:20:11.218687] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:16:27.660 [2024-07-12 18:20:11.218731] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:16:27.660 [2024-07-12 18:20:11.218742] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:16:27.660 [2024-07-12 18:20:11.218754] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:16:27.660 [2024-07-12 18:20:11.218763] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:16:27.660 [2024-07-12 18:20:11.218774] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:16:27.660 18:20:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@251 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:16:27.660 18:20:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:16:27.660 18:20:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:16:27.660 18:20:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:16:27.660 18:20:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:16:27.660 18:20:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:16:27.660 18:20:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:27.660 18:20:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:27.660 18:20:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:27.660 18:20:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:27.660 18:20:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:27.660 18:20:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:16:28.252 18:20:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:28.252 "name": "Existed_Raid", 00:16:28.252 "uuid": "99ddda6d-1cb7-4291-8441-43eefce55aab", 00:16:28.252 "strip_size_kb": 0, 00:16:28.252 "state": "configuring", 00:16:28.252 "raid_level": "raid1", 00:16:28.252 "superblock": true, 00:16:28.252 "num_base_bdevs": 3, 00:16:28.252 "num_base_bdevs_discovered": 0, 00:16:28.252 "num_base_bdevs_operational": 3, 00:16:28.252 "base_bdevs_list": [ 00:16:28.252 { 00:16:28.252 "name": "BaseBdev1", 00:16:28.252 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:28.252 "is_configured": false, 00:16:28.252 "data_offset": 0, 00:16:28.252 "data_size": 0 00:16:28.252 }, 00:16:28.252 { 00:16:28.252 "name": "BaseBdev2", 00:16:28.252 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:28.252 "is_configured": false, 00:16:28.252 "data_offset": 0, 00:16:28.252 "data_size": 0 00:16:28.252 }, 00:16:28.252 { 00:16:28.252 "name": "BaseBdev3", 00:16:28.252 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:28.252 "is_configured": false, 00:16:28.252 "data_offset": 0, 00:16:28.252 "data_size": 0 00:16:28.252 } 00:16:28.252 ] 00:16:28.252 }' 00:16:28.252 18:20:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:28.252 18:20:11 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:16:28.819 18:20:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@252 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:16:29.078 [2024-07-12 18:20:12.566110] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:16:29.078 [2024-07-12 18:20:12.566144] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xbe9a80 name Existed_Raid, state configuring 00:16:29.078 18:20:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@256 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:16:29.078 [2024-07-12 18:20:12.798747] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:16:29.078 [2024-07-12 18:20:12.798781] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:16:29.078 [2024-07-12 18:20:12.798791] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:16:29.078 [2024-07-12 18:20:12.798803] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:16:29.078 [2024-07-12 18:20:12.798812] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:16:29.078 [2024-07-12 18:20:12.798823] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:16:29.336 18:20:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@257 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:16:29.593 [2024-07-12 18:20:13.306345] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:16:29.593 BaseBdev1 00:16:29.850 18:20:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@258 -- # waitforbdev BaseBdev1 00:16:29.850 18:20:13 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev1 00:16:29.850 18:20:13 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:16:29.850 18:20:13 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:16:29.850 18:20:13 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:16:29.850 18:20:13 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:16:29.850 18:20:13 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:16:29.850 18:20:13 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:16:30.416 [ 00:16:30.416 { 00:16:30.416 "name": "BaseBdev1", 00:16:30.416 "aliases": [ 00:16:30.416 "3042ebb7-b5e1-4065-8ab5-682836373b8a" 00:16:30.416 ], 00:16:30.416 "product_name": "Malloc disk", 00:16:30.416 "block_size": 512, 00:16:30.416 "num_blocks": 65536, 00:16:30.416 "uuid": "3042ebb7-b5e1-4065-8ab5-682836373b8a", 00:16:30.416 "assigned_rate_limits": { 00:16:30.416 "rw_ios_per_sec": 0, 00:16:30.416 "rw_mbytes_per_sec": 0, 00:16:30.416 "r_mbytes_per_sec": 0, 00:16:30.416 "w_mbytes_per_sec": 0 00:16:30.416 }, 00:16:30.416 "claimed": true, 00:16:30.416 "claim_type": "exclusive_write", 00:16:30.416 "zoned": false, 00:16:30.416 "supported_io_types": { 00:16:30.416 "read": true, 00:16:30.416 "write": true, 00:16:30.416 "unmap": true, 00:16:30.416 "flush": true, 00:16:30.416 "reset": true, 00:16:30.416 "nvme_admin": false, 00:16:30.416 "nvme_io": false, 00:16:30.416 "nvme_io_md": false, 00:16:30.416 "write_zeroes": true, 00:16:30.416 "zcopy": true, 00:16:30.416 "get_zone_info": false, 00:16:30.416 "zone_management": false, 00:16:30.416 "zone_append": false, 00:16:30.416 "compare": false, 00:16:30.416 "compare_and_write": false, 00:16:30.416 "abort": true, 00:16:30.416 "seek_hole": false, 00:16:30.416 "seek_data": false, 00:16:30.416 "copy": true, 00:16:30.416 "nvme_iov_md": false 00:16:30.416 }, 00:16:30.416 "memory_domains": [ 00:16:30.416 { 00:16:30.416 "dma_device_id": "system", 00:16:30.416 "dma_device_type": 1 00:16:30.416 }, 00:16:30.416 { 00:16:30.416 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:30.416 "dma_device_type": 2 00:16:30.416 } 00:16:30.416 ], 00:16:30.416 "driver_specific": {} 00:16:30.416 } 00:16:30.416 ] 00:16:30.416 18:20:14 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:16:30.416 18:20:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@259 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:16:30.416 18:20:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:16:30.416 18:20:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:16:30.416 18:20:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:16:30.416 18:20:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:16:30.416 18:20:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:16:30.416 18:20:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:30.417 18:20:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:30.417 18:20:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:30.417 18:20:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:30.417 18:20:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:30.417 18:20:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:16:30.675 18:20:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:30.675 "name": "Existed_Raid", 00:16:30.675 "uuid": "53ef54a9-d2a7-48d9-95ef-32f7dfb0a5e1", 00:16:30.675 "strip_size_kb": 0, 00:16:30.675 "state": "configuring", 00:16:30.675 "raid_level": "raid1", 00:16:30.675 "superblock": true, 00:16:30.675 "num_base_bdevs": 3, 00:16:30.675 "num_base_bdevs_discovered": 1, 00:16:30.675 "num_base_bdevs_operational": 3, 00:16:30.675 "base_bdevs_list": [ 00:16:30.675 { 00:16:30.675 "name": "BaseBdev1", 00:16:30.675 "uuid": "3042ebb7-b5e1-4065-8ab5-682836373b8a", 00:16:30.675 "is_configured": true, 00:16:30.675 "data_offset": 2048, 00:16:30.675 "data_size": 63488 00:16:30.675 }, 00:16:30.675 { 00:16:30.675 "name": "BaseBdev2", 00:16:30.675 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:30.675 "is_configured": false, 00:16:30.675 "data_offset": 0, 00:16:30.675 "data_size": 0 00:16:30.675 }, 00:16:30.675 { 00:16:30.675 "name": "BaseBdev3", 00:16:30.675 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:30.675 "is_configured": false, 00:16:30.675 "data_offset": 0, 00:16:30.675 "data_size": 0 00:16:30.675 } 00:16:30.675 ] 00:16:30.675 }' 00:16:30.675 18:20:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:30.675 18:20:14 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:16:31.242 18:20:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@260 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:16:31.500 [2024-07-12 18:20:15.119135] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:16:31.500 [2024-07-12 18:20:15.119173] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xbe9310 name Existed_Raid, state configuring 00:16:31.500 18:20:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@264 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:16:31.758 [2024-07-12 18:20:15.367830] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:16:31.758 [2024-07-12 18:20:15.369273] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:16:31.758 [2024-07-12 18:20:15.369306] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:16:31.758 [2024-07-12 18:20:15.369316] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:16:31.758 [2024-07-12 18:20:15.369328] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:16:31.758 18:20:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i = 1 )) 00:16:31.758 18:20:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:16:31.758 18:20:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:16:31.758 18:20:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:16:31.758 18:20:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:16:31.758 18:20:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:16:31.758 18:20:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:16:31.758 18:20:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:16:31.758 18:20:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:31.758 18:20:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:31.758 18:20:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:31.758 18:20:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:31.758 18:20:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:31.758 18:20:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:16:32.017 18:20:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:32.017 "name": "Existed_Raid", 00:16:32.017 "uuid": "39bc0c0a-0821-491f-b084-2cfa6db9007f", 00:16:32.017 "strip_size_kb": 0, 00:16:32.017 "state": "configuring", 00:16:32.017 "raid_level": "raid1", 00:16:32.017 "superblock": true, 00:16:32.017 "num_base_bdevs": 3, 00:16:32.017 "num_base_bdevs_discovered": 1, 00:16:32.017 "num_base_bdevs_operational": 3, 00:16:32.017 "base_bdevs_list": [ 00:16:32.017 { 00:16:32.017 "name": "BaseBdev1", 00:16:32.017 "uuid": "3042ebb7-b5e1-4065-8ab5-682836373b8a", 00:16:32.017 "is_configured": true, 00:16:32.017 "data_offset": 2048, 00:16:32.017 "data_size": 63488 00:16:32.017 }, 00:16:32.017 { 00:16:32.017 "name": "BaseBdev2", 00:16:32.017 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:32.017 "is_configured": false, 00:16:32.017 "data_offset": 0, 00:16:32.017 "data_size": 0 00:16:32.017 }, 00:16:32.017 { 00:16:32.017 "name": "BaseBdev3", 00:16:32.017 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:32.017 "is_configured": false, 00:16:32.017 "data_offset": 0, 00:16:32.017 "data_size": 0 00:16:32.017 } 00:16:32.017 ] 00:16:32.017 }' 00:16:32.017 18:20:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:32.017 18:20:15 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:16:32.585 18:20:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:16:32.585 [2024-07-12 18:20:16.241747] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:16:32.585 BaseBdev2 00:16:32.585 18:20:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev2 00:16:32.585 18:20:16 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev2 00:16:32.585 18:20:16 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:16:32.585 18:20:16 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:16:32.585 18:20:16 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:16:32.585 18:20:16 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:16:32.585 18:20:16 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:16:32.843 18:20:16 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:16:33.102 [ 00:16:33.103 { 00:16:33.103 "name": "BaseBdev2", 00:16:33.103 "aliases": [ 00:16:33.103 "cc6f28bf-3afb-4aa9-b8ed-f84481026c99" 00:16:33.103 ], 00:16:33.103 "product_name": "Malloc disk", 00:16:33.103 "block_size": 512, 00:16:33.103 "num_blocks": 65536, 00:16:33.103 "uuid": "cc6f28bf-3afb-4aa9-b8ed-f84481026c99", 00:16:33.103 "assigned_rate_limits": { 00:16:33.103 "rw_ios_per_sec": 0, 00:16:33.103 "rw_mbytes_per_sec": 0, 00:16:33.103 "r_mbytes_per_sec": 0, 00:16:33.103 "w_mbytes_per_sec": 0 00:16:33.103 }, 00:16:33.103 "claimed": true, 00:16:33.103 "claim_type": "exclusive_write", 00:16:33.103 "zoned": false, 00:16:33.103 "supported_io_types": { 00:16:33.103 "read": true, 00:16:33.103 "write": true, 00:16:33.103 "unmap": true, 00:16:33.103 "flush": true, 00:16:33.103 "reset": true, 00:16:33.103 "nvme_admin": false, 00:16:33.103 "nvme_io": false, 00:16:33.103 "nvme_io_md": false, 00:16:33.103 "write_zeroes": true, 00:16:33.103 "zcopy": true, 00:16:33.103 "get_zone_info": false, 00:16:33.103 "zone_management": false, 00:16:33.103 "zone_append": false, 00:16:33.103 "compare": false, 00:16:33.103 "compare_and_write": false, 00:16:33.103 "abort": true, 00:16:33.103 "seek_hole": false, 00:16:33.103 "seek_data": false, 00:16:33.103 "copy": true, 00:16:33.103 "nvme_iov_md": false 00:16:33.103 }, 00:16:33.103 "memory_domains": [ 00:16:33.103 { 00:16:33.103 "dma_device_id": "system", 00:16:33.103 "dma_device_type": 1 00:16:33.103 }, 00:16:33.103 { 00:16:33.103 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:33.103 "dma_device_type": 2 00:16:33.103 } 00:16:33.103 ], 00:16:33.103 "driver_specific": {} 00:16:33.103 } 00:16:33.103 ] 00:16:33.103 18:20:16 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:16:33.103 18:20:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:16:33.103 18:20:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:16:33.103 18:20:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:16:33.103 18:20:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:16:33.103 18:20:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:16:33.103 18:20:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:16:33.103 18:20:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:16:33.103 18:20:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:16:33.103 18:20:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:33.103 18:20:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:33.103 18:20:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:33.103 18:20:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:33.103 18:20:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:33.103 18:20:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:16:33.362 18:20:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:33.362 "name": "Existed_Raid", 00:16:33.362 "uuid": "39bc0c0a-0821-491f-b084-2cfa6db9007f", 00:16:33.362 "strip_size_kb": 0, 00:16:33.362 "state": "configuring", 00:16:33.362 "raid_level": "raid1", 00:16:33.362 "superblock": true, 00:16:33.362 "num_base_bdevs": 3, 00:16:33.362 "num_base_bdevs_discovered": 2, 00:16:33.362 "num_base_bdevs_operational": 3, 00:16:33.362 "base_bdevs_list": [ 00:16:33.362 { 00:16:33.362 "name": "BaseBdev1", 00:16:33.362 "uuid": "3042ebb7-b5e1-4065-8ab5-682836373b8a", 00:16:33.362 "is_configured": true, 00:16:33.362 "data_offset": 2048, 00:16:33.362 "data_size": 63488 00:16:33.362 }, 00:16:33.362 { 00:16:33.362 "name": "BaseBdev2", 00:16:33.362 "uuid": "cc6f28bf-3afb-4aa9-b8ed-f84481026c99", 00:16:33.362 "is_configured": true, 00:16:33.362 "data_offset": 2048, 00:16:33.362 "data_size": 63488 00:16:33.362 }, 00:16:33.362 { 00:16:33.362 "name": "BaseBdev3", 00:16:33.362 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:33.362 "is_configured": false, 00:16:33.362 "data_offset": 0, 00:16:33.362 "data_size": 0 00:16:33.362 } 00:16:33.362 ] 00:16:33.362 }' 00:16:33.362 18:20:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:33.362 18:20:16 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:16:34.298 18:20:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:16:34.298 [2024-07-12 18:20:17.953917] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:16:34.298 [2024-07-12 18:20:17.954086] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0xbea400 00:16:34.298 [2024-07-12 18:20:17.954100] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:16:34.298 [2024-07-12 18:20:17.954272] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xbe9ef0 00:16:34.298 [2024-07-12 18:20:17.954391] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0xbea400 00:16:34.298 [2024-07-12 18:20:17.954401] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0xbea400 00:16:34.298 [2024-07-12 18:20:17.954491] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:16:34.298 BaseBdev3 00:16:34.298 18:20:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev3 00:16:34.298 18:20:17 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev3 00:16:34.298 18:20:17 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:16:34.298 18:20:17 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:16:34.298 18:20:17 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:16:34.298 18:20:17 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:16:34.298 18:20:17 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:16:34.557 18:20:18 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:16:34.815 [ 00:16:34.815 { 00:16:34.815 "name": "BaseBdev3", 00:16:34.815 "aliases": [ 00:16:34.815 "5280a71e-5767-4063-9f38-d596ba81619e" 00:16:34.815 ], 00:16:34.815 "product_name": "Malloc disk", 00:16:34.815 "block_size": 512, 00:16:34.815 "num_blocks": 65536, 00:16:34.815 "uuid": "5280a71e-5767-4063-9f38-d596ba81619e", 00:16:34.815 "assigned_rate_limits": { 00:16:34.815 "rw_ios_per_sec": 0, 00:16:34.815 "rw_mbytes_per_sec": 0, 00:16:34.815 "r_mbytes_per_sec": 0, 00:16:34.815 "w_mbytes_per_sec": 0 00:16:34.815 }, 00:16:34.815 "claimed": true, 00:16:34.815 "claim_type": "exclusive_write", 00:16:34.815 "zoned": false, 00:16:34.815 "supported_io_types": { 00:16:34.815 "read": true, 00:16:34.815 "write": true, 00:16:34.815 "unmap": true, 00:16:34.815 "flush": true, 00:16:34.815 "reset": true, 00:16:34.815 "nvme_admin": false, 00:16:34.815 "nvme_io": false, 00:16:34.815 "nvme_io_md": false, 00:16:34.815 "write_zeroes": true, 00:16:34.815 "zcopy": true, 00:16:34.815 "get_zone_info": false, 00:16:34.815 "zone_management": false, 00:16:34.815 "zone_append": false, 00:16:34.815 "compare": false, 00:16:34.815 "compare_and_write": false, 00:16:34.815 "abort": true, 00:16:34.815 "seek_hole": false, 00:16:34.815 "seek_data": false, 00:16:34.815 "copy": true, 00:16:34.815 "nvme_iov_md": false 00:16:34.815 }, 00:16:34.815 "memory_domains": [ 00:16:34.815 { 00:16:34.815 "dma_device_id": "system", 00:16:34.815 "dma_device_type": 1 00:16:34.815 }, 00:16:34.815 { 00:16:34.815 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:34.815 "dma_device_type": 2 00:16:34.815 } 00:16:34.815 ], 00:16:34.815 "driver_specific": {} 00:16:34.815 } 00:16:34.815 ] 00:16:34.815 18:20:18 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:16:34.816 18:20:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:16:34.816 18:20:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:16:34.816 18:20:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@270 -- # verify_raid_bdev_state Existed_Raid online raid1 0 3 00:16:34.816 18:20:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:16:34.816 18:20:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:16:34.816 18:20:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:16:34.816 18:20:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:16:34.816 18:20:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:16:34.816 18:20:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:34.816 18:20:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:34.816 18:20:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:34.816 18:20:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:34.816 18:20:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:34.816 18:20:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:16:34.816 18:20:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:34.816 "name": "Existed_Raid", 00:16:34.816 "uuid": "39bc0c0a-0821-491f-b084-2cfa6db9007f", 00:16:34.816 "strip_size_kb": 0, 00:16:34.816 "state": "online", 00:16:34.816 "raid_level": "raid1", 00:16:34.816 "superblock": true, 00:16:34.816 "num_base_bdevs": 3, 00:16:34.816 "num_base_bdevs_discovered": 3, 00:16:34.816 "num_base_bdevs_operational": 3, 00:16:34.816 "base_bdevs_list": [ 00:16:34.816 { 00:16:34.816 "name": "BaseBdev1", 00:16:34.816 "uuid": "3042ebb7-b5e1-4065-8ab5-682836373b8a", 00:16:34.816 "is_configured": true, 00:16:34.816 "data_offset": 2048, 00:16:34.816 "data_size": 63488 00:16:34.816 }, 00:16:34.816 { 00:16:34.816 "name": "BaseBdev2", 00:16:34.816 "uuid": "cc6f28bf-3afb-4aa9-b8ed-f84481026c99", 00:16:34.816 "is_configured": true, 00:16:34.816 "data_offset": 2048, 00:16:34.816 "data_size": 63488 00:16:34.816 }, 00:16:34.816 { 00:16:34.816 "name": "BaseBdev3", 00:16:34.816 "uuid": "5280a71e-5767-4063-9f38-d596ba81619e", 00:16:34.816 "is_configured": true, 00:16:34.816 "data_offset": 2048, 00:16:34.816 "data_size": 63488 00:16:34.816 } 00:16:34.816 ] 00:16:34.816 }' 00:16:34.816 18:20:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:34.816 18:20:18 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:16:35.383 18:20:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_properties Existed_Raid 00:16:35.383 18:20:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:16:35.383 18:20:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:16:35.383 18:20:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:16:35.383 18:20:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:16:35.383 18:20:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@198 -- # local name 00:16:35.383 18:20:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:16:35.383 18:20:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:16:35.642 [2024-07-12 18:20:19.173436] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:16:35.642 18:20:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:16:35.642 "name": "Existed_Raid", 00:16:35.642 "aliases": [ 00:16:35.642 "39bc0c0a-0821-491f-b084-2cfa6db9007f" 00:16:35.642 ], 00:16:35.642 "product_name": "Raid Volume", 00:16:35.642 "block_size": 512, 00:16:35.642 "num_blocks": 63488, 00:16:35.642 "uuid": "39bc0c0a-0821-491f-b084-2cfa6db9007f", 00:16:35.642 "assigned_rate_limits": { 00:16:35.642 "rw_ios_per_sec": 0, 00:16:35.642 "rw_mbytes_per_sec": 0, 00:16:35.642 "r_mbytes_per_sec": 0, 00:16:35.642 "w_mbytes_per_sec": 0 00:16:35.642 }, 00:16:35.642 "claimed": false, 00:16:35.642 "zoned": false, 00:16:35.642 "supported_io_types": { 00:16:35.642 "read": true, 00:16:35.642 "write": true, 00:16:35.642 "unmap": false, 00:16:35.642 "flush": false, 00:16:35.642 "reset": true, 00:16:35.642 "nvme_admin": false, 00:16:35.642 "nvme_io": false, 00:16:35.642 "nvme_io_md": false, 00:16:35.642 "write_zeroes": true, 00:16:35.642 "zcopy": false, 00:16:35.642 "get_zone_info": false, 00:16:35.642 "zone_management": false, 00:16:35.642 "zone_append": false, 00:16:35.642 "compare": false, 00:16:35.642 "compare_and_write": false, 00:16:35.642 "abort": false, 00:16:35.642 "seek_hole": false, 00:16:35.642 "seek_data": false, 00:16:35.642 "copy": false, 00:16:35.642 "nvme_iov_md": false 00:16:35.642 }, 00:16:35.642 "memory_domains": [ 00:16:35.642 { 00:16:35.642 "dma_device_id": "system", 00:16:35.642 "dma_device_type": 1 00:16:35.642 }, 00:16:35.642 { 00:16:35.642 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:35.642 "dma_device_type": 2 00:16:35.642 }, 00:16:35.642 { 00:16:35.642 "dma_device_id": "system", 00:16:35.642 "dma_device_type": 1 00:16:35.642 }, 00:16:35.642 { 00:16:35.642 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:35.642 "dma_device_type": 2 00:16:35.642 }, 00:16:35.642 { 00:16:35.642 "dma_device_id": "system", 00:16:35.642 "dma_device_type": 1 00:16:35.642 }, 00:16:35.642 { 00:16:35.642 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:35.642 "dma_device_type": 2 00:16:35.642 } 00:16:35.642 ], 00:16:35.642 "driver_specific": { 00:16:35.642 "raid": { 00:16:35.642 "uuid": "39bc0c0a-0821-491f-b084-2cfa6db9007f", 00:16:35.642 "strip_size_kb": 0, 00:16:35.642 "state": "online", 00:16:35.642 "raid_level": "raid1", 00:16:35.642 "superblock": true, 00:16:35.642 "num_base_bdevs": 3, 00:16:35.642 "num_base_bdevs_discovered": 3, 00:16:35.642 "num_base_bdevs_operational": 3, 00:16:35.642 "base_bdevs_list": [ 00:16:35.642 { 00:16:35.642 "name": "BaseBdev1", 00:16:35.642 "uuid": "3042ebb7-b5e1-4065-8ab5-682836373b8a", 00:16:35.642 "is_configured": true, 00:16:35.642 "data_offset": 2048, 00:16:35.642 "data_size": 63488 00:16:35.642 }, 00:16:35.642 { 00:16:35.642 "name": "BaseBdev2", 00:16:35.642 "uuid": "cc6f28bf-3afb-4aa9-b8ed-f84481026c99", 00:16:35.642 "is_configured": true, 00:16:35.642 "data_offset": 2048, 00:16:35.642 "data_size": 63488 00:16:35.642 }, 00:16:35.642 { 00:16:35.642 "name": "BaseBdev3", 00:16:35.642 "uuid": "5280a71e-5767-4063-9f38-d596ba81619e", 00:16:35.642 "is_configured": true, 00:16:35.642 "data_offset": 2048, 00:16:35.642 "data_size": 63488 00:16:35.642 } 00:16:35.642 ] 00:16:35.642 } 00:16:35.642 } 00:16:35.642 }' 00:16:35.642 18:20:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:16:35.642 18:20:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # base_bdev_names='BaseBdev1 00:16:35.642 BaseBdev2 00:16:35.642 BaseBdev3' 00:16:35.642 18:20:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:16:35.642 18:20:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:16:35.642 18:20:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:16:35.901 18:20:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:16:35.901 "name": "BaseBdev1", 00:16:35.901 "aliases": [ 00:16:35.901 "3042ebb7-b5e1-4065-8ab5-682836373b8a" 00:16:35.901 ], 00:16:35.901 "product_name": "Malloc disk", 00:16:35.901 "block_size": 512, 00:16:35.901 "num_blocks": 65536, 00:16:35.901 "uuid": "3042ebb7-b5e1-4065-8ab5-682836373b8a", 00:16:35.901 "assigned_rate_limits": { 00:16:35.901 "rw_ios_per_sec": 0, 00:16:35.901 "rw_mbytes_per_sec": 0, 00:16:35.901 "r_mbytes_per_sec": 0, 00:16:35.901 "w_mbytes_per_sec": 0 00:16:35.901 }, 00:16:35.901 "claimed": true, 00:16:35.901 "claim_type": "exclusive_write", 00:16:35.901 "zoned": false, 00:16:35.901 "supported_io_types": { 00:16:35.901 "read": true, 00:16:35.901 "write": true, 00:16:35.901 "unmap": true, 00:16:35.901 "flush": true, 00:16:35.901 "reset": true, 00:16:35.901 "nvme_admin": false, 00:16:35.901 "nvme_io": false, 00:16:35.901 "nvme_io_md": false, 00:16:35.901 "write_zeroes": true, 00:16:35.901 "zcopy": true, 00:16:35.901 "get_zone_info": false, 00:16:35.901 "zone_management": false, 00:16:35.901 "zone_append": false, 00:16:35.901 "compare": false, 00:16:35.901 "compare_and_write": false, 00:16:35.901 "abort": true, 00:16:35.901 "seek_hole": false, 00:16:35.901 "seek_data": false, 00:16:35.901 "copy": true, 00:16:35.901 "nvme_iov_md": false 00:16:35.901 }, 00:16:35.901 "memory_domains": [ 00:16:35.901 { 00:16:35.901 "dma_device_id": "system", 00:16:35.901 "dma_device_type": 1 00:16:35.901 }, 00:16:35.901 { 00:16:35.901 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:35.901 "dma_device_type": 2 00:16:35.901 } 00:16:35.901 ], 00:16:35.901 "driver_specific": {} 00:16:35.901 }' 00:16:35.901 18:20:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:35.901 18:20:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:35.901 18:20:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:16:35.901 18:20:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:35.901 18:20:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:36.160 18:20:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:16:36.160 18:20:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:36.160 18:20:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:36.160 18:20:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:16:36.160 18:20:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:36.160 18:20:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:36.160 18:20:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:16:36.160 18:20:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:16:36.160 18:20:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:16:36.160 18:20:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:16:36.419 18:20:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:16:36.419 "name": "BaseBdev2", 00:16:36.419 "aliases": [ 00:16:36.419 "cc6f28bf-3afb-4aa9-b8ed-f84481026c99" 00:16:36.419 ], 00:16:36.419 "product_name": "Malloc disk", 00:16:36.419 "block_size": 512, 00:16:36.419 "num_blocks": 65536, 00:16:36.419 "uuid": "cc6f28bf-3afb-4aa9-b8ed-f84481026c99", 00:16:36.419 "assigned_rate_limits": { 00:16:36.419 "rw_ios_per_sec": 0, 00:16:36.419 "rw_mbytes_per_sec": 0, 00:16:36.419 "r_mbytes_per_sec": 0, 00:16:36.419 "w_mbytes_per_sec": 0 00:16:36.419 }, 00:16:36.419 "claimed": true, 00:16:36.419 "claim_type": "exclusive_write", 00:16:36.419 "zoned": false, 00:16:36.419 "supported_io_types": { 00:16:36.419 "read": true, 00:16:36.419 "write": true, 00:16:36.419 "unmap": true, 00:16:36.419 "flush": true, 00:16:36.419 "reset": true, 00:16:36.419 "nvme_admin": false, 00:16:36.419 "nvme_io": false, 00:16:36.419 "nvme_io_md": false, 00:16:36.419 "write_zeroes": true, 00:16:36.419 "zcopy": true, 00:16:36.419 "get_zone_info": false, 00:16:36.419 "zone_management": false, 00:16:36.419 "zone_append": false, 00:16:36.419 "compare": false, 00:16:36.419 "compare_and_write": false, 00:16:36.419 "abort": true, 00:16:36.419 "seek_hole": false, 00:16:36.419 "seek_data": false, 00:16:36.419 "copy": true, 00:16:36.419 "nvme_iov_md": false 00:16:36.419 }, 00:16:36.419 "memory_domains": [ 00:16:36.419 { 00:16:36.419 "dma_device_id": "system", 00:16:36.419 "dma_device_type": 1 00:16:36.419 }, 00:16:36.419 { 00:16:36.419 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:36.419 "dma_device_type": 2 00:16:36.419 } 00:16:36.419 ], 00:16:36.419 "driver_specific": {} 00:16:36.419 }' 00:16:36.419 18:20:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:36.419 18:20:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:36.419 18:20:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:16:36.419 18:20:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:36.678 18:20:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:36.678 18:20:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:16:36.678 18:20:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:36.678 18:20:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:36.678 18:20:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:16:36.678 18:20:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:36.678 18:20:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:36.936 18:20:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:16:36.936 18:20:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:16:36.936 18:20:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:16:36.936 18:20:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:16:36.936 18:20:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:16:36.936 "name": "BaseBdev3", 00:16:36.937 "aliases": [ 00:16:36.937 "5280a71e-5767-4063-9f38-d596ba81619e" 00:16:36.937 ], 00:16:36.937 "product_name": "Malloc disk", 00:16:36.937 "block_size": 512, 00:16:36.937 "num_blocks": 65536, 00:16:36.937 "uuid": "5280a71e-5767-4063-9f38-d596ba81619e", 00:16:36.937 "assigned_rate_limits": { 00:16:36.937 "rw_ios_per_sec": 0, 00:16:36.937 "rw_mbytes_per_sec": 0, 00:16:36.937 "r_mbytes_per_sec": 0, 00:16:36.937 "w_mbytes_per_sec": 0 00:16:36.937 }, 00:16:36.937 "claimed": true, 00:16:36.937 "claim_type": "exclusive_write", 00:16:36.937 "zoned": false, 00:16:36.937 "supported_io_types": { 00:16:36.937 "read": true, 00:16:36.937 "write": true, 00:16:36.937 "unmap": true, 00:16:36.937 "flush": true, 00:16:36.937 "reset": true, 00:16:36.937 "nvme_admin": false, 00:16:36.937 "nvme_io": false, 00:16:36.937 "nvme_io_md": false, 00:16:36.937 "write_zeroes": true, 00:16:36.937 "zcopy": true, 00:16:36.937 "get_zone_info": false, 00:16:36.937 "zone_management": false, 00:16:36.937 "zone_append": false, 00:16:36.937 "compare": false, 00:16:36.937 "compare_and_write": false, 00:16:36.937 "abort": true, 00:16:36.937 "seek_hole": false, 00:16:36.937 "seek_data": false, 00:16:36.937 "copy": true, 00:16:36.937 "nvme_iov_md": false 00:16:36.937 }, 00:16:36.937 "memory_domains": [ 00:16:36.937 { 00:16:36.937 "dma_device_id": "system", 00:16:36.937 "dma_device_type": 1 00:16:36.937 }, 00:16:36.937 { 00:16:36.937 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:36.937 "dma_device_type": 2 00:16:36.937 } 00:16:36.937 ], 00:16:36.937 "driver_specific": {} 00:16:36.937 }' 00:16:36.937 18:20:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:37.195 18:20:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:37.195 18:20:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:16:37.195 18:20:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:37.195 18:20:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:37.195 18:20:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:16:37.195 18:20:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:37.195 18:20:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:37.453 18:20:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:16:37.453 18:20:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:37.453 18:20:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:37.453 18:20:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:16:37.453 18:20:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@274 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:16:37.713 [2024-07-12 18:20:21.262731] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:16:37.713 18:20:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@275 -- # local expected_state 00:16:37.713 18:20:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@276 -- # has_redundancy raid1 00:16:37.713 18:20:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@213 -- # case $1 in 00:16:37.713 18:20:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@214 -- # return 0 00:16:37.713 18:20:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@279 -- # expected_state=online 00:16:37.713 18:20:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@281 -- # verify_raid_bdev_state Existed_Raid online raid1 0 2 00:16:37.713 18:20:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:16:37.713 18:20:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:16:37.713 18:20:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:16:37.713 18:20:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:16:37.713 18:20:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:16:37.713 18:20:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:37.713 18:20:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:37.713 18:20:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:37.713 18:20:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:37.713 18:20:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:37.713 18:20:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:16:38.280 18:20:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:38.280 "name": "Existed_Raid", 00:16:38.280 "uuid": "39bc0c0a-0821-491f-b084-2cfa6db9007f", 00:16:38.280 "strip_size_kb": 0, 00:16:38.280 "state": "online", 00:16:38.280 "raid_level": "raid1", 00:16:38.280 "superblock": true, 00:16:38.280 "num_base_bdevs": 3, 00:16:38.280 "num_base_bdevs_discovered": 2, 00:16:38.280 "num_base_bdevs_operational": 2, 00:16:38.280 "base_bdevs_list": [ 00:16:38.280 { 00:16:38.280 "name": null, 00:16:38.280 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:38.280 "is_configured": false, 00:16:38.280 "data_offset": 2048, 00:16:38.280 "data_size": 63488 00:16:38.280 }, 00:16:38.280 { 00:16:38.280 "name": "BaseBdev2", 00:16:38.280 "uuid": "cc6f28bf-3afb-4aa9-b8ed-f84481026c99", 00:16:38.281 "is_configured": true, 00:16:38.281 "data_offset": 2048, 00:16:38.281 "data_size": 63488 00:16:38.281 }, 00:16:38.281 { 00:16:38.281 "name": "BaseBdev3", 00:16:38.281 "uuid": "5280a71e-5767-4063-9f38-d596ba81619e", 00:16:38.281 "is_configured": true, 00:16:38.281 "data_offset": 2048, 00:16:38.281 "data_size": 63488 00:16:38.281 } 00:16:38.281 ] 00:16:38.281 }' 00:16:38.281 18:20:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:38.281 18:20:21 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:16:38.849 18:20:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i = 1 )) 00:16:38.849 18:20:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:16:38.849 18:20:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:16:38.849 18:20:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:39.107 18:20:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:16:39.107 18:20:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:16:39.107 18:20:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:16:39.366 [2024-07-12 18:20:22.868906] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:16:39.366 18:20:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:16:39.366 18:20:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:16:39.366 18:20:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:39.366 18:20:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:16:39.625 18:20:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:16:39.625 18:20:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:16:39.625 18:20:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev3 00:16:39.883 [2024-07-12 18:20:23.369563] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:16:39.883 [2024-07-12 18:20:23.369650] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:16:39.883 [2024-07-12 18:20:23.382124] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:16:39.883 [2024-07-12 18:20:23.382160] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:16:39.883 [2024-07-12 18:20:23.382172] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xbea400 name Existed_Raid, state offline 00:16:39.883 18:20:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:16:39.883 18:20:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:16:39.883 18:20:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:39.883 18:20:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # jq -r '.[0]["name"] | select(.)' 00:16:40.142 18:20:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # raid_bdev= 00:16:40.142 18:20:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@294 -- # '[' -n '' ']' 00:16:40.142 18:20:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@299 -- # '[' 3 -gt 2 ']' 00:16:40.142 18:20:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i = 1 )) 00:16:40.142 18:20:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:16:40.142 18:20:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:16:40.142 BaseBdev2 00:16:40.402 18:20:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev2 00:16:40.402 18:20:23 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev2 00:16:40.402 18:20:23 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:16:40.402 18:20:23 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:16:40.402 18:20:23 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:16:40.402 18:20:23 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:16:40.402 18:20:23 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:16:40.402 18:20:24 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:16:40.662 [ 00:16:40.662 { 00:16:40.662 "name": "BaseBdev2", 00:16:40.662 "aliases": [ 00:16:40.662 "3fe9a57f-5071-40bc-a7a0-dbb4f541c626" 00:16:40.662 ], 00:16:40.662 "product_name": "Malloc disk", 00:16:40.662 "block_size": 512, 00:16:40.662 "num_blocks": 65536, 00:16:40.662 "uuid": "3fe9a57f-5071-40bc-a7a0-dbb4f541c626", 00:16:40.662 "assigned_rate_limits": { 00:16:40.662 "rw_ios_per_sec": 0, 00:16:40.662 "rw_mbytes_per_sec": 0, 00:16:40.662 "r_mbytes_per_sec": 0, 00:16:40.662 "w_mbytes_per_sec": 0 00:16:40.662 }, 00:16:40.662 "claimed": false, 00:16:40.662 "zoned": false, 00:16:40.662 "supported_io_types": { 00:16:40.662 "read": true, 00:16:40.662 "write": true, 00:16:40.662 "unmap": true, 00:16:40.662 "flush": true, 00:16:40.662 "reset": true, 00:16:40.662 "nvme_admin": false, 00:16:40.662 "nvme_io": false, 00:16:40.662 "nvme_io_md": false, 00:16:40.662 "write_zeroes": true, 00:16:40.662 "zcopy": true, 00:16:40.662 "get_zone_info": false, 00:16:40.662 "zone_management": false, 00:16:40.662 "zone_append": false, 00:16:40.662 "compare": false, 00:16:40.662 "compare_and_write": false, 00:16:40.662 "abort": true, 00:16:40.662 "seek_hole": false, 00:16:40.662 "seek_data": false, 00:16:40.662 "copy": true, 00:16:40.662 "nvme_iov_md": false 00:16:40.662 }, 00:16:40.662 "memory_domains": [ 00:16:40.662 { 00:16:40.662 "dma_device_id": "system", 00:16:40.662 "dma_device_type": 1 00:16:40.662 }, 00:16:40.662 { 00:16:40.662 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:40.662 "dma_device_type": 2 00:16:40.662 } 00:16:40.662 ], 00:16:40.662 "driver_specific": {} 00:16:40.662 } 00:16:40.662 ] 00:16:40.662 18:20:24 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:16:40.662 18:20:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:16:40.662 18:20:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:16:40.662 18:20:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:16:40.921 BaseBdev3 00:16:40.921 18:20:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev3 00:16:40.921 18:20:24 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev3 00:16:40.921 18:20:24 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:16:40.922 18:20:24 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:16:40.922 18:20:24 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:16:40.922 18:20:24 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:16:40.922 18:20:24 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:16:41.180 18:20:24 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:16:41.747 [ 00:16:41.747 { 00:16:41.747 "name": "BaseBdev3", 00:16:41.747 "aliases": [ 00:16:41.747 "59f79d38-4eb5-41fd-a44f-6765238899b2" 00:16:41.747 ], 00:16:41.747 "product_name": "Malloc disk", 00:16:41.747 "block_size": 512, 00:16:41.747 "num_blocks": 65536, 00:16:41.747 "uuid": "59f79d38-4eb5-41fd-a44f-6765238899b2", 00:16:41.747 "assigned_rate_limits": { 00:16:41.747 "rw_ios_per_sec": 0, 00:16:41.747 "rw_mbytes_per_sec": 0, 00:16:41.747 "r_mbytes_per_sec": 0, 00:16:41.747 "w_mbytes_per_sec": 0 00:16:41.747 }, 00:16:41.747 "claimed": false, 00:16:41.747 "zoned": false, 00:16:41.747 "supported_io_types": { 00:16:41.748 "read": true, 00:16:41.748 "write": true, 00:16:41.748 "unmap": true, 00:16:41.748 "flush": true, 00:16:41.748 "reset": true, 00:16:41.748 "nvme_admin": false, 00:16:41.748 "nvme_io": false, 00:16:41.748 "nvme_io_md": false, 00:16:41.748 "write_zeroes": true, 00:16:41.748 "zcopy": true, 00:16:41.748 "get_zone_info": false, 00:16:41.748 "zone_management": false, 00:16:41.748 "zone_append": false, 00:16:41.748 "compare": false, 00:16:41.748 "compare_and_write": false, 00:16:41.748 "abort": true, 00:16:41.748 "seek_hole": false, 00:16:41.748 "seek_data": false, 00:16:41.748 "copy": true, 00:16:41.748 "nvme_iov_md": false 00:16:41.748 }, 00:16:41.748 "memory_domains": [ 00:16:41.748 { 00:16:41.748 "dma_device_id": "system", 00:16:41.748 "dma_device_type": 1 00:16:41.748 }, 00:16:41.748 { 00:16:41.748 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:41.748 "dma_device_type": 2 00:16:41.748 } 00:16:41.748 ], 00:16:41.748 "driver_specific": {} 00:16:41.748 } 00:16:41.748 ] 00:16:41.748 18:20:25 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:16:41.748 18:20:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:16:41.748 18:20:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:16:41.748 18:20:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@305 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:16:42.006 [2024-07-12 18:20:25.564450] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:16:42.006 [2024-07-12 18:20:25.564491] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:16:42.006 [2024-07-12 18:20:25.564509] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:16:42.006 [2024-07-12 18:20:25.565872] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:16:42.006 18:20:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@306 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:16:42.006 18:20:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:16:42.006 18:20:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:16:42.006 18:20:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:16:42.006 18:20:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:16:42.006 18:20:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:16:42.006 18:20:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:42.006 18:20:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:42.006 18:20:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:42.006 18:20:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:42.006 18:20:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:42.006 18:20:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:16:42.265 18:20:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:42.265 "name": "Existed_Raid", 00:16:42.265 "uuid": "3f7a658b-7d8b-4a03-9efd-301313c2ffec", 00:16:42.265 "strip_size_kb": 0, 00:16:42.265 "state": "configuring", 00:16:42.265 "raid_level": "raid1", 00:16:42.265 "superblock": true, 00:16:42.265 "num_base_bdevs": 3, 00:16:42.265 "num_base_bdevs_discovered": 2, 00:16:42.265 "num_base_bdevs_operational": 3, 00:16:42.265 "base_bdevs_list": [ 00:16:42.265 { 00:16:42.265 "name": "BaseBdev1", 00:16:42.265 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:42.265 "is_configured": false, 00:16:42.265 "data_offset": 0, 00:16:42.265 "data_size": 0 00:16:42.265 }, 00:16:42.265 { 00:16:42.265 "name": "BaseBdev2", 00:16:42.265 "uuid": "3fe9a57f-5071-40bc-a7a0-dbb4f541c626", 00:16:42.265 "is_configured": true, 00:16:42.265 "data_offset": 2048, 00:16:42.265 "data_size": 63488 00:16:42.265 }, 00:16:42.265 { 00:16:42.265 "name": "BaseBdev3", 00:16:42.265 "uuid": "59f79d38-4eb5-41fd-a44f-6765238899b2", 00:16:42.265 "is_configured": true, 00:16:42.265 "data_offset": 2048, 00:16:42.265 "data_size": 63488 00:16:42.265 } 00:16:42.265 ] 00:16:42.265 }' 00:16:42.265 18:20:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:42.265 18:20:25 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:16:42.831 18:20:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@308 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev2 00:16:43.090 [2024-07-12 18:20:26.623220] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:16:43.090 18:20:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@309 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:16:43.090 18:20:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:16:43.090 18:20:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:16:43.090 18:20:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:16:43.090 18:20:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:16:43.090 18:20:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:16:43.090 18:20:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:43.090 18:20:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:43.090 18:20:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:43.090 18:20:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:43.090 18:20:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:43.090 18:20:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:16:43.350 18:20:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:43.350 "name": "Existed_Raid", 00:16:43.350 "uuid": "3f7a658b-7d8b-4a03-9efd-301313c2ffec", 00:16:43.350 "strip_size_kb": 0, 00:16:43.350 "state": "configuring", 00:16:43.350 "raid_level": "raid1", 00:16:43.350 "superblock": true, 00:16:43.350 "num_base_bdevs": 3, 00:16:43.350 "num_base_bdevs_discovered": 1, 00:16:43.350 "num_base_bdevs_operational": 3, 00:16:43.350 "base_bdevs_list": [ 00:16:43.350 { 00:16:43.350 "name": "BaseBdev1", 00:16:43.350 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:43.350 "is_configured": false, 00:16:43.350 "data_offset": 0, 00:16:43.350 "data_size": 0 00:16:43.350 }, 00:16:43.350 { 00:16:43.350 "name": null, 00:16:43.350 "uuid": "3fe9a57f-5071-40bc-a7a0-dbb4f541c626", 00:16:43.350 "is_configured": false, 00:16:43.350 "data_offset": 2048, 00:16:43.350 "data_size": 63488 00:16:43.350 }, 00:16:43.350 { 00:16:43.350 "name": "BaseBdev3", 00:16:43.350 "uuid": "59f79d38-4eb5-41fd-a44f-6765238899b2", 00:16:43.350 "is_configured": true, 00:16:43.350 "data_offset": 2048, 00:16:43.350 "data_size": 63488 00:16:43.350 } 00:16:43.350 ] 00:16:43.350 }' 00:16:43.350 18:20:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:43.350 18:20:26 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:16:43.917 18:20:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@310 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:43.917 18:20:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@310 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:16:44.174 18:20:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@310 -- # [[ false == \f\a\l\s\e ]] 00:16:44.174 18:20:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@312 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:16:44.433 [2024-07-12 18:20:27.962134] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:16:44.433 BaseBdev1 00:16:44.433 18:20:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@313 -- # waitforbdev BaseBdev1 00:16:44.433 18:20:27 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev1 00:16:44.433 18:20:27 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:16:44.433 18:20:27 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:16:44.433 18:20:27 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:16:44.433 18:20:27 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:16:44.433 18:20:27 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:16:44.701 18:20:28 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:16:44.997 [ 00:16:44.997 { 00:16:44.997 "name": "BaseBdev1", 00:16:44.997 "aliases": [ 00:16:44.997 "0542db2a-e147-4ac2-8555-862c53cdb876" 00:16:44.997 ], 00:16:44.997 "product_name": "Malloc disk", 00:16:44.997 "block_size": 512, 00:16:44.997 "num_blocks": 65536, 00:16:44.997 "uuid": "0542db2a-e147-4ac2-8555-862c53cdb876", 00:16:44.997 "assigned_rate_limits": { 00:16:44.997 "rw_ios_per_sec": 0, 00:16:44.997 "rw_mbytes_per_sec": 0, 00:16:44.997 "r_mbytes_per_sec": 0, 00:16:44.997 "w_mbytes_per_sec": 0 00:16:44.997 }, 00:16:44.997 "claimed": true, 00:16:44.997 "claim_type": "exclusive_write", 00:16:44.997 "zoned": false, 00:16:44.997 "supported_io_types": { 00:16:44.997 "read": true, 00:16:44.997 "write": true, 00:16:44.997 "unmap": true, 00:16:44.997 "flush": true, 00:16:44.997 "reset": true, 00:16:44.997 "nvme_admin": false, 00:16:44.997 "nvme_io": false, 00:16:44.997 "nvme_io_md": false, 00:16:44.997 "write_zeroes": true, 00:16:44.997 "zcopy": true, 00:16:44.997 "get_zone_info": false, 00:16:44.997 "zone_management": false, 00:16:44.997 "zone_append": false, 00:16:44.997 "compare": false, 00:16:44.997 "compare_and_write": false, 00:16:44.997 "abort": true, 00:16:44.997 "seek_hole": false, 00:16:44.997 "seek_data": false, 00:16:44.997 "copy": true, 00:16:44.997 "nvme_iov_md": false 00:16:44.997 }, 00:16:44.997 "memory_domains": [ 00:16:44.997 { 00:16:44.997 "dma_device_id": "system", 00:16:44.997 "dma_device_type": 1 00:16:44.997 }, 00:16:44.997 { 00:16:44.997 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:44.997 "dma_device_type": 2 00:16:44.997 } 00:16:44.997 ], 00:16:44.997 "driver_specific": {} 00:16:44.997 } 00:16:44.997 ] 00:16:44.997 18:20:28 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:16:44.997 18:20:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@314 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:16:44.997 18:20:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:16:44.997 18:20:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:16:44.997 18:20:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:16:44.997 18:20:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:16:44.997 18:20:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:16:44.997 18:20:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:44.997 18:20:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:44.997 18:20:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:44.997 18:20:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:44.997 18:20:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:44.997 18:20:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:16:45.255 18:20:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:45.255 "name": "Existed_Raid", 00:16:45.255 "uuid": "3f7a658b-7d8b-4a03-9efd-301313c2ffec", 00:16:45.255 "strip_size_kb": 0, 00:16:45.255 "state": "configuring", 00:16:45.255 "raid_level": "raid1", 00:16:45.255 "superblock": true, 00:16:45.255 "num_base_bdevs": 3, 00:16:45.255 "num_base_bdevs_discovered": 2, 00:16:45.255 "num_base_bdevs_operational": 3, 00:16:45.255 "base_bdevs_list": [ 00:16:45.255 { 00:16:45.255 "name": "BaseBdev1", 00:16:45.255 "uuid": "0542db2a-e147-4ac2-8555-862c53cdb876", 00:16:45.255 "is_configured": true, 00:16:45.255 "data_offset": 2048, 00:16:45.255 "data_size": 63488 00:16:45.255 }, 00:16:45.255 { 00:16:45.255 "name": null, 00:16:45.255 "uuid": "3fe9a57f-5071-40bc-a7a0-dbb4f541c626", 00:16:45.255 "is_configured": false, 00:16:45.255 "data_offset": 2048, 00:16:45.255 "data_size": 63488 00:16:45.255 }, 00:16:45.255 { 00:16:45.255 "name": "BaseBdev3", 00:16:45.255 "uuid": "59f79d38-4eb5-41fd-a44f-6765238899b2", 00:16:45.255 "is_configured": true, 00:16:45.255 "data_offset": 2048, 00:16:45.255 "data_size": 63488 00:16:45.255 } 00:16:45.255 ] 00:16:45.255 }' 00:16:45.255 18:20:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:45.255 18:20:28 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:16:45.838 18:20:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@315 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:45.838 18:20:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@315 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:16:46.095 18:20:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@315 -- # [[ true == \t\r\u\e ]] 00:16:46.095 18:20:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@317 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev3 00:16:46.354 [2024-07-12 18:20:30.019618] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:16:46.354 18:20:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@318 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:16:46.354 18:20:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:16:46.354 18:20:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:16:46.354 18:20:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:16:46.354 18:20:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:16:46.354 18:20:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:16:46.354 18:20:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:46.354 18:20:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:46.354 18:20:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:46.354 18:20:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:46.354 18:20:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:46.354 18:20:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:16:46.612 18:20:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:46.612 "name": "Existed_Raid", 00:16:46.612 "uuid": "3f7a658b-7d8b-4a03-9efd-301313c2ffec", 00:16:46.612 "strip_size_kb": 0, 00:16:46.612 "state": "configuring", 00:16:46.612 "raid_level": "raid1", 00:16:46.612 "superblock": true, 00:16:46.612 "num_base_bdevs": 3, 00:16:46.612 "num_base_bdevs_discovered": 1, 00:16:46.612 "num_base_bdevs_operational": 3, 00:16:46.612 "base_bdevs_list": [ 00:16:46.612 { 00:16:46.612 "name": "BaseBdev1", 00:16:46.612 "uuid": "0542db2a-e147-4ac2-8555-862c53cdb876", 00:16:46.612 "is_configured": true, 00:16:46.612 "data_offset": 2048, 00:16:46.612 "data_size": 63488 00:16:46.612 }, 00:16:46.612 { 00:16:46.612 "name": null, 00:16:46.612 "uuid": "3fe9a57f-5071-40bc-a7a0-dbb4f541c626", 00:16:46.612 "is_configured": false, 00:16:46.612 "data_offset": 2048, 00:16:46.612 "data_size": 63488 00:16:46.612 }, 00:16:46.612 { 00:16:46.612 "name": null, 00:16:46.612 "uuid": "59f79d38-4eb5-41fd-a44f-6765238899b2", 00:16:46.612 "is_configured": false, 00:16:46.612 "data_offset": 2048, 00:16:46.612 "data_size": 63488 00:16:46.612 } 00:16:46.612 ] 00:16:46.612 }' 00:16:46.612 18:20:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:46.612 18:20:30 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:16:47.179 18:20:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@319 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:47.179 18:20:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@319 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:16:47.438 18:20:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@319 -- # [[ false == \f\a\l\s\e ]] 00:16:47.438 18:20:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@321 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev3 00:16:47.698 [2024-07-12 18:20:31.371374] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:16:47.698 18:20:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@322 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:16:47.698 18:20:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:16:47.698 18:20:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:16:47.698 18:20:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:16:47.698 18:20:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:16:47.698 18:20:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:16:47.698 18:20:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:47.698 18:20:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:47.698 18:20:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:47.698 18:20:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:47.698 18:20:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:47.698 18:20:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:16:47.956 18:20:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:47.956 "name": "Existed_Raid", 00:16:47.956 "uuid": "3f7a658b-7d8b-4a03-9efd-301313c2ffec", 00:16:47.956 "strip_size_kb": 0, 00:16:47.956 "state": "configuring", 00:16:47.956 "raid_level": "raid1", 00:16:47.956 "superblock": true, 00:16:47.956 "num_base_bdevs": 3, 00:16:47.956 "num_base_bdevs_discovered": 2, 00:16:47.956 "num_base_bdevs_operational": 3, 00:16:47.956 "base_bdevs_list": [ 00:16:47.956 { 00:16:47.956 "name": "BaseBdev1", 00:16:47.956 "uuid": "0542db2a-e147-4ac2-8555-862c53cdb876", 00:16:47.956 "is_configured": true, 00:16:47.956 "data_offset": 2048, 00:16:47.956 "data_size": 63488 00:16:47.956 }, 00:16:47.956 { 00:16:47.956 "name": null, 00:16:47.956 "uuid": "3fe9a57f-5071-40bc-a7a0-dbb4f541c626", 00:16:47.956 "is_configured": false, 00:16:47.956 "data_offset": 2048, 00:16:47.956 "data_size": 63488 00:16:47.956 }, 00:16:47.956 { 00:16:47.956 "name": "BaseBdev3", 00:16:47.956 "uuid": "59f79d38-4eb5-41fd-a44f-6765238899b2", 00:16:47.956 "is_configured": true, 00:16:47.956 "data_offset": 2048, 00:16:47.956 "data_size": 63488 00:16:47.956 } 00:16:47.956 ] 00:16:47.956 }' 00:16:47.956 18:20:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:47.956 18:20:31 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:16:48.522 18:20:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@323 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:48.522 18:20:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@323 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:16:48.780 18:20:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@323 -- # [[ true == \t\r\u\e ]] 00:16:48.780 18:20:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@325 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:16:49.037 [2024-07-12 18:20:32.642747] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:16:49.037 18:20:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@326 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:16:49.037 18:20:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:16:49.037 18:20:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:16:49.037 18:20:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:16:49.037 18:20:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:16:49.037 18:20:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:16:49.037 18:20:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:49.037 18:20:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:49.037 18:20:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:49.037 18:20:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:49.037 18:20:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:49.037 18:20:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:16:49.295 18:20:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:49.295 "name": "Existed_Raid", 00:16:49.295 "uuid": "3f7a658b-7d8b-4a03-9efd-301313c2ffec", 00:16:49.295 "strip_size_kb": 0, 00:16:49.295 "state": "configuring", 00:16:49.295 "raid_level": "raid1", 00:16:49.295 "superblock": true, 00:16:49.295 "num_base_bdevs": 3, 00:16:49.295 "num_base_bdevs_discovered": 1, 00:16:49.295 "num_base_bdevs_operational": 3, 00:16:49.295 "base_bdevs_list": [ 00:16:49.295 { 00:16:49.295 "name": null, 00:16:49.295 "uuid": "0542db2a-e147-4ac2-8555-862c53cdb876", 00:16:49.295 "is_configured": false, 00:16:49.295 "data_offset": 2048, 00:16:49.295 "data_size": 63488 00:16:49.295 }, 00:16:49.295 { 00:16:49.295 "name": null, 00:16:49.295 "uuid": "3fe9a57f-5071-40bc-a7a0-dbb4f541c626", 00:16:49.295 "is_configured": false, 00:16:49.295 "data_offset": 2048, 00:16:49.295 "data_size": 63488 00:16:49.295 }, 00:16:49.295 { 00:16:49.295 "name": "BaseBdev3", 00:16:49.295 "uuid": "59f79d38-4eb5-41fd-a44f-6765238899b2", 00:16:49.295 "is_configured": true, 00:16:49.295 "data_offset": 2048, 00:16:49.295 "data_size": 63488 00:16:49.295 } 00:16:49.295 ] 00:16:49.295 }' 00:16:49.295 18:20:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:49.295 18:20:32 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:16:49.860 18:20:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@327 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:49.860 18:20:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@327 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:16:50.119 18:20:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@327 -- # [[ false == \f\a\l\s\e ]] 00:16:50.119 18:20:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@329 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev2 00:16:50.378 [2024-07-12 18:20:33.934599] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:16:50.378 18:20:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@330 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:16:50.378 18:20:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:16:50.378 18:20:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:16:50.378 18:20:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:16:50.378 18:20:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:16:50.378 18:20:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:16:50.378 18:20:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:50.378 18:20:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:50.378 18:20:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:50.378 18:20:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:50.378 18:20:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:50.378 18:20:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:16:50.637 18:20:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:50.637 "name": "Existed_Raid", 00:16:50.637 "uuid": "3f7a658b-7d8b-4a03-9efd-301313c2ffec", 00:16:50.637 "strip_size_kb": 0, 00:16:50.637 "state": "configuring", 00:16:50.637 "raid_level": "raid1", 00:16:50.637 "superblock": true, 00:16:50.637 "num_base_bdevs": 3, 00:16:50.637 "num_base_bdevs_discovered": 2, 00:16:50.637 "num_base_bdevs_operational": 3, 00:16:50.637 "base_bdevs_list": [ 00:16:50.637 { 00:16:50.637 "name": null, 00:16:50.637 "uuid": "0542db2a-e147-4ac2-8555-862c53cdb876", 00:16:50.637 "is_configured": false, 00:16:50.637 "data_offset": 2048, 00:16:50.637 "data_size": 63488 00:16:50.637 }, 00:16:50.637 { 00:16:50.637 "name": "BaseBdev2", 00:16:50.637 "uuid": "3fe9a57f-5071-40bc-a7a0-dbb4f541c626", 00:16:50.637 "is_configured": true, 00:16:50.637 "data_offset": 2048, 00:16:50.637 "data_size": 63488 00:16:50.637 }, 00:16:50.637 { 00:16:50.637 "name": "BaseBdev3", 00:16:50.637 "uuid": "59f79d38-4eb5-41fd-a44f-6765238899b2", 00:16:50.637 "is_configured": true, 00:16:50.637 "data_offset": 2048, 00:16:50.637 "data_size": 63488 00:16:50.637 } 00:16:50.637 ] 00:16:50.637 }' 00:16:50.637 18:20:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:50.637 18:20:34 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:16:51.204 18:20:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@331 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:51.204 18:20:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@331 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:16:51.463 18:20:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@331 -- # [[ true == \t\r\u\e ]] 00:16:51.463 18:20:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:51.463 18:20:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@333 -- # jq -r '.[0].base_bdevs_list[0].uuid' 00:16:51.722 18:20:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b NewBaseBdev -u 0542db2a-e147-4ac2-8555-862c53cdb876 00:16:51.980 [2024-07-12 18:20:35.515456] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev NewBaseBdev is claimed 00:16:51.980 [2024-07-12 18:20:35.515606] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0xbe01b0 00:16:51.980 [2024-07-12 18:20:35.515618] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:16:51.980 [2024-07-12 18:20:35.515792] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xd9c4f0 00:16:51.980 [2024-07-12 18:20:35.515908] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0xbe01b0 00:16:51.980 [2024-07-12 18:20:35.515918] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0xbe01b0 00:16:51.980 [2024-07-12 18:20:35.516022] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:16:51.980 NewBaseBdev 00:16:51.980 18:20:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@334 -- # waitforbdev NewBaseBdev 00:16:51.980 18:20:35 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=NewBaseBdev 00:16:51.980 18:20:35 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:16:51.980 18:20:35 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:16:51.980 18:20:35 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:16:51.980 18:20:35 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:16:51.980 18:20:35 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:16:52.238 18:20:35 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev -t 2000 00:16:52.497 [ 00:16:52.497 { 00:16:52.497 "name": "NewBaseBdev", 00:16:52.497 "aliases": [ 00:16:52.497 "0542db2a-e147-4ac2-8555-862c53cdb876" 00:16:52.497 ], 00:16:52.497 "product_name": "Malloc disk", 00:16:52.497 "block_size": 512, 00:16:52.497 "num_blocks": 65536, 00:16:52.497 "uuid": "0542db2a-e147-4ac2-8555-862c53cdb876", 00:16:52.497 "assigned_rate_limits": { 00:16:52.497 "rw_ios_per_sec": 0, 00:16:52.497 "rw_mbytes_per_sec": 0, 00:16:52.497 "r_mbytes_per_sec": 0, 00:16:52.497 "w_mbytes_per_sec": 0 00:16:52.497 }, 00:16:52.497 "claimed": true, 00:16:52.497 "claim_type": "exclusive_write", 00:16:52.497 "zoned": false, 00:16:52.497 "supported_io_types": { 00:16:52.497 "read": true, 00:16:52.497 "write": true, 00:16:52.497 "unmap": true, 00:16:52.497 "flush": true, 00:16:52.497 "reset": true, 00:16:52.497 "nvme_admin": false, 00:16:52.497 "nvme_io": false, 00:16:52.497 "nvme_io_md": false, 00:16:52.497 "write_zeroes": true, 00:16:52.497 "zcopy": true, 00:16:52.497 "get_zone_info": false, 00:16:52.497 "zone_management": false, 00:16:52.497 "zone_append": false, 00:16:52.497 "compare": false, 00:16:52.497 "compare_and_write": false, 00:16:52.497 "abort": true, 00:16:52.497 "seek_hole": false, 00:16:52.497 "seek_data": false, 00:16:52.497 "copy": true, 00:16:52.497 "nvme_iov_md": false 00:16:52.497 }, 00:16:52.497 "memory_domains": [ 00:16:52.497 { 00:16:52.497 "dma_device_id": "system", 00:16:52.497 "dma_device_type": 1 00:16:52.497 }, 00:16:52.497 { 00:16:52.497 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:52.497 "dma_device_type": 2 00:16:52.497 } 00:16:52.497 ], 00:16:52.497 "driver_specific": {} 00:16:52.497 } 00:16:52.497 ] 00:16:52.497 18:20:36 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:16:52.497 18:20:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@335 -- # verify_raid_bdev_state Existed_Raid online raid1 0 3 00:16:52.497 18:20:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:16:52.497 18:20:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:16:52.497 18:20:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:16:52.497 18:20:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:16:52.497 18:20:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:16:52.497 18:20:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:52.497 18:20:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:52.497 18:20:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:52.497 18:20:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:52.497 18:20:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:52.497 18:20:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:16:52.756 18:20:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:52.756 "name": "Existed_Raid", 00:16:52.756 "uuid": "3f7a658b-7d8b-4a03-9efd-301313c2ffec", 00:16:52.756 "strip_size_kb": 0, 00:16:52.756 "state": "online", 00:16:52.756 "raid_level": "raid1", 00:16:52.756 "superblock": true, 00:16:52.756 "num_base_bdevs": 3, 00:16:52.756 "num_base_bdevs_discovered": 3, 00:16:52.756 "num_base_bdevs_operational": 3, 00:16:52.756 "base_bdevs_list": [ 00:16:52.756 { 00:16:52.756 "name": "NewBaseBdev", 00:16:52.756 "uuid": "0542db2a-e147-4ac2-8555-862c53cdb876", 00:16:52.756 "is_configured": true, 00:16:52.756 "data_offset": 2048, 00:16:52.756 "data_size": 63488 00:16:52.756 }, 00:16:52.756 { 00:16:52.756 "name": "BaseBdev2", 00:16:52.756 "uuid": "3fe9a57f-5071-40bc-a7a0-dbb4f541c626", 00:16:52.756 "is_configured": true, 00:16:52.756 "data_offset": 2048, 00:16:52.756 "data_size": 63488 00:16:52.756 }, 00:16:52.756 { 00:16:52.756 "name": "BaseBdev3", 00:16:52.756 "uuid": "59f79d38-4eb5-41fd-a44f-6765238899b2", 00:16:52.756 "is_configured": true, 00:16:52.756 "data_offset": 2048, 00:16:52.756 "data_size": 63488 00:16:52.756 } 00:16:52.756 ] 00:16:52.756 }' 00:16:52.756 18:20:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:52.756 18:20:36 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:16:53.324 18:20:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@336 -- # verify_raid_bdev_properties Existed_Raid 00:16:53.324 18:20:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:16:53.324 18:20:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:16:53.324 18:20:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:16:53.324 18:20:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:16:53.324 18:20:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@198 -- # local name 00:16:53.324 18:20:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:16:53.324 18:20:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:16:53.583 [2024-07-12 18:20:37.059857] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:16:53.583 18:20:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:16:53.583 "name": "Existed_Raid", 00:16:53.584 "aliases": [ 00:16:53.584 "3f7a658b-7d8b-4a03-9efd-301313c2ffec" 00:16:53.584 ], 00:16:53.584 "product_name": "Raid Volume", 00:16:53.584 "block_size": 512, 00:16:53.584 "num_blocks": 63488, 00:16:53.584 "uuid": "3f7a658b-7d8b-4a03-9efd-301313c2ffec", 00:16:53.584 "assigned_rate_limits": { 00:16:53.584 "rw_ios_per_sec": 0, 00:16:53.584 "rw_mbytes_per_sec": 0, 00:16:53.584 "r_mbytes_per_sec": 0, 00:16:53.584 "w_mbytes_per_sec": 0 00:16:53.584 }, 00:16:53.584 "claimed": false, 00:16:53.584 "zoned": false, 00:16:53.584 "supported_io_types": { 00:16:53.584 "read": true, 00:16:53.584 "write": true, 00:16:53.584 "unmap": false, 00:16:53.584 "flush": false, 00:16:53.584 "reset": true, 00:16:53.584 "nvme_admin": false, 00:16:53.584 "nvme_io": false, 00:16:53.584 "nvme_io_md": false, 00:16:53.584 "write_zeroes": true, 00:16:53.584 "zcopy": false, 00:16:53.584 "get_zone_info": false, 00:16:53.584 "zone_management": false, 00:16:53.584 "zone_append": false, 00:16:53.584 "compare": false, 00:16:53.584 "compare_and_write": false, 00:16:53.584 "abort": false, 00:16:53.584 "seek_hole": false, 00:16:53.584 "seek_data": false, 00:16:53.584 "copy": false, 00:16:53.584 "nvme_iov_md": false 00:16:53.584 }, 00:16:53.584 "memory_domains": [ 00:16:53.584 { 00:16:53.584 "dma_device_id": "system", 00:16:53.584 "dma_device_type": 1 00:16:53.584 }, 00:16:53.584 { 00:16:53.584 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:53.584 "dma_device_type": 2 00:16:53.584 }, 00:16:53.584 { 00:16:53.584 "dma_device_id": "system", 00:16:53.584 "dma_device_type": 1 00:16:53.584 }, 00:16:53.584 { 00:16:53.584 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:53.584 "dma_device_type": 2 00:16:53.584 }, 00:16:53.584 { 00:16:53.584 "dma_device_id": "system", 00:16:53.584 "dma_device_type": 1 00:16:53.584 }, 00:16:53.584 { 00:16:53.584 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:53.584 "dma_device_type": 2 00:16:53.584 } 00:16:53.584 ], 00:16:53.584 "driver_specific": { 00:16:53.584 "raid": { 00:16:53.584 "uuid": "3f7a658b-7d8b-4a03-9efd-301313c2ffec", 00:16:53.584 "strip_size_kb": 0, 00:16:53.584 "state": "online", 00:16:53.584 "raid_level": "raid1", 00:16:53.584 "superblock": true, 00:16:53.584 "num_base_bdevs": 3, 00:16:53.584 "num_base_bdevs_discovered": 3, 00:16:53.584 "num_base_bdevs_operational": 3, 00:16:53.584 "base_bdevs_list": [ 00:16:53.584 { 00:16:53.584 "name": "NewBaseBdev", 00:16:53.584 "uuid": "0542db2a-e147-4ac2-8555-862c53cdb876", 00:16:53.584 "is_configured": true, 00:16:53.584 "data_offset": 2048, 00:16:53.584 "data_size": 63488 00:16:53.584 }, 00:16:53.584 { 00:16:53.584 "name": "BaseBdev2", 00:16:53.584 "uuid": "3fe9a57f-5071-40bc-a7a0-dbb4f541c626", 00:16:53.584 "is_configured": true, 00:16:53.584 "data_offset": 2048, 00:16:53.584 "data_size": 63488 00:16:53.584 }, 00:16:53.584 { 00:16:53.584 "name": "BaseBdev3", 00:16:53.584 "uuid": "59f79d38-4eb5-41fd-a44f-6765238899b2", 00:16:53.584 "is_configured": true, 00:16:53.584 "data_offset": 2048, 00:16:53.584 "data_size": 63488 00:16:53.584 } 00:16:53.584 ] 00:16:53.584 } 00:16:53.584 } 00:16:53.584 }' 00:16:53.584 18:20:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:16:53.584 18:20:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # base_bdev_names='NewBaseBdev 00:16:53.584 BaseBdev2 00:16:53.584 BaseBdev3' 00:16:53.584 18:20:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:16:53.584 18:20:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev 00:16:53.584 18:20:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:16:53.843 18:20:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:16:53.843 "name": "NewBaseBdev", 00:16:53.843 "aliases": [ 00:16:53.843 "0542db2a-e147-4ac2-8555-862c53cdb876" 00:16:53.843 ], 00:16:53.843 "product_name": "Malloc disk", 00:16:53.843 "block_size": 512, 00:16:53.843 "num_blocks": 65536, 00:16:53.843 "uuid": "0542db2a-e147-4ac2-8555-862c53cdb876", 00:16:53.843 "assigned_rate_limits": { 00:16:53.843 "rw_ios_per_sec": 0, 00:16:53.843 "rw_mbytes_per_sec": 0, 00:16:53.843 "r_mbytes_per_sec": 0, 00:16:53.843 "w_mbytes_per_sec": 0 00:16:53.843 }, 00:16:53.843 "claimed": true, 00:16:53.843 "claim_type": "exclusive_write", 00:16:53.843 "zoned": false, 00:16:53.843 "supported_io_types": { 00:16:53.843 "read": true, 00:16:53.843 "write": true, 00:16:53.843 "unmap": true, 00:16:53.843 "flush": true, 00:16:53.843 "reset": true, 00:16:53.843 "nvme_admin": false, 00:16:53.843 "nvme_io": false, 00:16:53.843 "nvme_io_md": false, 00:16:53.843 "write_zeroes": true, 00:16:53.843 "zcopy": true, 00:16:53.843 "get_zone_info": false, 00:16:53.843 "zone_management": false, 00:16:53.843 "zone_append": false, 00:16:53.843 "compare": false, 00:16:53.843 "compare_and_write": false, 00:16:53.843 "abort": true, 00:16:53.843 "seek_hole": false, 00:16:53.843 "seek_data": false, 00:16:53.843 "copy": true, 00:16:53.843 "nvme_iov_md": false 00:16:53.843 }, 00:16:53.843 "memory_domains": [ 00:16:53.843 { 00:16:53.843 "dma_device_id": "system", 00:16:53.843 "dma_device_type": 1 00:16:53.843 }, 00:16:53.843 { 00:16:53.843 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:53.843 "dma_device_type": 2 00:16:53.843 } 00:16:53.843 ], 00:16:53.843 "driver_specific": {} 00:16:53.843 }' 00:16:53.843 18:20:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:53.843 18:20:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:53.843 18:20:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:16:53.843 18:20:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:53.843 18:20:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:53.843 18:20:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:16:53.843 18:20:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:54.101 18:20:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:54.101 18:20:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:16:54.101 18:20:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:54.102 18:20:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:54.102 18:20:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:16:54.102 18:20:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:16:54.102 18:20:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:16:54.102 18:20:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:16:54.359 18:20:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:16:54.359 "name": "BaseBdev2", 00:16:54.359 "aliases": [ 00:16:54.359 "3fe9a57f-5071-40bc-a7a0-dbb4f541c626" 00:16:54.359 ], 00:16:54.359 "product_name": "Malloc disk", 00:16:54.359 "block_size": 512, 00:16:54.359 "num_blocks": 65536, 00:16:54.359 "uuid": "3fe9a57f-5071-40bc-a7a0-dbb4f541c626", 00:16:54.359 "assigned_rate_limits": { 00:16:54.359 "rw_ios_per_sec": 0, 00:16:54.359 "rw_mbytes_per_sec": 0, 00:16:54.359 "r_mbytes_per_sec": 0, 00:16:54.359 "w_mbytes_per_sec": 0 00:16:54.359 }, 00:16:54.359 "claimed": true, 00:16:54.359 "claim_type": "exclusive_write", 00:16:54.359 "zoned": false, 00:16:54.359 "supported_io_types": { 00:16:54.359 "read": true, 00:16:54.359 "write": true, 00:16:54.359 "unmap": true, 00:16:54.359 "flush": true, 00:16:54.359 "reset": true, 00:16:54.359 "nvme_admin": false, 00:16:54.359 "nvme_io": false, 00:16:54.359 "nvme_io_md": false, 00:16:54.359 "write_zeroes": true, 00:16:54.359 "zcopy": true, 00:16:54.359 "get_zone_info": false, 00:16:54.359 "zone_management": false, 00:16:54.359 "zone_append": false, 00:16:54.359 "compare": false, 00:16:54.359 "compare_and_write": false, 00:16:54.359 "abort": true, 00:16:54.359 "seek_hole": false, 00:16:54.359 "seek_data": false, 00:16:54.359 "copy": true, 00:16:54.359 "nvme_iov_md": false 00:16:54.359 }, 00:16:54.359 "memory_domains": [ 00:16:54.359 { 00:16:54.359 "dma_device_id": "system", 00:16:54.359 "dma_device_type": 1 00:16:54.359 }, 00:16:54.359 { 00:16:54.359 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:54.359 "dma_device_type": 2 00:16:54.359 } 00:16:54.359 ], 00:16:54.359 "driver_specific": {} 00:16:54.359 }' 00:16:54.359 18:20:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:54.359 18:20:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:54.359 18:20:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:16:54.359 18:20:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:54.617 18:20:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:54.617 18:20:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:16:54.617 18:20:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:54.617 18:20:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:54.617 18:20:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:16:54.617 18:20:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:54.617 18:20:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:54.617 18:20:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:16:54.617 18:20:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:16:54.617 18:20:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:16:54.617 18:20:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:16:54.876 18:20:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:16:54.876 "name": "BaseBdev3", 00:16:54.876 "aliases": [ 00:16:54.876 "59f79d38-4eb5-41fd-a44f-6765238899b2" 00:16:54.876 ], 00:16:54.876 "product_name": "Malloc disk", 00:16:54.876 "block_size": 512, 00:16:54.876 "num_blocks": 65536, 00:16:54.876 "uuid": "59f79d38-4eb5-41fd-a44f-6765238899b2", 00:16:54.876 "assigned_rate_limits": { 00:16:54.876 "rw_ios_per_sec": 0, 00:16:54.876 "rw_mbytes_per_sec": 0, 00:16:54.876 "r_mbytes_per_sec": 0, 00:16:54.876 "w_mbytes_per_sec": 0 00:16:54.876 }, 00:16:54.876 "claimed": true, 00:16:54.876 "claim_type": "exclusive_write", 00:16:54.876 "zoned": false, 00:16:54.876 "supported_io_types": { 00:16:54.876 "read": true, 00:16:54.876 "write": true, 00:16:54.876 "unmap": true, 00:16:54.876 "flush": true, 00:16:54.876 "reset": true, 00:16:54.876 "nvme_admin": false, 00:16:54.876 "nvme_io": false, 00:16:54.876 "nvme_io_md": false, 00:16:54.876 "write_zeroes": true, 00:16:54.876 "zcopy": true, 00:16:54.876 "get_zone_info": false, 00:16:54.876 "zone_management": false, 00:16:54.876 "zone_append": false, 00:16:54.876 "compare": false, 00:16:54.876 "compare_and_write": false, 00:16:54.876 "abort": true, 00:16:54.876 "seek_hole": false, 00:16:54.876 "seek_data": false, 00:16:54.876 "copy": true, 00:16:54.876 "nvme_iov_md": false 00:16:54.876 }, 00:16:54.876 "memory_domains": [ 00:16:54.876 { 00:16:54.876 "dma_device_id": "system", 00:16:54.876 "dma_device_type": 1 00:16:54.876 }, 00:16:54.876 { 00:16:54.876 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:54.876 "dma_device_type": 2 00:16:54.876 } 00:16:54.876 ], 00:16:54.876 "driver_specific": {} 00:16:54.876 }' 00:16:54.876 18:20:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:55.135 18:20:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:55.135 18:20:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:16:55.135 18:20:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:55.135 18:20:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:55.135 18:20:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:16:55.135 18:20:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:55.135 18:20:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:55.135 18:20:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:16:55.135 18:20:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:55.394 18:20:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:55.394 18:20:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:16:55.394 18:20:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@338 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:16:55.652 [2024-07-12 18:20:39.137084] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:16:55.652 [2024-07-12 18:20:39.137111] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:16:55.652 [2024-07-12 18:20:39.137162] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:16:55.652 [2024-07-12 18:20:39.137434] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:16:55.652 [2024-07-12 18:20:39.137446] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xbe01b0 name Existed_Raid, state offline 00:16:55.652 18:20:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@341 -- # killprocess 2508585 00:16:55.652 18:20:39 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@948 -- # '[' -z 2508585 ']' 00:16:55.652 18:20:39 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@952 -- # kill -0 2508585 00:16:55.652 18:20:39 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@953 -- # uname 00:16:55.652 18:20:39 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:16:55.652 18:20:39 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2508585 00:16:55.652 18:20:39 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:16:55.652 18:20:39 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:16:55.652 18:20:39 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2508585' 00:16:55.652 killing process with pid 2508585 00:16:55.652 18:20:39 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@967 -- # kill 2508585 00:16:55.652 [2024-07-12 18:20:39.203523] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:16:55.652 18:20:39 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@972 -- # wait 2508585 00:16:55.652 [2024-07-12 18:20:39.233993] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:16:55.911 18:20:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@343 -- # return 0 00:16:55.911 00:16:55.911 real 0m29.442s 00:16:55.911 user 0m54.093s 00:16:55.911 sys 0m5.174s 00:16:55.911 18:20:39 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@1124 -- # xtrace_disable 00:16:55.911 18:20:39 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:16:55.911 ************************************ 00:16:55.911 END TEST raid_state_function_test_sb 00:16:55.911 ************************************ 00:16:55.911 18:20:39 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:16:55.911 18:20:39 bdev_raid -- bdev/bdev_raid.sh@869 -- # run_test raid_superblock_test raid_superblock_test raid1 3 00:16:55.911 18:20:39 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 4 -le 1 ']' 00:16:55.911 18:20:39 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:16:55.911 18:20:39 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:16:55.911 ************************************ 00:16:55.911 START TEST raid_superblock_test 00:16:55.911 ************************************ 00:16:55.911 18:20:39 bdev_raid.raid_superblock_test -- common/autotest_common.sh@1123 -- # raid_superblock_test raid1 3 00:16:55.911 18:20:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@392 -- # local raid_level=raid1 00:16:55.911 18:20:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@393 -- # local num_base_bdevs=3 00:16:55.911 18:20:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@394 -- # base_bdevs_malloc=() 00:16:55.911 18:20:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@394 -- # local base_bdevs_malloc 00:16:55.911 18:20:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@395 -- # base_bdevs_pt=() 00:16:55.911 18:20:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@395 -- # local base_bdevs_pt 00:16:55.911 18:20:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@396 -- # base_bdevs_pt_uuid=() 00:16:55.911 18:20:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@396 -- # local base_bdevs_pt_uuid 00:16:55.911 18:20:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@397 -- # local raid_bdev_name=raid_bdev1 00:16:55.911 18:20:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@398 -- # local strip_size 00:16:55.911 18:20:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@399 -- # local strip_size_create_arg 00:16:55.911 18:20:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@400 -- # local raid_bdev_uuid 00:16:55.911 18:20:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@401 -- # local raid_bdev 00:16:55.911 18:20:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@403 -- # '[' raid1 '!=' raid1 ']' 00:16:55.911 18:20:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@407 -- # strip_size=0 00:16:55.911 18:20:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@411 -- # raid_pid=2513038 00:16:55.911 18:20:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@412 -- # waitforlisten 2513038 /var/tmp/spdk-raid.sock 00:16:55.911 18:20:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@410 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -L bdev_raid 00:16:55.911 18:20:39 bdev_raid.raid_superblock_test -- common/autotest_common.sh@829 -- # '[' -z 2513038 ']' 00:16:55.911 18:20:39 bdev_raid.raid_superblock_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:16:55.911 18:20:39 bdev_raid.raid_superblock_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:16:55.911 18:20:39 bdev_raid.raid_superblock_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:16:55.911 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:16:55.911 18:20:39 bdev_raid.raid_superblock_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:16:55.911 18:20:39 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:16:55.911 [2024-07-12 18:20:39.610678] Starting SPDK v24.09-pre git sha1 182dd7de4 / DPDK 24.03.0 initialization... 00:16:55.911 [2024-07-12 18:20:39.610744] [ DPDK EAL parameters: bdev_svc --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2513038 ] 00:16:56.170 [2024-07-12 18:20:39.741313] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:16:56.170 [2024-07-12 18:20:39.846761] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:16:56.429 [2024-07-12 18:20:39.918713] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:16:56.429 [2024-07-12 18:20:39.918752] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:16:56.995 18:20:40 bdev_raid.raid_superblock_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:16:56.995 18:20:40 bdev_raid.raid_superblock_test -- common/autotest_common.sh@862 -- # return 0 00:16:56.995 18:20:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i = 1 )) 00:16:56.995 18:20:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:16:56.995 18:20:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc1 00:16:56.995 18:20:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt1 00:16:56.995 18:20:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000001 00:16:56.995 18:20:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:16:56.995 18:20:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:16:56.995 18:20:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:16:56.995 18:20:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc1 00:16:57.254 malloc1 00:16:57.254 18:20:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:16:57.513 [2024-07-12 18:20:41.022102] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:16:57.513 [2024-07-12 18:20:41.022149] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:16:57.513 [2024-07-12 18:20:41.022171] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x23d9570 00:16:57.513 [2024-07-12 18:20:41.022184] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:16:57.513 [2024-07-12 18:20:41.023913] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:16:57.513 [2024-07-12 18:20:41.023954] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:16:57.513 pt1 00:16:57.513 18:20:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:16:57.513 18:20:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:16:57.513 18:20:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc2 00:16:57.513 18:20:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt2 00:16:57.513 18:20:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000002 00:16:57.513 18:20:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:16:57.513 18:20:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:16:57.513 18:20:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:16:57.513 18:20:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc2 00:16:57.771 malloc2 00:16:57.771 18:20:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:16:58.029 [2024-07-12 18:20:41.517387] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:16:58.029 [2024-07-12 18:20:41.517434] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:16:58.029 [2024-07-12 18:20:41.517455] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x23da970 00:16:58.029 [2024-07-12 18:20:41.517468] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:16:58.029 [2024-07-12 18:20:41.519138] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:16:58.029 [2024-07-12 18:20:41.519166] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:16:58.029 pt2 00:16:58.029 18:20:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:16:58.029 18:20:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:16:58.029 18:20:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc3 00:16:58.029 18:20:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt3 00:16:58.029 18:20:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000003 00:16:58.029 18:20:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:16:58.029 18:20:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:16:58.029 18:20:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:16:58.029 18:20:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc3 00:16:58.287 malloc3 00:16:58.288 18:20:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc3 -p pt3 -u 00000000-0000-0000-0000-000000000003 00:16:58.288 [2024-07-12 18:20:42.004130] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc3 00:16:58.288 [2024-07-12 18:20:42.004172] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:16:58.288 [2024-07-12 18:20:42.004189] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x2571340 00:16:58.288 [2024-07-12 18:20:42.004201] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:16:58.288 [2024-07-12 18:20:42.005698] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:16:58.288 [2024-07-12 18:20:42.005725] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt3 00:16:58.288 pt3 00:16:58.546 18:20:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:16:58.546 18:20:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:16:58.546 18:20:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@429 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'pt1 pt2 pt3' -n raid_bdev1 -s 00:16:58.546 [2024-07-12 18:20:42.248796] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:16:58.546 [2024-07-12 18:20:42.250123] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:16:58.546 [2024-07-12 18:20:42.250177] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt3 is claimed 00:16:58.546 [2024-07-12 18:20:42.250325] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x23d1ea0 00:16:58.546 [2024-07-12 18:20:42.250336] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:16:58.546 [2024-07-12 18:20:42.250537] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x23d9240 00:16:58.546 [2024-07-12 18:20:42.250683] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x23d1ea0 00:16:58.546 [2024-07-12 18:20:42.250693] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x23d1ea0 00:16:58.546 [2024-07-12 18:20:42.250790] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:16:58.546 18:20:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@430 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:16:58.546 18:20:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:16:58.546 18:20:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:16:58.546 18:20:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:16:58.546 18:20:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:16:58.546 18:20:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:16:58.546 18:20:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:58.546 18:20:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:58.546 18:20:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:58.546 18:20:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:58.809 18:20:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:58.809 18:20:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:16:58.809 18:20:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:58.809 "name": "raid_bdev1", 00:16:58.809 "uuid": "c202f6e0-0981-423a-a327-29e2bddbfcd3", 00:16:58.809 "strip_size_kb": 0, 00:16:58.809 "state": "online", 00:16:58.809 "raid_level": "raid1", 00:16:58.809 "superblock": true, 00:16:58.809 "num_base_bdevs": 3, 00:16:58.809 "num_base_bdevs_discovered": 3, 00:16:58.809 "num_base_bdevs_operational": 3, 00:16:58.809 "base_bdevs_list": [ 00:16:58.809 { 00:16:58.809 "name": "pt1", 00:16:58.809 "uuid": "00000000-0000-0000-0000-000000000001", 00:16:58.809 "is_configured": true, 00:16:58.809 "data_offset": 2048, 00:16:58.809 "data_size": 63488 00:16:58.809 }, 00:16:58.809 { 00:16:58.809 "name": "pt2", 00:16:58.809 "uuid": "00000000-0000-0000-0000-000000000002", 00:16:58.809 "is_configured": true, 00:16:58.809 "data_offset": 2048, 00:16:58.809 "data_size": 63488 00:16:58.809 }, 00:16:58.809 { 00:16:58.809 "name": "pt3", 00:16:58.809 "uuid": "00000000-0000-0000-0000-000000000003", 00:16:58.809 "is_configured": true, 00:16:58.809 "data_offset": 2048, 00:16:58.809 "data_size": 63488 00:16:58.809 } 00:16:58.809 ] 00:16:58.809 }' 00:16:58.809 18:20:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:58.809 18:20:42 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:16:59.377 18:20:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@431 -- # verify_raid_bdev_properties raid_bdev1 00:16:59.377 18:20:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:16:59.377 18:20:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:16:59.377 18:20:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:16:59.377 18:20:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:16:59.377 18:20:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@198 -- # local name 00:16:59.635 18:20:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:16:59.635 18:20:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:16:59.635 [2024-07-12 18:20:43.331902] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:16:59.635 18:20:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:16:59.635 "name": "raid_bdev1", 00:16:59.635 "aliases": [ 00:16:59.635 "c202f6e0-0981-423a-a327-29e2bddbfcd3" 00:16:59.635 ], 00:16:59.635 "product_name": "Raid Volume", 00:16:59.635 "block_size": 512, 00:16:59.635 "num_blocks": 63488, 00:16:59.635 "uuid": "c202f6e0-0981-423a-a327-29e2bddbfcd3", 00:16:59.636 "assigned_rate_limits": { 00:16:59.636 "rw_ios_per_sec": 0, 00:16:59.636 "rw_mbytes_per_sec": 0, 00:16:59.636 "r_mbytes_per_sec": 0, 00:16:59.636 "w_mbytes_per_sec": 0 00:16:59.636 }, 00:16:59.636 "claimed": false, 00:16:59.636 "zoned": false, 00:16:59.636 "supported_io_types": { 00:16:59.636 "read": true, 00:16:59.636 "write": true, 00:16:59.636 "unmap": false, 00:16:59.636 "flush": false, 00:16:59.636 "reset": true, 00:16:59.636 "nvme_admin": false, 00:16:59.636 "nvme_io": false, 00:16:59.636 "nvme_io_md": false, 00:16:59.636 "write_zeroes": true, 00:16:59.636 "zcopy": false, 00:16:59.636 "get_zone_info": false, 00:16:59.636 "zone_management": false, 00:16:59.636 "zone_append": false, 00:16:59.636 "compare": false, 00:16:59.636 "compare_and_write": false, 00:16:59.636 "abort": false, 00:16:59.636 "seek_hole": false, 00:16:59.636 "seek_data": false, 00:16:59.636 "copy": false, 00:16:59.636 "nvme_iov_md": false 00:16:59.636 }, 00:16:59.636 "memory_domains": [ 00:16:59.636 { 00:16:59.636 "dma_device_id": "system", 00:16:59.636 "dma_device_type": 1 00:16:59.636 }, 00:16:59.636 { 00:16:59.636 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:59.636 "dma_device_type": 2 00:16:59.636 }, 00:16:59.636 { 00:16:59.636 "dma_device_id": "system", 00:16:59.636 "dma_device_type": 1 00:16:59.636 }, 00:16:59.636 { 00:16:59.636 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:59.636 "dma_device_type": 2 00:16:59.636 }, 00:16:59.636 { 00:16:59.636 "dma_device_id": "system", 00:16:59.636 "dma_device_type": 1 00:16:59.636 }, 00:16:59.636 { 00:16:59.636 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:59.636 "dma_device_type": 2 00:16:59.636 } 00:16:59.636 ], 00:16:59.636 "driver_specific": { 00:16:59.636 "raid": { 00:16:59.636 "uuid": "c202f6e0-0981-423a-a327-29e2bddbfcd3", 00:16:59.636 "strip_size_kb": 0, 00:16:59.636 "state": "online", 00:16:59.636 "raid_level": "raid1", 00:16:59.636 "superblock": true, 00:16:59.636 "num_base_bdevs": 3, 00:16:59.636 "num_base_bdevs_discovered": 3, 00:16:59.636 "num_base_bdevs_operational": 3, 00:16:59.636 "base_bdevs_list": [ 00:16:59.636 { 00:16:59.636 "name": "pt1", 00:16:59.636 "uuid": "00000000-0000-0000-0000-000000000001", 00:16:59.636 "is_configured": true, 00:16:59.636 "data_offset": 2048, 00:16:59.636 "data_size": 63488 00:16:59.636 }, 00:16:59.636 { 00:16:59.636 "name": "pt2", 00:16:59.636 "uuid": "00000000-0000-0000-0000-000000000002", 00:16:59.636 "is_configured": true, 00:16:59.636 "data_offset": 2048, 00:16:59.636 "data_size": 63488 00:16:59.636 }, 00:16:59.636 { 00:16:59.636 "name": "pt3", 00:16:59.636 "uuid": "00000000-0000-0000-0000-000000000003", 00:16:59.636 "is_configured": true, 00:16:59.636 "data_offset": 2048, 00:16:59.636 "data_size": 63488 00:16:59.636 } 00:16:59.636 ] 00:16:59.636 } 00:16:59.636 } 00:16:59.636 }' 00:16:59.636 18:20:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:16:59.895 18:20:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:16:59.895 pt2 00:16:59.895 pt3' 00:16:59.895 18:20:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:16:59.895 18:20:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:16:59.895 18:20:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:17:00.154 18:20:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:17:00.154 "name": "pt1", 00:17:00.154 "aliases": [ 00:17:00.154 "00000000-0000-0000-0000-000000000001" 00:17:00.154 ], 00:17:00.154 "product_name": "passthru", 00:17:00.154 "block_size": 512, 00:17:00.154 "num_blocks": 65536, 00:17:00.154 "uuid": "00000000-0000-0000-0000-000000000001", 00:17:00.154 "assigned_rate_limits": { 00:17:00.154 "rw_ios_per_sec": 0, 00:17:00.154 "rw_mbytes_per_sec": 0, 00:17:00.154 "r_mbytes_per_sec": 0, 00:17:00.154 "w_mbytes_per_sec": 0 00:17:00.154 }, 00:17:00.154 "claimed": true, 00:17:00.154 "claim_type": "exclusive_write", 00:17:00.154 "zoned": false, 00:17:00.154 "supported_io_types": { 00:17:00.154 "read": true, 00:17:00.154 "write": true, 00:17:00.154 "unmap": true, 00:17:00.154 "flush": true, 00:17:00.154 "reset": true, 00:17:00.154 "nvme_admin": false, 00:17:00.154 "nvme_io": false, 00:17:00.154 "nvme_io_md": false, 00:17:00.154 "write_zeroes": true, 00:17:00.154 "zcopy": true, 00:17:00.154 "get_zone_info": false, 00:17:00.154 "zone_management": false, 00:17:00.154 "zone_append": false, 00:17:00.154 "compare": false, 00:17:00.154 "compare_and_write": false, 00:17:00.154 "abort": true, 00:17:00.154 "seek_hole": false, 00:17:00.154 "seek_data": false, 00:17:00.154 "copy": true, 00:17:00.154 "nvme_iov_md": false 00:17:00.154 }, 00:17:00.154 "memory_domains": [ 00:17:00.154 { 00:17:00.154 "dma_device_id": "system", 00:17:00.154 "dma_device_type": 1 00:17:00.154 }, 00:17:00.154 { 00:17:00.154 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:00.154 "dma_device_type": 2 00:17:00.154 } 00:17:00.154 ], 00:17:00.154 "driver_specific": { 00:17:00.154 "passthru": { 00:17:00.154 "name": "pt1", 00:17:00.154 "base_bdev_name": "malloc1" 00:17:00.154 } 00:17:00.154 } 00:17:00.154 }' 00:17:00.154 18:20:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:00.154 18:20:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:00.154 18:20:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:17:00.154 18:20:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:00.154 18:20:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:00.154 18:20:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:17:00.154 18:20:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:00.154 18:20:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:00.413 18:20:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:17:00.413 18:20:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:00.413 18:20:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:00.413 18:20:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:17:00.413 18:20:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:17:00.413 18:20:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:17:00.413 18:20:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:17:00.671 18:20:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:17:00.671 "name": "pt2", 00:17:00.671 "aliases": [ 00:17:00.671 "00000000-0000-0000-0000-000000000002" 00:17:00.671 ], 00:17:00.671 "product_name": "passthru", 00:17:00.671 "block_size": 512, 00:17:00.671 "num_blocks": 65536, 00:17:00.671 "uuid": "00000000-0000-0000-0000-000000000002", 00:17:00.671 "assigned_rate_limits": { 00:17:00.671 "rw_ios_per_sec": 0, 00:17:00.671 "rw_mbytes_per_sec": 0, 00:17:00.671 "r_mbytes_per_sec": 0, 00:17:00.671 "w_mbytes_per_sec": 0 00:17:00.671 }, 00:17:00.671 "claimed": true, 00:17:00.671 "claim_type": "exclusive_write", 00:17:00.671 "zoned": false, 00:17:00.671 "supported_io_types": { 00:17:00.671 "read": true, 00:17:00.671 "write": true, 00:17:00.671 "unmap": true, 00:17:00.671 "flush": true, 00:17:00.671 "reset": true, 00:17:00.671 "nvme_admin": false, 00:17:00.671 "nvme_io": false, 00:17:00.671 "nvme_io_md": false, 00:17:00.671 "write_zeroes": true, 00:17:00.671 "zcopy": true, 00:17:00.671 "get_zone_info": false, 00:17:00.671 "zone_management": false, 00:17:00.671 "zone_append": false, 00:17:00.671 "compare": false, 00:17:00.671 "compare_and_write": false, 00:17:00.671 "abort": true, 00:17:00.671 "seek_hole": false, 00:17:00.671 "seek_data": false, 00:17:00.671 "copy": true, 00:17:00.671 "nvme_iov_md": false 00:17:00.671 }, 00:17:00.671 "memory_domains": [ 00:17:00.671 { 00:17:00.671 "dma_device_id": "system", 00:17:00.671 "dma_device_type": 1 00:17:00.671 }, 00:17:00.671 { 00:17:00.671 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:00.671 "dma_device_type": 2 00:17:00.671 } 00:17:00.671 ], 00:17:00.671 "driver_specific": { 00:17:00.671 "passthru": { 00:17:00.671 "name": "pt2", 00:17:00.671 "base_bdev_name": "malloc2" 00:17:00.671 } 00:17:00.671 } 00:17:00.671 }' 00:17:00.671 18:20:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:00.671 18:20:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:00.671 18:20:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:17:00.671 18:20:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:00.671 18:20:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:00.931 18:20:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:17:00.931 18:20:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:00.931 18:20:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:00.931 18:20:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:17:00.931 18:20:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:00.931 18:20:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:00.931 18:20:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:17:00.931 18:20:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:17:00.931 18:20:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt3 00:17:00.931 18:20:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:17:01.204 18:20:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:17:01.204 "name": "pt3", 00:17:01.204 "aliases": [ 00:17:01.204 "00000000-0000-0000-0000-000000000003" 00:17:01.204 ], 00:17:01.204 "product_name": "passthru", 00:17:01.204 "block_size": 512, 00:17:01.204 "num_blocks": 65536, 00:17:01.204 "uuid": "00000000-0000-0000-0000-000000000003", 00:17:01.204 "assigned_rate_limits": { 00:17:01.204 "rw_ios_per_sec": 0, 00:17:01.204 "rw_mbytes_per_sec": 0, 00:17:01.204 "r_mbytes_per_sec": 0, 00:17:01.204 "w_mbytes_per_sec": 0 00:17:01.204 }, 00:17:01.204 "claimed": true, 00:17:01.204 "claim_type": "exclusive_write", 00:17:01.204 "zoned": false, 00:17:01.204 "supported_io_types": { 00:17:01.204 "read": true, 00:17:01.204 "write": true, 00:17:01.204 "unmap": true, 00:17:01.204 "flush": true, 00:17:01.204 "reset": true, 00:17:01.204 "nvme_admin": false, 00:17:01.204 "nvme_io": false, 00:17:01.204 "nvme_io_md": false, 00:17:01.204 "write_zeroes": true, 00:17:01.204 "zcopy": true, 00:17:01.204 "get_zone_info": false, 00:17:01.204 "zone_management": false, 00:17:01.204 "zone_append": false, 00:17:01.204 "compare": false, 00:17:01.204 "compare_and_write": false, 00:17:01.204 "abort": true, 00:17:01.204 "seek_hole": false, 00:17:01.204 "seek_data": false, 00:17:01.204 "copy": true, 00:17:01.204 "nvme_iov_md": false 00:17:01.204 }, 00:17:01.204 "memory_domains": [ 00:17:01.204 { 00:17:01.204 "dma_device_id": "system", 00:17:01.204 "dma_device_type": 1 00:17:01.204 }, 00:17:01.204 { 00:17:01.204 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:01.204 "dma_device_type": 2 00:17:01.204 } 00:17:01.204 ], 00:17:01.204 "driver_specific": { 00:17:01.204 "passthru": { 00:17:01.204 "name": "pt3", 00:17:01.204 "base_bdev_name": "malloc3" 00:17:01.204 } 00:17:01.204 } 00:17:01.204 }' 00:17:01.204 18:20:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:01.204 18:20:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:01.204 18:20:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:17:01.204 18:20:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:01.475 18:20:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:01.475 18:20:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:17:01.475 18:20:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:01.475 18:20:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:01.475 18:20:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:17:01.475 18:20:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:01.475 18:20:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:01.475 18:20:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:17:01.475 18:20:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:17:01.475 18:20:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # jq -r '.[] | .uuid' 00:17:01.733 [2024-07-12 18:20:45.357317] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:17:01.733 18:20:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # raid_bdev_uuid=c202f6e0-0981-423a-a327-29e2bddbfcd3 00:17:01.733 18:20:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@435 -- # '[' -z c202f6e0-0981-423a-a327-29e2bddbfcd3 ']' 00:17:01.733 18:20:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@440 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:17:01.991 [2024-07-12 18:20:45.601701] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:17:01.991 [2024-07-12 18:20:45.601720] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:17:01.991 [2024-07-12 18:20:45.601766] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:17:01.991 [2024-07-12 18:20:45.601832] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:17:01.991 [2024-07-12 18:20:45.601844] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x23d1ea0 name raid_bdev1, state offline 00:17:01.991 18:20:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:01.991 18:20:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # jq -r '.[]' 00:17:02.250 18:20:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # raid_bdev= 00:17:02.250 18:20:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@442 -- # '[' -n '' ']' 00:17:02.250 18:20:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:17:02.250 18:20:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt1 00:17:02.508 18:20:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:17:02.508 18:20:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:17:02.766 18:20:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:17:02.766 18:20:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt3 00:17:03.025 18:20:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs 00:17:03.025 18:20:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # jq -r '[.[] | select(.product_name == "passthru")] | any' 00:17:03.284 18:20:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # '[' false == true ']' 00:17:03.284 18:20:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@456 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'malloc1 malloc2 malloc3' -n raid_bdev1 00:17:03.284 18:20:46 bdev_raid.raid_superblock_test -- common/autotest_common.sh@648 -- # local es=0 00:17:03.284 18:20:46 bdev_raid.raid_superblock_test -- common/autotest_common.sh@650 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'malloc1 malloc2 malloc3' -n raid_bdev1 00:17:03.284 18:20:46 bdev_raid.raid_superblock_test -- common/autotest_common.sh@636 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:17:03.284 18:20:46 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:17:03.284 18:20:46 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:17:03.284 18:20:46 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:17:03.284 18:20:46 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:17:03.284 18:20:46 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:17:03.284 18:20:46 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:17:03.284 18:20:46 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py ]] 00:17:03.284 18:20:46 bdev_raid.raid_superblock_test -- common/autotest_common.sh@651 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'malloc1 malloc2 malloc3' -n raid_bdev1 00:17:03.542 [2024-07-12 18:20:47.041435] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc1 is claimed 00:17:03.542 [2024-07-12 18:20:47.042811] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc2 is claimed 00:17:03.542 [2024-07-12 18:20:47.042852] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc3 is claimed 00:17:03.542 [2024-07-12 18:20:47.042895] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc1 00:17:03.542 [2024-07-12 18:20:47.042943] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc2 00:17:03.542 [2024-07-12 18:20:47.042967] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc3 00:17:03.542 [2024-07-12 18:20:47.042985] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:17:03.542 [2024-07-12 18:20:47.042994] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x257cff0 name raid_bdev1, state configuring 00:17:03.542 request: 00:17:03.542 { 00:17:03.542 "name": "raid_bdev1", 00:17:03.542 "raid_level": "raid1", 00:17:03.542 "base_bdevs": [ 00:17:03.542 "malloc1", 00:17:03.542 "malloc2", 00:17:03.542 "malloc3" 00:17:03.542 ], 00:17:03.542 "superblock": false, 00:17:03.542 "method": "bdev_raid_create", 00:17:03.542 "req_id": 1 00:17:03.542 } 00:17:03.542 Got JSON-RPC error response 00:17:03.542 response: 00:17:03.542 { 00:17:03.542 "code": -17, 00:17:03.542 "message": "Failed to create RAID bdev raid_bdev1: File exists" 00:17:03.542 } 00:17:03.542 18:20:47 bdev_raid.raid_superblock_test -- common/autotest_common.sh@651 -- # es=1 00:17:03.542 18:20:47 bdev_raid.raid_superblock_test -- common/autotest_common.sh@659 -- # (( es > 128 )) 00:17:03.542 18:20:47 bdev_raid.raid_superblock_test -- common/autotest_common.sh@670 -- # [[ -n '' ]] 00:17:03.542 18:20:47 bdev_raid.raid_superblock_test -- common/autotest_common.sh@675 -- # (( !es == 0 )) 00:17:03.542 18:20:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:03.542 18:20:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # jq -r '.[]' 00:17:03.800 18:20:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # raid_bdev= 00:17:03.800 18:20:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@459 -- # '[' -n '' ']' 00:17:03.800 18:20:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@464 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:17:04.058 [2024-07-12 18:20:47.534689] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:17:04.059 [2024-07-12 18:20:47.534729] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:17:04.059 [2024-07-12 18:20:47.534749] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x23d97a0 00:17:04.059 [2024-07-12 18:20:47.534762] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:17:04.059 [2024-07-12 18:20:47.536345] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:17:04.059 [2024-07-12 18:20:47.536373] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:17:04.059 [2024-07-12 18:20:47.536435] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt1 00:17:04.059 [2024-07-12 18:20:47.536459] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:17:04.059 pt1 00:17:04.059 18:20:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@467 -- # verify_raid_bdev_state raid_bdev1 configuring raid1 0 3 00:17:04.059 18:20:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:17:04.059 18:20:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:17:04.059 18:20:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:17:04.059 18:20:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:17:04.059 18:20:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:17:04.059 18:20:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:04.059 18:20:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:04.059 18:20:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:04.059 18:20:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:04.059 18:20:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:04.059 18:20:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:17:04.317 18:20:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:04.317 "name": "raid_bdev1", 00:17:04.317 "uuid": "c202f6e0-0981-423a-a327-29e2bddbfcd3", 00:17:04.317 "strip_size_kb": 0, 00:17:04.317 "state": "configuring", 00:17:04.317 "raid_level": "raid1", 00:17:04.317 "superblock": true, 00:17:04.317 "num_base_bdevs": 3, 00:17:04.317 "num_base_bdevs_discovered": 1, 00:17:04.317 "num_base_bdevs_operational": 3, 00:17:04.317 "base_bdevs_list": [ 00:17:04.317 { 00:17:04.317 "name": "pt1", 00:17:04.317 "uuid": "00000000-0000-0000-0000-000000000001", 00:17:04.317 "is_configured": true, 00:17:04.317 "data_offset": 2048, 00:17:04.317 "data_size": 63488 00:17:04.317 }, 00:17:04.317 { 00:17:04.317 "name": null, 00:17:04.317 "uuid": "00000000-0000-0000-0000-000000000002", 00:17:04.317 "is_configured": false, 00:17:04.317 "data_offset": 2048, 00:17:04.317 "data_size": 63488 00:17:04.317 }, 00:17:04.317 { 00:17:04.317 "name": null, 00:17:04.317 "uuid": "00000000-0000-0000-0000-000000000003", 00:17:04.317 "is_configured": false, 00:17:04.317 "data_offset": 2048, 00:17:04.317 "data_size": 63488 00:17:04.317 } 00:17:04.317 ] 00:17:04.317 }' 00:17:04.317 18:20:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:04.317 18:20:47 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:17:04.884 18:20:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@469 -- # '[' 3 -gt 2 ']' 00:17:04.884 18:20:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@471 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:17:04.884 [2024-07-12 18:20:48.557416] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:17:04.884 [2024-07-12 18:20:48.557462] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:17:04.884 [2024-07-12 18:20:48.557479] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x23d0a10 00:17:04.884 [2024-07-12 18:20:48.557491] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:17:04.884 [2024-07-12 18:20:48.557815] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:17:04.884 [2024-07-12 18:20:48.557832] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:17:04.884 [2024-07-12 18:20:48.557891] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:17:04.884 [2024-07-12 18:20:48.557910] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:17:04.884 pt2 00:17:04.884 18:20:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@472 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:17:05.142 [2024-07-12 18:20:48.802071] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: pt2 00:17:05.142 18:20:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@473 -- # verify_raid_bdev_state raid_bdev1 configuring raid1 0 3 00:17:05.142 18:20:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:17:05.142 18:20:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:17:05.142 18:20:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:17:05.142 18:20:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:17:05.142 18:20:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:17:05.142 18:20:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:05.142 18:20:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:05.142 18:20:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:05.142 18:20:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:05.142 18:20:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:05.142 18:20:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:17:05.399 18:20:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:05.399 "name": "raid_bdev1", 00:17:05.399 "uuid": "c202f6e0-0981-423a-a327-29e2bddbfcd3", 00:17:05.399 "strip_size_kb": 0, 00:17:05.399 "state": "configuring", 00:17:05.399 "raid_level": "raid1", 00:17:05.399 "superblock": true, 00:17:05.399 "num_base_bdevs": 3, 00:17:05.399 "num_base_bdevs_discovered": 1, 00:17:05.399 "num_base_bdevs_operational": 3, 00:17:05.399 "base_bdevs_list": [ 00:17:05.399 { 00:17:05.399 "name": "pt1", 00:17:05.399 "uuid": "00000000-0000-0000-0000-000000000001", 00:17:05.399 "is_configured": true, 00:17:05.399 "data_offset": 2048, 00:17:05.399 "data_size": 63488 00:17:05.399 }, 00:17:05.399 { 00:17:05.399 "name": null, 00:17:05.399 "uuid": "00000000-0000-0000-0000-000000000002", 00:17:05.399 "is_configured": false, 00:17:05.399 "data_offset": 2048, 00:17:05.399 "data_size": 63488 00:17:05.399 }, 00:17:05.399 { 00:17:05.399 "name": null, 00:17:05.399 "uuid": "00000000-0000-0000-0000-000000000003", 00:17:05.399 "is_configured": false, 00:17:05.399 "data_offset": 2048, 00:17:05.399 "data_size": 63488 00:17:05.399 } 00:17:05.399 ] 00:17:05.399 }' 00:17:05.399 18:20:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:05.399 18:20:49 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:17:05.966 18:20:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i = 1 )) 00:17:05.966 18:20:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:17:05.966 18:20:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@478 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:17:06.224 [2024-07-12 18:20:49.856857] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:17:06.224 [2024-07-12 18:20:49.856911] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:17:06.224 [2024-07-12 18:20:49.856935] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x23d9a10 00:17:06.224 [2024-07-12 18:20:49.856947] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:17:06.224 [2024-07-12 18:20:49.857279] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:17:06.224 [2024-07-12 18:20:49.857296] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:17:06.224 [2024-07-12 18:20:49.857358] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:17:06.224 [2024-07-12 18:20:49.857376] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:17:06.224 pt2 00:17:06.224 18:20:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i++ )) 00:17:06.224 18:20:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:17:06.224 18:20:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@478 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc3 -p pt3 -u 00000000-0000-0000-0000-000000000003 00:17:06.483 [2024-07-12 18:20:50.113544] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc3 00:17:06.483 [2024-07-12 18:20:50.113591] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:17:06.483 [2024-07-12 18:20:50.113607] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x23d06c0 00:17:06.483 [2024-07-12 18:20:50.113620] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:17:06.483 [2024-07-12 18:20:50.113950] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:17:06.483 [2024-07-12 18:20:50.113968] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt3 00:17:06.483 [2024-07-12 18:20:50.114027] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt3 00:17:06.483 [2024-07-12 18:20:50.114045] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt3 is claimed 00:17:06.483 [2024-07-12 18:20:50.114152] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x2573c00 00:17:06.483 [2024-07-12 18:20:50.114162] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:17:06.483 [2024-07-12 18:20:50.114329] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x23d3610 00:17:06.483 [2024-07-12 18:20:50.114455] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x2573c00 00:17:06.483 [2024-07-12 18:20:50.114465] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x2573c00 00:17:06.483 [2024-07-12 18:20:50.114563] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:17:06.483 pt3 00:17:06.483 18:20:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i++ )) 00:17:06.483 18:20:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:17:06.483 18:20:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@482 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:17:06.483 18:20:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:17:06.483 18:20:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:17:06.483 18:20:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:17:06.483 18:20:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:17:06.483 18:20:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:17:06.483 18:20:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:06.483 18:20:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:06.483 18:20:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:06.483 18:20:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:06.483 18:20:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:06.483 18:20:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:17:06.742 18:20:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:06.742 "name": "raid_bdev1", 00:17:06.742 "uuid": "c202f6e0-0981-423a-a327-29e2bddbfcd3", 00:17:06.742 "strip_size_kb": 0, 00:17:06.742 "state": "online", 00:17:06.742 "raid_level": "raid1", 00:17:06.742 "superblock": true, 00:17:06.742 "num_base_bdevs": 3, 00:17:06.742 "num_base_bdevs_discovered": 3, 00:17:06.742 "num_base_bdevs_operational": 3, 00:17:06.742 "base_bdevs_list": [ 00:17:06.742 { 00:17:06.742 "name": "pt1", 00:17:06.742 "uuid": "00000000-0000-0000-0000-000000000001", 00:17:06.742 "is_configured": true, 00:17:06.742 "data_offset": 2048, 00:17:06.742 "data_size": 63488 00:17:06.742 }, 00:17:06.742 { 00:17:06.742 "name": "pt2", 00:17:06.742 "uuid": "00000000-0000-0000-0000-000000000002", 00:17:06.742 "is_configured": true, 00:17:06.742 "data_offset": 2048, 00:17:06.742 "data_size": 63488 00:17:06.742 }, 00:17:06.742 { 00:17:06.742 "name": "pt3", 00:17:06.742 "uuid": "00000000-0000-0000-0000-000000000003", 00:17:06.742 "is_configured": true, 00:17:06.742 "data_offset": 2048, 00:17:06.742 "data_size": 63488 00:17:06.742 } 00:17:06.742 ] 00:17:06.742 }' 00:17:06.742 18:20:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:06.742 18:20:50 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:17:07.309 18:20:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@483 -- # verify_raid_bdev_properties raid_bdev1 00:17:07.309 18:20:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:17:07.309 18:20:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:17:07.309 18:20:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:17:07.309 18:20:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:17:07.309 18:20:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@198 -- # local name 00:17:07.309 18:20:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:17:07.309 18:20:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:17:07.567 [2024-07-12 18:20:51.144563] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:17:07.567 18:20:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:17:07.567 "name": "raid_bdev1", 00:17:07.567 "aliases": [ 00:17:07.567 "c202f6e0-0981-423a-a327-29e2bddbfcd3" 00:17:07.567 ], 00:17:07.567 "product_name": "Raid Volume", 00:17:07.567 "block_size": 512, 00:17:07.567 "num_blocks": 63488, 00:17:07.567 "uuid": "c202f6e0-0981-423a-a327-29e2bddbfcd3", 00:17:07.567 "assigned_rate_limits": { 00:17:07.567 "rw_ios_per_sec": 0, 00:17:07.567 "rw_mbytes_per_sec": 0, 00:17:07.568 "r_mbytes_per_sec": 0, 00:17:07.568 "w_mbytes_per_sec": 0 00:17:07.568 }, 00:17:07.568 "claimed": false, 00:17:07.568 "zoned": false, 00:17:07.568 "supported_io_types": { 00:17:07.568 "read": true, 00:17:07.568 "write": true, 00:17:07.568 "unmap": false, 00:17:07.568 "flush": false, 00:17:07.568 "reset": true, 00:17:07.568 "nvme_admin": false, 00:17:07.568 "nvme_io": false, 00:17:07.568 "nvme_io_md": false, 00:17:07.568 "write_zeroes": true, 00:17:07.568 "zcopy": false, 00:17:07.568 "get_zone_info": false, 00:17:07.568 "zone_management": false, 00:17:07.568 "zone_append": false, 00:17:07.568 "compare": false, 00:17:07.568 "compare_and_write": false, 00:17:07.568 "abort": false, 00:17:07.568 "seek_hole": false, 00:17:07.568 "seek_data": false, 00:17:07.568 "copy": false, 00:17:07.568 "nvme_iov_md": false 00:17:07.568 }, 00:17:07.568 "memory_domains": [ 00:17:07.568 { 00:17:07.568 "dma_device_id": "system", 00:17:07.568 "dma_device_type": 1 00:17:07.568 }, 00:17:07.568 { 00:17:07.568 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:07.568 "dma_device_type": 2 00:17:07.568 }, 00:17:07.568 { 00:17:07.568 "dma_device_id": "system", 00:17:07.568 "dma_device_type": 1 00:17:07.568 }, 00:17:07.568 { 00:17:07.568 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:07.568 "dma_device_type": 2 00:17:07.568 }, 00:17:07.568 { 00:17:07.568 "dma_device_id": "system", 00:17:07.568 "dma_device_type": 1 00:17:07.568 }, 00:17:07.568 { 00:17:07.568 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:07.568 "dma_device_type": 2 00:17:07.568 } 00:17:07.568 ], 00:17:07.568 "driver_specific": { 00:17:07.568 "raid": { 00:17:07.568 "uuid": "c202f6e0-0981-423a-a327-29e2bddbfcd3", 00:17:07.568 "strip_size_kb": 0, 00:17:07.568 "state": "online", 00:17:07.568 "raid_level": "raid1", 00:17:07.568 "superblock": true, 00:17:07.568 "num_base_bdevs": 3, 00:17:07.568 "num_base_bdevs_discovered": 3, 00:17:07.568 "num_base_bdevs_operational": 3, 00:17:07.568 "base_bdevs_list": [ 00:17:07.568 { 00:17:07.568 "name": "pt1", 00:17:07.568 "uuid": "00000000-0000-0000-0000-000000000001", 00:17:07.568 "is_configured": true, 00:17:07.568 "data_offset": 2048, 00:17:07.568 "data_size": 63488 00:17:07.568 }, 00:17:07.568 { 00:17:07.568 "name": "pt2", 00:17:07.568 "uuid": "00000000-0000-0000-0000-000000000002", 00:17:07.568 "is_configured": true, 00:17:07.568 "data_offset": 2048, 00:17:07.568 "data_size": 63488 00:17:07.568 }, 00:17:07.568 { 00:17:07.568 "name": "pt3", 00:17:07.568 "uuid": "00000000-0000-0000-0000-000000000003", 00:17:07.568 "is_configured": true, 00:17:07.568 "data_offset": 2048, 00:17:07.568 "data_size": 63488 00:17:07.568 } 00:17:07.568 ] 00:17:07.568 } 00:17:07.568 } 00:17:07.568 }' 00:17:07.568 18:20:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:17:07.568 18:20:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:17:07.568 pt2 00:17:07.568 pt3' 00:17:07.568 18:20:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:17:07.568 18:20:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:17:07.568 18:20:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:17:07.826 18:20:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:17:07.826 "name": "pt1", 00:17:07.826 "aliases": [ 00:17:07.826 "00000000-0000-0000-0000-000000000001" 00:17:07.826 ], 00:17:07.826 "product_name": "passthru", 00:17:07.826 "block_size": 512, 00:17:07.826 "num_blocks": 65536, 00:17:07.826 "uuid": "00000000-0000-0000-0000-000000000001", 00:17:07.826 "assigned_rate_limits": { 00:17:07.826 "rw_ios_per_sec": 0, 00:17:07.826 "rw_mbytes_per_sec": 0, 00:17:07.826 "r_mbytes_per_sec": 0, 00:17:07.826 "w_mbytes_per_sec": 0 00:17:07.826 }, 00:17:07.826 "claimed": true, 00:17:07.826 "claim_type": "exclusive_write", 00:17:07.826 "zoned": false, 00:17:07.826 "supported_io_types": { 00:17:07.826 "read": true, 00:17:07.826 "write": true, 00:17:07.826 "unmap": true, 00:17:07.826 "flush": true, 00:17:07.826 "reset": true, 00:17:07.826 "nvme_admin": false, 00:17:07.826 "nvme_io": false, 00:17:07.826 "nvme_io_md": false, 00:17:07.826 "write_zeroes": true, 00:17:07.826 "zcopy": true, 00:17:07.826 "get_zone_info": false, 00:17:07.826 "zone_management": false, 00:17:07.826 "zone_append": false, 00:17:07.826 "compare": false, 00:17:07.826 "compare_and_write": false, 00:17:07.826 "abort": true, 00:17:07.826 "seek_hole": false, 00:17:07.826 "seek_data": false, 00:17:07.826 "copy": true, 00:17:07.826 "nvme_iov_md": false 00:17:07.826 }, 00:17:07.826 "memory_domains": [ 00:17:07.826 { 00:17:07.826 "dma_device_id": "system", 00:17:07.826 "dma_device_type": 1 00:17:07.826 }, 00:17:07.826 { 00:17:07.826 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:07.826 "dma_device_type": 2 00:17:07.826 } 00:17:07.826 ], 00:17:07.826 "driver_specific": { 00:17:07.826 "passthru": { 00:17:07.826 "name": "pt1", 00:17:07.826 "base_bdev_name": "malloc1" 00:17:07.826 } 00:17:07.826 } 00:17:07.826 }' 00:17:07.826 18:20:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:07.826 18:20:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:07.826 18:20:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:17:07.826 18:20:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:08.084 18:20:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:08.084 18:20:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:17:08.084 18:20:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:08.084 18:20:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:08.084 18:20:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:17:08.084 18:20:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:08.084 18:20:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:08.084 18:20:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:17:08.084 18:20:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:17:08.084 18:20:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:17:08.084 18:20:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:17:08.342 18:20:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:17:08.342 "name": "pt2", 00:17:08.342 "aliases": [ 00:17:08.342 "00000000-0000-0000-0000-000000000002" 00:17:08.342 ], 00:17:08.342 "product_name": "passthru", 00:17:08.342 "block_size": 512, 00:17:08.342 "num_blocks": 65536, 00:17:08.342 "uuid": "00000000-0000-0000-0000-000000000002", 00:17:08.342 "assigned_rate_limits": { 00:17:08.342 "rw_ios_per_sec": 0, 00:17:08.342 "rw_mbytes_per_sec": 0, 00:17:08.342 "r_mbytes_per_sec": 0, 00:17:08.342 "w_mbytes_per_sec": 0 00:17:08.342 }, 00:17:08.342 "claimed": true, 00:17:08.342 "claim_type": "exclusive_write", 00:17:08.342 "zoned": false, 00:17:08.342 "supported_io_types": { 00:17:08.342 "read": true, 00:17:08.342 "write": true, 00:17:08.342 "unmap": true, 00:17:08.342 "flush": true, 00:17:08.342 "reset": true, 00:17:08.342 "nvme_admin": false, 00:17:08.342 "nvme_io": false, 00:17:08.342 "nvme_io_md": false, 00:17:08.342 "write_zeroes": true, 00:17:08.342 "zcopy": true, 00:17:08.342 "get_zone_info": false, 00:17:08.342 "zone_management": false, 00:17:08.342 "zone_append": false, 00:17:08.342 "compare": false, 00:17:08.342 "compare_and_write": false, 00:17:08.342 "abort": true, 00:17:08.342 "seek_hole": false, 00:17:08.342 "seek_data": false, 00:17:08.342 "copy": true, 00:17:08.342 "nvme_iov_md": false 00:17:08.342 }, 00:17:08.342 "memory_domains": [ 00:17:08.342 { 00:17:08.342 "dma_device_id": "system", 00:17:08.342 "dma_device_type": 1 00:17:08.342 }, 00:17:08.342 { 00:17:08.342 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:08.342 "dma_device_type": 2 00:17:08.342 } 00:17:08.342 ], 00:17:08.342 "driver_specific": { 00:17:08.342 "passthru": { 00:17:08.342 "name": "pt2", 00:17:08.342 "base_bdev_name": "malloc2" 00:17:08.342 } 00:17:08.342 } 00:17:08.342 }' 00:17:08.342 18:20:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:08.600 18:20:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:08.600 18:20:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:17:08.600 18:20:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:08.600 18:20:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:08.600 18:20:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:17:08.600 18:20:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:08.600 18:20:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:08.600 18:20:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:17:08.601 18:20:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:08.859 18:20:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:08.859 18:20:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:17:08.859 18:20:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:17:08.859 18:20:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt3 00:17:08.859 18:20:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:17:09.116 18:20:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:17:09.116 "name": "pt3", 00:17:09.116 "aliases": [ 00:17:09.116 "00000000-0000-0000-0000-000000000003" 00:17:09.116 ], 00:17:09.116 "product_name": "passthru", 00:17:09.116 "block_size": 512, 00:17:09.116 "num_blocks": 65536, 00:17:09.116 "uuid": "00000000-0000-0000-0000-000000000003", 00:17:09.116 "assigned_rate_limits": { 00:17:09.116 "rw_ios_per_sec": 0, 00:17:09.116 "rw_mbytes_per_sec": 0, 00:17:09.116 "r_mbytes_per_sec": 0, 00:17:09.116 "w_mbytes_per_sec": 0 00:17:09.116 }, 00:17:09.116 "claimed": true, 00:17:09.116 "claim_type": "exclusive_write", 00:17:09.116 "zoned": false, 00:17:09.116 "supported_io_types": { 00:17:09.116 "read": true, 00:17:09.116 "write": true, 00:17:09.116 "unmap": true, 00:17:09.116 "flush": true, 00:17:09.116 "reset": true, 00:17:09.116 "nvme_admin": false, 00:17:09.116 "nvme_io": false, 00:17:09.116 "nvme_io_md": false, 00:17:09.116 "write_zeroes": true, 00:17:09.116 "zcopy": true, 00:17:09.116 "get_zone_info": false, 00:17:09.117 "zone_management": false, 00:17:09.117 "zone_append": false, 00:17:09.117 "compare": false, 00:17:09.117 "compare_and_write": false, 00:17:09.117 "abort": true, 00:17:09.117 "seek_hole": false, 00:17:09.117 "seek_data": false, 00:17:09.117 "copy": true, 00:17:09.117 "nvme_iov_md": false 00:17:09.117 }, 00:17:09.117 "memory_domains": [ 00:17:09.117 { 00:17:09.117 "dma_device_id": "system", 00:17:09.117 "dma_device_type": 1 00:17:09.117 }, 00:17:09.117 { 00:17:09.117 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:09.117 "dma_device_type": 2 00:17:09.117 } 00:17:09.117 ], 00:17:09.117 "driver_specific": { 00:17:09.117 "passthru": { 00:17:09.117 "name": "pt3", 00:17:09.117 "base_bdev_name": "malloc3" 00:17:09.117 } 00:17:09.117 } 00:17:09.117 }' 00:17:09.117 18:20:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:09.117 18:20:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:09.117 18:20:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:17:09.117 18:20:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:09.117 18:20:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:09.117 18:20:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:17:09.117 18:20:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:09.374 18:20:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:09.374 18:20:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:17:09.374 18:20:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:09.374 18:20:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:09.374 18:20:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:17:09.374 18:20:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:17:09.374 18:20:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # jq -r '.[] | .uuid' 00:17:09.632 [2024-07-12 18:20:53.218073] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:17:09.632 18:20:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # '[' c202f6e0-0981-423a-a327-29e2bddbfcd3 '!=' c202f6e0-0981-423a-a327-29e2bddbfcd3 ']' 00:17:09.632 18:20:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@490 -- # has_redundancy raid1 00:17:09.632 18:20:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:17:09.632 18:20:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@214 -- # return 0 00:17:09.632 18:20:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@492 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt1 00:17:09.890 [2024-07-12 18:20:53.462459] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: pt1 00:17:09.890 18:20:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@495 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:17:09.890 18:20:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:17:09.890 18:20:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:17:09.890 18:20:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:17:09.890 18:20:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:17:09.890 18:20:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:17:09.890 18:20:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:09.890 18:20:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:09.890 18:20:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:09.890 18:20:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:09.890 18:20:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:09.890 18:20:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:17:10.149 18:20:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:10.149 "name": "raid_bdev1", 00:17:10.149 "uuid": "c202f6e0-0981-423a-a327-29e2bddbfcd3", 00:17:10.149 "strip_size_kb": 0, 00:17:10.149 "state": "online", 00:17:10.149 "raid_level": "raid1", 00:17:10.149 "superblock": true, 00:17:10.149 "num_base_bdevs": 3, 00:17:10.149 "num_base_bdevs_discovered": 2, 00:17:10.149 "num_base_bdevs_operational": 2, 00:17:10.149 "base_bdevs_list": [ 00:17:10.149 { 00:17:10.149 "name": null, 00:17:10.149 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:10.149 "is_configured": false, 00:17:10.149 "data_offset": 2048, 00:17:10.149 "data_size": 63488 00:17:10.149 }, 00:17:10.149 { 00:17:10.149 "name": "pt2", 00:17:10.149 "uuid": "00000000-0000-0000-0000-000000000002", 00:17:10.149 "is_configured": true, 00:17:10.149 "data_offset": 2048, 00:17:10.149 "data_size": 63488 00:17:10.149 }, 00:17:10.149 { 00:17:10.149 "name": "pt3", 00:17:10.149 "uuid": "00000000-0000-0000-0000-000000000003", 00:17:10.149 "is_configured": true, 00:17:10.149 "data_offset": 2048, 00:17:10.149 "data_size": 63488 00:17:10.149 } 00:17:10.149 ] 00:17:10.149 }' 00:17:10.149 18:20:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:10.149 18:20:53 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:17:10.716 18:20:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@498 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:17:10.975 [2024-07-12 18:20:54.557335] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:17:10.975 [2024-07-12 18:20:54.557360] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:17:10.975 [2024-07-12 18:20:54.557412] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:17:10.975 [2024-07-12 18:20:54.557465] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:17:10.975 [2024-07-12 18:20:54.557476] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x2573c00 name raid_bdev1, state offline 00:17:10.975 18:20:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@499 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:10.975 18:20:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@499 -- # jq -r '.[]' 00:17:11.233 18:20:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@499 -- # raid_bdev= 00:17:11.234 18:20:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@500 -- # '[' -n '' ']' 00:17:11.234 18:20:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@505 -- # (( i = 1 )) 00:17:11.234 18:20:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@505 -- # (( i < num_base_bdevs )) 00:17:11.234 18:20:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@506 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:17:11.493 18:20:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@505 -- # (( i++ )) 00:17:11.493 18:20:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@505 -- # (( i < num_base_bdevs )) 00:17:11.493 18:20:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@506 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt3 00:17:11.751 18:20:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@505 -- # (( i++ )) 00:17:11.751 18:20:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@505 -- # (( i < num_base_bdevs )) 00:17:11.751 18:20:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@510 -- # (( i = 1 )) 00:17:11.751 18:20:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@510 -- # (( i < num_base_bdevs - 1 )) 00:17:11.751 18:20:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@511 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:17:12.010 [2024-07-12 18:20:55.535866] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:17:12.010 [2024-07-12 18:20:55.535914] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:17:12.010 [2024-07-12 18:20:55.535941] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x23d1310 00:17:12.010 [2024-07-12 18:20:55.535954] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:17:12.010 [2024-07-12 18:20:55.537598] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:17:12.010 [2024-07-12 18:20:55.537627] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:17:12.010 [2024-07-12 18:20:55.537695] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:17:12.010 [2024-07-12 18:20:55.537722] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:17:12.010 pt2 00:17:12.010 18:20:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@514 -- # verify_raid_bdev_state raid_bdev1 configuring raid1 0 2 00:17:12.010 18:20:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:17:12.010 18:20:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:17:12.010 18:20:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:17:12.010 18:20:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:17:12.010 18:20:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:17:12.010 18:20:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:12.010 18:20:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:12.010 18:20:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:12.010 18:20:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:12.010 18:20:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:12.010 18:20:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:17:12.268 18:20:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:12.268 "name": "raid_bdev1", 00:17:12.268 "uuid": "c202f6e0-0981-423a-a327-29e2bddbfcd3", 00:17:12.268 "strip_size_kb": 0, 00:17:12.268 "state": "configuring", 00:17:12.268 "raid_level": "raid1", 00:17:12.268 "superblock": true, 00:17:12.268 "num_base_bdevs": 3, 00:17:12.268 "num_base_bdevs_discovered": 1, 00:17:12.268 "num_base_bdevs_operational": 2, 00:17:12.268 "base_bdevs_list": [ 00:17:12.268 { 00:17:12.268 "name": null, 00:17:12.268 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:12.268 "is_configured": false, 00:17:12.268 "data_offset": 2048, 00:17:12.268 "data_size": 63488 00:17:12.268 }, 00:17:12.268 { 00:17:12.268 "name": "pt2", 00:17:12.268 "uuid": "00000000-0000-0000-0000-000000000002", 00:17:12.268 "is_configured": true, 00:17:12.268 "data_offset": 2048, 00:17:12.268 "data_size": 63488 00:17:12.268 }, 00:17:12.268 { 00:17:12.268 "name": null, 00:17:12.268 "uuid": "00000000-0000-0000-0000-000000000003", 00:17:12.268 "is_configured": false, 00:17:12.268 "data_offset": 2048, 00:17:12.268 "data_size": 63488 00:17:12.268 } 00:17:12.268 ] 00:17:12.268 }' 00:17:12.268 18:20:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:12.268 18:20:55 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:17:12.836 18:20:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@510 -- # (( i++ )) 00:17:12.836 18:20:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@510 -- # (( i < num_base_bdevs - 1 )) 00:17:12.836 18:20:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@518 -- # i=2 00:17:12.836 18:20:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@519 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc3 -p pt3 -u 00000000-0000-0000-0000-000000000003 00:17:13.095 [2024-07-12 18:20:56.582662] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc3 00:17:13.095 [2024-07-12 18:20:56.582712] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:17:13.095 [2024-07-12 18:20:56.582733] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x23cfec0 00:17:13.095 [2024-07-12 18:20:56.582746] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:17:13.095 [2024-07-12 18:20:56.583092] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:17:13.095 [2024-07-12 18:20:56.583110] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt3 00:17:13.095 [2024-07-12 18:20:56.583175] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt3 00:17:13.095 [2024-07-12 18:20:56.583195] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt3 is claimed 00:17:13.095 [2024-07-12 18:20:56.583294] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x2571cc0 00:17:13.095 [2024-07-12 18:20:56.583304] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:17:13.095 [2024-07-12 18:20:56.583465] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x25726d0 00:17:13.095 [2024-07-12 18:20:56.583590] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x2571cc0 00:17:13.095 [2024-07-12 18:20:56.583599] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x2571cc0 00:17:13.095 [2024-07-12 18:20:56.583695] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:17:13.095 pt3 00:17:13.095 18:20:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@522 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:17:13.095 18:20:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:17:13.095 18:20:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:17:13.095 18:20:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:17:13.095 18:20:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:17:13.095 18:20:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:17:13.095 18:20:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:13.095 18:20:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:13.095 18:20:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:13.095 18:20:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:13.095 18:20:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:13.095 18:20:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:17:13.353 18:20:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:13.353 "name": "raid_bdev1", 00:17:13.353 "uuid": "c202f6e0-0981-423a-a327-29e2bddbfcd3", 00:17:13.353 "strip_size_kb": 0, 00:17:13.353 "state": "online", 00:17:13.353 "raid_level": "raid1", 00:17:13.353 "superblock": true, 00:17:13.353 "num_base_bdevs": 3, 00:17:13.353 "num_base_bdevs_discovered": 2, 00:17:13.353 "num_base_bdevs_operational": 2, 00:17:13.353 "base_bdevs_list": [ 00:17:13.353 { 00:17:13.353 "name": null, 00:17:13.353 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:13.353 "is_configured": false, 00:17:13.353 "data_offset": 2048, 00:17:13.353 "data_size": 63488 00:17:13.353 }, 00:17:13.353 { 00:17:13.353 "name": "pt2", 00:17:13.353 "uuid": "00000000-0000-0000-0000-000000000002", 00:17:13.353 "is_configured": true, 00:17:13.353 "data_offset": 2048, 00:17:13.353 "data_size": 63488 00:17:13.353 }, 00:17:13.353 { 00:17:13.353 "name": "pt3", 00:17:13.353 "uuid": "00000000-0000-0000-0000-000000000003", 00:17:13.353 "is_configured": true, 00:17:13.353 "data_offset": 2048, 00:17:13.353 "data_size": 63488 00:17:13.353 } 00:17:13.353 ] 00:17:13.353 }' 00:17:13.353 18:20:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:13.353 18:20:56 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:17:13.919 18:20:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@525 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:17:14.177 [2024-07-12 18:20:57.681553] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:17:14.177 [2024-07-12 18:20:57.681578] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:17:14.177 [2024-07-12 18:20:57.681625] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:17:14.177 [2024-07-12 18:20:57.681675] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:17:14.177 [2024-07-12 18:20:57.681686] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x2571cc0 name raid_bdev1, state offline 00:17:14.177 18:20:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@526 -- # jq -r '.[]' 00:17:14.177 18:20:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@526 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:14.434 18:20:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@526 -- # raid_bdev= 00:17:14.434 18:20:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@527 -- # '[' -n '' ']' 00:17:14.434 18:20:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@531 -- # '[' 3 -gt 2 ']' 00:17:14.434 18:20:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@533 -- # i=2 00:17:14.434 18:20:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@534 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt3 00:17:14.693 18:20:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@539 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:17:14.693 [2024-07-12 18:20:58.415467] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:17:14.693 [2024-07-12 18:20:58.415510] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:17:14.693 [2024-07-12 18:20:58.415526] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x23cfec0 00:17:14.693 [2024-07-12 18:20:58.415544] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:17:14.693 [2024-07-12 18:20:58.417127] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:17:14.693 [2024-07-12 18:20:58.417155] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:17:14.693 [2024-07-12 18:20:58.417222] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt1 00:17:14.693 [2024-07-12 18:20:58.417246] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:17:14.693 [2024-07-12 18:20:58.417338] bdev_raid.c:3547:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev pt2 (4) greater than existing raid bdev raid_bdev1 (2) 00:17:14.693 [2024-07-12 18:20:58.417351] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:17:14.693 [2024-07-12 18:20:58.417364] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x2571f40 name raid_bdev1, state configuring 00:17:14.693 [2024-07-12 18:20:58.417386] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:17:14.950 pt1 00:17:14.950 18:20:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@541 -- # '[' 3 -gt 2 ']' 00:17:14.950 18:20:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@544 -- # verify_raid_bdev_state raid_bdev1 configuring raid1 0 2 00:17:14.950 18:20:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:17:14.950 18:20:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:17:14.950 18:20:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:17:14.950 18:20:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:17:14.950 18:20:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:17:14.950 18:20:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:14.950 18:20:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:14.950 18:20:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:14.950 18:20:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:14.950 18:20:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:14.950 18:20:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:17:15.208 18:20:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:15.208 "name": "raid_bdev1", 00:17:15.208 "uuid": "c202f6e0-0981-423a-a327-29e2bddbfcd3", 00:17:15.208 "strip_size_kb": 0, 00:17:15.208 "state": "configuring", 00:17:15.208 "raid_level": "raid1", 00:17:15.208 "superblock": true, 00:17:15.208 "num_base_bdevs": 3, 00:17:15.208 "num_base_bdevs_discovered": 1, 00:17:15.208 "num_base_bdevs_operational": 2, 00:17:15.208 "base_bdevs_list": [ 00:17:15.208 { 00:17:15.208 "name": null, 00:17:15.208 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:15.208 "is_configured": false, 00:17:15.208 "data_offset": 2048, 00:17:15.208 "data_size": 63488 00:17:15.208 }, 00:17:15.208 { 00:17:15.208 "name": "pt2", 00:17:15.208 "uuid": "00000000-0000-0000-0000-000000000002", 00:17:15.208 "is_configured": true, 00:17:15.208 "data_offset": 2048, 00:17:15.208 "data_size": 63488 00:17:15.208 }, 00:17:15.208 { 00:17:15.208 "name": null, 00:17:15.208 "uuid": "00000000-0000-0000-0000-000000000003", 00:17:15.208 "is_configured": false, 00:17:15.208 "data_offset": 2048, 00:17:15.208 "data_size": 63488 00:17:15.208 } 00:17:15.208 ] 00:17:15.208 }' 00:17:15.208 18:20:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:15.208 18:20:58 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:17:15.774 18:20:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@545 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs configuring 00:17:15.774 18:20:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@545 -- # jq -r '.[].base_bdevs_list[0].is_configured' 00:17:16.031 18:20:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@545 -- # [[ false == \f\a\l\s\e ]] 00:17:16.031 18:20:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@548 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc3 -p pt3 -u 00000000-0000-0000-0000-000000000003 00:17:16.289 [2024-07-12 18:20:59.775075] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc3 00:17:16.289 [2024-07-12 18:20:59.775124] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:17:16.289 [2024-07-12 18:20:59.775143] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x23d30c0 00:17:16.289 [2024-07-12 18:20:59.775156] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:17:16.289 [2024-07-12 18:20:59.775493] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:17:16.289 [2024-07-12 18:20:59.775510] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt3 00:17:16.289 [2024-07-12 18:20:59.775572] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt3 00:17:16.289 [2024-07-12 18:20:59.775590] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt3 is claimed 00:17:16.289 [2024-07-12 18:20:59.775690] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x23d3a40 00:17:16.289 [2024-07-12 18:20:59.775701] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:17:16.289 [2024-07-12 18:20:59.775864] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x25726c0 00:17:16.289 [2024-07-12 18:20:59.775999] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x23d3a40 00:17:16.289 [2024-07-12 18:20:59.776010] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x23d3a40 00:17:16.289 [2024-07-12 18:20:59.776104] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:17:16.289 pt3 00:17:16.289 18:20:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@553 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:17:16.289 18:20:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:17:16.289 18:20:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:17:16.289 18:20:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:17:16.289 18:20:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:17:16.289 18:20:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:17:16.289 18:20:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:16.289 18:20:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:16.289 18:20:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:16.289 18:20:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:16.289 18:20:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:16.289 18:20:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:17:16.548 18:21:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:16.548 "name": "raid_bdev1", 00:17:16.548 "uuid": "c202f6e0-0981-423a-a327-29e2bddbfcd3", 00:17:16.548 "strip_size_kb": 0, 00:17:16.548 "state": "online", 00:17:16.548 "raid_level": "raid1", 00:17:16.548 "superblock": true, 00:17:16.548 "num_base_bdevs": 3, 00:17:16.548 "num_base_bdevs_discovered": 2, 00:17:16.548 "num_base_bdevs_operational": 2, 00:17:16.548 "base_bdevs_list": [ 00:17:16.548 { 00:17:16.548 "name": null, 00:17:16.548 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:16.548 "is_configured": false, 00:17:16.548 "data_offset": 2048, 00:17:16.548 "data_size": 63488 00:17:16.548 }, 00:17:16.548 { 00:17:16.548 "name": "pt2", 00:17:16.548 "uuid": "00000000-0000-0000-0000-000000000002", 00:17:16.548 "is_configured": true, 00:17:16.548 "data_offset": 2048, 00:17:16.548 "data_size": 63488 00:17:16.548 }, 00:17:16.548 { 00:17:16.548 "name": "pt3", 00:17:16.548 "uuid": "00000000-0000-0000-0000-000000000003", 00:17:16.548 "is_configured": true, 00:17:16.548 "data_offset": 2048, 00:17:16.548 "data_size": 63488 00:17:16.548 } 00:17:16.548 ] 00:17:16.548 }' 00:17:16.548 18:21:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:16.548 18:21:00 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:17:17.115 18:21:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@554 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs online 00:17:17.115 18:21:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@554 -- # jq -r '.[].base_bdevs_list[0].is_configured' 00:17:17.115 18:21:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@554 -- # [[ false == \f\a\l\s\e ]] 00:17:17.115 18:21:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@557 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:17:17.115 18:21:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@557 -- # jq -r '.[] | .uuid' 00:17:17.374 [2024-07-12 18:21:00.998585] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:17:17.374 18:21:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@557 -- # '[' c202f6e0-0981-423a-a327-29e2bddbfcd3 '!=' c202f6e0-0981-423a-a327-29e2bddbfcd3 ']' 00:17:17.374 18:21:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@562 -- # killprocess 2513038 00:17:17.374 18:21:01 bdev_raid.raid_superblock_test -- common/autotest_common.sh@948 -- # '[' -z 2513038 ']' 00:17:17.374 18:21:01 bdev_raid.raid_superblock_test -- common/autotest_common.sh@952 -- # kill -0 2513038 00:17:17.374 18:21:01 bdev_raid.raid_superblock_test -- common/autotest_common.sh@953 -- # uname 00:17:17.374 18:21:01 bdev_raid.raid_superblock_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:17:17.374 18:21:01 bdev_raid.raid_superblock_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2513038 00:17:17.374 18:21:01 bdev_raid.raid_superblock_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:17:17.374 18:21:01 bdev_raid.raid_superblock_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:17:17.374 18:21:01 bdev_raid.raid_superblock_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2513038' 00:17:17.374 killing process with pid 2513038 00:17:17.374 18:21:01 bdev_raid.raid_superblock_test -- common/autotest_common.sh@967 -- # kill 2513038 00:17:17.374 [2024-07-12 18:21:01.071372] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:17:17.374 [2024-07-12 18:21:01.071421] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:17:17.374 [2024-07-12 18:21:01.071473] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:17:17.374 [2024-07-12 18:21:01.071485] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x23d3a40 name raid_bdev1, state offline 00:17:17.374 18:21:01 bdev_raid.raid_superblock_test -- common/autotest_common.sh@972 -- # wait 2513038 00:17:17.374 [2024-07-12 18:21:01.099007] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:17:17.685 18:21:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@564 -- # return 0 00:17:17.685 00:17:17.685 real 0m21.768s 00:17:17.685 user 0m39.728s 00:17:17.685 sys 0m4.007s 00:17:17.685 18:21:01 bdev_raid.raid_superblock_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:17:17.685 18:21:01 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:17:17.685 ************************************ 00:17:17.685 END TEST raid_superblock_test 00:17:17.685 ************************************ 00:17:17.685 18:21:01 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:17:17.685 18:21:01 bdev_raid -- bdev/bdev_raid.sh@870 -- # run_test raid_read_error_test raid_io_error_test raid1 3 read 00:17:17.685 18:21:01 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:17:17.685 18:21:01 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:17:17.685 18:21:01 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:17:17.685 ************************************ 00:17:17.685 START TEST raid_read_error_test 00:17:17.685 ************************************ 00:17:17.685 18:21:01 bdev_raid.raid_read_error_test -- common/autotest_common.sh@1123 -- # raid_io_error_test raid1 3 read 00:17:17.685 18:21:01 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@788 -- # local raid_level=raid1 00:17:17.685 18:21:01 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@789 -- # local num_base_bdevs=3 00:17:17.685 18:21:01 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@790 -- # local error_io_type=read 00:17:17.944 18:21:01 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i = 1 )) 00:17:17.944 18:21:01 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:17:17.944 18:21:01 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev1 00:17:17.944 18:21:01 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:17:17.944 18:21:01 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:17:17.944 18:21:01 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev2 00:17:17.944 18:21:01 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:17:17.944 18:21:01 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:17:17.944 18:21:01 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev3 00:17:17.944 18:21:01 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:17:17.944 18:21:01 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:17:17.944 18:21:01 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3') 00:17:17.944 18:21:01 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # local base_bdevs 00:17:17.944 18:21:01 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@792 -- # local raid_bdev_name=raid_bdev1 00:17:17.944 18:21:01 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # local strip_size 00:17:17.944 18:21:01 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@794 -- # local create_arg 00:17:17.944 18:21:01 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@795 -- # local bdevperf_log 00:17:17.944 18:21:01 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@796 -- # local fail_per_s 00:17:17.944 18:21:01 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@798 -- # '[' raid1 '!=' raid1 ']' 00:17:17.944 18:21:01 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@802 -- # strip_size=0 00:17:17.944 18:21:01 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@805 -- # mktemp -p /raidtest 00:17:17.944 18:21:01 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@805 -- # bdevperf_log=/raidtest/tmp.ovmoTBZrwf 00:17:17.944 18:21:01 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@808 -- # raid_pid=2516388 00:17:17.944 18:21:01 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@809 -- # waitforlisten 2516388 /var/tmp/spdk-raid.sock 00:17:17.944 18:21:01 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@807 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 128k -q 1 -z -f -L bdev_raid 00:17:17.944 18:21:01 bdev_raid.raid_read_error_test -- common/autotest_common.sh@829 -- # '[' -z 2516388 ']' 00:17:17.944 18:21:01 bdev_raid.raid_read_error_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:17:17.944 18:21:01 bdev_raid.raid_read_error_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:17:17.944 18:21:01 bdev_raid.raid_read_error_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:17:17.944 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:17:17.944 18:21:01 bdev_raid.raid_read_error_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:17:17.944 18:21:01 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:17:17.944 [2024-07-12 18:21:01.474514] Starting SPDK v24.09-pre git sha1 182dd7de4 / DPDK 24.03.0 initialization... 00:17:17.944 [2024-07-12 18:21:01.474585] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2516388 ] 00:17:17.944 [2024-07-12 18:21:01.603775] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:17:18.202 [2024-07-12 18:21:01.710297] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:17:18.202 [2024-07-12 18:21:01.774490] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:17:18.202 [2024-07-12 18:21:01.774530] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:17:18.769 18:21:02 bdev_raid.raid_read_error_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:17:18.769 18:21:02 bdev_raid.raid_read_error_test -- common/autotest_common.sh@862 -- # return 0 00:17:18.769 18:21:02 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:17:18.769 18:21:02 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:17:19.027 BaseBdev1_malloc 00:17:19.027 18:21:02 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev1_malloc 00:17:19.287 true 00:17:19.287 18:21:02 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev1_malloc -p BaseBdev1 00:17:19.547 [2024-07-12 18:21:03.117030] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev1_malloc 00:17:19.547 [2024-07-12 18:21:03.117073] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:17:19.547 [2024-07-12 18:21:03.117093] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x16ef0d0 00:17:19.547 [2024-07-12 18:21:03.117106] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:17:19.547 [2024-07-12 18:21:03.118979] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:17:19.547 [2024-07-12 18:21:03.119007] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:17:19.547 BaseBdev1 00:17:19.547 18:21:03 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:17:19.547 18:21:03 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:17:19.805 BaseBdev2_malloc 00:17:19.806 18:21:03 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev2_malloc 00:17:20.064 true 00:17:20.064 18:21:03 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev2_malloc -p BaseBdev2 00:17:20.321 [2024-07-12 18:21:03.844796] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev2_malloc 00:17:20.321 [2024-07-12 18:21:03.844839] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:17:20.321 [2024-07-12 18:21:03.844860] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x16f3910 00:17:20.321 [2024-07-12 18:21:03.844873] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:17:20.321 [2024-07-12 18:21:03.846483] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:17:20.321 [2024-07-12 18:21:03.846510] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:17:20.321 BaseBdev2 00:17:20.321 18:21:03 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:17:20.321 18:21:03 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3_malloc 00:17:20.579 BaseBdev3_malloc 00:17:20.579 18:21:04 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev3_malloc 00:17:20.838 true 00:17:20.838 18:21:04 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev3_malloc -p BaseBdev3 00:17:20.838 [2024-07-12 18:21:04.559278] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev3_malloc 00:17:20.838 [2024-07-12 18:21:04.559324] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:17:20.838 [2024-07-12 18:21:04.559345] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x16f5bd0 00:17:20.838 [2024-07-12 18:21:04.559357] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:17:20.838 [2024-07-12 18:21:04.560960] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:17:20.838 [2024-07-12 18:21:04.560988] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev3 00:17:20.838 BaseBdev3 00:17:21.096 18:21:04 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@819 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n raid_bdev1 -s 00:17:21.096 [2024-07-12 18:21:04.799945] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:17:21.097 [2024-07-12 18:21:04.801306] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:17:21.097 [2024-07-12 18:21:04.801378] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:17:21.097 [2024-07-12 18:21:04.801583] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x16f7280 00:17:21.097 [2024-07-12 18:21:04.801595] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:17:21.097 [2024-07-12 18:21:04.801795] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x16f6e20 00:17:21.097 [2024-07-12 18:21:04.801956] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x16f7280 00:17:21.097 [2024-07-12 18:21:04.801967] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x16f7280 00:17:21.097 [2024-07-12 18:21:04.802072] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:17:21.097 18:21:04 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@820 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:17:21.097 18:21:04 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:17:21.097 18:21:04 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:17:21.097 18:21:04 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:17:21.097 18:21:04 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:17:21.097 18:21:04 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:17:21.097 18:21:04 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:21.097 18:21:04 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:21.097 18:21:04 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:21.097 18:21:04 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:21.097 18:21:04 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:17:21.097 18:21:04 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:21.663 18:21:05 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:21.663 "name": "raid_bdev1", 00:17:21.663 "uuid": "32413bfd-fe3c-446d-99ff-5ee66c59b0fe", 00:17:21.663 "strip_size_kb": 0, 00:17:21.663 "state": "online", 00:17:21.663 "raid_level": "raid1", 00:17:21.663 "superblock": true, 00:17:21.663 "num_base_bdevs": 3, 00:17:21.663 "num_base_bdevs_discovered": 3, 00:17:21.663 "num_base_bdevs_operational": 3, 00:17:21.663 "base_bdevs_list": [ 00:17:21.663 { 00:17:21.663 "name": "BaseBdev1", 00:17:21.663 "uuid": "eee33354-598d-52c1-b1b2-b29c767d0fea", 00:17:21.663 "is_configured": true, 00:17:21.663 "data_offset": 2048, 00:17:21.663 "data_size": 63488 00:17:21.663 }, 00:17:21.663 { 00:17:21.663 "name": "BaseBdev2", 00:17:21.663 "uuid": "01ed77a1-5974-55da-8f6a-0c6f7b718518", 00:17:21.663 "is_configured": true, 00:17:21.663 "data_offset": 2048, 00:17:21.663 "data_size": 63488 00:17:21.663 }, 00:17:21.663 { 00:17:21.663 "name": "BaseBdev3", 00:17:21.663 "uuid": "94dfa98a-ef02-525d-8691-13f18edddeb6", 00:17:21.663 "is_configured": true, 00:17:21.663 "data_offset": 2048, 00:17:21.663 "data_size": 63488 00:17:21.663 } 00:17:21.663 ] 00:17:21.663 }' 00:17:21.663 18:21:05 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:21.663 18:21:05 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:17:22.230 18:21:05 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@823 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/spdk-raid.sock perform_tests 00:17:22.230 18:21:05 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@824 -- # sleep 1 00:17:22.230 [2024-07-12 18:21:05.774802] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1544e00 00:17:23.166 18:21:06 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@827 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_inject_error EE_BaseBdev1_malloc read failure 00:17:23.425 18:21:06 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@829 -- # local expected_num_base_bdevs 00:17:23.425 18:21:06 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@830 -- # [[ raid1 = \r\a\i\d\1 ]] 00:17:23.425 18:21:06 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@830 -- # [[ read = \w\r\i\t\e ]] 00:17:23.425 18:21:06 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@833 -- # expected_num_base_bdevs=3 00:17:23.425 18:21:06 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@835 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:17:23.425 18:21:06 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:17:23.425 18:21:06 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:17:23.425 18:21:06 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:17:23.425 18:21:06 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:17:23.425 18:21:06 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:17:23.425 18:21:06 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:23.425 18:21:06 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:23.425 18:21:06 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:23.425 18:21:06 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:23.425 18:21:06 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:23.425 18:21:06 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:17:23.684 18:21:07 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:23.684 "name": "raid_bdev1", 00:17:23.684 "uuid": "32413bfd-fe3c-446d-99ff-5ee66c59b0fe", 00:17:23.684 "strip_size_kb": 0, 00:17:23.684 "state": "online", 00:17:23.684 "raid_level": "raid1", 00:17:23.684 "superblock": true, 00:17:23.684 "num_base_bdevs": 3, 00:17:23.684 "num_base_bdevs_discovered": 3, 00:17:23.684 "num_base_bdevs_operational": 3, 00:17:23.684 "base_bdevs_list": [ 00:17:23.684 { 00:17:23.684 "name": "BaseBdev1", 00:17:23.684 "uuid": "eee33354-598d-52c1-b1b2-b29c767d0fea", 00:17:23.684 "is_configured": true, 00:17:23.684 "data_offset": 2048, 00:17:23.684 "data_size": 63488 00:17:23.684 }, 00:17:23.684 { 00:17:23.684 "name": "BaseBdev2", 00:17:23.684 "uuid": "01ed77a1-5974-55da-8f6a-0c6f7b718518", 00:17:23.684 "is_configured": true, 00:17:23.684 "data_offset": 2048, 00:17:23.684 "data_size": 63488 00:17:23.684 }, 00:17:23.684 { 00:17:23.684 "name": "BaseBdev3", 00:17:23.684 "uuid": "94dfa98a-ef02-525d-8691-13f18edddeb6", 00:17:23.684 "is_configured": true, 00:17:23.684 "data_offset": 2048, 00:17:23.684 "data_size": 63488 00:17:23.684 } 00:17:23.684 ] 00:17:23.684 }' 00:17:23.684 18:21:07 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:23.684 18:21:07 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:17:24.248 18:21:07 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@837 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:17:24.506 [2024-07-12 18:21:08.038655] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:17:24.506 [2024-07-12 18:21:08.038693] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:17:24.506 [2024-07-12 18:21:08.041874] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:17:24.506 [2024-07-12 18:21:08.041908] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:17:24.506 [2024-07-12 18:21:08.042009] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:17:24.506 [2024-07-12 18:21:08.042022] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x16f7280 name raid_bdev1, state offline 00:17:24.506 0 00:17:24.506 18:21:08 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@839 -- # killprocess 2516388 00:17:24.506 18:21:08 bdev_raid.raid_read_error_test -- common/autotest_common.sh@948 -- # '[' -z 2516388 ']' 00:17:24.506 18:21:08 bdev_raid.raid_read_error_test -- common/autotest_common.sh@952 -- # kill -0 2516388 00:17:24.506 18:21:08 bdev_raid.raid_read_error_test -- common/autotest_common.sh@953 -- # uname 00:17:24.506 18:21:08 bdev_raid.raid_read_error_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:17:24.506 18:21:08 bdev_raid.raid_read_error_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2516388 00:17:24.506 18:21:08 bdev_raid.raid_read_error_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:17:24.506 18:21:08 bdev_raid.raid_read_error_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:17:24.506 18:21:08 bdev_raid.raid_read_error_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2516388' 00:17:24.506 killing process with pid 2516388 00:17:24.506 18:21:08 bdev_raid.raid_read_error_test -- common/autotest_common.sh@967 -- # kill 2516388 00:17:24.506 [2024-07-12 18:21:08.115719] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:17:24.506 18:21:08 bdev_raid.raid_read_error_test -- common/autotest_common.sh@972 -- # wait 2516388 00:17:24.506 [2024-07-12 18:21:08.139301] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:17:24.765 18:21:08 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # grep raid_bdev1 00:17:24.765 18:21:08 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # grep -v Job /raidtest/tmp.ovmoTBZrwf 00:17:24.765 18:21:08 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # awk '{print $6}' 00:17:24.765 18:21:08 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # fail_per_s=0.00 00:17:24.765 18:21:08 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@844 -- # has_redundancy raid1 00:17:24.765 18:21:08 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:17:24.765 18:21:08 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@214 -- # return 0 00:17:24.765 18:21:08 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@845 -- # [[ 0.00 = \0\.\0\0 ]] 00:17:24.765 00:17:24.765 real 0m6.986s 00:17:24.765 user 0m11.065s 00:17:24.765 sys 0m1.186s 00:17:24.765 18:21:08 bdev_raid.raid_read_error_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:17:24.765 18:21:08 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:17:24.765 ************************************ 00:17:24.765 END TEST raid_read_error_test 00:17:24.765 ************************************ 00:17:24.765 18:21:08 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:17:24.765 18:21:08 bdev_raid -- bdev/bdev_raid.sh@871 -- # run_test raid_write_error_test raid_io_error_test raid1 3 write 00:17:24.765 18:21:08 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:17:24.765 18:21:08 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:17:24.765 18:21:08 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:17:24.765 ************************************ 00:17:24.765 START TEST raid_write_error_test 00:17:24.765 ************************************ 00:17:24.765 18:21:08 bdev_raid.raid_write_error_test -- common/autotest_common.sh@1123 -- # raid_io_error_test raid1 3 write 00:17:24.765 18:21:08 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@788 -- # local raid_level=raid1 00:17:24.765 18:21:08 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@789 -- # local num_base_bdevs=3 00:17:24.765 18:21:08 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@790 -- # local error_io_type=write 00:17:24.765 18:21:08 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i = 1 )) 00:17:24.765 18:21:08 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:17:24.765 18:21:08 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev1 00:17:24.765 18:21:08 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:17:24.765 18:21:08 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:17:24.765 18:21:08 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev2 00:17:24.765 18:21:08 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:17:24.765 18:21:08 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:17:24.765 18:21:08 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev3 00:17:24.765 18:21:08 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:17:24.765 18:21:08 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:17:24.765 18:21:08 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3') 00:17:24.765 18:21:08 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # local base_bdevs 00:17:24.765 18:21:08 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@792 -- # local raid_bdev_name=raid_bdev1 00:17:24.765 18:21:08 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # local strip_size 00:17:24.765 18:21:08 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@794 -- # local create_arg 00:17:24.765 18:21:08 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@795 -- # local bdevperf_log 00:17:24.765 18:21:08 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@796 -- # local fail_per_s 00:17:24.765 18:21:08 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@798 -- # '[' raid1 '!=' raid1 ']' 00:17:24.765 18:21:08 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@802 -- # strip_size=0 00:17:24.765 18:21:08 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@805 -- # mktemp -p /raidtest 00:17:24.765 18:21:08 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@805 -- # bdevperf_log=/raidtest/tmp.9V6uvVJFp2 00:17:24.765 18:21:08 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@808 -- # raid_pid=2517793 00:17:25.024 18:21:08 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@809 -- # waitforlisten 2517793 /var/tmp/spdk-raid.sock 00:17:25.024 18:21:08 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@807 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 128k -q 1 -z -f -L bdev_raid 00:17:25.024 18:21:08 bdev_raid.raid_write_error_test -- common/autotest_common.sh@829 -- # '[' -z 2517793 ']' 00:17:25.024 18:21:08 bdev_raid.raid_write_error_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:17:25.024 18:21:08 bdev_raid.raid_write_error_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:17:25.024 18:21:08 bdev_raid.raid_write_error_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:17:25.024 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:17:25.024 18:21:08 bdev_raid.raid_write_error_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:17:25.024 18:21:08 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:17:25.024 [2024-07-12 18:21:08.546543] Starting SPDK v24.09-pre git sha1 182dd7de4 / DPDK 24.03.0 initialization... 00:17:25.024 [2024-07-12 18:21:08.546600] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2517793 ] 00:17:25.024 [2024-07-12 18:21:08.660717] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:17:25.283 [2024-07-12 18:21:08.768619] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:17:25.283 [2024-07-12 18:21:08.828774] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:17:25.283 [2024-07-12 18:21:08.828804] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:17:25.849 18:21:09 bdev_raid.raid_write_error_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:17:25.849 18:21:09 bdev_raid.raid_write_error_test -- common/autotest_common.sh@862 -- # return 0 00:17:25.849 18:21:09 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:17:25.849 18:21:09 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:17:26.107 BaseBdev1_malloc 00:17:26.107 18:21:09 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev1_malloc 00:17:26.365 true 00:17:26.365 18:21:09 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev1_malloc -p BaseBdev1 00:17:26.365 [2024-07-12 18:21:10.069082] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev1_malloc 00:17:26.365 [2024-07-12 18:21:10.069133] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:17:26.365 [2024-07-12 18:21:10.069153] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x13290d0 00:17:26.365 [2024-07-12 18:21:10.069166] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:17:26.365 [2024-07-12 18:21:10.070916] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:17:26.365 [2024-07-12 18:21:10.070953] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:17:26.365 BaseBdev1 00:17:26.365 18:21:10 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:17:26.365 18:21:10 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:17:26.624 BaseBdev2_malloc 00:17:26.624 18:21:10 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev2_malloc 00:17:26.883 true 00:17:26.883 18:21:10 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev2_malloc -p BaseBdev2 00:17:27.142 [2024-07-12 18:21:10.755847] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev2_malloc 00:17:27.142 [2024-07-12 18:21:10.755898] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:17:27.142 [2024-07-12 18:21:10.755918] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x132d910 00:17:27.142 [2024-07-12 18:21:10.755935] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:17:27.142 [2024-07-12 18:21:10.757509] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:17:27.142 [2024-07-12 18:21:10.757537] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:17:27.142 BaseBdev2 00:17:27.142 18:21:10 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:17:27.142 18:21:10 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3_malloc 00:17:27.400 BaseBdev3_malloc 00:17:27.400 18:21:10 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev3_malloc 00:17:27.400 true 00:17:27.658 18:21:11 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev3_malloc -p BaseBdev3 00:17:27.658 [2024-07-12 18:21:11.281873] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev3_malloc 00:17:27.658 [2024-07-12 18:21:11.281916] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:17:27.658 [2024-07-12 18:21:11.281938] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x132fbd0 00:17:27.658 [2024-07-12 18:21:11.281951] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:17:27.658 [2024-07-12 18:21:11.283311] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:17:27.658 [2024-07-12 18:21:11.283336] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev3 00:17:27.658 BaseBdev3 00:17:27.658 18:21:11 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@819 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n raid_bdev1 -s 00:17:27.916 [2024-07-12 18:21:11.526547] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:17:27.916 [2024-07-12 18:21:11.527716] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:17:27.916 [2024-07-12 18:21:11.527783] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:17:27.916 [2024-07-12 18:21:11.527985] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x1331280 00:17:27.916 [2024-07-12 18:21:11.527997] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:17:27.916 [2024-07-12 18:21:11.528181] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1330e20 00:17:27.916 [2024-07-12 18:21:11.528325] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1331280 00:17:27.916 [2024-07-12 18:21:11.528340] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x1331280 00:17:27.916 [2024-07-12 18:21:11.528437] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:17:27.916 18:21:11 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@820 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:17:27.916 18:21:11 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:17:27.916 18:21:11 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:17:27.916 18:21:11 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:17:27.916 18:21:11 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:17:27.916 18:21:11 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:17:27.916 18:21:11 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:27.916 18:21:11 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:27.916 18:21:11 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:27.916 18:21:11 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:27.916 18:21:11 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:17:27.916 18:21:11 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:28.175 18:21:11 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:28.175 "name": "raid_bdev1", 00:17:28.175 "uuid": "274b6767-a1f7-400f-a83d-baa84a42effd", 00:17:28.175 "strip_size_kb": 0, 00:17:28.175 "state": "online", 00:17:28.175 "raid_level": "raid1", 00:17:28.175 "superblock": true, 00:17:28.175 "num_base_bdevs": 3, 00:17:28.175 "num_base_bdevs_discovered": 3, 00:17:28.175 "num_base_bdevs_operational": 3, 00:17:28.175 "base_bdevs_list": [ 00:17:28.175 { 00:17:28.175 "name": "BaseBdev1", 00:17:28.175 "uuid": "e3d954c2-0586-5fa6-8dad-b1f49b5bd8be", 00:17:28.175 "is_configured": true, 00:17:28.175 "data_offset": 2048, 00:17:28.175 "data_size": 63488 00:17:28.175 }, 00:17:28.175 { 00:17:28.175 "name": "BaseBdev2", 00:17:28.175 "uuid": "1d6fec0d-289f-5071-afb5-3aecdf842129", 00:17:28.175 "is_configured": true, 00:17:28.175 "data_offset": 2048, 00:17:28.175 "data_size": 63488 00:17:28.175 }, 00:17:28.175 { 00:17:28.175 "name": "BaseBdev3", 00:17:28.175 "uuid": "9800c15c-9ecf-529c-a7bb-06ac354297ce", 00:17:28.175 "is_configured": true, 00:17:28.175 "data_offset": 2048, 00:17:28.175 "data_size": 63488 00:17:28.175 } 00:17:28.175 ] 00:17:28.175 }' 00:17:28.175 18:21:11 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:28.175 18:21:11 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:17:28.741 18:21:12 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@824 -- # sleep 1 00:17:28.741 18:21:12 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@823 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/spdk-raid.sock perform_tests 00:17:28.741 [2024-07-12 18:21:12.437252] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x117ee00 00:17:29.676 18:21:13 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@827 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_inject_error EE_BaseBdev1_malloc write failure 00:17:29.936 [2024-07-12 18:21:13.565202] bdev_raid.c:2221:_raid_bdev_fail_base_bdev: *NOTICE*: Failing base bdev in slot 0 ('BaseBdev1') of raid bdev 'raid_bdev1' 00:17:29.936 [2024-07-12 18:21:13.565260] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:17:29.936 [2024-07-12 18:21:13.565451] bdev_raid.c:1919:raid_bdev_channel_remove_base_bdev: *DEBUG*: slot: 0 raid_ch: 0x117ee00 00:17:29.936 18:21:13 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@829 -- # local expected_num_base_bdevs 00:17:29.936 18:21:13 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@830 -- # [[ raid1 = \r\a\i\d\1 ]] 00:17:29.936 18:21:13 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@830 -- # [[ write = \w\r\i\t\e ]] 00:17:29.936 18:21:13 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@831 -- # expected_num_base_bdevs=2 00:17:29.936 18:21:13 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@835 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:17:29.936 18:21:13 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:17:29.936 18:21:13 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:17:29.936 18:21:13 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:17:29.936 18:21:13 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:17:29.936 18:21:13 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:17:29.936 18:21:13 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:29.936 18:21:13 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:29.936 18:21:13 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:29.936 18:21:13 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:29.936 18:21:13 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:29.936 18:21:13 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:17:30.194 18:21:13 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:30.194 "name": "raid_bdev1", 00:17:30.194 "uuid": "274b6767-a1f7-400f-a83d-baa84a42effd", 00:17:30.194 "strip_size_kb": 0, 00:17:30.194 "state": "online", 00:17:30.194 "raid_level": "raid1", 00:17:30.194 "superblock": true, 00:17:30.194 "num_base_bdevs": 3, 00:17:30.194 "num_base_bdevs_discovered": 2, 00:17:30.194 "num_base_bdevs_operational": 2, 00:17:30.194 "base_bdevs_list": [ 00:17:30.194 { 00:17:30.194 "name": null, 00:17:30.194 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:30.194 "is_configured": false, 00:17:30.194 "data_offset": 2048, 00:17:30.194 "data_size": 63488 00:17:30.194 }, 00:17:30.194 { 00:17:30.194 "name": "BaseBdev2", 00:17:30.194 "uuid": "1d6fec0d-289f-5071-afb5-3aecdf842129", 00:17:30.194 "is_configured": true, 00:17:30.194 "data_offset": 2048, 00:17:30.194 "data_size": 63488 00:17:30.194 }, 00:17:30.194 { 00:17:30.194 "name": "BaseBdev3", 00:17:30.194 "uuid": "9800c15c-9ecf-529c-a7bb-06ac354297ce", 00:17:30.194 "is_configured": true, 00:17:30.194 "data_offset": 2048, 00:17:30.194 "data_size": 63488 00:17:30.194 } 00:17:30.194 ] 00:17:30.194 }' 00:17:30.194 18:21:13 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:30.194 18:21:13 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:17:30.762 18:21:14 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@837 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:17:31.021 [2024-07-12 18:21:14.592067] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:17:31.021 [2024-07-12 18:21:14.592104] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:17:31.021 [2024-07-12 18:21:14.595257] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:17:31.021 [2024-07-12 18:21:14.595288] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:17:31.021 [2024-07-12 18:21:14.595360] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:17:31.021 [2024-07-12 18:21:14.595373] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1331280 name raid_bdev1, state offline 00:17:31.021 0 00:17:31.021 18:21:14 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@839 -- # killprocess 2517793 00:17:31.021 18:21:14 bdev_raid.raid_write_error_test -- common/autotest_common.sh@948 -- # '[' -z 2517793 ']' 00:17:31.021 18:21:14 bdev_raid.raid_write_error_test -- common/autotest_common.sh@952 -- # kill -0 2517793 00:17:31.021 18:21:14 bdev_raid.raid_write_error_test -- common/autotest_common.sh@953 -- # uname 00:17:31.021 18:21:14 bdev_raid.raid_write_error_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:17:31.021 18:21:14 bdev_raid.raid_write_error_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2517793 00:17:31.021 18:21:14 bdev_raid.raid_write_error_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:17:31.021 18:21:14 bdev_raid.raid_write_error_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:17:31.021 18:21:14 bdev_raid.raid_write_error_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2517793' 00:17:31.021 killing process with pid 2517793 00:17:31.021 18:21:14 bdev_raid.raid_write_error_test -- common/autotest_common.sh@967 -- # kill 2517793 00:17:31.021 [2024-07-12 18:21:14.660750] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:17:31.021 18:21:14 bdev_raid.raid_write_error_test -- common/autotest_common.sh@972 -- # wait 2517793 00:17:31.021 [2024-07-12 18:21:14.682118] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:17:31.280 18:21:14 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # grep -v Job /raidtest/tmp.9V6uvVJFp2 00:17:31.280 18:21:14 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # grep raid_bdev1 00:17:31.280 18:21:14 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # awk '{print $6}' 00:17:31.280 18:21:14 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # fail_per_s=0.00 00:17:31.280 18:21:14 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@844 -- # has_redundancy raid1 00:17:31.280 18:21:14 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:17:31.280 18:21:14 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@214 -- # return 0 00:17:31.280 18:21:14 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@845 -- # [[ 0.00 = \0\.\0\0 ]] 00:17:31.280 00:17:31.280 real 0m6.451s 00:17:31.280 user 0m10.037s 00:17:31.280 sys 0m1.171s 00:17:31.280 18:21:14 bdev_raid.raid_write_error_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:17:31.280 18:21:14 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:17:31.280 ************************************ 00:17:31.280 END TEST raid_write_error_test 00:17:31.280 ************************************ 00:17:31.280 18:21:14 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:17:31.280 18:21:14 bdev_raid -- bdev/bdev_raid.sh@865 -- # for n in {2..4} 00:17:31.280 18:21:14 bdev_raid -- bdev/bdev_raid.sh@866 -- # for level in raid0 concat raid1 00:17:31.280 18:21:14 bdev_raid -- bdev/bdev_raid.sh@867 -- # run_test raid_state_function_test raid_state_function_test raid0 4 false 00:17:31.280 18:21:14 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:17:31.280 18:21:14 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:17:31.280 18:21:14 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:17:31.280 ************************************ 00:17:31.280 START TEST raid_state_function_test 00:17:31.280 ************************************ 00:17:31.280 18:21:15 bdev_raid.raid_state_function_test -- common/autotest_common.sh@1123 -- # raid_state_function_test raid0 4 false 00:17:31.280 18:21:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@220 -- # local raid_level=raid0 00:17:31.280 18:21:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@221 -- # local num_base_bdevs=4 00:17:31.280 18:21:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@222 -- # local superblock=false 00:17:31.280 18:21:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@223 -- # local raid_bdev 00:17:31.540 18:21:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i = 1 )) 00:17:31.540 18:21:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:17:31.540 18:21:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev1 00:17:31.540 18:21:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:17:31.540 18:21:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:17:31.540 18:21:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev2 00:17:31.540 18:21:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:17:31.540 18:21:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:17:31.540 18:21:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev3 00:17:31.540 18:21:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:17:31.540 18:21:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:17:31.540 18:21:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev4 00:17:31.540 18:21:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:17:31.540 18:21:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:17:31.540 18:21:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3' 'BaseBdev4') 00:17:31.540 18:21:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # local base_bdevs 00:17:31.540 18:21:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@225 -- # local raid_bdev_name=Existed_Raid 00:17:31.540 18:21:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # local strip_size 00:17:31.540 18:21:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@227 -- # local strip_size_create_arg 00:17:31.540 18:21:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@228 -- # local superblock_create_arg 00:17:31.540 18:21:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@230 -- # '[' raid0 '!=' raid1 ']' 00:17:31.540 18:21:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@231 -- # strip_size=64 00:17:31.540 18:21:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@232 -- # strip_size_create_arg='-z 64' 00:17:31.540 18:21:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@237 -- # '[' false = true ']' 00:17:31.540 18:21:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@240 -- # superblock_create_arg= 00:17:31.540 18:21:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@244 -- # raid_pid=2518775 00:17:31.540 18:21:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@245 -- # echo 'Process raid pid: 2518775' 00:17:31.540 Process raid pid: 2518775 00:17:31.540 18:21:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@246 -- # waitforlisten 2518775 /var/tmp/spdk-raid.sock 00:17:31.540 18:21:15 bdev_raid.raid_state_function_test -- common/autotest_common.sh@829 -- # '[' -z 2518775 ']' 00:17:31.540 18:21:15 bdev_raid.raid_state_function_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:17:31.540 18:21:15 bdev_raid.raid_state_function_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:17:31.540 18:21:15 bdev_raid.raid_state_function_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:17:31.540 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:17:31.540 18:21:15 bdev_raid.raid_state_function_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:17:31.540 18:21:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@243 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:17:31.540 18:21:15 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:17:31.540 [2024-07-12 18:21:15.072654] Starting SPDK v24.09-pre git sha1 182dd7de4 / DPDK 24.03.0 initialization... 00:17:31.540 [2024-07-12 18:21:15.072718] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:17:31.540 [2024-07-12 18:21:15.200327] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:17:31.799 [2024-07-12 18:21:15.302846] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:17:31.799 [2024-07-12 18:21:15.366327] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:17:31.799 [2024-07-12 18:21:15.366358] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:17:32.369 18:21:15 bdev_raid.raid_state_function_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:17:32.369 18:21:15 bdev_raid.raid_state_function_test -- common/autotest_common.sh@862 -- # return 0 00:17:32.369 18:21:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@250 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:17:32.629 [2024-07-12 18:21:16.131303] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:17:32.629 [2024-07-12 18:21:16.131346] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:17:32.629 [2024-07-12 18:21:16.131357] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:17:32.629 [2024-07-12 18:21:16.131369] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:17:32.629 [2024-07-12 18:21:16.131377] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:17:32.629 [2024-07-12 18:21:16.131388] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:17:32.629 [2024-07-12 18:21:16.131401] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev4 00:17:32.629 [2024-07-12 18:21:16.131412] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev4 doesn't exist now 00:17:32.629 18:21:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@251 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:17:32.629 18:21:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:17:32.629 18:21:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:17:32.629 18:21:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:17:32.629 18:21:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:17:32.629 18:21:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:17:32.629 18:21:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:32.629 18:21:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:32.629 18:21:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:32.629 18:21:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:32.629 18:21:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:32.629 18:21:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:17:32.890 18:21:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:32.890 "name": "Existed_Raid", 00:17:32.890 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:32.890 "strip_size_kb": 64, 00:17:32.890 "state": "configuring", 00:17:32.890 "raid_level": "raid0", 00:17:32.890 "superblock": false, 00:17:32.890 "num_base_bdevs": 4, 00:17:32.890 "num_base_bdevs_discovered": 0, 00:17:32.890 "num_base_bdevs_operational": 4, 00:17:32.890 "base_bdevs_list": [ 00:17:32.890 { 00:17:32.890 "name": "BaseBdev1", 00:17:32.890 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:32.890 "is_configured": false, 00:17:32.890 "data_offset": 0, 00:17:32.890 "data_size": 0 00:17:32.890 }, 00:17:32.890 { 00:17:32.890 "name": "BaseBdev2", 00:17:32.890 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:32.890 "is_configured": false, 00:17:32.890 "data_offset": 0, 00:17:32.890 "data_size": 0 00:17:32.890 }, 00:17:32.890 { 00:17:32.890 "name": "BaseBdev3", 00:17:32.890 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:32.890 "is_configured": false, 00:17:32.890 "data_offset": 0, 00:17:32.890 "data_size": 0 00:17:32.890 }, 00:17:32.890 { 00:17:32.890 "name": "BaseBdev4", 00:17:32.890 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:32.890 "is_configured": false, 00:17:32.890 "data_offset": 0, 00:17:32.890 "data_size": 0 00:17:32.890 } 00:17:32.890 ] 00:17:32.890 }' 00:17:32.890 18:21:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:32.890 18:21:16 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:17:33.828 18:21:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@252 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:17:34.087 [2024-07-12 18:21:17.719386] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:17:34.087 [2024-07-12 18:21:17.719417] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x16eeaa0 name Existed_Raid, state configuring 00:17:34.087 18:21:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@256 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:17:34.346 [2024-07-12 18:21:17.964053] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:17:34.346 [2024-07-12 18:21:17.964084] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:17:34.346 [2024-07-12 18:21:17.964094] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:17:34.346 [2024-07-12 18:21:17.964105] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:17:34.346 [2024-07-12 18:21:17.964122] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:17:34.346 [2024-07-12 18:21:17.964133] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:17:34.346 [2024-07-12 18:21:17.964142] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev4 00:17:34.346 [2024-07-12 18:21:17.964153] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev4 doesn't exist now 00:17:34.346 18:21:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@257 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:17:34.618 [2024-07-12 18:21:18.207771] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:17:34.618 BaseBdev1 00:17:34.618 18:21:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@258 -- # waitforbdev BaseBdev1 00:17:34.618 18:21:18 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev1 00:17:34.618 18:21:18 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:17:34.618 18:21:18 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:17:34.618 18:21:18 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:17:34.619 18:21:18 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:17:34.619 18:21:18 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:17:34.949 18:21:18 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:17:35.208 [ 00:17:35.209 { 00:17:35.209 "name": "BaseBdev1", 00:17:35.209 "aliases": [ 00:17:35.209 "4e3aa81c-0caa-4b99-8c07-b6a887eb70c2" 00:17:35.209 ], 00:17:35.209 "product_name": "Malloc disk", 00:17:35.209 "block_size": 512, 00:17:35.209 "num_blocks": 65536, 00:17:35.209 "uuid": "4e3aa81c-0caa-4b99-8c07-b6a887eb70c2", 00:17:35.209 "assigned_rate_limits": { 00:17:35.209 "rw_ios_per_sec": 0, 00:17:35.209 "rw_mbytes_per_sec": 0, 00:17:35.209 "r_mbytes_per_sec": 0, 00:17:35.209 "w_mbytes_per_sec": 0 00:17:35.209 }, 00:17:35.209 "claimed": true, 00:17:35.209 "claim_type": "exclusive_write", 00:17:35.209 "zoned": false, 00:17:35.209 "supported_io_types": { 00:17:35.209 "read": true, 00:17:35.209 "write": true, 00:17:35.209 "unmap": true, 00:17:35.209 "flush": true, 00:17:35.209 "reset": true, 00:17:35.209 "nvme_admin": false, 00:17:35.209 "nvme_io": false, 00:17:35.209 "nvme_io_md": false, 00:17:35.209 "write_zeroes": true, 00:17:35.209 "zcopy": true, 00:17:35.209 "get_zone_info": false, 00:17:35.209 "zone_management": false, 00:17:35.209 "zone_append": false, 00:17:35.209 "compare": false, 00:17:35.209 "compare_and_write": false, 00:17:35.209 "abort": true, 00:17:35.209 "seek_hole": false, 00:17:35.209 "seek_data": false, 00:17:35.209 "copy": true, 00:17:35.209 "nvme_iov_md": false 00:17:35.209 }, 00:17:35.209 "memory_domains": [ 00:17:35.209 { 00:17:35.209 "dma_device_id": "system", 00:17:35.209 "dma_device_type": 1 00:17:35.209 }, 00:17:35.209 { 00:17:35.209 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:35.209 "dma_device_type": 2 00:17:35.209 } 00:17:35.209 ], 00:17:35.209 "driver_specific": {} 00:17:35.209 } 00:17:35.209 ] 00:17:35.209 18:21:18 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:17:35.209 18:21:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@259 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:17:35.209 18:21:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:17:35.209 18:21:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:17:35.209 18:21:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:17:35.209 18:21:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:17:35.209 18:21:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:17:35.209 18:21:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:35.209 18:21:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:35.209 18:21:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:35.209 18:21:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:35.209 18:21:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:35.209 18:21:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:17:35.469 18:21:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:35.469 "name": "Existed_Raid", 00:17:35.469 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:35.469 "strip_size_kb": 64, 00:17:35.469 "state": "configuring", 00:17:35.469 "raid_level": "raid0", 00:17:35.469 "superblock": false, 00:17:35.469 "num_base_bdevs": 4, 00:17:35.469 "num_base_bdevs_discovered": 1, 00:17:35.469 "num_base_bdevs_operational": 4, 00:17:35.469 "base_bdevs_list": [ 00:17:35.469 { 00:17:35.469 "name": "BaseBdev1", 00:17:35.469 "uuid": "4e3aa81c-0caa-4b99-8c07-b6a887eb70c2", 00:17:35.469 "is_configured": true, 00:17:35.469 "data_offset": 0, 00:17:35.469 "data_size": 65536 00:17:35.469 }, 00:17:35.469 { 00:17:35.469 "name": "BaseBdev2", 00:17:35.469 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:35.469 "is_configured": false, 00:17:35.469 "data_offset": 0, 00:17:35.469 "data_size": 0 00:17:35.469 }, 00:17:35.469 { 00:17:35.469 "name": "BaseBdev3", 00:17:35.469 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:35.469 "is_configured": false, 00:17:35.469 "data_offset": 0, 00:17:35.469 "data_size": 0 00:17:35.469 }, 00:17:35.469 { 00:17:35.469 "name": "BaseBdev4", 00:17:35.469 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:35.469 "is_configured": false, 00:17:35.469 "data_offset": 0, 00:17:35.469 "data_size": 0 00:17:35.469 } 00:17:35.469 ] 00:17:35.469 }' 00:17:35.469 18:21:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:35.469 18:21:18 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:17:36.038 18:21:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@260 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:17:36.038 [2024-07-12 18:21:19.764078] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:17:36.038 [2024-07-12 18:21:19.764114] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x16ee310 name Existed_Raid, state configuring 00:17:36.297 18:21:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@264 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:17:36.297 [2024-07-12 18:21:20.012785] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:17:36.297 [2024-07-12 18:21:20.014230] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:17:36.297 [2024-07-12 18:21:20.014263] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:17:36.297 [2024-07-12 18:21:20.014273] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:17:36.297 [2024-07-12 18:21:20.014286] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:17:36.297 [2024-07-12 18:21:20.014295] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev4 00:17:36.297 [2024-07-12 18:21:20.014306] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev4 doesn't exist now 00:17:36.556 18:21:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i = 1 )) 00:17:36.556 18:21:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:17:36.556 18:21:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:17:36.556 18:21:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:17:36.556 18:21:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:17:36.556 18:21:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:17:36.556 18:21:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:17:36.556 18:21:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:17:36.556 18:21:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:36.556 18:21:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:36.556 18:21:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:36.556 18:21:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:36.556 18:21:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:17:36.556 18:21:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:36.815 18:21:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:36.815 "name": "Existed_Raid", 00:17:36.815 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:36.815 "strip_size_kb": 64, 00:17:36.815 "state": "configuring", 00:17:36.815 "raid_level": "raid0", 00:17:36.815 "superblock": false, 00:17:36.815 "num_base_bdevs": 4, 00:17:36.815 "num_base_bdevs_discovered": 1, 00:17:36.815 "num_base_bdevs_operational": 4, 00:17:36.815 "base_bdevs_list": [ 00:17:36.815 { 00:17:36.815 "name": "BaseBdev1", 00:17:36.815 "uuid": "4e3aa81c-0caa-4b99-8c07-b6a887eb70c2", 00:17:36.815 "is_configured": true, 00:17:36.815 "data_offset": 0, 00:17:36.815 "data_size": 65536 00:17:36.815 }, 00:17:36.815 { 00:17:36.815 "name": "BaseBdev2", 00:17:36.815 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:36.815 "is_configured": false, 00:17:36.815 "data_offset": 0, 00:17:36.815 "data_size": 0 00:17:36.815 }, 00:17:36.815 { 00:17:36.815 "name": "BaseBdev3", 00:17:36.815 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:36.815 "is_configured": false, 00:17:36.815 "data_offset": 0, 00:17:36.815 "data_size": 0 00:17:36.815 }, 00:17:36.815 { 00:17:36.815 "name": "BaseBdev4", 00:17:36.815 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:36.815 "is_configured": false, 00:17:36.815 "data_offset": 0, 00:17:36.815 "data_size": 0 00:17:36.815 } 00:17:36.815 ] 00:17:36.815 }' 00:17:36.815 18:21:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:36.815 18:21:20 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:17:37.385 18:21:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:17:37.385 [2024-07-12 18:21:21.095007] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:17:37.385 BaseBdev2 00:17:37.385 18:21:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev2 00:17:37.385 18:21:21 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev2 00:17:37.385 18:21:21 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:17:37.646 18:21:21 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:17:37.646 18:21:21 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:17:37.646 18:21:21 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:17:37.646 18:21:21 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:17:37.646 18:21:21 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:17:38.216 [ 00:17:38.216 { 00:17:38.216 "name": "BaseBdev2", 00:17:38.216 "aliases": [ 00:17:38.216 "f90dac70-81a8-43c0-a9fb-f34c79d97f01" 00:17:38.216 ], 00:17:38.216 "product_name": "Malloc disk", 00:17:38.216 "block_size": 512, 00:17:38.216 "num_blocks": 65536, 00:17:38.216 "uuid": "f90dac70-81a8-43c0-a9fb-f34c79d97f01", 00:17:38.216 "assigned_rate_limits": { 00:17:38.216 "rw_ios_per_sec": 0, 00:17:38.216 "rw_mbytes_per_sec": 0, 00:17:38.216 "r_mbytes_per_sec": 0, 00:17:38.216 "w_mbytes_per_sec": 0 00:17:38.216 }, 00:17:38.216 "claimed": true, 00:17:38.216 "claim_type": "exclusive_write", 00:17:38.216 "zoned": false, 00:17:38.216 "supported_io_types": { 00:17:38.216 "read": true, 00:17:38.216 "write": true, 00:17:38.216 "unmap": true, 00:17:38.216 "flush": true, 00:17:38.216 "reset": true, 00:17:38.216 "nvme_admin": false, 00:17:38.216 "nvme_io": false, 00:17:38.216 "nvme_io_md": false, 00:17:38.216 "write_zeroes": true, 00:17:38.216 "zcopy": true, 00:17:38.216 "get_zone_info": false, 00:17:38.216 "zone_management": false, 00:17:38.216 "zone_append": false, 00:17:38.216 "compare": false, 00:17:38.216 "compare_and_write": false, 00:17:38.216 "abort": true, 00:17:38.216 "seek_hole": false, 00:17:38.216 "seek_data": false, 00:17:38.216 "copy": true, 00:17:38.216 "nvme_iov_md": false 00:17:38.216 }, 00:17:38.216 "memory_domains": [ 00:17:38.216 { 00:17:38.216 "dma_device_id": "system", 00:17:38.216 "dma_device_type": 1 00:17:38.216 }, 00:17:38.216 { 00:17:38.216 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:38.216 "dma_device_type": 2 00:17:38.216 } 00:17:38.216 ], 00:17:38.216 "driver_specific": {} 00:17:38.216 } 00:17:38.216 ] 00:17:38.216 18:21:21 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:17:38.216 18:21:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:17:38.216 18:21:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:17:38.216 18:21:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:17:38.216 18:21:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:17:38.216 18:21:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:17:38.216 18:21:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:17:38.216 18:21:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:17:38.216 18:21:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:17:38.216 18:21:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:38.216 18:21:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:38.216 18:21:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:38.216 18:21:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:38.216 18:21:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:38.216 18:21:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:17:38.476 18:21:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:38.476 "name": "Existed_Raid", 00:17:38.476 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:38.476 "strip_size_kb": 64, 00:17:38.476 "state": "configuring", 00:17:38.476 "raid_level": "raid0", 00:17:38.476 "superblock": false, 00:17:38.476 "num_base_bdevs": 4, 00:17:38.476 "num_base_bdevs_discovered": 2, 00:17:38.476 "num_base_bdevs_operational": 4, 00:17:38.476 "base_bdevs_list": [ 00:17:38.476 { 00:17:38.476 "name": "BaseBdev1", 00:17:38.476 "uuid": "4e3aa81c-0caa-4b99-8c07-b6a887eb70c2", 00:17:38.476 "is_configured": true, 00:17:38.476 "data_offset": 0, 00:17:38.476 "data_size": 65536 00:17:38.476 }, 00:17:38.476 { 00:17:38.476 "name": "BaseBdev2", 00:17:38.476 "uuid": "f90dac70-81a8-43c0-a9fb-f34c79d97f01", 00:17:38.476 "is_configured": true, 00:17:38.476 "data_offset": 0, 00:17:38.476 "data_size": 65536 00:17:38.476 }, 00:17:38.476 { 00:17:38.476 "name": "BaseBdev3", 00:17:38.476 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:38.476 "is_configured": false, 00:17:38.476 "data_offset": 0, 00:17:38.476 "data_size": 0 00:17:38.476 }, 00:17:38.476 { 00:17:38.476 "name": "BaseBdev4", 00:17:38.476 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:38.476 "is_configured": false, 00:17:38.476 "data_offset": 0, 00:17:38.476 "data_size": 0 00:17:38.476 } 00:17:38.476 ] 00:17:38.476 }' 00:17:38.476 18:21:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:38.476 18:21:22 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:17:39.045 18:21:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:17:39.304 [2024-07-12 18:21:22.863025] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:17:39.304 BaseBdev3 00:17:39.304 18:21:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev3 00:17:39.304 18:21:22 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev3 00:17:39.304 18:21:22 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:17:39.304 18:21:22 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:17:39.304 18:21:22 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:17:39.304 18:21:22 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:17:39.304 18:21:22 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:17:39.563 18:21:23 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:17:39.563 [ 00:17:39.563 { 00:17:39.563 "name": "BaseBdev3", 00:17:39.563 "aliases": [ 00:17:39.563 "18c5c2f7-7687-4bc2-a47b-f2d0f9d8c109" 00:17:39.563 ], 00:17:39.563 "product_name": "Malloc disk", 00:17:39.563 "block_size": 512, 00:17:39.563 "num_blocks": 65536, 00:17:39.563 "uuid": "18c5c2f7-7687-4bc2-a47b-f2d0f9d8c109", 00:17:39.563 "assigned_rate_limits": { 00:17:39.563 "rw_ios_per_sec": 0, 00:17:39.563 "rw_mbytes_per_sec": 0, 00:17:39.563 "r_mbytes_per_sec": 0, 00:17:39.563 "w_mbytes_per_sec": 0 00:17:39.563 }, 00:17:39.563 "claimed": true, 00:17:39.563 "claim_type": "exclusive_write", 00:17:39.563 "zoned": false, 00:17:39.563 "supported_io_types": { 00:17:39.563 "read": true, 00:17:39.563 "write": true, 00:17:39.563 "unmap": true, 00:17:39.563 "flush": true, 00:17:39.563 "reset": true, 00:17:39.563 "nvme_admin": false, 00:17:39.563 "nvme_io": false, 00:17:39.563 "nvme_io_md": false, 00:17:39.563 "write_zeroes": true, 00:17:39.563 "zcopy": true, 00:17:39.563 "get_zone_info": false, 00:17:39.563 "zone_management": false, 00:17:39.563 "zone_append": false, 00:17:39.563 "compare": false, 00:17:39.563 "compare_and_write": false, 00:17:39.563 "abort": true, 00:17:39.563 "seek_hole": false, 00:17:39.563 "seek_data": false, 00:17:39.563 "copy": true, 00:17:39.563 "nvme_iov_md": false 00:17:39.563 }, 00:17:39.563 "memory_domains": [ 00:17:39.563 { 00:17:39.563 "dma_device_id": "system", 00:17:39.563 "dma_device_type": 1 00:17:39.563 }, 00:17:39.563 { 00:17:39.563 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:39.563 "dma_device_type": 2 00:17:39.563 } 00:17:39.563 ], 00:17:39.563 "driver_specific": {} 00:17:39.563 } 00:17:39.563 ] 00:17:39.563 18:21:23 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:17:39.563 18:21:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:17:39.563 18:21:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:17:39.563 18:21:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:17:39.563 18:21:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:17:39.563 18:21:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:17:39.563 18:21:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:17:39.563 18:21:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:17:39.563 18:21:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:17:39.563 18:21:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:39.563 18:21:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:39.563 18:21:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:39.563 18:21:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:39.563 18:21:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:17:39.563 18:21:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:39.822 18:21:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:39.822 "name": "Existed_Raid", 00:17:39.822 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:39.822 "strip_size_kb": 64, 00:17:39.822 "state": "configuring", 00:17:39.822 "raid_level": "raid0", 00:17:39.822 "superblock": false, 00:17:39.822 "num_base_bdevs": 4, 00:17:39.822 "num_base_bdevs_discovered": 3, 00:17:39.822 "num_base_bdevs_operational": 4, 00:17:39.822 "base_bdevs_list": [ 00:17:39.822 { 00:17:39.822 "name": "BaseBdev1", 00:17:39.822 "uuid": "4e3aa81c-0caa-4b99-8c07-b6a887eb70c2", 00:17:39.822 "is_configured": true, 00:17:39.822 "data_offset": 0, 00:17:39.822 "data_size": 65536 00:17:39.822 }, 00:17:39.822 { 00:17:39.822 "name": "BaseBdev2", 00:17:39.822 "uuid": "f90dac70-81a8-43c0-a9fb-f34c79d97f01", 00:17:39.822 "is_configured": true, 00:17:39.822 "data_offset": 0, 00:17:39.822 "data_size": 65536 00:17:39.822 }, 00:17:39.822 { 00:17:39.822 "name": "BaseBdev3", 00:17:39.822 "uuid": "18c5c2f7-7687-4bc2-a47b-f2d0f9d8c109", 00:17:39.822 "is_configured": true, 00:17:39.822 "data_offset": 0, 00:17:39.822 "data_size": 65536 00:17:39.822 }, 00:17:39.822 { 00:17:39.822 "name": "BaseBdev4", 00:17:39.822 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:39.822 "is_configured": false, 00:17:39.822 "data_offset": 0, 00:17:39.822 "data_size": 0 00:17:39.822 } 00:17:39.822 ] 00:17:39.822 }' 00:17:39.823 18:21:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:39.823 18:21:23 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:17:40.389 18:21:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4 00:17:40.648 [2024-07-12 18:21:24.254097] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:17:40.648 [2024-07-12 18:21:24.254134] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x16ef350 00:17:40.648 [2024-07-12 18:21:24.254143] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 262144, blocklen 512 00:17:40.648 [2024-07-12 18:21:24.254396] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x16ef020 00:17:40.648 [2024-07-12 18:21:24.254518] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x16ef350 00:17:40.648 [2024-07-12 18:21:24.254528] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x16ef350 00:17:40.648 [2024-07-12 18:21:24.254689] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:17:40.648 BaseBdev4 00:17:40.648 18:21:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev4 00:17:40.648 18:21:24 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev4 00:17:40.648 18:21:24 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:17:40.648 18:21:24 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:17:40.648 18:21:24 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:17:40.648 18:21:24 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:17:40.648 18:21:24 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:17:40.907 18:21:24 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 -t 2000 00:17:41.166 [ 00:17:41.166 { 00:17:41.166 "name": "BaseBdev4", 00:17:41.166 "aliases": [ 00:17:41.166 "583ce727-b7cc-4ba5-a5f5-4390aa86a64a" 00:17:41.166 ], 00:17:41.166 "product_name": "Malloc disk", 00:17:41.166 "block_size": 512, 00:17:41.166 "num_blocks": 65536, 00:17:41.166 "uuid": "583ce727-b7cc-4ba5-a5f5-4390aa86a64a", 00:17:41.166 "assigned_rate_limits": { 00:17:41.166 "rw_ios_per_sec": 0, 00:17:41.166 "rw_mbytes_per_sec": 0, 00:17:41.166 "r_mbytes_per_sec": 0, 00:17:41.166 "w_mbytes_per_sec": 0 00:17:41.166 }, 00:17:41.166 "claimed": true, 00:17:41.166 "claim_type": "exclusive_write", 00:17:41.166 "zoned": false, 00:17:41.166 "supported_io_types": { 00:17:41.166 "read": true, 00:17:41.166 "write": true, 00:17:41.166 "unmap": true, 00:17:41.166 "flush": true, 00:17:41.166 "reset": true, 00:17:41.166 "nvme_admin": false, 00:17:41.166 "nvme_io": false, 00:17:41.166 "nvme_io_md": false, 00:17:41.166 "write_zeroes": true, 00:17:41.166 "zcopy": true, 00:17:41.166 "get_zone_info": false, 00:17:41.166 "zone_management": false, 00:17:41.166 "zone_append": false, 00:17:41.166 "compare": false, 00:17:41.166 "compare_and_write": false, 00:17:41.166 "abort": true, 00:17:41.166 "seek_hole": false, 00:17:41.166 "seek_data": false, 00:17:41.166 "copy": true, 00:17:41.166 "nvme_iov_md": false 00:17:41.166 }, 00:17:41.166 "memory_domains": [ 00:17:41.166 { 00:17:41.166 "dma_device_id": "system", 00:17:41.166 "dma_device_type": 1 00:17:41.166 }, 00:17:41.166 { 00:17:41.166 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:41.166 "dma_device_type": 2 00:17:41.166 } 00:17:41.166 ], 00:17:41.166 "driver_specific": {} 00:17:41.166 } 00:17:41.166 ] 00:17:41.166 18:21:24 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:17:41.166 18:21:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:17:41.166 18:21:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:17:41.166 18:21:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@270 -- # verify_raid_bdev_state Existed_Raid online raid0 64 4 00:17:41.166 18:21:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:17:41.166 18:21:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:17:41.166 18:21:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:17:41.166 18:21:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:17:41.166 18:21:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:17:41.166 18:21:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:41.166 18:21:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:41.166 18:21:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:41.166 18:21:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:41.166 18:21:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:41.166 18:21:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:17:41.426 18:21:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:41.426 "name": "Existed_Raid", 00:17:41.426 "uuid": "a3494d13-6145-4829-8dc9-6e8e8a055499", 00:17:41.426 "strip_size_kb": 64, 00:17:41.426 "state": "online", 00:17:41.426 "raid_level": "raid0", 00:17:41.426 "superblock": false, 00:17:41.426 "num_base_bdevs": 4, 00:17:41.426 "num_base_bdevs_discovered": 4, 00:17:41.426 "num_base_bdevs_operational": 4, 00:17:41.426 "base_bdevs_list": [ 00:17:41.426 { 00:17:41.426 "name": "BaseBdev1", 00:17:41.426 "uuid": "4e3aa81c-0caa-4b99-8c07-b6a887eb70c2", 00:17:41.426 "is_configured": true, 00:17:41.426 "data_offset": 0, 00:17:41.426 "data_size": 65536 00:17:41.426 }, 00:17:41.426 { 00:17:41.426 "name": "BaseBdev2", 00:17:41.426 "uuid": "f90dac70-81a8-43c0-a9fb-f34c79d97f01", 00:17:41.426 "is_configured": true, 00:17:41.426 "data_offset": 0, 00:17:41.426 "data_size": 65536 00:17:41.426 }, 00:17:41.426 { 00:17:41.426 "name": "BaseBdev3", 00:17:41.426 "uuid": "18c5c2f7-7687-4bc2-a47b-f2d0f9d8c109", 00:17:41.426 "is_configured": true, 00:17:41.426 "data_offset": 0, 00:17:41.426 "data_size": 65536 00:17:41.426 }, 00:17:41.426 { 00:17:41.426 "name": "BaseBdev4", 00:17:41.426 "uuid": "583ce727-b7cc-4ba5-a5f5-4390aa86a64a", 00:17:41.426 "is_configured": true, 00:17:41.426 "data_offset": 0, 00:17:41.426 "data_size": 65536 00:17:41.426 } 00:17:41.426 ] 00:17:41.426 }' 00:17:41.426 18:21:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:41.426 18:21:25 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:17:41.993 18:21:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_properties Existed_Raid 00:17:41.993 18:21:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:17:41.993 18:21:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:17:41.993 18:21:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:17:41.993 18:21:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:17:41.993 18:21:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@198 -- # local name 00:17:41.993 18:21:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:17:41.993 18:21:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:17:42.251 [2024-07-12 18:21:25.770478] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:17:42.251 18:21:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:17:42.251 "name": "Existed_Raid", 00:17:42.251 "aliases": [ 00:17:42.251 "a3494d13-6145-4829-8dc9-6e8e8a055499" 00:17:42.251 ], 00:17:42.251 "product_name": "Raid Volume", 00:17:42.251 "block_size": 512, 00:17:42.251 "num_blocks": 262144, 00:17:42.251 "uuid": "a3494d13-6145-4829-8dc9-6e8e8a055499", 00:17:42.251 "assigned_rate_limits": { 00:17:42.251 "rw_ios_per_sec": 0, 00:17:42.251 "rw_mbytes_per_sec": 0, 00:17:42.251 "r_mbytes_per_sec": 0, 00:17:42.251 "w_mbytes_per_sec": 0 00:17:42.251 }, 00:17:42.251 "claimed": false, 00:17:42.251 "zoned": false, 00:17:42.251 "supported_io_types": { 00:17:42.251 "read": true, 00:17:42.251 "write": true, 00:17:42.251 "unmap": true, 00:17:42.251 "flush": true, 00:17:42.251 "reset": true, 00:17:42.251 "nvme_admin": false, 00:17:42.251 "nvme_io": false, 00:17:42.251 "nvme_io_md": false, 00:17:42.251 "write_zeroes": true, 00:17:42.251 "zcopy": false, 00:17:42.251 "get_zone_info": false, 00:17:42.251 "zone_management": false, 00:17:42.252 "zone_append": false, 00:17:42.252 "compare": false, 00:17:42.252 "compare_and_write": false, 00:17:42.252 "abort": false, 00:17:42.252 "seek_hole": false, 00:17:42.252 "seek_data": false, 00:17:42.252 "copy": false, 00:17:42.252 "nvme_iov_md": false 00:17:42.252 }, 00:17:42.252 "memory_domains": [ 00:17:42.252 { 00:17:42.252 "dma_device_id": "system", 00:17:42.252 "dma_device_type": 1 00:17:42.252 }, 00:17:42.252 { 00:17:42.252 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:42.252 "dma_device_type": 2 00:17:42.252 }, 00:17:42.252 { 00:17:42.252 "dma_device_id": "system", 00:17:42.252 "dma_device_type": 1 00:17:42.252 }, 00:17:42.252 { 00:17:42.252 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:42.252 "dma_device_type": 2 00:17:42.252 }, 00:17:42.252 { 00:17:42.252 "dma_device_id": "system", 00:17:42.252 "dma_device_type": 1 00:17:42.252 }, 00:17:42.252 { 00:17:42.252 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:42.252 "dma_device_type": 2 00:17:42.252 }, 00:17:42.252 { 00:17:42.252 "dma_device_id": "system", 00:17:42.252 "dma_device_type": 1 00:17:42.252 }, 00:17:42.252 { 00:17:42.252 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:42.252 "dma_device_type": 2 00:17:42.252 } 00:17:42.252 ], 00:17:42.252 "driver_specific": { 00:17:42.252 "raid": { 00:17:42.252 "uuid": "a3494d13-6145-4829-8dc9-6e8e8a055499", 00:17:42.252 "strip_size_kb": 64, 00:17:42.252 "state": "online", 00:17:42.252 "raid_level": "raid0", 00:17:42.252 "superblock": false, 00:17:42.252 "num_base_bdevs": 4, 00:17:42.252 "num_base_bdevs_discovered": 4, 00:17:42.252 "num_base_bdevs_operational": 4, 00:17:42.252 "base_bdevs_list": [ 00:17:42.252 { 00:17:42.252 "name": "BaseBdev1", 00:17:42.252 "uuid": "4e3aa81c-0caa-4b99-8c07-b6a887eb70c2", 00:17:42.252 "is_configured": true, 00:17:42.252 "data_offset": 0, 00:17:42.252 "data_size": 65536 00:17:42.252 }, 00:17:42.252 { 00:17:42.252 "name": "BaseBdev2", 00:17:42.252 "uuid": "f90dac70-81a8-43c0-a9fb-f34c79d97f01", 00:17:42.252 "is_configured": true, 00:17:42.252 "data_offset": 0, 00:17:42.252 "data_size": 65536 00:17:42.252 }, 00:17:42.252 { 00:17:42.252 "name": "BaseBdev3", 00:17:42.252 "uuid": "18c5c2f7-7687-4bc2-a47b-f2d0f9d8c109", 00:17:42.252 "is_configured": true, 00:17:42.252 "data_offset": 0, 00:17:42.252 "data_size": 65536 00:17:42.252 }, 00:17:42.252 { 00:17:42.252 "name": "BaseBdev4", 00:17:42.252 "uuid": "583ce727-b7cc-4ba5-a5f5-4390aa86a64a", 00:17:42.252 "is_configured": true, 00:17:42.252 "data_offset": 0, 00:17:42.252 "data_size": 65536 00:17:42.252 } 00:17:42.252 ] 00:17:42.252 } 00:17:42.252 } 00:17:42.252 }' 00:17:42.252 18:21:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:17:42.252 18:21:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='BaseBdev1 00:17:42.252 BaseBdev2 00:17:42.252 BaseBdev3 00:17:42.252 BaseBdev4' 00:17:42.252 18:21:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:17:42.252 18:21:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:17:42.252 18:21:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:17:42.511 18:21:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:17:42.511 "name": "BaseBdev1", 00:17:42.511 "aliases": [ 00:17:42.511 "4e3aa81c-0caa-4b99-8c07-b6a887eb70c2" 00:17:42.511 ], 00:17:42.511 "product_name": "Malloc disk", 00:17:42.511 "block_size": 512, 00:17:42.511 "num_blocks": 65536, 00:17:42.511 "uuid": "4e3aa81c-0caa-4b99-8c07-b6a887eb70c2", 00:17:42.511 "assigned_rate_limits": { 00:17:42.511 "rw_ios_per_sec": 0, 00:17:42.511 "rw_mbytes_per_sec": 0, 00:17:42.511 "r_mbytes_per_sec": 0, 00:17:42.511 "w_mbytes_per_sec": 0 00:17:42.511 }, 00:17:42.511 "claimed": true, 00:17:42.511 "claim_type": "exclusive_write", 00:17:42.511 "zoned": false, 00:17:42.511 "supported_io_types": { 00:17:42.511 "read": true, 00:17:42.511 "write": true, 00:17:42.511 "unmap": true, 00:17:42.511 "flush": true, 00:17:42.511 "reset": true, 00:17:42.511 "nvme_admin": false, 00:17:42.511 "nvme_io": false, 00:17:42.511 "nvme_io_md": false, 00:17:42.511 "write_zeroes": true, 00:17:42.511 "zcopy": true, 00:17:42.511 "get_zone_info": false, 00:17:42.511 "zone_management": false, 00:17:42.511 "zone_append": false, 00:17:42.511 "compare": false, 00:17:42.511 "compare_and_write": false, 00:17:42.511 "abort": true, 00:17:42.511 "seek_hole": false, 00:17:42.511 "seek_data": false, 00:17:42.511 "copy": true, 00:17:42.511 "nvme_iov_md": false 00:17:42.511 }, 00:17:42.511 "memory_domains": [ 00:17:42.511 { 00:17:42.511 "dma_device_id": "system", 00:17:42.511 "dma_device_type": 1 00:17:42.511 }, 00:17:42.511 { 00:17:42.511 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:42.511 "dma_device_type": 2 00:17:42.511 } 00:17:42.511 ], 00:17:42.511 "driver_specific": {} 00:17:42.511 }' 00:17:42.511 18:21:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:42.511 18:21:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:42.511 18:21:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:17:42.511 18:21:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:42.511 18:21:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:42.770 18:21:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:17:42.770 18:21:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:42.770 18:21:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:42.770 18:21:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:17:42.770 18:21:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:42.770 18:21:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:42.770 18:21:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:17:42.770 18:21:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:17:42.770 18:21:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:17:42.770 18:21:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:17:43.029 18:21:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:17:43.029 "name": "BaseBdev2", 00:17:43.029 "aliases": [ 00:17:43.029 "f90dac70-81a8-43c0-a9fb-f34c79d97f01" 00:17:43.029 ], 00:17:43.029 "product_name": "Malloc disk", 00:17:43.029 "block_size": 512, 00:17:43.029 "num_blocks": 65536, 00:17:43.029 "uuid": "f90dac70-81a8-43c0-a9fb-f34c79d97f01", 00:17:43.029 "assigned_rate_limits": { 00:17:43.029 "rw_ios_per_sec": 0, 00:17:43.029 "rw_mbytes_per_sec": 0, 00:17:43.029 "r_mbytes_per_sec": 0, 00:17:43.029 "w_mbytes_per_sec": 0 00:17:43.029 }, 00:17:43.029 "claimed": true, 00:17:43.029 "claim_type": "exclusive_write", 00:17:43.029 "zoned": false, 00:17:43.029 "supported_io_types": { 00:17:43.029 "read": true, 00:17:43.029 "write": true, 00:17:43.029 "unmap": true, 00:17:43.029 "flush": true, 00:17:43.029 "reset": true, 00:17:43.029 "nvme_admin": false, 00:17:43.029 "nvme_io": false, 00:17:43.029 "nvme_io_md": false, 00:17:43.029 "write_zeroes": true, 00:17:43.029 "zcopy": true, 00:17:43.029 "get_zone_info": false, 00:17:43.029 "zone_management": false, 00:17:43.029 "zone_append": false, 00:17:43.029 "compare": false, 00:17:43.029 "compare_and_write": false, 00:17:43.029 "abort": true, 00:17:43.029 "seek_hole": false, 00:17:43.029 "seek_data": false, 00:17:43.029 "copy": true, 00:17:43.029 "nvme_iov_md": false 00:17:43.029 }, 00:17:43.029 "memory_domains": [ 00:17:43.029 { 00:17:43.029 "dma_device_id": "system", 00:17:43.029 "dma_device_type": 1 00:17:43.029 }, 00:17:43.029 { 00:17:43.029 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:43.029 "dma_device_type": 2 00:17:43.029 } 00:17:43.029 ], 00:17:43.029 "driver_specific": {} 00:17:43.029 }' 00:17:43.029 18:21:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:43.029 18:21:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:43.288 18:21:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:17:43.288 18:21:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:43.288 18:21:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:43.288 18:21:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:17:43.288 18:21:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:43.288 18:21:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:43.288 18:21:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:17:43.288 18:21:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:43.288 18:21:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:43.288 18:21:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:17:43.288 18:21:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:17:43.547 18:21:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:17:43.547 18:21:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:17:43.547 18:21:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:17:43.547 "name": "BaseBdev3", 00:17:43.547 "aliases": [ 00:17:43.547 "18c5c2f7-7687-4bc2-a47b-f2d0f9d8c109" 00:17:43.547 ], 00:17:43.547 "product_name": "Malloc disk", 00:17:43.547 "block_size": 512, 00:17:43.547 "num_blocks": 65536, 00:17:43.547 "uuid": "18c5c2f7-7687-4bc2-a47b-f2d0f9d8c109", 00:17:43.547 "assigned_rate_limits": { 00:17:43.547 "rw_ios_per_sec": 0, 00:17:43.547 "rw_mbytes_per_sec": 0, 00:17:43.547 "r_mbytes_per_sec": 0, 00:17:43.547 "w_mbytes_per_sec": 0 00:17:43.547 }, 00:17:43.547 "claimed": true, 00:17:43.547 "claim_type": "exclusive_write", 00:17:43.547 "zoned": false, 00:17:43.547 "supported_io_types": { 00:17:43.547 "read": true, 00:17:43.547 "write": true, 00:17:43.547 "unmap": true, 00:17:43.547 "flush": true, 00:17:43.547 "reset": true, 00:17:43.547 "nvme_admin": false, 00:17:43.547 "nvme_io": false, 00:17:43.547 "nvme_io_md": false, 00:17:43.547 "write_zeroes": true, 00:17:43.547 "zcopy": true, 00:17:43.547 "get_zone_info": false, 00:17:43.547 "zone_management": false, 00:17:43.547 "zone_append": false, 00:17:43.547 "compare": false, 00:17:43.547 "compare_and_write": false, 00:17:43.547 "abort": true, 00:17:43.547 "seek_hole": false, 00:17:43.547 "seek_data": false, 00:17:43.547 "copy": true, 00:17:43.547 "nvme_iov_md": false 00:17:43.547 }, 00:17:43.547 "memory_domains": [ 00:17:43.547 { 00:17:43.547 "dma_device_id": "system", 00:17:43.547 "dma_device_type": 1 00:17:43.547 }, 00:17:43.547 { 00:17:43.547 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:43.547 "dma_device_type": 2 00:17:43.547 } 00:17:43.547 ], 00:17:43.547 "driver_specific": {} 00:17:43.547 }' 00:17:43.547 18:21:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:43.805 18:21:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:43.805 18:21:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:17:43.805 18:21:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:43.805 18:21:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:43.805 18:21:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:17:43.805 18:21:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:43.805 18:21:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:43.805 18:21:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:17:43.805 18:21:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:44.064 18:21:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:44.064 18:21:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:17:44.064 18:21:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:17:44.064 18:21:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 00:17:44.064 18:21:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:17:44.323 18:21:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:17:44.323 "name": "BaseBdev4", 00:17:44.323 "aliases": [ 00:17:44.323 "583ce727-b7cc-4ba5-a5f5-4390aa86a64a" 00:17:44.323 ], 00:17:44.323 "product_name": "Malloc disk", 00:17:44.323 "block_size": 512, 00:17:44.323 "num_blocks": 65536, 00:17:44.323 "uuid": "583ce727-b7cc-4ba5-a5f5-4390aa86a64a", 00:17:44.323 "assigned_rate_limits": { 00:17:44.323 "rw_ios_per_sec": 0, 00:17:44.323 "rw_mbytes_per_sec": 0, 00:17:44.323 "r_mbytes_per_sec": 0, 00:17:44.323 "w_mbytes_per_sec": 0 00:17:44.323 }, 00:17:44.323 "claimed": true, 00:17:44.323 "claim_type": "exclusive_write", 00:17:44.323 "zoned": false, 00:17:44.323 "supported_io_types": { 00:17:44.323 "read": true, 00:17:44.323 "write": true, 00:17:44.323 "unmap": true, 00:17:44.323 "flush": true, 00:17:44.323 "reset": true, 00:17:44.323 "nvme_admin": false, 00:17:44.323 "nvme_io": false, 00:17:44.323 "nvme_io_md": false, 00:17:44.323 "write_zeroes": true, 00:17:44.323 "zcopy": true, 00:17:44.323 "get_zone_info": false, 00:17:44.323 "zone_management": false, 00:17:44.323 "zone_append": false, 00:17:44.323 "compare": false, 00:17:44.323 "compare_and_write": false, 00:17:44.323 "abort": true, 00:17:44.323 "seek_hole": false, 00:17:44.323 "seek_data": false, 00:17:44.323 "copy": true, 00:17:44.323 "nvme_iov_md": false 00:17:44.323 }, 00:17:44.323 "memory_domains": [ 00:17:44.323 { 00:17:44.323 "dma_device_id": "system", 00:17:44.323 "dma_device_type": 1 00:17:44.323 }, 00:17:44.323 { 00:17:44.323 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:44.323 "dma_device_type": 2 00:17:44.323 } 00:17:44.323 ], 00:17:44.323 "driver_specific": {} 00:17:44.323 }' 00:17:44.323 18:21:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:44.323 18:21:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:44.323 18:21:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:17:44.323 18:21:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:44.323 18:21:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:44.323 18:21:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:17:44.323 18:21:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:44.581 18:21:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:44.581 18:21:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:17:44.581 18:21:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:44.581 18:21:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:44.581 18:21:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:17:44.581 18:21:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@274 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:17:44.839 [2024-07-12 18:21:28.409207] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:17:44.839 [2024-07-12 18:21:28.409236] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:17:44.839 [2024-07-12 18:21:28.409283] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:17:44.839 18:21:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@275 -- # local expected_state 00:17:44.839 18:21:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@276 -- # has_redundancy raid0 00:17:44.839 18:21:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:17:44.839 18:21:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@215 -- # return 1 00:17:44.839 18:21:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@277 -- # expected_state=offline 00:17:44.839 18:21:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@281 -- # verify_raid_bdev_state Existed_Raid offline raid0 64 3 00:17:44.839 18:21:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:17:44.839 18:21:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=offline 00:17:44.839 18:21:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:17:44.839 18:21:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:17:44.839 18:21:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:17:44.839 18:21:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:44.839 18:21:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:44.839 18:21:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:44.839 18:21:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:44.839 18:21:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:44.839 18:21:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:17:45.098 18:21:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:45.098 "name": "Existed_Raid", 00:17:45.098 "uuid": "a3494d13-6145-4829-8dc9-6e8e8a055499", 00:17:45.098 "strip_size_kb": 64, 00:17:45.098 "state": "offline", 00:17:45.098 "raid_level": "raid0", 00:17:45.098 "superblock": false, 00:17:45.098 "num_base_bdevs": 4, 00:17:45.098 "num_base_bdevs_discovered": 3, 00:17:45.098 "num_base_bdevs_operational": 3, 00:17:45.098 "base_bdevs_list": [ 00:17:45.098 { 00:17:45.098 "name": null, 00:17:45.098 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:45.098 "is_configured": false, 00:17:45.098 "data_offset": 0, 00:17:45.098 "data_size": 65536 00:17:45.098 }, 00:17:45.098 { 00:17:45.098 "name": "BaseBdev2", 00:17:45.098 "uuid": "f90dac70-81a8-43c0-a9fb-f34c79d97f01", 00:17:45.098 "is_configured": true, 00:17:45.098 "data_offset": 0, 00:17:45.098 "data_size": 65536 00:17:45.098 }, 00:17:45.098 { 00:17:45.098 "name": "BaseBdev3", 00:17:45.098 "uuid": "18c5c2f7-7687-4bc2-a47b-f2d0f9d8c109", 00:17:45.098 "is_configured": true, 00:17:45.098 "data_offset": 0, 00:17:45.098 "data_size": 65536 00:17:45.098 }, 00:17:45.098 { 00:17:45.098 "name": "BaseBdev4", 00:17:45.098 "uuid": "583ce727-b7cc-4ba5-a5f5-4390aa86a64a", 00:17:45.098 "is_configured": true, 00:17:45.098 "data_offset": 0, 00:17:45.098 "data_size": 65536 00:17:45.098 } 00:17:45.098 ] 00:17:45.098 }' 00:17:45.098 18:21:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:45.098 18:21:28 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:17:45.664 18:21:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i = 1 )) 00:17:45.664 18:21:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:17:45.664 18:21:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:45.664 18:21:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:17:45.922 18:21:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:17:45.922 18:21:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:17:45.922 18:21:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:17:46.181 [2024-07-12 18:21:29.734654] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:17:46.181 18:21:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:17:46.181 18:21:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:17:46.181 18:21:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:46.181 18:21:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:17:46.440 18:21:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:17:46.440 18:21:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:17:46.440 18:21:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev3 00:17:46.699 [2024-07-12 18:21:30.230582] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:17:46.699 18:21:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:17:46.699 18:21:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:17:46.699 18:21:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:46.699 18:21:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:17:46.958 18:21:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:17:46.958 18:21:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:17:46.958 18:21:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev4 00:17:47.217 [2024-07-12 18:21:30.732310] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev4 00:17:47.217 [2024-07-12 18:21:30.732356] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x16ef350 name Existed_Raid, state offline 00:17:47.217 18:21:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:17:47.217 18:21:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:17:47.217 18:21:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:47.217 18:21:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # jq -r '.[0]["name"] | select(.)' 00:17:47.217 18:21:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # raid_bdev= 00:17:47.217 18:21:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@294 -- # '[' -n '' ']' 00:17:47.217 18:21:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@299 -- # '[' 4 -gt 2 ']' 00:17:47.217 18:21:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i = 1 )) 00:17:47.217 18:21:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:17:47.218 18:21:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:17:47.477 BaseBdev2 00:17:47.477 18:21:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev2 00:17:47.477 18:21:31 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev2 00:17:47.477 18:21:31 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:17:47.477 18:21:31 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:17:47.477 18:21:31 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:17:47.477 18:21:31 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:17:47.477 18:21:31 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:17:47.736 18:21:31 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:17:47.995 [ 00:17:47.995 { 00:17:47.995 "name": "BaseBdev2", 00:17:47.995 "aliases": [ 00:17:47.995 "93d83aca-de01-4230-a859-c3f6452cb770" 00:17:47.995 ], 00:17:47.995 "product_name": "Malloc disk", 00:17:47.995 "block_size": 512, 00:17:47.995 "num_blocks": 65536, 00:17:47.995 "uuid": "93d83aca-de01-4230-a859-c3f6452cb770", 00:17:47.995 "assigned_rate_limits": { 00:17:47.995 "rw_ios_per_sec": 0, 00:17:47.995 "rw_mbytes_per_sec": 0, 00:17:47.995 "r_mbytes_per_sec": 0, 00:17:47.995 "w_mbytes_per_sec": 0 00:17:47.995 }, 00:17:47.995 "claimed": false, 00:17:47.995 "zoned": false, 00:17:47.995 "supported_io_types": { 00:17:47.995 "read": true, 00:17:47.995 "write": true, 00:17:47.995 "unmap": true, 00:17:47.995 "flush": true, 00:17:47.995 "reset": true, 00:17:47.995 "nvme_admin": false, 00:17:47.995 "nvme_io": false, 00:17:47.995 "nvme_io_md": false, 00:17:47.995 "write_zeroes": true, 00:17:47.995 "zcopy": true, 00:17:47.995 "get_zone_info": false, 00:17:47.995 "zone_management": false, 00:17:47.995 "zone_append": false, 00:17:47.995 "compare": false, 00:17:47.995 "compare_and_write": false, 00:17:47.995 "abort": true, 00:17:47.995 "seek_hole": false, 00:17:47.995 "seek_data": false, 00:17:47.995 "copy": true, 00:17:47.995 "nvme_iov_md": false 00:17:47.995 }, 00:17:47.995 "memory_domains": [ 00:17:47.995 { 00:17:47.995 "dma_device_id": "system", 00:17:47.995 "dma_device_type": 1 00:17:47.995 }, 00:17:47.995 { 00:17:47.995 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:47.995 "dma_device_type": 2 00:17:47.995 } 00:17:47.995 ], 00:17:47.995 "driver_specific": {} 00:17:47.995 } 00:17:47.995 ] 00:17:47.995 18:21:31 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:17:47.995 18:21:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:17:47.995 18:21:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:17:47.995 18:21:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:17:48.254 BaseBdev3 00:17:48.254 18:21:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev3 00:17:48.254 18:21:31 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev3 00:17:48.254 18:21:31 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:17:48.254 18:21:31 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:17:48.254 18:21:31 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:17:48.254 18:21:31 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:17:48.254 18:21:31 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:17:48.513 18:21:32 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:17:48.771 [ 00:17:48.771 { 00:17:48.771 "name": "BaseBdev3", 00:17:48.771 "aliases": [ 00:17:48.771 "33dcb066-8bcb-4e6e-8a76-8c248e772d63" 00:17:48.771 ], 00:17:48.771 "product_name": "Malloc disk", 00:17:48.771 "block_size": 512, 00:17:48.771 "num_blocks": 65536, 00:17:48.771 "uuid": "33dcb066-8bcb-4e6e-8a76-8c248e772d63", 00:17:48.771 "assigned_rate_limits": { 00:17:48.771 "rw_ios_per_sec": 0, 00:17:48.771 "rw_mbytes_per_sec": 0, 00:17:48.771 "r_mbytes_per_sec": 0, 00:17:48.771 "w_mbytes_per_sec": 0 00:17:48.771 }, 00:17:48.771 "claimed": false, 00:17:48.771 "zoned": false, 00:17:48.771 "supported_io_types": { 00:17:48.771 "read": true, 00:17:48.771 "write": true, 00:17:48.771 "unmap": true, 00:17:48.771 "flush": true, 00:17:48.771 "reset": true, 00:17:48.771 "nvme_admin": false, 00:17:48.771 "nvme_io": false, 00:17:48.771 "nvme_io_md": false, 00:17:48.771 "write_zeroes": true, 00:17:48.771 "zcopy": true, 00:17:48.771 "get_zone_info": false, 00:17:48.771 "zone_management": false, 00:17:48.771 "zone_append": false, 00:17:48.771 "compare": false, 00:17:48.771 "compare_and_write": false, 00:17:48.771 "abort": true, 00:17:48.771 "seek_hole": false, 00:17:48.771 "seek_data": false, 00:17:48.771 "copy": true, 00:17:48.771 "nvme_iov_md": false 00:17:48.771 }, 00:17:48.771 "memory_domains": [ 00:17:48.771 { 00:17:48.771 "dma_device_id": "system", 00:17:48.771 "dma_device_type": 1 00:17:48.771 }, 00:17:48.771 { 00:17:48.771 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:48.771 "dma_device_type": 2 00:17:48.771 } 00:17:48.771 ], 00:17:48.771 "driver_specific": {} 00:17:48.771 } 00:17:48.771 ] 00:17:48.771 18:21:32 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:17:48.771 18:21:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:17:48.771 18:21:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:17:48.771 18:21:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4 00:17:49.030 BaseBdev4 00:17:49.030 18:21:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev4 00:17:49.030 18:21:32 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev4 00:17:49.030 18:21:32 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:17:49.030 18:21:32 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:17:49.030 18:21:32 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:17:49.030 18:21:32 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:17:49.030 18:21:32 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:17:49.288 18:21:32 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 -t 2000 00:17:49.546 [ 00:17:49.546 { 00:17:49.546 "name": "BaseBdev4", 00:17:49.546 "aliases": [ 00:17:49.546 "32a95680-7a5f-433f-a7d1-3c15ac2c87f8" 00:17:49.546 ], 00:17:49.546 "product_name": "Malloc disk", 00:17:49.546 "block_size": 512, 00:17:49.546 "num_blocks": 65536, 00:17:49.546 "uuid": "32a95680-7a5f-433f-a7d1-3c15ac2c87f8", 00:17:49.546 "assigned_rate_limits": { 00:17:49.546 "rw_ios_per_sec": 0, 00:17:49.546 "rw_mbytes_per_sec": 0, 00:17:49.546 "r_mbytes_per_sec": 0, 00:17:49.546 "w_mbytes_per_sec": 0 00:17:49.546 }, 00:17:49.546 "claimed": false, 00:17:49.546 "zoned": false, 00:17:49.546 "supported_io_types": { 00:17:49.546 "read": true, 00:17:49.546 "write": true, 00:17:49.546 "unmap": true, 00:17:49.546 "flush": true, 00:17:49.546 "reset": true, 00:17:49.546 "nvme_admin": false, 00:17:49.546 "nvme_io": false, 00:17:49.546 "nvme_io_md": false, 00:17:49.546 "write_zeroes": true, 00:17:49.546 "zcopy": true, 00:17:49.546 "get_zone_info": false, 00:17:49.546 "zone_management": false, 00:17:49.546 "zone_append": false, 00:17:49.546 "compare": false, 00:17:49.546 "compare_and_write": false, 00:17:49.546 "abort": true, 00:17:49.546 "seek_hole": false, 00:17:49.546 "seek_data": false, 00:17:49.546 "copy": true, 00:17:49.546 "nvme_iov_md": false 00:17:49.546 }, 00:17:49.546 "memory_domains": [ 00:17:49.546 { 00:17:49.546 "dma_device_id": "system", 00:17:49.546 "dma_device_type": 1 00:17:49.546 }, 00:17:49.546 { 00:17:49.546 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:49.546 "dma_device_type": 2 00:17:49.546 } 00:17:49.546 ], 00:17:49.546 "driver_specific": {} 00:17:49.546 } 00:17:49.546 ] 00:17:49.546 18:21:33 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:17:49.546 18:21:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:17:49.546 18:21:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:17:49.546 18:21:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@305 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:17:49.805 [2024-07-12 18:21:33.327341] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:17:49.805 [2024-07-12 18:21:33.327381] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:17:49.805 [2024-07-12 18:21:33.327402] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:17:49.805 [2024-07-12 18:21:33.328706] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:17:49.805 [2024-07-12 18:21:33.328748] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:17:49.805 18:21:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@306 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:17:49.805 18:21:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:17:49.805 18:21:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:17:49.805 18:21:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:17:49.805 18:21:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:17:49.805 18:21:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:17:49.805 18:21:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:49.805 18:21:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:49.805 18:21:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:49.805 18:21:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:49.805 18:21:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:49.805 18:21:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:17:50.064 18:21:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:50.064 "name": "Existed_Raid", 00:17:50.064 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:50.064 "strip_size_kb": 64, 00:17:50.064 "state": "configuring", 00:17:50.064 "raid_level": "raid0", 00:17:50.064 "superblock": false, 00:17:50.064 "num_base_bdevs": 4, 00:17:50.064 "num_base_bdevs_discovered": 3, 00:17:50.064 "num_base_bdevs_operational": 4, 00:17:50.064 "base_bdevs_list": [ 00:17:50.064 { 00:17:50.064 "name": "BaseBdev1", 00:17:50.064 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:50.064 "is_configured": false, 00:17:50.064 "data_offset": 0, 00:17:50.064 "data_size": 0 00:17:50.064 }, 00:17:50.064 { 00:17:50.064 "name": "BaseBdev2", 00:17:50.064 "uuid": "93d83aca-de01-4230-a859-c3f6452cb770", 00:17:50.064 "is_configured": true, 00:17:50.064 "data_offset": 0, 00:17:50.064 "data_size": 65536 00:17:50.064 }, 00:17:50.064 { 00:17:50.064 "name": "BaseBdev3", 00:17:50.064 "uuid": "33dcb066-8bcb-4e6e-8a76-8c248e772d63", 00:17:50.064 "is_configured": true, 00:17:50.064 "data_offset": 0, 00:17:50.064 "data_size": 65536 00:17:50.064 }, 00:17:50.064 { 00:17:50.064 "name": "BaseBdev4", 00:17:50.064 "uuid": "32a95680-7a5f-433f-a7d1-3c15ac2c87f8", 00:17:50.064 "is_configured": true, 00:17:50.064 "data_offset": 0, 00:17:50.064 "data_size": 65536 00:17:50.064 } 00:17:50.064 ] 00:17:50.064 }' 00:17:50.064 18:21:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:50.064 18:21:33 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:17:50.631 18:21:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@308 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev2 00:17:50.890 [2024-07-12 18:21:34.410165] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:17:50.890 18:21:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@309 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:17:50.890 18:21:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:17:50.890 18:21:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:17:50.890 18:21:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:17:50.890 18:21:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:17:50.890 18:21:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:17:50.890 18:21:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:50.890 18:21:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:50.890 18:21:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:50.890 18:21:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:50.890 18:21:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:17:50.890 18:21:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:51.189 18:21:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:51.189 "name": "Existed_Raid", 00:17:51.189 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:51.189 "strip_size_kb": 64, 00:17:51.189 "state": "configuring", 00:17:51.189 "raid_level": "raid0", 00:17:51.189 "superblock": false, 00:17:51.189 "num_base_bdevs": 4, 00:17:51.189 "num_base_bdevs_discovered": 2, 00:17:51.189 "num_base_bdevs_operational": 4, 00:17:51.189 "base_bdevs_list": [ 00:17:51.189 { 00:17:51.189 "name": "BaseBdev1", 00:17:51.189 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:51.189 "is_configured": false, 00:17:51.189 "data_offset": 0, 00:17:51.189 "data_size": 0 00:17:51.189 }, 00:17:51.189 { 00:17:51.189 "name": null, 00:17:51.189 "uuid": "93d83aca-de01-4230-a859-c3f6452cb770", 00:17:51.189 "is_configured": false, 00:17:51.189 "data_offset": 0, 00:17:51.189 "data_size": 65536 00:17:51.189 }, 00:17:51.189 { 00:17:51.189 "name": "BaseBdev3", 00:17:51.189 "uuid": "33dcb066-8bcb-4e6e-8a76-8c248e772d63", 00:17:51.189 "is_configured": true, 00:17:51.189 "data_offset": 0, 00:17:51.189 "data_size": 65536 00:17:51.189 }, 00:17:51.189 { 00:17:51.189 "name": "BaseBdev4", 00:17:51.189 "uuid": "32a95680-7a5f-433f-a7d1-3c15ac2c87f8", 00:17:51.189 "is_configured": true, 00:17:51.189 "data_offset": 0, 00:17:51.189 "data_size": 65536 00:17:51.189 } 00:17:51.189 ] 00:17:51.189 }' 00:17:51.189 18:21:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:51.189 18:21:34 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:17:51.788 18:21:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@310 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:51.788 18:21:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@310 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:17:52.046 18:21:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@310 -- # [[ false == \f\a\l\s\e ]] 00:17:52.046 18:21:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@312 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:17:52.305 [2024-07-12 18:21:35.974832] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:17:52.305 BaseBdev1 00:17:52.305 18:21:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@313 -- # waitforbdev BaseBdev1 00:17:52.305 18:21:35 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev1 00:17:52.305 18:21:35 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:17:52.305 18:21:35 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:17:52.305 18:21:35 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:17:52.305 18:21:35 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:17:52.305 18:21:35 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:17:52.563 18:21:36 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:17:52.821 [ 00:17:52.821 { 00:17:52.821 "name": "BaseBdev1", 00:17:52.821 "aliases": [ 00:17:52.821 "0ab05d07-4010-4925-9da9-abca0b269e96" 00:17:52.821 ], 00:17:52.821 "product_name": "Malloc disk", 00:17:52.821 "block_size": 512, 00:17:52.821 "num_blocks": 65536, 00:17:52.821 "uuid": "0ab05d07-4010-4925-9da9-abca0b269e96", 00:17:52.821 "assigned_rate_limits": { 00:17:52.821 "rw_ios_per_sec": 0, 00:17:52.821 "rw_mbytes_per_sec": 0, 00:17:52.821 "r_mbytes_per_sec": 0, 00:17:52.821 "w_mbytes_per_sec": 0 00:17:52.821 }, 00:17:52.821 "claimed": true, 00:17:52.821 "claim_type": "exclusive_write", 00:17:52.821 "zoned": false, 00:17:52.821 "supported_io_types": { 00:17:52.821 "read": true, 00:17:52.821 "write": true, 00:17:52.821 "unmap": true, 00:17:52.821 "flush": true, 00:17:52.821 "reset": true, 00:17:52.821 "nvme_admin": false, 00:17:52.821 "nvme_io": false, 00:17:52.821 "nvme_io_md": false, 00:17:52.821 "write_zeroes": true, 00:17:52.821 "zcopy": true, 00:17:52.821 "get_zone_info": false, 00:17:52.821 "zone_management": false, 00:17:52.821 "zone_append": false, 00:17:52.821 "compare": false, 00:17:52.821 "compare_and_write": false, 00:17:52.821 "abort": true, 00:17:52.821 "seek_hole": false, 00:17:52.821 "seek_data": false, 00:17:52.821 "copy": true, 00:17:52.821 "nvme_iov_md": false 00:17:52.821 }, 00:17:52.821 "memory_domains": [ 00:17:52.821 { 00:17:52.821 "dma_device_id": "system", 00:17:52.821 "dma_device_type": 1 00:17:52.821 }, 00:17:52.821 { 00:17:52.821 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:52.821 "dma_device_type": 2 00:17:52.821 } 00:17:52.821 ], 00:17:52.821 "driver_specific": {} 00:17:52.821 } 00:17:52.821 ] 00:17:52.821 18:21:36 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:17:52.821 18:21:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@314 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:17:52.821 18:21:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:17:52.821 18:21:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:17:52.821 18:21:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:17:52.821 18:21:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:17:52.821 18:21:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:17:52.821 18:21:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:52.821 18:21:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:52.821 18:21:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:52.821 18:21:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:52.821 18:21:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:17:52.821 18:21:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:53.080 18:21:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:53.080 "name": "Existed_Raid", 00:17:53.080 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:53.080 "strip_size_kb": 64, 00:17:53.080 "state": "configuring", 00:17:53.080 "raid_level": "raid0", 00:17:53.080 "superblock": false, 00:17:53.080 "num_base_bdevs": 4, 00:17:53.080 "num_base_bdevs_discovered": 3, 00:17:53.080 "num_base_bdevs_operational": 4, 00:17:53.080 "base_bdevs_list": [ 00:17:53.080 { 00:17:53.080 "name": "BaseBdev1", 00:17:53.080 "uuid": "0ab05d07-4010-4925-9da9-abca0b269e96", 00:17:53.080 "is_configured": true, 00:17:53.080 "data_offset": 0, 00:17:53.080 "data_size": 65536 00:17:53.080 }, 00:17:53.080 { 00:17:53.080 "name": null, 00:17:53.080 "uuid": "93d83aca-de01-4230-a859-c3f6452cb770", 00:17:53.080 "is_configured": false, 00:17:53.080 "data_offset": 0, 00:17:53.080 "data_size": 65536 00:17:53.080 }, 00:17:53.080 { 00:17:53.080 "name": "BaseBdev3", 00:17:53.080 "uuid": "33dcb066-8bcb-4e6e-8a76-8c248e772d63", 00:17:53.080 "is_configured": true, 00:17:53.080 "data_offset": 0, 00:17:53.080 "data_size": 65536 00:17:53.080 }, 00:17:53.080 { 00:17:53.080 "name": "BaseBdev4", 00:17:53.080 "uuid": "32a95680-7a5f-433f-a7d1-3c15ac2c87f8", 00:17:53.080 "is_configured": true, 00:17:53.080 "data_offset": 0, 00:17:53.080 "data_size": 65536 00:17:53.080 } 00:17:53.080 ] 00:17:53.080 }' 00:17:53.080 18:21:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:53.080 18:21:36 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:17:53.645 18:21:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@315 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:53.645 18:21:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@315 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:17:53.903 18:21:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@315 -- # [[ true == \t\r\u\e ]] 00:17:53.903 18:21:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@317 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev3 00:17:54.162 [2024-07-12 18:21:37.807710] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:17:54.162 18:21:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@318 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:17:54.162 18:21:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:17:54.162 18:21:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:17:54.162 18:21:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:17:54.162 18:21:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:17:54.162 18:21:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:17:54.162 18:21:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:54.162 18:21:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:54.162 18:21:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:54.162 18:21:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:54.162 18:21:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:54.162 18:21:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:17:54.420 18:21:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:54.420 "name": "Existed_Raid", 00:17:54.420 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:54.420 "strip_size_kb": 64, 00:17:54.420 "state": "configuring", 00:17:54.420 "raid_level": "raid0", 00:17:54.420 "superblock": false, 00:17:54.420 "num_base_bdevs": 4, 00:17:54.420 "num_base_bdevs_discovered": 2, 00:17:54.420 "num_base_bdevs_operational": 4, 00:17:54.420 "base_bdevs_list": [ 00:17:54.420 { 00:17:54.420 "name": "BaseBdev1", 00:17:54.420 "uuid": "0ab05d07-4010-4925-9da9-abca0b269e96", 00:17:54.420 "is_configured": true, 00:17:54.420 "data_offset": 0, 00:17:54.420 "data_size": 65536 00:17:54.420 }, 00:17:54.420 { 00:17:54.420 "name": null, 00:17:54.420 "uuid": "93d83aca-de01-4230-a859-c3f6452cb770", 00:17:54.420 "is_configured": false, 00:17:54.420 "data_offset": 0, 00:17:54.420 "data_size": 65536 00:17:54.420 }, 00:17:54.420 { 00:17:54.420 "name": null, 00:17:54.420 "uuid": "33dcb066-8bcb-4e6e-8a76-8c248e772d63", 00:17:54.420 "is_configured": false, 00:17:54.420 "data_offset": 0, 00:17:54.420 "data_size": 65536 00:17:54.420 }, 00:17:54.420 { 00:17:54.420 "name": "BaseBdev4", 00:17:54.420 "uuid": "32a95680-7a5f-433f-a7d1-3c15ac2c87f8", 00:17:54.420 "is_configured": true, 00:17:54.420 "data_offset": 0, 00:17:54.420 "data_size": 65536 00:17:54.420 } 00:17:54.420 ] 00:17:54.420 }' 00:17:54.420 18:21:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:54.420 18:21:38 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:17:54.987 18:21:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@319 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:54.987 18:21:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@319 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:17:55.246 18:21:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@319 -- # [[ false == \f\a\l\s\e ]] 00:17:55.246 18:21:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@321 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev3 00:17:55.504 [2024-07-12 18:21:39.127229] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:17:55.504 18:21:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@322 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:17:55.504 18:21:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:17:55.504 18:21:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:17:55.504 18:21:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:17:55.504 18:21:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:17:55.504 18:21:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:17:55.504 18:21:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:55.504 18:21:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:55.504 18:21:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:55.504 18:21:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:55.504 18:21:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:55.504 18:21:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:17:55.762 18:21:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:55.762 "name": "Existed_Raid", 00:17:55.763 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:55.763 "strip_size_kb": 64, 00:17:55.763 "state": "configuring", 00:17:55.763 "raid_level": "raid0", 00:17:55.763 "superblock": false, 00:17:55.763 "num_base_bdevs": 4, 00:17:55.763 "num_base_bdevs_discovered": 3, 00:17:55.763 "num_base_bdevs_operational": 4, 00:17:55.763 "base_bdevs_list": [ 00:17:55.763 { 00:17:55.763 "name": "BaseBdev1", 00:17:55.763 "uuid": "0ab05d07-4010-4925-9da9-abca0b269e96", 00:17:55.763 "is_configured": true, 00:17:55.763 "data_offset": 0, 00:17:55.763 "data_size": 65536 00:17:55.763 }, 00:17:55.763 { 00:17:55.763 "name": null, 00:17:55.763 "uuid": "93d83aca-de01-4230-a859-c3f6452cb770", 00:17:55.763 "is_configured": false, 00:17:55.763 "data_offset": 0, 00:17:55.763 "data_size": 65536 00:17:55.763 }, 00:17:55.763 { 00:17:55.763 "name": "BaseBdev3", 00:17:55.763 "uuid": "33dcb066-8bcb-4e6e-8a76-8c248e772d63", 00:17:55.763 "is_configured": true, 00:17:55.763 "data_offset": 0, 00:17:55.763 "data_size": 65536 00:17:55.763 }, 00:17:55.763 { 00:17:55.763 "name": "BaseBdev4", 00:17:55.763 "uuid": "32a95680-7a5f-433f-a7d1-3c15ac2c87f8", 00:17:55.763 "is_configured": true, 00:17:55.763 "data_offset": 0, 00:17:55.763 "data_size": 65536 00:17:55.763 } 00:17:55.763 ] 00:17:55.763 }' 00:17:55.763 18:21:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:55.763 18:21:39 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:17:56.330 18:21:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@323 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:56.330 18:21:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@323 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:17:56.589 18:21:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@323 -- # [[ true == \t\r\u\e ]] 00:17:56.589 18:21:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@325 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:17:56.848 [2024-07-12 18:21:40.430809] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:17:56.848 18:21:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@326 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:17:56.848 18:21:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:17:56.848 18:21:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:17:56.848 18:21:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:17:56.848 18:21:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:17:56.848 18:21:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:17:56.848 18:21:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:56.848 18:21:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:56.848 18:21:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:56.848 18:21:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:56.848 18:21:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:56.848 18:21:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:17:57.107 18:21:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:57.107 "name": "Existed_Raid", 00:17:57.107 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:57.107 "strip_size_kb": 64, 00:17:57.107 "state": "configuring", 00:17:57.107 "raid_level": "raid0", 00:17:57.107 "superblock": false, 00:17:57.107 "num_base_bdevs": 4, 00:17:57.107 "num_base_bdevs_discovered": 2, 00:17:57.107 "num_base_bdevs_operational": 4, 00:17:57.107 "base_bdevs_list": [ 00:17:57.107 { 00:17:57.107 "name": null, 00:17:57.107 "uuid": "0ab05d07-4010-4925-9da9-abca0b269e96", 00:17:57.107 "is_configured": false, 00:17:57.107 "data_offset": 0, 00:17:57.107 "data_size": 65536 00:17:57.107 }, 00:17:57.107 { 00:17:57.107 "name": null, 00:17:57.107 "uuid": "93d83aca-de01-4230-a859-c3f6452cb770", 00:17:57.107 "is_configured": false, 00:17:57.107 "data_offset": 0, 00:17:57.107 "data_size": 65536 00:17:57.107 }, 00:17:57.107 { 00:17:57.107 "name": "BaseBdev3", 00:17:57.107 "uuid": "33dcb066-8bcb-4e6e-8a76-8c248e772d63", 00:17:57.107 "is_configured": true, 00:17:57.107 "data_offset": 0, 00:17:57.107 "data_size": 65536 00:17:57.107 }, 00:17:57.107 { 00:17:57.107 "name": "BaseBdev4", 00:17:57.107 "uuid": "32a95680-7a5f-433f-a7d1-3c15ac2c87f8", 00:17:57.107 "is_configured": true, 00:17:57.107 "data_offset": 0, 00:17:57.107 "data_size": 65536 00:17:57.107 } 00:17:57.107 ] 00:17:57.107 }' 00:17:57.107 18:21:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:57.107 18:21:40 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:17:57.675 18:21:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@327 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:57.675 18:21:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@327 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:17:57.934 18:21:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@327 -- # [[ false == \f\a\l\s\e ]] 00:17:57.934 18:21:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@329 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev2 00:17:58.193 [2024-07-12 18:21:41.690572] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:17:58.193 18:21:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@330 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:17:58.193 18:21:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:17:58.193 18:21:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:17:58.193 18:21:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:17:58.193 18:21:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:17:58.193 18:21:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:17:58.193 18:21:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:58.193 18:21:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:58.193 18:21:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:58.193 18:21:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:58.193 18:21:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:58.193 18:21:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:17:58.452 18:21:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:58.452 "name": "Existed_Raid", 00:17:58.452 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:58.452 "strip_size_kb": 64, 00:17:58.452 "state": "configuring", 00:17:58.452 "raid_level": "raid0", 00:17:58.452 "superblock": false, 00:17:58.452 "num_base_bdevs": 4, 00:17:58.452 "num_base_bdevs_discovered": 3, 00:17:58.452 "num_base_bdevs_operational": 4, 00:17:58.452 "base_bdevs_list": [ 00:17:58.452 { 00:17:58.452 "name": null, 00:17:58.452 "uuid": "0ab05d07-4010-4925-9da9-abca0b269e96", 00:17:58.452 "is_configured": false, 00:17:58.452 "data_offset": 0, 00:17:58.452 "data_size": 65536 00:17:58.452 }, 00:17:58.452 { 00:17:58.452 "name": "BaseBdev2", 00:17:58.452 "uuid": "93d83aca-de01-4230-a859-c3f6452cb770", 00:17:58.452 "is_configured": true, 00:17:58.452 "data_offset": 0, 00:17:58.452 "data_size": 65536 00:17:58.452 }, 00:17:58.452 { 00:17:58.452 "name": "BaseBdev3", 00:17:58.452 "uuid": "33dcb066-8bcb-4e6e-8a76-8c248e772d63", 00:17:58.452 "is_configured": true, 00:17:58.452 "data_offset": 0, 00:17:58.452 "data_size": 65536 00:17:58.452 }, 00:17:58.452 { 00:17:58.452 "name": "BaseBdev4", 00:17:58.452 "uuid": "32a95680-7a5f-433f-a7d1-3c15ac2c87f8", 00:17:58.452 "is_configured": true, 00:17:58.452 "data_offset": 0, 00:17:58.452 "data_size": 65536 00:17:58.452 } 00:17:58.452 ] 00:17:58.452 }' 00:17:58.452 18:21:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:58.452 18:21:41 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:17:59.019 18:21:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@331 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:59.019 18:21:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@331 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:17:59.019 18:21:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@331 -- # [[ true == \t\r\u\e ]] 00:17:59.019 18:21:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:59.019 18:21:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@333 -- # jq -r '.[0].base_bdevs_list[0].uuid' 00:17:59.597 18:21:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b NewBaseBdev -u 0ab05d07-4010-4925-9da9-abca0b269e96 00:17:59.856 [2024-07-12 18:21:43.403706] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev NewBaseBdev is claimed 00:17:59.856 [2024-07-12 18:21:43.403742] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x16f3040 00:17:59.856 [2024-07-12 18:21:43.403751] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 262144, blocklen 512 00:17:59.856 [2024-07-12 18:21:43.403957] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x16eea70 00:17:59.856 [2024-07-12 18:21:43.404101] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x16f3040 00:17:59.856 [2024-07-12 18:21:43.404114] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x16f3040 00:17:59.856 [2024-07-12 18:21:43.404277] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:17:59.856 NewBaseBdev 00:17:59.856 18:21:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@334 -- # waitforbdev NewBaseBdev 00:17:59.856 18:21:43 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=NewBaseBdev 00:17:59.856 18:21:43 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:17:59.856 18:21:43 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:17:59.856 18:21:43 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:17:59.856 18:21:43 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:17:59.856 18:21:43 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:18:00.116 18:21:43 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev -t 2000 00:18:00.116 [ 00:18:00.116 { 00:18:00.116 "name": "NewBaseBdev", 00:18:00.116 "aliases": [ 00:18:00.116 "0ab05d07-4010-4925-9da9-abca0b269e96" 00:18:00.116 ], 00:18:00.116 "product_name": "Malloc disk", 00:18:00.116 "block_size": 512, 00:18:00.116 "num_blocks": 65536, 00:18:00.116 "uuid": "0ab05d07-4010-4925-9da9-abca0b269e96", 00:18:00.116 "assigned_rate_limits": { 00:18:00.116 "rw_ios_per_sec": 0, 00:18:00.116 "rw_mbytes_per_sec": 0, 00:18:00.116 "r_mbytes_per_sec": 0, 00:18:00.116 "w_mbytes_per_sec": 0 00:18:00.116 }, 00:18:00.116 "claimed": true, 00:18:00.116 "claim_type": "exclusive_write", 00:18:00.116 "zoned": false, 00:18:00.116 "supported_io_types": { 00:18:00.116 "read": true, 00:18:00.116 "write": true, 00:18:00.116 "unmap": true, 00:18:00.116 "flush": true, 00:18:00.116 "reset": true, 00:18:00.116 "nvme_admin": false, 00:18:00.116 "nvme_io": false, 00:18:00.116 "nvme_io_md": false, 00:18:00.116 "write_zeroes": true, 00:18:00.116 "zcopy": true, 00:18:00.116 "get_zone_info": false, 00:18:00.116 "zone_management": false, 00:18:00.116 "zone_append": false, 00:18:00.116 "compare": false, 00:18:00.116 "compare_and_write": false, 00:18:00.116 "abort": true, 00:18:00.116 "seek_hole": false, 00:18:00.116 "seek_data": false, 00:18:00.116 "copy": true, 00:18:00.116 "nvme_iov_md": false 00:18:00.116 }, 00:18:00.116 "memory_domains": [ 00:18:00.116 { 00:18:00.116 "dma_device_id": "system", 00:18:00.116 "dma_device_type": 1 00:18:00.116 }, 00:18:00.116 { 00:18:00.116 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:00.116 "dma_device_type": 2 00:18:00.116 } 00:18:00.116 ], 00:18:00.116 "driver_specific": {} 00:18:00.116 } 00:18:00.116 ] 00:18:00.116 18:21:43 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:18:00.116 18:21:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@335 -- # verify_raid_bdev_state Existed_Raid online raid0 64 4 00:18:00.116 18:21:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:18:00.116 18:21:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:18:00.116 18:21:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:18:00.116 18:21:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:18:00.116 18:21:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:18:00.116 18:21:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:18:00.116 18:21:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:18:00.116 18:21:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:18:00.116 18:21:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:18:00.116 18:21:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:00.116 18:21:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:18:00.375 18:21:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:18:00.375 "name": "Existed_Raid", 00:18:00.375 "uuid": "3b14cbf1-8319-4556-a01b-64aeb4fcc8e5", 00:18:00.375 "strip_size_kb": 64, 00:18:00.375 "state": "online", 00:18:00.375 "raid_level": "raid0", 00:18:00.375 "superblock": false, 00:18:00.375 "num_base_bdevs": 4, 00:18:00.375 "num_base_bdevs_discovered": 4, 00:18:00.375 "num_base_bdevs_operational": 4, 00:18:00.375 "base_bdevs_list": [ 00:18:00.375 { 00:18:00.375 "name": "NewBaseBdev", 00:18:00.375 "uuid": "0ab05d07-4010-4925-9da9-abca0b269e96", 00:18:00.375 "is_configured": true, 00:18:00.375 "data_offset": 0, 00:18:00.375 "data_size": 65536 00:18:00.375 }, 00:18:00.375 { 00:18:00.375 "name": "BaseBdev2", 00:18:00.375 "uuid": "93d83aca-de01-4230-a859-c3f6452cb770", 00:18:00.375 "is_configured": true, 00:18:00.375 "data_offset": 0, 00:18:00.375 "data_size": 65536 00:18:00.375 }, 00:18:00.375 { 00:18:00.375 "name": "BaseBdev3", 00:18:00.375 "uuid": "33dcb066-8bcb-4e6e-8a76-8c248e772d63", 00:18:00.375 "is_configured": true, 00:18:00.375 "data_offset": 0, 00:18:00.375 "data_size": 65536 00:18:00.375 }, 00:18:00.375 { 00:18:00.375 "name": "BaseBdev4", 00:18:00.375 "uuid": "32a95680-7a5f-433f-a7d1-3c15ac2c87f8", 00:18:00.375 "is_configured": true, 00:18:00.375 "data_offset": 0, 00:18:00.375 "data_size": 65536 00:18:00.375 } 00:18:00.375 ] 00:18:00.375 }' 00:18:00.375 18:21:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:18:00.375 18:21:44 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:18:00.943 18:21:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@336 -- # verify_raid_bdev_properties Existed_Raid 00:18:00.943 18:21:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:18:00.943 18:21:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:18:00.943 18:21:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:18:00.943 18:21:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:18:00.943 18:21:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@198 -- # local name 00:18:00.943 18:21:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:18:00.943 18:21:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:18:01.202 [2024-07-12 18:21:44.811772] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:18:01.202 18:21:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:18:01.202 "name": "Existed_Raid", 00:18:01.202 "aliases": [ 00:18:01.202 "3b14cbf1-8319-4556-a01b-64aeb4fcc8e5" 00:18:01.202 ], 00:18:01.202 "product_name": "Raid Volume", 00:18:01.202 "block_size": 512, 00:18:01.202 "num_blocks": 262144, 00:18:01.202 "uuid": "3b14cbf1-8319-4556-a01b-64aeb4fcc8e5", 00:18:01.202 "assigned_rate_limits": { 00:18:01.202 "rw_ios_per_sec": 0, 00:18:01.202 "rw_mbytes_per_sec": 0, 00:18:01.202 "r_mbytes_per_sec": 0, 00:18:01.202 "w_mbytes_per_sec": 0 00:18:01.202 }, 00:18:01.202 "claimed": false, 00:18:01.202 "zoned": false, 00:18:01.202 "supported_io_types": { 00:18:01.202 "read": true, 00:18:01.202 "write": true, 00:18:01.202 "unmap": true, 00:18:01.202 "flush": true, 00:18:01.202 "reset": true, 00:18:01.202 "nvme_admin": false, 00:18:01.202 "nvme_io": false, 00:18:01.202 "nvme_io_md": false, 00:18:01.202 "write_zeroes": true, 00:18:01.202 "zcopy": false, 00:18:01.202 "get_zone_info": false, 00:18:01.202 "zone_management": false, 00:18:01.202 "zone_append": false, 00:18:01.202 "compare": false, 00:18:01.202 "compare_and_write": false, 00:18:01.202 "abort": false, 00:18:01.202 "seek_hole": false, 00:18:01.202 "seek_data": false, 00:18:01.202 "copy": false, 00:18:01.202 "nvme_iov_md": false 00:18:01.202 }, 00:18:01.202 "memory_domains": [ 00:18:01.202 { 00:18:01.202 "dma_device_id": "system", 00:18:01.202 "dma_device_type": 1 00:18:01.202 }, 00:18:01.202 { 00:18:01.202 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:01.202 "dma_device_type": 2 00:18:01.202 }, 00:18:01.202 { 00:18:01.202 "dma_device_id": "system", 00:18:01.202 "dma_device_type": 1 00:18:01.202 }, 00:18:01.202 { 00:18:01.202 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:01.202 "dma_device_type": 2 00:18:01.202 }, 00:18:01.202 { 00:18:01.202 "dma_device_id": "system", 00:18:01.202 "dma_device_type": 1 00:18:01.202 }, 00:18:01.202 { 00:18:01.202 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:01.202 "dma_device_type": 2 00:18:01.202 }, 00:18:01.202 { 00:18:01.202 "dma_device_id": "system", 00:18:01.202 "dma_device_type": 1 00:18:01.202 }, 00:18:01.202 { 00:18:01.202 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:01.202 "dma_device_type": 2 00:18:01.202 } 00:18:01.202 ], 00:18:01.202 "driver_specific": { 00:18:01.202 "raid": { 00:18:01.202 "uuid": "3b14cbf1-8319-4556-a01b-64aeb4fcc8e5", 00:18:01.202 "strip_size_kb": 64, 00:18:01.202 "state": "online", 00:18:01.202 "raid_level": "raid0", 00:18:01.202 "superblock": false, 00:18:01.202 "num_base_bdevs": 4, 00:18:01.202 "num_base_bdevs_discovered": 4, 00:18:01.202 "num_base_bdevs_operational": 4, 00:18:01.202 "base_bdevs_list": [ 00:18:01.202 { 00:18:01.202 "name": "NewBaseBdev", 00:18:01.202 "uuid": "0ab05d07-4010-4925-9da9-abca0b269e96", 00:18:01.202 "is_configured": true, 00:18:01.202 "data_offset": 0, 00:18:01.202 "data_size": 65536 00:18:01.202 }, 00:18:01.202 { 00:18:01.202 "name": "BaseBdev2", 00:18:01.202 "uuid": "93d83aca-de01-4230-a859-c3f6452cb770", 00:18:01.202 "is_configured": true, 00:18:01.202 "data_offset": 0, 00:18:01.202 "data_size": 65536 00:18:01.202 }, 00:18:01.202 { 00:18:01.202 "name": "BaseBdev3", 00:18:01.202 "uuid": "33dcb066-8bcb-4e6e-8a76-8c248e772d63", 00:18:01.202 "is_configured": true, 00:18:01.202 "data_offset": 0, 00:18:01.202 "data_size": 65536 00:18:01.202 }, 00:18:01.202 { 00:18:01.202 "name": "BaseBdev4", 00:18:01.202 "uuid": "32a95680-7a5f-433f-a7d1-3c15ac2c87f8", 00:18:01.202 "is_configured": true, 00:18:01.202 "data_offset": 0, 00:18:01.202 "data_size": 65536 00:18:01.202 } 00:18:01.202 ] 00:18:01.202 } 00:18:01.202 } 00:18:01.202 }' 00:18:01.202 18:21:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:18:01.202 18:21:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='NewBaseBdev 00:18:01.202 BaseBdev2 00:18:01.202 BaseBdev3 00:18:01.202 BaseBdev4' 00:18:01.202 18:21:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:18:01.202 18:21:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev 00:18:01.202 18:21:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:18:01.461 18:21:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:18:01.461 "name": "NewBaseBdev", 00:18:01.461 "aliases": [ 00:18:01.461 "0ab05d07-4010-4925-9da9-abca0b269e96" 00:18:01.461 ], 00:18:01.461 "product_name": "Malloc disk", 00:18:01.461 "block_size": 512, 00:18:01.461 "num_blocks": 65536, 00:18:01.461 "uuid": "0ab05d07-4010-4925-9da9-abca0b269e96", 00:18:01.461 "assigned_rate_limits": { 00:18:01.461 "rw_ios_per_sec": 0, 00:18:01.461 "rw_mbytes_per_sec": 0, 00:18:01.461 "r_mbytes_per_sec": 0, 00:18:01.461 "w_mbytes_per_sec": 0 00:18:01.461 }, 00:18:01.461 "claimed": true, 00:18:01.461 "claim_type": "exclusive_write", 00:18:01.461 "zoned": false, 00:18:01.461 "supported_io_types": { 00:18:01.461 "read": true, 00:18:01.461 "write": true, 00:18:01.461 "unmap": true, 00:18:01.461 "flush": true, 00:18:01.461 "reset": true, 00:18:01.461 "nvme_admin": false, 00:18:01.461 "nvme_io": false, 00:18:01.461 "nvme_io_md": false, 00:18:01.461 "write_zeroes": true, 00:18:01.461 "zcopy": true, 00:18:01.461 "get_zone_info": false, 00:18:01.461 "zone_management": false, 00:18:01.461 "zone_append": false, 00:18:01.461 "compare": false, 00:18:01.461 "compare_and_write": false, 00:18:01.461 "abort": true, 00:18:01.461 "seek_hole": false, 00:18:01.461 "seek_data": false, 00:18:01.461 "copy": true, 00:18:01.461 "nvme_iov_md": false 00:18:01.461 }, 00:18:01.461 "memory_domains": [ 00:18:01.461 { 00:18:01.461 "dma_device_id": "system", 00:18:01.461 "dma_device_type": 1 00:18:01.461 }, 00:18:01.461 { 00:18:01.461 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:01.461 "dma_device_type": 2 00:18:01.461 } 00:18:01.461 ], 00:18:01.461 "driver_specific": {} 00:18:01.461 }' 00:18:01.461 18:21:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:01.720 18:21:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:01.720 18:21:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:18:01.720 18:21:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:01.720 18:21:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:01.720 18:21:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:18:01.720 18:21:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:01.720 18:21:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:01.979 18:21:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:18:01.979 18:21:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:01.979 18:21:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:01.979 18:21:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:18:01.979 18:21:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:18:01.979 18:21:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:18:01.979 18:21:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:18:02.238 18:21:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:18:02.238 "name": "BaseBdev2", 00:18:02.238 "aliases": [ 00:18:02.238 "93d83aca-de01-4230-a859-c3f6452cb770" 00:18:02.238 ], 00:18:02.238 "product_name": "Malloc disk", 00:18:02.238 "block_size": 512, 00:18:02.238 "num_blocks": 65536, 00:18:02.238 "uuid": "93d83aca-de01-4230-a859-c3f6452cb770", 00:18:02.238 "assigned_rate_limits": { 00:18:02.238 "rw_ios_per_sec": 0, 00:18:02.238 "rw_mbytes_per_sec": 0, 00:18:02.238 "r_mbytes_per_sec": 0, 00:18:02.238 "w_mbytes_per_sec": 0 00:18:02.238 }, 00:18:02.238 "claimed": true, 00:18:02.238 "claim_type": "exclusive_write", 00:18:02.238 "zoned": false, 00:18:02.238 "supported_io_types": { 00:18:02.238 "read": true, 00:18:02.238 "write": true, 00:18:02.238 "unmap": true, 00:18:02.238 "flush": true, 00:18:02.238 "reset": true, 00:18:02.238 "nvme_admin": false, 00:18:02.238 "nvme_io": false, 00:18:02.238 "nvme_io_md": false, 00:18:02.238 "write_zeroes": true, 00:18:02.238 "zcopy": true, 00:18:02.238 "get_zone_info": false, 00:18:02.238 "zone_management": false, 00:18:02.238 "zone_append": false, 00:18:02.238 "compare": false, 00:18:02.238 "compare_and_write": false, 00:18:02.238 "abort": true, 00:18:02.238 "seek_hole": false, 00:18:02.238 "seek_data": false, 00:18:02.238 "copy": true, 00:18:02.238 "nvme_iov_md": false 00:18:02.238 }, 00:18:02.238 "memory_domains": [ 00:18:02.238 { 00:18:02.238 "dma_device_id": "system", 00:18:02.238 "dma_device_type": 1 00:18:02.238 }, 00:18:02.238 { 00:18:02.238 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:02.238 "dma_device_type": 2 00:18:02.238 } 00:18:02.238 ], 00:18:02.238 "driver_specific": {} 00:18:02.238 }' 00:18:02.238 18:21:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:02.238 18:21:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:02.238 18:21:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:18:02.238 18:21:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:02.238 18:21:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:02.238 18:21:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:18:02.238 18:21:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:02.497 18:21:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:02.497 18:21:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:18:02.497 18:21:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:02.497 18:21:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:02.497 18:21:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:18:02.497 18:21:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:18:02.498 18:21:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:18:02.498 18:21:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:18:02.756 18:21:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:18:02.756 "name": "BaseBdev3", 00:18:02.756 "aliases": [ 00:18:02.756 "33dcb066-8bcb-4e6e-8a76-8c248e772d63" 00:18:02.756 ], 00:18:02.756 "product_name": "Malloc disk", 00:18:02.756 "block_size": 512, 00:18:02.756 "num_blocks": 65536, 00:18:02.756 "uuid": "33dcb066-8bcb-4e6e-8a76-8c248e772d63", 00:18:02.756 "assigned_rate_limits": { 00:18:02.756 "rw_ios_per_sec": 0, 00:18:02.756 "rw_mbytes_per_sec": 0, 00:18:02.756 "r_mbytes_per_sec": 0, 00:18:02.756 "w_mbytes_per_sec": 0 00:18:02.756 }, 00:18:02.756 "claimed": true, 00:18:02.756 "claim_type": "exclusive_write", 00:18:02.756 "zoned": false, 00:18:02.756 "supported_io_types": { 00:18:02.756 "read": true, 00:18:02.756 "write": true, 00:18:02.756 "unmap": true, 00:18:02.756 "flush": true, 00:18:02.756 "reset": true, 00:18:02.756 "nvme_admin": false, 00:18:02.756 "nvme_io": false, 00:18:02.756 "nvme_io_md": false, 00:18:02.756 "write_zeroes": true, 00:18:02.756 "zcopy": true, 00:18:02.756 "get_zone_info": false, 00:18:02.756 "zone_management": false, 00:18:02.756 "zone_append": false, 00:18:02.756 "compare": false, 00:18:02.756 "compare_and_write": false, 00:18:02.756 "abort": true, 00:18:02.756 "seek_hole": false, 00:18:02.756 "seek_data": false, 00:18:02.756 "copy": true, 00:18:02.756 "nvme_iov_md": false 00:18:02.756 }, 00:18:02.756 "memory_domains": [ 00:18:02.756 { 00:18:02.756 "dma_device_id": "system", 00:18:02.756 "dma_device_type": 1 00:18:02.756 }, 00:18:02.756 { 00:18:02.756 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:02.756 "dma_device_type": 2 00:18:02.756 } 00:18:02.756 ], 00:18:02.756 "driver_specific": {} 00:18:02.756 }' 00:18:02.756 18:21:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:02.756 18:21:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:02.756 18:21:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:18:02.756 18:21:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:03.015 18:21:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:03.015 18:21:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:18:03.015 18:21:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:03.015 18:21:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:03.015 18:21:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:18:03.015 18:21:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:03.015 18:21:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:03.015 18:21:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:18:03.015 18:21:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:18:03.015 18:21:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 00:18:03.015 18:21:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:18:03.273 18:21:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:18:03.273 "name": "BaseBdev4", 00:18:03.273 "aliases": [ 00:18:03.273 "32a95680-7a5f-433f-a7d1-3c15ac2c87f8" 00:18:03.273 ], 00:18:03.273 "product_name": "Malloc disk", 00:18:03.273 "block_size": 512, 00:18:03.273 "num_blocks": 65536, 00:18:03.273 "uuid": "32a95680-7a5f-433f-a7d1-3c15ac2c87f8", 00:18:03.273 "assigned_rate_limits": { 00:18:03.273 "rw_ios_per_sec": 0, 00:18:03.273 "rw_mbytes_per_sec": 0, 00:18:03.273 "r_mbytes_per_sec": 0, 00:18:03.273 "w_mbytes_per_sec": 0 00:18:03.273 }, 00:18:03.273 "claimed": true, 00:18:03.273 "claim_type": "exclusive_write", 00:18:03.273 "zoned": false, 00:18:03.273 "supported_io_types": { 00:18:03.273 "read": true, 00:18:03.273 "write": true, 00:18:03.273 "unmap": true, 00:18:03.273 "flush": true, 00:18:03.273 "reset": true, 00:18:03.273 "nvme_admin": false, 00:18:03.273 "nvme_io": false, 00:18:03.273 "nvme_io_md": false, 00:18:03.273 "write_zeroes": true, 00:18:03.273 "zcopy": true, 00:18:03.273 "get_zone_info": false, 00:18:03.273 "zone_management": false, 00:18:03.273 "zone_append": false, 00:18:03.273 "compare": false, 00:18:03.273 "compare_and_write": false, 00:18:03.273 "abort": true, 00:18:03.273 "seek_hole": false, 00:18:03.273 "seek_data": false, 00:18:03.273 "copy": true, 00:18:03.273 "nvme_iov_md": false 00:18:03.273 }, 00:18:03.273 "memory_domains": [ 00:18:03.273 { 00:18:03.273 "dma_device_id": "system", 00:18:03.273 "dma_device_type": 1 00:18:03.273 }, 00:18:03.273 { 00:18:03.274 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:03.274 "dma_device_type": 2 00:18:03.274 } 00:18:03.274 ], 00:18:03.274 "driver_specific": {} 00:18:03.274 }' 00:18:03.274 18:21:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:03.274 18:21:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:03.531 18:21:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:18:03.531 18:21:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:03.531 18:21:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:03.531 18:21:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:18:03.531 18:21:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:03.531 18:21:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:03.531 18:21:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:18:03.531 18:21:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:03.531 18:21:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:03.531 18:21:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:18:03.531 18:21:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@338 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:18:03.790 [2024-07-12 18:21:47.458492] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:18:03.790 [2024-07-12 18:21:47.458523] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:18:03.790 [2024-07-12 18:21:47.458573] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:18:03.790 [2024-07-12 18:21:47.458630] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:18:03.790 [2024-07-12 18:21:47.458646] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x16f3040 name Existed_Raid, state offline 00:18:03.790 18:21:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@341 -- # killprocess 2518775 00:18:03.790 18:21:47 bdev_raid.raid_state_function_test -- common/autotest_common.sh@948 -- # '[' -z 2518775 ']' 00:18:03.790 18:21:47 bdev_raid.raid_state_function_test -- common/autotest_common.sh@952 -- # kill -0 2518775 00:18:03.790 18:21:47 bdev_raid.raid_state_function_test -- common/autotest_common.sh@953 -- # uname 00:18:03.790 18:21:47 bdev_raid.raid_state_function_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:18:03.790 18:21:47 bdev_raid.raid_state_function_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2518775 00:18:04.049 18:21:47 bdev_raid.raid_state_function_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:18:04.049 18:21:47 bdev_raid.raid_state_function_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:18:04.049 18:21:47 bdev_raid.raid_state_function_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2518775' 00:18:04.049 killing process with pid 2518775 00:18:04.049 18:21:47 bdev_raid.raid_state_function_test -- common/autotest_common.sh@967 -- # kill 2518775 00:18:04.049 [2024-07-12 18:21:47.537212] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:18:04.049 18:21:47 bdev_raid.raid_state_function_test -- common/autotest_common.sh@972 -- # wait 2518775 00:18:04.049 [2024-07-12 18:21:47.578847] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:18:04.308 18:21:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@343 -- # return 0 00:18:04.308 00:18:04.308 real 0m32.795s 00:18:04.308 user 1m0.133s 00:18:04.308 sys 0m5.835s 00:18:04.308 18:21:47 bdev_raid.raid_state_function_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:18:04.308 18:21:47 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:18:04.308 ************************************ 00:18:04.308 END TEST raid_state_function_test 00:18:04.308 ************************************ 00:18:04.308 18:21:47 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:18:04.308 18:21:47 bdev_raid -- bdev/bdev_raid.sh@868 -- # run_test raid_state_function_test_sb raid_state_function_test raid0 4 true 00:18:04.308 18:21:47 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:18:04.308 18:21:47 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:18:04.308 18:21:47 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:18:04.308 ************************************ 00:18:04.308 START TEST raid_state_function_test_sb 00:18:04.308 ************************************ 00:18:04.308 18:21:47 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@1123 -- # raid_state_function_test raid0 4 true 00:18:04.308 18:21:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@220 -- # local raid_level=raid0 00:18:04.308 18:21:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@221 -- # local num_base_bdevs=4 00:18:04.308 18:21:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@222 -- # local superblock=true 00:18:04.308 18:21:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@223 -- # local raid_bdev 00:18:04.308 18:21:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i = 1 )) 00:18:04.308 18:21:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:18:04.308 18:21:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev1 00:18:04.308 18:21:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:18:04.308 18:21:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:18:04.308 18:21:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev2 00:18:04.308 18:21:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:18:04.308 18:21:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:18:04.308 18:21:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev3 00:18:04.308 18:21:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:18:04.308 18:21:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:18:04.308 18:21:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev4 00:18:04.308 18:21:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:18:04.308 18:21:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:18:04.308 18:21:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3' 'BaseBdev4') 00:18:04.308 18:21:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # local base_bdevs 00:18:04.308 18:21:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@225 -- # local raid_bdev_name=Existed_Raid 00:18:04.308 18:21:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # local strip_size 00:18:04.308 18:21:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@227 -- # local strip_size_create_arg 00:18:04.308 18:21:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@228 -- # local superblock_create_arg 00:18:04.308 18:21:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@230 -- # '[' raid0 '!=' raid1 ']' 00:18:04.308 18:21:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@231 -- # strip_size=64 00:18:04.308 18:21:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@232 -- # strip_size_create_arg='-z 64' 00:18:04.308 18:21:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@237 -- # '[' true = true ']' 00:18:04.308 18:21:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@238 -- # superblock_create_arg=-s 00:18:04.308 18:21:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@244 -- # raid_pid=2523675 00:18:04.308 18:21:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@245 -- # echo 'Process raid pid: 2523675' 00:18:04.308 Process raid pid: 2523675 00:18:04.308 18:21:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@246 -- # waitforlisten 2523675 /var/tmp/spdk-raid.sock 00:18:04.308 18:21:47 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@829 -- # '[' -z 2523675 ']' 00:18:04.308 18:21:47 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:18:04.308 18:21:47 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@834 -- # local max_retries=100 00:18:04.308 18:21:47 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:18:04.308 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:18:04.308 18:21:47 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@838 -- # xtrace_disable 00:18:04.308 18:21:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@243 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:18:04.308 18:21:47 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:18:04.308 [2024-07-12 18:21:47.938035] Starting SPDK v24.09-pre git sha1 182dd7de4 / DPDK 24.03.0 initialization... 00:18:04.308 [2024-07-12 18:21:47.938100] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:18:04.567 [2024-07-12 18:21:48.066257] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:18:04.567 [2024-07-12 18:21:48.171859] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:18:04.567 [2024-07-12 18:21:48.232426] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:18:04.567 [2024-07-12 18:21:48.232455] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:18:05.135 18:21:48 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:18:05.135 18:21:48 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@862 -- # return 0 00:18:05.135 18:21:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@250 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r raid0 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:18:05.704 [2024-07-12 18:21:49.327408] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:18:05.704 [2024-07-12 18:21:49.327450] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:18:05.704 [2024-07-12 18:21:49.327461] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:18:05.704 [2024-07-12 18:21:49.327473] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:18:05.704 [2024-07-12 18:21:49.327481] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:18:05.704 [2024-07-12 18:21:49.327492] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:18:05.704 [2024-07-12 18:21:49.327501] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev4 00:18:05.704 [2024-07-12 18:21:49.327512] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev4 doesn't exist now 00:18:05.704 18:21:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@251 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:18:05.704 18:21:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:18:05.704 18:21:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:18:05.704 18:21:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:18:05.704 18:21:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:18:05.704 18:21:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:18:05.704 18:21:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:18:05.704 18:21:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:18:05.704 18:21:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:18:05.704 18:21:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:18:05.704 18:21:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:05.704 18:21:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:18:05.964 18:21:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:18:05.964 "name": "Existed_Raid", 00:18:05.964 "uuid": "9d502627-4b9d-4262-b587-63805dde93b9", 00:18:05.964 "strip_size_kb": 64, 00:18:05.964 "state": "configuring", 00:18:05.964 "raid_level": "raid0", 00:18:05.964 "superblock": true, 00:18:05.964 "num_base_bdevs": 4, 00:18:05.964 "num_base_bdevs_discovered": 0, 00:18:05.964 "num_base_bdevs_operational": 4, 00:18:05.964 "base_bdevs_list": [ 00:18:05.964 { 00:18:05.964 "name": "BaseBdev1", 00:18:05.964 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:05.964 "is_configured": false, 00:18:05.964 "data_offset": 0, 00:18:05.964 "data_size": 0 00:18:05.964 }, 00:18:05.964 { 00:18:05.964 "name": "BaseBdev2", 00:18:05.964 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:05.964 "is_configured": false, 00:18:05.964 "data_offset": 0, 00:18:05.964 "data_size": 0 00:18:05.964 }, 00:18:05.964 { 00:18:05.964 "name": "BaseBdev3", 00:18:05.964 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:05.964 "is_configured": false, 00:18:05.964 "data_offset": 0, 00:18:05.964 "data_size": 0 00:18:05.964 }, 00:18:05.964 { 00:18:05.964 "name": "BaseBdev4", 00:18:05.964 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:05.964 "is_configured": false, 00:18:05.964 "data_offset": 0, 00:18:05.964 "data_size": 0 00:18:05.964 } 00:18:05.964 ] 00:18:05.964 }' 00:18:05.964 18:21:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:18:05.964 18:21:49 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:18:06.530 18:21:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@252 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:18:07.096 [2024-07-12 18:21:50.682802] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:18:07.096 [2024-07-12 18:21:50.682829] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1803aa0 name Existed_Raid, state configuring 00:18:07.096 18:21:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@256 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r raid0 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:18:07.664 [2024-07-12 18:21:51.188102] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:18:07.664 [2024-07-12 18:21:51.188132] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:18:07.664 [2024-07-12 18:21:51.188139] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:18:07.664 [2024-07-12 18:21:51.188147] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:18:07.664 [2024-07-12 18:21:51.188153] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:18:07.664 [2024-07-12 18:21:51.188160] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:18:07.664 [2024-07-12 18:21:51.188166] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev4 00:18:07.664 [2024-07-12 18:21:51.188174] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev4 doesn't exist now 00:18:07.664 18:21:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@257 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:18:07.990 [2024-07-12 18:21:51.458896] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:18:07.990 BaseBdev1 00:18:07.990 18:21:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@258 -- # waitforbdev BaseBdev1 00:18:07.990 18:21:51 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev1 00:18:07.990 18:21:51 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:18:07.990 18:21:51 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:18:07.990 18:21:51 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:18:07.990 18:21:51 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:18:07.990 18:21:51 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:18:08.250 18:21:51 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:18:08.250 [ 00:18:08.250 { 00:18:08.250 "name": "BaseBdev1", 00:18:08.250 "aliases": [ 00:18:08.250 "da6da937-5530-440d-b547-65da36001b6f" 00:18:08.250 ], 00:18:08.250 "product_name": "Malloc disk", 00:18:08.250 "block_size": 512, 00:18:08.250 "num_blocks": 65536, 00:18:08.250 "uuid": "da6da937-5530-440d-b547-65da36001b6f", 00:18:08.250 "assigned_rate_limits": { 00:18:08.250 "rw_ios_per_sec": 0, 00:18:08.250 "rw_mbytes_per_sec": 0, 00:18:08.250 "r_mbytes_per_sec": 0, 00:18:08.250 "w_mbytes_per_sec": 0 00:18:08.250 }, 00:18:08.250 "claimed": true, 00:18:08.250 "claim_type": "exclusive_write", 00:18:08.250 "zoned": false, 00:18:08.250 "supported_io_types": { 00:18:08.250 "read": true, 00:18:08.250 "write": true, 00:18:08.250 "unmap": true, 00:18:08.250 "flush": true, 00:18:08.250 "reset": true, 00:18:08.250 "nvme_admin": false, 00:18:08.250 "nvme_io": false, 00:18:08.250 "nvme_io_md": false, 00:18:08.250 "write_zeroes": true, 00:18:08.250 "zcopy": true, 00:18:08.250 "get_zone_info": false, 00:18:08.250 "zone_management": false, 00:18:08.250 "zone_append": false, 00:18:08.250 "compare": false, 00:18:08.250 "compare_and_write": false, 00:18:08.250 "abort": true, 00:18:08.250 "seek_hole": false, 00:18:08.250 "seek_data": false, 00:18:08.250 "copy": true, 00:18:08.250 "nvme_iov_md": false 00:18:08.250 }, 00:18:08.250 "memory_domains": [ 00:18:08.250 { 00:18:08.250 "dma_device_id": "system", 00:18:08.250 "dma_device_type": 1 00:18:08.250 }, 00:18:08.250 { 00:18:08.250 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:08.250 "dma_device_type": 2 00:18:08.250 } 00:18:08.250 ], 00:18:08.250 "driver_specific": {} 00:18:08.250 } 00:18:08.250 ] 00:18:08.250 18:21:51 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:18:08.250 18:21:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@259 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:18:08.250 18:21:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:18:08.250 18:21:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:18:08.250 18:21:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:18:08.250 18:21:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:18:08.250 18:21:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:18:08.250 18:21:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:18:08.250 18:21:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:18:08.250 18:21:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:18:08.250 18:21:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:18:08.250 18:21:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:08.511 18:21:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:18:08.770 18:21:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:18:08.770 "name": "Existed_Raid", 00:18:08.770 "uuid": "a12915a6-9a41-4b1a-b439-2e3016529918", 00:18:08.770 "strip_size_kb": 64, 00:18:08.770 "state": "configuring", 00:18:08.770 "raid_level": "raid0", 00:18:08.770 "superblock": true, 00:18:08.770 "num_base_bdevs": 4, 00:18:08.770 "num_base_bdevs_discovered": 1, 00:18:08.770 "num_base_bdevs_operational": 4, 00:18:08.770 "base_bdevs_list": [ 00:18:08.770 { 00:18:08.770 "name": "BaseBdev1", 00:18:08.770 "uuid": "da6da937-5530-440d-b547-65da36001b6f", 00:18:08.770 "is_configured": true, 00:18:08.770 "data_offset": 2048, 00:18:08.770 "data_size": 63488 00:18:08.770 }, 00:18:08.770 { 00:18:08.770 "name": "BaseBdev2", 00:18:08.770 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:08.770 "is_configured": false, 00:18:08.770 "data_offset": 0, 00:18:08.770 "data_size": 0 00:18:08.770 }, 00:18:08.770 { 00:18:08.770 "name": "BaseBdev3", 00:18:08.770 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:08.770 "is_configured": false, 00:18:08.770 "data_offset": 0, 00:18:08.770 "data_size": 0 00:18:08.770 }, 00:18:08.770 { 00:18:08.770 "name": "BaseBdev4", 00:18:08.770 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:08.770 "is_configured": false, 00:18:08.770 "data_offset": 0, 00:18:08.770 "data_size": 0 00:18:08.770 } 00:18:08.770 ] 00:18:08.770 }' 00:18:08.770 18:21:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:18:08.770 18:21:52 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:18:09.702 18:21:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@260 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:18:09.702 [2024-07-12 18:21:53.307645] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:18:09.702 [2024-07-12 18:21:53.307679] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1803310 name Existed_Raid, state configuring 00:18:09.702 18:21:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@264 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r raid0 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:18:09.960 [2024-07-12 18:21:53.548295] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:18:09.961 [2024-07-12 18:21:53.549348] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:18:09.961 [2024-07-12 18:21:53.549375] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:18:09.961 [2024-07-12 18:21:53.549382] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:18:09.961 [2024-07-12 18:21:53.549390] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:18:09.961 [2024-07-12 18:21:53.549395] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev4 00:18:09.961 [2024-07-12 18:21:53.549403] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev4 doesn't exist now 00:18:09.961 18:21:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i = 1 )) 00:18:09.961 18:21:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:18:09.961 18:21:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:18:09.961 18:21:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:18:09.961 18:21:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:18:09.961 18:21:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:18:09.961 18:21:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:18:09.961 18:21:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:18:09.961 18:21:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:18:09.961 18:21:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:18:09.961 18:21:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:18:09.961 18:21:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:18:09.961 18:21:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:09.961 18:21:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:18:10.219 18:21:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:18:10.219 "name": "Existed_Raid", 00:18:10.219 "uuid": "ea2c2de6-2f00-43fc-aabf-8026ff1f7132", 00:18:10.219 "strip_size_kb": 64, 00:18:10.219 "state": "configuring", 00:18:10.219 "raid_level": "raid0", 00:18:10.219 "superblock": true, 00:18:10.219 "num_base_bdevs": 4, 00:18:10.219 "num_base_bdevs_discovered": 1, 00:18:10.219 "num_base_bdevs_operational": 4, 00:18:10.219 "base_bdevs_list": [ 00:18:10.219 { 00:18:10.219 "name": "BaseBdev1", 00:18:10.219 "uuid": "da6da937-5530-440d-b547-65da36001b6f", 00:18:10.219 "is_configured": true, 00:18:10.219 "data_offset": 2048, 00:18:10.219 "data_size": 63488 00:18:10.219 }, 00:18:10.219 { 00:18:10.219 "name": "BaseBdev2", 00:18:10.219 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:10.219 "is_configured": false, 00:18:10.219 "data_offset": 0, 00:18:10.219 "data_size": 0 00:18:10.219 }, 00:18:10.219 { 00:18:10.219 "name": "BaseBdev3", 00:18:10.219 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:10.219 "is_configured": false, 00:18:10.219 "data_offset": 0, 00:18:10.219 "data_size": 0 00:18:10.219 }, 00:18:10.219 { 00:18:10.219 "name": "BaseBdev4", 00:18:10.219 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:10.219 "is_configured": false, 00:18:10.219 "data_offset": 0, 00:18:10.219 "data_size": 0 00:18:10.219 } 00:18:10.219 ] 00:18:10.219 }' 00:18:10.219 18:21:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:18:10.219 18:21:53 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:18:10.785 18:21:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:18:11.043 [2024-07-12 18:21:54.579271] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:18:11.043 BaseBdev2 00:18:11.043 18:21:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev2 00:18:11.043 18:21:54 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev2 00:18:11.043 18:21:54 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:18:11.043 18:21:54 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:18:11.043 18:21:54 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:18:11.043 18:21:54 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:18:11.043 18:21:54 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:18:11.610 18:21:55 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:18:11.888 [ 00:18:11.888 { 00:18:11.888 "name": "BaseBdev2", 00:18:11.888 "aliases": [ 00:18:11.888 "da9809c6-2535-4b4e-89fc-4f2d6d784040" 00:18:11.888 ], 00:18:11.888 "product_name": "Malloc disk", 00:18:11.888 "block_size": 512, 00:18:11.888 "num_blocks": 65536, 00:18:11.888 "uuid": "da9809c6-2535-4b4e-89fc-4f2d6d784040", 00:18:11.888 "assigned_rate_limits": { 00:18:11.888 "rw_ios_per_sec": 0, 00:18:11.888 "rw_mbytes_per_sec": 0, 00:18:11.888 "r_mbytes_per_sec": 0, 00:18:11.888 "w_mbytes_per_sec": 0 00:18:11.888 }, 00:18:11.888 "claimed": true, 00:18:11.888 "claim_type": "exclusive_write", 00:18:11.888 "zoned": false, 00:18:11.888 "supported_io_types": { 00:18:11.888 "read": true, 00:18:11.888 "write": true, 00:18:11.888 "unmap": true, 00:18:11.888 "flush": true, 00:18:11.888 "reset": true, 00:18:11.888 "nvme_admin": false, 00:18:11.888 "nvme_io": false, 00:18:11.888 "nvme_io_md": false, 00:18:11.888 "write_zeroes": true, 00:18:11.888 "zcopy": true, 00:18:11.888 "get_zone_info": false, 00:18:11.888 "zone_management": false, 00:18:11.888 "zone_append": false, 00:18:11.888 "compare": false, 00:18:11.888 "compare_and_write": false, 00:18:11.888 "abort": true, 00:18:11.888 "seek_hole": false, 00:18:11.888 "seek_data": false, 00:18:11.888 "copy": true, 00:18:11.888 "nvme_iov_md": false 00:18:11.888 }, 00:18:11.888 "memory_domains": [ 00:18:11.888 { 00:18:11.888 "dma_device_id": "system", 00:18:11.888 "dma_device_type": 1 00:18:11.888 }, 00:18:11.888 { 00:18:11.888 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:11.888 "dma_device_type": 2 00:18:11.888 } 00:18:11.888 ], 00:18:11.888 "driver_specific": {} 00:18:11.888 } 00:18:11.888 ] 00:18:11.888 18:21:55 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:18:11.888 18:21:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:18:11.888 18:21:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:18:11.888 18:21:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:18:11.888 18:21:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:18:11.888 18:21:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:18:11.888 18:21:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:18:11.888 18:21:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:18:11.888 18:21:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:18:11.888 18:21:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:18:11.888 18:21:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:18:11.888 18:21:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:18:11.888 18:21:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:18:11.888 18:21:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:11.888 18:21:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:18:12.453 18:21:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:18:12.453 "name": "Existed_Raid", 00:18:12.453 "uuid": "ea2c2de6-2f00-43fc-aabf-8026ff1f7132", 00:18:12.453 "strip_size_kb": 64, 00:18:12.453 "state": "configuring", 00:18:12.453 "raid_level": "raid0", 00:18:12.453 "superblock": true, 00:18:12.453 "num_base_bdevs": 4, 00:18:12.453 "num_base_bdevs_discovered": 2, 00:18:12.453 "num_base_bdevs_operational": 4, 00:18:12.453 "base_bdevs_list": [ 00:18:12.453 { 00:18:12.453 "name": "BaseBdev1", 00:18:12.453 "uuid": "da6da937-5530-440d-b547-65da36001b6f", 00:18:12.453 "is_configured": true, 00:18:12.453 "data_offset": 2048, 00:18:12.453 "data_size": 63488 00:18:12.453 }, 00:18:12.453 { 00:18:12.453 "name": "BaseBdev2", 00:18:12.453 "uuid": "da9809c6-2535-4b4e-89fc-4f2d6d784040", 00:18:12.453 "is_configured": true, 00:18:12.453 "data_offset": 2048, 00:18:12.453 "data_size": 63488 00:18:12.453 }, 00:18:12.453 { 00:18:12.453 "name": "BaseBdev3", 00:18:12.453 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:12.453 "is_configured": false, 00:18:12.453 "data_offset": 0, 00:18:12.453 "data_size": 0 00:18:12.453 }, 00:18:12.453 { 00:18:12.453 "name": "BaseBdev4", 00:18:12.453 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:12.453 "is_configured": false, 00:18:12.453 "data_offset": 0, 00:18:12.453 "data_size": 0 00:18:12.453 } 00:18:12.453 ] 00:18:12.453 }' 00:18:12.453 18:21:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:18:12.453 18:21:55 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:18:13.017 18:21:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:18:13.597 [2024-07-12 18:21:57.242343] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:18:13.597 BaseBdev3 00:18:13.597 18:21:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev3 00:18:13.597 18:21:57 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev3 00:18:13.597 18:21:57 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:18:13.597 18:21:57 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:18:13.597 18:21:57 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:18:13.597 18:21:57 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:18:13.597 18:21:57 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:18:14.163 18:21:57 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:18:14.419 [ 00:18:14.419 { 00:18:14.419 "name": "BaseBdev3", 00:18:14.419 "aliases": [ 00:18:14.419 "9eac696e-d167-4f76-ad9f-67e56b587cd3" 00:18:14.419 ], 00:18:14.419 "product_name": "Malloc disk", 00:18:14.419 "block_size": 512, 00:18:14.419 "num_blocks": 65536, 00:18:14.419 "uuid": "9eac696e-d167-4f76-ad9f-67e56b587cd3", 00:18:14.419 "assigned_rate_limits": { 00:18:14.419 "rw_ios_per_sec": 0, 00:18:14.419 "rw_mbytes_per_sec": 0, 00:18:14.419 "r_mbytes_per_sec": 0, 00:18:14.419 "w_mbytes_per_sec": 0 00:18:14.419 }, 00:18:14.419 "claimed": true, 00:18:14.419 "claim_type": "exclusive_write", 00:18:14.419 "zoned": false, 00:18:14.419 "supported_io_types": { 00:18:14.419 "read": true, 00:18:14.419 "write": true, 00:18:14.419 "unmap": true, 00:18:14.419 "flush": true, 00:18:14.419 "reset": true, 00:18:14.419 "nvme_admin": false, 00:18:14.419 "nvme_io": false, 00:18:14.419 "nvme_io_md": false, 00:18:14.419 "write_zeroes": true, 00:18:14.419 "zcopy": true, 00:18:14.419 "get_zone_info": false, 00:18:14.419 "zone_management": false, 00:18:14.419 "zone_append": false, 00:18:14.419 "compare": false, 00:18:14.419 "compare_and_write": false, 00:18:14.419 "abort": true, 00:18:14.419 "seek_hole": false, 00:18:14.419 "seek_data": false, 00:18:14.419 "copy": true, 00:18:14.419 "nvme_iov_md": false 00:18:14.419 }, 00:18:14.419 "memory_domains": [ 00:18:14.419 { 00:18:14.419 "dma_device_id": "system", 00:18:14.419 "dma_device_type": 1 00:18:14.419 }, 00:18:14.419 { 00:18:14.419 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:14.419 "dma_device_type": 2 00:18:14.419 } 00:18:14.419 ], 00:18:14.419 "driver_specific": {} 00:18:14.419 } 00:18:14.419 ] 00:18:14.419 18:21:58 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:18:14.419 18:21:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:18:14.419 18:21:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:18:14.419 18:21:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:18:14.419 18:21:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:18:14.419 18:21:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:18:14.419 18:21:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:18:14.419 18:21:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:18:14.419 18:21:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:18:14.419 18:21:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:18:14.419 18:21:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:18:14.419 18:21:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:18:14.419 18:21:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:18:14.419 18:21:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:14.419 18:21:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:18:14.677 18:21:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:18:14.677 "name": "Existed_Raid", 00:18:14.677 "uuid": "ea2c2de6-2f00-43fc-aabf-8026ff1f7132", 00:18:14.677 "strip_size_kb": 64, 00:18:14.677 "state": "configuring", 00:18:14.677 "raid_level": "raid0", 00:18:14.677 "superblock": true, 00:18:14.677 "num_base_bdevs": 4, 00:18:14.677 "num_base_bdevs_discovered": 3, 00:18:14.677 "num_base_bdevs_operational": 4, 00:18:14.677 "base_bdevs_list": [ 00:18:14.677 { 00:18:14.677 "name": "BaseBdev1", 00:18:14.677 "uuid": "da6da937-5530-440d-b547-65da36001b6f", 00:18:14.677 "is_configured": true, 00:18:14.677 "data_offset": 2048, 00:18:14.677 "data_size": 63488 00:18:14.677 }, 00:18:14.677 { 00:18:14.677 "name": "BaseBdev2", 00:18:14.677 "uuid": "da9809c6-2535-4b4e-89fc-4f2d6d784040", 00:18:14.677 "is_configured": true, 00:18:14.677 "data_offset": 2048, 00:18:14.677 "data_size": 63488 00:18:14.677 }, 00:18:14.677 { 00:18:14.677 "name": "BaseBdev3", 00:18:14.677 "uuid": "9eac696e-d167-4f76-ad9f-67e56b587cd3", 00:18:14.677 "is_configured": true, 00:18:14.677 "data_offset": 2048, 00:18:14.677 "data_size": 63488 00:18:14.677 }, 00:18:14.677 { 00:18:14.677 "name": "BaseBdev4", 00:18:14.677 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:14.677 "is_configured": false, 00:18:14.677 "data_offset": 0, 00:18:14.677 "data_size": 0 00:18:14.677 } 00:18:14.677 ] 00:18:14.677 }' 00:18:14.677 18:21:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:18:14.677 18:21:58 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:18:15.240 18:21:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4 00:18:15.807 [2024-07-12 18:21:59.368214] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:18:15.807 [2024-07-12 18:21:59.368377] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x1804350 00:18:15.807 [2024-07-12 18:21:59.368389] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 253952, blocklen 512 00:18:15.807 [2024-07-12 18:21:59.368527] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1804020 00:18:15.807 [2024-07-12 18:21:59.368623] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1804350 00:18:15.807 [2024-07-12 18:21:59.368630] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x1804350 00:18:15.807 [2024-07-12 18:21:59.368701] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:18:15.807 BaseBdev4 00:18:15.807 18:21:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev4 00:18:15.807 18:21:59 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev4 00:18:15.807 18:21:59 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:18:15.807 18:21:59 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:18:15.807 18:21:59 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:18:15.807 18:21:59 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:18:15.807 18:21:59 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:18:16.066 18:21:59 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 -t 2000 00:18:16.633 [ 00:18:16.633 { 00:18:16.633 "name": "BaseBdev4", 00:18:16.633 "aliases": [ 00:18:16.633 "ff3c1b6d-c327-4a52-b092-b4db5f5c317c" 00:18:16.633 ], 00:18:16.633 "product_name": "Malloc disk", 00:18:16.633 "block_size": 512, 00:18:16.633 "num_blocks": 65536, 00:18:16.634 "uuid": "ff3c1b6d-c327-4a52-b092-b4db5f5c317c", 00:18:16.634 "assigned_rate_limits": { 00:18:16.634 "rw_ios_per_sec": 0, 00:18:16.634 "rw_mbytes_per_sec": 0, 00:18:16.634 "r_mbytes_per_sec": 0, 00:18:16.634 "w_mbytes_per_sec": 0 00:18:16.634 }, 00:18:16.634 "claimed": true, 00:18:16.634 "claim_type": "exclusive_write", 00:18:16.634 "zoned": false, 00:18:16.634 "supported_io_types": { 00:18:16.634 "read": true, 00:18:16.634 "write": true, 00:18:16.634 "unmap": true, 00:18:16.634 "flush": true, 00:18:16.634 "reset": true, 00:18:16.634 "nvme_admin": false, 00:18:16.634 "nvme_io": false, 00:18:16.634 "nvme_io_md": false, 00:18:16.634 "write_zeroes": true, 00:18:16.634 "zcopy": true, 00:18:16.634 "get_zone_info": false, 00:18:16.634 "zone_management": false, 00:18:16.634 "zone_append": false, 00:18:16.634 "compare": false, 00:18:16.634 "compare_and_write": false, 00:18:16.634 "abort": true, 00:18:16.634 "seek_hole": false, 00:18:16.634 "seek_data": false, 00:18:16.634 "copy": true, 00:18:16.634 "nvme_iov_md": false 00:18:16.634 }, 00:18:16.634 "memory_domains": [ 00:18:16.634 { 00:18:16.634 "dma_device_id": "system", 00:18:16.634 "dma_device_type": 1 00:18:16.634 }, 00:18:16.634 { 00:18:16.634 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:16.634 "dma_device_type": 2 00:18:16.634 } 00:18:16.634 ], 00:18:16.634 "driver_specific": {} 00:18:16.634 } 00:18:16.634 ] 00:18:16.634 18:22:00 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:18:16.634 18:22:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:18:16.634 18:22:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:18:16.634 18:22:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@270 -- # verify_raid_bdev_state Existed_Raid online raid0 64 4 00:18:16.634 18:22:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:18:16.634 18:22:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:18:16.634 18:22:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:18:16.634 18:22:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:18:16.634 18:22:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:18:16.634 18:22:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:18:16.634 18:22:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:18:16.634 18:22:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:18:16.634 18:22:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:18:16.634 18:22:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:16.634 18:22:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:18:16.893 18:22:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:18:16.893 "name": "Existed_Raid", 00:18:16.893 "uuid": "ea2c2de6-2f00-43fc-aabf-8026ff1f7132", 00:18:16.893 "strip_size_kb": 64, 00:18:16.893 "state": "online", 00:18:16.893 "raid_level": "raid0", 00:18:16.893 "superblock": true, 00:18:16.893 "num_base_bdevs": 4, 00:18:16.893 "num_base_bdevs_discovered": 4, 00:18:16.893 "num_base_bdevs_operational": 4, 00:18:16.893 "base_bdevs_list": [ 00:18:16.893 { 00:18:16.893 "name": "BaseBdev1", 00:18:16.893 "uuid": "da6da937-5530-440d-b547-65da36001b6f", 00:18:16.893 "is_configured": true, 00:18:16.893 "data_offset": 2048, 00:18:16.893 "data_size": 63488 00:18:16.893 }, 00:18:16.893 { 00:18:16.893 "name": "BaseBdev2", 00:18:16.893 "uuid": "da9809c6-2535-4b4e-89fc-4f2d6d784040", 00:18:16.893 "is_configured": true, 00:18:16.893 "data_offset": 2048, 00:18:16.893 "data_size": 63488 00:18:16.893 }, 00:18:16.893 { 00:18:16.893 "name": "BaseBdev3", 00:18:16.893 "uuid": "9eac696e-d167-4f76-ad9f-67e56b587cd3", 00:18:16.893 "is_configured": true, 00:18:16.893 "data_offset": 2048, 00:18:16.893 "data_size": 63488 00:18:16.893 }, 00:18:16.893 { 00:18:16.893 "name": "BaseBdev4", 00:18:16.893 "uuid": "ff3c1b6d-c327-4a52-b092-b4db5f5c317c", 00:18:16.893 "is_configured": true, 00:18:16.893 "data_offset": 2048, 00:18:16.893 "data_size": 63488 00:18:16.893 } 00:18:16.893 ] 00:18:16.893 }' 00:18:16.893 18:22:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:18:16.893 18:22:00 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:18:17.460 18:22:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_properties Existed_Raid 00:18:17.460 18:22:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:18:17.460 18:22:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:18:17.460 18:22:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:18:17.460 18:22:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:18:17.460 18:22:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@198 -- # local name 00:18:17.460 18:22:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:18:17.460 18:22:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:18:17.718 [2024-07-12 18:22:01.241231] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:18:17.718 18:22:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:18:17.718 "name": "Existed_Raid", 00:18:17.718 "aliases": [ 00:18:17.718 "ea2c2de6-2f00-43fc-aabf-8026ff1f7132" 00:18:17.718 ], 00:18:17.718 "product_name": "Raid Volume", 00:18:17.718 "block_size": 512, 00:18:17.718 "num_blocks": 253952, 00:18:17.718 "uuid": "ea2c2de6-2f00-43fc-aabf-8026ff1f7132", 00:18:17.718 "assigned_rate_limits": { 00:18:17.718 "rw_ios_per_sec": 0, 00:18:17.718 "rw_mbytes_per_sec": 0, 00:18:17.718 "r_mbytes_per_sec": 0, 00:18:17.718 "w_mbytes_per_sec": 0 00:18:17.718 }, 00:18:17.718 "claimed": false, 00:18:17.718 "zoned": false, 00:18:17.718 "supported_io_types": { 00:18:17.718 "read": true, 00:18:17.718 "write": true, 00:18:17.718 "unmap": true, 00:18:17.718 "flush": true, 00:18:17.718 "reset": true, 00:18:17.718 "nvme_admin": false, 00:18:17.718 "nvme_io": false, 00:18:17.718 "nvme_io_md": false, 00:18:17.718 "write_zeroes": true, 00:18:17.718 "zcopy": false, 00:18:17.718 "get_zone_info": false, 00:18:17.718 "zone_management": false, 00:18:17.718 "zone_append": false, 00:18:17.718 "compare": false, 00:18:17.718 "compare_and_write": false, 00:18:17.718 "abort": false, 00:18:17.718 "seek_hole": false, 00:18:17.718 "seek_data": false, 00:18:17.718 "copy": false, 00:18:17.718 "nvme_iov_md": false 00:18:17.718 }, 00:18:17.718 "memory_domains": [ 00:18:17.718 { 00:18:17.718 "dma_device_id": "system", 00:18:17.718 "dma_device_type": 1 00:18:17.718 }, 00:18:17.718 { 00:18:17.718 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:17.718 "dma_device_type": 2 00:18:17.718 }, 00:18:17.718 { 00:18:17.718 "dma_device_id": "system", 00:18:17.718 "dma_device_type": 1 00:18:17.718 }, 00:18:17.718 { 00:18:17.718 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:17.718 "dma_device_type": 2 00:18:17.718 }, 00:18:17.718 { 00:18:17.718 "dma_device_id": "system", 00:18:17.718 "dma_device_type": 1 00:18:17.718 }, 00:18:17.718 { 00:18:17.718 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:17.718 "dma_device_type": 2 00:18:17.718 }, 00:18:17.718 { 00:18:17.718 "dma_device_id": "system", 00:18:17.718 "dma_device_type": 1 00:18:17.718 }, 00:18:17.718 { 00:18:17.718 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:17.718 "dma_device_type": 2 00:18:17.718 } 00:18:17.718 ], 00:18:17.718 "driver_specific": { 00:18:17.718 "raid": { 00:18:17.718 "uuid": "ea2c2de6-2f00-43fc-aabf-8026ff1f7132", 00:18:17.718 "strip_size_kb": 64, 00:18:17.718 "state": "online", 00:18:17.718 "raid_level": "raid0", 00:18:17.718 "superblock": true, 00:18:17.718 "num_base_bdevs": 4, 00:18:17.718 "num_base_bdevs_discovered": 4, 00:18:17.718 "num_base_bdevs_operational": 4, 00:18:17.718 "base_bdevs_list": [ 00:18:17.718 { 00:18:17.718 "name": "BaseBdev1", 00:18:17.718 "uuid": "da6da937-5530-440d-b547-65da36001b6f", 00:18:17.718 "is_configured": true, 00:18:17.718 "data_offset": 2048, 00:18:17.718 "data_size": 63488 00:18:17.718 }, 00:18:17.718 { 00:18:17.718 "name": "BaseBdev2", 00:18:17.718 "uuid": "da9809c6-2535-4b4e-89fc-4f2d6d784040", 00:18:17.718 "is_configured": true, 00:18:17.718 "data_offset": 2048, 00:18:17.718 "data_size": 63488 00:18:17.718 }, 00:18:17.718 { 00:18:17.718 "name": "BaseBdev3", 00:18:17.718 "uuid": "9eac696e-d167-4f76-ad9f-67e56b587cd3", 00:18:17.718 "is_configured": true, 00:18:17.718 "data_offset": 2048, 00:18:17.718 "data_size": 63488 00:18:17.718 }, 00:18:17.718 { 00:18:17.718 "name": "BaseBdev4", 00:18:17.718 "uuid": "ff3c1b6d-c327-4a52-b092-b4db5f5c317c", 00:18:17.718 "is_configured": true, 00:18:17.718 "data_offset": 2048, 00:18:17.718 "data_size": 63488 00:18:17.718 } 00:18:17.718 ] 00:18:17.718 } 00:18:17.718 } 00:18:17.718 }' 00:18:17.718 18:22:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:18:17.718 18:22:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # base_bdev_names='BaseBdev1 00:18:17.718 BaseBdev2 00:18:17.718 BaseBdev3 00:18:17.718 BaseBdev4' 00:18:17.718 18:22:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:18:17.718 18:22:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:18:17.718 18:22:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:18:17.977 18:22:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:18:17.977 "name": "BaseBdev1", 00:18:17.977 "aliases": [ 00:18:17.977 "da6da937-5530-440d-b547-65da36001b6f" 00:18:17.977 ], 00:18:17.977 "product_name": "Malloc disk", 00:18:17.977 "block_size": 512, 00:18:17.977 "num_blocks": 65536, 00:18:17.977 "uuid": "da6da937-5530-440d-b547-65da36001b6f", 00:18:17.977 "assigned_rate_limits": { 00:18:17.977 "rw_ios_per_sec": 0, 00:18:17.977 "rw_mbytes_per_sec": 0, 00:18:17.977 "r_mbytes_per_sec": 0, 00:18:17.977 "w_mbytes_per_sec": 0 00:18:17.977 }, 00:18:17.977 "claimed": true, 00:18:17.977 "claim_type": "exclusive_write", 00:18:17.977 "zoned": false, 00:18:17.977 "supported_io_types": { 00:18:17.977 "read": true, 00:18:17.977 "write": true, 00:18:17.977 "unmap": true, 00:18:17.977 "flush": true, 00:18:17.977 "reset": true, 00:18:17.977 "nvme_admin": false, 00:18:17.977 "nvme_io": false, 00:18:17.977 "nvme_io_md": false, 00:18:17.977 "write_zeroes": true, 00:18:17.977 "zcopy": true, 00:18:17.977 "get_zone_info": false, 00:18:17.977 "zone_management": false, 00:18:17.977 "zone_append": false, 00:18:17.977 "compare": false, 00:18:17.977 "compare_and_write": false, 00:18:17.977 "abort": true, 00:18:17.977 "seek_hole": false, 00:18:17.977 "seek_data": false, 00:18:17.977 "copy": true, 00:18:17.977 "nvme_iov_md": false 00:18:17.977 }, 00:18:17.977 "memory_domains": [ 00:18:17.977 { 00:18:17.977 "dma_device_id": "system", 00:18:17.977 "dma_device_type": 1 00:18:17.977 }, 00:18:17.977 { 00:18:17.977 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:17.977 "dma_device_type": 2 00:18:17.977 } 00:18:17.977 ], 00:18:17.977 "driver_specific": {} 00:18:17.977 }' 00:18:17.977 18:22:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:17.977 18:22:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:17.977 18:22:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:18:17.977 18:22:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:17.977 18:22:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:17.977 18:22:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:18:17.977 18:22:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:18.236 18:22:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:18.236 18:22:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:18:18.236 18:22:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:18.236 18:22:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:18.236 18:22:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:18:18.236 18:22:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:18:18.236 18:22:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:18:18.236 18:22:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:18:18.495 18:22:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:18:18.495 "name": "BaseBdev2", 00:18:18.495 "aliases": [ 00:18:18.495 "da9809c6-2535-4b4e-89fc-4f2d6d784040" 00:18:18.495 ], 00:18:18.495 "product_name": "Malloc disk", 00:18:18.495 "block_size": 512, 00:18:18.495 "num_blocks": 65536, 00:18:18.495 "uuid": "da9809c6-2535-4b4e-89fc-4f2d6d784040", 00:18:18.495 "assigned_rate_limits": { 00:18:18.495 "rw_ios_per_sec": 0, 00:18:18.495 "rw_mbytes_per_sec": 0, 00:18:18.495 "r_mbytes_per_sec": 0, 00:18:18.495 "w_mbytes_per_sec": 0 00:18:18.495 }, 00:18:18.495 "claimed": true, 00:18:18.495 "claim_type": "exclusive_write", 00:18:18.495 "zoned": false, 00:18:18.495 "supported_io_types": { 00:18:18.495 "read": true, 00:18:18.495 "write": true, 00:18:18.495 "unmap": true, 00:18:18.495 "flush": true, 00:18:18.495 "reset": true, 00:18:18.495 "nvme_admin": false, 00:18:18.495 "nvme_io": false, 00:18:18.495 "nvme_io_md": false, 00:18:18.495 "write_zeroes": true, 00:18:18.495 "zcopy": true, 00:18:18.495 "get_zone_info": false, 00:18:18.495 "zone_management": false, 00:18:18.495 "zone_append": false, 00:18:18.495 "compare": false, 00:18:18.495 "compare_and_write": false, 00:18:18.495 "abort": true, 00:18:18.495 "seek_hole": false, 00:18:18.495 "seek_data": false, 00:18:18.495 "copy": true, 00:18:18.495 "nvme_iov_md": false 00:18:18.495 }, 00:18:18.495 "memory_domains": [ 00:18:18.495 { 00:18:18.495 "dma_device_id": "system", 00:18:18.495 "dma_device_type": 1 00:18:18.495 }, 00:18:18.495 { 00:18:18.495 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:18.495 "dma_device_type": 2 00:18:18.495 } 00:18:18.495 ], 00:18:18.495 "driver_specific": {} 00:18:18.495 }' 00:18:18.495 18:22:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:18.495 18:22:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:18.495 18:22:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:18:18.495 18:22:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:18.754 18:22:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:18.754 18:22:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:18:18.754 18:22:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:18.754 18:22:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:18.754 18:22:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:18:18.754 18:22:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:18.754 18:22:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:18.754 18:22:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:18:18.754 18:22:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:18:18.754 18:22:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:18:18.754 18:22:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:18:19.012 18:22:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:18:19.012 "name": "BaseBdev3", 00:18:19.012 "aliases": [ 00:18:19.012 "9eac696e-d167-4f76-ad9f-67e56b587cd3" 00:18:19.012 ], 00:18:19.012 "product_name": "Malloc disk", 00:18:19.012 "block_size": 512, 00:18:19.012 "num_blocks": 65536, 00:18:19.012 "uuid": "9eac696e-d167-4f76-ad9f-67e56b587cd3", 00:18:19.012 "assigned_rate_limits": { 00:18:19.012 "rw_ios_per_sec": 0, 00:18:19.012 "rw_mbytes_per_sec": 0, 00:18:19.012 "r_mbytes_per_sec": 0, 00:18:19.012 "w_mbytes_per_sec": 0 00:18:19.012 }, 00:18:19.012 "claimed": true, 00:18:19.012 "claim_type": "exclusive_write", 00:18:19.012 "zoned": false, 00:18:19.012 "supported_io_types": { 00:18:19.012 "read": true, 00:18:19.012 "write": true, 00:18:19.012 "unmap": true, 00:18:19.012 "flush": true, 00:18:19.012 "reset": true, 00:18:19.012 "nvme_admin": false, 00:18:19.012 "nvme_io": false, 00:18:19.012 "nvme_io_md": false, 00:18:19.012 "write_zeroes": true, 00:18:19.012 "zcopy": true, 00:18:19.012 "get_zone_info": false, 00:18:19.012 "zone_management": false, 00:18:19.012 "zone_append": false, 00:18:19.012 "compare": false, 00:18:19.012 "compare_and_write": false, 00:18:19.012 "abort": true, 00:18:19.012 "seek_hole": false, 00:18:19.012 "seek_data": false, 00:18:19.012 "copy": true, 00:18:19.012 "nvme_iov_md": false 00:18:19.012 }, 00:18:19.012 "memory_domains": [ 00:18:19.012 { 00:18:19.012 "dma_device_id": "system", 00:18:19.012 "dma_device_type": 1 00:18:19.012 }, 00:18:19.012 { 00:18:19.012 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:19.012 "dma_device_type": 2 00:18:19.012 } 00:18:19.012 ], 00:18:19.012 "driver_specific": {} 00:18:19.012 }' 00:18:19.012 18:22:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:19.271 18:22:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:19.271 18:22:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:18:19.271 18:22:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:19.271 18:22:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:19.271 18:22:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:18:19.271 18:22:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:19.271 18:22:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:19.271 18:22:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:18:19.271 18:22:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:19.529 18:22:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:19.529 18:22:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:18:19.529 18:22:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:18:19.529 18:22:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 00:18:19.529 18:22:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:18:19.787 18:22:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:18:19.787 "name": "BaseBdev4", 00:18:19.787 "aliases": [ 00:18:19.787 "ff3c1b6d-c327-4a52-b092-b4db5f5c317c" 00:18:19.787 ], 00:18:19.787 "product_name": "Malloc disk", 00:18:19.787 "block_size": 512, 00:18:19.787 "num_blocks": 65536, 00:18:19.787 "uuid": "ff3c1b6d-c327-4a52-b092-b4db5f5c317c", 00:18:19.787 "assigned_rate_limits": { 00:18:19.787 "rw_ios_per_sec": 0, 00:18:19.787 "rw_mbytes_per_sec": 0, 00:18:19.787 "r_mbytes_per_sec": 0, 00:18:19.787 "w_mbytes_per_sec": 0 00:18:19.787 }, 00:18:19.787 "claimed": true, 00:18:19.787 "claim_type": "exclusive_write", 00:18:19.787 "zoned": false, 00:18:19.787 "supported_io_types": { 00:18:19.787 "read": true, 00:18:19.787 "write": true, 00:18:19.787 "unmap": true, 00:18:19.787 "flush": true, 00:18:19.787 "reset": true, 00:18:19.787 "nvme_admin": false, 00:18:19.787 "nvme_io": false, 00:18:19.787 "nvme_io_md": false, 00:18:19.787 "write_zeroes": true, 00:18:19.787 "zcopy": true, 00:18:19.787 "get_zone_info": false, 00:18:19.787 "zone_management": false, 00:18:19.787 "zone_append": false, 00:18:19.787 "compare": false, 00:18:19.787 "compare_and_write": false, 00:18:19.787 "abort": true, 00:18:19.787 "seek_hole": false, 00:18:19.787 "seek_data": false, 00:18:19.787 "copy": true, 00:18:19.787 "nvme_iov_md": false 00:18:19.787 }, 00:18:19.787 "memory_domains": [ 00:18:19.787 { 00:18:19.787 "dma_device_id": "system", 00:18:19.787 "dma_device_type": 1 00:18:19.787 }, 00:18:19.787 { 00:18:19.787 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:19.787 "dma_device_type": 2 00:18:19.787 } 00:18:19.787 ], 00:18:19.787 "driver_specific": {} 00:18:19.787 }' 00:18:19.787 18:22:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:19.787 18:22:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:19.787 18:22:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:18:19.787 18:22:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:19.787 18:22:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:19.787 18:22:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:18:19.787 18:22:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:20.045 18:22:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:20.045 18:22:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:18:20.045 18:22:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:20.045 18:22:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:20.045 18:22:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:18:20.045 18:22:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@274 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:18:20.304 [2024-07-12 18:22:03.875858] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:18:20.304 [2024-07-12 18:22:03.875884] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:18:20.304 [2024-07-12 18:22:03.875930] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:18:20.304 18:22:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@275 -- # local expected_state 00:18:20.304 18:22:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@276 -- # has_redundancy raid0 00:18:20.304 18:22:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@213 -- # case $1 in 00:18:20.304 18:22:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@215 -- # return 1 00:18:20.304 18:22:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@277 -- # expected_state=offline 00:18:20.304 18:22:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@281 -- # verify_raid_bdev_state Existed_Raid offline raid0 64 3 00:18:20.304 18:22:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:18:20.304 18:22:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=offline 00:18:20.304 18:22:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:18:20.304 18:22:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:18:20.304 18:22:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:18:20.304 18:22:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:18:20.304 18:22:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:18:20.304 18:22:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:18:20.304 18:22:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:18:20.304 18:22:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:20.304 18:22:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:18:20.563 18:22:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:18:20.563 "name": "Existed_Raid", 00:18:20.563 "uuid": "ea2c2de6-2f00-43fc-aabf-8026ff1f7132", 00:18:20.563 "strip_size_kb": 64, 00:18:20.563 "state": "offline", 00:18:20.563 "raid_level": "raid0", 00:18:20.563 "superblock": true, 00:18:20.563 "num_base_bdevs": 4, 00:18:20.563 "num_base_bdevs_discovered": 3, 00:18:20.563 "num_base_bdevs_operational": 3, 00:18:20.563 "base_bdevs_list": [ 00:18:20.563 { 00:18:20.563 "name": null, 00:18:20.563 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:20.563 "is_configured": false, 00:18:20.563 "data_offset": 2048, 00:18:20.563 "data_size": 63488 00:18:20.563 }, 00:18:20.563 { 00:18:20.563 "name": "BaseBdev2", 00:18:20.564 "uuid": "da9809c6-2535-4b4e-89fc-4f2d6d784040", 00:18:20.564 "is_configured": true, 00:18:20.564 "data_offset": 2048, 00:18:20.564 "data_size": 63488 00:18:20.564 }, 00:18:20.564 { 00:18:20.564 "name": "BaseBdev3", 00:18:20.564 "uuid": "9eac696e-d167-4f76-ad9f-67e56b587cd3", 00:18:20.564 "is_configured": true, 00:18:20.564 "data_offset": 2048, 00:18:20.564 "data_size": 63488 00:18:20.564 }, 00:18:20.564 { 00:18:20.564 "name": "BaseBdev4", 00:18:20.564 "uuid": "ff3c1b6d-c327-4a52-b092-b4db5f5c317c", 00:18:20.564 "is_configured": true, 00:18:20.564 "data_offset": 2048, 00:18:20.564 "data_size": 63488 00:18:20.564 } 00:18:20.564 ] 00:18:20.564 }' 00:18:20.564 18:22:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:18:20.564 18:22:04 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:18:21.131 18:22:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i = 1 )) 00:18:21.131 18:22:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:18:21.131 18:22:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:21.131 18:22:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:18:21.389 18:22:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:18:21.389 18:22:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:18:21.389 18:22:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:18:21.648 [2024-07-12 18:22:05.231314] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:18:21.648 18:22:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:18:21.648 18:22:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:18:21.648 18:22:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:21.648 18:22:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:18:21.906 18:22:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:18:21.906 18:22:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:18:21.906 18:22:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev3 00:18:22.165 [2024-07-12 18:22:05.745229] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:18:22.165 18:22:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:18:22.165 18:22:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:18:22.165 18:22:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:22.165 18:22:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:18:22.424 18:22:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:18:22.424 18:22:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:18:22.424 18:22:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev4 00:18:22.683 [2024-07-12 18:22:06.259341] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev4 00:18:22.683 [2024-07-12 18:22:06.259385] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1804350 name Existed_Raid, state offline 00:18:22.683 18:22:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:18:22.683 18:22:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:18:22.683 18:22:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:22.683 18:22:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # jq -r '.[0]["name"] | select(.)' 00:18:22.942 18:22:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # raid_bdev= 00:18:22.942 18:22:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@294 -- # '[' -n '' ']' 00:18:22.942 18:22:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@299 -- # '[' 4 -gt 2 ']' 00:18:22.942 18:22:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i = 1 )) 00:18:22.942 18:22:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:18:22.942 18:22:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:18:23.200 BaseBdev2 00:18:23.200 18:22:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev2 00:18:23.200 18:22:06 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev2 00:18:23.200 18:22:06 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:18:23.200 18:22:06 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:18:23.200 18:22:06 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:18:23.200 18:22:06 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:18:23.200 18:22:06 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:18:23.459 18:22:07 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:18:23.718 [ 00:18:23.718 { 00:18:23.718 "name": "BaseBdev2", 00:18:23.718 "aliases": [ 00:18:23.718 "76d3a3f8-07d8-4cb3-88c9-90f5d4f7df6a" 00:18:23.718 ], 00:18:23.718 "product_name": "Malloc disk", 00:18:23.718 "block_size": 512, 00:18:23.718 "num_blocks": 65536, 00:18:23.718 "uuid": "76d3a3f8-07d8-4cb3-88c9-90f5d4f7df6a", 00:18:23.718 "assigned_rate_limits": { 00:18:23.718 "rw_ios_per_sec": 0, 00:18:23.718 "rw_mbytes_per_sec": 0, 00:18:23.718 "r_mbytes_per_sec": 0, 00:18:23.718 "w_mbytes_per_sec": 0 00:18:23.718 }, 00:18:23.718 "claimed": false, 00:18:23.718 "zoned": false, 00:18:23.718 "supported_io_types": { 00:18:23.718 "read": true, 00:18:23.718 "write": true, 00:18:23.718 "unmap": true, 00:18:23.718 "flush": true, 00:18:23.718 "reset": true, 00:18:23.718 "nvme_admin": false, 00:18:23.718 "nvme_io": false, 00:18:23.719 "nvme_io_md": false, 00:18:23.719 "write_zeroes": true, 00:18:23.719 "zcopy": true, 00:18:23.719 "get_zone_info": false, 00:18:23.719 "zone_management": false, 00:18:23.719 "zone_append": false, 00:18:23.719 "compare": false, 00:18:23.719 "compare_and_write": false, 00:18:23.719 "abort": true, 00:18:23.719 "seek_hole": false, 00:18:23.719 "seek_data": false, 00:18:23.719 "copy": true, 00:18:23.719 "nvme_iov_md": false 00:18:23.719 }, 00:18:23.719 "memory_domains": [ 00:18:23.719 { 00:18:23.719 "dma_device_id": "system", 00:18:23.719 "dma_device_type": 1 00:18:23.719 }, 00:18:23.719 { 00:18:23.719 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:23.719 "dma_device_type": 2 00:18:23.719 } 00:18:23.719 ], 00:18:23.719 "driver_specific": {} 00:18:23.719 } 00:18:23.719 ] 00:18:23.719 18:22:07 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:18:23.719 18:22:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:18:23.719 18:22:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:18:23.719 18:22:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:18:23.977 BaseBdev3 00:18:23.977 18:22:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev3 00:18:23.977 18:22:07 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev3 00:18:23.977 18:22:07 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:18:23.977 18:22:07 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:18:23.977 18:22:07 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:18:23.977 18:22:07 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:18:23.977 18:22:07 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:18:24.236 18:22:07 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:18:24.495 [ 00:18:24.495 { 00:18:24.495 "name": "BaseBdev3", 00:18:24.495 "aliases": [ 00:18:24.495 "02d28d6b-d39c-450c-967b-2af333a551bd" 00:18:24.495 ], 00:18:24.495 "product_name": "Malloc disk", 00:18:24.495 "block_size": 512, 00:18:24.495 "num_blocks": 65536, 00:18:24.495 "uuid": "02d28d6b-d39c-450c-967b-2af333a551bd", 00:18:24.495 "assigned_rate_limits": { 00:18:24.495 "rw_ios_per_sec": 0, 00:18:24.495 "rw_mbytes_per_sec": 0, 00:18:24.495 "r_mbytes_per_sec": 0, 00:18:24.495 "w_mbytes_per_sec": 0 00:18:24.495 }, 00:18:24.495 "claimed": false, 00:18:24.495 "zoned": false, 00:18:24.495 "supported_io_types": { 00:18:24.495 "read": true, 00:18:24.495 "write": true, 00:18:24.495 "unmap": true, 00:18:24.495 "flush": true, 00:18:24.495 "reset": true, 00:18:24.495 "nvme_admin": false, 00:18:24.495 "nvme_io": false, 00:18:24.495 "nvme_io_md": false, 00:18:24.495 "write_zeroes": true, 00:18:24.495 "zcopy": true, 00:18:24.495 "get_zone_info": false, 00:18:24.495 "zone_management": false, 00:18:24.495 "zone_append": false, 00:18:24.495 "compare": false, 00:18:24.495 "compare_and_write": false, 00:18:24.495 "abort": true, 00:18:24.495 "seek_hole": false, 00:18:24.495 "seek_data": false, 00:18:24.495 "copy": true, 00:18:24.495 "nvme_iov_md": false 00:18:24.495 }, 00:18:24.495 "memory_domains": [ 00:18:24.495 { 00:18:24.495 "dma_device_id": "system", 00:18:24.495 "dma_device_type": 1 00:18:24.495 }, 00:18:24.495 { 00:18:24.495 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:24.495 "dma_device_type": 2 00:18:24.495 } 00:18:24.495 ], 00:18:24.495 "driver_specific": {} 00:18:24.495 } 00:18:24.495 ] 00:18:24.495 18:22:08 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:18:24.495 18:22:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:18:24.495 18:22:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:18:24.495 18:22:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4 00:18:24.756 BaseBdev4 00:18:24.756 18:22:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev4 00:18:24.756 18:22:08 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev4 00:18:24.756 18:22:08 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:18:24.756 18:22:08 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:18:24.756 18:22:08 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:18:24.756 18:22:08 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:18:24.756 18:22:08 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:18:25.049 18:22:08 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 -t 2000 00:18:25.309 [ 00:18:25.309 { 00:18:25.309 "name": "BaseBdev4", 00:18:25.309 "aliases": [ 00:18:25.309 "65e8b807-c7cd-4272-8ae0-9460b362bac7" 00:18:25.309 ], 00:18:25.309 "product_name": "Malloc disk", 00:18:25.309 "block_size": 512, 00:18:25.309 "num_blocks": 65536, 00:18:25.309 "uuid": "65e8b807-c7cd-4272-8ae0-9460b362bac7", 00:18:25.309 "assigned_rate_limits": { 00:18:25.309 "rw_ios_per_sec": 0, 00:18:25.309 "rw_mbytes_per_sec": 0, 00:18:25.309 "r_mbytes_per_sec": 0, 00:18:25.309 "w_mbytes_per_sec": 0 00:18:25.309 }, 00:18:25.309 "claimed": false, 00:18:25.309 "zoned": false, 00:18:25.309 "supported_io_types": { 00:18:25.309 "read": true, 00:18:25.309 "write": true, 00:18:25.309 "unmap": true, 00:18:25.309 "flush": true, 00:18:25.309 "reset": true, 00:18:25.309 "nvme_admin": false, 00:18:25.309 "nvme_io": false, 00:18:25.309 "nvme_io_md": false, 00:18:25.309 "write_zeroes": true, 00:18:25.309 "zcopy": true, 00:18:25.309 "get_zone_info": false, 00:18:25.309 "zone_management": false, 00:18:25.309 "zone_append": false, 00:18:25.309 "compare": false, 00:18:25.309 "compare_and_write": false, 00:18:25.309 "abort": true, 00:18:25.309 "seek_hole": false, 00:18:25.309 "seek_data": false, 00:18:25.309 "copy": true, 00:18:25.309 "nvme_iov_md": false 00:18:25.309 }, 00:18:25.309 "memory_domains": [ 00:18:25.309 { 00:18:25.309 "dma_device_id": "system", 00:18:25.309 "dma_device_type": 1 00:18:25.309 }, 00:18:25.309 { 00:18:25.309 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:25.309 "dma_device_type": 2 00:18:25.309 } 00:18:25.309 ], 00:18:25.309 "driver_specific": {} 00:18:25.309 } 00:18:25.309 ] 00:18:25.309 18:22:08 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:18:25.309 18:22:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:18:25.309 18:22:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:18:25.309 18:22:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@305 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r raid0 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:18:25.309 [2024-07-12 18:22:09.001842] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:18:25.309 [2024-07-12 18:22:09.001883] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:18:25.309 [2024-07-12 18:22:09.001899] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:18:25.309 [2024-07-12 18:22:09.002950] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:18:25.309 [2024-07-12 18:22:09.002981] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:18:25.309 18:22:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@306 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:18:25.309 18:22:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:18:25.309 18:22:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:18:25.309 18:22:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:18:25.309 18:22:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:18:25.309 18:22:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:18:25.309 18:22:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:18:25.309 18:22:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:18:25.310 18:22:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:18:25.310 18:22:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:18:25.310 18:22:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:25.310 18:22:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:18:25.568 18:22:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:18:25.568 "name": "Existed_Raid", 00:18:25.568 "uuid": "99080195-628b-42cc-bb5a-90da6c3a136e", 00:18:25.568 "strip_size_kb": 64, 00:18:25.568 "state": "configuring", 00:18:25.568 "raid_level": "raid0", 00:18:25.568 "superblock": true, 00:18:25.568 "num_base_bdevs": 4, 00:18:25.568 "num_base_bdevs_discovered": 3, 00:18:25.568 "num_base_bdevs_operational": 4, 00:18:25.568 "base_bdevs_list": [ 00:18:25.568 { 00:18:25.568 "name": "BaseBdev1", 00:18:25.568 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:25.568 "is_configured": false, 00:18:25.568 "data_offset": 0, 00:18:25.568 "data_size": 0 00:18:25.568 }, 00:18:25.568 { 00:18:25.568 "name": "BaseBdev2", 00:18:25.568 "uuid": "76d3a3f8-07d8-4cb3-88c9-90f5d4f7df6a", 00:18:25.568 "is_configured": true, 00:18:25.568 "data_offset": 2048, 00:18:25.568 "data_size": 63488 00:18:25.568 }, 00:18:25.568 { 00:18:25.568 "name": "BaseBdev3", 00:18:25.568 "uuid": "02d28d6b-d39c-450c-967b-2af333a551bd", 00:18:25.568 "is_configured": true, 00:18:25.568 "data_offset": 2048, 00:18:25.568 "data_size": 63488 00:18:25.568 }, 00:18:25.568 { 00:18:25.568 "name": "BaseBdev4", 00:18:25.568 "uuid": "65e8b807-c7cd-4272-8ae0-9460b362bac7", 00:18:25.568 "is_configured": true, 00:18:25.568 "data_offset": 2048, 00:18:25.568 "data_size": 63488 00:18:25.568 } 00:18:25.568 ] 00:18:25.568 }' 00:18:25.568 18:22:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:18:25.568 18:22:09 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:18:26.137 18:22:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@308 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev2 00:18:26.396 [2024-07-12 18:22:10.080623] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:18:26.396 18:22:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@309 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:18:26.396 18:22:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:18:26.396 18:22:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:18:26.396 18:22:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:18:26.396 18:22:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:18:26.396 18:22:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:18:26.396 18:22:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:18:26.396 18:22:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:18:26.396 18:22:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:18:26.396 18:22:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:18:26.396 18:22:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:26.396 18:22:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:18:26.655 18:22:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:18:26.655 "name": "Existed_Raid", 00:18:26.655 "uuid": "99080195-628b-42cc-bb5a-90da6c3a136e", 00:18:26.655 "strip_size_kb": 64, 00:18:26.655 "state": "configuring", 00:18:26.655 "raid_level": "raid0", 00:18:26.655 "superblock": true, 00:18:26.655 "num_base_bdevs": 4, 00:18:26.655 "num_base_bdevs_discovered": 2, 00:18:26.655 "num_base_bdevs_operational": 4, 00:18:26.655 "base_bdevs_list": [ 00:18:26.655 { 00:18:26.655 "name": "BaseBdev1", 00:18:26.655 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:26.655 "is_configured": false, 00:18:26.655 "data_offset": 0, 00:18:26.655 "data_size": 0 00:18:26.655 }, 00:18:26.655 { 00:18:26.655 "name": null, 00:18:26.655 "uuid": "76d3a3f8-07d8-4cb3-88c9-90f5d4f7df6a", 00:18:26.655 "is_configured": false, 00:18:26.655 "data_offset": 2048, 00:18:26.655 "data_size": 63488 00:18:26.655 }, 00:18:26.655 { 00:18:26.655 "name": "BaseBdev3", 00:18:26.655 "uuid": "02d28d6b-d39c-450c-967b-2af333a551bd", 00:18:26.655 "is_configured": true, 00:18:26.655 "data_offset": 2048, 00:18:26.655 "data_size": 63488 00:18:26.655 }, 00:18:26.655 { 00:18:26.655 "name": "BaseBdev4", 00:18:26.655 "uuid": "65e8b807-c7cd-4272-8ae0-9460b362bac7", 00:18:26.655 "is_configured": true, 00:18:26.655 "data_offset": 2048, 00:18:26.655 "data_size": 63488 00:18:26.655 } 00:18:26.655 ] 00:18:26.655 }' 00:18:26.655 18:22:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:18:26.655 18:22:10 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:18:27.223 18:22:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@310 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:18:27.223 18:22:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@310 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:27.482 18:22:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@310 -- # [[ false == \f\a\l\s\e ]] 00:18:27.482 18:22:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@312 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:18:27.741 [2024-07-12 18:22:11.424287] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:18:27.741 BaseBdev1 00:18:27.741 18:22:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@313 -- # waitforbdev BaseBdev1 00:18:27.741 18:22:11 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev1 00:18:27.741 18:22:11 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:18:27.741 18:22:11 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:18:27.741 18:22:11 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:18:27.741 18:22:11 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:18:27.741 18:22:11 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:18:28.008 18:22:11 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:18:28.270 [ 00:18:28.271 { 00:18:28.271 "name": "BaseBdev1", 00:18:28.271 "aliases": [ 00:18:28.271 "f8e95bb2-8196-4448-964a-2fec484a337e" 00:18:28.271 ], 00:18:28.271 "product_name": "Malloc disk", 00:18:28.271 "block_size": 512, 00:18:28.271 "num_blocks": 65536, 00:18:28.271 "uuid": "f8e95bb2-8196-4448-964a-2fec484a337e", 00:18:28.271 "assigned_rate_limits": { 00:18:28.271 "rw_ios_per_sec": 0, 00:18:28.271 "rw_mbytes_per_sec": 0, 00:18:28.271 "r_mbytes_per_sec": 0, 00:18:28.271 "w_mbytes_per_sec": 0 00:18:28.271 }, 00:18:28.271 "claimed": true, 00:18:28.271 "claim_type": "exclusive_write", 00:18:28.271 "zoned": false, 00:18:28.271 "supported_io_types": { 00:18:28.271 "read": true, 00:18:28.271 "write": true, 00:18:28.271 "unmap": true, 00:18:28.271 "flush": true, 00:18:28.271 "reset": true, 00:18:28.271 "nvme_admin": false, 00:18:28.271 "nvme_io": false, 00:18:28.271 "nvme_io_md": false, 00:18:28.271 "write_zeroes": true, 00:18:28.271 "zcopy": true, 00:18:28.271 "get_zone_info": false, 00:18:28.271 "zone_management": false, 00:18:28.271 "zone_append": false, 00:18:28.271 "compare": false, 00:18:28.271 "compare_and_write": false, 00:18:28.271 "abort": true, 00:18:28.271 "seek_hole": false, 00:18:28.271 "seek_data": false, 00:18:28.271 "copy": true, 00:18:28.271 "nvme_iov_md": false 00:18:28.271 }, 00:18:28.271 "memory_domains": [ 00:18:28.271 { 00:18:28.271 "dma_device_id": "system", 00:18:28.271 "dma_device_type": 1 00:18:28.271 }, 00:18:28.271 { 00:18:28.271 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:28.271 "dma_device_type": 2 00:18:28.271 } 00:18:28.271 ], 00:18:28.271 "driver_specific": {} 00:18:28.271 } 00:18:28.271 ] 00:18:28.271 18:22:11 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:18:28.271 18:22:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@314 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:18:28.271 18:22:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:18:28.271 18:22:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:18:28.271 18:22:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:18:28.271 18:22:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:18:28.271 18:22:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:18:28.271 18:22:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:18:28.271 18:22:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:18:28.271 18:22:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:18:28.271 18:22:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:18:28.271 18:22:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:28.271 18:22:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:18:28.530 18:22:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:18:28.530 "name": "Existed_Raid", 00:18:28.530 "uuid": "99080195-628b-42cc-bb5a-90da6c3a136e", 00:18:28.530 "strip_size_kb": 64, 00:18:28.530 "state": "configuring", 00:18:28.530 "raid_level": "raid0", 00:18:28.530 "superblock": true, 00:18:28.530 "num_base_bdevs": 4, 00:18:28.530 "num_base_bdevs_discovered": 3, 00:18:28.530 "num_base_bdevs_operational": 4, 00:18:28.530 "base_bdevs_list": [ 00:18:28.530 { 00:18:28.530 "name": "BaseBdev1", 00:18:28.530 "uuid": "f8e95bb2-8196-4448-964a-2fec484a337e", 00:18:28.530 "is_configured": true, 00:18:28.530 "data_offset": 2048, 00:18:28.530 "data_size": 63488 00:18:28.530 }, 00:18:28.530 { 00:18:28.530 "name": null, 00:18:28.530 "uuid": "76d3a3f8-07d8-4cb3-88c9-90f5d4f7df6a", 00:18:28.530 "is_configured": false, 00:18:28.530 "data_offset": 2048, 00:18:28.530 "data_size": 63488 00:18:28.530 }, 00:18:28.530 { 00:18:28.530 "name": "BaseBdev3", 00:18:28.530 "uuid": "02d28d6b-d39c-450c-967b-2af333a551bd", 00:18:28.530 "is_configured": true, 00:18:28.530 "data_offset": 2048, 00:18:28.530 "data_size": 63488 00:18:28.530 }, 00:18:28.530 { 00:18:28.530 "name": "BaseBdev4", 00:18:28.530 "uuid": "65e8b807-c7cd-4272-8ae0-9460b362bac7", 00:18:28.530 "is_configured": true, 00:18:28.530 "data_offset": 2048, 00:18:28.530 "data_size": 63488 00:18:28.530 } 00:18:28.530 ] 00:18:28.530 }' 00:18:28.530 18:22:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:18:28.530 18:22:12 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:18:29.099 18:22:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@315 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:29.099 18:22:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@315 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:18:29.357 18:22:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@315 -- # [[ true == \t\r\u\e ]] 00:18:29.357 18:22:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@317 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev3 00:18:29.616 [2024-07-12 18:22:13.240991] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:18:29.616 18:22:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@318 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:18:29.616 18:22:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:18:29.616 18:22:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:18:29.616 18:22:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:18:29.616 18:22:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:18:29.616 18:22:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:18:29.616 18:22:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:18:29.616 18:22:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:18:29.616 18:22:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:18:29.616 18:22:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:18:29.616 18:22:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:29.616 18:22:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:18:29.875 18:22:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:18:29.875 "name": "Existed_Raid", 00:18:29.875 "uuid": "99080195-628b-42cc-bb5a-90da6c3a136e", 00:18:29.875 "strip_size_kb": 64, 00:18:29.875 "state": "configuring", 00:18:29.875 "raid_level": "raid0", 00:18:29.875 "superblock": true, 00:18:29.875 "num_base_bdevs": 4, 00:18:29.875 "num_base_bdevs_discovered": 2, 00:18:29.875 "num_base_bdevs_operational": 4, 00:18:29.875 "base_bdevs_list": [ 00:18:29.875 { 00:18:29.875 "name": "BaseBdev1", 00:18:29.875 "uuid": "f8e95bb2-8196-4448-964a-2fec484a337e", 00:18:29.875 "is_configured": true, 00:18:29.875 "data_offset": 2048, 00:18:29.875 "data_size": 63488 00:18:29.875 }, 00:18:29.875 { 00:18:29.875 "name": null, 00:18:29.875 "uuid": "76d3a3f8-07d8-4cb3-88c9-90f5d4f7df6a", 00:18:29.875 "is_configured": false, 00:18:29.875 "data_offset": 2048, 00:18:29.875 "data_size": 63488 00:18:29.875 }, 00:18:29.875 { 00:18:29.875 "name": null, 00:18:29.875 "uuid": "02d28d6b-d39c-450c-967b-2af333a551bd", 00:18:29.875 "is_configured": false, 00:18:29.875 "data_offset": 2048, 00:18:29.875 "data_size": 63488 00:18:29.875 }, 00:18:29.875 { 00:18:29.875 "name": "BaseBdev4", 00:18:29.875 "uuid": "65e8b807-c7cd-4272-8ae0-9460b362bac7", 00:18:29.875 "is_configured": true, 00:18:29.875 "data_offset": 2048, 00:18:29.875 "data_size": 63488 00:18:29.875 } 00:18:29.875 ] 00:18:29.875 }' 00:18:29.875 18:22:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:18:29.875 18:22:13 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:18:30.444 18:22:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@319 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:30.444 18:22:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@319 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:18:30.702 18:22:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@319 -- # [[ false == \f\a\l\s\e ]] 00:18:30.702 18:22:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@321 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev3 00:18:30.961 [2024-07-12 18:22:14.576459] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:18:30.961 18:22:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@322 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:18:30.961 18:22:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:18:30.961 18:22:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:18:30.961 18:22:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:18:30.961 18:22:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:18:30.961 18:22:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:18:30.961 18:22:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:18:30.961 18:22:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:18:30.961 18:22:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:18:30.961 18:22:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:18:30.961 18:22:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:30.961 18:22:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:18:31.220 18:22:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:18:31.220 "name": "Existed_Raid", 00:18:31.220 "uuid": "99080195-628b-42cc-bb5a-90da6c3a136e", 00:18:31.220 "strip_size_kb": 64, 00:18:31.220 "state": "configuring", 00:18:31.220 "raid_level": "raid0", 00:18:31.220 "superblock": true, 00:18:31.220 "num_base_bdevs": 4, 00:18:31.220 "num_base_bdevs_discovered": 3, 00:18:31.220 "num_base_bdevs_operational": 4, 00:18:31.220 "base_bdevs_list": [ 00:18:31.220 { 00:18:31.220 "name": "BaseBdev1", 00:18:31.220 "uuid": "f8e95bb2-8196-4448-964a-2fec484a337e", 00:18:31.220 "is_configured": true, 00:18:31.220 "data_offset": 2048, 00:18:31.220 "data_size": 63488 00:18:31.220 }, 00:18:31.220 { 00:18:31.220 "name": null, 00:18:31.220 "uuid": "76d3a3f8-07d8-4cb3-88c9-90f5d4f7df6a", 00:18:31.220 "is_configured": false, 00:18:31.220 "data_offset": 2048, 00:18:31.220 "data_size": 63488 00:18:31.220 }, 00:18:31.220 { 00:18:31.220 "name": "BaseBdev3", 00:18:31.220 "uuid": "02d28d6b-d39c-450c-967b-2af333a551bd", 00:18:31.220 "is_configured": true, 00:18:31.220 "data_offset": 2048, 00:18:31.220 "data_size": 63488 00:18:31.220 }, 00:18:31.220 { 00:18:31.220 "name": "BaseBdev4", 00:18:31.220 "uuid": "65e8b807-c7cd-4272-8ae0-9460b362bac7", 00:18:31.220 "is_configured": true, 00:18:31.220 "data_offset": 2048, 00:18:31.220 "data_size": 63488 00:18:31.220 } 00:18:31.220 ] 00:18:31.220 }' 00:18:31.220 18:22:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:18:31.220 18:22:14 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:18:31.787 18:22:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@323 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:31.787 18:22:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@323 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:18:32.046 18:22:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@323 -- # [[ true == \t\r\u\e ]] 00:18:32.046 18:22:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@325 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:18:32.306 [2024-07-12 18:22:15.915916] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:18:32.306 18:22:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@326 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:18:32.306 18:22:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:18:32.306 18:22:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:18:32.306 18:22:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:18:32.306 18:22:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:18:32.306 18:22:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:18:32.306 18:22:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:18:32.306 18:22:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:18:32.306 18:22:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:18:32.306 18:22:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:18:32.306 18:22:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:18:32.306 18:22:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:32.565 18:22:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:18:32.565 "name": "Existed_Raid", 00:18:32.565 "uuid": "99080195-628b-42cc-bb5a-90da6c3a136e", 00:18:32.565 "strip_size_kb": 64, 00:18:32.565 "state": "configuring", 00:18:32.565 "raid_level": "raid0", 00:18:32.565 "superblock": true, 00:18:32.565 "num_base_bdevs": 4, 00:18:32.565 "num_base_bdevs_discovered": 2, 00:18:32.565 "num_base_bdevs_operational": 4, 00:18:32.565 "base_bdevs_list": [ 00:18:32.565 { 00:18:32.565 "name": null, 00:18:32.565 "uuid": "f8e95bb2-8196-4448-964a-2fec484a337e", 00:18:32.565 "is_configured": false, 00:18:32.565 "data_offset": 2048, 00:18:32.565 "data_size": 63488 00:18:32.565 }, 00:18:32.565 { 00:18:32.565 "name": null, 00:18:32.565 "uuid": "76d3a3f8-07d8-4cb3-88c9-90f5d4f7df6a", 00:18:32.565 "is_configured": false, 00:18:32.565 "data_offset": 2048, 00:18:32.565 "data_size": 63488 00:18:32.565 }, 00:18:32.565 { 00:18:32.565 "name": "BaseBdev3", 00:18:32.565 "uuid": "02d28d6b-d39c-450c-967b-2af333a551bd", 00:18:32.565 "is_configured": true, 00:18:32.565 "data_offset": 2048, 00:18:32.565 "data_size": 63488 00:18:32.565 }, 00:18:32.565 { 00:18:32.565 "name": "BaseBdev4", 00:18:32.565 "uuid": "65e8b807-c7cd-4272-8ae0-9460b362bac7", 00:18:32.565 "is_configured": true, 00:18:32.565 "data_offset": 2048, 00:18:32.565 "data_size": 63488 00:18:32.565 } 00:18:32.565 ] 00:18:32.565 }' 00:18:32.565 18:22:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:18:32.565 18:22:16 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:18:33.133 18:22:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@327 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:33.133 18:22:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@327 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:18:33.393 18:22:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@327 -- # [[ false == \f\a\l\s\e ]] 00:18:33.393 18:22:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@329 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev2 00:18:33.652 [2024-07-12 18:22:17.261967] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:18:33.652 18:22:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@330 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:18:33.652 18:22:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:18:33.652 18:22:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:18:33.652 18:22:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:18:33.652 18:22:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:18:33.652 18:22:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:18:33.652 18:22:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:18:33.652 18:22:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:18:33.652 18:22:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:18:33.652 18:22:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:18:33.652 18:22:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:33.652 18:22:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:18:33.912 18:22:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:18:33.912 "name": "Existed_Raid", 00:18:33.912 "uuid": "99080195-628b-42cc-bb5a-90da6c3a136e", 00:18:33.912 "strip_size_kb": 64, 00:18:33.912 "state": "configuring", 00:18:33.912 "raid_level": "raid0", 00:18:33.912 "superblock": true, 00:18:33.912 "num_base_bdevs": 4, 00:18:33.912 "num_base_bdevs_discovered": 3, 00:18:33.912 "num_base_bdevs_operational": 4, 00:18:33.912 "base_bdevs_list": [ 00:18:33.912 { 00:18:33.912 "name": null, 00:18:33.912 "uuid": "f8e95bb2-8196-4448-964a-2fec484a337e", 00:18:33.912 "is_configured": false, 00:18:33.912 "data_offset": 2048, 00:18:33.912 "data_size": 63488 00:18:33.912 }, 00:18:33.912 { 00:18:33.912 "name": "BaseBdev2", 00:18:33.912 "uuid": "76d3a3f8-07d8-4cb3-88c9-90f5d4f7df6a", 00:18:33.912 "is_configured": true, 00:18:33.912 "data_offset": 2048, 00:18:33.912 "data_size": 63488 00:18:33.912 }, 00:18:33.912 { 00:18:33.912 "name": "BaseBdev3", 00:18:33.912 "uuid": "02d28d6b-d39c-450c-967b-2af333a551bd", 00:18:33.912 "is_configured": true, 00:18:33.912 "data_offset": 2048, 00:18:33.912 "data_size": 63488 00:18:33.912 }, 00:18:33.912 { 00:18:33.912 "name": "BaseBdev4", 00:18:33.912 "uuid": "65e8b807-c7cd-4272-8ae0-9460b362bac7", 00:18:33.912 "is_configured": true, 00:18:33.912 "data_offset": 2048, 00:18:33.912 "data_size": 63488 00:18:33.912 } 00:18:33.912 ] 00:18:33.912 }' 00:18:33.912 18:22:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:18:33.912 18:22:17 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:18:34.479 18:22:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@331 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:34.479 18:22:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@331 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:18:34.738 18:22:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@331 -- # [[ true == \t\r\u\e ]] 00:18:34.738 18:22:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@333 -- # jq -r '.[0].base_bdevs_list[0].uuid' 00:18:34.738 18:22:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:34.997 18:22:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b NewBaseBdev -u f8e95bb2-8196-4448-964a-2fec484a337e 00:18:35.256 [2024-07-12 18:22:18.866290] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev NewBaseBdev is claimed 00:18:35.256 [2024-07-12 18:22:18.866432] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x180a470 00:18:35.256 [2024-07-12 18:22:18.866442] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 253952, blocklen 512 00:18:35.256 [2024-07-12 18:22:18.866565] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x17fac40 00:18:35.256 [2024-07-12 18:22:18.866651] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x180a470 00:18:35.256 [2024-07-12 18:22:18.866658] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x180a470 00:18:35.256 [2024-07-12 18:22:18.866728] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:18:35.256 NewBaseBdev 00:18:35.256 18:22:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@334 -- # waitforbdev NewBaseBdev 00:18:35.256 18:22:18 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=NewBaseBdev 00:18:35.256 18:22:18 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:18:35.256 18:22:18 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:18:35.256 18:22:18 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:18:35.256 18:22:18 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:18:35.256 18:22:18 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:18:35.514 18:22:19 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev -t 2000 00:18:35.773 [ 00:18:35.773 { 00:18:35.773 "name": "NewBaseBdev", 00:18:35.773 "aliases": [ 00:18:35.773 "f8e95bb2-8196-4448-964a-2fec484a337e" 00:18:35.773 ], 00:18:35.773 "product_name": "Malloc disk", 00:18:35.773 "block_size": 512, 00:18:35.773 "num_blocks": 65536, 00:18:35.773 "uuid": "f8e95bb2-8196-4448-964a-2fec484a337e", 00:18:35.773 "assigned_rate_limits": { 00:18:35.773 "rw_ios_per_sec": 0, 00:18:35.773 "rw_mbytes_per_sec": 0, 00:18:35.773 "r_mbytes_per_sec": 0, 00:18:35.773 "w_mbytes_per_sec": 0 00:18:35.773 }, 00:18:35.773 "claimed": true, 00:18:35.773 "claim_type": "exclusive_write", 00:18:35.773 "zoned": false, 00:18:35.773 "supported_io_types": { 00:18:35.773 "read": true, 00:18:35.773 "write": true, 00:18:35.773 "unmap": true, 00:18:35.773 "flush": true, 00:18:35.773 "reset": true, 00:18:35.773 "nvme_admin": false, 00:18:35.773 "nvme_io": false, 00:18:35.773 "nvme_io_md": false, 00:18:35.773 "write_zeroes": true, 00:18:35.773 "zcopy": true, 00:18:35.773 "get_zone_info": false, 00:18:35.773 "zone_management": false, 00:18:35.773 "zone_append": false, 00:18:35.773 "compare": false, 00:18:35.773 "compare_and_write": false, 00:18:35.773 "abort": true, 00:18:35.773 "seek_hole": false, 00:18:35.773 "seek_data": false, 00:18:35.773 "copy": true, 00:18:35.773 "nvme_iov_md": false 00:18:35.773 }, 00:18:35.773 "memory_domains": [ 00:18:35.773 { 00:18:35.773 "dma_device_id": "system", 00:18:35.773 "dma_device_type": 1 00:18:35.773 }, 00:18:35.773 { 00:18:35.773 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:35.773 "dma_device_type": 2 00:18:35.773 } 00:18:35.773 ], 00:18:35.773 "driver_specific": {} 00:18:35.773 } 00:18:35.773 ] 00:18:35.773 18:22:19 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:18:35.773 18:22:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@335 -- # verify_raid_bdev_state Existed_Raid online raid0 64 4 00:18:35.773 18:22:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:18:35.773 18:22:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:18:35.773 18:22:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:18:35.773 18:22:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:18:35.773 18:22:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:18:35.773 18:22:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:18:35.773 18:22:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:18:35.773 18:22:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:18:35.773 18:22:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:18:35.773 18:22:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:35.773 18:22:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:18:36.032 18:22:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:18:36.032 "name": "Existed_Raid", 00:18:36.032 "uuid": "99080195-628b-42cc-bb5a-90da6c3a136e", 00:18:36.032 "strip_size_kb": 64, 00:18:36.032 "state": "online", 00:18:36.032 "raid_level": "raid0", 00:18:36.032 "superblock": true, 00:18:36.032 "num_base_bdevs": 4, 00:18:36.032 "num_base_bdevs_discovered": 4, 00:18:36.032 "num_base_bdevs_operational": 4, 00:18:36.032 "base_bdevs_list": [ 00:18:36.032 { 00:18:36.032 "name": "NewBaseBdev", 00:18:36.032 "uuid": "f8e95bb2-8196-4448-964a-2fec484a337e", 00:18:36.032 "is_configured": true, 00:18:36.032 "data_offset": 2048, 00:18:36.032 "data_size": 63488 00:18:36.032 }, 00:18:36.032 { 00:18:36.032 "name": "BaseBdev2", 00:18:36.032 "uuid": "76d3a3f8-07d8-4cb3-88c9-90f5d4f7df6a", 00:18:36.032 "is_configured": true, 00:18:36.032 "data_offset": 2048, 00:18:36.032 "data_size": 63488 00:18:36.032 }, 00:18:36.032 { 00:18:36.032 "name": "BaseBdev3", 00:18:36.032 "uuid": "02d28d6b-d39c-450c-967b-2af333a551bd", 00:18:36.032 "is_configured": true, 00:18:36.032 "data_offset": 2048, 00:18:36.032 "data_size": 63488 00:18:36.032 }, 00:18:36.032 { 00:18:36.032 "name": "BaseBdev4", 00:18:36.032 "uuid": "65e8b807-c7cd-4272-8ae0-9460b362bac7", 00:18:36.032 "is_configured": true, 00:18:36.032 "data_offset": 2048, 00:18:36.032 "data_size": 63488 00:18:36.032 } 00:18:36.032 ] 00:18:36.032 }' 00:18:36.032 18:22:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:18:36.032 18:22:19 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:18:36.599 18:22:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@336 -- # verify_raid_bdev_properties Existed_Raid 00:18:36.599 18:22:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:18:36.599 18:22:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:18:36.599 18:22:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:18:36.599 18:22:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:18:36.599 18:22:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@198 -- # local name 00:18:36.599 18:22:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:18:36.599 18:22:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:18:36.859 [2024-07-12 18:22:20.450610] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:18:36.859 18:22:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:18:36.859 "name": "Existed_Raid", 00:18:36.859 "aliases": [ 00:18:36.859 "99080195-628b-42cc-bb5a-90da6c3a136e" 00:18:36.859 ], 00:18:36.859 "product_name": "Raid Volume", 00:18:36.859 "block_size": 512, 00:18:36.859 "num_blocks": 253952, 00:18:36.859 "uuid": "99080195-628b-42cc-bb5a-90da6c3a136e", 00:18:36.859 "assigned_rate_limits": { 00:18:36.859 "rw_ios_per_sec": 0, 00:18:36.859 "rw_mbytes_per_sec": 0, 00:18:36.859 "r_mbytes_per_sec": 0, 00:18:36.859 "w_mbytes_per_sec": 0 00:18:36.859 }, 00:18:36.859 "claimed": false, 00:18:36.859 "zoned": false, 00:18:36.859 "supported_io_types": { 00:18:36.859 "read": true, 00:18:36.859 "write": true, 00:18:36.859 "unmap": true, 00:18:36.859 "flush": true, 00:18:36.859 "reset": true, 00:18:36.859 "nvme_admin": false, 00:18:36.859 "nvme_io": false, 00:18:36.859 "nvme_io_md": false, 00:18:36.859 "write_zeroes": true, 00:18:36.859 "zcopy": false, 00:18:36.859 "get_zone_info": false, 00:18:36.859 "zone_management": false, 00:18:36.859 "zone_append": false, 00:18:36.859 "compare": false, 00:18:36.859 "compare_and_write": false, 00:18:36.859 "abort": false, 00:18:36.859 "seek_hole": false, 00:18:36.859 "seek_data": false, 00:18:36.859 "copy": false, 00:18:36.859 "nvme_iov_md": false 00:18:36.859 }, 00:18:36.859 "memory_domains": [ 00:18:36.859 { 00:18:36.859 "dma_device_id": "system", 00:18:36.859 "dma_device_type": 1 00:18:36.859 }, 00:18:36.859 { 00:18:36.859 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:36.859 "dma_device_type": 2 00:18:36.859 }, 00:18:36.859 { 00:18:36.859 "dma_device_id": "system", 00:18:36.859 "dma_device_type": 1 00:18:36.859 }, 00:18:36.859 { 00:18:36.859 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:36.859 "dma_device_type": 2 00:18:36.859 }, 00:18:36.859 { 00:18:36.859 "dma_device_id": "system", 00:18:36.859 "dma_device_type": 1 00:18:36.859 }, 00:18:36.859 { 00:18:36.859 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:36.859 "dma_device_type": 2 00:18:36.859 }, 00:18:36.859 { 00:18:36.859 "dma_device_id": "system", 00:18:36.859 "dma_device_type": 1 00:18:36.859 }, 00:18:36.859 { 00:18:36.859 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:36.859 "dma_device_type": 2 00:18:36.859 } 00:18:36.859 ], 00:18:36.859 "driver_specific": { 00:18:36.859 "raid": { 00:18:36.859 "uuid": "99080195-628b-42cc-bb5a-90da6c3a136e", 00:18:36.859 "strip_size_kb": 64, 00:18:36.859 "state": "online", 00:18:36.859 "raid_level": "raid0", 00:18:36.859 "superblock": true, 00:18:36.859 "num_base_bdevs": 4, 00:18:36.859 "num_base_bdevs_discovered": 4, 00:18:36.859 "num_base_bdevs_operational": 4, 00:18:36.859 "base_bdevs_list": [ 00:18:36.859 { 00:18:36.859 "name": "NewBaseBdev", 00:18:36.859 "uuid": "f8e95bb2-8196-4448-964a-2fec484a337e", 00:18:36.859 "is_configured": true, 00:18:36.859 "data_offset": 2048, 00:18:36.859 "data_size": 63488 00:18:36.859 }, 00:18:36.859 { 00:18:36.859 "name": "BaseBdev2", 00:18:36.859 "uuid": "76d3a3f8-07d8-4cb3-88c9-90f5d4f7df6a", 00:18:36.859 "is_configured": true, 00:18:36.859 "data_offset": 2048, 00:18:36.859 "data_size": 63488 00:18:36.859 }, 00:18:36.859 { 00:18:36.859 "name": "BaseBdev3", 00:18:36.859 "uuid": "02d28d6b-d39c-450c-967b-2af333a551bd", 00:18:36.859 "is_configured": true, 00:18:36.859 "data_offset": 2048, 00:18:36.859 "data_size": 63488 00:18:36.859 }, 00:18:36.859 { 00:18:36.859 "name": "BaseBdev4", 00:18:36.859 "uuid": "65e8b807-c7cd-4272-8ae0-9460b362bac7", 00:18:36.859 "is_configured": true, 00:18:36.859 "data_offset": 2048, 00:18:36.859 "data_size": 63488 00:18:36.859 } 00:18:36.859 ] 00:18:36.859 } 00:18:36.859 } 00:18:36.859 }' 00:18:36.859 18:22:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:18:36.859 18:22:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # base_bdev_names='NewBaseBdev 00:18:36.859 BaseBdev2 00:18:36.859 BaseBdev3 00:18:36.859 BaseBdev4' 00:18:36.859 18:22:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:18:36.859 18:22:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev 00:18:36.859 18:22:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:18:37.118 18:22:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:18:37.118 "name": "NewBaseBdev", 00:18:37.118 "aliases": [ 00:18:37.118 "f8e95bb2-8196-4448-964a-2fec484a337e" 00:18:37.118 ], 00:18:37.118 "product_name": "Malloc disk", 00:18:37.118 "block_size": 512, 00:18:37.118 "num_blocks": 65536, 00:18:37.118 "uuid": "f8e95bb2-8196-4448-964a-2fec484a337e", 00:18:37.118 "assigned_rate_limits": { 00:18:37.118 "rw_ios_per_sec": 0, 00:18:37.118 "rw_mbytes_per_sec": 0, 00:18:37.118 "r_mbytes_per_sec": 0, 00:18:37.118 "w_mbytes_per_sec": 0 00:18:37.118 }, 00:18:37.118 "claimed": true, 00:18:37.118 "claim_type": "exclusive_write", 00:18:37.118 "zoned": false, 00:18:37.118 "supported_io_types": { 00:18:37.118 "read": true, 00:18:37.119 "write": true, 00:18:37.119 "unmap": true, 00:18:37.119 "flush": true, 00:18:37.119 "reset": true, 00:18:37.119 "nvme_admin": false, 00:18:37.119 "nvme_io": false, 00:18:37.119 "nvme_io_md": false, 00:18:37.119 "write_zeroes": true, 00:18:37.119 "zcopy": true, 00:18:37.119 "get_zone_info": false, 00:18:37.119 "zone_management": false, 00:18:37.119 "zone_append": false, 00:18:37.119 "compare": false, 00:18:37.119 "compare_and_write": false, 00:18:37.119 "abort": true, 00:18:37.119 "seek_hole": false, 00:18:37.119 "seek_data": false, 00:18:37.119 "copy": true, 00:18:37.119 "nvme_iov_md": false 00:18:37.119 }, 00:18:37.119 "memory_domains": [ 00:18:37.119 { 00:18:37.119 "dma_device_id": "system", 00:18:37.119 "dma_device_type": 1 00:18:37.119 }, 00:18:37.119 { 00:18:37.119 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:37.119 "dma_device_type": 2 00:18:37.119 } 00:18:37.119 ], 00:18:37.119 "driver_specific": {} 00:18:37.119 }' 00:18:37.119 18:22:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:37.119 18:22:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:37.377 18:22:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:18:37.377 18:22:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:37.377 18:22:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:37.377 18:22:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:18:37.377 18:22:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:37.377 18:22:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:37.377 18:22:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:18:37.377 18:22:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:37.377 18:22:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:37.636 18:22:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:18:37.636 18:22:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:18:37.636 18:22:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:18:37.636 18:22:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:18:37.636 18:22:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:18:37.636 "name": "BaseBdev2", 00:18:37.636 "aliases": [ 00:18:37.636 "76d3a3f8-07d8-4cb3-88c9-90f5d4f7df6a" 00:18:37.636 ], 00:18:37.636 "product_name": "Malloc disk", 00:18:37.636 "block_size": 512, 00:18:37.636 "num_blocks": 65536, 00:18:37.636 "uuid": "76d3a3f8-07d8-4cb3-88c9-90f5d4f7df6a", 00:18:37.636 "assigned_rate_limits": { 00:18:37.636 "rw_ios_per_sec": 0, 00:18:37.636 "rw_mbytes_per_sec": 0, 00:18:37.636 "r_mbytes_per_sec": 0, 00:18:37.636 "w_mbytes_per_sec": 0 00:18:37.636 }, 00:18:37.636 "claimed": true, 00:18:37.636 "claim_type": "exclusive_write", 00:18:37.636 "zoned": false, 00:18:37.636 "supported_io_types": { 00:18:37.636 "read": true, 00:18:37.636 "write": true, 00:18:37.636 "unmap": true, 00:18:37.636 "flush": true, 00:18:37.636 "reset": true, 00:18:37.636 "nvme_admin": false, 00:18:37.636 "nvme_io": false, 00:18:37.636 "nvme_io_md": false, 00:18:37.636 "write_zeroes": true, 00:18:37.636 "zcopy": true, 00:18:37.636 "get_zone_info": false, 00:18:37.636 "zone_management": false, 00:18:37.636 "zone_append": false, 00:18:37.636 "compare": false, 00:18:37.636 "compare_and_write": false, 00:18:37.636 "abort": true, 00:18:37.636 "seek_hole": false, 00:18:37.636 "seek_data": false, 00:18:37.636 "copy": true, 00:18:37.636 "nvme_iov_md": false 00:18:37.636 }, 00:18:37.636 "memory_domains": [ 00:18:37.636 { 00:18:37.636 "dma_device_id": "system", 00:18:37.636 "dma_device_type": 1 00:18:37.636 }, 00:18:37.636 { 00:18:37.636 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:37.636 "dma_device_type": 2 00:18:37.636 } 00:18:37.636 ], 00:18:37.636 "driver_specific": {} 00:18:37.636 }' 00:18:37.636 18:22:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:37.895 18:22:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:37.895 18:22:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:18:37.895 18:22:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:37.895 18:22:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:37.895 18:22:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:18:37.895 18:22:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:37.895 18:22:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:37.895 18:22:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:18:37.895 18:22:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:38.153 18:22:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:38.153 18:22:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:18:38.153 18:22:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:18:38.153 18:22:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:18:38.153 18:22:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:18:38.412 18:22:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:18:38.412 "name": "BaseBdev3", 00:18:38.412 "aliases": [ 00:18:38.412 "02d28d6b-d39c-450c-967b-2af333a551bd" 00:18:38.412 ], 00:18:38.412 "product_name": "Malloc disk", 00:18:38.412 "block_size": 512, 00:18:38.412 "num_blocks": 65536, 00:18:38.412 "uuid": "02d28d6b-d39c-450c-967b-2af333a551bd", 00:18:38.412 "assigned_rate_limits": { 00:18:38.412 "rw_ios_per_sec": 0, 00:18:38.412 "rw_mbytes_per_sec": 0, 00:18:38.412 "r_mbytes_per_sec": 0, 00:18:38.412 "w_mbytes_per_sec": 0 00:18:38.412 }, 00:18:38.412 "claimed": true, 00:18:38.412 "claim_type": "exclusive_write", 00:18:38.412 "zoned": false, 00:18:38.412 "supported_io_types": { 00:18:38.412 "read": true, 00:18:38.412 "write": true, 00:18:38.412 "unmap": true, 00:18:38.412 "flush": true, 00:18:38.412 "reset": true, 00:18:38.412 "nvme_admin": false, 00:18:38.412 "nvme_io": false, 00:18:38.412 "nvme_io_md": false, 00:18:38.412 "write_zeroes": true, 00:18:38.412 "zcopy": true, 00:18:38.412 "get_zone_info": false, 00:18:38.412 "zone_management": false, 00:18:38.412 "zone_append": false, 00:18:38.412 "compare": false, 00:18:38.412 "compare_and_write": false, 00:18:38.412 "abort": true, 00:18:38.412 "seek_hole": false, 00:18:38.412 "seek_data": false, 00:18:38.412 "copy": true, 00:18:38.412 "nvme_iov_md": false 00:18:38.412 }, 00:18:38.412 "memory_domains": [ 00:18:38.412 { 00:18:38.412 "dma_device_id": "system", 00:18:38.412 "dma_device_type": 1 00:18:38.412 }, 00:18:38.412 { 00:18:38.412 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:38.412 "dma_device_type": 2 00:18:38.412 } 00:18:38.412 ], 00:18:38.412 "driver_specific": {} 00:18:38.412 }' 00:18:38.412 18:22:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:38.412 18:22:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:38.412 18:22:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:18:38.412 18:22:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:38.412 18:22:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:38.671 18:22:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:18:38.671 18:22:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:38.671 18:22:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:38.671 18:22:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:18:38.671 18:22:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:38.671 18:22:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:38.671 18:22:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:18:38.671 18:22:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:18:38.671 18:22:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:18:38.671 18:22:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 00:18:38.930 18:22:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:18:38.930 "name": "BaseBdev4", 00:18:38.930 "aliases": [ 00:18:38.930 "65e8b807-c7cd-4272-8ae0-9460b362bac7" 00:18:38.930 ], 00:18:38.930 "product_name": "Malloc disk", 00:18:38.930 "block_size": 512, 00:18:38.930 "num_blocks": 65536, 00:18:38.930 "uuid": "65e8b807-c7cd-4272-8ae0-9460b362bac7", 00:18:38.930 "assigned_rate_limits": { 00:18:38.930 "rw_ios_per_sec": 0, 00:18:38.930 "rw_mbytes_per_sec": 0, 00:18:38.930 "r_mbytes_per_sec": 0, 00:18:38.930 "w_mbytes_per_sec": 0 00:18:38.930 }, 00:18:38.930 "claimed": true, 00:18:38.930 "claim_type": "exclusive_write", 00:18:38.930 "zoned": false, 00:18:38.930 "supported_io_types": { 00:18:38.930 "read": true, 00:18:38.930 "write": true, 00:18:38.930 "unmap": true, 00:18:38.930 "flush": true, 00:18:38.930 "reset": true, 00:18:38.930 "nvme_admin": false, 00:18:38.930 "nvme_io": false, 00:18:38.930 "nvme_io_md": false, 00:18:38.930 "write_zeroes": true, 00:18:38.930 "zcopy": true, 00:18:38.930 "get_zone_info": false, 00:18:38.930 "zone_management": false, 00:18:38.930 "zone_append": false, 00:18:38.930 "compare": false, 00:18:38.930 "compare_and_write": false, 00:18:38.930 "abort": true, 00:18:38.930 "seek_hole": false, 00:18:38.930 "seek_data": false, 00:18:38.930 "copy": true, 00:18:38.930 "nvme_iov_md": false 00:18:38.930 }, 00:18:38.930 "memory_domains": [ 00:18:38.930 { 00:18:38.930 "dma_device_id": "system", 00:18:38.930 "dma_device_type": 1 00:18:38.930 }, 00:18:38.930 { 00:18:38.930 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:38.930 "dma_device_type": 2 00:18:38.930 } 00:18:38.930 ], 00:18:38.930 "driver_specific": {} 00:18:38.930 }' 00:18:38.930 18:22:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:38.930 18:22:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:38.930 18:22:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:18:38.930 18:22:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:39.189 18:22:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:39.189 18:22:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:18:39.189 18:22:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:39.189 18:22:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:39.189 18:22:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:18:39.189 18:22:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:39.189 18:22:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:39.189 18:22:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:18:39.189 18:22:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@338 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:18:39.448 [2024-07-12 18:22:23.125432] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:18:39.448 [2024-07-12 18:22:23.125459] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:18:39.448 [2024-07-12 18:22:23.125508] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:18:39.448 [2024-07-12 18:22:23.125561] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:18:39.448 [2024-07-12 18:22:23.125570] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x180a470 name Existed_Raid, state offline 00:18:39.448 18:22:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@341 -- # killprocess 2523675 00:18:39.448 18:22:23 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@948 -- # '[' -z 2523675 ']' 00:18:39.448 18:22:23 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@952 -- # kill -0 2523675 00:18:39.448 18:22:23 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@953 -- # uname 00:18:39.448 18:22:23 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:18:39.448 18:22:23 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2523675 00:18:39.707 18:22:23 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:18:39.707 18:22:23 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:18:39.707 18:22:23 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2523675' 00:18:39.707 killing process with pid 2523675 00:18:39.707 18:22:23 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@967 -- # kill 2523675 00:18:39.707 [2024-07-12 18:22:23.194701] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:18:39.707 18:22:23 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@972 -- # wait 2523675 00:18:39.707 [2024-07-12 18:22:23.279681] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:18:39.965 18:22:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@343 -- # return 0 00:18:39.965 00:18:39.965 real 0m35.785s 00:18:39.965 user 1m5.520s 00:18:39.965 sys 0m6.135s 00:18:39.965 18:22:23 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@1124 -- # xtrace_disable 00:18:39.965 18:22:23 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:18:39.965 ************************************ 00:18:39.965 END TEST raid_state_function_test_sb 00:18:39.965 ************************************ 00:18:40.224 18:22:23 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:18:40.224 18:22:23 bdev_raid -- bdev/bdev_raid.sh@869 -- # run_test raid_superblock_test raid_superblock_test raid0 4 00:18:40.224 18:22:23 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 4 -le 1 ']' 00:18:40.224 18:22:23 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:18:40.224 18:22:23 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:18:40.224 ************************************ 00:18:40.224 START TEST raid_superblock_test 00:18:40.224 ************************************ 00:18:40.224 18:22:23 bdev_raid.raid_superblock_test -- common/autotest_common.sh@1123 -- # raid_superblock_test raid0 4 00:18:40.224 18:22:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@392 -- # local raid_level=raid0 00:18:40.224 18:22:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@393 -- # local num_base_bdevs=4 00:18:40.224 18:22:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@394 -- # base_bdevs_malloc=() 00:18:40.224 18:22:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@394 -- # local base_bdevs_malloc 00:18:40.224 18:22:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@395 -- # base_bdevs_pt=() 00:18:40.224 18:22:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@395 -- # local base_bdevs_pt 00:18:40.224 18:22:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@396 -- # base_bdevs_pt_uuid=() 00:18:40.224 18:22:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@396 -- # local base_bdevs_pt_uuid 00:18:40.224 18:22:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@397 -- # local raid_bdev_name=raid_bdev1 00:18:40.224 18:22:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@398 -- # local strip_size 00:18:40.224 18:22:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@399 -- # local strip_size_create_arg 00:18:40.224 18:22:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@400 -- # local raid_bdev_uuid 00:18:40.224 18:22:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@401 -- # local raid_bdev 00:18:40.224 18:22:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@403 -- # '[' raid0 '!=' raid1 ']' 00:18:40.224 18:22:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@404 -- # strip_size=64 00:18:40.224 18:22:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@405 -- # strip_size_create_arg='-z 64' 00:18:40.224 18:22:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@411 -- # raid_pid=2529047 00:18:40.224 18:22:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@412 -- # waitforlisten 2529047 /var/tmp/spdk-raid.sock 00:18:40.224 18:22:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@410 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -L bdev_raid 00:18:40.224 18:22:23 bdev_raid.raid_superblock_test -- common/autotest_common.sh@829 -- # '[' -z 2529047 ']' 00:18:40.224 18:22:23 bdev_raid.raid_superblock_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:18:40.224 18:22:23 bdev_raid.raid_superblock_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:18:40.224 18:22:23 bdev_raid.raid_superblock_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:18:40.224 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:18:40.224 18:22:23 bdev_raid.raid_superblock_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:18:40.224 18:22:23 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:18:40.224 [2024-07-12 18:22:23.813031] Starting SPDK v24.09-pre git sha1 182dd7de4 / DPDK 24.03.0 initialization... 00:18:40.224 [2024-07-12 18:22:23.813103] [ DPDK EAL parameters: bdev_svc --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2529047 ] 00:18:40.224 [2024-07-12 18:22:23.944306] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:18:40.483 [2024-07-12 18:22:24.052911] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:18:40.483 [2024-07-12 18:22:24.115305] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:18:40.483 [2024-07-12 18:22:24.115350] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:18:41.100 18:22:24 bdev_raid.raid_superblock_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:18:41.100 18:22:24 bdev_raid.raid_superblock_test -- common/autotest_common.sh@862 -- # return 0 00:18:41.100 18:22:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i = 1 )) 00:18:41.100 18:22:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:18:41.100 18:22:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc1 00:18:41.100 18:22:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt1 00:18:41.100 18:22:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000001 00:18:41.100 18:22:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:18:41.100 18:22:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:18:41.100 18:22:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:18:41.100 18:22:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc1 00:18:41.358 malloc1 00:18:41.358 18:22:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:18:41.616 [2024-07-12 18:22:25.225317] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:18:41.616 [2024-07-12 18:22:25.225365] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:18:41.616 [2024-07-12 18:22:25.225386] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x82a570 00:18:41.616 [2024-07-12 18:22:25.225399] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:18:41.616 [2024-07-12 18:22:25.227035] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:18:41.616 [2024-07-12 18:22:25.227063] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:18:41.616 pt1 00:18:41.616 18:22:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:18:41.616 18:22:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:18:41.616 18:22:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc2 00:18:41.616 18:22:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt2 00:18:41.616 18:22:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000002 00:18:41.616 18:22:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:18:41.617 18:22:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:18:41.617 18:22:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:18:41.617 18:22:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc2 00:18:41.875 malloc2 00:18:41.875 18:22:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:18:42.133 [2024-07-12 18:22:25.711336] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:18:42.133 [2024-07-12 18:22:25.711382] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:18:42.133 [2024-07-12 18:22:25.711399] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x82b970 00:18:42.133 [2024-07-12 18:22:25.711412] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:18:42.133 [2024-07-12 18:22:25.712855] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:18:42.133 [2024-07-12 18:22:25.712883] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:18:42.133 pt2 00:18:42.133 18:22:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:18:42.133 18:22:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:18:42.133 18:22:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc3 00:18:42.133 18:22:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt3 00:18:42.133 18:22:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000003 00:18:42.133 18:22:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:18:42.133 18:22:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:18:42.133 18:22:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:18:42.133 18:22:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc3 00:18:42.391 malloc3 00:18:42.391 18:22:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc3 -p pt3 -u 00000000-0000-0000-0000-000000000003 00:18:42.649 [2024-07-12 18:22:26.213317] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc3 00:18:42.649 [2024-07-12 18:22:26.213363] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:18:42.649 [2024-07-12 18:22:26.213381] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x9c2340 00:18:42.649 [2024-07-12 18:22:26.213394] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:18:42.649 [2024-07-12 18:22:26.214809] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:18:42.649 [2024-07-12 18:22:26.214837] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt3 00:18:42.649 pt3 00:18:42.649 18:22:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:18:42.649 18:22:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:18:42.649 18:22:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc4 00:18:42.649 18:22:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt4 00:18:42.649 18:22:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000004 00:18:42.649 18:22:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:18:42.649 18:22:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:18:42.649 18:22:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:18:42.649 18:22:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc4 00:18:42.907 malloc4 00:18:42.907 18:22:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc4 -p pt4 -u 00000000-0000-0000-0000-000000000004 00:18:43.166 [2024-07-12 18:22:26.711155] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc4 00:18:43.166 [2024-07-12 18:22:26.711202] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:18:43.166 [2024-07-12 18:22:26.711222] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x9c4c60 00:18:43.166 [2024-07-12 18:22:26.711235] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:18:43.166 [2024-07-12 18:22:26.712621] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:18:43.166 [2024-07-12 18:22:26.712649] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt4 00:18:43.166 pt4 00:18:43.166 18:22:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:18:43.166 18:22:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:18:43.166 18:22:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@429 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'pt1 pt2 pt3 pt4' -n raid_bdev1 -s 00:18:43.425 [2024-07-12 18:22:26.955820] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:18:43.425 [2024-07-12 18:22:26.957022] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:18:43.425 [2024-07-12 18:22:26.957076] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt3 is claimed 00:18:43.425 [2024-07-12 18:22:26.957118] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt4 is claimed 00:18:43.425 [2024-07-12 18:22:26.957281] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x822530 00:18:43.425 [2024-07-12 18:22:26.957291] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 253952, blocklen 512 00:18:43.425 [2024-07-12 18:22:26.957477] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x820770 00:18:43.425 [2024-07-12 18:22:26.957618] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x822530 00:18:43.425 [2024-07-12 18:22:26.957628] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x822530 00:18:43.425 [2024-07-12 18:22:26.957718] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:18:43.425 18:22:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@430 -- # verify_raid_bdev_state raid_bdev1 online raid0 64 4 00:18:43.425 18:22:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:18:43.425 18:22:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:18:43.425 18:22:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:18:43.425 18:22:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:18:43.425 18:22:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:18:43.425 18:22:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:18:43.425 18:22:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:18:43.425 18:22:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:18:43.425 18:22:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:18:43.425 18:22:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:43.425 18:22:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:18:43.684 18:22:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:18:43.684 "name": "raid_bdev1", 00:18:43.684 "uuid": "fda48608-27c8-4100-a20e-1e962fadc5f6", 00:18:43.684 "strip_size_kb": 64, 00:18:43.684 "state": "online", 00:18:43.684 "raid_level": "raid0", 00:18:43.684 "superblock": true, 00:18:43.684 "num_base_bdevs": 4, 00:18:43.684 "num_base_bdevs_discovered": 4, 00:18:43.684 "num_base_bdevs_operational": 4, 00:18:43.684 "base_bdevs_list": [ 00:18:43.684 { 00:18:43.684 "name": "pt1", 00:18:43.684 "uuid": "00000000-0000-0000-0000-000000000001", 00:18:43.684 "is_configured": true, 00:18:43.684 "data_offset": 2048, 00:18:43.684 "data_size": 63488 00:18:43.684 }, 00:18:43.684 { 00:18:43.684 "name": "pt2", 00:18:43.684 "uuid": "00000000-0000-0000-0000-000000000002", 00:18:43.684 "is_configured": true, 00:18:43.684 "data_offset": 2048, 00:18:43.684 "data_size": 63488 00:18:43.684 }, 00:18:43.684 { 00:18:43.684 "name": "pt3", 00:18:43.684 "uuid": "00000000-0000-0000-0000-000000000003", 00:18:43.684 "is_configured": true, 00:18:43.684 "data_offset": 2048, 00:18:43.684 "data_size": 63488 00:18:43.684 }, 00:18:43.684 { 00:18:43.684 "name": "pt4", 00:18:43.684 "uuid": "00000000-0000-0000-0000-000000000004", 00:18:43.684 "is_configured": true, 00:18:43.684 "data_offset": 2048, 00:18:43.684 "data_size": 63488 00:18:43.684 } 00:18:43.684 ] 00:18:43.684 }' 00:18:43.684 18:22:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:18:43.684 18:22:27 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:18:44.252 18:22:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@431 -- # verify_raid_bdev_properties raid_bdev1 00:18:44.252 18:22:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:18:44.252 18:22:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:18:44.252 18:22:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:18:44.252 18:22:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:18:44.252 18:22:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@198 -- # local name 00:18:44.252 18:22:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:18:44.252 18:22:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:18:44.511 [2024-07-12 18:22:28.051007] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:18:44.511 18:22:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:18:44.511 "name": "raid_bdev1", 00:18:44.511 "aliases": [ 00:18:44.511 "fda48608-27c8-4100-a20e-1e962fadc5f6" 00:18:44.511 ], 00:18:44.511 "product_name": "Raid Volume", 00:18:44.511 "block_size": 512, 00:18:44.511 "num_blocks": 253952, 00:18:44.511 "uuid": "fda48608-27c8-4100-a20e-1e962fadc5f6", 00:18:44.511 "assigned_rate_limits": { 00:18:44.511 "rw_ios_per_sec": 0, 00:18:44.511 "rw_mbytes_per_sec": 0, 00:18:44.511 "r_mbytes_per_sec": 0, 00:18:44.511 "w_mbytes_per_sec": 0 00:18:44.511 }, 00:18:44.511 "claimed": false, 00:18:44.511 "zoned": false, 00:18:44.511 "supported_io_types": { 00:18:44.511 "read": true, 00:18:44.511 "write": true, 00:18:44.511 "unmap": true, 00:18:44.511 "flush": true, 00:18:44.511 "reset": true, 00:18:44.511 "nvme_admin": false, 00:18:44.511 "nvme_io": false, 00:18:44.511 "nvme_io_md": false, 00:18:44.511 "write_zeroes": true, 00:18:44.511 "zcopy": false, 00:18:44.511 "get_zone_info": false, 00:18:44.511 "zone_management": false, 00:18:44.511 "zone_append": false, 00:18:44.511 "compare": false, 00:18:44.511 "compare_and_write": false, 00:18:44.511 "abort": false, 00:18:44.511 "seek_hole": false, 00:18:44.511 "seek_data": false, 00:18:44.511 "copy": false, 00:18:44.511 "nvme_iov_md": false 00:18:44.511 }, 00:18:44.511 "memory_domains": [ 00:18:44.511 { 00:18:44.511 "dma_device_id": "system", 00:18:44.511 "dma_device_type": 1 00:18:44.511 }, 00:18:44.511 { 00:18:44.511 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:44.511 "dma_device_type": 2 00:18:44.511 }, 00:18:44.511 { 00:18:44.511 "dma_device_id": "system", 00:18:44.511 "dma_device_type": 1 00:18:44.511 }, 00:18:44.511 { 00:18:44.511 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:44.511 "dma_device_type": 2 00:18:44.511 }, 00:18:44.511 { 00:18:44.511 "dma_device_id": "system", 00:18:44.511 "dma_device_type": 1 00:18:44.511 }, 00:18:44.511 { 00:18:44.511 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:44.511 "dma_device_type": 2 00:18:44.511 }, 00:18:44.511 { 00:18:44.511 "dma_device_id": "system", 00:18:44.511 "dma_device_type": 1 00:18:44.511 }, 00:18:44.511 { 00:18:44.511 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:44.511 "dma_device_type": 2 00:18:44.511 } 00:18:44.511 ], 00:18:44.511 "driver_specific": { 00:18:44.511 "raid": { 00:18:44.511 "uuid": "fda48608-27c8-4100-a20e-1e962fadc5f6", 00:18:44.511 "strip_size_kb": 64, 00:18:44.511 "state": "online", 00:18:44.511 "raid_level": "raid0", 00:18:44.511 "superblock": true, 00:18:44.511 "num_base_bdevs": 4, 00:18:44.511 "num_base_bdevs_discovered": 4, 00:18:44.511 "num_base_bdevs_operational": 4, 00:18:44.511 "base_bdevs_list": [ 00:18:44.511 { 00:18:44.511 "name": "pt1", 00:18:44.511 "uuid": "00000000-0000-0000-0000-000000000001", 00:18:44.511 "is_configured": true, 00:18:44.511 "data_offset": 2048, 00:18:44.511 "data_size": 63488 00:18:44.511 }, 00:18:44.511 { 00:18:44.511 "name": "pt2", 00:18:44.511 "uuid": "00000000-0000-0000-0000-000000000002", 00:18:44.511 "is_configured": true, 00:18:44.511 "data_offset": 2048, 00:18:44.511 "data_size": 63488 00:18:44.511 }, 00:18:44.511 { 00:18:44.511 "name": "pt3", 00:18:44.511 "uuid": "00000000-0000-0000-0000-000000000003", 00:18:44.511 "is_configured": true, 00:18:44.511 "data_offset": 2048, 00:18:44.511 "data_size": 63488 00:18:44.511 }, 00:18:44.511 { 00:18:44.511 "name": "pt4", 00:18:44.511 "uuid": "00000000-0000-0000-0000-000000000004", 00:18:44.511 "is_configured": true, 00:18:44.511 "data_offset": 2048, 00:18:44.511 "data_size": 63488 00:18:44.511 } 00:18:44.511 ] 00:18:44.511 } 00:18:44.511 } 00:18:44.511 }' 00:18:44.511 18:22:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:18:44.511 18:22:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:18:44.511 pt2 00:18:44.511 pt3 00:18:44.511 pt4' 00:18:44.511 18:22:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:18:44.511 18:22:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:18:44.511 18:22:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:18:44.769 18:22:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:18:44.769 "name": "pt1", 00:18:44.769 "aliases": [ 00:18:44.769 "00000000-0000-0000-0000-000000000001" 00:18:44.769 ], 00:18:44.769 "product_name": "passthru", 00:18:44.769 "block_size": 512, 00:18:44.769 "num_blocks": 65536, 00:18:44.769 "uuid": "00000000-0000-0000-0000-000000000001", 00:18:44.769 "assigned_rate_limits": { 00:18:44.769 "rw_ios_per_sec": 0, 00:18:44.769 "rw_mbytes_per_sec": 0, 00:18:44.769 "r_mbytes_per_sec": 0, 00:18:44.769 "w_mbytes_per_sec": 0 00:18:44.769 }, 00:18:44.769 "claimed": true, 00:18:44.769 "claim_type": "exclusive_write", 00:18:44.769 "zoned": false, 00:18:44.769 "supported_io_types": { 00:18:44.769 "read": true, 00:18:44.769 "write": true, 00:18:44.769 "unmap": true, 00:18:44.769 "flush": true, 00:18:44.769 "reset": true, 00:18:44.769 "nvme_admin": false, 00:18:44.769 "nvme_io": false, 00:18:44.769 "nvme_io_md": false, 00:18:44.769 "write_zeroes": true, 00:18:44.769 "zcopy": true, 00:18:44.769 "get_zone_info": false, 00:18:44.769 "zone_management": false, 00:18:44.769 "zone_append": false, 00:18:44.769 "compare": false, 00:18:44.769 "compare_and_write": false, 00:18:44.769 "abort": true, 00:18:44.769 "seek_hole": false, 00:18:44.769 "seek_data": false, 00:18:44.769 "copy": true, 00:18:44.769 "nvme_iov_md": false 00:18:44.769 }, 00:18:44.769 "memory_domains": [ 00:18:44.769 { 00:18:44.769 "dma_device_id": "system", 00:18:44.769 "dma_device_type": 1 00:18:44.769 }, 00:18:44.769 { 00:18:44.769 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:44.769 "dma_device_type": 2 00:18:44.769 } 00:18:44.769 ], 00:18:44.769 "driver_specific": { 00:18:44.769 "passthru": { 00:18:44.769 "name": "pt1", 00:18:44.769 "base_bdev_name": "malloc1" 00:18:44.769 } 00:18:44.769 } 00:18:44.769 }' 00:18:44.769 18:22:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:44.769 18:22:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:44.769 18:22:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:18:44.769 18:22:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:45.027 18:22:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:45.027 18:22:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:18:45.027 18:22:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:45.027 18:22:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:45.027 18:22:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:18:45.027 18:22:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:45.027 18:22:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:45.027 18:22:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:18:45.027 18:22:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:18:45.027 18:22:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:18:45.027 18:22:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:18:45.286 18:22:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:18:45.286 "name": "pt2", 00:18:45.286 "aliases": [ 00:18:45.286 "00000000-0000-0000-0000-000000000002" 00:18:45.286 ], 00:18:45.286 "product_name": "passthru", 00:18:45.286 "block_size": 512, 00:18:45.286 "num_blocks": 65536, 00:18:45.286 "uuid": "00000000-0000-0000-0000-000000000002", 00:18:45.286 "assigned_rate_limits": { 00:18:45.286 "rw_ios_per_sec": 0, 00:18:45.286 "rw_mbytes_per_sec": 0, 00:18:45.286 "r_mbytes_per_sec": 0, 00:18:45.286 "w_mbytes_per_sec": 0 00:18:45.286 }, 00:18:45.286 "claimed": true, 00:18:45.286 "claim_type": "exclusive_write", 00:18:45.286 "zoned": false, 00:18:45.286 "supported_io_types": { 00:18:45.286 "read": true, 00:18:45.286 "write": true, 00:18:45.286 "unmap": true, 00:18:45.286 "flush": true, 00:18:45.286 "reset": true, 00:18:45.286 "nvme_admin": false, 00:18:45.286 "nvme_io": false, 00:18:45.286 "nvme_io_md": false, 00:18:45.286 "write_zeroes": true, 00:18:45.286 "zcopy": true, 00:18:45.286 "get_zone_info": false, 00:18:45.286 "zone_management": false, 00:18:45.286 "zone_append": false, 00:18:45.286 "compare": false, 00:18:45.286 "compare_and_write": false, 00:18:45.286 "abort": true, 00:18:45.286 "seek_hole": false, 00:18:45.286 "seek_data": false, 00:18:45.286 "copy": true, 00:18:45.286 "nvme_iov_md": false 00:18:45.286 }, 00:18:45.286 "memory_domains": [ 00:18:45.286 { 00:18:45.286 "dma_device_id": "system", 00:18:45.286 "dma_device_type": 1 00:18:45.286 }, 00:18:45.286 { 00:18:45.286 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:45.286 "dma_device_type": 2 00:18:45.286 } 00:18:45.286 ], 00:18:45.286 "driver_specific": { 00:18:45.286 "passthru": { 00:18:45.286 "name": "pt2", 00:18:45.286 "base_bdev_name": "malloc2" 00:18:45.286 } 00:18:45.286 } 00:18:45.286 }' 00:18:45.286 18:22:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:45.544 18:22:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:45.544 18:22:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:18:45.544 18:22:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:45.544 18:22:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:45.544 18:22:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:18:45.544 18:22:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:45.544 18:22:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:45.544 18:22:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:18:45.544 18:22:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:45.802 18:22:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:45.802 18:22:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:18:45.802 18:22:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:18:45.802 18:22:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:18:45.802 18:22:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt3 00:18:46.061 18:22:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:18:46.061 "name": "pt3", 00:18:46.061 "aliases": [ 00:18:46.061 "00000000-0000-0000-0000-000000000003" 00:18:46.061 ], 00:18:46.061 "product_name": "passthru", 00:18:46.061 "block_size": 512, 00:18:46.061 "num_blocks": 65536, 00:18:46.061 "uuid": "00000000-0000-0000-0000-000000000003", 00:18:46.061 "assigned_rate_limits": { 00:18:46.061 "rw_ios_per_sec": 0, 00:18:46.061 "rw_mbytes_per_sec": 0, 00:18:46.061 "r_mbytes_per_sec": 0, 00:18:46.061 "w_mbytes_per_sec": 0 00:18:46.061 }, 00:18:46.061 "claimed": true, 00:18:46.061 "claim_type": "exclusive_write", 00:18:46.061 "zoned": false, 00:18:46.061 "supported_io_types": { 00:18:46.061 "read": true, 00:18:46.061 "write": true, 00:18:46.061 "unmap": true, 00:18:46.061 "flush": true, 00:18:46.061 "reset": true, 00:18:46.061 "nvme_admin": false, 00:18:46.061 "nvme_io": false, 00:18:46.061 "nvme_io_md": false, 00:18:46.061 "write_zeroes": true, 00:18:46.061 "zcopy": true, 00:18:46.061 "get_zone_info": false, 00:18:46.061 "zone_management": false, 00:18:46.061 "zone_append": false, 00:18:46.061 "compare": false, 00:18:46.061 "compare_and_write": false, 00:18:46.061 "abort": true, 00:18:46.061 "seek_hole": false, 00:18:46.061 "seek_data": false, 00:18:46.061 "copy": true, 00:18:46.061 "nvme_iov_md": false 00:18:46.061 }, 00:18:46.061 "memory_domains": [ 00:18:46.061 { 00:18:46.061 "dma_device_id": "system", 00:18:46.061 "dma_device_type": 1 00:18:46.061 }, 00:18:46.061 { 00:18:46.061 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:46.061 "dma_device_type": 2 00:18:46.061 } 00:18:46.061 ], 00:18:46.061 "driver_specific": { 00:18:46.061 "passthru": { 00:18:46.061 "name": "pt3", 00:18:46.061 "base_bdev_name": "malloc3" 00:18:46.061 } 00:18:46.061 } 00:18:46.061 }' 00:18:46.061 18:22:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:46.061 18:22:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:46.061 18:22:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:18:46.061 18:22:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:46.061 18:22:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:46.061 18:22:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:18:46.061 18:22:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:46.061 18:22:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:46.320 18:22:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:18:46.320 18:22:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:46.320 18:22:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:46.320 18:22:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:18:46.320 18:22:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:18:46.320 18:22:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt4 00:18:46.320 18:22:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:18:46.579 18:22:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:18:46.579 "name": "pt4", 00:18:46.579 "aliases": [ 00:18:46.579 "00000000-0000-0000-0000-000000000004" 00:18:46.579 ], 00:18:46.579 "product_name": "passthru", 00:18:46.579 "block_size": 512, 00:18:46.579 "num_blocks": 65536, 00:18:46.579 "uuid": "00000000-0000-0000-0000-000000000004", 00:18:46.579 "assigned_rate_limits": { 00:18:46.579 "rw_ios_per_sec": 0, 00:18:46.579 "rw_mbytes_per_sec": 0, 00:18:46.579 "r_mbytes_per_sec": 0, 00:18:46.579 "w_mbytes_per_sec": 0 00:18:46.579 }, 00:18:46.579 "claimed": true, 00:18:46.579 "claim_type": "exclusive_write", 00:18:46.579 "zoned": false, 00:18:46.579 "supported_io_types": { 00:18:46.579 "read": true, 00:18:46.579 "write": true, 00:18:46.579 "unmap": true, 00:18:46.579 "flush": true, 00:18:46.579 "reset": true, 00:18:46.579 "nvme_admin": false, 00:18:46.579 "nvme_io": false, 00:18:46.579 "nvme_io_md": false, 00:18:46.579 "write_zeroes": true, 00:18:46.579 "zcopy": true, 00:18:46.579 "get_zone_info": false, 00:18:46.579 "zone_management": false, 00:18:46.579 "zone_append": false, 00:18:46.579 "compare": false, 00:18:46.579 "compare_and_write": false, 00:18:46.579 "abort": true, 00:18:46.579 "seek_hole": false, 00:18:46.579 "seek_data": false, 00:18:46.579 "copy": true, 00:18:46.579 "nvme_iov_md": false 00:18:46.579 }, 00:18:46.579 "memory_domains": [ 00:18:46.579 { 00:18:46.579 "dma_device_id": "system", 00:18:46.579 "dma_device_type": 1 00:18:46.579 }, 00:18:46.579 { 00:18:46.579 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:46.579 "dma_device_type": 2 00:18:46.579 } 00:18:46.579 ], 00:18:46.579 "driver_specific": { 00:18:46.579 "passthru": { 00:18:46.579 "name": "pt4", 00:18:46.579 "base_bdev_name": "malloc4" 00:18:46.579 } 00:18:46.579 } 00:18:46.579 }' 00:18:46.579 18:22:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:46.579 18:22:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:46.579 18:22:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:18:46.579 18:22:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:46.579 18:22:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:46.837 18:22:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:18:46.837 18:22:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:46.837 18:22:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:46.837 18:22:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:18:46.837 18:22:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:46.837 18:22:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:46.837 18:22:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:18:46.837 18:22:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # jq -r '.[] | .uuid' 00:18:46.838 18:22:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:18:47.096 [2024-07-12 18:22:30.746168] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:18:47.096 18:22:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # raid_bdev_uuid=fda48608-27c8-4100-a20e-1e962fadc5f6 00:18:47.096 18:22:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@435 -- # '[' -z fda48608-27c8-4100-a20e-1e962fadc5f6 ']' 00:18:47.096 18:22:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@440 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:18:47.355 [2024-07-12 18:22:30.994506] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:18:47.355 [2024-07-12 18:22:30.994534] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:18:47.355 [2024-07-12 18:22:30.994589] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:18:47.355 [2024-07-12 18:22:30.994653] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:18:47.355 [2024-07-12 18:22:30.994665] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x822530 name raid_bdev1, state offline 00:18:47.355 18:22:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:47.355 18:22:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # jq -r '.[]' 00:18:47.614 18:22:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # raid_bdev= 00:18:47.614 18:22:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@442 -- # '[' -n '' ']' 00:18:47.614 18:22:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:18:47.614 18:22:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt1 00:18:47.872 18:22:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:18:47.872 18:22:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:18:48.131 18:22:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:18:48.131 18:22:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt3 00:18:48.390 18:22:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:18:48.390 18:22:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt4 00:18:48.649 18:22:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs 00:18:48.649 18:22:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # jq -r '[.[] | select(.product_name == "passthru")] | any' 00:18:48.908 18:22:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # '[' false == true ']' 00:18:48.908 18:22:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@456 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'malloc1 malloc2 malloc3 malloc4' -n raid_bdev1 00:18:48.908 18:22:32 bdev_raid.raid_superblock_test -- common/autotest_common.sh@648 -- # local es=0 00:18:48.908 18:22:32 bdev_raid.raid_superblock_test -- common/autotest_common.sh@650 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'malloc1 malloc2 malloc3 malloc4' -n raid_bdev1 00:18:48.908 18:22:32 bdev_raid.raid_superblock_test -- common/autotest_common.sh@636 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:18:48.908 18:22:32 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:18:48.908 18:22:32 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:18:48.908 18:22:32 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:18:48.908 18:22:32 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:18:48.908 18:22:32 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:18:48.908 18:22:32 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:18:48.908 18:22:32 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py ]] 00:18:48.908 18:22:32 bdev_raid.raid_superblock_test -- common/autotest_common.sh@651 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'malloc1 malloc2 malloc3 malloc4' -n raid_bdev1 00:18:49.167 [2024-07-12 18:22:32.670875] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc1 is claimed 00:18:49.167 [2024-07-12 18:22:32.672266] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc2 is claimed 00:18:49.167 [2024-07-12 18:22:32.672311] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc3 is claimed 00:18:49.167 [2024-07-12 18:22:32.672345] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc4 is claimed 00:18:49.167 [2024-07-12 18:22:32.672391] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc1 00:18:49.167 [2024-07-12 18:22:32.672432] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc2 00:18:49.167 [2024-07-12 18:22:32.672455] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc3 00:18:49.167 [2024-07-12 18:22:32.672477] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc4 00:18:49.167 [2024-07-12 18:22:32.672495] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:18:49.167 [2024-07-12 18:22:32.672505] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x9cdff0 name raid_bdev1, state configuring 00:18:49.167 request: 00:18:49.167 { 00:18:49.167 "name": "raid_bdev1", 00:18:49.167 "raid_level": "raid0", 00:18:49.167 "base_bdevs": [ 00:18:49.167 "malloc1", 00:18:49.167 "malloc2", 00:18:49.167 "malloc3", 00:18:49.167 "malloc4" 00:18:49.167 ], 00:18:49.167 "strip_size_kb": 64, 00:18:49.167 "superblock": false, 00:18:49.167 "method": "bdev_raid_create", 00:18:49.167 "req_id": 1 00:18:49.167 } 00:18:49.167 Got JSON-RPC error response 00:18:49.167 response: 00:18:49.167 { 00:18:49.167 "code": -17, 00:18:49.167 "message": "Failed to create RAID bdev raid_bdev1: File exists" 00:18:49.167 } 00:18:49.167 18:22:32 bdev_raid.raid_superblock_test -- common/autotest_common.sh@651 -- # es=1 00:18:49.167 18:22:32 bdev_raid.raid_superblock_test -- common/autotest_common.sh@659 -- # (( es > 128 )) 00:18:49.167 18:22:32 bdev_raid.raid_superblock_test -- common/autotest_common.sh@670 -- # [[ -n '' ]] 00:18:49.167 18:22:32 bdev_raid.raid_superblock_test -- common/autotest_common.sh@675 -- # (( !es == 0 )) 00:18:49.167 18:22:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:49.167 18:22:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # jq -r '.[]' 00:18:49.425 18:22:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # raid_bdev= 00:18:49.425 18:22:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@459 -- # '[' -n '' ']' 00:18:49.425 18:22:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@464 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:18:49.683 [2024-07-12 18:22:33.168132] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:18:49.683 [2024-07-12 18:22:33.168181] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:18:49.683 [2024-07-12 18:22:33.168204] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x82a7a0 00:18:49.683 [2024-07-12 18:22:33.168216] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:18:49.683 [2024-07-12 18:22:33.169819] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:18:49.683 [2024-07-12 18:22:33.169848] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:18:49.683 [2024-07-12 18:22:33.169917] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt1 00:18:49.683 [2024-07-12 18:22:33.169954] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:18:49.683 pt1 00:18:49.683 18:22:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@467 -- # verify_raid_bdev_state raid_bdev1 configuring raid0 64 4 00:18:49.683 18:22:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:18:49.683 18:22:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:18:49.683 18:22:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:18:49.683 18:22:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:18:49.683 18:22:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:18:49.683 18:22:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:18:49.683 18:22:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:18:49.683 18:22:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:18:49.683 18:22:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:18:49.683 18:22:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:49.683 18:22:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:18:49.942 18:22:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:18:49.942 "name": "raid_bdev1", 00:18:49.942 "uuid": "fda48608-27c8-4100-a20e-1e962fadc5f6", 00:18:49.942 "strip_size_kb": 64, 00:18:49.942 "state": "configuring", 00:18:49.942 "raid_level": "raid0", 00:18:49.942 "superblock": true, 00:18:49.942 "num_base_bdevs": 4, 00:18:49.942 "num_base_bdevs_discovered": 1, 00:18:49.942 "num_base_bdevs_operational": 4, 00:18:49.942 "base_bdevs_list": [ 00:18:49.942 { 00:18:49.942 "name": "pt1", 00:18:49.942 "uuid": "00000000-0000-0000-0000-000000000001", 00:18:49.942 "is_configured": true, 00:18:49.942 "data_offset": 2048, 00:18:49.942 "data_size": 63488 00:18:49.942 }, 00:18:49.942 { 00:18:49.943 "name": null, 00:18:49.943 "uuid": "00000000-0000-0000-0000-000000000002", 00:18:49.943 "is_configured": false, 00:18:49.943 "data_offset": 2048, 00:18:49.943 "data_size": 63488 00:18:49.943 }, 00:18:49.943 { 00:18:49.943 "name": null, 00:18:49.943 "uuid": "00000000-0000-0000-0000-000000000003", 00:18:49.943 "is_configured": false, 00:18:49.943 "data_offset": 2048, 00:18:49.943 "data_size": 63488 00:18:49.943 }, 00:18:49.943 { 00:18:49.943 "name": null, 00:18:49.943 "uuid": "00000000-0000-0000-0000-000000000004", 00:18:49.943 "is_configured": false, 00:18:49.943 "data_offset": 2048, 00:18:49.943 "data_size": 63488 00:18:49.943 } 00:18:49.943 ] 00:18:49.943 }' 00:18:49.943 18:22:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:18:49.943 18:22:33 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:18:50.509 18:22:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@469 -- # '[' 4 -gt 2 ']' 00:18:50.509 18:22:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@471 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:18:50.509 [2024-07-12 18:22:34.118656] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:18:50.509 [2024-07-12 18:22:34.118708] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:18:50.509 [2024-07-12 18:22:34.118729] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x9c3940 00:18:50.509 [2024-07-12 18:22:34.118741] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:18:50.509 [2024-07-12 18:22:34.119098] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:18:50.509 [2024-07-12 18:22:34.119117] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:18:50.509 [2024-07-12 18:22:34.119182] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:18:50.509 [2024-07-12 18:22:34.119200] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:18:50.509 pt2 00:18:50.509 18:22:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@472 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:18:50.767 [2024-07-12 18:22:34.367322] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: pt2 00:18:50.767 18:22:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@473 -- # verify_raid_bdev_state raid_bdev1 configuring raid0 64 4 00:18:50.767 18:22:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:18:50.767 18:22:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:18:50.767 18:22:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:18:50.767 18:22:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:18:50.767 18:22:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:18:50.767 18:22:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:18:50.767 18:22:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:18:50.767 18:22:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:18:50.768 18:22:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:18:50.768 18:22:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:50.768 18:22:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:18:51.026 18:22:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:18:51.026 "name": "raid_bdev1", 00:18:51.026 "uuid": "fda48608-27c8-4100-a20e-1e962fadc5f6", 00:18:51.026 "strip_size_kb": 64, 00:18:51.026 "state": "configuring", 00:18:51.026 "raid_level": "raid0", 00:18:51.026 "superblock": true, 00:18:51.026 "num_base_bdevs": 4, 00:18:51.026 "num_base_bdevs_discovered": 1, 00:18:51.026 "num_base_bdevs_operational": 4, 00:18:51.026 "base_bdevs_list": [ 00:18:51.026 { 00:18:51.026 "name": "pt1", 00:18:51.026 "uuid": "00000000-0000-0000-0000-000000000001", 00:18:51.026 "is_configured": true, 00:18:51.026 "data_offset": 2048, 00:18:51.026 "data_size": 63488 00:18:51.026 }, 00:18:51.026 { 00:18:51.026 "name": null, 00:18:51.026 "uuid": "00000000-0000-0000-0000-000000000002", 00:18:51.026 "is_configured": false, 00:18:51.026 "data_offset": 2048, 00:18:51.026 "data_size": 63488 00:18:51.026 }, 00:18:51.026 { 00:18:51.026 "name": null, 00:18:51.026 "uuid": "00000000-0000-0000-0000-000000000003", 00:18:51.026 "is_configured": false, 00:18:51.026 "data_offset": 2048, 00:18:51.026 "data_size": 63488 00:18:51.026 }, 00:18:51.026 { 00:18:51.026 "name": null, 00:18:51.026 "uuid": "00000000-0000-0000-0000-000000000004", 00:18:51.026 "is_configured": false, 00:18:51.026 "data_offset": 2048, 00:18:51.026 "data_size": 63488 00:18:51.026 } 00:18:51.026 ] 00:18:51.026 }' 00:18:51.026 18:22:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:18:51.026 18:22:34 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:18:51.593 18:22:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i = 1 )) 00:18:51.593 18:22:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:18:51.593 18:22:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@478 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:18:51.851 [2024-07-12 18:22:35.382011] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:18:51.851 [2024-07-12 18:22:35.382064] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:18:51.851 [2024-07-12 18:22:35.382083] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x821060 00:18:51.851 [2024-07-12 18:22:35.382096] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:18:51.851 [2024-07-12 18:22:35.382439] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:18:51.851 [2024-07-12 18:22:35.382456] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:18:51.851 [2024-07-12 18:22:35.382521] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:18:51.851 [2024-07-12 18:22:35.382540] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:18:51.851 pt2 00:18:51.851 18:22:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i++ )) 00:18:51.851 18:22:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:18:51.851 18:22:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@478 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc3 -p pt3 -u 00000000-0000-0000-0000-000000000003 00:18:52.109 [2024-07-12 18:22:35.630661] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc3 00:18:52.109 [2024-07-12 18:22:35.630697] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:18:52.109 [2024-07-12 18:22:35.630721] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x8238d0 00:18:52.109 [2024-07-12 18:22:35.630734] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:18:52.109 [2024-07-12 18:22:35.631046] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:18:52.109 [2024-07-12 18:22:35.631064] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt3 00:18:52.109 [2024-07-12 18:22:35.631118] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt3 00:18:52.109 [2024-07-12 18:22:35.631136] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt3 is claimed 00:18:52.109 pt3 00:18:52.109 18:22:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i++ )) 00:18:52.109 18:22:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:18:52.109 18:22:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@478 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc4 -p pt4 -u 00000000-0000-0000-0000-000000000004 00:18:52.368 [2024-07-12 18:22:35.879327] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc4 00:18:52.368 [2024-07-12 18:22:35.879358] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:18:52.368 [2024-07-12 18:22:35.879373] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x824b80 00:18:52.368 [2024-07-12 18:22:35.879386] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:18:52.368 [2024-07-12 18:22:35.879651] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:18:52.368 [2024-07-12 18:22:35.879667] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt4 00:18:52.368 [2024-07-12 18:22:35.879714] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt4 00:18:52.368 [2024-07-12 18:22:35.879731] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt4 is claimed 00:18:52.368 [2024-07-12 18:22:35.879842] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x821780 00:18:52.368 [2024-07-12 18:22:35.879852] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 253952, blocklen 512 00:18:52.368 [2024-07-12 18:22:35.880026] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x826d70 00:18:52.368 [2024-07-12 18:22:35.880149] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x821780 00:18:52.368 [2024-07-12 18:22:35.880159] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x821780 00:18:52.368 [2024-07-12 18:22:35.880252] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:18:52.368 pt4 00:18:52.368 18:22:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i++ )) 00:18:52.368 18:22:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:18:52.368 18:22:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@482 -- # verify_raid_bdev_state raid_bdev1 online raid0 64 4 00:18:52.368 18:22:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:18:52.368 18:22:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:18:52.368 18:22:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:18:52.368 18:22:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:18:52.368 18:22:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:18:52.368 18:22:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:18:52.368 18:22:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:18:52.368 18:22:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:18:52.368 18:22:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:18:52.368 18:22:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:52.368 18:22:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:18:52.628 18:22:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:18:52.628 "name": "raid_bdev1", 00:18:52.628 "uuid": "fda48608-27c8-4100-a20e-1e962fadc5f6", 00:18:52.628 "strip_size_kb": 64, 00:18:52.628 "state": "online", 00:18:52.628 "raid_level": "raid0", 00:18:52.628 "superblock": true, 00:18:52.628 "num_base_bdevs": 4, 00:18:52.628 "num_base_bdevs_discovered": 4, 00:18:52.628 "num_base_bdevs_operational": 4, 00:18:52.628 "base_bdevs_list": [ 00:18:52.628 { 00:18:52.628 "name": "pt1", 00:18:52.628 "uuid": "00000000-0000-0000-0000-000000000001", 00:18:52.628 "is_configured": true, 00:18:52.628 "data_offset": 2048, 00:18:52.628 "data_size": 63488 00:18:52.628 }, 00:18:52.628 { 00:18:52.628 "name": "pt2", 00:18:52.628 "uuid": "00000000-0000-0000-0000-000000000002", 00:18:52.628 "is_configured": true, 00:18:52.628 "data_offset": 2048, 00:18:52.628 "data_size": 63488 00:18:52.628 }, 00:18:52.628 { 00:18:52.628 "name": "pt3", 00:18:52.628 "uuid": "00000000-0000-0000-0000-000000000003", 00:18:52.628 "is_configured": true, 00:18:52.628 "data_offset": 2048, 00:18:52.628 "data_size": 63488 00:18:52.628 }, 00:18:52.628 { 00:18:52.628 "name": "pt4", 00:18:52.628 "uuid": "00000000-0000-0000-0000-000000000004", 00:18:52.628 "is_configured": true, 00:18:52.628 "data_offset": 2048, 00:18:52.628 "data_size": 63488 00:18:52.628 } 00:18:52.628 ] 00:18:52.628 }' 00:18:52.628 18:22:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:18:52.628 18:22:36 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:18:53.196 18:22:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@483 -- # verify_raid_bdev_properties raid_bdev1 00:18:53.196 18:22:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:18:53.196 18:22:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:18:53.196 18:22:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:18:53.196 18:22:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:18:53.196 18:22:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@198 -- # local name 00:18:53.196 18:22:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:18:53.196 18:22:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:18:53.454 [2024-07-12 18:22:36.950492] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:18:53.454 18:22:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:18:53.454 "name": "raid_bdev1", 00:18:53.454 "aliases": [ 00:18:53.454 "fda48608-27c8-4100-a20e-1e962fadc5f6" 00:18:53.454 ], 00:18:53.454 "product_name": "Raid Volume", 00:18:53.454 "block_size": 512, 00:18:53.454 "num_blocks": 253952, 00:18:53.454 "uuid": "fda48608-27c8-4100-a20e-1e962fadc5f6", 00:18:53.454 "assigned_rate_limits": { 00:18:53.455 "rw_ios_per_sec": 0, 00:18:53.455 "rw_mbytes_per_sec": 0, 00:18:53.455 "r_mbytes_per_sec": 0, 00:18:53.455 "w_mbytes_per_sec": 0 00:18:53.455 }, 00:18:53.455 "claimed": false, 00:18:53.455 "zoned": false, 00:18:53.455 "supported_io_types": { 00:18:53.455 "read": true, 00:18:53.455 "write": true, 00:18:53.455 "unmap": true, 00:18:53.455 "flush": true, 00:18:53.455 "reset": true, 00:18:53.455 "nvme_admin": false, 00:18:53.455 "nvme_io": false, 00:18:53.455 "nvme_io_md": false, 00:18:53.455 "write_zeroes": true, 00:18:53.455 "zcopy": false, 00:18:53.455 "get_zone_info": false, 00:18:53.455 "zone_management": false, 00:18:53.455 "zone_append": false, 00:18:53.455 "compare": false, 00:18:53.455 "compare_and_write": false, 00:18:53.455 "abort": false, 00:18:53.455 "seek_hole": false, 00:18:53.455 "seek_data": false, 00:18:53.455 "copy": false, 00:18:53.455 "nvme_iov_md": false 00:18:53.455 }, 00:18:53.455 "memory_domains": [ 00:18:53.455 { 00:18:53.455 "dma_device_id": "system", 00:18:53.455 "dma_device_type": 1 00:18:53.455 }, 00:18:53.455 { 00:18:53.455 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:53.455 "dma_device_type": 2 00:18:53.455 }, 00:18:53.455 { 00:18:53.455 "dma_device_id": "system", 00:18:53.455 "dma_device_type": 1 00:18:53.455 }, 00:18:53.455 { 00:18:53.455 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:53.455 "dma_device_type": 2 00:18:53.455 }, 00:18:53.455 { 00:18:53.455 "dma_device_id": "system", 00:18:53.455 "dma_device_type": 1 00:18:53.455 }, 00:18:53.455 { 00:18:53.455 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:53.455 "dma_device_type": 2 00:18:53.455 }, 00:18:53.455 { 00:18:53.455 "dma_device_id": "system", 00:18:53.455 "dma_device_type": 1 00:18:53.455 }, 00:18:53.455 { 00:18:53.455 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:53.455 "dma_device_type": 2 00:18:53.455 } 00:18:53.455 ], 00:18:53.455 "driver_specific": { 00:18:53.455 "raid": { 00:18:53.455 "uuid": "fda48608-27c8-4100-a20e-1e962fadc5f6", 00:18:53.455 "strip_size_kb": 64, 00:18:53.455 "state": "online", 00:18:53.455 "raid_level": "raid0", 00:18:53.455 "superblock": true, 00:18:53.455 "num_base_bdevs": 4, 00:18:53.455 "num_base_bdevs_discovered": 4, 00:18:53.455 "num_base_bdevs_operational": 4, 00:18:53.455 "base_bdevs_list": [ 00:18:53.455 { 00:18:53.455 "name": "pt1", 00:18:53.455 "uuid": "00000000-0000-0000-0000-000000000001", 00:18:53.455 "is_configured": true, 00:18:53.455 "data_offset": 2048, 00:18:53.455 "data_size": 63488 00:18:53.455 }, 00:18:53.455 { 00:18:53.455 "name": "pt2", 00:18:53.455 "uuid": "00000000-0000-0000-0000-000000000002", 00:18:53.455 "is_configured": true, 00:18:53.455 "data_offset": 2048, 00:18:53.455 "data_size": 63488 00:18:53.455 }, 00:18:53.455 { 00:18:53.455 "name": "pt3", 00:18:53.455 "uuid": "00000000-0000-0000-0000-000000000003", 00:18:53.455 "is_configured": true, 00:18:53.455 "data_offset": 2048, 00:18:53.455 "data_size": 63488 00:18:53.455 }, 00:18:53.455 { 00:18:53.455 "name": "pt4", 00:18:53.455 "uuid": "00000000-0000-0000-0000-000000000004", 00:18:53.455 "is_configured": true, 00:18:53.455 "data_offset": 2048, 00:18:53.455 "data_size": 63488 00:18:53.455 } 00:18:53.455 ] 00:18:53.455 } 00:18:53.455 } 00:18:53.455 }' 00:18:53.455 18:22:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:18:53.455 18:22:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:18:53.455 pt2 00:18:53.455 pt3 00:18:53.455 pt4' 00:18:53.455 18:22:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:18:53.455 18:22:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:18:53.455 18:22:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:18:53.714 18:22:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:18:53.714 "name": "pt1", 00:18:53.714 "aliases": [ 00:18:53.714 "00000000-0000-0000-0000-000000000001" 00:18:53.714 ], 00:18:53.714 "product_name": "passthru", 00:18:53.714 "block_size": 512, 00:18:53.714 "num_blocks": 65536, 00:18:53.714 "uuid": "00000000-0000-0000-0000-000000000001", 00:18:53.714 "assigned_rate_limits": { 00:18:53.714 "rw_ios_per_sec": 0, 00:18:53.714 "rw_mbytes_per_sec": 0, 00:18:53.714 "r_mbytes_per_sec": 0, 00:18:53.714 "w_mbytes_per_sec": 0 00:18:53.714 }, 00:18:53.714 "claimed": true, 00:18:53.714 "claim_type": "exclusive_write", 00:18:53.714 "zoned": false, 00:18:53.714 "supported_io_types": { 00:18:53.714 "read": true, 00:18:53.714 "write": true, 00:18:53.714 "unmap": true, 00:18:53.714 "flush": true, 00:18:53.714 "reset": true, 00:18:53.714 "nvme_admin": false, 00:18:53.714 "nvme_io": false, 00:18:53.714 "nvme_io_md": false, 00:18:53.714 "write_zeroes": true, 00:18:53.714 "zcopy": true, 00:18:53.714 "get_zone_info": false, 00:18:53.714 "zone_management": false, 00:18:53.714 "zone_append": false, 00:18:53.714 "compare": false, 00:18:53.714 "compare_and_write": false, 00:18:53.714 "abort": true, 00:18:53.714 "seek_hole": false, 00:18:53.714 "seek_data": false, 00:18:53.714 "copy": true, 00:18:53.714 "nvme_iov_md": false 00:18:53.714 }, 00:18:53.714 "memory_domains": [ 00:18:53.714 { 00:18:53.714 "dma_device_id": "system", 00:18:53.714 "dma_device_type": 1 00:18:53.714 }, 00:18:53.714 { 00:18:53.714 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:53.714 "dma_device_type": 2 00:18:53.714 } 00:18:53.714 ], 00:18:53.714 "driver_specific": { 00:18:53.714 "passthru": { 00:18:53.714 "name": "pt1", 00:18:53.714 "base_bdev_name": "malloc1" 00:18:53.714 } 00:18:53.714 } 00:18:53.714 }' 00:18:53.714 18:22:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:53.714 18:22:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:53.714 18:22:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:18:53.714 18:22:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:53.714 18:22:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:53.972 18:22:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:18:53.972 18:22:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:53.972 18:22:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:53.972 18:22:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:18:53.972 18:22:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:53.972 18:22:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:53.972 18:22:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:18:53.972 18:22:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:18:53.972 18:22:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:18:53.972 18:22:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:18:54.231 18:22:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:18:54.231 "name": "pt2", 00:18:54.231 "aliases": [ 00:18:54.231 "00000000-0000-0000-0000-000000000002" 00:18:54.231 ], 00:18:54.231 "product_name": "passthru", 00:18:54.231 "block_size": 512, 00:18:54.231 "num_blocks": 65536, 00:18:54.231 "uuid": "00000000-0000-0000-0000-000000000002", 00:18:54.231 "assigned_rate_limits": { 00:18:54.231 "rw_ios_per_sec": 0, 00:18:54.231 "rw_mbytes_per_sec": 0, 00:18:54.231 "r_mbytes_per_sec": 0, 00:18:54.231 "w_mbytes_per_sec": 0 00:18:54.231 }, 00:18:54.231 "claimed": true, 00:18:54.231 "claim_type": "exclusive_write", 00:18:54.231 "zoned": false, 00:18:54.231 "supported_io_types": { 00:18:54.231 "read": true, 00:18:54.231 "write": true, 00:18:54.231 "unmap": true, 00:18:54.231 "flush": true, 00:18:54.231 "reset": true, 00:18:54.231 "nvme_admin": false, 00:18:54.231 "nvme_io": false, 00:18:54.231 "nvme_io_md": false, 00:18:54.231 "write_zeroes": true, 00:18:54.231 "zcopy": true, 00:18:54.231 "get_zone_info": false, 00:18:54.231 "zone_management": false, 00:18:54.231 "zone_append": false, 00:18:54.231 "compare": false, 00:18:54.231 "compare_and_write": false, 00:18:54.231 "abort": true, 00:18:54.231 "seek_hole": false, 00:18:54.231 "seek_data": false, 00:18:54.231 "copy": true, 00:18:54.231 "nvme_iov_md": false 00:18:54.231 }, 00:18:54.231 "memory_domains": [ 00:18:54.231 { 00:18:54.231 "dma_device_id": "system", 00:18:54.231 "dma_device_type": 1 00:18:54.231 }, 00:18:54.231 { 00:18:54.231 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:54.231 "dma_device_type": 2 00:18:54.231 } 00:18:54.231 ], 00:18:54.231 "driver_specific": { 00:18:54.231 "passthru": { 00:18:54.231 "name": "pt2", 00:18:54.231 "base_bdev_name": "malloc2" 00:18:54.231 } 00:18:54.231 } 00:18:54.231 }' 00:18:54.231 18:22:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:54.231 18:22:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:54.231 18:22:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:18:54.231 18:22:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:54.490 18:22:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:54.490 18:22:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:18:54.490 18:22:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:54.490 18:22:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:54.490 18:22:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:18:54.490 18:22:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:54.490 18:22:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:54.490 18:22:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:18:54.490 18:22:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:18:54.490 18:22:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt3 00:18:54.490 18:22:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:18:54.749 18:22:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:18:54.749 "name": "pt3", 00:18:54.749 "aliases": [ 00:18:54.749 "00000000-0000-0000-0000-000000000003" 00:18:54.749 ], 00:18:54.749 "product_name": "passthru", 00:18:54.749 "block_size": 512, 00:18:54.749 "num_blocks": 65536, 00:18:54.749 "uuid": "00000000-0000-0000-0000-000000000003", 00:18:54.749 "assigned_rate_limits": { 00:18:54.749 "rw_ios_per_sec": 0, 00:18:54.749 "rw_mbytes_per_sec": 0, 00:18:54.749 "r_mbytes_per_sec": 0, 00:18:54.749 "w_mbytes_per_sec": 0 00:18:54.749 }, 00:18:54.749 "claimed": true, 00:18:54.749 "claim_type": "exclusive_write", 00:18:54.749 "zoned": false, 00:18:54.749 "supported_io_types": { 00:18:54.749 "read": true, 00:18:54.749 "write": true, 00:18:54.749 "unmap": true, 00:18:54.749 "flush": true, 00:18:54.749 "reset": true, 00:18:54.749 "nvme_admin": false, 00:18:54.749 "nvme_io": false, 00:18:54.749 "nvme_io_md": false, 00:18:54.749 "write_zeroes": true, 00:18:54.749 "zcopy": true, 00:18:54.749 "get_zone_info": false, 00:18:54.749 "zone_management": false, 00:18:54.749 "zone_append": false, 00:18:54.749 "compare": false, 00:18:54.749 "compare_and_write": false, 00:18:54.749 "abort": true, 00:18:54.749 "seek_hole": false, 00:18:54.749 "seek_data": false, 00:18:54.749 "copy": true, 00:18:54.749 "nvme_iov_md": false 00:18:54.749 }, 00:18:54.749 "memory_domains": [ 00:18:54.749 { 00:18:54.749 "dma_device_id": "system", 00:18:54.749 "dma_device_type": 1 00:18:54.749 }, 00:18:54.749 { 00:18:54.749 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:54.749 "dma_device_type": 2 00:18:54.749 } 00:18:54.749 ], 00:18:54.749 "driver_specific": { 00:18:54.749 "passthru": { 00:18:54.749 "name": "pt3", 00:18:54.749 "base_bdev_name": "malloc3" 00:18:54.749 } 00:18:54.749 } 00:18:54.749 }' 00:18:54.749 18:22:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:55.007 18:22:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:55.007 18:22:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:18:55.007 18:22:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:55.007 18:22:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:55.007 18:22:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:18:55.007 18:22:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:55.007 18:22:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:55.007 18:22:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:18:55.007 18:22:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:55.007 18:22:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:55.266 18:22:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:18:55.266 18:22:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:18:55.266 18:22:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt4 00:18:55.266 18:22:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:18:55.524 18:22:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:18:55.524 "name": "pt4", 00:18:55.524 "aliases": [ 00:18:55.524 "00000000-0000-0000-0000-000000000004" 00:18:55.524 ], 00:18:55.524 "product_name": "passthru", 00:18:55.524 "block_size": 512, 00:18:55.524 "num_blocks": 65536, 00:18:55.524 "uuid": "00000000-0000-0000-0000-000000000004", 00:18:55.524 "assigned_rate_limits": { 00:18:55.524 "rw_ios_per_sec": 0, 00:18:55.524 "rw_mbytes_per_sec": 0, 00:18:55.524 "r_mbytes_per_sec": 0, 00:18:55.524 "w_mbytes_per_sec": 0 00:18:55.524 }, 00:18:55.524 "claimed": true, 00:18:55.524 "claim_type": "exclusive_write", 00:18:55.524 "zoned": false, 00:18:55.524 "supported_io_types": { 00:18:55.524 "read": true, 00:18:55.524 "write": true, 00:18:55.524 "unmap": true, 00:18:55.524 "flush": true, 00:18:55.524 "reset": true, 00:18:55.524 "nvme_admin": false, 00:18:55.524 "nvme_io": false, 00:18:55.524 "nvme_io_md": false, 00:18:55.524 "write_zeroes": true, 00:18:55.524 "zcopy": true, 00:18:55.524 "get_zone_info": false, 00:18:55.524 "zone_management": false, 00:18:55.524 "zone_append": false, 00:18:55.524 "compare": false, 00:18:55.524 "compare_and_write": false, 00:18:55.524 "abort": true, 00:18:55.524 "seek_hole": false, 00:18:55.524 "seek_data": false, 00:18:55.524 "copy": true, 00:18:55.524 "nvme_iov_md": false 00:18:55.524 }, 00:18:55.524 "memory_domains": [ 00:18:55.524 { 00:18:55.524 "dma_device_id": "system", 00:18:55.524 "dma_device_type": 1 00:18:55.524 }, 00:18:55.524 { 00:18:55.524 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:55.524 "dma_device_type": 2 00:18:55.524 } 00:18:55.524 ], 00:18:55.524 "driver_specific": { 00:18:55.524 "passthru": { 00:18:55.524 "name": "pt4", 00:18:55.524 "base_bdev_name": "malloc4" 00:18:55.524 } 00:18:55.524 } 00:18:55.524 }' 00:18:55.524 18:22:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:55.524 18:22:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:55.524 18:22:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:18:55.524 18:22:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:55.524 18:22:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:55.524 18:22:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:18:55.524 18:22:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:55.783 18:22:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:55.783 18:22:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:18:55.783 18:22:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:55.783 18:22:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:55.783 18:22:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:18:55.783 18:22:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:18:55.783 18:22:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # jq -r '.[] | .uuid' 00:18:56.349 [2024-07-12 18:22:39.874276] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:18:56.349 18:22:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # '[' fda48608-27c8-4100-a20e-1e962fadc5f6 '!=' fda48608-27c8-4100-a20e-1e962fadc5f6 ']' 00:18:56.349 18:22:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@490 -- # has_redundancy raid0 00:18:56.349 18:22:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:18:56.349 18:22:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@215 -- # return 1 00:18:56.349 18:22:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@562 -- # killprocess 2529047 00:18:56.349 18:22:39 bdev_raid.raid_superblock_test -- common/autotest_common.sh@948 -- # '[' -z 2529047 ']' 00:18:56.349 18:22:39 bdev_raid.raid_superblock_test -- common/autotest_common.sh@952 -- # kill -0 2529047 00:18:56.349 18:22:39 bdev_raid.raid_superblock_test -- common/autotest_common.sh@953 -- # uname 00:18:56.349 18:22:39 bdev_raid.raid_superblock_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:18:56.349 18:22:39 bdev_raid.raid_superblock_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2529047 00:18:56.349 18:22:39 bdev_raid.raid_superblock_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:18:56.349 18:22:39 bdev_raid.raid_superblock_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:18:56.349 18:22:39 bdev_raid.raid_superblock_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2529047' 00:18:56.349 killing process with pid 2529047 00:18:56.349 18:22:39 bdev_raid.raid_superblock_test -- common/autotest_common.sh@967 -- # kill 2529047 00:18:56.349 [2024-07-12 18:22:39.973404] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:18:56.349 [2024-07-12 18:22:39.973465] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:18:56.349 [2024-07-12 18:22:39.973527] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:18:56.349 [2024-07-12 18:22:39.973539] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x821780 name raid_bdev1, state offline 00:18:56.349 18:22:39 bdev_raid.raid_superblock_test -- common/autotest_common.sh@972 -- # wait 2529047 00:18:56.349 [2024-07-12 18:22:40.010109] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:18:56.608 18:22:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@564 -- # return 0 00:18:56.608 00:18:56.608 real 0m16.467s 00:18:56.608 user 0m29.726s 00:18:56.608 sys 0m2.954s 00:18:56.608 18:22:40 bdev_raid.raid_superblock_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:18:56.608 18:22:40 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:18:56.608 ************************************ 00:18:56.608 END TEST raid_superblock_test 00:18:56.608 ************************************ 00:18:56.608 18:22:40 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:18:56.608 18:22:40 bdev_raid -- bdev/bdev_raid.sh@870 -- # run_test raid_read_error_test raid_io_error_test raid0 4 read 00:18:56.608 18:22:40 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:18:56.608 18:22:40 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:18:56.608 18:22:40 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:18:56.608 ************************************ 00:18:56.608 START TEST raid_read_error_test 00:18:56.608 ************************************ 00:18:56.608 18:22:40 bdev_raid.raid_read_error_test -- common/autotest_common.sh@1123 -- # raid_io_error_test raid0 4 read 00:18:56.608 18:22:40 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@788 -- # local raid_level=raid0 00:18:56.608 18:22:40 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@789 -- # local num_base_bdevs=4 00:18:56.608 18:22:40 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@790 -- # local error_io_type=read 00:18:56.608 18:22:40 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i = 1 )) 00:18:56.608 18:22:40 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:18:56.608 18:22:40 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev1 00:18:56.608 18:22:40 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:18:56.608 18:22:40 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:18:56.608 18:22:40 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev2 00:18:56.608 18:22:40 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:18:56.608 18:22:40 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:18:56.608 18:22:40 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev3 00:18:56.608 18:22:40 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:18:56.608 18:22:40 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:18:56.608 18:22:40 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev4 00:18:56.608 18:22:40 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:18:56.608 18:22:40 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:18:56.608 18:22:40 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3' 'BaseBdev4') 00:18:56.608 18:22:40 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # local base_bdevs 00:18:56.608 18:22:40 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@792 -- # local raid_bdev_name=raid_bdev1 00:18:56.608 18:22:40 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # local strip_size 00:18:56.608 18:22:40 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@794 -- # local create_arg 00:18:56.608 18:22:40 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@795 -- # local bdevperf_log 00:18:56.608 18:22:40 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@796 -- # local fail_per_s 00:18:56.608 18:22:40 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@798 -- # '[' raid0 '!=' raid1 ']' 00:18:56.608 18:22:40 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@799 -- # strip_size=64 00:18:56.608 18:22:40 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@800 -- # create_arg+=' -z 64' 00:18:56.608 18:22:40 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@805 -- # mktemp -p /raidtest 00:18:56.608 18:22:40 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@805 -- # bdevperf_log=/raidtest/tmp.HfGsnsld6s 00:18:56.608 18:22:40 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@808 -- # raid_pid=2531482 00:18:56.608 18:22:40 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@809 -- # waitforlisten 2531482 /var/tmp/spdk-raid.sock 00:18:56.608 18:22:40 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@807 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 128k -q 1 -z -f -L bdev_raid 00:18:56.608 18:22:40 bdev_raid.raid_read_error_test -- common/autotest_common.sh@829 -- # '[' -z 2531482 ']' 00:18:56.608 18:22:40 bdev_raid.raid_read_error_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:18:56.608 18:22:40 bdev_raid.raid_read_error_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:18:56.608 18:22:40 bdev_raid.raid_read_error_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:18:56.608 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:18:56.608 18:22:40 bdev_raid.raid_read_error_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:18:56.608 18:22:40 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:18:56.867 [2024-07-12 18:22:40.358141] Starting SPDK v24.09-pre git sha1 182dd7de4 / DPDK 24.03.0 initialization... 00:18:56.867 [2024-07-12 18:22:40.358203] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2531482 ] 00:18:56.867 [2024-07-12 18:22:40.495998] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:18:57.132 [2024-07-12 18:22:40.598423] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:18:57.132 [2024-07-12 18:22:40.665157] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:18:57.132 [2024-07-12 18:22:40.665204] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:18:57.751 18:22:41 bdev_raid.raid_read_error_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:18:57.751 18:22:41 bdev_raid.raid_read_error_test -- common/autotest_common.sh@862 -- # return 0 00:18:57.751 18:22:41 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:18:57.751 18:22:41 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:18:58.009 BaseBdev1_malloc 00:18:58.009 18:22:41 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev1_malloc 00:18:58.268 true 00:18:58.268 18:22:41 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev1_malloc -p BaseBdev1 00:18:58.526 [2024-07-12 18:22:41.997140] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev1_malloc 00:18:58.526 [2024-07-12 18:22:41.997183] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:18:58.526 [2024-07-12 18:22:41.997204] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x237c0d0 00:18:58.526 [2024-07-12 18:22:41.997217] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:18:58.526 [2024-07-12 18:22:41.999066] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:18:58.526 [2024-07-12 18:22:41.999096] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:18:58.526 BaseBdev1 00:18:58.526 18:22:42 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:18:58.526 18:22:42 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:18:58.526 BaseBdev2_malloc 00:18:58.785 18:22:42 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev2_malloc 00:18:58.785 true 00:18:58.785 18:22:42 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev2_malloc -p BaseBdev2 00:18:59.042 [2024-07-12 18:22:42.715607] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev2_malloc 00:18:59.043 [2024-07-12 18:22:42.715653] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:18:59.043 [2024-07-12 18:22:42.715673] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x2380910 00:18:59.043 [2024-07-12 18:22:42.715685] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:18:59.043 [2024-07-12 18:22:42.717243] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:18:59.043 [2024-07-12 18:22:42.717271] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:18:59.043 BaseBdev2 00:18:59.043 18:22:42 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:18:59.043 18:22:42 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3_malloc 00:18:59.300 BaseBdev3_malloc 00:18:59.300 18:22:42 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev3_malloc 00:18:59.558 true 00:18:59.558 18:22:43 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev3_malloc -p BaseBdev3 00:18:59.816 [2024-07-12 18:22:43.455377] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev3_malloc 00:18:59.816 [2024-07-12 18:22:43.455425] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:18:59.816 [2024-07-12 18:22:43.455446] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x2382bd0 00:18:59.816 [2024-07-12 18:22:43.455459] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:18:59.816 [2024-07-12 18:22:43.457070] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:18:59.816 [2024-07-12 18:22:43.457098] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev3 00:18:59.816 BaseBdev3 00:18:59.816 18:22:43 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:18:59.816 18:22:43 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4_malloc 00:19:00.074 BaseBdev4_malloc 00:19:00.074 18:22:43 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev4_malloc 00:19:00.332 true 00:19:00.332 18:22:43 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev4_malloc -p BaseBdev4 00:19:00.591 [2024-07-12 18:22:44.165797] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev4_malloc 00:19:00.591 [2024-07-12 18:22:44.165841] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:19:00.591 [2024-07-12 18:22:44.165861] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x2383aa0 00:19:00.591 [2024-07-12 18:22:44.165874] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:19:00.591 [2024-07-12 18:22:44.167428] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:19:00.591 [2024-07-12 18:22:44.167457] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev4 00:19:00.591 BaseBdev4 00:19:00.591 18:22:44 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@819 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n raid_bdev1 -s 00:19:00.849 [2024-07-12 18:22:44.398445] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:19:00.849 [2024-07-12 18:22:44.399794] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:19:00.849 [2024-07-12 18:22:44.399861] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:19:00.849 [2024-07-12 18:22:44.399923] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:19:00.849 [2024-07-12 18:22:44.400207] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x237dc20 00:19:00.849 [2024-07-12 18:22:44.400219] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 253952, blocklen 512 00:19:00.850 [2024-07-12 18:22:44.400423] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x21d2260 00:19:00.850 [2024-07-12 18:22:44.400568] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x237dc20 00:19:00.850 [2024-07-12 18:22:44.400578] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x237dc20 00:19:00.850 [2024-07-12 18:22:44.400681] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:19:00.850 18:22:44 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@820 -- # verify_raid_bdev_state raid_bdev1 online raid0 64 4 00:19:00.850 18:22:44 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:19:00.850 18:22:44 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:19:00.850 18:22:44 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:19:00.850 18:22:44 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:19:00.850 18:22:44 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:19:00.850 18:22:44 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:19:00.850 18:22:44 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:19:00.850 18:22:44 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:19:00.850 18:22:44 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:19:00.850 18:22:44 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:00.850 18:22:44 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:19:01.108 18:22:44 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:19:01.108 "name": "raid_bdev1", 00:19:01.108 "uuid": "b044b040-9f52-40a9-a41a-2c87f63748f8", 00:19:01.108 "strip_size_kb": 64, 00:19:01.108 "state": "online", 00:19:01.108 "raid_level": "raid0", 00:19:01.108 "superblock": true, 00:19:01.108 "num_base_bdevs": 4, 00:19:01.108 "num_base_bdevs_discovered": 4, 00:19:01.108 "num_base_bdevs_operational": 4, 00:19:01.108 "base_bdevs_list": [ 00:19:01.108 { 00:19:01.108 "name": "BaseBdev1", 00:19:01.108 "uuid": "7171e678-07ad-53e9-945a-51d6e2dd9e77", 00:19:01.108 "is_configured": true, 00:19:01.108 "data_offset": 2048, 00:19:01.108 "data_size": 63488 00:19:01.108 }, 00:19:01.108 { 00:19:01.108 "name": "BaseBdev2", 00:19:01.108 "uuid": "91e80b14-c06d-51b5-b8cc-a94365c527ff", 00:19:01.108 "is_configured": true, 00:19:01.108 "data_offset": 2048, 00:19:01.108 "data_size": 63488 00:19:01.108 }, 00:19:01.108 { 00:19:01.108 "name": "BaseBdev3", 00:19:01.108 "uuid": "8dc39f66-57ad-5236-a7fb-0427e080089c", 00:19:01.108 "is_configured": true, 00:19:01.108 "data_offset": 2048, 00:19:01.108 "data_size": 63488 00:19:01.108 }, 00:19:01.108 { 00:19:01.108 "name": "BaseBdev4", 00:19:01.108 "uuid": "819d21a6-9a10-5383-926d-8043c106664a", 00:19:01.108 "is_configured": true, 00:19:01.108 "data_offset": 2048, 00:19:01.108 "data_size": 63488 00:19:01.108 } 00:19:01.108 ] 00:19:01.108 }' 00:19:01.108 18:22:44 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:19:01.108 18:22:44 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:19:01.675 18:22:45 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@824 -- # sleep 1 00:19:01.675 18:22:45 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@823 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/spdk-raid.sock perform_tests 00:19:01.675 [2024-07-12 18:22:45.357251] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x236ffc0 00:19:02.613 18:22:46 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@827 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_inject_error EE_BaseBdev1_malloc read failure 00:19:02.872 18:22:46 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@829 -- # local expected_num_base_bdevs 00:19:02.872 18:22:46 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@830 -- # [[ raid0 = \r\a\i\d\1 ]] 00:19:02.872 18:22:46 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@833 -- # expected_num_base_bdevs=4 00:19:02.872 18:22:46 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@835 -- # verify_raid_bdev_state raid_bdev1 online raid0 64 4 00:19:02.872 18:22:46 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:19:02.872 18:22:46 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:19:02.872 18:22:46 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:19:02.872 18:22:46 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:19:02.872 18:22:46 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:19:02.872 18:22:46 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:19:02.872 18:22:46 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:19:02.872 18:22:46 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:19:02.872 18:22:46 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:19:02.872 18:22:46 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:02.872 18:22:46 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:19:03.131 18:22:46 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:19:03.131 "name": "raid_bdev1", 00:19:03.131 "uuid": "b044b040-9f52-40a9-a41a-2c87f63748f8", 00:19:03.131 "strip_size_kb": 64, 00:19:03.131 "state": "online", 00:19:03.131 "raid_level": "raid0", 00:19:03.131 "superblock": true, 00:19:03.131 "num_base_bdevs": 4, 00:19:03.131 "num_base_bdevs_discovered": 4, 00:19:03.131 "num_base_bdevs_operational": 4, 00:19:03.131 "base_bdevs_list": [ 00:19:03.131 { 00:19:03.131 "name": "BaseBdev1", 00:19:03.131 "uuid": "7171e678-07ad-53e9-945a-51d6e2dd9e77", 00:19:03.131 "is_configured": true, 00:19:03.131 "data_offset": 2048, 00:19:03.131 "data_size": 63488 00:19:03.131 }, 00:19:03.131 { 00:19:03.131 "name": "BaseBdev2", 00:19:03.131 "uuid": "91e80b14-c06d-51b5-b8cc-a94365c527ff", 00:19:03.131 "is_configured": true, 00:19:03.131 "data_offset": 2048, 00:19:03.131 "data_size": 63488 00:19:03.131 }, 00:19:03.131 { 00:19:03.131 "name": "BaseBdev3", 00:19:03.131 "uuid": "8dc39f66-57ad-5236-a7fb-0427e080089c", 00:19:03.131 "is_configured": true, 00:19:03.131 "data_offset": 2048, 00:19:03.131 "data_size": 63488 00:19:03.131 }, 00:19:03.131 { 00:19:03.131 "name": "BaseBdev4", 00:19:03.131 "uuid": "819d21a6-9a10-5383-926d-8043c106664a", 00:19:03.131 "is_configured": true, 00:19:03.131 "data_offset": 2048, 00:19:03.131 "data_size": 63488 00:19:03.131 } 00:19:03.131 ] 00:19:03.131 }' 00:19:03.131 18:22:46 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:19:03.131 18:22:46 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:19:03.698 18:22:47 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@837 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:19:03.957 [2024-07-12 18:22:47.428739] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:19:03.957 [2024-07-12 18:22:47.428771] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:19:03.957 [2024-07-12 18:22:47.431917] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:19:03.957 [2024-07-12 18:22:47.431958] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:19:03.957 [2024-07-12 18:22:47.431998] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:19:03.957 [2024-07-12 18:22:47.432009] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x237dc20 name raid_bdev1, state offline 00:19:03.957 0 00:19:03.957 18:22:47 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@839 -- # killprocess 2531482 00:19:03.957 18:22:47 bdev_raid.raid_read_error_test -- common/autotest_common.sh@948 -- # '[' -z 2531482 ']' 00:19:03.957 18:22:47 bdev_raid.raid_read_error_test -- common/autotest_common.sh@952 -- # kill -0 2531482 00:19:03.957 18:22:47 bdev_raid.raid_read_error_test -- common/autotest_common.sh@953 -- # uname 00:19:03.957 18:22:47 bdev_raid.raid_read_error_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:19:03.957 18:22:47 bdev_raid.raid_read_error_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2531482 00:19:03.957 18:22:47 bdev_raid.raid_read_error_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:19:03.957 18:22:47 bdev_raid.raid_read_error_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:19:03.957 18:22:47 bdev_raid.raid_read_error_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2531482' 00:19:03.957 killing process with pid 2531482 00:19:03.957 18:22:47 bdev_raid.raid_read_error_test -- common/autotest_common.sh@967 -- # kill 2531482 00:19:03.957 [2024-07-12 18:22:47.495672] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:19:03.957 18:22:47 bdev_raid.raid_read_error_test -- common/autotest_common.sh@972 -- # wait 2531482 00:19:03.957 [2024-07-12 18:22:47.527047] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:19:04.216 18:22:47 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # grep -v Job /raidtest/tmp.HfGsnsld6s 00:19:04.216 18:22:47 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # grep raid_bdev1 00:19:04.216 18:22:47 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # awk '{print $6}' 00:19:04.216 18:22:47 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # fail_per_s=0.48 00:19:04.216 18:22:47 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@844 -- # has_redundancy raid0 00:19:04.216 18:22:47 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:19:04.216 18:22:47 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@215 -- # return 1 00:19:04.216 18:22:47 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@847 -- # [[ 0.48 != \0\.\0\0 ]] 00:19:04.216 00:19:04.216 real 0m7.484s 00:19:04.216 user 0m11.945s 00:19:04.216 sys 0m1.308s 00:19:04.216 18:22:47 bdev_raid.raid_read_error_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:19:04.216 18:22:47 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:19:04.216 ************************************ 00:19:04.216 END TEST raid_read_error_test 00:19:04.216 ************************************ 00:19:04.216 18:22:47 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:19:04.216 18:22:47 bdev_raid -- bdev/bdev_raid.sh@871 -- # run_test raid_write_error_test raid_io_error_test raid0 4 write 00:19:04.216 18:22:47 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:19:04.216 18:22:47 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:19:04.216 18:22:47 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:19:04.216 ************************************ 00:19:04.216 START TEST raid_write_error_test 00:19:04.216 ************************************ 00:19:04.216 18:22:47 bdev_raid.raid_write_error_test -- common/autotest_common.sh@1123 -- # raid_io_error_test raid0 4 write 00:19:04.216 18:22:47 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@788 -- # local raid_level=raid0 00:19:04.216 18:22:47 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@789 -- # local num_base_bdevs=4 00:19:04.216 18:22:47 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@790 -- # local error_io_type=write 00:19:04.216 18:22:47 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i = 1 )) 00:19:04.216 18:22:47 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:19:04.216 18:22:47 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev1 00:19:04.216 18:22:47 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:19:04.216 18:22:47 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:19:04.216 18:22:47 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev2 00:19:04.216 18:22:47 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:19:04.216 18:22:47 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:19:04.216 18:22:47 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev3 00:19:04.216 18:22:47 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:19:04.216 18:22:47 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:19:04.216 18:22:47 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev4 00:19:04.216 18:22:47 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:19:04.216 18:22:47 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:19:04.216 18:22:47 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3' 'BaseBdev4') 00:19:04.216 18:22:47 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # local base_bdevs 00:19:04.216 18:22:47 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@792 -- # local raid_bdev_name=raid_bdev1 00:19:04.216 18:22:47 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # local strip_size 00:19:04.216 18:22:47 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@794 -- # local create_arg 00:19:04.216 18:22:47 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@795 -- # local bdevperf_log 00:19:04.216 18:22:47 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@796 -- # local fail_per_s 00:19:04.216 18:22:47 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@798 -- # '[' raid0 '!=' raid1 ']' 00:19:04.216 18:22:47 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@799 -- # strip_size=64 00:19:04.216 18:22:47 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@800 -- # create_arg+=' -z 64' 00:19:04.216 18:22:47 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@805 -- # mktemp -p /raidtest 00:19:04.216 18:22:47 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@805 -- # bdevperf_log=/raidtest/tmp.9HMiUbI9ZG 00:19:04.216 18:22:47 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@808 -- # raid_pid=2532633 00:19:04.216 18:22:47 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@809 -- # waitforlisten 2532633 /var/tmp/spdk-raid.sock 00:19:04.216 18:22:47 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@807 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 128k -q 1 -z -f -L bdev_raid 00:19:04.216 18:22:47 bdev_raid.raid_write_error_test -- common/autotest_common.sh@829 -- # '[' -z 2532633 ']' 00:19:04.216 18:22:47 bdev_raid.raid_write_error_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:19:04.216 18:22:47 bdev_raid.raid_write_error_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:19:04.216 18:22:47 bdev_raid.raid_write_error_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:19:04.216 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:19:04.216 18:22:47 bdev_raid.raid_write_error_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:19:04.216 18:22:47 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:19:04.216 [2024-07-12 18:22:47.922911] Starting SPDK v24.09-pre git sha1 182dd7de4 / DPDK 24.03.0 initialization... 00:19:04.216 [2024-07-12 18:22:47.922990] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2532633 ] 00:19:04.476 [2024-07-12 18:22:48.045449] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:19:04.476 [2024-07-12 18:22:48.152641] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:19:04.734 [2024-07-12 18:22:48.222530] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:19:04.734 [2024-07-12 18:22:48.222563] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:19:05.301 18:22:48 bdev_raid.raid_write_error_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:19:05.301 18:22:48 bdev_raid.raid_write_error_test -- common/autotest_common.sh@862 -- # return 0 00:19:05.301 18:22:48 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:19:05.301 18:22:48 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:19:05.560 BaseBdev1_malloc 00:19:05.560 18:22:49 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev1_malloc 00:19:05.819 true 00:19:05.819 18:22:49 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev1_malloc -p BaseBdev1 00:19:06.078 [2024-07-12 18:22:49.581084] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev1_malloc 00:19:06.078 [2024-07-12 18:22:49.581130] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:19:06.078 [2024-07-12 18:22:49.581151] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x27840d0 00:19:06.078 [2024-07-12 18:22:49.581163] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:19:06.078 [2024-07-12 18:22:49.583040] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:19:06.078 [2024-07-12 18:22:49.583069] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:19:06.078 BaseBdev1 00:19:06.078 18:22:49 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:19:06.078 18:22:49 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:19:06.337 BaseBdev2_malloc 00:19:06.337 18:22:49 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev2_malloc 00:19:06.596 true 00:19:06.596 18:22:50 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev2_malloc -p BaseBdev2 00:19:06.854 [2024-07-12 18:22:50.577496] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev2_malloc 00:19:06.854 [2024-07-12 18:22:50.577542] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:19:06.854 [2024-07-12 18:22:50.577565] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x2788910 00:19:06.854 [2024-07-12 18:22:50.577579] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:19:06.854 [2024-07-12 18:22:50.579185] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:19:06.854 [2024-07-12 18:22:50.579213] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:19:07.113 BaseBdev2 00:19:07.113 18:22:50 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:19:07.113 18:22:50 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3_malloc 00:19:07.372 BaseBdev3_malloc 00:19:07.372 18:22:50 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev3_malloc 00:19:07.372 true 00:19:07.372 18:22:51 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev3_malloc -p BaseBdev3 00:19:07.631 [2024-07-12 18:22:51.315987] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev3_malloc 00:19:07.631 [2024-07-12 18:22:51.316030] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:19:07.631 [2024-07-12 18:22:51.316051] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x278abd0 00:19:07.631 [2024-07-12 18:22:51.316064] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:19:07.631 [2024-07-12 18:22:51.317598] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:19:07.631 [2024-07-12 18:22:51.317627] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev3 00:19:07.631 BaseBdev3 00:19:07.631 18:22:51 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:19:07.631 18:22:51 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4_malloc 00:19:07.890 BaseBdev4_malloc 00:19:07.890 18:22:51 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev4_malloc 00:19:08.148 true 00:19:08.148 18:22:51 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev4_malloc -p BaseBdev4 00:19:08.407 [2024-07-12 18:22:52.043690] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev4_malloc 00:19:08.407 [2024-07-12 18:22:52.043737] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:19:08.407 [2024-07-12 18:22:52.043759] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x278baa0 00:19:08.407 [2024-07-12 18:22:52.043771] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:19:08.407 [2024-07-12 18:22:52.045360] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:19:08.407 [2024-07-12 18:22:52.045388] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev4 00:19:08.407 BaseBdev4 00:19:08.407 18:22:52 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@819 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n raid_bdev1 -s 00:19:08.666 [2024-07-12 18:22:52.280356] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:19:08.666 [2024-07-12 18:22:52.281676] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:19:08.666 [2024-07-12 18:22:52.281744] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:19:08.666 [2024-07-12 18:22:52.281804] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:19:08.666 [2024-07-12 18:22:52.282033] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x2785c20 00:19:08.666 [2024-07-12 18:22:52.282045] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 253952, blocklen 512 00:19:08.666 [2024-07-12 18:22:52.282236] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x25da260 00:19:08.666 [2024-07-12 18:22:52.282385] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x2785c20 00:19:08.666 [2024-07-12 18:22:52.282395] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x2785c20 00:19:08.666 [2024-07-12 18:22:52.282499] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:19:08.666 18:22:52 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@820 -- # verify_raid_bdev_state raid_bdev1 online raid0 64 4 00:19:08.666 18:22:52 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:19:08.666 18:22:52 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:19:08.666 18:22:52 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:19:08.666 18:22:52 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:19:08.667 18:22:52 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:19:08.667 18:22:52 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:19:08.667 18:22:52 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:19:08.667 18:22:52 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:19:08.667 18:22:52 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:19:08.667 18:22:52 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:08.667 18:22:52 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:19:08.926 18:22:52 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:19:08.926 "name": "raid_bdev1", 00:19:08.926 "uuid": "b3ee5fab-34fa-43d2-8a14-c4f35298554a", 00:19:08.926 "strip_size_kb": 64, 00:19:08.926 "state": "online", 00:19:08.926 "raid_level": "raid0", 00:19:08.926 "superblock": true, 00:19:08.926 "num_base_bdevs": 4, 00:19:08.926 "num_base_bdevs_discovered": 4, 00:19:08.926 "num_base_bdevs_operational": 4, 00:19:08.926 "base_bdevs_list": [ 00:19:08.926 { 00:19:08.926 "name": "BaseBdev1", 00:19:08.926 "uuid": "13ae6c90-e08a-5bc6-87ae-5d3575690b5e", 00:19:08.926 "is_configured": true, 00:19:08.926 "data_offset": 2048, 00:19:08.926 "data_size": 63488 00:19:08.926 }, 00:19:08.926 { 00:19:08.926 "name": "BaseBdev2", 00:19:08.926 "uuid": "6bddbeba-6c2b-5886-927f-d48ac30e4a73", 00:19:08.926 "is_configured": true, 00:19:08.926 "data_offset": 2048, 00:19:08.926 "data_size": 63488 00:19:08.926 }, 00:19:08.926 { 00:19:08.926 "name": "BaseBdev3", 00:19:08.926 "uuid": "56166e8b-c7ba-51fe-9967-710d54873b88", 00:19:08.926 "is_configured": true, 00:19:08.926 "data_offset": 2048, 00:19:08.926 "data_size": 63488 00:19:08.926 }, 00:19:08.926 { 00:19:08.926 "name": "BaseBdev4", 00:19:08.926 "uuid": "12f667e6-4a40-5ff3-8035-d9b29f0fcd51", 00:19:08.926 "is_configured": true, 00:19:08.926 "data_offset": 2048, 00:19:08.926 "data_size": 63488 00:19:08.926 } 00:19:08.926 ] 00:19:08.926 }' 00:19:08.926 18:22:52 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:19:08.926 18:22:52 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:19:09.493 18:22:52 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@823 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/spdk-raid.sock perform_tests 00:19:09.493 18:22:52 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@824 -- # sleep 1 00:19:09.493 [2024-07-12 18:22:53.030601] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x2777fc0 00:19:10.428 18:22:53 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@827 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_inject_error EE_BaseBdev1_malloc write failure 00:19:10.687 18:22:54 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@829 -- # local expected_num_base_bdevs 00:19:10.687 18:22:54 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@830 -- # [[ raid0 = \r\a\i\d\1 ]] 00:19:10.687 18:22:54 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@833 -- # expected_num_base_bdevs=4 00:19:10.687 18:22:54 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@835 -- # verify_raid_bdev_state raid_bdev1 online raid0 64 4 00:19:10.687 18:22:54 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:19:10.687 18:22:54 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:19:10.687 18:22:54 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:19:10.687 18:22:54 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:19:10.687 18:22:54 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:19:10.687 18:22:54 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:19:10.687 18:22:54 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:19:10.687 18:22:54 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:19:10.687 18:22:54 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:19:10.687 18:22:54 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:10.687 18:22:54 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:19:11.254 18:22:54 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:19:11.254 "name": "raid_bdev1", 00:19:11.254 "uuid": "b3ee5fab-34fa-43d2-8a14-c4f35298554a", 00:19:11.254 "strip_size_kb": 64, 00:19:11.254 "state": "online", 00:19:11.254 "raid_level": "raid0", 00:19:11.254 "superblock": true, 00:19:11.254 "num_base_bdevs": 4, 00:19:11.254 "num_base_bdevs_discovered": 4, 00:19:11.254 "num_base_bdevs_operational": 4, 00:19:11.254 "base_bdevs_list": [ 00:19:11.254 { 00:19:11.254 "name": "BaseBdev1", 00:19:11.254 "uuid": "13ae6c90-e08a-5bc6-87ae-5d3575690b5e", 00:19:11.254 "is_configured": true, 00:19:11.254 "data_offset": 2048, 00:19:11.254 "data_size": 63488 00:19:11.254 }, 00:19:11.254 { 00:19:11.254 "name": "BaseBdev2", 00:19:11.254 "uuid": "6bddbeba-6c2b-5886-927f-d48ac30e4a73", 00:19:11.254 "is_configured": true, 00:19:11.254 "data_offset": 2048, 00:19:11.254 "data_size": 63488 00:19:11.254 }, 00:19:11.254 { 00:19:11.254 "name": "BaseBdev3", 00:19:11.254 "uuid": "56166e8b-c7ba-51fe-9967-710d54873b88", 00:19:11.254 "is_configured": true, 00:19:11.254 "data_offset": 2048, 00:19:11.254 "data_size": 63488 00:19:11.254 }, 00:19:11.254 { 00:19:11.254 "name": "BaseBdev4", 00:19:11.254 "uuid": "12f667e6-4a40-5ff3-8035-d9b29f0fcd51", 00:19:11.254 "is_configured": true, 00:19:11.254 "data_offset": 2048, 00:19:11.254 "data_size": 63488 00:19:11.254 } 00:19:11.254 ] 00:19:11.254 }' 00:19:11.254 18:22:54 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:19:11.254 18:22:54 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:19:11.821 18:22:55 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@837 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:19:11.821 [2024-07-12 18:22:55.518896] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:19:11.821 [2024-07-12 18:22:55.518947] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:19:11.821 [2024-07-12 18:22:55.522098] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:19:11.821 [2024-07-12 18:22:55.522134] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:19:11.821 [2024-07-12 18:22:55.522174] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:19:11.821 [2024-07-12 18:22:55.522186] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x2785c20 name raid_bdev1, state offline 00:19:11.821 0 00:19:11.821 18:22:55 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@839 -- # killprocess 2532633 00:19:11.821 18:22:55 bdev_raid.raid_write_error_test -- common/autotest_common.sh@948 -- # '[' -z 2532633 ']' 00:19:11.821 18:22:55 bdev_raid.raid_write_error_test -- common/autotest_common.sh@952 -- # kill -0 2532633 00:19:11.821 18:22:55 bdev_raid.raid_write_error_test -- common/autotest_common.sh@953 -- # uname 00:19:11.821 18:22:55 bdev_raid.raid_write_error_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:19:11.821 18:22:55 bdev_raid.raid_write_error_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2532633 00:19:12.079 18:22:55 bdev_raid.raid_write_error_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:19:12.079 18:22:55 bdev_raid.raid_write_error_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:19:12.079 18:22:55 bdev_raid.raid_write_error_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2532633' 00:19:12.079 killing process with pid 2532633 00:19:12.079 18:22:55 bdev_raid.raid_write_error_test -- common/autotest_common.sh@967 -- # kill 2532633 00:19:12.079 [2024-07-12 18:22:55.584284] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:19:12.079 18:22:55 bdev_raid.raid_write_error_test -- common/autotest_common.sh@972 -- # wait 2532633 00:19:12.079 [2024-07-12 18:22:55.616343] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:19:12.337 18:22:55 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # grep -v Job /raidtest/tmp.9HMiUbI9ZG 00:19:12.337 18:22:55 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # grep raid_bdev1 00:19:12.337 18:22:55 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # awk '{print $6}' 00:19:12.337 18:22:55 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # fail_per_s=0.40 00:19:12.337 18:22:55 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@844 -- # has_redundancy raid0 00:19:12.337 18:22:55 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:19:12.337 18:22:55 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@215 -- # return 1 00:19:12.337 18:22:55 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@847 -- # [[ 0.40 != \0\.\0\0 ]] 00:19:12.337 00:19:12.337 real 0m8.011s 00:19:12.337 user 0m13.003s 00:19:12.337 sys 0m1.305s 00:19:12.337 18:22:55 bdev_raid.raid_write_error_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:19:12.337 18:22:55 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:19:12.337 ************************************ 00:19:12.337 END TEST raid_write_error_test 00:19:12.337 ************************************ 00:19:12.337 18:22:55 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:19:12.337 18:22:55 bdev_raid -- bdev/bdev_raid.sh@866 -- # for level in raid0 concat raid1 00:19:12.337 18:22:55 bdev_raid -- bdev/bdev_raid.sh@867 -- # run_test raid_state_function_test raid_state_function_test concat 4 false 00:19:12.337 18:22:55 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:19:12.337 18:22:55 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:19:12.337 18:22:55 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:19:12.337 ************************************ 00:19:12.337 START TEST raid_state_function_test 00:19:12.337 ************************************ 00:19:12.337 18:22:55 bdev_raid.raid_state_function_test -- common/autotest_common.sh@1123 -- # raid_state_function_test concat 4 false 00:19:12.337 18:22:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@220 -- # local raid_level=concat 00:19:12.337 18:22:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@221 -- # local num_base_bdevs=4 00:19:12.337 18:22:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@222 -- # local superblock=false 00:19:12.337 18:22:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@223 -- # local raid_bdev 00:19:12.337 18:22:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i = 1 )) 00:19:12.337 18:22:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:19:12.337 18:22:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev1 00:19:12.337 18:22:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:19:12.337 18:22:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:19:12.337 18:22:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev2 00:19:12.337 18:22:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:19:12.337 18:22:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:19:12.337 18:22:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev3 00:19:12.337 18:22:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:19:12.337 18:22:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:19:12.337 18:22:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev4 00:19:12.337 18:22:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:19:12.337 18:22:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:19:12.337 18:22:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3' 'BaseBdev4') 00:19:12.337 18:22:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # local base_bdevs 00:19:12.337 18:22:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@225 -- # local raid_bdev_name=Existed_Raid 00:19:12.337 18:22:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # local strip_size 00:19:12.337 18:22:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@227 -- # local strip_size_create_arg 00:19:12.337 18:22:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@228 -- # local superblock_create_arg 00:19:12.337 18:22:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@230 -- # '[' concat '!=' raid1 ']' 00:19:12.337 18:22:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@231 -- # strip_size=64 00:19:12.337 18:22:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@232 -- # strip_size_create_arg='-z 64' 00:19:12.337 18:22:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@237 -- # '[' false = true ']' 00:19:12.337 18:22:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@240 -- # superblock_create_arg= 00:19:12.337 18:22:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@244 -- # raid_pid=2533755 00:19:12.337 18:22:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@245 -- # echo 'Process raid pid: 2533755' 00:19:12.337 Process raid pid: 2533755 00:19:12.337 18:22:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@243 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:19:12.337 18:22:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@246 -- # waitforlisten 2533755 /var/tmp/spdk-raid.sock 00:19:12.337 18:22:55 bdev_raid.raid_state_function_test -- common/autotest_common.sh@829 -- # '[' -z 2533755 ']' 00:19:12.337 18:22:55 bdev_raid.raid_state_function_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:19:12.337 18:22:55 bdev_raid.raid_state_function_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:19:12.337 18:22:55 bdev_raid.raid_state_function_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:19:12.337 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:19:12.337 18:22:55 bdev_raid.raid_state_function_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:19:12.337 18:22:55 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:19:12.337 [2024-07-12 18:22:56.018464] Starting SPDK v24.09-pre git sha1 182dd7de4 / DPDK 24.03.0 initialization... 00:19:12.338 [2024-07-12 18:22:56.018534] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:19:12.595 [2024-07-12 18:22:56.147399] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:19:12.595 [2024-07-12 18:22:56.251402] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:19:12.595 [2024-07-12 18:22:56.310817] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:19:12.595 [2024-07-12 18:22:56.310842] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:19:13.529 18:22:56 bdev_raid.raid_state_function_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:19:13.529 18:22:56 bdev_raid.raid_state_function_test -- common/autotest_common.sh@862 -- # return 0 00:19:13.529 18:22:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@250 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:19:13.529 [2024-07-12 18:22:57.176865] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:19:13.529 [2024-07-12 18:22:57.176909] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:19:13.529 [2024-07-12 18:22:57.176920] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:19:13.529 [2024-07-12 18:22:57.176948] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:19:13.529 [2024-07-12 18:22:57.176956] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:19:13.529 [2024-07-12 18:22:57.176967] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:19:13.529 [2024-07-12 18:22:57.176976] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev4 00:19:13.529 [2024-07-12 18:22:57.176987] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev4 doesn't exist now 00:19:13.529 18:22:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@251 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:19:13.529 18:22:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:19:13.529 18:22:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:19:13.529 18:22:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:19:13.529 18:22:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:19:13.529 18:22:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:19:13.529 18:22:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:19:13.529 18:22:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:19:13.529 18:22:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:19:13.529 18:22:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:19:13.529 18:22:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:13.529 18:22:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:19:13.787 18:22:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:19:13.787 "name": "Existed_Raid", 00:19:13.787 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:13.787 "strip_size_kb": 64, 00:19:13.787 "state": "configuring", 00:19:13.787 "raid_level": "concat", 00:19:13.787 "superblock": false, 00:19:13.787 "num_base_bdevs": 4, 00:19:13.787 "num_base_bdevs_discovered": 0, 00:19:13.787 "num_base_bdevs_operational": 4, 00:19:13.787 "base_bdevs_list": [ 00:19:13.787 { 00:19:13.787 "name": "BaseBdev1", 00:19:13.787 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:13.787 "is_configured": false, 00:19:13.787 "data_offset": 0, 00:19:13.788 "data_size": 0 00:19:13.788 }, 00:19:13.788 { 00:19:13.788 "name": "BaseBdev2", 00:19:13.788 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:13.788 "is_configured": false, 00:19:13.788 "data_offset": 0, 00:19:13.788 "data_size": 0 00:19:13.788 }, 00:19:13.788 { 00:19:13.788 "name": "BaseBdev3", 00:19:13.788 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:13.788 "is_configured": false, 00:19:13.788 "data_offset": 0, 00:19:13.788 "data_size": 0 00:19:13.788 }, 00:19:13.788 { 00:19:13.788 "name": "BaseBdev4", 00:19:13.788 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:13.788 "is_configured": false, 00:19:13.788 "data_offset": 0, 00:19:13.788 "data_size": 0 00:19:13.788 } 00:19:13.788 ] 00:19:13.788 }' 00:19:13.788 18:22:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:19:13.788 18:22:57 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:19:14.385 18:22:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@252 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:19:14.644 [2024-07-12 18:22:58.259592] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:19:14.644 [2024-07-12 18:22:58.259623] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xccfaa0 name Existed_Raid, state configuring 00:19:14.644 18:22:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@256 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:19:14.909 [2024-07-12 18:22:58.496253] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:19:14.910 [2024-07-12 18:22:58.496283] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:19:14.910 [2024-07-12 18:22:58.496293] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:19:14.910 [2024-07-12 18:22:58.496305] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:19:14.910 [2024-07-12 18:22:58.496314] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:19:14.910 [2024-07-12 18:22:58.496325] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:19:14.910 [2024-07-12 18:22:58.496334] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev4 00:19:14.910 [2024-07-12 18:22:58.496345] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev4 doesn't exist now 00:19:14.910 18:22:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@257 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:19:15.168 [2024-07-12 18:22:58.754834] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:19:15.168 BaseBdev1 00:19:15.168 18:22:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@258 -- # waitforbdev BaseBdev1 00:19:15.168 18:22:58 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev1 00:19:15.168 18:22:58 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:19:15.168 18:22:58 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:19:15.168 18:22:58 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:19:15.168 18:22:58 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:19:15.168 18:22:58 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:19:15.427 18:22:59 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:19:15.685 [ 00:19:15.685 { 00:19:15.685 "name": "BaseBdev1", 00:19:15.685 "aliases": [ 00:19:15.685 "4d681a8a-fe7f-4d3a-a496-a9a16e5e4322" 00:19:15.685 ], 00:19:15.685 "product_name": "Malloc disk", 00:19:15.685 "block_size": 512, 00:19:15.685 "num_blocks": 65536, 00:19:15.685 "uuid": "4d681a8a-fe7f-4d3a-a496-a9a16e5e4322", 00:19:15.685 "assigned_rate_limits": { 00:19:15.685 "rw_ios_per_sec": 0, 00:19:15.685 "rw_mbytes_per_sec": 0, 00:19:15.685 "r_mbytes_per_sec": 0, 00:19:15.685 "w_mbytes_per_sec": 0 00:19:15.685 }, 00:19:15.685 "claimed": true, 00:19:15.685 "claim_type": "exclusive_write", 00:19:15.685 "zoned": false, 00:19:15.685 "supported_io_types": { 00:19:15.685 "read": true, 00:19:15.685 "write": true, 00:19:15.685 "unmap": true, 00:19:15.685 "flush": true, 00:19:15.685 "reset": true, 00:19:15.685 "nvme_admin": false, 00:19:15.685 "nvme_io": false, 00:19:15.685 "nvme_io_md": false, 00:19:15.685 "write_zeroes": true, 00:19:15.685 "zcopy": true, 00:19:15.685 "get_zone_info": false, 00:19:15.685 "zone_management": false, 00:19:15.685 "zone_append": false, 00:19:15.685 "compare": false, 00:19:15.685 "compare_and_write": false, 00:19:15.685 "abort": true, 00:19:15.685 "seek_hole": false, 00:19:15.685 "seek_data": false, 00:19:15.685 "copy": true, 00:19:15.685 "nvme_iov_md": false 00:19:15.685 }, 00:19:15.685 "memory_domains": [ 00:19:15.685 { 00:19:15.685 "dma_device_id": "system", 00:19:15.685 "dma_device_type": 1 00:19:15.685 }, 00:19:15.685 { 00:19:15.685 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:15.685 "dma_device_type": 2 00:19:15.685 } 00:19:15.685 ], 00:19:15.685 "driver_specific": {} 00:19:15.685 } 00:19:15.685 ] 00:19:15.685 18:22:59 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:19:15.685 18:22:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@259 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:19:15.685 18:22:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:19:15.685 18:22:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:19:15.685 18:22:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:19:15.685 18:22:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:19:15.685 18:22:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:19:15.685 18:22:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:19:15.685 18:22:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:19:15.685 18:22:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:19:15.685 18:22:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:19:15.685 18:22:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:15.685 18:22:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:19:15.943 18:22:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:19:15.944 "name": "Existed_Raid", 00:19:15.944 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:15.944 "strip_size_kb": 64, 00:19:15.944 "state": "configuring", 00:19:15.944 "raid_level": "concat", 00:19:15.944 "superblock": false, 00:19:15.944 "num_base_bdevs": 4, 00:19:15.944 "num_base_bdevs_discovered": 1, 00:19:15.944 "num_base_bdevs_operational": 4, 00:19:15.944 "base_bdevs_list": [ 00:19:15.944 { 00:19:15.944 "name": "BaseBdev1", 00:19:15.944 "uuid": "4d681a8a-fe7f-4d3a-a496-a9a16e5e4322", 00:19:15.944 "is_configured": true, 00:19:15.944 "data_offset": 0, 00:19:15.944 "data_size": 65536 00:19:15.944 }, 00:19:15.944 { 00:19:15.944 "name": "BaseBdev2", 00:19:15.944 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:15.944 "is_configured": false, 00:19:15.944 "data_offset": 0, 00:19:15.944 "data_size": 0 00:19:15.944 }, 00:19:15.944 { 00:19:15.944 "name": "BaseBdev3", 00:19:15.944 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:15.944 "is_configured": false, 00:19:15.944 "data_offset": 0, 00:19:15.944 "data_size": 0 00:19:15.944 }, 00:19:15.944 { 00:19:15.944 "name": "BaseBdev4", 00:19:15.944 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:15.944 "is_configured": false, 00:19:15.944 "data_offset": 0, 00:19:15.944 "data_size": 0 00:19:15.944 } 00:19:15.944 ] 00:19:15.944 }' 00:19:15.944 18:22:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:19:15.944 18:22:59 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:19:16.512 18:23:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@260 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:19:16.770 [2024-07-12 18:23:00.327036] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:19:16.770 [2024-07-12 18:23:00.327078] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xccf310 name Existed_Raid, state configuring 00:19:16.770 18:23:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@264 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:19:17.029 [2024-07-12 18:23:00.503535] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:19:17.029 [2024-07-12 18:23:00.504982] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:19:17.029 [2024-07-12 18:23:00.505014] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:19:17.029 [2024-07-12 18:23:00.505024] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:19:17.029 [2024-07-12 18:23:00.505035] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:19:17.029 [2024-07-12 18:23:00.505044] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev4 00:19:17.029 [2024-07-12 18:23:00.505055] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev4 doesn't exist now 00:19:17.029 18:23:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i = 1 )) 00:19:17.029 18:23:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:19:17.029 18:23:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:19:17.029 18:23:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:19:17.029 18:23:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:19:17.029 18:23:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:19:17.029 18:23:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:19:17.029 18:23:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:19:17.029 18:23:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:19:17.029 18:23:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:19:17.029 18:23:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:19:17.029 18:23:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:19:17.029 18:23:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:17.029 18:23:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:19:17.029 18:23:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:19:17.029 "name": "Existed_Raid", 00:19:17.029 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:17.029 "strip_size_kb": 64, 00:19:17.029 "state": "configuring", 00:19:17.029 "raid_level": "concat", 00:19:17.029 "superblock": false, 00:19:17.029 "num_base_bdevs": 4, 00:19:17.029 "num_base_bdevs_discovered": 1, 00:19:17.029 "num_base_bdevs_operational": 4, 00:19:17.029 "base_bdevs_list": [ 00:19:17.029 { 00:19:17.029 "name": "BaseBdev1", 00:19:17.029 "uuid": "4d681a8a-fe7f-4d3a-a496-a9a16e5e4322", 00:19:17.029 "is_configured": true, 00:19:17.029 "data_offset": 0, 00:19:17.029 "data_size": 65536 00:19:17.029 }, 00:19:17.029 { 00:19:17.029 "name": "BaseBdev2", 00:19:17.029 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:17.029 "is_configured": false, 00:19:17.029 "data_offset": 0, 00:19:17.029 "data_size": 0 00:19:17.029 }, 00:19:17.029 { 00:19:17.029 "name": "BaseBdev3", 00:19:17.029 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:17.029 "is_configured": false, 00:19:17.029 "data_offset": 0, 00:19:17.029 "data_size": 0 00:19:17.029 }, 00:19:17.029 { 00:19:17.029 "name": "BaseBdev4", 00:19:17.029 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:17.029 "is_configured": false, 00:19:17.029 "data_offset": 0, 00:19:17.029 "data_size": 0 00:19:17.029 } 00:19:17.029 ] 00:19:17.029 }' 00:19:17.029 18:23:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:19:17.029 18:23:00 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:19:17.605 18:23:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:19:17.863 [2024-07-12 18:23:01.545680] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:19:17.863 BaseBdev2 00:19:17.864 18:23:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev2 00:19:17.864 18:23:01 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev2 00:19:17.864 18:23:01 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:19:17.864 18:23:01 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:19:17.864 18:23:01 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:19:17.864 18:23:01 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:19:17.864 18:23:01 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:19:18.122 18:23:01 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:19:18.382 [ 00:19:18.382 { 00:19:18.382 "name": "BaseBdev2", 00:19:18.382 "aliases": [ 00:19:18.382 "6be219e7-3b48-47d1-a9fe-130c5ba34813" 00:19:18.382 ], 00:19:18.382 "product_name": "Malloc disk", 00:19:18.382 "block_size": 512, 00:19:18.382 "num_blocks": 65536, 00:19:18.382 "uuid": "6be219e7-3b48-47d1-a9fe-130c5ba34813", 00:19:18.382 "assigned_rate_limits": { 00:19:18.382 "rw_ios_per_sec": 0, 00:19:18.382 "rw_mbytes_per_sec": 0, 00:19:18.382 "r_mbytes_per_sec": 0, 00:19:18.382 "w_mbytes_per_sec": 0 00:19:18.382 }, 00:19:18.382 "claimed": true, 00:19:18.382 "claim_type": "exclusive_write", 00:19:18.382 "zoned": false, 00:19:18.382 "supported_io_types": { 00:19:18.382 "read": true, 00:19:18.382 "write": true, 00:19:18.382 "unmap": true, 00:19:18.382 "flush": true, 00:19:18.382 "reset": true, 00:19:18.382 "nvme_admin": false, 00:19:18.382 "nvme_io": false, 00:19:18.382 "nvme_io_md": false, 00:19:18.382 "write_zeroes": true, 00:19:18.382 "zcopy": true, 00:19:18.382 "get_zone_info": false, 00:19:18.382 "zone_management": false, 00:19:18.382 "zone_append": false, 00:19:18.382 "compare": false, 00:19:18.382 "compare_and_write": false, 00:19:18.382 "abort": true, 00:19:18.382 "seek_hole": false, 00:19:18.382 "seek_data": false, 00:19:18.382 "copy": true, 00:19:18.382 "nvme_iov_md": false 00:19:18.382 }, 00:19:18.382 "memory_domains": [ 00:19:18.382 { 00:19:18.382 "dma_device_id": "system", 00:19:18.382 "dma_device_type": 1 00:19:18.382 }, 00:19:18.382 { 00:19:18.382 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:18.382 "dma_device_type": 2 00:19:18.382 } 00:19:18.382 ], 00:19:18.382 "driver_specific": {} 00:19:18.382 } 00:19:18.382 ] 00:19:18.382 18:23:02 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:19:18.382 18:23:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:19:18.382 18:23:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:19:18.382 18:23:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:19:18.382 18:23:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:19:18.382 18:23:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:19:18.382 18:23:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:19:18.382 18:23:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:19:18.382 18:23:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:19:18.382 18:23:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:19:18.382 18:23:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:19:18.382 18:23:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:19:18.382 18:23:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:19:18.382 18:23:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:18.382 18:23:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:19:18.641 18:23:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:19:18.641 "name": "Existed_Raid", 00:19:18.641 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:18.641 "strip_size_kb": 64, 00:19:18.641 "state": "configuring", 00:19:18.641 "raid_level": "concat", 00:19:18.641 "superblock": false, 00:19:18.641 "num_base_bdevs": 4, 00:19:18.641 "num_base_bdevs_discovered": 2, 00:19:18.641 "num_base_bdevs_operational": 4, 00:19:18.641 "base_bdevs_list": [ 00:19:18.641 { 00:19:18.641 "name": "BaseBdev1", 00:19:18.641 "uuid": "4d681a8a-fe7f-4d3a-a496-a9a16e5e4322", 00:19:18.641 "is_configured": true, 00:19:18.641 "data_offset": 0, 00:19:18.641 "data_size": 65536 00:19:18.641 }, 00:19:18.641 { 00:19:18.641 "name": "BaseBdev2", 00:19:18.641 "uuid": "6be219e7-3b48-47d1-a9fe-130c5ba34813", 00:19:18.641 "is_configured": true, 00:19:18.641 "data_offset": 0, 00:19:18.641 "data_size": 65536 00:19:18.641 }, 00:19:18.641 { 00:19:18.641 "name": "BaseBdev3", 00:19:18.641 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:18.641 "is_configured": false, 00:19:18.641 "data_offset": 0, 00:19:18.641 "data_size": 0 00:19:18.641 }, 00:19:18.641 { 00:19:18.641 "name": "BaseBdev4", 00:19:18.641 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:18.641 "is_configured": false, 00:19:18.641 "data_offset": 0, 00:19:18.641 "data_size": 0 00:19:18.641 } 00:19:18.641 ] 00:19:18.641 }' 00:19:18.641 18:23:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:19:18.641 18:23:02 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:19:19.209 18:23:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:19:19.467 [2024-07-12 18:23:03.085110] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:19:19.467 BaseBdev3 00:19:19.467 18:23:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev3 00:19:19.467 18:23:03 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev3 00:19:19.467 18:23:03 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:19:19.468 18:23:03 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:19:19.468 18:23:03 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:19:19.468 18:23:03 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:19:19.468 18:23:03 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:19:19.726 18:23:03 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:19:19.985 [ 00:19:19.985 { 00:19:19.985 "name": "BaseBdev3", 00:19:19.985 "aliases": [ 00:19:19.985 "34d7bc77-db92-4ce1-9410-41ceb7c825d3" 00:19:19.985 ], 00:19:19.985 "product_name": "Malloc disk", 00:19:19.985 "block_size": 512, 00:19:19.985 "num_blocks": 65536, 00:19:19.985 "uuid": "34d7bc77-db92-4ce1-9410-41ceb7c825d3", 00:19:19.985 "assigned_rate_limits": { 00:19:19.985 "rw_ios_per_sec": 0, 00:19:19.985 "rw_mbytes_per_sec": 0, 00:19:19.985 "r_mbytes_per_sec": 0, 00:19:19.985 "w_mbytes_per_sec": 0 00:19:19.985 }, 00:19:19.985 "claimed": true, 00:19:19.985 "claim_type": "exclusive_write", 00:19:19.985 "zoned": false, 00:19:19.985 "supported_io_types": { 00:19:19.985 "read": true, 00:19:19.985 "write": true, 00:19:19.985 "unmap": true, 00:19:19.985 "flush": true, 00:19:19.985 "reset": true, 00:19:19.985 "nvme_admin": false, 00:19:19.985 "nvme_io": false, 00:19:19.985 "nvme_io_md": false, 00:19:19.985 "write_zeroes": true, 00:19:19.985 "zcopy": true, 00:19:19.985 "get_zone_info": false, 00:19:19.985 "zone_management": false, 00:19:19.985 "zone_append": false, 00:19:19.985 "compare": false, 00:19:19.985 "compare_and_write": false, 00:19:19.985 "abort": true, 00:19:19.985 "seek_hole": false, 00:19:19.985 "seek_data": false, 00:19:19.985 "copy": true, 00:19:19.985 "nvme_iov_md": false 00:19:19.985 }, 00:19:19.985 "memory_domains": [ 00:19:19.985 { 00:19:19.985 "dma_device_id": "system", 00:19:19.985 "dma_device_type": 1 00:19:19.985 }, 00:19:19.985 { 00:19:19.985 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:19.985 "dma_device_type": 2 00:19:19.985 } 00:19:19.985 ], 00:19:19.985 "driver_specific": {} 00:19:19.985 } 00:19:19.985 ] 00:19:19.985 18:23:03 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:19:19.985 18:23:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:19:19.985 18:23:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:19:19.985 18:23:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:19:19.985 18:23:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:19:19.985 18:23:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:19:19.985 18:23:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:19:19.985 18:23:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:19:19.985 18:23:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:19:19.985 18:23:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:19:19.985 18:23:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:19:19.985 18:23:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:19:19.985 18:23:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:19:19.985 18:23:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:19.985 18:23:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:19:20.243 18:23:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:19:20.243 "name": "Existed_Raid", 00:19:20.243 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:20.243 "strip_size_kb": 64, 00:19:20.243 "state": "configuring", 00:19:20.243 "raid_level": "concat", 00:19:20.243 "superblock": false, 00:19:20.243 "num_base_bdevs": 4, 00:19:20.243 "num_base_bdevs_discovered": 3, 00:19:20.243 "num_base_bdevs_operational": 4, 00:19:20.243 "base_bdevs_list": [ 00:19:20.243 { 00:19:20.243 "name": "BaseBdev1", 00:19:20.243 "uuid": "4d681a8a-fe7f-4d3a-a496-a9a16e5e4322", 00:19:20.243 "is_configured": true, 00:19:20.243 "data_offset": 0, 00:19:20.243 "data_size": 65536 00:19:20.243 }, 00:19:20.243 { 00:19:20.243 "name": "BaseBdev2", 00:19:20.243 "uuid": "6be219e7-3b48-47d1-a9fe-130c5ba34813", 00:19:20.243 "is_configured": true, 00:19:20.243 "data_offset": 0, 00:19:20.243 "data_size": 65536 00:19:20.243 }, 00:19:20.243 { 00:19:20.243 "name": "BaseBdev3", 00:19:20.243 "uuid": "34d7bc77-db92-4ce1-9410-41ceb7c825d3", 00:19:20.243 "is_configured": true, 00:19:20.243 "data_offset": 0, 00:19:20.243 "data_size": 65536 00:19:20.243 }, 00:19:20.243 { 00:19:20.243 "name": "BaseBdev4", 00:19:20.243 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:20.243 "is_configured": false, 00:19:20.243 "data_offset": 0, 00:19:20.243 "data_size": 0 00:19:20.243 } 00:19:20.243 ] 00:19:20.243 }' 00:19:20.243 18:23:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:19:20.243 18:23:03 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:19:20.808 18:23:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4 00:19:21.067 [2024-07-12 18:23:04.676755] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:19:21.067 [2024-07-12 18:23:04.676794] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0xcd0350 00:19:21.067 [2024-07-12 18:23:04.676803] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 262144, blocklen 512 00:19:21.067 [2024-07-12 18:23:04.677060] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xcd0020 00:19:21.067 [2024-07-12 18:23:04.677185] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0xcd0350 00:19:21.067 [2024-07-12 18:23:04.677195] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0xcd0350 00:19:21.067 [2024-07-12 18:23:04.677360] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:19:21.067 BaseBdev4 00:19:21.067 18:23:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev4 00:19:21.067 18:23:04 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev4 00:19:21.067 18:23:04 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:19:21.067 18:23:04 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:19:21.067 18:23:04 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:19:21.067 18:23:04 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:19:21.067 18:23:04 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:19:21.326 18:23:04 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 -t 2000 00:19:21.585 [ 00:19:21.585 { 00:19:21.585 "name": "BaseBdev4", 00:19:21.585 "aliases": [ 00:19:21.585 "4c5a9f0a-9f57-473a-8355-1c114470876a" 00:19:21.585 ], 00:19:21.585 "product_name": "Malloc disk", 00:19:21.585 "block_size": 512, 00:19:21.585 "num_blocks": 65536, 00:19:21.585 "uuid": "4c5a9f0a-9f57-473a-8355-1c114470876a", 00:19:21.585 "assigned_rate_limits": { 00:19:21.585 "rw_ios_per_sec": 0, 00:19:21.585 "rw_mbytes_per_sec": 0, 00:19:21.585 "r_mbytes_per_sec": 0, 00:19:21.585 "w_mbytes_per_sec": 0 00:19:21.585 }, 00:19:21.585 "claimed": true, 00:19:21.585 "claim_type": "exclusive_write", 00:19:21.585 "zoned": false, 00:19:21.585 "supported_io_types": { 00:19:21.585 "read": true, 00:19:21.585 "write": true, 00:19:21.585 "unmap": true, 00:19:21.585 "flush": true, 00:19:21.585 "reset": true, 00:19:21.585 "nvme_admin": false, 00:19:21.585 "nvme_io": false, 00:19:21.585 "nvme_io_md": false, 00:19:21.585 "write_zeroes": true, 00:19:21.585 "zcopy": true, 00:19:21.585 "get_zone_info": false, 00:19:21.585 "zone_management": false, 00:19:21.585 "zone_append": false, 00:19:21.585 "compare": false, 00:19:21.585 "compare_and_write": false, 00:19:21.585 "abort": true, 00:19:21.585 "seek_hole": false, 00:19:21.585 "seek_data": false, 00:19:21.585 "copy": true, 00:19:21.585 "nvme_iov_md": false 00:19:21.585 }, 00:19:21.585 "memory_domains": [ 00:19:21.585 { 00:19:21.585 "dma_device_id": "system", 00:19:21.585 "dma_device_type": 1 00:19:21.585 }, 00:19:21.585 { 00:19:21.585 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:21.585 "dma_device_type": 2 00:19:21.585 } 00:19:21.585 ], 00:19:21.585 "driver_specific": {} 00:19:21.585 } 00:19:21.585 ] 00:19:21.585 18:23:05 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:19:21.585 18:23:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:19:21.585 18:23:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:19:21.585 18:23:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@270 -- # verify_raid_bdev_state Existed_Raid online concat 64 4 00:19:21.585 18:23:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:19:21.585 18:23:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:19:21.585 18:23:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:19:21.585 18:23:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:19:21.585 18:23:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:19:21.585 18:23:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:19:21.585 18:23:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:19:21.585 18:23:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:19:21.585 18:23:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:19:21.586 18:23:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:21.586 18:23:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:19:21.844 18:23:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:19:21.844 "name": "Existed_Raid", 00:19:21.844 "uuid": "15ac1efa-8939-43a0-a259-1cde52d14e67", 00:19:21.844 "strip_size_kb": 64, 00:19:21.844 "state": "online", 00:19:21.844 "raid_level": "concat", 00:19:21.844 "superblock": false, 00:19:21.844 "num_base_bdevs": 4, 00:19:21.844 "num_base_bdevs_discovered": 4, 00:19:21.845 "num_base_bdevs_operational": 4, 00:19:21.845 "base_bdevs_list": [ 00:19:21.845 { 00:19:21.845 "name": "BaseBdev1", 00:19:21.845 "uuid": "4d681a8a-fe7f-4d3a-a496-a9a16e5e4322", 00:19:21.845 "is_configured": true, 00:19:21.845 "data_offset": 0, 00:19:21.845 "data_size": 65536 00:19:21.845 }, 00:19:21.845 { 00:19:21.845 "name": "BaseBdev2", 00:19:21.845 "uuid": "6be219e7-3b48-47d1-a9fe-130c5ba34813", 00:19:21.845 "is_configured": true, 00:19:21.845 "data_offset": 0, 00:19:21.845 "data_size": 65536 00:19:21.845 }, 00:19:21.845 { 00:19:21.845 "name": "BaseBdev3", 00:19:21.845 "uuid": "34d7bc77-db92-4ce1-9410-41ceb7c825d3", 00:19:21.845 "is_configured": true, 00:19:21.845 "data_offset": 0, 00:19:21.845 "data_size": 65536 00:19:21.845 }, 00:19:21.845 { 00:19:21.845 "name": "BaseBdev4", 00:19:21.845 "uuid": "4c5a9f0a-9f57-473a-8355-1c114470876a", 00:19:21.845 "is_configured": true, 00:19:21.845 "data_offset": 0, 00:19:21.845 "data_size": 65536 00:19:21.845 } 00:19:21.845 ] 00:19:21.845 }' 00:19:21.845 18:23:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:19:21.845 18:23:05 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:19:22.412 18:23:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_properties Existed_Raid 00:19:22.412 18:23:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:19:22.412 18:23:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:19:22.412 18:23:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:19:22.412 18:23:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:19:22.412 18:23:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@198 -- # local name 00:19:22.412 18:23:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:19:22.412 18:23:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:19:22.671 [2024-07-12 18:23:06.277345] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:19:22.671 18:23:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:19:22.671 "name": "Existed_Raid", 00:19:22.671 "aliases": [ 00:19:22.671 "15ac1efa-8939-43a0-a259-1cde52d14e67" 00:19:22.671 ], 00:19:22.671 "product_name": "Raid Volume", 00:19:22.671 "block_size": 512, 00:19:22.671 "num_blocks": 262144, 00:19:22.671 "uuid": "15ac1efa-8939-43a0-a259-1cde52d14e67", 00:19:22.671 "assigned_rate_limits": { 00:19:22.671 "rw_ios_per_sec": 0, 00:19:22.671 "rw_mbytes_per_sec": 0, 00:19:22.671 "r_mbytes_per_sec": 0, 00:19:22.671 "w_mbytes_per_sec": 0 00:19:22.671 }, 00:19:22.671 "claimed": false, 00:19:22.671 "zoned": false, 00:19:22.671 "supported_io_types": { 00:19:22.671 "read": true, 00:19:22.671 "write": true, 00:19:22.671 "unmap": true, 00:19:22.671 "flush": true, 00:19:22.671 "reset": true, 00:19:22.671 "nvme_admin": false, 00:19:22.671 "nvme_io": false, 00:19:22.671 "nvme_io_md": false, 00:19:22.671 "write_zeroes": true, 00:19:22.671 "zcopy": false, 00:19:22.671 "get_zone_info": false, 00:19:22.671 "zone_management": false, 00:19:22.671 "zone_append": false, 00:19:22.671 "compare": false, 00:19:22.671 "compare_and_write": false, 00:19:22.671 "abort": false, 00:19:22.671 "seek_hole": false, 00:19:22.671 "seek_data": false, 00:19:22.671 "copy": false, 00:19:22.671 "nvme_iov_md": false 00:19:22.671 }, 00:19:22.671 "memory_domains": [ 00:19:22.671 { 00:19:22.671 "dma_device_id": "system", 00:19:22.671 "dma_device_type": 1 00:19:22.671 }, 00:19:22.671 { 00:19:22.671 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:22.671 "dma_device_type": 2 00:19:22.671 }, 00:19:22.671 { 00:19:22.671 "dma_device_id": "system", 00:19:22.671 "dma_device_type": 1 00:19:22.671 }, 00:19:22.671 { 00:19:22.671 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:22.671 "dma_device_type": 2 00:19:22.671 }, 00:19:22.671 { 00:19:22.671 "dma_device_id": "system", 00:19:22.671 "dma_device_type": 1 00:19:22.671 }, 00:19:22.671 { 00:19:22.671 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:22.671 "dma_device_type": 2 00:19:22.671 }, 00:19:22.671 { 00:19:22.671 "dma_device_id": "system", 00:19:22.671 "dma_device_type": 1 00:19:22.671 }, 00:19:22.671 { 00:19:22.671 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:22.671 "dma_device_type": 2 00:19:22.671 } 00:19:22.671 ], 00:19:22.671 "driver_specific": { 00:19:22.671 "raid": { 00:19:22.671 "uuid": "15ac1efa-8939-43a0-a259-1cde52d14e67", 00:19:22.671 "strip_size_kb": 64, 00:19:22.671 "state": "online", 00:19:22.671 "raid_level": "concat", 00:19:22.671 "superblock": false, 00:19:22.671 "num_base_bdevs": 4, 00:19:22.671 "num_base_bdevs_discovered": 4, 00:19:22.671 "num_base_bdevs_operational": 4, 00:19:22.671 "base_bdevs_list": [ 00:19:22.671 { 00:19:22.671 "name": "BaseBdev1", 00:19:22.671 "uuid": "4d681a8a-fe7f-4d3a-a496-a9a16e5e4322", 00:19:22.671 "is_configured": true, 00:19:22.671 "data_offset": 0, 00:19:22.671 "data_size": 65536 00:19:22.671 }, 00:19:22.671 { 00:19:22.671 "name": "BaseBdev2", 00:19:22.671 "uuid": "6be219e7-3b48-47d1-a9fe-130c5ba34813", 00:19:22.671 "is_configured": true, 00:19:22.671 "data_offset": 0, 00:19:22.671 "data_size": 65536 00:19:22.671 }, 00:19:22.671 { 00:19:22.671 "name": "BaseBdev3", 00:19:22.671 "uuid": "34d7bc77-db92-4ce1-9410-41ceb7c825d3", 00:19:22.671 "is_configured": true, 00:19:22.671 "data_offset": 0, 00:19:22.671 "data_size": 65536 00:19:22.671 }, 00:19:22.671 { 00:19:22.671 "name": "BaseBdev4", 00:19:22.671 "uuid": "4c5a9f0a-9f57-473a-8355-1c114470876a", 00:19:22.671 "is_configured": true, 00:19:22.671 "data_offset": 0, 00:19:22.671 "data_size": 65536 00:19:22.671 } 00:19:22.671 ] 00:19:22.671 } 00:19:22.671 } 00:19:22.671 }' 00:19:22.671 18:23:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:19:22.671 18:23:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='BaseBdev1 00:19:22.671 BaseBdev2 00:19:22.671 BaseBdev3 00:19:22.671 BaseBdev4' 00:19:22.671 18:23:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:19:22.671 18:23:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:19:22.671 18:23:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:19:22.930 18:23:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:19:22.930 "name": "BaseBdev1", 00:19:22.930 "aliases": [ 00:19:22.930 "4d681a8a-fe7f-4d3a-a496-a9a16e5e4322" 00:19:22.930 ], 00:19:22.930 "product_name": "Malloc disk", 00:19:22.930 "block_size": 512, 00:19:22.930 "num_blocks": 65536, 00:19:22.930 "uuid": "4d681a8a-fe7f-4d3a-a496-a9a16e5e4322", 00:19:22.930 "assigned_rate_limits": { 00:19:22.930 "rw_ios_per_sec": 0, 00:19:22.930 "rw_mbytes_per_sec": 0, 00:19:22.930 "r_mbytes_per_sec": 0, 00:19:22.930 "w_mbytes_per_sec": 0 00:19:22.930 }, 00:19:22.930 "claimed": true, 00:19:22.930 "claim_type": "exclusive_write", 00:19:22.930 "zoned": false, 00:19:22.930 "supported_io_types": { 00:19:22.930 "read": true, 00:19:22.931 "write": true, 00:19:22.931 "unmap": true, 00:19:22.931 "flush": true, 00:19:22.931 "reset": true, 00:19:22.931 "nvme_admin": false, 00:19:22.931 "nvme_io": false, 00:19:22.931 "nvme_io_md": false, 00:19:22.931 "write_zeroes": true, 00:19:22.931 "zcopy": true, 00:19:22.931 "get_zone_info": false, 00:19:22.931 "zone_management": false, 00:19:22.931 "zone_append": false, 00:19:22.931 "compare": false, 00:19:22.931 "compare_and_write": false, 00:19:22.931 "abort": true, 00:19:22.931 "seek_hole": false, 00:19:22.931 "seek_data": false, 00:19:22.931 "copy": true, 00:19:22.931 "nvme_iov_md": false 00:19:22.931 }, 00:19:22.931 "memory_domains": [ 00:19:22.931 { 00:19:22.931 "dma_device_id": "system", 00:19:22.931 "dma_device_type": 1 00:19:22.931 }, 00:19:22.931 { 00:19:22.931 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:22.931 "dma_device_type": 2 00:19:22.931 } 00:19:22.931 ], 00:19:22.931 "driver_specific": {} 00:19:22.931 }' 00:19:22.931 18:23:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:19:22.931 18:23:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:19:23.189 18:23:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:19:23.189 18:23:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:19:23.189 18:23:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:19:23.189 18:23:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:19:23.189 18:23:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:19:23.189 18:23:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:19:23.189 18:23:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:19:23.189 18:23:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:19:23.189 18:23:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:19:23.448 18:23:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:19:23.448 18:23:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:19:23.448 18:23:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:19:23.448 18:23:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:19:23.707 18:23:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:19:23.707 "name": "BaseBdev2", 00:19:23.707 "aliases": [ 00:19:23.707 "6be219e7-3b48-47d1-a9fe-130c5ba34813" 00:19:23.707 ], 00:19:23.707 "product_name": "Malloc disk", 00:19:23.707 "block_size": 512, 00:19:23.707 "num_blocks": 65536, 00:19:23.707 "uuid": "6be219e7-3b48-47d1-a9fe-130c5ba34813", 00:19:23.707 "assigned_rate_limits": { 00:19:23.707 "rw_ios_per_sec": 0, 00:19:23.707 "rw_mbytes_per_sec": 0, 00:19:23.707 "r_mbytes_per_sec": 0, 00:19:23.707 "w_mbytes_per_sec": 0 00:19:23.707 }, 00:19:23.707 "claimed": true, 00:19:23.707 "claim_type": "exclusive_write", 00:19:23.707 "zoned": false, 00:19:23.707 "supported_io_types": { 00:19:23.707 "read": true, 00:19:23.707 "write": true, 00:19:23.707 "unmap": true, 00:19:23.707 "flush": true, 00:19:23.707 "reset": true, 00:19:23.707 "nvme_admin": false, 00:19:23.707 "nvme_io": false, 00:19:23.707 "nvme_io_md": false, 00:19:23.707 "write_zeroes": true, 00:19:23.707 "zcopy": true, 00:19:23.707 "get_zone_info": false, 00:19:23.707 "zone_management": false, 00:19:23.707 "zone_append": false, 00:19:23.707 "compare": false, 00:19:23.707 "compare_and_write": false, 00:19:23.707 "abort": true, 00:19:23.707 "seek_hole": false, 00:19:23.707 "seek_data": false, 00:19:23.707 "copy": true, 00:19:23.707 "nvme_iov_md": false 00:19:23.707 }, 00:19:23.707 "memory_domains": [ 00:19:23.707 { 00:19:23.707 "dma_device_id": "system", 00:19:23.707 "dma_device_type": 1 00:19:23.707 }, 00:19:23.707 { 00:19:23.707 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:23.707 "dma_device_type": 2 00:19:23.707 } 00:19:23.707 ], 00:19:23.707 "driver_specific": {} 00:19:23.707 }' 00:19:23.707 18:23:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:19:23.707 18:23:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:19:23.707 18:23:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:19:23.707 18:23:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:19:23.707 18:23:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:19:23.707 18:23:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:19:23.707 18:23:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:19:23.707 18:23:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:19:23.966 18:23:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:19:23.966 18:23:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:19:23.966 18:23:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:19:23.966 18:23:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:19:23.966 18:23:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:19:23.966 18:23:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:19:23.966 18:23:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:19:24.224 18:23:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:19:24.224 "name": "BaseBdev3", 00:19:24.224 "aliases": [ 00:19:24.224 "34d7bc77-db92-4ce1-9410-41ceb7c825d3" 00:19:24.224 ], 00:19:24.224 "product_name": "Malloc disk", 00:19:24.224 "block_size": 512, 00:19:24.224 "num_blocks": 65536, 00:19:24.224 "uuid": "34d7bc77-db92-4ce1-9410-41ceb7c825d3", 00:19:24.224 "assigned_rate_limits": { 00:19:24.224 "rw_ios_per_sec": 0, 00:19:24.224 "rw_mbytes_per_sec": 0, 00:19:24.224 "r_mbytes_per_sec": 0, 00:19:24.224 "w_mbytes_per_sec": 0 00:19:24.224 }, 00:19:24.224 "claimed": true, 00:19:24.224 "claim_type": "exclusive_write", 00:19:24.224 "zoned": false, 00:19:24.224 "supported_io_types": { 00:19:24.224 "read": true, 00:19:24.224 "write": true, 00:19:24.224 "unmap": true, 00:19:24.224 "flush": true, 00:19:24.224 "reset": true, 00:19:24.224 "nvme_admin": false, 00:19:24.224 "nvme_io": false, 00:19:24.224 "nvme_io_md": false, 00:19:24.224 "write_zeroes": true, 00:19:24.224 "zcopy": true, 00:19:24.224 "get_zone_info": false, 00:19:24.224 "zone_management": false, 00:19:24.224 "zone_append": false, 00:19:24.224 "compare": false, 00:19:24.224 "compare_and_write": false, 00:19:24.224 "abort": true, 00:19:24.224 "seek_hole": false, 00:19:24.224 "seek_data": false, 00:19:24.224 "copy": true, 00:19:24.224 "nvme_iov_md": false 00:19:24.224 }, 00:19:24.224 "memory_domains": [ 00:19:24.224 { 00:19:24.224 "dma_device_id": "system", 00:19:24.225 "dma_device_type": 1 00:19:24.225 }, 00:19:24.225 { 00:19:24.225 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:24.225 "dma_device_type": 2 00:19:24.225 } 00:19:24.225 ], 00:19:24.225 "driver_specific": {} 00:19:24.225 }' 00:19:24.225 18:23:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:19:24.225 18:23:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:19:24.225 18:23:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:19:24.225 18:23:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:19:24.225 18:23:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:19:24.483 18:23:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:19:24.483 18:23:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:19:24.483 18:23:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:19:24.483 18:23:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:19:24.483 18:23:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:19:24.483 18:23:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:19:24.483 18:23:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:19:24.483 18:23:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:19:24.483 18:23:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 00:19:24.483 18:23:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:19:24.743 18:23:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:19:24.743 "name": "BaseBdev4", 00:19:24.743 "aliases": [ 00:19:24.743 "4c5a9f0a-9f57-473a-8355-1c114470876a" 00:19:24.743 ], 00:19:24.743 "product_name": "Malloc disk", 00:19:24.743 "block_size": 512, 00:19:24.743 "num_blocks": 65536, 00:19:24.743 "uuid": "4c5a9f0a-9f57-473a-8355-1c114470876a", 00:19:24.743 "assigned_rate_limits": { 00:19:24.743 "rw_ios_per_sec": 0, 00:19:24.743 "rw_mbytes_per_sec": 0, 00:19:24.743 "r_mbytes_per_sec": 0, 00:19:24.743 "w_mbytes_per_sec": 0 00:19:24.743 }, 00:19:24.743 "claimed": true, 00:19:24.743 "claim_type": "exclusive_write", 00:19:24.743 "zoned": false, 00:19:24.743 "supported_io_types": { 00:19:24.743 "read": true, 00:19:24.743 "write": true, 00:19:24.743 "unmap": true, 00:19:24.743 "flush": true, 00:19:24.743 "reset": true, 00:19:24.743 "nvme_admin": false, 00:19:24.743 "nvme_io": false, 00:19:24.743 "nvme_io_md": false, 00:19:24.743 "write_zeroes": true, 00:19:24.743 "zcopy": true, 00:19:24.743 "get_zone_info": false, 00:19:24.743 "zone_management": false, 00:19:24.743 "zone_append": false, 00:19:24.743 "compare": false, 00:19:24.743 "compare_and_write": false, 00:19:24.743 "abort": true, 00:19:24.743 "seek_hole": false, 00:19:24.743 "seek_data": false, 00:19:24.743 "copy": true, 00:19:24.743 "nvme_iov_md": false 00:19:24.743 }, 00:19:24.743 "memory_domains": [ 00:19:24.743 { 00:19:24.743 "dma_device_id": "system", 00:19:24.743 "dma_device_type": 1 00:19:24.743 }, 00:19:24.743 { 00:19:24.743 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:24.743 "dma_device_type": 2 00:19:24.743 } 00:19:24.743 ], 00:19:24.743 "driver_specific": {} 00:19:24.743 }' 00:19:24.743 18:23:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:19:24.743 18:23:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:19:25.002 18:23:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:19:25.002 18:23:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:19:25.002 18:23:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:19:25.002 18:23:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:19:25.002 18:23:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:19:25.002 18:23:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:19:25.002 18:23:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:19:25.002 18:23:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:19:25.002 18:23:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:19:25.261 18:23:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:19:25.261 18:23:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@274 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:19:25.261 [2024-07-12 18:23:08.956160] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:19:25.261 [2024-07-12 18:23:08.956186] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:19:25.261 [2024-07-12 18:23:08.956231] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:19:25.261 18:23:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@275 -- # local expected_state 00:19:25.261 18:23:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@276 -- # has_redundancy concat 00:19:25.261 18:23:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:19:25.261 18:23:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@215 -- # return 1 00:19:25.261 18:23:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@277 -- # expected_state=offline 00:19:25.261 18:23:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@281 -- # verify_raid_bdev_state Existed_Raid offline concat 64 3 00:19:25.261 18:23:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:19:25.261 18:23:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=offline 00:19:25.261 18:23:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:19:25.261 18:23:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:19:25.261 18:23:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:19:25.261 18:23:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:19:25.261 18:23:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:19:25.261 18:23:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:19:25.261 18:23:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:19:25.261 18:23:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:25.261 18:23:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:19:25.520 18:23:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:19:25.520 "name": "Existed_Raid", 00:19:25.520 "uuid": "15ac1efa-8939-43a0-a259-1cde52d14e67", 00:19:25.520 "strip_size_kb": 64, 00:19:25.520 "state": "offline", 00:19:25.520 "raid_level": "concat", 00:19:25.520 "superblock": false, 00:19:25.520 "num_base_bdevs": 4, 00:19:25.520 "num_base_bdevs_discovered": 3, 00:19:25.520 "num_base_bdevs_operational": 3, 00:19:25.520 "base_bdevs_list": [ 00:19:25.520 { 00:19:25.520 "name": null, 00:19:25.520 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:25.520 "is_configured": false, 00:19:25.520 "data_offset": 0, 00:19:25.520 "data_size": 65536 00:19:25.520 }, 00:19:25.520 { 00:19:25.520 "name": "BaseBdev2", 00:19:25.520 "uuid": "6be219e7-3b48-47d1-a9fe-130c5ba34813", 00:19:25.520 "is_configured": true, 00:19:25.520 "data_offset": 0, 00:19:25.520 "data_size": 65536 00:19:25.520 }, 00:19:25.520 { 00:19:25.520 "name": "BaseBdev3", 00:19:25.520 "uuid": "34d7bc77-db92-4ce1-9410-41ceb7c825d3", 00:19:25.520 "is_configured": true, 00:19:25.520 "data_offset": 0, 00:19:25.520 "data_size": 65536 00:19:25.520 }, 00:19:25.520 { 00:19:25.520 "name": "BaseBdev4", 00:19:25.520 "uuid": "4c5a9f0a-9f57-473a-8355-1c114470876a", 00:19:25.520 "is_configured": true, 00:19:25.520 "data_offset": 0, 00:19:25.520 "data_size": 65536 00:19:25.520 } 00:19:25.520 ] 00:19:25.520 }' 00:19:25.520 18:23:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:19:25.520 18:23:09 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:19:26.457 18:23:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i = 1 )) 00:19:26.457 18:23:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:19:26.457 18:23:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:19:26.457 18:23:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:26.457 18:23:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:19:26.457 18:23:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:19:26.457 18:23:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:19:26.716 [2024-07-12 18:23:10.288736] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:19:26.716 18:23:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:19:26.716 18:23:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:19:26.716 18:23:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:26.716 18:23:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:19:26.974 18:23:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:19:26.974 18:23:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:19:26.974 18:23:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev3 00:19:27.233 [2024-07-12 18:23:10.786414] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:19:27.233 18:23:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:19:27.233 18:23:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:19:27.233 18:23:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:27.233 18:23:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:19:27.492 18:23:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:19:27.492 18:23:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:19:27.492 18:23:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev4 00:19:27.751 [2024-07-12 18:23:11.286320] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev4 00:19:27.751 [2024-07-12 18:23:11.286365] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xcd0350 name Existed_Raid, state offline 00:19:27.751 18:23:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:19:27.751 18:23:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:19:27.751 18:23:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:27.751 18:23:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # jq -r '.[0]["name"] | select(.)' 00:19:28.010 18:23:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # raid_bdev= 00:19:28.010 18:23:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@294 -- # '[' -n '' ']' 00:19:28.010 18:23:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@299 -- # '[' 4 -gt 2 ']' 00:19:28.010 18:23:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i = 1 )) 00:19:28.010 18:23:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:19:28.010 18:23:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:19:28.269 BaseBdev2 00:19:28.269 18:23:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev2 00:19:28.269 18:23:11 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev2 00:19:28.269 18:23:11 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:19:28.269 18:23:11 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:19:28.269 18:23:11 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:19:28.269 18:23:11 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:19:28.269 18:23:11 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:19:28.527 18:23:12 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:19:28.787 [ 00:19:28.788 { 00:19:28.788 "name": "BaseBdev2", 00:19:28.788 "aliases": [ 00:19:28.788 "35431aa8-e99f-4c14-88bd-b8fbcd6b8c70" 00:19:28.788 ], 00:19:28.788 "product_name": "Malloc disk", 00:19:28.788 "block_size": 512, 00:19:28.788 "num_blocks": 65536, 00:19:28.788 "uuid": "35431aa8-e99f-4c14-88bd-b8fbcd6b8c70", 00:19:28.788 "assigned_rate_limits": { 00:19:28.788 "rw_ios_per_sec": 0, 00:19:28.788 "rw_mbytes_per_sec": 0, 00:19:28.788 "r_mbytes_per_sec": 0, 00:19:28.788 "w_mbytes_per_sec": 0 00:19:28.788 }, 00:19:28.788 "claimed": false, 00:19:28.788 "zoned": false, 00:19:28.788 "supported_io_types": { 00:19:28.788 "read": true, 00:19:28.788 "write": true, 00:19:28.788 "unmap": true, 00:19:28.788 "flush": true, 00:19:28.788 "reset": true, 00:19:28.788 "nvme_admin": false, 00:19:28.788 "nvme_io": false, 00:19:28.788 "nvme_io_md": false, 00:19:28.788 "write_zeroes": true, 00:19:28.788 "zcopy": true, 00:19:28.788 "get_zone_info": false, 00:19:28.788 "zone_management": false, 00:19:28.788 "zone_append": false, 00:19:28.788 "compare": false, 00:19:28.788 "compare_and_write": false, 00:19:28.788 "abort": true, 00:19:28.788 "seek_hole": false, 00:19:28.788 "seek_data": false, 00:19:28.788 "copy": true, 00:19:28.788 "nvme_iov_md": false 00:19:28.788 }, 00:19:28.788 "memory_domains": [ 00:19:28.788 { 00:19:28.788 "dma_device_id": "system", 00:19:28.788 "dma_device_type": 1 00:19:28.788 }, 00:19:28.788 { 00:19:28.788 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:28.788 "dma_device_type": 2 00:19:28.788 } 00:19:28.788 ], 00:19:28.788 "driver_specific": {} 00:19:28.788 } 00:19:28.788 ] 00:19:28.788 18:23:12 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:19:28.788 18:23:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:19:28.788 18:23:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:19:28.788 18:23:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:19:29.048 BaseBdev3 00:19:29.048 18:23:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev3 00:19:29.048 18:23:12 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev3 00:19:29.048 18:23:12 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:19:29.048 18:23:12 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:19:29.048 18:23:12 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:19:29.048 18:23:12 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:19:29.048 18:23:12 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:19:29.330 18:23:12 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:19:29.330 [ 00:19:29.330 { 00:19:29.330 "name": "BaseBdev3", 00:19:29.330 "aliases": [ 00:19:29.330 "e35e6e6f-e586-44dc-a04c-be18f779707f" 00:19:29.330 ], 00:19:29.330 "product_name": "Malloc disk", 00:19:29.330 "block_size": 512, 00:19:29.330 "num_blocks": 65536, 00:19:29.330 "uuid": "e35e6e6f-e586-44dc-a04c-be18f779707f", 00:19:29.330 "assigned_rate_limits": { 00:19:29.330 "rw_ios_per_sec": 0, 00:19:29.330 "rw_mbytes_per_sec": 0, 00:19:29.330 "r_mbytes_per_sec": 0, 00:19:29.330 "w_mbytes_per_sec": 0 00:19:29.330 }, 00:19:29.330 "claimed": false, 00:19:29.330 "zoned": false, 00:19:29.330 "supported_io_types": { 00:19:29.330 "read": true, 00:19:29.330 "write": true, 00:19:29.330 "unmap": true, 00:19:29.330 "flush": true, 00:19:29.330 "reset": true, 00:19:29.330 "nvme_admin": false, 00:19:29.330 "nvme_io": false, 00:19:29.330 "nvme_io_md": false, 00:19:29.330 "write_zeroes": true, 00:19:29.330 "zcopy": true, 00:19:29.330 "get_zone_info": false, 00:19:29.330 "zone_management": false, 00:19:29.330 "zone_append": false, 00:19:29.330 "compare": false, 00:19:29.330 "compare_and_write": false, 00:19:29.330 "abort": true, 00:19:29.330 "seek_hole": false, 00:19:29.330 "seek_data": false, 00:19:29.330 "copy": true, 00:19:29.330 "nvme_iov_md": false 00:19:29.330 }, 00:19:29.330 "memory_domains": [ 00:19:29.330 { 00:19:29.330 "dma_device_id": "system", 00:19:29.330 "dma_device_type": 1 00:19:29.330 }, 00:19:29.330 { 00:19:29.330 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:29.330 "dma_device_type": 2 00:19:29.330 } 00:19:29.330 ], 00:19:29.330 "driver_specific": {} 00:19:29.330 } 00:19:29.330 ] 00:19:29.330 18:23:13 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:19:29.330 18:23:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:19:29.330 18:23:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:19:29.330 18:23:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4 00:19:29.589 BaseBdev4 00:19:29.589 18:23:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev4 00:19:29.590 18:23:13 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev4 00:19:29.590 18:23:13 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:19:29.590 18:23:13 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:19:29.590 18:23:13 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:19:29.590 18:23:13 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:19:29.590 18:23:13 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:19:29.848 18:23:13 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 -t 2000 00:19:30.106 [ 00:19:30.106 { 00:19:30.106 "name": "BaseBdev4", 00:19:30.106 "aliases": [ 00:19:30.106 "d5d89953-f807-4676-9ece-9b6628ee6024" 00:19:30.106 ], 00:19:30.106 "product_name": "Malloc disk", 00:19:30.106 "block_size": 512, 00:19:30.106 "num_blocks": 65536, 00:19:30.106 "uuid": "d5d89953-f807-4676-9ece-9b6628ee6024", 00:19:30.106 "assigned_rate_limits": { 00:19:30.106 "rw_ios_per_sec": 0, 00:19:30.106 "rw_mbytes_per_sec": 0, 00:19:30.106 "r_mbytes_per_sec": 0, 00:19:30.106 "w_mbytes_per_sec": 0 00:19:30.106 }, 00:19:30.106 "claimed": false, 00:19:30.106 "zoned": false, 00:19:30.106 "supported_io_types": { 00:19:30.106 "read": true, 00:19:30.106 "write": true, 00:19:30.106 "unmap": true, 00:19:30.106 "flush": true, 00:19:30.106 "reset": true, 00:19:30.106 "nvme_admin": false, 00:19:30.106 "nvme_io": false, 00:19:30.106 "nvme_io_md": false, 00:19:30.106 "write_zeroes": true, 00:19:30.106 "zcopy": true, 00:19:30.106 "get_zone_info": false, 00:19:30.106 "zone_management": false, 00:19:30.106 "zone_append": false, 00:19:30.106 "compare": false, 00:19:30.106 "compare_and_write": false, 00:19:30.106 "abort": true, 00:19:30.106 "seek_hole": false, 00:19:30.106 "seek_data": false, 00:19:30.106 "copy": true, 00:19:30.106 "nvme_iov_md": false 00:19:30.106 }, 00:19:30.106 "memory_domains": [ 00:19:30.106 { 00:19:30.106 "dma_device_id": "system", 00:19:30.106 "dma_device_type": 1 00:19:30.106 }, 00:19:30.106 { 00:19:30.106 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:30.106 "dma_device_type": 2 00:19:30.106 } 00:19:30.106 ], 00:19:30.106 "driver_specific": {} 00:19:30.106 } 00:19:30.106 ] 00:19:30.106 18:23:13 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:19:30.106 18:23:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:19:30.106 18:23:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:19:30.106 18:23:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@305 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:19:30.365 [2024-07-12 18:23:13.983573] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:19:30.365 [2024-07-12 18:23:13.983618] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:19:30.365 [2024-07-12 18:23:13.983636] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:19:30.365 [2024-07-12 18:23:13.985000] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:19:30.365 [2024-07-12 18:23:13.985044] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:19:30.365 18:23:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@306 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:19:30.365 18:23:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:19:30.365 18:23:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:19:30.365 18:23:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:19:30.365 18:23:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:19:30.365 18:23:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:19:30.365 18:23:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:19:30.365 18:23:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:19:30.365 18:23:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:19:30.365 18:23:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:19:30.365 18:23:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:30.365 18:23:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:19:30.631 18:23:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:19:30.631 "name": "Existed_Raid", 00:19:30.631 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:30.631 "strip_size_kb": 64, 00:19:30.631 "state": "configuring", 00:19:30.631 "raid_level": "concat", 00:19:30.631 "superblock": false, 00:19:30.631 "num_base_bdevs": 4, 00:19:30.631 "num_base_bdevs_discovered": 3, 00:19:30.631 "num_base_bdevs_operational": 4, 00:19:30.631 "base_bdevs_list": [ 00:19:30.631 { 00:19:30.631 "name": "BaseBdev1", 00:19:30.631 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:30.631 "is_configured": false, 00:19:30.631 "data_offset": 0, 00:19:30.631 "data_size": 0 00:19:30.631 }, 00:19:30.631 { 00:19:30.631 "name": "BaseBdev2", 00:19:30.631 "uuid": "35431aa8-e99f-4c14-88bd-b8fbcd6b8c70", 00:19:30.631 "is_configured": true, 00:19:30.631 "data_offset": 0, 00:19:30.631 "data_size": 65536 00:19:30.631 }, 00:19:30.631 { 00:19:30.631 "name": "BaseBdev3", 00:19:30.631 "uuid": "e35e6e6f-e586-44dc-a04c-be18f779707f", 00:19:30.631 "is_configured": true, 00:19:30.631 "data_offset": 0, 00:19:30.631 "data_size": 65536 00:19:30.631 }, 00:19:30.631 { 00:19:30.631 "name": "BaseBdev4", 00:19:30.631 "uuid": "d5d89953-f807-4676-9ece-9b6628ee6024", 00:19:30.631 "is_configured": true, 00:19:30.631 "data_offset": 0, 00:19:30.631 "data_size": 65536 00:19:30.631 } 00:19:30.631 ] 00:19:30.631 }' 00:19:30.631 18:23:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:19:30.631 18:23:14 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:19:31.221 18:23:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@308 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev2 00:19:31.480 [2024-07-12 18:23:15.062407] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:19:31.480 18:23:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@309 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:19:31.480 18:23:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:19:31.480 18:23:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:19:31.480 18:23:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:19:31.480 18:23:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:19:31.480 18:23:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:19:31.480 18:23:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:19:31.480 18:23:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:19:31.480 18:23:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:19:31.480 18:23:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:19:31.480 18:23:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:31.480 18:23:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:19:31.739 18:23:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:19:31.739 "name": "Existed_Raid", 00:19:31.739 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:31.739 "strip_size_kb": 64, 00:19:31.739 "state": "configuring", 00:19:31.739 "raid_level": "concat", 00:19:31.739 "superblock": false, 00:19:31.739 "num_base_bdevs": 4, 00:19:31.739 "num_base_bdevs_discovered": 2, 00:19:31.739 "num_base_bdevs_operational": 4, 00:19:31.739 "base_bdevs_list": [ 00:19:31.739 { 00:19:31.739 "name": "BaseBdev1", 00:19:31.739 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:31.739 "is_configured": false, 00:19:31.739 "data_offset": 0, 00:19:31.739 "data_size": 0 00:19:31.739 }, 00:19:31.739 { 00:19:31.740 "name": null, 00:19:31.740 "uuid": "35431aa8-e99f-4c14-88bd-b8fbcd6b8c70", 00:19:31.740 "is_configured": false, 00:19:31.740 "data_offset": 0, 00:19:31.740 "data_size": 65536 00:19:31.740 }, 00:19:31.740 { 00:19:31.740 "name": "BaseBdev3", 00:19:31.740 "uuid": "e35e6e6f-e586-44dc-a04c-be18f779707f", 00:19:31.740 "is_configured": true, 00:19:31.740 "data_offset": 0, 00:19:31.740 "data_size": 65536 00:19:31.740 }, 00:19:31.740 { 00:19:31.740 "name": "BaseBdev4", 00:19:31.740 "uuid": "d5d89953-f807-4676-9ece-9b6628ee6024", 00:19:31.740 "is_configured": true, 00:19:31.740 "data_offset": 0, 00:19:31.740 "data_size": 65536 00:19:31.740 } 00:19:31.740 ] 00:19:31.740 }' 00:19:31.740 18:23:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:19:31.740 18:23:15 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:19:32.308 18:23:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@310 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:32.308 18:23:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@310 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:19:32.567 18:23:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@310 -- # [[ false == \f\a\l\s\e ]] 00:19:32.567 18:23:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@312 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:19:32.826 [2024-07-12 18:23:16.414595] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:19:32.826 BaseBdev1 00:19:32.826 18:23:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@313 -- # waitforbdev BaseBdev1 00:19:32.826 18:23:16 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev1 00:19:32.826 18:23:16 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:19:32.826 18:23:16 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:19:32.826 18:23:16 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:19:32.826 18:23:16 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:19:32.826 18:23:16 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:19:33.085 18:23:16 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:19:33.344 [ 00:19:33.344 { 00:19:33.344 "name": "BaseBdev1", 00:19:33.344 "aliases": [ 00:19:33.344 "53b0609a-0550-47a0-b407-5d9751ba7a80" 00:19:33.344 ], 00:19:33.344 "product_name": "Malloc disk", 00:19:33.344 "block_size": 512, 00:19:33.344 "num_blocks": 65536, 00:19:33.344 "uuid": "53b0609a-0550-47a0-b407-5d9751ba7a80", 00:19:33.344 "assigned_rate_limits": { 00:19:33.344 "rw_ios_per_sec": 0, 00:19:33.344 "rw_mbytes_per_sec": 0, 00:19:33.344 "r_mbytes_per_sec": 0, 00:19:33.344 "w_mbytes_per_sec": 0 00:19:33.344 }, 00:19:33.344 "claimed": true, 00:19:33.344 "claim_type": "exclusive_write", 00:19:33.344 "zoned": false, 00:19:33.344 "supported_io_types": { 00:19:33.344 "read": true, 00:19:33.344 "write": true, 00:19:33.344 "unmap": true, 00:19:33.344 "flush": true, 00:19:33.344 "reset": true, 00:19:33.344 "nvme_admin": false, 00:19:33.344 "nvme_io": false, 00:19:33.344 "nvme_io_md": false, 00:19:33.344 "write_zeroes": true, 00:19:33.344 "zcopy": true, 00:19:33.344 "get_zone_info": false, 00:19:33.344 "zone_management": false, 00:19:33.344 "zone_append": false, 00:19:33.344 "compare": false, 00:19:33.344 "compare_and_write": false, 00:19:33.344 "abort": true, 00:19:33.344 "seek_hole": false, 00:19:33.344 "seek_data": false, 00:19:33.344 "copy": true, 00:19:33.344 "nvme_iov_md": false 00:19:33.344 }, 00:19:33.344 "memory_domains": [ 00:19:33.344 { 00:19:33.344 "dma_device_id": "system", 00:19:33.344 "dma_device_type": 1 00:19:33.344 }, 00:19:33.344 { 00:19:33.344 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:33.344 "dma_device_type": 2 00:19:33.344 } 00:19:33.344 ], 00:19:33.344 "driver_specific": {} 00:19:33.344 } 00:19:33.344 ] 00:19:33.344 18:23:16 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:19:33.344 18:23:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@314 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:19:33.344 18:23:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:19:33.344 18:23:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:19:33.344 18:23:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:19:33.344 18:23:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:19:33.344 18:23:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:19:33.344 18:23:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:19:33.344 18:23:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:19:33.344 18:23:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:19:33.344 18:23:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:19:33.344 18:23:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:33.344 18:23:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:19:33.603 18:23:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:19:33.603 "name": "Existed_Raid", 00:19:33.603 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:33.603 "strip_size_kb": 64, 00:19:33.603 "state": "configuring", 00:19:33.603 "raid_level": "concat", 00:19:33.603 "superblock": false, 00:19:33.603 "num_base_bdevs": 4, 00:19:33.604 "num_base_bdevs_discovered": 3, 00:19:33.604 "num_base_bdevs_operational": 4, 00:19:33.604 "base_bdevs_list": [ 00:19:33.604 { 00:19:33.604 "name": "BaseBdev1", 00:19:33.604 "uuid": "53b0609a-0550-47a0-b407-5d9751ba7a80", 00:19:33.604 "is_configured": true, 00:19:33.604 "data_offset": 0, 00:19:33.604 "data_size": 65536 00:19:33.604 }, 00:19:33.604 { 00:19:33.604 "name": null, 00:19:33.604 "uuid": "35431aa8-e99f-4c14-88bd-b8fbcd6b8c70", 00:19:33.604 "is_configured": false, 00:19:33.604 "data_offset": 0, 00:19:33.604 "data_size": 65536 00:19:33.604 }, 00:19:33.604 { 00:19:33.604 "name": "BaseBdev3", 00:19:33.604 "uuid": "e35e6e6f-e586-44dc-a04c-be18f779707f", 00:19:33.604 "is_configured": true, 00:19:33.604 "data_offset": 0, 00:19:33.604 "data_size": 65536 00:19:33.604 }, 00:19:33.604 { 00:19:33.604 "name": "BaseBdev4", 00:19:33.604 "uuid": "d5d89953-f807-4676-9ece-9b6628ee6024", 00:19:33.604 "is_configured": true, 00:19:33.604 "data_offset": 0, 00:19:33.604 "data_size": 65536 00:19:33.604 } 00:19:33.604 ] 00:19:33.604 }' 00:19:33.604 18:23:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:19:33.604 18:23:17 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:19:34.171 18:23:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@315 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:19:34.171 18:23:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@315 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:34.430 18:23:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@315 -- # [[ true == \t\r\u\e ]] 00:19:34.430 18:23:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@317 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev3 00:19:34.689 [2024-07-12 18:23:18.239449] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:19:34.689 18:23:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@318 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:19:34.689 18:23:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:19:34.689 18:23:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:19:34.689 18:23:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:19:34.689 18:23:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:19:34.689 18:23:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:19:34.689 18:23:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:19:34.689 18:23:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:19:34.689 18:23:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:19:34.689 18:23:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:19:34.689 18:23:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:34.689 18:23:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:19:34.948 18:23:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:19:34.948 "name": "Existed_Raid", 00:19:34.948 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:34.948 "strip_size_kb": 64, 00:19:34.948 "state": "configuring", 00:19:34.948 "raid_level": "concat", 00:19:34.948 "superblock": false, 00:19:34.948 "num_base_bdevs": 4, 00:19:34.948 "num_base_bdevs_discovered": 2, 00:19:34.948 "num_base_bdevs_operational": 4, 00:19:34.948 "base_bdevs_list": [ 00:19:34.948 { 00:19:34.948 "name": "BaseBdev1", 00:19:34.948 "uuid": "53b0609a-0550-47a0-b407-5d9751ba7a80", 00:19:34.948 "is_configured": true, 00:19:34.948 "data_offset": 0, 00:19:34.948 "data_size": 65536 00:19:34.948 }, 00:19:34.948 { 00:19:34.948 "name": null, 00:19:34.948 "uuid": "35431aa8-e99f-4c14-88bd-b8fbcd6b8c70", 00:19:34.948 "is_configured": false, 00:19:34.948 "data_offset": 0, 00:19:34.948 "data_size": 65536 00:19:34.948 }, 00:19:34.948 { 00:19:34.948 "name": null, 00:19:34.948 "uuid": "e35e6e6f-e586-44dc-a04c-be18f779707f", 00:19:34.948 "is_configured": false, 00:19:34.948 "data_offset": 0, 00:19:34.948 "data_size": 65536 00:19:34.948 }, 00:19:34.948 { 00:19:34.948 "name": "BaseBdev4", 00:19:34.948 "uuid": "d5d89953-f807-4676-9ece-9b6628ee6024", 00:19:34.948 "is_configured": true, 00:19:34.948 "data_offset": 0, 00:19:34.948 "data_size": 65536 00:19:34.948 } 00:19:34.948 ] 00:19:34.948 }' 00:19:34.948 18:23:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:19:34.948 18:23:18 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:19:35.515 18:23:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@319 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:35.515 18:23:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@319 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:19:35.774 18:23:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@319 -- # [[ false == \f\a\l\s\e ]] 00:19:35.774 18:23:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@321 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev3 00:19:36.342 [2024-07-12 18:23:19.823696] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:19:36.342 18:23:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@322 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:19:36.342 18:23:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:19:36.342 18:23:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:19:36.342 18:23:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:19:36.342 18:23:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:19:36.342 18:23:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:19:36.342 18:23:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:19:36.342 18:23:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:19:36.342 18:23:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:19:36.342 18:23:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:19:36.342 18:23:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:36.342 18:23:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:19:36.600 18:23:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:19:36.600 "name": "Existed_Raid", 00:19:36.600 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:36.600 "strip_size_kb": 64, 00:19:36.600 "state": "configuring", 00:19:36.600 "raid_level": "concat", 00:19:36.600 "superblock": false, 00:19:36.600 "num_base_bdevs": 4, 00:19:36.600 "num_base_bdevs_discovered": 3, 00:19:36.601 "num_base_bdevs_operational": 4, 00:19:36.601 "base_bdevs_list": [ 00:19:36.601 { 00:19:36.601 "name": "BaseBdev1", 00:19:36.601 "uuid": "53b0609a-0550-47a0-b407-5d9751ba7a80", 00:19:36.601 "is_configured": true, 00:19:36.601 "data_offset": 0, 00:19:36.601 "data_size": 65536 00:19:36.601 }, 00:19:36.601 { 00:19:36.601 "name": null, 00:19:36.601 "uuid": "35431aa8-e99f-4c14-88bd-b8fbcd6b8c70", 00:19:36.601 "is_configured": false, 00:19:36.601 "data_offset": 0, 00:19:36.601 "data_size": 65536 00:19:36.601 }, 00:19:36.601 { 00:19:36.601 "name": "BaseBdev3", 00:19:36.601 "uuid": "e35e6e6f-e586-44dc-a04c-be18f779707f", 00:19:36.601 "is_configured": true, 00:19:36.601 "data_offset": 0, 00:19:36.601 "data_size": 65536 00:19:36.601 }, 00:19:36.601 { 00:19:36.601 "name": "BaseBdev4", 00:19:36.601 "uuid": "d5d89953-f807-4676-9ece-9b6628ee6024", 00:19:36.601 "is_configured": true, 00:19:36.601 "data_offset": 0, 00:19:36.601 "data_size": 65536 00:19:36.601 } 00:19:36.601 ] 00:19:36.601 }' 00:19:36.601 18:23:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:19:36.601 18:23:20 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:19:37.167 18:23:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@323 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:37.167 18:23:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@323 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:19:37.426 18:23:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@323 -- # [[ true == \t\r\u\e ]] 00:19:37.426 18:23:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@325 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:19:37.684 [2024-07-12 18:23:21.163265] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:19:37.684 18:23:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@326 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:19:37.684 18:23:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:19:37.685 18:23:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:19:37.685 18:23:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:19:37.685 18:23:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:19:37.685 18:23:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:19:37.685 18:23:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:19:37.685 18:23:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:19:37.685 18:23:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:19:37.685 18:23:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:19:37.685 18:23:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:37.685 18:23:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:19:37.943 18:23:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:19:37.943 "name": "Existed_Raid", 00:19:37.943 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:37.943 "strip_size_kb": 64, 00:19:37.943 "state": "configuring", 00:19:37.943 "raid_level": "concat", 00:19:37.943 "superblock": false, 00:19:37.943 "num_base_bdevs": 4, 00:19:37.943 "num_base_bdevs_discovered": 2, 00:19:37.943 "num_base_bdevs_operational": 4, 00:19:37.943 "base_bdevs_list": [ 00:19:37.943 { 00:19:37.943 "name": null, 00:19:37.943 "uuid": "53b0609a-0550-47a0-b407-5d9751ba7a80", 00:19:37.943 "is_configured": false, 00:19:37.943 "data_offset": 0, 00:19:37.943 "data_size": 65536 00:19:37.943 }, 00:19:37.943 { 00:19:37.943 "name": null, 00:19:37.943 "uuid": "35431aa8-e99f-4c14-88bd-b8fbcd6b8c70", 00:19:37.943 "is_configured": false, 00:19:37.943 "data_offset": 0, 00:19:37.943 "data_size": 65536 00:19:37.943 }, 00:19:37.943 { 00:19:37.943 "name": "BaseBdev3", 00:19:37.943 "uuid": "e35e6e6f-e586-44dc-a04c-be18f779707f", 00:19:37.943 "is_configured": true, 00:19:37.943 "data_offset": 0, 00:19:37.943 "data_size": 65536 00:19:37.943 }, 00:19:37.943 { 00:19:37.943 "name": "BaseBdev4", 00:19:37.943 "uuid": "d5d89953-f807-4676-9ece-9b6628ee6024", 00:19:37.943 "is_configured": true, 00:19:37.943 "data_offset": 0, 00:19:37.943 "data_size": 65536 00:19:37.943 } 00:19:37.943 ] 00:19:37.943 }' 00:19:37.943 18:23:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:19:37.943 18:23:21 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:19:38.510 18:23:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@327 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:38.510 18:23:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@327 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:19:38.510 18:23:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@327 -- # [[ false == \f\a\l\s\e ]] 00:19:38.510 18:23:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@329 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev2 00:19:38.768 [2024-07-12 18:23:22.447048] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:19:38.768 18:23:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@330 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:19:38.768 18:23:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:19:38.768 18:23:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:19:38.768 18:23:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:19:38.768 18:23:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:19:38.768 18:23:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:19:38.768 18:23:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:19:38.768 18:23:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:19:38.768 18:23:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:19:38.768 18:23:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:19:38.768 18:23:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:19:38.768 18:23:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:39.027 18:23:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:19:39.027 "name": "Existed_Raid", 00:19:39.027 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:39.027 "strip_size_kb": 64, 00:19:39.027 "state": "configuring", 00:19:39.027 "raid_level": "concat", 00:19:39.027 "superblock": false, 00:19:39.027 "num_base_bdevs": 4, 00:19:39.027 "num_base_bdevs_discovered": 3, 00:19:39.027 "num_base_bdevs_operational": 4, 00:19:39.027 "base_bdevs_list": [ 00:19:39.027 { 00:19:39.027 "name": null, 00:19:39.027 "uuid": "53b0609a-0550-47a0-b407-5d9751ba7a80", 00:19:39.027 "is_configured": false, 00:19:39.027 "data_offset": 0, 00:19:39.027 "data_size": 65536 00:19:39.027 }, 00:19:39.027 { 00:19:39.027 "name": "BaseBdev2", 00:19:39.027 "uuid": "35431aa8-e99f-4c14-88bd-b8fbcd6b8c70", 00:19:39.027 "is_configured": true, 00:19:39.027 "data_offset": 0, 00:19:39.027 "data_size": 65536 00:19:39.027 }, 00:19:39.027 { 00:19:39.027 "name": "BaseBdev3", 00:19:39.027 "uuid": "e35e6e6f-e586-44dc-a04c-be18f779707f", 00:19:39.027 "is_configured": true, 00:19:39.027 "data_offset": 0, 00:19:39.027 "data_size": 65536 00:19:39.027 }, 00:19:39.027 { 00:19:39.027 "name": "BaseBdev4", 00:19:39.027 "uuid": "d5d89953-f807-4676-9ece-9b6628ee6024", 00:19:39.027 "is_configured": true, 00:19:39.027 "data_offset": 0, 00:19:39.027 "data_size": 65536 00:19:39.027 } 00:19:39.027 ] 00:19:39.027 }' 00:19:39.027 18:23:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:19:39.027 18:23:22 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:19:39.606 18:23:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@331 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:39.606 18:23:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@331 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:19:39.864 18:23:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@331 -- # [[ true == \t\r\u\e ]] 00:19:39.864 18:23:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@333 -- # jq -r '.[0].base_bdevs_list[0].uuid' 00:19:39.864 18:23:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:40.122 18:23:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b NewBaseBdev -u 53b0609a-0550-47a0-b407-5d9751ba7a80 00:19:40.380 [2024-07-12 18:23:23.926459] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev NewBaseBdev is claimed 00:19:40.380 [2024-07-12 18:23:23.926497] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0xcd4040 00:19:40.380 [2024-07-12 18:23:23.926505] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 262144, blocklen 512 00:19:40.380 [2024-07-12 18:23:23.926700] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xccfa70 00:19:40.380 [2024-07-12 18:23:23.926816] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0xcd4040 00:19:40.380 [2024-07-12 18:23:23.926825] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0xcd4040 00:19:40.380 [2024-07-12 18:23:23.926999] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:19:40.380 NewBaseBdev 00:19:40.380 18:23:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@334 -- # waitforbdev NewBaseBdev 00:19:40.380 18:23:23 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=NewBaseBdev 00:19:40.380 18:23:23 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:19:40.380 18:23:23 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:19:40.380 18:23:23 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:19:40.380 18:23:23 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:19:40.380 18:23:23 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:19:40.638 18:23:24 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev -t 2000 00:19:40.896 [ 00:19:40.896 { 00:19:40.896 "name": "NewBaseBdev", 00:19:40.896 "aliases": [ 00:19:40.896 "53b0609a-0550-47a0-b407-5d9751ba7a80" 00:19:40.896 ], 00:19:40.896 "product_name": "Malloc disk", 00:19:40.896 "block_size": 512, 00:19:40.896 "num_blocks": 65536, 00:19:40.896 "uuid": "53b0609a-0550-47a0-b407-5d9751ba7a80", 00:19:40.896 "assigned_rate_limits": { 00:19:40.896 "rw_ios_per_sec": 0, 00:19:40.896 "rw_mbytes_per_sec": 0, 00:19:40.896 "r_mbytes_per_sec": 0, 00:19:40.896 "w_mbytes_per_sec": 0 00:19:40.896 }, 00:19:40.896 "claimed": true, 00:19:40.896 "claim_type": "exclusive_write", 00:19:40.896 "zoned": false, 00:19:40.896 "supported_io_types": { 00:19:40.896 "read": true, 00:19:40.896 "write": true, 00:19:40.896 "unmap": true, 00:19:40.896 "flush": true, 00:19:40.896 "reset": true, 00:19:40.896 "nvme_admin": false, 00:19:40.896 "nvme_io": false, 00:19:40.896 "nvme_io_md": false, 00:19:40.896 "write_zeroes": true, 00:19:40.896 "zcopy": true, 00:19:40.896 "get_zone_info": false, 00:19:40.896 "zone_management": false, 00:19:40.896 "zone_append": false, 00:19:40.896 "compare": false, 00:19:40.896 "compare_and_write": false, 00:19:40.896 "abort": true, 00:19:40.896 "seek_hole": false, 00:19:40.896 "seek_data": false, 00:19:40.896 "copy": true, 00:19:40.896 "nvme_iov_md": false 00:19:40.896 }, 00:19:40.896 "memory_domains": [ 00:19:40.896 { 00:19:40.896 "dma_device_id": "system", 00:19:40.896 "dma_device_type": 1 00:19:40.896 }, 00:19:40.896 { 00:19:40.896 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:40.896 "dma_device_type": 2 00:19:40.896 } 00:19:40.896 ], 00:19:40.896 "driver_specific": {} 00:19:40.896 } 00:19:40.896 ] 00:19:40.896 18:23:24 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:19:40.896 18:23:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@335 -- # verify_raid_bdev_state Existed_Raid online concat 64 4 00:19:40.896 18:23:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:19:40.896 18:23:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:19:40.896 18:23:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:19:40.896 18:23:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:19:40.896 18:23:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:19:40.896 18:23:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:19:40.896 18:23:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:19:40.896 18:23:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:19:40.896 18:23:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:19:40.896 18:23:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:40.896 18:23:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:19:41.155 18:23:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:19:41.155 "name": "Existed_Raid", 00:19:41.155 "uuid": "3a5fb464-33d1-4672-bec2-109297217fd3", 00:19:41.155 "strip_size_kb": 64, 00:19:41.155 "state": "online", 00:19:41.155 "raid_level": "concat", 00:19:41.155 "superblock": false, 00:19:41.155 "num_base_bdevs": 4, 00:19:41.155 "num_base_bdevs_discovered": 4, 00:19:41.155 "num_base_bdevs_operational": 4, 00:19:41.155 "base_bdevs_list": [ 00:19:41.155 { 00:19:41.155 "name": "NewBaseBdev", 00:19:41.155 "uuid": "53b0609a-0550-47a0-b407-5d9751ba7a80", 00:19:41.155 "is_configured": true, 00:19:41.155 "data_offset": 0, 00:19:41.155 "data_size": 65536 00:19:41.155 }, 00:19:41.155 { 00:19:41.155 "name": "BaseBdev2", 00:19:41.155 "uuid": "35431aa8-e99f-4c14-88bd-b8fbcd6b8c70", 00:19:41.155 "is_configured": true, 00:19:41.155 "data_offset": 0, 00:19:41.155 "data_size": 65536 00:19:41.155 }, 00:19:41.155 { 00:19:41.155 "name": "BaseBdev3", 00:19:41.155 "uuid": "e35e6e6f-e586-44dc-a04c-be18f779707f", 00:19:41.155 "is_configured": true, 00:19:41.155 "data_offset": 0, 00:19:41.155 "data_size": 65536 00:19:41.155 }, 00:19:41.155 { 00:19:41.155 "name": "BaseBdev4", 00:19:41.155 "uuid": "d5d89953-f807-4676-9ece-9b6628ee6024", 00:19:41.155 "is_configured": true, 00:19:41.155 "data_offset": 0, 00:19:41.155 "data_size": 65536 00:19:41.155 } 00:19:41.155 ] 00:19:41.155 }' 00:19:41.155 18:23:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:19:41.155 18:23:24 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:19:41.722 18:23:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@336 -- # verify_raid_bdev_properties Existed_Raid 00:19:41.722 18:23:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:19:41.722 18:23:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:19:41.722 18:23:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:19:41.722 18:23:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:19:41.722 18:23:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@198 -- # local name 00:19:41.722 18:23:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:19:41.722 18:23:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:19:41.981 [2024-07-12 18:23:25.531059] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:19:41.981 18:23:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:19:41.981 "name": "Existed_Raid", 00:19:41.981 "aliases": [ 00:19:41.981 "3a5fb464-33d1-4672-bec2-109297217fd3" 00:19:41.981 ], 00:19:41.981 "product_name": "Raid Volume", 00:19:41.981 "block_size": 512, 00:19:41.981 "num_blocks": 262144, 00:19:41.981 "uuid": "3a5fb464-33d1-4672-bec2-109297217fd3", 00:19:41.981 "assigned_rate_limits": { 00:19:41.981 "rw_ios_per_sec": 0, 00:19:41.981 "rw_mbytes_per_sec": 0, 00:19:41.981 "r_mbytes_per_sec": 0, 00:19:41.981 "w_mbytes_per_sec": 0 00:19:41.981 }, 00:19:41.981 "claimed": false, 00:19:41.981 "zoned": false, 00:19:41.981 "supported_io_types": { 00:19:41.981 "read": true, 00:19:41.981 "write": true, 00:19:41.981 "unmap": true, 00:19:41.981 "flush": true, 00:19:41.981 "reset": true, 00:19:41.981 "nvme_admin": false, 00:19:41.981 "nvme_io": false, 00:19:41.981 "nvme_io_md": false, 00:19:41.981 "write_zeroes": true, 00:19:41.981 "zcopy": false, 00:19:41.981 "get_zone_info": false, 00:19:41.981 "zone_management": false, 00:19:41.981 "zone_append": false, 00:19:41.981 "compare": false, 00:19:41.981 "compare_and_write": false, 00:19:41.981 "abort": false, 00:19:41.981 "seek_hole": false, 00:19:41.981 "seek_data": false, 00:19:41.981 "copy": false, 00:19:41.981 "nvme_iov_md": false 00:19:41.981 }, 00:19:41.981 "memory_domains": [ 00:19:41.981 { 00:19:41.981 "dma_device_id": "system", 00:19:41.981 "dma_device_type": 1 00:19:41.981 }, 00:19:41.981 { 00:19:41.981 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:41.981 "dma_device_type": 2 00:19:41.981 }, 00:19:41.981 { 00:19:41.981 "dma_device_id": "system", 00:19:41.981 "dma_device_type": 1 00:19:41.981 }, 00:19:41.981 { 00:19:41.981 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:41.981 "dma_device_type": 2 00:19:41.981 }, 00:19:41.981 { 00:19:41.981 "dma_device_id": "system", 00:19:41.981 "dma_device_type": 1 00:19:41.981 }, 00:19:41.981 { 00:19:41.981 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:41.981 "dma_device_type": 2 00:19:41.981 }, 00:19:41.981 { 00:19:41.981 "dma_device_id": "system", 00:19:41.981 "dma_device_type": 1 00:19:41.981 }, 00:19:41.981 { 00:19:41.981 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:41.981 "dma_device_type": 2 00:19:41.981 } 00:19:41.981 ], 00:19:41.981 "driver_specific": { 00:19:41.981 "raid": { 00:19:41.981 "uuid": "3a5fb464-33d1-4672-bec2-109297217fd3", 00:19:41.981 "strip_size_kb": 64, 00:19:41.981 "state": "online", 00:19:41.981 "raid_level": "concat", 00:19:41.981 "superblock": false, 00:19:41.981 "num_base_bdevs": 4, 00:19:41.981 "num_base_bdevs_discovered": 4, 00:19:41.981 "num_base_bdevs_operational": 4, 00:19:41.981 "base_bdevs_list": [ 00:19:41.981 { 00:19:41.981 "name": "NewBaseBdev", 00:19:41.981 "uuid": "53b0609a-0550-47a0-b407-5d9751ba7a80", 00:19:41.981 "is_configured": true, 00:19:41.981 "data_offset": 0, 00:19:41.981 "data_size": 65536 00:19:41.981 }, 00:19:41.981 { 00:19:41.981 "name": "BaseBdev2", 00:19:41.981 "uuid": "35431aa8-e99f-4c14-88bd-b8fbcd6b8c70", 00:19:41.981 "is_configured": true, 00:19:41.981 "data_offset": 0, 00:19:41.981 "data_size": 65536 00:19:41.981 }, 00:19:41.981 { 00:19:41.981 "name": "BaseBdev3", 00:19:41.981 "uuid": "e35e6e6f-e586-44dc-a04c-be18f779707f", 00:19:41.981 "is_configured": true, 00:19:41.981 "data_offset": 0, 00:19:41.981 "data_size": 65536 00:19:41.981 }, 00:19:41.981 { 00:19:41.981 "name": "BaseBdev4", 00:19:41.981 "uuid": "d5d89953-f807-4676-9ece-9b6628ee6024", 00:19:41.981 "is_configured": true, 00:19:41.981 "data_offset": 0, 00:19:41.981 "data_size": 65536 00:19:41.981 } 00:19:41.981 ] 00:19:41.981 } 00:19:41.981 } 00:19:41.981 }' 00:19:41.981 18:23:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:19:41.981 18:23:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='NewBaseBdev 00:19:41.981 BaseBdev2 00:19:41.981 BaseBdev3 00:19:41.981 BaseBdev4' 00:19:41.981 18:23:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:19:41.981 18:23:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev 00:19:41.981 18:23:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:19:42.240 18:23:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:19:42.240 "name": "NewBaseBdev", 00:19:42.240 "aliases": [ 00:19:42.240 "53b0609a-0550-47a0-b407-5d9751ba7a80" 00:19:42.240 ], 00:19:42.240 "product_name": "Malloc disk", 00:19:42.240 "block_size": 512, 00:19:42.240 "num_blocks": 65536, 00:19:42.240 "uuid": "53b0609a-0550-47a0-b407-5d9751ba7a80", 00:19:42.240 "assigned_rate_limits": { 00:19:42.240 "rw_ios_per_sec": 0, 00:19:42.240 "rw_mbytes_per_sec": 0, 00:19:42.240 "r_mbytes_per_sec": 0, 00:19:42.240 "w_mbytes_per_sec": 0 00:19:42.240 }, 00:19:42.240 "claimed": true, 00:19:42.240 "claim_type": "exclusive_write", 00:19:42.240 "zoned": false, 00:19:42.240 "supported_io_types": { 00:19:42.240 "read": true, 00:19:42.240 "write": true, 00:19:42.240 "unmap": true, 00:19:42.240 "flush": true, 00:19:42.240 "reset": true, 00:19:42.240 "nvme_admin": false, 00:19:42.240 "nvme_io": false, 00:19:42.240 "nvme_io_md": false, 00:19:42.240 "write_zeroes": true, 00:19:42.240 "zcopy": true, 00:19:42.240 "get_zone_info": false, 00:19:42.240 "zone_management": false, 00:19:42.240 "zone_append": false, 00:19:42.240 "compare": false, 00:19:42.240 "compare_and_write": false, 00:19:42.240 "abort": true, 00:19:42.240 "seek_hole": false, 00:19:42.240 "seek_data": false, 00:19:42.240 "copy": true, 00:19:42.240 "nvme_iov_md": false 00:19:42.240 }, 00:19:42.240 "memory_domains": [ 00:19:42.240 { 00:19:42.240 "dma_device_id": "system", 00:19:42.240 "dma_device_type": 1 00:19:42.240 }, 00:19:42.240 { 00:19:42.240 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:42.240 "dma_device_type": 2 00:19:42.240 } 00:19:42.240 ], 00:19:42.240 "driver_specific": {} 00:19:42.240 }' 00:19:42.240 18:23:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:19:42.240 18:23:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:19:42.240 18:23:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:19:42.240 18:23:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:19:42.500 18:23:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:19:42.500 18:23:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:19:42.500 18:23:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:19:42.500 18:23:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:19:42.500 18:23:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:19:42.500 18:23:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:19:42.500 18:23:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:19:42.500 18:23:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:19:42.500 18:23:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:19:42.500 18:23:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:19:42.500 18:23:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:19:42.759 18:23:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:19:42.759 "name": "BaseBdev2", 00:19:42.759 "aliases": [ 00:19:42.759 "35431aa8-e99f-4c14-88bd-b8fbcd6b8c70" 00:19:42.759 ], 00:19:42.759 "product_name": "Malloc disk", 00:19:42.759 "block_size": 512, 00:19:42.759 "num_blocks": 65536, 00:19:42.759 "uuid": "35431aa8-e99f-4c14-88bd-b8fbcd6b8c70", 00:19:42.759 "assigned_rate_limits": { 00:19:42.759 "rw_ios_per_sec": 0, 00:19:42.759 "rw_mbytes_per_sec": 0, 00:19:42.759 "r_mbytes_per_sec": 0, 00:19:42.759 "w_mbytes_per_sec": 0 00:19:42.759 }, 00:19:42.759 "claimed": true, 00:19:42.759 "claim_type": "exclusive_write", 00:19:42.759 "zoned": false, 00:19:42.759 "supported_io_types": { 00:19:42.759 "read": true, 00:19:42.759 "write": true, 00:19:42.759 "unmap": true, 00:19:42.759 "flush": true, 00:19:42.759 "reset": true, 00:19:42.759 "nvme_admin": false, 00:19:42.759 "nvme_io": false, 00:19:42.759 "nvme_io_md": false, 00:19:42.759 "write_zeroes": true, 00:19:42.759 "zcopy": true, 00:19:42.759 "get_zone_info": false, 00:19:42.759 "zone_management": false, 00:19:42.759 "zone_append": false, 00:19:42.759 "compare": false, 00:19:42.759 "compare_and_write": false, 00:19:42.759 "abort": true, 00:19:42.759 "seek_hole": false, 00:19:42.759 "seek_data": false, 00:19:42.759 "copy": true, 00:19:42.759 "nvme_iov_md": false 00:19:42.759 }, 00:19:42.759 "memory_domains": [ 00:19:42.759 { 00:19:42.760 "dma_device_id": "system", 00:19:42.760 "dma_device_type": 1 00:19:42.760 }, 00:19:42.760 { 00:19:42.760 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:42.760 "dma_device_type": 2 00:19:42.760 } 00:19:42.760 ], 00:19:42.760 "driver_specific": {} 00:19:42.760 }' 00:19:42.760 18:23:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:19:42.760 18:23:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:19:43.019 18:23:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:19:43.019 18:23:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:19:43.019 18:23:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:19:43.019 18:23:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:19:43.019 18:23:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:19:43.019 18:23:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:19:43.277 18:23:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:19:43.277 18:23:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:19:43.277 18:23:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:19:43.277 18:23:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:19:43.277 18:23:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:19:43.277 18:23:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:19:43.277 18:23:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:19:43.536 18:23:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:19:43.536 "name": "BaseBdev3", 00:19:43.536 "aliases": [ 00:19:43.536 "e35e6e6f-e586-44dc-a04c-be18f779707f" 00:19:43.536 ], 00:19:43.536 "product_name": "Malloc disk", 00:19:43.536 "block_size": 512, 00:19:43.536 "num_blocks": 65536, 00:19:43.536 "uuid": "e35e6e6f-e586-44dc-a04c-be18f779707f", 00:19:43.536 "assigned_rate_limits": { 00:19:43.536 "rw_ios_per_sec": 0, 00:19:43.536 "rw_mbytes_per_sec": 0, 00:19:43.536 "r_mbytes_per_sec": 0, 00:19:43.536 "w_mbytes_per_sec": 0 00:19:43.536 }, 00:19:43.536 "claimed": true, 00:19:43.536 "claim_type": "exclusive_write", 00:19:43.536 "zoned": false, 00:19:43.536 "supported_io_types": { 00:19:43.536 "read": true, 00:19:43.536 "write": true, 00:19:43.536 "unmap": true, 00:19:43.536 "flush": true, 00:19:43.536 "reset": true, 00:19:43.536 "nvme_admin": false, 00:19:43.536 "nvme_io": false, 00:19:43.536 "nvme_io_md": false, 00:19:43.536 "write_zeroes": true, 00:19:43.536 "zcopy": true, 00:19:43.536 "get_zone_info": false, 00:19:43.536 "zone_management": false, 00:19:43.536 "zone_append": false, 00:19:43.536 "compare": false, 00:19:43.536 "compare_and_write": false, 00:19:43.536 "abort": true, 00:19:43.536 "seek_hole": false, 00:19:43.536 "seek_data": false, 00:19:43.536 "copy": true, 00:19:43.536 "nvme_iov_md": false 00:19:43.536 }, 00:19:43.536 "memory_domains": [ 00:19:43.536 { 00:19:43.536 "dma_device_id": "system", 00:19:43.536 "dma_device_type": 1 00:19:43.536 }, 00:19:43.536 { 00:19:43.536 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:43.536 "dma_device_type": 2 00:19:43.536 } 00:19:43.536 ], 00:19:43.536 "driver_specific": {} 00:19:43.536 }' 00:19:43.536 18:23:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:19:43.536 18:23:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:19:43.536 18:23:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:19:43.536 18:23:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:19:43.536 18:23:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:19:43.794 18:23:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:19:43.794 18:23:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:19:43.794 18:23:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:19:43.794 18:23:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:19:43.794 18:23:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:19:43.794 18:23:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:19:43.794 18:23:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:19:43.794 18:23:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:19:43.794 18:23:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:19:43.794 18:23:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 00:19:44.053 18:23:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:19:44.053 "name": "BaseBdev4", 00:19:44.053 "aliases": [ 00:19:44.053 "d5d89953-f807-4676-9ece-9b6628ee6024" 00:19:44.053 ], 00:19:44.053 "product_name": "Malloc disk", 00:19:44.053 "block_size": 512, 00:19:44.053 "num_blocks": 65536, 00:19:44.053 "uuid": "d5d89953-f807-4676-9ece-9b6628ee6024", 00:19:44.053 "assigned_rate_limits": { 00:19:44.053 "rw_ios_per_sec": 0, 00:19:44.053 "rw_mbytes_per_sec": 0, 00:19:44.053 "r_mbytes_per_sec": 0, 00:19:44.053 "w_mbytes_per_sec": 0 00:19:44.053 }, 00:19:44.053 "claimed": true, 00:19:44.053 "claim_type": "exclusive_write", 00:19:44.053 "zoned": false, 00:19:44.053 "supported_io_types": { 00:19:44.053 "read": true, 00:19:44.053 "write": true, 00:19:44.053 "unmap": true, 00:19:44.053 "flush": true, 00:19:44.053 "reset": true, 00:19:44.053 "nvme_admin": false, 00:19:44.053 "nvme_io": false, 00:19:44.053 "nvme_io_md": false, 00:19:44.053 "write_zeroes": true, 00:19:44.053 "zcopy": true, 00:19:44.053 "get_zone_info": false, 00:19:44.053 "zone_management": false, 00:19:44.053 "zone_append": false, 00:19:44.053 "compare": false, 00:19:44.053 "compare_and_write": false, 00:19:44.053 "abort": true, 00:19:44.053 "seek_hole": false, 00:19:44.053 "seek_data": false, 00:19:44.053 "copy": true, 00:19:44.053 "nvme_iov_md": false 00:19:44.053 }, 00:19:44.053 "memory_domains": [ 00:19:44.053 { 00:19:44.053 "dma_device_id": "system", 00:19:44.053 "dma_device_type": 1 00:19:44.053 }, 00:19:44.053 { 00:19:44.053 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:44.053 "dma_device_type": 2 00:19:44.053 } 00:19:44.053 ], 00:19:44.053 "driver_specific": {} 00:19:44.053 }' 00:19:44.053 18:23:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:19:44.053 18:23:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:19:44.053 18:23:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:19:44.053 18:23:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:19:44.312 18:23:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:19:44.312 18:23:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:19:44.312 18:23:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:19:44.312 18:23:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:19:44.312 18:23:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:19:44.312 18:23:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:19:44.312 18:23:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:19:44.571 18:23:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:19:44.571 18:23:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@338 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:19:44.571 [2024-07-12 18:23:28.193870] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:19:44.571 [2024-07-12 18:23:28.193899] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:19:44.571 [2024-07-12 18:23:28.193954] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:19:44.571 [2024-07-12 18:23:28.194011] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:19:44.571 [2024-07-12 18:23:28.194023] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xcd4040 name Existed_Raid, state offline 00:19:44.571 18:23:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@341 -- # killprocess 2533755 00:19:44.571 18:23:28 bdev_raid.raid_state_function_test -- common/autotest_common.sh@948 -- # '[' -z 2533755 ']' 00:19:44.571 18:23:28 bdev_raid.raid_state_function_test -- common/autotest_common.sh@952 -- # kill -0 2533755 00:19:44.571 18:23:28 bdev_raid.raid_state_function_test -- common/autotest_common.sh@953 -- # uname 00:19:44.571 18:23:28 bdev_raid.raid_state_function_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:19:44.571 18:23:28 bdev_raid.raid_state_function_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2533755 00:19:44.571 18:23:28 bdev_raid.raid_state_function_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:19:44.571 18:23:28 bdev_raid.raid_state_function_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:19:44.571 18:23:28 bdev_raid.raid_state_function_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2533755' 00:19:44.571 killing process with pid 2533755 00:19:44.571 18:23:28 bdev_raid.raid_state_function_test -- common/autotest_common.sh@967 -- # kill 2533755 00:19:44.571 [2024-07-12 18:23:28.250104] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:19:44.571 18:23:28 bdev_raid.raid_state_function_test -- common/autotest_common.sh@972 -- # wait 2533755 00:19:44.571 [2024-07-12 18:23:28.287908] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:19:44.831 18:23:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@343 -- # return 0 00:19:44.831 00:19:44.831 real 0m32.548s 00:19:44.831 user 0m59.766s 00:19:44.831 sys 0m5.749s 00:19:44.831 18:23:28 bdev_raid.raid_state_function_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:19:44.831 18:23:28 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:19:44.831 ************************************ 00:19:44.831 END TEST raid_state_function_test 00:19:44.831 ************************************ 00:19:44.831 18:23:28 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:19:44.831 18:23:28 bdev_raid -- bdev/bdev_raid.sh@868 -- # run_test raid_state_function_test_sb raid_state_function_test concat 4 true 00:19:44.831 18:23:28 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:19:44.831 18:23:28 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:19:44.831 18:23:28 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:19:45.090 ************************************ 00:19:45.090 START TEST raid_state_function_test_sb 00:19:45.090 ************************************ 00:19:45.090 18:23:28 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@1123 -- # raid_state_function_test concat 4 true 00:19:45.090 18:23:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@220 -- # local raid_level=concat 00:19:45.090 18:23:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@221 -- # local num_base_bdevs=4 00:19:45.090 18:23:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@222 -- # local superblock=true 00:19:45.090 18:23:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@223 -- # local raid_bdev 00:19:45.090 18:23:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i = 1 )) 00:19:45.090 18:23:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:19:45.090 18:23:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev1 00:19:45.090 18:23:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:19:45.090 18:23:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:19:45.090 18:23:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev2 00:19:45.090 18:23:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:19:45.090 18:23:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:19:45.090 18:23:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev3 00:19:45.090 18:23:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:19:45.090 18:23:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:19:45.090 18:23:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev4 00:19:45.090 18:23:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:19:45.090 18:23:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:19:45.090 18:23:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3' 'BaseBdev4') 00:19:45.090 18:23:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # local base_bdevs 00:19:45.090 18:23:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@225 -- # local raid_bdev_name=Existed_Raid 00:19:45.090 18:23:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # local strip_size 00:19:45.090 18:23:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@227 -- # local strip_size_create_arg 00:19:45.090 18:23:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@228 -- # local superblock_create_arg 00:19:45.090 18:23:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@230 -- # '[' concat '!=' raid1 ']' 00:19:45.090 18:23:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@231 -- # strip_size=64 00:19:45.090 18:23:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@232 -- # strip_size_create_arg='-z 64' 00:19:45.090 18:23:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@237 -- # '[' true = true ']' 00:19:45.090 18:23:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@238 -- # superblock_create_arg=-s 00:19:45.090 18:23:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@244 -- # raid_pid=2538645 00:19:45.090 18:23:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@245 -- # echo 'Process raid pid: 2538645' 00:19:45.090 Process raid pid: 2538645 00:19:45.090 18:23:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@246 -- # waitforlisten 2538645 /var/tmp/spdk-raid.sock 00:19:45.090 18:23:28 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@829 -- # '[' -z 2538645 ']' 00:19:45.090 18:23:28 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:19:45.090 18:23:28 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@834 -- # local max_retries=100 00:19:45.090 18:23:28 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:19:45.090 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:19:45.090 18:23:28 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@838 -- # xtrace_disable 00:19:45.090 18:23:28 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:19:45.090 18:23:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@243 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:19:45.090 [2024-07-12 18:23:28.634920] Starting SPDK v24.09-pre git sha1 182dd7de4 / DPDK 24.03.0 initialization... 00:19:45.090 [2024-07-12 18:23:28.634992] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:19:45.090 [2024-07-12 18:23:28.754085] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:19:45.350 [2024-07-12 18:23:28.856838] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:19:45.350 [2024-07-12 18:23:28.917769] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:19:45.350 [2024-07-12 18:23:28.917805] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:19:45.918 18:23:29 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:19:45.918 18:23:29 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@862 -- # return 0 00:19:45.918 18:23:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@250 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r concat -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:19:46.486 [2024-07-12 18:23:30.028164] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:19:46.486 [2024-07-12 18:23:30.028210] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:19:46.486 [2024-07-12 18:23:30.028223] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:19:46.486 [2024-07-12 18:23:30.028239] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:19:46.486 [2024-07-12 18:23:30.028250] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:19:46.486 [2024-07-12 18:23:30.028265] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:19:46.486 [2024-07-12 18:23:30.028276] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev4 00:19:46.486 [2024-07-12 18:23:30.028290] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev4 doesn't exist now 00:19:46.486 18:23:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@251 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:19:46.486 18:23:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:19:46.486 18:23:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:19:46.486 18:23:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:19:46.486 18:23:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:19:46.486 18:23:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:19:46.486 18:23:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:19:46.486 18:23:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:19:46.486 18:23:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:19:46.486 18:23:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:19:46.486 18:23:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:46.486 18:23:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:19:46.745 18:23:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:19:46.745 "name": "Existed_Raid", 00:19:46.745 "uuid": "232a8ff5-00f9-4ecb-91d3-c342bc90f271", 00:19:46.745 "strip_size_kb": 64, 00:19:46.745 "state": "configuring", 00:19:46.745 "raid_level": "concat", 00:19:46.745 "superblock": true, 00:19:46.745 "num_base_bdevs": 4, 00:19:46.745 "num_base_bdevs_discovered": 0, 00:19:46.745 "num_base_bdevs_operational": 4, 00:19:46.745 "base_bdevs_list": [ 00:19:46.745 { 00:19:46.745 "name": "BaseBdev1", 00:19:46.745 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:46.745 "is_configured": false, 00:19:46.745 "data_offset": 0, 00:19:46.745 "data_size": 0 00:19:46.745 }, 00:19:46.745 { 00:19:46.745 "name": "BaseBdev2", 00:19:46.745 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:46.745 "is_configured": false, 00:19:46.745 "data_offset": 0, 00:19:46.745 "data_size": 0 00:19:46.745 }, 00:19:46.745 { 00:19:46.745 "name": "BaseBdev3", 00:19:46.745 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:46.745 "is_configured": false, 00:19:46.745 "data_offset": 0, 00:19:46.745 "data_size": 0 00:19:46.745 }, 00:19:46.745 { 00:19:46.745 "name": "BaseBdev4", 00:19:46.745 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:46.745 "is_configured": false, 00:19:46.745 "data_offset": 0, 00:19:46.745 "data_size": 0 00:19:46.745 } 00:19:46.745 ] 00:19:46.745 }' 00:19:46.745 18:23:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:19:46.745 18:23:30 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:19:47.313 18:23:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@252 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:19:47.621 [2024-07-12 18:23:31.090803] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:19:47.621 [2024-07-12 18:23:31.090832] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1401aa0 name Existed_Raid, state configuring 00:19:47.621 18:23:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@256 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r concat -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:19:47.621 [2024-07-12 18:23:31.259288] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:19:47.621 [2024-07-12 18:23:31.259320] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:19:47.621 [2024-07-12 18:23:31.259329] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:19:47.621 [2024-07-12 18:23:31.259341] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:19:47.621 [2024-07-12 18:23:31.259349] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:19:47.621 [2024-07-12 18:23:31.259360] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:19:47.621 [2024-07-12 18:23:31.259369] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev4 00:19:47.621 [2024-07-12 18:23:31.259380] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev4 doesn't exist now 00:19:47.621 18:23:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@257 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:19:47.880 [2024-07-12 18:23:31.509824] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:19:47.880 BaseBdev1 00:19:47.880 18:23:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@258 -- # waitforbdev BaseBdev1 00:19:47.880 18:23:31 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev1 00:19:47.880 18:23:31 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:19:47.880 18:23:31 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:19:47.880 18:23:31 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:19:47.880 18:23:31 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:19:47.880 18:23:31 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:19:48.139 18:23:31 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:19:48.706 [ 00:19:48.706 { 00:19:48.706 "name": "BaseBdev1", 00:19:48.706 "aliases": [ 00:19:48.706 "953be238-2764-4c9e-b398-1462264fa3dd" 00:19:48.706 ], 00:19:48.706 "product_name": "Malloc disk", 00:19:48.706 "block_size": 512, 00:19:48.706 "num_blocks": 65536, 00:19:48.706 "uuid": "953be238-2764-4c9e-b398-1462264fa3dd", 00:19:48.706 "assigned_rate_limits": { 00:19:48.706 "rw_ios_per_sec": 0, 00:19:48.706 "rw_mbytes_per_sec": 0, 00:19:48.706 "r_mbytes_per_sec": 0, 00:19:48.706 "w_mbytes_per_sec": 0 00:19:48.706 }, 00:19:48.706 "claimed": true, 00:19:48.706 "claim_type": "exclusive_write", 00:19:48.706 "zoned": false, 00:19:48.706 "supported_io_types": { 00:19:48.706 "read": true, 00:19:48.706 "write": true, 00:19:48.707 "unmap": true, 00:19:48.707 "flush": true, 00:19:48.707 "reset": true, 00:19:48.707 "nvme_admin": false, 00:19:48.707 "nvme_io": false, 00:19:48.707 "nvme_io_md": false, 00:19:48.707 "write_zeroes": true, 00:19:48.707 "zcopy": true, 00:19:48.707 "get_zone_info": false, 00:19:48.707 "zone_management": false, 00:19:48.707 "zone_append": false, 00:19:48.707 "compare": false, 00:19:48.707 "compare_and_write": false, 00:19:48.707 "abort": true, 00:19:48.707 "seek_hole": false, 00:19:48.707 "seek_data": false, 00:19:48.707 "copy": true, 00:19:48.707 "nvme_iov_md": false 00:19:48.707 }, 00:19:48.707 "memory_domains": [ 00:19:48.707 { 00:19:48.707 "dma_device_id": "system", 00:19:48.707 "dma_device_type": 1 00:19:48.707 }, 00:19:48.707 { 00:19:48.707 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:48.707 "dma_device_type": 2 00:19:48.707 } 00:19:48.707 ], 00:19:48.707 "driver_specific": {} 00:19:48.707 } 00:19:48.707 ] 00:19:48.707 18:23:32 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:19:48.707 18:23:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@259 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:19:48.707 18:23:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:19:48.707 18:23:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:19:48.707 18:23:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:19:48.707 18:23:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:19:48.707 18:23:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:19:48.707 18:23:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:19:48.707 18:23:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:19:48.707 18:23:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:19:48.707 18:23:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:19:48.707 18:23:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:48.707 18:23:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:19:48.966 18:23:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:19:48.966 "name": "Existed_Raid", 00:19:48.966 "uuid": "acbc26ee-146b-44a5-b8ef-c815e5e3b8be", 00:19:48.966 "strip_size_kb": 64, 00:19:48.966 "state": "configuring", 00:19:48.966 "raid_level": "concat", 00:19:48.966 "superblock": true, 00:19:48.966 "num_base_bdevs": 4, 00:19:48.966 "num_base_bdevs_discovered": 1, 00:19:48.966 "num_base_bdevs_operational": 4, 00:19:48.966 "base_bdevs_list": [ 00:19:48.966 { 00:19:48.966 "name": "BaseBdev1", 00:19:48.966 "uuid": "953be238-2764-4c9e-b398-1462264fa3dd", 00:19:48.966 "is_configured": true, 00:19:48.966 "data_offset": 2048, 00:19:48.966 "data_size": 63488 00:19:48.966 }, 00:19:48.966 { 00:19:48.966 "name": "BaseBdev2", 00:19:48.966 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:48.966 "is_configured": false, 00:19:48.966 "data_offset": 0, 00:19:48.966 "data_size": 0 00:19:48.966 }, 00:19:48.966 { 00:19:48.966 "name": "BaseBdev3", 00:19:48.966 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:48.966 "is_configured": false, 00:19:48.966 "data_offset": 0, 00:19:48.966 "data_size": 0 00:19:48.966 }, 00:19:48.966 { 00:19:48.966 "name": "BaseBdev4", 00:19:48.966 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:48.966 "is_configured": false, 00:19:48.966 "data_offset": 0, 00:19:48.966 "data_size": 0 00:19:48.966 } 00:19:48.966 ] 00:19:48.966 }' 00:19:48.966 18:23:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:19:48.966 18:23:32 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:19:49.545 18:23:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@260 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:19:49.804 [2024-07-12 18:23:33.330641] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:19:49.804 [2024-07-12 18:23:33.330677] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1401310 name Existed_Raid, state configuring 00:19:49.804 18:23:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@264 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r concat -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:19:50.062 [2024-07-12 18:23:33.571330] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:19:50.062 [2024-07-12 18:23:33.572779] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:19:50.062 [2024-07-12 18:23:33.572811] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:19:50.062 [2024-07-12 18:23:33.572821] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:19:50.062 [2024-07-12 18:23:33.572833] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:19:50.062 [2024-07-12 18:23:33.572842] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev4 00:19:50.062 [2024-07-12 18:23:33.572853] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev4 doesn't exist now 00:19:50.062 18:23:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i = 1 )) 00:19:50.062 18:23:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:19:50.062 18:23:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:19:50.062 18:23:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:19:50.062 18:23:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:19:50.062 18:23:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:19:50.062 18:23:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:19:50.062 18:23:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:19:50.062 18:23:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:19:50.062 18:23:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:19:50.062 18:23:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:19:50.062 18:23:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:19:50.062 18:23:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:50.062 18:23:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:19:50.320 18:23:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:19:50.320 "name": "Existed_Raid", 00:19:50.320 "uuid": "1872619a-f60b-423b-8a5f-bb52ecabef33", 00:19:50.320 "strip_size_kb": 64, 00:19:50.320 "state": "configuring", 00:19:50.320 "raid_level": "concat", 00:19:50.320 "superblock": true, 00:19:50.320 "num_base_bdevs": 4, 00:19:50.320 "num_base_bdevs_discovered": 1, 00:19:50.320 "num_base_bdevs_operational": 4, 00:19:50.320 "base_bdevs_list": [ 00:19:50.320 { 00:19:50.320 "name": "BaseBdev1", 00:19:50.320 "uuid": "953be238-2764-4c9e-b398-1462264fa3dd", 00:19:50.320 "is_configured": true, 00:19:50.320 "data_offset": 2048, 00:19:50.320 "data_size": 63488 00:19:50.320 }, 00:19:50.320 { 00:19:50.320 "name": "BaseBdev2", 00:19:50.320 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:50.320 "is_configured": false, 00:19:50.320 "data_offset": 0, 00:19:50.320 "data_size": 0 00:19:50.320 }, 00:19:50.320 { 00:19:50.320 "name": "BaseBdev3", 00:19:50.320 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:50.320 "is_configured": false, 00:19:50.320 "data_offset": 0, 00:19:50.320 "data_size": 0 00:19:50.320 }, 00:19:50.320 { 00:19:50.320 "name": "BaseBdev4", 00:19:50.320 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:50.320 "is_configured": false, 00:19:50.320 "data_offset": 0, 00:19:50.320 "data_size": 0 00:19:50.320 } 00:19:50.320 ] 00:19:50.320 }' 00:19:50.320 18:23:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:19:50.320 18:23:33 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:19:50.887 18:23:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:19:50.887 [2024-07-12 18:23:34.609385] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:19:50.887 BaseBdev2 00:19:51.146 18:23:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev2 00:19:51.146 18:23:34 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev2 00:19:51.146 18:23:34 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:19:51.146 18:23:34 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:19:51.146 18:23:34 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:19:51.146 18:23:34 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:19:51.146 18:23:34 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:19:51.405 18:23:34 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:19:51.405 [ 00:19:51.405 { 00:19:51.405 "name": "BaseBdev2", 00:19:51.405 "aliases": [ 00:19:51.405 "3e00b82c-64e6-4bf4-8cc2-a3f860714dd8" 00:19:51.405 ], 00:19:51.405 "product_name": "Malloc disk", 00:19:51.405 "block_size": 512, 00:19:51.405 "num_blocks": 65536, 00:19:51.405 "uuid": "3e00b82c-64e6-4bf4-8cc2-a3f860714dd8", 00:19:51.405 "assigned_rate_limits": { 00:19:51.405 "rw_ios_per_sec": 0, 00:19:51.405 "rw_mbytes_per_sec": 0, 00:19:51.405 "r_mbytes_per_sec": 0, 00:19:51.405 "w_mbytes_per_sec": 0 00:19:51.405 }, 00:19:51.405 "claimed": true, 00:19:51.405 "claim_type": "exclusive_write", 00:19:51.405 "zoned": false, 00:19:51.405 "supported_io_types": { 00:19:51.405 "read": true, 00:19:51.405 "write": true, 00:19:51.405 "unmap": true, 00:19:51.405 "flush": true, 00:19:51.405 "reset": true, 00:19:51.405 "nvme_admin": false, 00:19:51.405 "nvme_io": false, 00:19:51.405 "nvme_io_md": false, 00:19:51.405 "write_zeroes": true, 00:19:51.405 "zcopy": true, 00:19:51.405 "get_zone_info": false, 00:19:51.405 "zone_management": false, 00:19:51.405 "zone_append": false, 00:19:51.405 "compare": false, 00:19:51.405 "compare_and_write": false, 00:19:51.405 "abort": true, 00:19:51.405 "seek_hole": false, 00:19:51.405 "seek_data": false, 00:19:51.405 "copy": true, 00:19:51.405 "nvme_iov_md": false 00:19:51.405 }, 00:19:51.405 "memory_domains": [ 00:19:51.405 { 00:19:51.405 "dma_device_id": "system", 00:19:51.405 "dma_device_type": 1 00:19:51.405 }, 00:19:51.405 { 00:19:51.405 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:51.405 "dma_device_type": 2 00:19:51.405 } 00:19:51.405 ], 00:19:51.405 "driver_specific": {} 00:19:51.405 } 00:19:51.405 ] 00:19:51.405 18:23:35 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:19:51.405 18:23:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:19:51.405 18:23:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:19:51.405 18:23:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:19:51.405 18:23:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:19:51.405 18:23:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:19:51.405 18:23:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:19:51.405 18:23:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:19:51.405 18:23:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:19:51.405 18:23:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:19:51.405 18:23:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:19:51.405 18:23:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:19:51.405 18:23:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:19:51.405 18:23:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:19:51.405 18:23:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:51.664 18:23:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:19:51.664 "name": "Existed_Raid", 00:19:51.664 "uuid": "1872619a-f60b-423b-8a5f-bb52ecabef33", 00:19:51.664 "strip_size_kb": 64, 00:19:51.664 "state": "configuring", 00:19:51.664 "raid_level": "concat", 00:19:51.664 "superblock": true, 00:19:51.664 "num_base_bdevs": 4, 00:19:51.664 "num_base_bdevs_discovered": 2, 00:19:51.664 "num_base_bdevs_operational": 4, 00:19:51.664 "base_bdevs_list": [ 00:19:51.664 { 00:19:51.664 "name": "BaseBdev1", 00:19:51.664 "uuid": "953be238-2764-4c9e-b398-1462264fa3dd", 00:19:51.664 "is_configured": true, 00:19:51.664 "data_offset": 2048, 00:19:51.664 "data_size": 63488 00:19:51.664 }, 00:19:51.664 { 00:19:51.664 "name": "BaseBdev2", 00:19:51.664 "uuid": "3e00b82c-64e6-4bf4-8cc2-a3f860714dd8", 00:19:51.664 "is_configured": true, 00:19:51.664 "data_offset": 2048, 00:19:51.665 "data_size": 63488 00:19:51.665 }, 00:19:51.665 { 00:19:51.665 "name": "BaseBdev3", 00:19:51.665 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:51.665 "is_configured": false, 00:19:51.665 "data_offset": 0, 00:19:51.665 "data_size": 0 00:19:51.665 }, 00:19:51.665 { 00:19:51.665 "name": "BaseBdev4", 00:19:51.665 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:51.665 "is_configured": false, 00:19:51.665 "data_offset": 0, 00:19:51.665 "data_size": 0 00:19:51.665 } 00:19:51.665 ] 00:19:51.665 }' 00:19:51.665 18:23:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:19:51.665 18:23:35 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:19:52.600 18:23:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:19:52.600 [2024-07-12 18:23:36.204945] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:19:52.600 BaseBdev3 00:19:52.600 18:23:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev3 00:19:52.600 18:23:36 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev3 00:19:52.600 18:23:36 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:19:52.600 18:23:36 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:19:52.600 18:23:36 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:19:52.600 18:23:36 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:19:52.600 18:23:36 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:19:52.859 18:23:36 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:19:53.118 [ 00:19:53.118 { 00:19:53.118 "name": "BaseBdev3", 00:19:53.118 "aliases": [ 00:19:53.118 "7a754565-477b-4b6d-adfa-2fca90c414d2" 00:19:53.118 ], 00:19:53.118 "product_name": "Malloc disk", 00:19:53.118 "block_size": 512, 00:19:53.118 "num_blocks": 65536, 00:19:53.118 "uuid": "7a754565-477b-4b6d-adfa-2fca90c414d2", 00:19:53.118 "assigned_rate_limits": { 00:19:53.118 "rw_ios_per_sec": 0, 00:19:53.118 "rw_mbytes_per_sec": 0, 00:19:53.118 "r_mbytes_per_sec": 0, 00:19:53.118 "w_mbytes_per_sec": 0 00:19:53.118 }, 00:19:53.118 "claimed": true, 00:19:53.118 "claim_type": "exclusive_write", 00:19:53.118 "zoned": false, 00:19:53.118 "supported_io_types": { 00:19:53.118 "read": true, 00:19:53.118 "write": true, 00:19:53.118 "unmap": true, 00:19:53.118 "flush": true, 00:19:53.118 "reset": true, 00:19:53.118 "nvme_admin": false, 00:19:53.118 "nvme_io": false, 00:19:53.118 "nvme_io_md": false, 00:19:53.118 "write_zeroes": true, 00:19:53.118 "zcopy": true, 00:19:53.118 "get_zone_info": false, 00:19:53.118 "zone_management": false, 00:19:53.118 "zone_append": false, 00:19:53.118 "compare": false, 00:19:53.118 "compare_and_write": false, 00:19:53.118 "abort": true, 00:19:53.118 "seek_hole": false, 00:19:53.118 "seek_data": false, 00:19:53.118 "copy": true, 00:19:53.118 "nvme_iov_md": false 00:19:53.118 }, 00:19:53.118 "memory_domains": [ 00:19:53.118 { 00:19:53.118 "dma_device_id": "system", 00:19:53.118 "dma_device_type": 1 00:19:53.118 }, 00:19:53.118 { 00:19:53.118 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:53.118 "dma_device_type": 2 00:19:53.118 } 00:19:53.118 ], 00:19:53.118 "driver_specific": {} 00:19:53.118 } 00:19:53.118 ] 00:19:53.118 18:23:36 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:19:53.118 18:23:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:19:53.118 18:23:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:19:53.118 18:23:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:19:53.118 18:23:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:19:53.118 18:23:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:19:53.118 18:23:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:19:53.118 18:23:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:19:53.118 18:23:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:19:53.118 18:23:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:19:53.118 18:23:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:19:53.118 18:23:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:19:53.118 18:23:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:19:53.118 18:23:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:53.118 18:23:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:19:53.377 18:23:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:19:53.377 "name": "Existed_Raid", 00:19:53.377 "uuid": "1872619a-f60b-423b-8a5f-bb52ecabef33", 00:19:53.377 "strip_size_kb": 64, 00:19:53.377 "state": "configuring", 00:19:53.377 "raid_level": "concat", 00:19:53.377 "superblock": true, 00:19:53.377 "num_base_bdevs": 4, 00:19:53.377 "num_base_bdevs_discovered": 3, 00:19:53.377 "num_base_bdevs_operational": 4, 00:19:53.377 "base_bdevs_list": [ 00:19:53.377 { 00:19:53.377 "name": "BaseBdev1", 00:19:53.377 "uuid": "953be238-2764-4c9e-b398-1462264fa3dd", 00:19:53.377 "is_configured": true, 00:19:53.377 "data_offset": 2048, 00:19:53.377 "data_size": 63488 00:19:53.377 }, 00:19:53.377 { 00:19:53.377 "name": "BaseBdev2", 00:19:53.377 "uuid": "3e00b82c-64e6-4bf4-8cc2-a3f860714dd8", 00:19:53.377 "is_configured": true, 00:19:53.377 "data_offset": 2048, 00:19:53.377 "data_size": 63488 00:19:53.377 }, 00:19:53.377 { 00:19:53.377 "name": "BaseBdev3", 00:19:53.377 "uuid": "7a754565-477b-4b6d-adfa-2fca90c414d2", 00:19:53.377 "is_configured": true, 00:19:53.377 "data_offset": 2048, 00:19:53.377 "data_size": 63488 00:19:53.377 }, 00:19:53.377 { 00:19:53.377 "name": "BaseBdev4", 00:19:53.377 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:53.377 "is_configured": false, 00:19:53.377 "data_offset": 0, 00:19:53.377 "data_size": 0 00:19:53.377 } 00:19:53.377 ] 00:19:53.377 }' 00:19:53.377 18:23:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:19:53.377 18:23:36 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:19:53.945 18:23:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4 00:19:54.204 [2024-07-12 18:23:37.781624] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:19:54.204 [2024-07-12 18:23:37.781788] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x1402350 00:19:54.204 [2024-07-12 18:23:37.781801] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 253952, blocklen 512 00:19:54.204 [2024-07-12 18:23:37.781980] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1402020 00:19:54.204 [2024-07-12 18:23:37.782103] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1402350 00:19:54.204 [2024-07-12 18:23:37.782113] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x1402350 00:19:54.204 [2024-07-12 18:23:37.782202] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:19:54.204 BaseBdev4 00:19:54.204 18:23:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev4 00:19:54.204 18:23:37 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev4 00:19:54.204 18:23:37 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:19:54.204 18:23:37 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:19:54.204 18:23:37 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:19:54.204 18:23:37 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:19:54.204 18:23:37 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:19:54.463 18:23:38 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 -t 2000 00:19:54.722 [ 00:19:54.722 { 00:19:54.722 "name": "BaseBdev4", 00:19:54.722 "aliases": [ 00:19:54.722 "6867ae8f-bf82-47af-a9af-b2956ed7163c" 00:19:54.722 ], 00:19:54.722 "product_name": "Malloc disk", 00:19:54.722 "block_size": 512, 00:19:54.722 "num_blocks": 65536, 00:19:54.722 "uuid": "6867ae8f-bf82-47af-a9af-b2956ed7163c", 00:19:54.722 "assigned_rate_limits": { 00:19:54.722 "rw_ios_per_sec": 0, 00:19:54.722 "rw_mbytes_per_sec": 0, 00:19:54.722 "r_mbytes_per_sec": 0, 00:19:54.722 "w_mbytes_per_sec": 0 00:19:54.722 }, 00:19:54.722 "claimed": true, 00:19:54.722 "claim_type": "exclusive_write", 00:19:54.722 "zoned": false, 00:19:54.722 "supported_io_types": { 00:19:54.722 "read": true, 00:19:54.722 "write": true, 00:19:54.722 "unmap": true, 00:19:54.722 "flush": true, 00:19:54.722 "reset": true, 00:19:54.722 "nvme_admin": false, 00:19:54.722 "nvme_io": false, 00:19:54.722 "nvme_io_md": false, 00:19:54.722 "write_zeroes": true, 00:19:54.722 "zcopy": true, 00:19:54.722 "get_zone_info": false, 00:19:54.722 "zone_management": false, 00:19:54.722 "zone_append": false, 00:19:54.722 "compare": false, 00:19:54.722 "compare_and_write": false, 00:19:54.722 "abort": true, 00:19:54.722 "seek_hole": false, 00:19:54.722 "seek_data": false, 00:19:54.722 "copy": true, 00:19:54.722 "nvme_iov_md": false 00:19:54.722 }, 00:19:54.722 "memory_domains": [ 00:19:54.722 { 00:19:54.722 "dma_device_id": "system", 00:19:54.722 "dma_device_type": 1 00:19:54.722 }, 00:19:54.722 { 00:19:54.722 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:54.722 "dma_device_type": 2 00:19:54.722 } 00:19:54.722 ], 00:19:54.722 "driver_specific": {} 00:19:54.722 } 00:19:54.722 ] 00:19:54.722 18:23:38 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:19:54.722 18:23:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:19:54.722 18:23:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:19:54.722 18:23:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@270 -- # verify_raid_bdev_state Existed_Raid online concat 64 4 00:19:54.722 18:23:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:19:54.722 18:23:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:19:54.722 18:23:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:19:54.722 18:23:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:19:54.722 18:23:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:19:54.722 18:23:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:19:54.722 18:23:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:19:54.722 18:23:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:19:54.722 18:23:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:19:54.722 18:23:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:54.722 18:23:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:19:54.982 18:23:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:19:54.982 "name": "Existed_Raid", 00:19:54.982 "uuid": "1872619a-f60b-423b-8a5f-bb52ecabef33", 00:19:54.982 "strip_size_kb": 64, 00:19:54.982 "state": "online", 00:19:54.982 "raid_level": "concat", 00:19:54.982 "superblock": true, 00:19:54.982 "num_base_bdevs": 4, 00:19:54.982 "num_base_bdevs_discovered": 4, 00:19:54.982 "num_base_bdevs_operational": 4, 00:19:54.982 "base_bdevs_list": [ 00:19:54.982 { 00:19:54.982 "name": "BaseBdev1", 00:19:54.982 "uuid": "953be238-2764-4c9e-b398-1462264fa3dd", 00:19:54.982 "is_configured": true, 00:19:54.982 "data_offset": 2048, 00:19:54.982 "data_size": 63488 00:19:54.982 }, 00:19:54.982 { 00:19:54.982 "name": "BaseBdev2", 00:19:54.982 "uuid": "3e00b82c-64e6-4bf4-8cc2-a3f860714dd8", 00:19:54.982 "is_configured": true, 00:19:54.982 "data_offset": 2048, 00:19:54.982 "data_size": 63488 00:19:54.982 }, 00:19:54.982 { 00:19:54.982 "name": "BaseBdev3", 00:19:54.982 "uuid": "7a754565-477b-4b6d-adfa-2fca90c414d2", 00:19:54.982 "is_configured": true, 00:19:54.982 "data_offset": 2048, 00:19:54.982 "data_size": 63488 00:19:54.982 }, 00:19:54.982 { 00:19:54.982 "name": "BaseBdev4", 00:19:54.982 "uuid": "6867ae8f-bf82-47af-a9af-b2956ed7163c", 00:19:54.982 "is_configured": true, 00:19:54.982 "data_offset": 2048, 00:19:54.982 "data_size": 63488 00:19:54.982 } 00:19:54.982 ] 00:19:54.982 }' 00:19:54.982 18:23:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:19:54.982 18:23:38 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:19:55.549 18:23:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_properties Existed_Raid 00:19:55.550 18:23:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:19:55.550 18:23:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:19:55.550 18:23:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:19:55.550 18:23:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:19:55.550 18:23:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@198 -- # local name 00:19:55.550 18:23:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:19:55.550 18:23:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:19:55.808 [2024-07-12 18:23:39.362135] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:19:55.808 18:23:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:19:55.808 "name": "Existed_Raid", 00:19:55.808 "aliases": [ 00:19:55.808 "1872619a-f60b-423b-8a5f-bb52ecabef33" 00:19:55.808 ], 00:19:55.808 "product_name": "Raid Volume", 00:19:55.808 "block_size": 512, 00:19:55.808 "num_blocks": 253952, 00:19:55.808 "uuid": "1872619a-f60b-423b-8a5f-bb52ecabef33", 00:19:55.808 "assigned_rate_limits": { 00:19:55.808 "rw_ios_per_sec": 0, 00:19:55.808 "rw_mbytes_per_sec": 0, 00:19:55.808 "r_mbytes_per_sec": 0, 00:19:55.808 "w_mbytes_per_sec": 0 00:19:55.808 }, 00:19:55.808 "claimed": false, 00:19:55.808 "zoned": false, 00:19:55.808 "supported_io_types": { 00:19:55.808 "read": true, 00:19:55.808 "write": true, 00:19:55.808 "unmap": true, 00:19:55.808 "flush": true, 00:19:55.808 "reset": true, 00:19:55.808 "nvme_admin": false, 00:19:55.808 "nvme_io": false, 00:19:55.808 "nvme_io_md": false, 00:19:55.808 "write_zeroes": true, 00:19:55.808 "zcopy": false, 00:19:55.808 "get_zone_info": false, 00:19:55.808 "zone_management": false, 00:19:55.808 "zone_append": false, 00:19:55.808 "compare": false, 00:19:55.808 "compare_and_write": false, 00:19:55.808 "abort": false, 00:19:55.808 "seek_hole": false, 00:19:55.808 "seek_data": false, 00:19:55.808 "copy": false, 00:19:55.808 "nvme_iov_md": false 00:19:55.808 }, 00:19:55.808 "memory_domains": [ 00:19:55.808 { 00:19:55.808 "dma_device_id": "system", 00:19:55.808 "dma_device_type": 1 00:19:55.808 }, 00:19:55.808 { 00:19:55.808 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:55.808 "dma_device_type": 2 00:19:55.808 }, 00:19:55.808 { 00:19:55.808 "dma_device_id": "system", 00:19:55.808 "dma_device_type": 1 00:19:55.808 }, 00:19:55.808 { 00:19:55.808 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:55.808 "dma_device_type": 2 00:19:55.808 }, 00:19:55.808 { 00:19:55.808 "dma_device_id": "system", 00:19:55.808 "dma_device_type": 1 00:19:55.808 }, 00:19:55.808 { 00:19:55.808 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:55.808 "dma_device_type": 2 00:19:55.808 }, 00:19:55.808 { 00:19:55.808 "dma_device_id": "system", 00:19:55.808 "dma_device_type": 1 00:19:55.808 }, 00:19:55.808 { 00:19:55.808 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:55.808 "dma_device_type": 2 00:19:55.808 } 00:19:55.808 ], 00:19:55.808 "driver_specific": { 00:19:55.808 "raid": { 00:19:55.809 "uuid": "1872619a-f60b-423b-8a5f-bb52ecabef33", 00:19:55.809 "strip_size_kb": 64, 00:19:55.809 "state": "online", 00:19:55.809 "raid_level": "concat", 00:19:55.809 "superblock": true, 00:19:55.809 "num_base_bdevs": 4, 00:19:55.809 "num_base_bdevs_discovered": 4, 00:19:55.809 "num_base_bdevs_operational": 4, 00:19:55.809 "base_bdevs_list": [ 00:19:55.809 { 00:19:55.809 "name": "BaseBdev1", 00:19:55.809 "uuid": "953be238-2764-4c9e-b398-1462264fa3dd", 00:19:55.809 "is_configured": true, 00:19:55.809 "data_offset": 2048, 00:19:55.809 "data_size": 63488 00:19:55.809 }, 00:19:55.809 { 00:19:55.809 "name": "BaseBdev2", 00:19:55.809 "uuid": "3e00b82c-64e6-4bf4-8cc2-a3f860714dd8", 00:19:55.809 "is_configured": true, 00:19:55.809 "data_offset": 2048, 00:19:55.809 "data_size": 63488 00:19:55.809 }, 00:19:55.809 { 00:19:55.809 "name": "BaseBdev3", 00:19:55.809 "uuid": "7a754565-477b-4b6d-adfa-2fca90c414d2", 00:19:55.809 "is_configured": true, 00:19:55.809 "data_offset": 2048, 00:19:55.809 "data_size": 63488 00:19:55.809 }, 00:19:55.809 { 00:19:55.809 "name": "BaseBdev4", 00:19:55.809 "uuid": "6867ae8f-bf82-47af-a9af-b2956ed7163c", 00:19:55.809 "is_configured": true, 00:19:55.809 "data_offset": 2048, 00:19:55.809 "data_size": 63488 00:19:55.809 } 00:19:55.809 ] 00:19:55.809 } 00:19:55.809 } 00:19:55.809 }' 00:19:55.809 18:23:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:19:55.809 18:23:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # base_bdev_names='BaseBdev1 00:19:55.809 BaseBdev2 00:19:55.809 BaseBdev3 00:19:55.809 BaseBdev4' 00:19:55.809 18:23:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:19:55.809 18:23:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:19:55.809 18:23:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:19:56.067 18:23:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:19:56.067 "name": "BaseBdev1", 00:19:56.067 "aliases": [ 00:19:56.067 "953be238-2764-4c9e-b398-1462264fa3dd" 00:19:56.067 ], 00:19:56.067 "product_name": "Malloc disk", 00:19:56.067 "block_size": 512, 00:19:56.067 "num_blocks": 65536, 00:19:56.067 "uuid": "953be238-2764-4c9e-b398-1462264fa3dd", 00:19:56.067 "assigned_rate_limits": { 00:19:56.067 "rw_ios_per_sec": 0, 00:19:56.067 "rw_mbytes_per_sec": 0, 00:19:56.067 "r_mbytes_per_sec": 0, 00:19:56.067 "w_mbytes_per_sec": 0 00:19:56.067 }, 00:19:56.067 "claimed": true, 00:19:56.067 "claim_type": "exclusive_write", 00:19:56.067 "zoned": false, 00:19:56.067 "supported_io_types": { 00:19:56.067 "read": true, 00:19:56.067 "write": true, 00:19:56.067 "unmap": true, 00:19:56.067 "flush": true, 00:19:56.067 "reset": true, 00:19:56.067 "nvme_admin": false, 00:19:56.067 "nvme_io": false, 00:19:56.067 "nvme_io_md": false, 00:19:56.067 "write_zeroes": true, 00:19:56.067 "zcopy": true, 00:19:56.067 "get_zone_info": false, 00:19:56.067 "zone_management": false, 00:19:56.067 "zone_append": false, 00:19:56.067 "compare": false, 00:19:56.067 "compare_and_write": false, 00:19:56.067 "abort": true, 00:19:56.067 "seek_hole": false, 00:19:56.067 "seek_data": false, 00:19:56.067 "copy": true, 00:19:56.067 "nvme_iov_md": false 00:19:56.067 }, 00:19:56.067 "memory_domains": [ 00:19:56.067 { 00:19:56.067 "dma_device_id": "system", 00:19:56.067 "dma_device_type": 1 00:19:56.067 }, 00:19:56.067 { 00:19:56.067 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:56.067 "dma_device_type": 2 00:19:56.067 } 00:19:56.067 ], 00:19:56.067 "driver_specific": {} 00:19:56.067 }' 00:19:56.067 18:23:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:19:56.067 18:23:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:19:56.067 18:23:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:19:56.067 18:23:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:19:56.325 18:23:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:19:56.326 18:23:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:19:56.326 18:23:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:19:56.326 18:23:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:19:56.326 18:23:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:19:56.326 18:23:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:19:56.326 18:23:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:19:56.326 18:23:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:19:56.326 18:23:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:19:56.585 18:23:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:19:56.585 18:23:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:19:56.585 18:23:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:19:56.585 "name": "BaseBdev2", 00:19:56.585 "aliases": [ 00:19:56.585 "3e00b82c-64e6-4bf4-8cc2-a3f860714dd8" 00:19:56.585 ], 00:19:56.585 "product_name": "Malloc disk", 00:19:56.585 "block_size": 512, 00:19:56.585 "num_blocks": 65536, 00:19:56.585 "uuid": "3e00b82c-64e6-4bf4-8cc2-a3f860714dd8", 00:19:56.585 "assigned_rate_limits": { 00:19:56.585 "rw_ios_per_sec": 0, 00:19:56.585 "rw_mbytes_per_sec": 0, 00:19:56.585 "r_mbytes_per_sec": 0, 00:19:56.585 "w_mbytes_per_sec": 0 00:19:56.585 }, 00:19:56.585 "claimed": true, 00:19:56.585 "claim_type": "exclusive_write", 00:19:56.585 "zoned": false, 00:19:56.585 "supported_io_types": { 00:19:56.585 "read": true, 00:19:56.585 "write": true, 00:19:56.585 "unmap": true, 00:19:56.585 "flush": true, 00:19:56.585 "reset": true, 00:19:56.585 "nvme_admin": false, 00:19:56.585 "nvme_io": false, 00:19:56.585 "nvme_io_md": false, 00:19:56.585 "write_zeroes": true, 00:19:56.585 "zcopy": true, 00:19:56.585 "get_zone_info": false, 00:19:56.585 "zone_management": false, 00:19:56.585 "zone_append": false, 00:19:56.585 "compare": false, 00:19:56.585 "compare_and_write": false, 00:19:56.585 "abort": true, 00:19:56.585 "seek_hole": false, 00:19:56.585 "seek_data": false, 00:19:56.585 "copy": true, 00:19:56.585 "nvme_iov_md": false 00:19:56.585 }, 00:19:56.585 "memory_domains": [ 00:19:56.585 { 00:19:56.585 "dma_device_id": "system", 00:19:56.585 "dma_device_type": 1 00:19:56.585 }, 00:19:56.585 { 00:19:56.585 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:56.585 "dma_device_type": 2 00:19:56.585 } 00:19:56.585 ], 00:19:56.585 "driver_specific": {} 00:19:56.585 }' 00:19:56.585 18:23:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:19:56.843 18:23:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:19:56.843 18:23:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:19:56.843 18:23:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:19:56.843 18:23:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:19:56.843 18:23:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:19:56.843 18:23:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:19:56.843 18:23:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:19:57.101 18:23:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:19:57.101 18:23:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:19:57.101 18:23:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:19:57.101 18:23:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:19:57.101 18:23:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:19:57.101 18:23:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:19:57.101 18:23:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:19:57.360 18:23:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:19:57.361 "name": "BaseBdev3", 00:19:57.361 "aliases": [ 00:19:57.361 "7a754565-477b-4b6d-adfa-2fca90c414d2" 00:19:57.361 ], 00:19:57.361 "product_name": "Malloc disk", 00:19:57.361 "block_size": 512, 00:19:57.361 "num_blocks": 65536, 00:19:57.361 "uuid": "7a754565-477b-4b6d-adfa-2fca90c414d2", 00:19:57.361 "assigned_rate_limits": { 00:19:57.361 "rw_ios_per_sec": 0, 00:19:57.361 "rw_mbytes_per_sec": 0, 00:19:57.361 "r_mbytes_per_sec": 0, 00:19:57.361 "w_mbytes_per_sec": 0 00:19:57.361 }, 00:19:57.361 "claimed": true, 00:19:57.361 "claim_type": "exclusive_write", 00:19:57.361 "zoned": false, 00:19:57.361 "supported_io_types": { 00:19:57.361 "read": true, 00:19:57.361 "write": true, 00:19:57.361 "unmap": true, 00:19:57.361 "flush": true, 00:19:57.361 "reset": true, 00:19:57.361 "nvme_admin": false, 00:19:57.361 "nvme_io": false, 00:19:57.361 "nvme_io_md": false, 00:19:57.361 "write_zeroes": true, 00:19:57.361 "zcopy": true, 00:19:57.361 "get_zone_info": false, 00:19:57.361 "zone_management": false, 00:19:57.361 "zone_append": false, 00:19:57.361 "compare": false, 00:19:57.361 "compare_and_write": false, 00:19:57.361 "abort": true, 00:19:57.361 "seek_hole": false, 00:19:57.361 "seek_data": false, 00:19:57.361 "copy": true, 00:19:57.361 "nvme_iov_md": false 00:19:57.361 }, 00:19:57.361 "memory_domains": [ 00:19:57.361 { 00:19:57.361 "dma_device_id": "system", 00:19:57.361 "dma_device_type": 1 00:19:57.361 }, 00:19:57.361 { 00:19:57.361 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:57.361 "dma_device_type": 2 00:19:57.361 } 00:19:57.361 ], 00:19:57.361 "driver_specific": {} 00:19:57.361 }' 00:19:57.361 18:23:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:19:57.361 18:23:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:19:57.361 18:23:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:19:57.361 18:23:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:19:57.361 18:23:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:19:57.361 18:23:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:19:57.361 18:23:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:19:57.620 18:23:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:19:57.620 18:23:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:19:57.620 18:23:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:19:57.620 18:23:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:19:57.620 18:23:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:19:57.620 18:23:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:19:57.620 18:23:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 00:19:57.620 18:23:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:19:57.880 18:23:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:19:57.880 "name": "BaseBdev4", 00:19:57.880 "aliases": [ 00:19:57.880 "6867ae8f-bf82-47af-a9af-b2956ed7163c" 00:19:57.880 ], 00:19:57.880 "product_name": "Malloc disk", 00:19:57.880 "block_size": 512, 00:19:57.880 "num_blocks": 65536, 00:19:57.880 "uuid": "6867ae8f-bf82-47af-a9af-b2956ed7163c", 00:19:57.880 "assigned_rate_limits": { 00:19:57.880 "rw_ios_per_sec": 0, 00:19:57.880 "rw_mbytes_per_sec": 0, 00:19:57.880 "r_mbytes_per_sec": 0, 00:19:57.880 "w_mbytes_per_sec": 0 00:19:57.880 }, 00:19:57.880 "claimed": true, 00:19:57.880 "claim_type": "exclusive_write", 00:19:57.880 "zoned": false, 00:19:57.880 "supported_io_types": { 00:19:57.880 "read": true, 00:19:57.880 "write": true, 00:19:57.880 "unmap": true, 00:19:57.880 "flush": true, 00:19:57.880 "reset": true, 00:19:57.880 "nvme_admin": false, 00:19:57.880 "nvme_io": false, 00:19:57.880 "nvme_io_md": false, 00:19:57.880 "write_zeroes": true, 00:19:57.880 "zcopy": true, 00:19:57.880 "get_zone_info": false, 00:19:57.880 "zone_management": false, 00:19:57.880 "zone_append": false, 00:19:57.880 "compare": false, 00:19:57.880 "compare_and_write": false, 00:19:57.880 "abort": true, 00:19:57.880 "seek_hole": false, 00:19:57.880 "seek_data": false, 00:19:57.880 "copy": true, 00:19:57.880 "nvme_iov_md": false 00:19:57.880 }, 00:19:57.880 "memory_domains": [ 00:19:57.880 { 00:19:57.880 "dma_device_id": "system", 00:19:57.880 "dma_device_type": 1 00:19:57.880 }, 00:19:57.880 { 00:19:57.880 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:57.881 "dma_device_type": 2 00:19:57.881 } 00:19:57.881 ], 00:19:57.881 "driver_specific": {} 00:19:57.881 }' 00:19:57.881 18:23:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:19:57.881 18:23:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:19:57.881 18:23:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:19:57.881 18:23:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:19:58.140 18:23:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:19:58.140 18:23:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:19:58.140 18:23:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:19:58.140 18:23:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:19:58.140 18:23:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:19:58.140 18:23:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:19:58.140 18:23:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:19:58.140 18:23:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:19:58.140 18:23:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@274 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:19:58.399 [2024-07-12 18:23:42.077098] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:19:58.399 [2024-07-12 18:23:42.077124] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:19:58.399 [2024-07-12 18:23:42.077171] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:19:58.399 18:23:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@275 -- # local expected_state 00:19:58.399 18:23:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@276 -- # has_redundancy concat 00:19:58.399 18:23:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@213 -- # case $1 in 00:19:58.399 18:23:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@215 -- # return 1 00:19:58.399 18:23:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@277 -- # expected_state=offline 00:19:58.399 18:23:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@281 -- # verify_raid_bdev_state Existed_Raid offline concat 64 3 00:19:58.399 18:23:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:19:58.399 18:23:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=offline 00:19:58.399 18:23:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:19:58.399 18:23:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:19:58.399 18:23:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:19:58.399 18:23:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:19:58.399 18:23:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:19:58.399 18:23:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:19:58.399 18:23:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:19:58.399 18:23:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:58.399 18:23:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:19:58.658 18:23:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:19:58.658 "name": "Existed_Raid", 00:19:58.658 "uuid": "1872619a-f60b-423b-8a5f-bb52ecabef33", 00:19:58.658 "strip_size_kb": 64, 00:19:58.658 "state": "offline", 00:19:58.658 "raid_level": "concat", 00:19:58.658 "superblock": true, 00:19:58.658 "num_base_bdevs": 4, 00:19:58.658 "num_base_bdevs_discovered": 3, 00:19:58.658 "num_base_bdevs_operational": 3, 00:19:58.658 "base_bdevs_list": [ 00:19:58.658 { 00:19:58.658 "name": null, 00:19:58.658 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:58.658 "is_configured": false, 00:19:58.658 "data_offset": 2048, 00:19:58.658 "data_size": 63488 00:19:58.658 }, 00:19:58.658 { 00:19:58.658 "name": "BaseBdev2", 00:19:58.658 "uuid": "3e00b82c-64e6-4bf4-8cc2-a3f860714dd8", 00:19:58.658 "is_configured": true, 00:19:58.658 "data_offset": 2048, 00:19:58.658 "data_size": 63488 00:19:58.658 }, 00:19:58.658 { 00:19:58.658 "name": "BaseBdev3", 00:19:58.658 "uuid": "7a754565-477b-4b6d-adfa-2fca90c414d2", 00:19:58.658 "is_configured": true, 00:19:58.658 "data_offset": 2048, 00:19:58.658 "data_size": 63488 00:19:58.658 }, 00:19:58.658 { 00:19:58.658 "name": "BaseBdev4", 00:19:58.658 "uuid": "6867ae8f-bf82-47af-a9af-b2956ed7163c", 00:19:58.658 "is_configured": true, 00:19:58.658 "data_offset": 2048, 00:19:58.658 "data_size": 63488 00:19:58.658 } 00:19:58.658 ] 00:19:58.658 }' 00:19:58.658 18:23:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:19:58.658 18:23:42 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:19:59.226 18:23:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i = 1 )) 00:19:59.226 18:23:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:19:59.226 18:23:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:59.226 18:23:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:19:59.485 18:23:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:19:59.485 18:23:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:19:59.485 18:23:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:19:59.743 [2024-07-12 18:23:43.401667] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:19:59.743 18:23:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:19:59.743 18:23:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:19:59.743 18:23:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:59.743 18:23:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:20:00.001 18:23:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:20:00.001 18:23:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:20:00.001 18:23:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev3 00:20:00.258 [2024-07-12 18:23:43.903411] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:20:00.258 18:23:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:20:00.258 18:23:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:20:00.258 18:23:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:00.258 18:23:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:20:00.516 18:23:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:20:00.516 18:23:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:20:00.516 18:23:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev4 00:20:00.775 [2024-07-12 18:23:44.413073] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev4 00:20:00.775 [2024-07-12 18:23:44.413110] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1402350 name Existed_Raid, state offline 00:20:00.775 18:23:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:20:00.775 18:23:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:20:00.775 18:23:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:00.775 18:23:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # jq -r '.[0]["name"] | select(.)' 00:20:01.034 18:23:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # raid_bdev= 00:20:01.034 18:23:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@294 -- # '[' -n '' ']' 00:20:01.034 18:23:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@299 -- # '[' 4 -gt 2 ']' 00:20:01.034 18:23:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i = 1 )) 00:20:01.034 18:23:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:20:01.034 18:23:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:20:01.292 BaseBdev2 00:20:01.292 18:23:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev2 00:20:01.292 18:23:44 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev2 00:20:01.292 18:23:44 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:20:01.292 18:23:44 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:20:01.292 18:23:44 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:20:01.292 18:23:44 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:20:01.292 18:23:44 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:20:01.550 18:23:45 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:20:01.809 [ 00:20:01.809 { 00:20:01.809 "name": "BaseBdev2", 00:20:01.809 "aliases": [ 00:20:01.809 "a11740ce-ad97-4c83-933e-41298952f48f" 00:20:01.809 ], 00:20:01.809 "product_name": "Malloc disk", 00:20:01.809 "block_size": 512, 00:20:01.809 "num_blocks": 65536, 00:20:01.809 "uuid": "a11740ce-ad97-4c83-933e-41298952f48f", 00:20:01.809 "assigned_rate_limits": { 00:20:01.809 "rw_ios_per_sec": 0, 00:20:01.809 "rw_mbytes_per_sec": 0, 00:20:01.809 "r_mbytes_per_sec": 0, 00:20:01.809 "w_mbytes_per_sec": 0 00:20:01.809 }, 00:20:01.809 "claimed": false, 00:20:01.809 "zoned": false, 00:20:01.809 "supported_io_types": { 00:20:01.809 "read": true, 00:20:01.809 "write": true, 00:20:01.809 "unmap": true, 00:20:01.809 "flush": true, 00:20:01.809 "reset": true, 00:20:01.809 "nvme_admin": false, 00:20:01.809 "nvme_io": false, 00:20:01.809 "nvme_io_md": false, 00:20:01.809 "write_zeroes": true, 00:20:01.809 "zcopy": true, 00:20:01.809 "get_zone_info": false, 00:20:01.809 "zone_management": false, 00:20:01.809 "zone_append": false, 00:20:01.809 "compare": false, 00:20:01.809 "compare_and_write": false, 00:20:01.809 "abort": true, 00:20:01.809 "seek_hole": false, 00:20:01.809 "seek_data": false, 00:20:01.809 "copy": true, 00:20:01.809 "nvme_iov_md": false 00:20:01.809 }, 00:20:01.809 "memory_domains": [ 00:20:01.809 { 00:20:01.809 "dma_device_id": "system", 00:20:01.809 "dma_device_type": 1 00:20:01.809 }, 00:20:01.809 { 00:20:01.809 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:01.809 "dma_device_type": 2 00:20:01.809 } 00:20:01.809 ], 00:20:01.809 "driver_specific": {} 00:20:01.809 } 00:20:01.809 ] 00:20:01.809 18:23:45 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:20:01.809 18:23:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:20:01.809 18:23:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:20:01.809 18:23:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:20:02.068 BaseBdev3 00:20:02.068 18:23:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev3 00:20:02.068 18:23:45 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev3 00:20:02.068 18:23:45 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:20:02.068 18:23:45 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:20:02.068 18:23:45 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:20:02.068 18:23:45 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:20:02.068 18:23:45 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:20:02.326 18:23:45 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:20:02.584 [ 00:20:02.584 { 00:20:02.584 "name": "BaseBdev3", 00:20:02.584 "aliases": [ 00:20:02.584 "f66bf931-dc95-48e7-b1ec-3889e0d491aa" 00:20:02.584 ], 00:20:02.584 "product_name": "Malloc disk", 00:20:02.584 "block_size": 512, 00:20:02.584 "num_blocks": 65536, 00:20:02.584 "uuid": "f66bf931-dc95-48e7-b1ec-3889e0d491aa", 00:20:02.584 "assigned_rate_limits": { 00:20:02.584 "rw_ios_per_sec": 0, 00:20:02.584 "rw_mbytes_per_sec": 0, 00:20:02.584 "r_mbytes_per_sec": 0, 00:20:02.584 "w_mbytes_per_sec": 0 00:20:02.585 }, 00:20:02.585 "claimed": false, 00:20:02.585 "zoned": false, 00:20:02.585 "supported_io_types": { 00:20:02.585 "read": true, 00:20:02.585 "write": true, 00:20:02.585 "unmap": true, 00:20:02.585 "flush": true, 00:20:02.585 "reset": true, 00:20:02.585 "nvme_admin": false, 00:20:02.585 "nvme_io": false, 00:20:02.585 "nvme_io_md": false, 00:20:02.585 "write_zeroes": true, 00:20:02.585 "zcopy": true, 00:20:02.585 "get_zone_info": false, 00:20:02.585 "zone_management": false, 00:20:02.585 "zone_append": false, 00:20:02.585 "compare": false, 00:20:02.585 "compare_and_write": false, 00:20:02.585 "abort": true, 00:20:02.585 "seek_hole": false, 00:20:02.585 "seek_data": false, 00:20:02.585 "copy": true, 00:20:02.585 "nvme_iov_md": false 00:20:02.585 }, 00:20:02.585 "memory_domains": [ 00:20:02.585 { 00:20:02.585 "dma_device_id": "system", 00:20:02.585 "dma_device_type": 1 00:20:02.585 }, 00:20:02.585 { 00:20:02.585 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:02.585 "dma_device_type": 2 00:20:02.585 } 00:20:02.585 ], 00:20:02.585 "driver_specific": {} 00:20:02.585 } 00:20:02.585 ] 00:20:02.585 18:23:46 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:20:02.585 18:23:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:20:02.585 18:23:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:20:02.585 18:23:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4 00:20:02.844 BaseBdev4 00:20:02.844 18:23:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev4 00:20:02.844 18:23:46 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev4 00:20:02.844 18:23:46 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:20:02.844 18:23:46 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:20:02.844 18:23:46 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:20:02.844 18:23:46 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:20:02.844 18:23:46 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:20:03.102 18:23:46 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 -t 2000 00:20:03.361 [ 00:20:03.361 { 00:20:03.361 "name": "BaseBdev4", 00:20:03.361 "aliases": [ 00:20:03.361 "5c27ef8b-6465-4136-9f50-7ca42eaa5299" 00:20:03.361 ], 00:20:03.361 "product_name": "Malloc disk", 00:20:03.361 "block_size": 512, 00:20:03.361 "num_blocks": 65536, 00:20:03.361 "uuid": "5c27ef8b-6465-4136-9f50-7ca42eaa5299", 00:20:03.361 "assigned_rate_limits": { 00:20:03.361 "rw_ios_per_sec": 0, 00:20:03.361 "rw_mbytes_per_sec": 0, 00:20:03.361 "r_mbytes_per_sec": 0, 00:20:03.361 "w_mbytes_per_sec": 0 00:20:03.361 }, 00:20:03.361 "claimed": false, 00:20:03.361 "zoned": false, 00:20:03.361 "supported_io_types": { 00:20:03.361 "read": true, 00:20:03.361 "write": true, 00:20:03.361 "unmap": true, 00:20:03.361 "flush": true, 00:20:03.361 "reset": true, 00:20:03.361 "nvme_admin": false, 00:20:03.361 "nvme_io": false, 00:20:03.361 "nvme_io_md": false, 00:20:03.361 "write_zeroes": true, 00:20:03.361 "zcopy": true, 00:20:03.361 "get_zone_info": false, 00:20:03.361 "zone_management": false, 00:20:03.361 "zone_append": false, 00:20:03.361 "compare": false, 00:20:03.361 "compare_and_write": false, 00:20:03.361 "abort": true, 00:20:03.361 "seek_hole": false, 00:20:03.361 "seek_data": false, 00:20:03.361 "copy": true, 00:20:03.361 "nvme_iov_md": false 00:20:03.361 }, 00:20:03.361 "memory_domains": [ 00:20:03.361 { 00:20:03.361 "dma_device_id": "system", 00:20:03.361 "dma_device_type": 1 00:20:03.361 }, 00:20:03.361 { 00:20:03.361 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:03.361 "dma_device_type": 2 00:20:03.361 } 00:20:03.361 ], 00:20:03.361 "driver_specific": {} 00:20:03.361 } 00:20:03.361 ] 00:20:03.361 18:23:46 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:20:03.361 18:23:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:20:03.361 18:23:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:20:03.361 18:23:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@305 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r concat -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:20:03.619 [2024-07-12 18:23:47.093221] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:20:03.619 [2024-07-12 18:23:47.093260] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:20:03.619 [2024-07-12 18:23:47.093278] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:20:03.619 [2024-07-12 18:23:47.094657] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:20:03.619 [2024-07-12 18:23:47.094698] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:20:03.619 18:23:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@306 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:20:03.619 18:23:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:20:03.619 18:23:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:20:03.619 18:23:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:20:03.619 18:23:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:20:03.619 18:23:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:20:03.619 18:23:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:20:03.619 18:23:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:20:03.620 18:23:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:20:03.620 18:23:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:20:03.620 18:23:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:03.620 18:23:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:20:03.887 18:23:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:20:03.888 "name": "Existed_Raid", 00:20:03.888 "uuid": "aeb05eaf-5f26-4ec6-9b0b-449ccb4c10c2", 00:20:03.888 "strip_size_kb": 64, 00:20:03.888 "state": "configuring", 00:20:03.888 "raid_level": "concat", 00:20:03.888 "superblock": true, 00:20:03.888 "num_base_bdevs": 4, 00:20:03.888 "num_base_bdevs_discovered": 3, 00:20:03.888 "num_base_bdevs_operational": 4, 00:20:03.888 "base_bdevs_list": [ 00:20:03.888 { 00:20:03.888 "name": "BaseBdev1", 00:20:03.888 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:03.888 "is_configured": false, 00:20:03.888 "data_offset": 0, 00:20:03.888 "data_size": 0 00:20:03.888 }, 00:20:03.888 { 00:20:03.888 "name": "BaseBdev2", 00:20:03.888 "uuid": "a11740ce-ad97-4c83-933e-41298952f48f", 00:20:03.888 "is_configured": true, 00:20:03.888 "data_offset": 2048, 00:20:03.888 "data_size": 63488 00:20:03.888 }, 00:20:03.888 { 00:20:03.888 "name": "BaseBdev3", 00:20:03.888 "uuid": "f66bf931-dc95-48e7-b1ec-3889e0d491aa", 00:20:03.888 "is_configured": true, 00:20:03.888 "data_offset": 2048, 00:20:03.888 "data_size": 63488 00:20:03.888 }, 00:20:03.888 { 00:20:03.888 "name": "BaseBdev4", 00:20:03.888 "uuid": "5c27ef8b-6465-4136-9f50-7ca42eaa5299", 00:20:03.888 "is_configured": true, 00:20:03.888 "data_offset": 2048, 00:20:03.888 "data_size": 63488 00:20:03.888 } 00:20:03.888 ] 00:20:03.888 }' 00:20:03.888 18:23:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:20:03.888 18:23:47 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:20:04.512 18:23:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@308 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev2 00:20:04.512 [2024-07-12 18:23:48.180070] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:20:04.512 18:23:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@309 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:20:04.512 18:23:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:20:04.512 18:23:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:20:04.512 18:23:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:20:04.512 18:23:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:20:04.512 18:23:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:20:04.512 18:23:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:20:04.512 18:23:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:20:04.512 18:23:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:20:04.512 18:23:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:20:04.512 18:23:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:04.512 18:23:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:20:04.770 18:23:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:20:04.771 "name": "Existed_Raid", 00:20:04.771 "uuid": "aeb05eaf-5f26-4ec6-9b0b-449ccb4c10c2", 00:20:04.771 "strip_size_kb": 64, 00:20:04.771 "state": "configuring", 00:20:04.771 "raid_level": "concat", 00:20:04.771 "superblock": true, 00:20:04.771 "num_base_bdevs": 4, 00:20:04.771 "num_base_bdevs_discovered": 2, 00:20:04.771 "num_base_bdevs_operational": 4, 00:20:04.771 "base_bdevs_list": [ 00:20:04.771 { 00:20:04.771 "name": "BaseBdev1", 00:20:04.771 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:04.771 "is_configured": false, 00:20:04.771 "data_offset": 0, 00:20:04.771 "data_size": 0 00:20:04.771 }, 00:20:04.771 { 00:20:04.771 "name": null, 00:20:04.771 "uuid": "a11740ce-ad97-4c83-933e-41298952f48f", 00:20:04.771 "is_configured": false, 00:20:04.771 "data_offset": 2048, 00:20:04.771 "data_size": 63488 00:20:04.771 }, 00:20:04.771 { 00:20:04.771 "name": "BaseBdev3", 00:20:04.771 "uuid": "f66bf931-dc95-48e7-b1ec-3889e0d491aa", 00:20:04.771 "is_configured": true, 00:20:04.771 "data_offset": 2048, 00:20:04.771 "data_size": 63488 00:20:04.771 }, 00:20:04.771 { 00:20:04.771 "name": "BaseBdev4", 00:20:04.771 "uuid": "5c27ef8b-6465-4136-9f50-7ca42eaa5299", 00:20:04.771 "is_configured": true, 00:20:04.771 "data_offset": 2048, 00:20:04.771 "data_size": 63488 00:20:04.771 } 00:20:04.771 ] 00:20:04.771 }' 00:20:04.771 18:23:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:20:04.771 18:23:48 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:20:05.337 18:23:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@310 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:05.337 18:23:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@310 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:20:05.595 18:23:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@310 -- # [[ false == \f\a\l\s\e ]] 00:20:05.595 18:23:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@312 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:20:05.854 [2024-07-12 18:23:49.536170] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:20:05.854 BaseBdev1 00:20:05.854 18:23:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@313 -- # waitforbdev BaseBdev1 00:20:05.854 18:23:49 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev1 00:20:05.854 18:23:49 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:20:05.854 18:23:49 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:20:05.854 18:23:49 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:20:05.854 18:23:49 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:20:05.854 18:23:49 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:20:06.112 18:23:49 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:20:06.371 [ 00:20:06.371 { 00:20:06.371 "name": "BaseBdev1", 00:20:06.371 "aliases": [ 00:20:06.371 "6c4aeff5-1bfc-4243-a6c9-2506c4063fc2" 00:20:06.371 ], 00:20:06.371 "product_name": "Malloc disk", 00:20:06.371 "block_size": 512, 00:20:06.371 "num_blocks": 65536, 00:20:06.371 "uuid": "6c4aeff5-1bfc-4243-a6c9-2506c4063fc2", 00:20:06.371 "assigned_rate_limits": { 00:20:06.371 "rw_ios_per_sec": 0, 00:20:06.371 "rw_mbytes_per_sec": 0, 00:20:06.371 "r_mbytes_per_sec": 0, 00:20:06.371 "w_mbytes_per_sec": 0 00:20:06.371 }, 00:20:06.371 "claimed": true, 00:20:06.371 "claim_type": "exclusive_write", 00:20:06.371 "zoned": false, 00:20:06.371 "supported_io_types": { 00:20:06.371 "read": true, 00:20:06.371 "write": true, 00:20:06.371 "unmap": true, 00:20:06.371 "flush": true, 00:20:06.371 "reset": true, 00:20:06.371 "nvme_admin": false, 00:20:06.371 "nvme_io": false, 00:20:06.371 "nvme_io_md": false, 00:20:06.371 "write_zeroes": true, 00:20:06.371 "zcopy": true, 00:20:06.371 "get_zone_info": false, 00:20:06.371 "zone_management": false, 00:20:06.371 "zone_append": false, 00:20:06.371 "compare": false, 00:20:06.371 "compare_and_write": false, 00:20:06.371 "abort": true, 00:20:06.371 "seek_hole": false, 00:20:06.371 "seek_data": false, 00:20:06.371 "copy": true, 00:20:06.371 "nvme_iov_md": false 00:20:06.371 }, 00:20:06.371 "memory_domains": [ 00:20:06.371 { 00:20:06.371 "dma_device_id": "system", 00:20:06.371 "dma_device_type": 1 00:20:06.371 }, 00:20:06.371 { 00:20:06.371 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:06.371 "dma_device_type": 2 00:20:06.371 } 00:20:06.371 ], 00:20:06.371 "driver_specific": {} 00:20:06.371 } 00:20:06.371 ] 00:20:06.371 18:23:50 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:20:06.371 18:23:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@314 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:20:06.371 18:23:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:20:06.371 18:23:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:20:06.371 18:23:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:20:06.371 18:23:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:20:06.371 18:23:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:20:06.371 18:23:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:20:06.371 18:23:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:20:06.371 18:23:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:20:06.371 18:23:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:20:06.371 18:23:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:06.371 18:23:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:20:06.630 18:23:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:20:06.630 "name": "Existed_Raid", 00:20:06.630 "uuid": "aeb05eaf-5f26-4ec6-9b0b-449ccb4c10c2", 00:20:06.630 "strip_size_kb": 64, 00:20:06.630 "state": "configuring", 00:20:06.630 "raid_level": "concat", 00:20:06.630 "superblock": true, 00:20:06.630 "num_base_bdevs": 4, 00:20:06.630 "num_base_bdevs_discovered": 3, 00:20:06.630 "num_base_bdevs_operational": 4, 00:20:06.630 "base_bdevs_list": [ 00:20:06.630 { 00:20:06.630 "name": "BaseBdev1", 00:20:06.630 "uuid": "6c4aeff5-1bfc-4243-a6c9-2506c4063fc2", 00:20:06.630 "is_configured": true, 00:20:06.630 "data_offset": 2048, 00:20:06.630 "data_size": 63488 00:20:06.630 }, 00:20:06.630 { 00:20:06.630 "name": null, 00:20:06.630 "uuid": "a11740ce-ad97-4c83-933e-41298952f48f", 00:20:06.630 "is_configured": false, 00:20:06.631 "data_offset": 2048, 00:20:06.631 "data_size": 63488 00:20:06.631 }, 00:20:06.631 { 00:20:06.631 "name": "BaseBdev3", 00:20:06.631 "uuid": "f66bf931-dc95-48e7-b1ec-3889e0d491aa", 00:20:06.631 "is_configured": true, 00:20:06.631 "data_offset": 2048, 00:20:06.631 "data_size": 63488 00:20:06.631 }, 00:20:06.631 { 00:20:06.631 "name": "BaseBdev4", 00:20:06.631 "uuid": "5c27ef8b-6465-4136-9f50-7ca42eaa5299", 00:20:06.631 "is_configured": true, 00:20:06.631 "data_offset": 2048, 00:20:06.631 "data_size": 63488 00:20:06.631 } 00:20:06.631 ] 00:20:06.631 }' 00:20:06.631 18:23:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:20:06.631 18:23:50 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:20:07.198 18:23:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@315 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:07.198 18:23:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@315 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:20:07.457 18:23:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@315 -- # [[ true == \t\r\u\e ]] 00:20:07.457 18:23:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@317 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev3 00:20:07.715 [2024-07-12 18:23:51.365063] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:20:07.715 18:23:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@318 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:20:07.715 18:23:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:20:07.715 18:23:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:20:07.715 18:23:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:20:07.715 18:23:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:20:07.715 18:23:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:20:07.715 18:23:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:20:07.715 18:23:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:20:07.715 18:23:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:20:07.715 18:23:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:20:07.715 18:23:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:07.715 18:23:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:20:07.973 18:23:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:20:07.973 "name": "Existed_Raid", 00:20:07.973 "uuid": "aeb05eaf-5f26-4ec6-9b0b-449ccb4c10c2", 00:20:07.973 "strip_size_kb": 64, 00:20:07.973 "state": "configuring", 00:20:07.973 "raid_level": "concat", 00:20:07.973 "superblock": true, 00:20:07.973 "num_base_bdevs": 4, 00:20:07.973 "num_base_bdevs_discovered": 2, 00:20:07.973 "num_base_bdevs_operational": 4, 00:20:07.973 "base_bdevs_list": [ 00:20:07.973 { 00:20:07.973 "name": "BaseBdev1", 00:20:07.973 "uuid": "6c4aeff5-1bfc-4243-a6c9-2506c4063fc2", 00:20:07.973 "is_configured": true, 00:20:07.973 "data_offset": 2048, 00:20:07.973 "data_size": 63488 00:20:07.973 }, 00:20:07.973 { 00:20:07.973 "name": null, 00:20:07.973 "uuid": "a11740ce-ad97-4c83-933e-41298952f48f", 00:20:07.973 "is_configured": false, 00:20:07.973 "data_offset": 2048, 00:20:07.973 "data_size": 63488 00:20:07.973 }, 00:20:07.973 { 00:20:07.973 "name": null, 00:20:07.973 "uuid": "f66bf931-dc95-48e7-b1ec-3889e0d491aa", 00:20:07.973 "is_configured": false, 00:20:07.973 "data_offset": 2048, 00:20:07.973 "data_size": 63488 00:20:07.973 }, 00:20:07.973 { 00:20:07.973 "name": "BaseBdev4", 00:20:07.973 "uuid": "5c27ef8b-6465-4136-9f50-7ca42eaa5299", 00:20:07.973 "is_configured": true, 00:20:07.973 "data_offset": 2048, 00:20:07.973 "data_size": 63488 00:20:07.973 } 00:20:07.973 ] 00:20:07.973 }' 00:20:07.973 18:23:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:20:07.973 18:23:51 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:20:08.539 18:23:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@319 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:20:08.540 18:23:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@319 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:08.798 18:23:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@319 -- # [[ false == \f\a\l\s\e ]] 00:20:08.798 18:23:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@321 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev3 00:20:09.057 [2024-07-12 18:23:52.692702] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:20:09.057 18:23:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@322 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:20:09.057 18:23:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:20:09.057 18:23:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:20:09.057 18:23:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:20:09.057 18:23:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:20:09.057 18:23:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:20:09.057 18:23:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:20:09.057 18:23:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:20:09.057 18:23:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:20:09.057 18:23:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:20:09.057 18:23:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:20:09.057 18:23:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:09.325 18:23:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:20:09.325 "name": "Existed_Raid", 00:20:09.325 "uuid": "aeb05eaf-5f26-4ec6-9b0b-449ccb4c10c2", 00:20:09.325 "strip_size_kb": 64, 00:20:09.325 "state": "configuring", 00:20:09.325 "raid_level": "concat", 00:20:09.325 "superblock": true, 00:20:09.325 "num_base_bdevs": 4, 00:20:09.325 "num_base_bdevs_discovered": 3, 00:20:09.325 "num_base_bdevs_operational": 4, 00:20:09.325 "base_bdevs_list": [ 00:20:09.325 { 00:20:09.325 "name": "BaseBdev1", 00:20:09.325 "uuid": "6c4aeff5-1bfc-4243-a6c9-2506c4063fc2", 00:20:09.325 "is_configured": true, 00:20:09.325 "data_offset": 2048, 00:20:09.325 "data_size": 63488 00:20:09.326 }, 00:20:09.326 { 00:20:09.326 "name": null, 00:20:09.326 "uuid": "a11740ce-ad97-4c83-933e-41298952f48f", 00:20:09.326 "is_configured": false, 00:20:09.326 "data_offset": 2048, 00:20:09.326 "data_size": 63488 00:20:09.326 }, 00:20:09.326 { 00:20:09.326 "name": "BaseBdev3", 00:20:09.326 "uuid": "f66bf931-dc95-48e7-b1ec-3889e0d491aa", 00:20:09.326 "is_configured": true, 00:20:09.326 "data_offset": 2048, 00:20:09.326 "data_size": 63488 00:20:09.326 }, 00:20:09.326 { 00:20:09.326 "name": "BaseBdev4", 00:20:09.326 "uuid": "5c27ef8b-6465-4136-9f50-7ca42eaa5299", 00:20:09.326 "is_configured": true, 00:20:09.326 "data_offset": 2048, 00:20:09.326 "data_size": 63488 00:20:09.326 } 00:20:09.326 ] 00:20:09.326 }' 00:20:09.326 18:23:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:20:09.326 18:23:52 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:20:09.899 18:23:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@323 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:09.899 18:23:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@323 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:20:10.157 18:23:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@323 -- # [[ true == \t\r\u\e ]] 00:20:10.157 18:23:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@325 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:20:10.417 [2024-07-12 18:23:53.980145] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:20:10.417 18:23:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@326 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:20:10.417 18:23:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:20:10.417 18:23:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:20:10.417 18:23:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:20:10.417 18:23:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:20:10.417 18:23:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:20:10.417 18:23:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:20:10.417 18:23:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:20:10.417 18:23:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:20:10.417 18:23:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:20:10.417 18:23:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:20:10.417 18:23:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:10.675 18:23:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:20:10.675 "name": "Existed_Raid", 00:20:10.675 "uuid": "aeb05eaf-5f26-4ec6-9b0b-449ccb4c10c2", 00:20:10.675 "strip_size_kb": 64, 00:20:10.675 "state": "configuring", 00:20:10.675 "raid_level": "concat", 00:20:10.675 "superblock": true, 00:20:10.675 "num_base_bdevs": 4, 00:20:10.675 "num_base_bdevs_discovered": 2, 00:20:10.675 "num_base_bdevs_operational": 4, 00:20:10.675 "base_bdevs_list": [ 00:20:10.675 { 00:20:10.675 "name": null, 00:20:10.675 "uuid": "6c4aeff5-1bfc-4243-a6c9-2506c4063fc2", 00:20:10.675 "is_configured": false, 00:20:10.675 "data_offset": 2048, 00:20:10.675 "data_size": 63488 00:20:10.675 }, 00:20:10.675 { 00:20:10.675 "name": null, 00:20:10.675 "uuid": "a11740ce-ad97-4c83-933e-41298952f48f", 00:20:10.675 "is_configured": false, 00:20:10.675 "data_offset": 2048, 00:20:10.675 "data_size": 63488 00:20:10.675 }, 00:20:10.675 { 00:20:10.675 "name": "BaseBdev3", 00:20:10.675 "uuid": "f66bf931-dc95-48e7-b1ec-3889e0d491aa", 00:20:10.675 "is_configured": true, 00:20:10.675 "data_offset": 2048, 00:20:10.675 "data_size": 63488 00:20:10.675 }, 00:20:10.675 { 00:20:10.675 "name": "BaseBdev4", 00:20:10.675 "uuid": "5c27ef8b-6465-4136-9f50-7ca42eaa5299", 00:20:10.675 "is_configured": true, 00:20:10.675 "data_offset": 2048, 00:20:10.675 "data_size": 63488 00:20:10.675 } 00:20:10.675 ] 00:20:10.675 }' 00:20:10.675 18:23:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:20:10.675 18:23:54 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:20:11.241 18:23:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@327 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:11.241 18:23:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@327 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:20:11.500 18:23:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@327 -- # [[ false == \f\a\l\s\e ]] 00:20:11.500 18:23:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@329 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev2 00:20:11.758 [2024-07-12 18:23:55.287990] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:20:11.758 18:23:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@330 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:20:11.758 18:23:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:20:11.758 18:23:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:20:11.758 18:23:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:20:11.758 18:23:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:20:11.758 18:23:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:20:11.758 18:23:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:20:11.758 18:23:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:20:11.758 18:23:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:20:11.758 18:23:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:20:11.758 18:23:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:20:11.758 18:23:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:12.017 18:23:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:20:12.017 "name": "Existed_Raid", 00:20:12.017 "uuid": "aeb05eaf-5f26-4ec6-9b0b-449ccb4c10c2", 00:20:12.017 "strip_size_kb": 64, 00:20:12.017 "state": "configuring", 00:20:12.017 "raid_level": "concat", 00:20:12.017 "superblock": true, 00:20:12.017 "num_base_bdevs": 4, 00:20:12.017 "num_base_bdevs_discovered": 3, 00:20:12.017 "num_base_bdevs_operational": 4, 00:20:12.017 "base_bdevs_list": [ 00:20:12.017 { 00:20:12.017 "name": null, 00:20:12.017 "uuid": "6c4aeff5-1bfc-4243-a6c9-2506c4063fc2", 00:20:12.017 "is_configured": false, 00:20:12.017 "data_offset": 2048, 00:20:12.017 "data_size": 63488 00:20:12.017 }, 00:20:12.017 { 00:20:12.017 "name": "BaseBdev2", 00:20:12.017 "uuid": "a11740ce-ad97-4c83-933e-41298952f48f", 00:20:12.017 "is_configured": true, 00:20:12.017 "data_offset": 2048, 00:20:12.017 "data_size": 63488 00:20:12.017 }, 00:20:12.017 { 00:20:12.017 "name": "BaseBdev3", 00:20:12.017 "uuid": "f66bf931-dc95-48e7-b1ec-3889e0d491aa", 00:20:12.017 "is_configured": true, 00:20:12.017 "data_offset": 2048, 00:20:12.017 "data_size": 63488 00:20:12.017 }, 00:20:12.017 { 00:20:12.017 "name": "BaseBdev4", 00:20:12.017 "uuid": "5c27ef8b-6465-4136-9f50-7ca42eaa5299", 00:20:12.017 "is_configured": true, 00:20:12.017 "data_offset": 2048, 00:20:12.017 "data_size": 63488 00:20:12.017 } 00:20:12.017 ] 00:20:12.018 }' 00:20:12.018 18:23:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:20:12.018 18:23:55 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:20:12.586 18:23:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@331 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:12.586 18:23:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@331 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:20:12.844 18:23:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@331 -- # [[ true == \t\r\u\e ]] 00:20:12.844 18:23:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:12.844 18:23:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@333 -- # jq -r '.[0].base_bdevs_list[0].uuid' 00:20:13.103 18:23:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b NewBaseBdev -u 6c4aeff5-1bfc-4243-a6c9-2506c4063fc2 00:20:13.362 [2024-07-12 18:23:56.883683] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev NewBaseBdev is claimed 00:20:13.362 [2024-07-12 18:23:56.883835] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x1404850 00:20:13.362 [2024-07-12 18:23:56.883848] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 253952, blocklen 512 00:20:13.362 [2024-07-12 18:23:56.884032] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x13fad80 00:20:13.362 [2024-07-12 18:23:56.884146] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1404850 00:20:13.362 [2024-07-12 18:23:56.884156] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x1404850 00:20:13.362 [2024-07-12 18:23:56.884244] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:20:13.362 NewBaseBdev 00:20:13.362 18:23:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@334 -- # waitforbdev NewBaseBdev 00:20:13.362 18:23:56 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=NewBaseBdev 00:20:13.362 18:23:56 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:20:13.362 18:23:56 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:20:13.362 18:23:56 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:20:13.362 18:23:56 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:20:13.362 18:23:56 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:20:13.621 18:23:57 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev -t 2000 00:20:13.880 [ 00:20:13.880 { 00:20:13.880 "name": "NewBaseBdev", 00:20:13.880 "aliases": [ 00:20:13.880 "6c4aeff5-1bfc-4243-a6c9-2506c4063fc2" 00:20:13.880 ], 00:20:13.880 "product_name": "Malloc disk", 00:20:13.880 "block_size": 512, 00:20:13.880 "num_blocks": 65536, 00:20:13.880 "uuid": "6c4aeff5-1bfc-4243-a6c9-2506c4063fc2", 00:20:13.880 "assigned_rate_limits": { 00:20:13.880 "rw_ios_per_sec": 0, 00:20:13.880 "rw_mbytes_per_sec": 0, 00:20:13.880 "r_mbytes_per_sec": 0, 00:20:13.880 "w_mbytes_per_sec": 0 00:20:13.880 }, 00:20:13.880 "claimed": true, 00:20:13.880 "claim_type": "exclusive_write", 00:20:13.880 "zoned": false, 00:20:13.880 "supported_io_types": { 00:20:13.880 "read": true, 00:20:13.880 "write": true, 00:20:13.880 "unmap": true, 00:20:13.880 "flush": true, 00:20:13.880 "reset": true, 00:20:13.880 "nvme_admin": false, 00:20:13.880 "nvme_io": false, 00:20:13.880 "nvme_io_md": false, 00:20:13.880 "write_zeroes": true, 00:20:13.880 "zcopy": true, 00:20:13.880 "get_zone_info": false, 00:20:13.880 "zone_management": false, 00:20:13.880 "zone_append": false, 00:20:13.880 "compare": false, 00:20:13.880 "compare_and_write": false, 00:20:13.880 "abort": true, 00:20:13.880 "seek_hole": false, 00:20:13.880 "seek_data": false, 00:20:13.880 "copy": true, 00:20:13.880 "nvme_iov_md": false 00:20:13.880 }, 00:20:13.880 "memory_domains": [ 00:20:13.880 { 00:20:13.880 "dma_device_id": "system", 00:20:13.880 "dma_device_type": 1 00:20:13.880 }, 00:20:13.880 { 00:20:13.880 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:13.880 "dma_device_type": 2 00:20:13.880 } 00:20:13.880 ], 00:20:13.880 "driver_specific": {} 00:20:13.880 } 00:20:13.880 ] 00:20:13.880 18:23:57 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:20:13.880 18:23:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@335 -- # verify_raid_bdev_state Existed_Raid online concat 64 4 00:20:13.880 18:23:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:20:13.880 18:23:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:20:13.880 18:23:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:20:13.880 18:23:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:20:13.880 18:23:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:20:13.880 18:23:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:20:13.880 18:23:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:20:13.880 18:23:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:20:13.880 18:23:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:20:13.880 18:23:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:13.880 18:23:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:20:14.140 18:23:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:20:14.140 "name": "Existed_Raid", 00:20:14.140 "uuid": "aeb05eaf-5f26-4ec6-9b0b-449ccb4c10c2", 00:20:14.140 "strip_size_kb": 64, 00:20:14.140 "state": "online", 00:20:14.140 "raid_level": "concat", 00:20:14.140 "superblock": true, 00:20:14.140 "num_base_bdevs": 4, 00:20:14.140 "num_base_bdevs_discovered": 4, 00:20:14.140 "num_base_bdevs_operational": 4, 00:20:14.140 "base_bdevs_list": [ 00:20:14.140 { 00:20:14.140 "name": "NewBaseBdev", 00:20:14.140 "uuid": "6c4aeff5-1bfc-4243-a6c9-2506c4063fc2", 00:20:14.140 "is_configured": true, 00:20:14.140 "data_offset": 2048, 00:20:14.140 "data_size": 63488 00:20:14.140 }, 00:20:14.140 { 00:20:14.140 "name": "BaseBdev2", 00:20:14.140 "uuid": "a11740ce-ad97-4c83-933e-41298952f48f", 00:20:14.140 "is_configured": true, 00:20:14.140 "data_offset": 2048, 00:20:14.140 "data_size": 63488 00:20:14.140 }, 00:20:14.140 { 00:20:14.140 "name": "BaseBdev3", 00:20:14.140 "uuid": "f66bf931-dc95-48e7-b1ec-3889e0d491aa", 00:20:14.140 "is_configured": true, 00:20:14.140 "data_offset": 2048, 00:20:14.140 "data_size": 63488 00:20:14.140 }, 00:20:14.140 { 00:20:14.140 "name": "BaseBdev4", 00:20:14.140 "uuid": "5c27ef8b-6465-4136-9f50-7ca42eaa5299", 00:20:14.140 "is_configured": true, 00:20:14.140 "data_offset": 2048, 00:20:14.140 "data_size": 63488 00:20:14.140 } 00:20:14.140 ] 00:20:14.140 }' 00:20:14.140 18:23:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:20:14.140 18:23:57 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:20:14.707 18:23:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@336 -- # verify_raid_bdev_properties Existed_Raid 00:20:14.707 18:23:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:20:14.707 18:23:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:20:14.707 18:23:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:20:14.707 18:23:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:20:14.707 18:23:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@198 -- # local name 00:20:14.707 18:23:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:20:14.707 18:23:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:20:14.966 [2024-07-12 18:23:58.444152] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:20:14.966 18:23:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:20:14.966 "name": "Existed_Raid", 00:20:14.966 "aliases": [ 00:20:14.966 "aeb05eaf-5f26-4ec6-9b0b-449ccb4c10c2" 00:20:14.966 ], 00:20:14.966 "product_name": "Raid Volume", 00:20:14.966 "block_size": 512, 00:20:14.966 "num_blocks": 253952, 00:20:14.966 "uuid": "aeb05eaf-5f26-4ec6-9b0b-449ccb4c10c2", 00:20:14.966 "assigned_rate_limits": { 00:20:14.966 "rw_ios_per_sec": 0, 00:20:14.966 "rw_mbytes_per_sec": 0, 00:20:14.966 "r_mbytes_per_sec": 0, 00:20:14.966 "w_mbytes_per_sec": 0 00:20:14.966 }, 00:20:14.966 "claimed": false, 00:20:14.966 "zoned": false, 00:20:14.966 "supported_io_types": { 00:20:14.966 "read": true, 00:20:14.966 "write": true, 00:20:14.966 "unmap": true, 00:20:14.966 "flush": true, 00:20:14.966 "reset": true, 00:20:14.966 "nvme_admin": false, 00:20:14.966 "nvme_io": false, 00:20:14.966 "nvme_io_md": false, 00:20:14.966 "write_zeroes": true, 00:20:14.966 "zcopy": false, 00:20:14.966 "get_zone_info": false, 00:20:14.966 "zone_management": false, 00:20:14.966 "zone_append": false, 00:20:14.966 "compare": false, 00:20:14.966 "compare_and_write": false, 00:20:14.966 "abort": false, 00:20:14.966 "seek_hole": false, 00:20:14.966 "seek_data": false, 00:20:14.966 "copy": false, 00:20:14.966 "nvme_iov_md": false 00:20:14.966 }, 00:20:14.966 "memory_domains": [ 00:20:14.966 { 00:20:14.966 "dma_device_id": "system", 00:20:14.966 "dma_device_type": 1 00:20:14.966 }, 00:20:14.966 { 00:20:14.966 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:14.966 "dma_device_type": 2 00:20:14.966 }, 00:20:14.966 { 00:20:14.966 "dma_device_id": "system", 00:20:14.966 "dma_device_type": 1 00:20:14.966 }, 00:20:14.966 { 00:20:14.966 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:14.966 "dma_device_type": 2 00:20:14.966 }, 00:20:14.966 { 00:20:14.966 "dma_device_id": "system", 00:20:14.966 "dma_device_type": 1 00:20:14.966 }, 00:20:14.966 { 00:20:14.966 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:14.966 "dma_device_type": 2 00:20:14.966 }, 00:20:14.966 { 00:20:14.966 "dma_device_id": "system", 00:20:14.966 "dma_device_type": 1 00:20:14.966 }, 00:20:14.966 { 00:20:14.966 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:14.966 "dma_device_type": 2 00:20:14.966 } 00:20:14.966 ], 00:20:14.966 "driver_specific": { 00:20:14.966 "raid": { 00:20:14.966 "uuid": "aeb05eaf-5f26-4ec6-9b0b-449ccb4c10c2", 00:20:14.966 "strip_size_kb": 64, 00:20:14.966 "state": "online", 00:20:14.966 "raid_level": "concat", 00:20:14.966 "superblock": true, 00:20:14.966 "num_base_bdevs": 4, 00:20:14.966 "num_base_bdevs_discovered": 4, 00:20:14.966 "num_base_bdevs_operational": 4, 00:20:14.966 "base_bdevs_list": [ 00:20:14.966 { 00:20:14.966 "name": "NewBaseBdev", 00:20:14.966 "uuid": "6c4aeff5-1bfc-4243-a6c9-2506c4063fc2", 00:20:14.966 "is_configured": true, 00:20:14.966 "data_offset": 2048, 00:20:14.966 "data_size": 63488 00:20:14.966 }, 00:20:14.966 { 00:20:14.966 "name": "BaseBdev2", 00:20:14.966 "uuid": "a11740ce-ad97-4c83-933e-41298952f48f", 00:20:14.966 "is_configured": true, 00:20:14.966 "data_offset": 2048, 00:20:14.966 "data_size": 63488 00:20:14.967 }, 00:20:14.967 { 00:20:14.967 "name": "BaseBdev3", 00:20:14.967 "uuid": "f66bf931-dc95-48e7-b1ec-3889e0d491aa", 00:20:14.967 "is_configured": true, 00:20:14.967 "data_offset": 2048, 00:20:14.967 "data_size": 63488 00:20:14.967 }, 00:20:14.967 { 00:20:14.967 "name": "BaseBdev4", 00:20:14.967 "uuid": "5c27ef8b-6465-4136-9f50-7ca42eaa5299", 00:20:14.967 "is_configured": true, 00:20:14.967 "data_offset": 2048, 00:20:14.967 "data_size": 63488 00:20:14.967 } 00:20:14.967 ] 00:20:14.967 } 00:20:14.967 } 00:20:14.967 }' 00:20:14.967 18:23:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:20:14.967 18:23:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # base_bdev_names='NewBaseBdev 00:20:14.967 BaseBdev2 00:20:14.967 BaseBdev3 00:20:14.967 BaseBdev4' 00:20:14.967 18:23:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:20:14.967 18:23:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev 00:20:14.967 18:23:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:20:15.225 18:23:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:20:15.225 "name": "NewBaseBdev", 00:20:15.225 "aliases": [ 00:20:15.225 "6c4aeff5-1bfc-4243-a6c9-2506c4063fc2" 00:20:15.225 ], 00:20:15.225 "product_name": "Malloc disk", 00:20:15.225 "block_size": 512, 00:20:15.225 "num_blocks": 65536, 00:20:15.225 "uuid": "6c4aeff5-1bfc-4243-a6c9-2506c4063fc2", 00:20:15.225 "assigned_rate_limits": { 00:20:15.225 "rw_ios_per_sec": 0, 00:20:15.225 "rw_mbytes_per_sec": 0, 00:20:15.225 "r_mbytes_per_sec": 0, 00:20:15.225 "w_mbytes_per_sec": 0 00:20:15.225 }, 00:20:15.225 "claimed": true, 00:20:15.225 "claim_type": "exclusive_write", 00:20:15.225 "zoned": false, 00:20:15.225 "supported_io_types": { 00:20:15.225 "read": true, 00:20:15.225 "write": true, 00:20:15.225 "unmap": true, 00:20:15.225 "flush": true, 00:20:15.225 "reset": true, 00:20:15.225 "nvme_admin": false, 00:20:15.225 "nvme_io": false, 00:20:15.225 "nvme_io_md": false, 00:20:15.225 "write_zeroes": true, 00:20:15.225 "zcopy": true, 00:20:15.225 "get_zone_info": false, 00:20:15.225 "zone_management": false, 00:20:15.225 "zone_append": false, 00:20:15.225 "compare": false, 00:20:15.225 "compare_and_write": false, 00:20:15.225 "abort": true, 00:20:15.225 "seek_hole": false, 00:20:15.225 "seek_data": false, 00:20:15.225 "copy": true, 00:20:15.225 "nvme_iov_md": false 00:20:15.225 }, 00:20:15.225 "memory_domains": [ 00:20:15.225 { 00:20:15.225 "dma_device_id": "system", 00:20:15.225 "dma_device_type": 1 00:20:15.225 }, 00:20:15.225 { 00:20:15.225 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:15.225 "dma_device_type": 2 00:20:15.225 } 00:20:15.225 ], 00:20:15.225 "driver_specific": {} 00:20:15.225 }' 00:20:15.225 18:23:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:20:15.225 18:23:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:20:15.225 18:23:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:20:15.225 18:23:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:20:15.225 18:23:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:20:15.225 18:23:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:20:15.225 18:23:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:20:15.484 18:23:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:20:15.484 18:23:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:20:15.484 18:23:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:20:15.484 18:23:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:20:15.484 18:23:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:20:15.484 18:23:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:20:15.484 18:23:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:20:15.484 18:23:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:20:15.742 18:23:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:20:15.742 "name": "BaseBdev2", 00:20:15.742 "aliases": [ 00:20:15.742 "a11740ce-ad97-4c83-933e-41298952f48f" 00:20:15.742 ], 00:20:15.742 "product_name": "Malloc disk", 00:20:15.742 "block_size": 512, 00:20:15.742 "num_blocks": 65536, 00:20:15.742 "uuid": "a11740ce-ad97-4c83-933e-41298952f48f", 00:20:15.742 "assigned_rate_limits": { 00:20:15.742 "rw_ios_per_sec": 0, 00:20:15.742 "rw_mbytes_per_sec": 0, 00:20:15.742 "r_mbytes_per_sec": 0, 00:20:15.742 "w_mbytes_per_sec": 0 00:20:15.742 }, 00:20:15.742 "claimed": true, 00:20:15.742 "claim_type": "exclusive_write", 00:20:15.742 "zoned": false, 00:20:15.742 "supported_io_types": { 00:20:15.742 "read": true, 00:20:15.742 "write": true, 00:20:15.742 "unmap": true, 00:20:15.742 "flush": true, 00:20:15.742 "reset": true, 00:20:15.742 "nvme_admin": false, 00:20:15.742 "nvme_io": false, 00:20:15.742 "nvme_io_md": false, 00:20:15.742 "write_zeroes": true, 00:20:15.742 "zcopy": true, 00:20:15.742 "get_zone_info": false, 00:20:15.742 "zone_management": false, 00:20:15.742 "zone_append": false, 00:20:15.742 "compare": false, 00:20:15.742 "compare_and_write": false, 00:20:15.742 "abort": true, 00:20:15.742 "seek_hole": false, 00:20:15.742 "seek_data": false, 00:20:15.742 "copy": true, 00:20:15.742 "nvme_iov_md": false 00:20:15.742 }, 00:20:15.742 "memory_domains": [ 00:20:15.742 { 00:20:15.742 "dma_device_id": "system", 00:20:15.742 "dma_device_type": 1 00:20:15.742 }, 00:20:15.742 { 00:20:15.742 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:15.742 "dma_device_type": 2 00:20:15.742 } 00:20:15.742 ], 00:20:15.742 "driver_specific": {} 00:20:15.742 }' 00:20:15.742 18:23:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:20:15.742 18:23:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:20:15.743 18:23:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:20:15.743 18:23:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:20:16.001 18:23:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:20:16.001 18:23:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:20:16.001 18:23:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:20:16.001 18:23:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:20:16.001 18:23:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:20:16.001 18:23:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:20:16.001 18:23:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:20:16.001 18:23:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:20:16.001 18:23:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:20:16.001 18:23:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:20:16.001 18:23:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:20:16.260 18:23:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:20:16.260 "name": "BaseBdev3", 00:20:16.260 "aliases": [ 00:20:16.260 "f66bf931-dc95-48e7-b1ec-3889e0d491aa" 00:20:16.260 ], 00:20:16.260 "product_name": "Malloc disk", 00:20:16.260 "block_size": 512, 00:20:16.260 "num_blocks": 65536, 00:20:16.260 "uuid": "f66bf931-dc95-48e7-b1ec-3889e0d491aa", 00:20:16.260 "assigned_rate_limits": { 00:20:16.260 "rw_ios_per_sec": 0, 00:20:16.260 "rw_mbytes_per_sec": 0, 00:20:16.260 "r_mbytes_per_sec": 0, 00:20:16.260 "w_mbytes_per_sec": 0 00:20:16.260 }, 00:20:16.260 "claimed": true, 00:20:16.260 "claim_type": "exclusive_write", 00:20:16.260 "zoned": false, 00:20:16.260 "supported_io_types": { 00:20:16.260 "read": true, 00:20:16.260 "write": true, 00:20:16.260 "unmap": true, 00:20:16.260 "flush": true, 00:20:16.260 "reset": true, 00:20:16.260 "nvme_admin": false, 00:20:16.260 "nvme_io": false, 00:20:16.260 "nvme_io_md": false, 00:20:16.260 "write_zeroes": true, 00:20:16.260 "zcopy": true, 00:20:16.260 "get_zone_info": false, 00:20:16.260 "zone_management": false, 00:20:16.260 "zone_append": false, 00:20:16.260 "compare": false, 00:20:16.260 "compare_and_write": false, 00:20:16.260 "abort": true, 00:20:16.260 "seek_hole": false, 00:20:16.260 "seek_data": false, 00:20:16.260 "copy": true, 00:20:16.260 "nvme_iov_md": false 00:20:16.260 }, 00:20:16.260 "memory_domains": [ 00:20:16.260 { 00:20:16.260 "dma_device_id": "system", 00:20:16.260 "dma_device_type": 1 00:20:16.260 }, 00:20:16.260 { 00:20:16.260 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:16.260 "dma_device_type": 2 00:20:16.260 } 00:20:16.260 ], 00:20:16.260 "driver_specific": {} 00:20:16.260 }' 00:20:16.260 18:23:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:20:16.260 18:23:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:20:16.518 18:24:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:20:16.518 18:24:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:20:16.518 18:24:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:20:16.518 18:24:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:20:16.518 18:24:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:20:16.518 18:24:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:20:16.518 18:24:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:20:16.518 18:24:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:20:16.518 18:24:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:20:16.777 18:24:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:20:16.777 18:24:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:20:16.777 18:24:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:20:16.777 18:24:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 00:20:17.035 18:24:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:20:17.035 "name": "BaseBdev4", 00:20:17.035 "aliases": [ 00:20:17.035 "5c27ef8b-6465-4136-9f50-7ca42eaa5299" 00:20:17.035 ], 00:20:17.035 "product_name": "Malloc disk", 00:20:17.035 "block_size": 512, 00:20:17.035 "num_blocks": 65536, 00:20:17.035 "uuid": "5c27ef8b-6465-4136-9f50-7ca42eaa5299", 00:20:17.035 "assigned_rate_limits": { 00:20:17.035 "rw_ios_per_sec": 0, 00:20:17.035 "rw_mbytes_per_sec": 0, 00:20:17.035 "r_mbytes_per_sec": 0, 00:20:17.035 "w_mbytes_per_sec": 0 00:20:17.035 }, 00:20:17.035 "claimed": true, 00:20:17.035 "claim_type": "exclusive_write", 00:20:17.035 "zoned": false, 00:20:17.035 "supported_io_types": { 00:20:17.035 "read": true, 00:20:17.035 "write": true, 00:20:17.035 "unmap": true, 00:20:17.035 "flush": true, 00:20:17.035 "reset": true, 00:20:17.035 "nvme_admin": false, 00:20:17.035 "nvme_io": false, 00:20:17.035 "nvme_io_md": false, 00:20:17.035 "write_zeroes": true, 00:20:17.035 "zcopy": true, 00:20:17.035 "get_zone_info": false, 00:20:17.035 "zone_management": false, 00:20:17.035 "zone_append": false, 00:20:17.035 "compare": false, 00:20:17.035 "compare_and_write": false, 00:20:17.035 "abort": true, 00:20:17.035 "seek_hole": false, 00:20:17.035 "seek_data": false, 00:20:17.035 "copy": true, 00:20:17.035 "nvme_iov_md": false 00:20:17.035 }, 00:20:17.035 "memory_domains": [ 00:20:17.035 { 00:20:17.035 "dma_device_id": "system", 00:20:17.035 "dma_device_type": 1 00:20:17.035 }, 00:20:17.035 { 00:20:17.035 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:17.035 "dma_device_type": 2 00:20:17.035 } 00:20:17.035 ], 00:20:17.035 "driver_specific": {} 00:20:17.035 }' 00:20:17.035 18:24:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:20:17.035 18:24:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:20:17.035 18:24:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:20:17.035 18:24:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:20:17.035 18:24:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:20:17.035 18:24:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:20:17.035 18:24:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:20:17.035 18:24:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:20:17.294 18:24:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:20:17.294 18:24:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:20:17.294 18:24:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:20:17.294 18:24:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:20:17.294 18:24:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@338 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:20:17.552 [2024-07-12 18:24:01.102919] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:20:17.552 [2024-07-12 18:24:01.102954] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:20:17.552 [2024-07-12 18:24:01.103007] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:20:17.552 [2024-07-12 18:24:01.103066] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:20:17.552 [2024-07-12 18:24:01.103078] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1404850 name Existed_Raid, state offline 00:20:17.552 18:24:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@341 -- # killprocess 2538645 00:20:17.552 18:24:01 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@948 -- # '[' -z 2538645 ']' 00:20:17.552 18:24:01 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@952 -- # kill -0 2538645 00:20:17.552 18:24:01 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@953 -- # uname 00:20:17.552 18:24:01 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:20:17.552 18:24:01 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2538645 00:20:17.552 18:24:01 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:20:17.552 18:24:01 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:20:17.552 18:24:01 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2538645' 00:20:17.552 killing process with pid 2538645 00:20:17.552 18:24:01 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@967 -- # kill 2538645 00:20:17.552 [2024-07-12 18:24:01.176433] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:20:17.552 18:24:01 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@972 -- # wait 2538645 00:20:17.552 [2024-07-12 18:24:01.214574] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:20:17.810 18:24:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@343 -- # return 0 00:20:17.810 00:20:17.810 real 0m32.867s 00:20:17.810 user 1m0.322s 00:20:17.810 sys 0m5.828s 00:20:17.810 18:24:01 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@1124 -- # xtrace_disable 00:20:17.810 18:24:01 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:20:17.810 ************************************ 00:20:17.810 END TEST raid_state_function_test_sb 00:20:17.810 ************************************ 00:20:17.810 18:24:01 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:20:17.810 18:24:01 bdev_raid -- bdev/bdev_raid.sh@869 -- # run_test raid_superblock_test raid_superblock_test concat 4 00:20:17.810 18:24:01 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 4 -le 1 ']' 00:20:17.810 18:24:01 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:20:17.810 18:24:01 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:20:17.810 ************************************ 00:20:17.810 START TEST raid_superblock_test 00:20:17.810 ************************************ 00:20:17.810 18:24:01 bdev_raid.raid_superblock_test -- common/autotest_common.sh@1123 -- # raid_superblock_test concat 4 00:20:17.810 18:24:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@392 -- # local raid_level=concat 00:20:17.810 18:24:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@393 -- # local num_base_bdevs=4 00:20:17.810 18:24:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@394 -- # base_bdevs_malloc=() 00:20:17.810 18:24:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@394 -- # local base_bdevs_malloc 00:20:17.810 18:24:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@395 -- # base_bdevs_pt=() 00:20:17.810 18:24:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@395 -- # local base_bdevs_pt 00:20:17.810 18:24:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@396 -- # base_bdevs_pt_uuid=() 00:20:17.810 18:24:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@396 -- # local base_bdevs_pt_uuid 00:20:17.810 18:24:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@397 -- # local raid_bdev_name=raid_bdev1 00:20:17.811 18:24:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@398 -- # local strip_size 00:20:17.811 18:24:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@399 -- # local strip_size_create_arg 00:20:17.811 18:24:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@400 -- # local raid_bdev_uuid 00:20:17.811 18:24:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@401 -- # local raid_bdev 00:20:17.811 18:24:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@403 -- # '[' concat '!=' raid1 ']' 00:20:17.811 18:24:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@404 -- # strip_size=64 00:20:17.811 18:24:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@405 -- # strip_size_create_arg='-z 64' 00:20:17.811 18:24:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@411 -- # raid_pid=2543641 00:20:17.811 18:24:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@412 -- # waitforlisten 2543641 /var/tmp/spdk-raid.sock 00:20:17.811 18:24:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@410 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -L bdev_raid 00:20:17.811 18:24:01 bdev_raid.raid_superblock_test -- common/autotest_common.sh@829 -- # '[' -z 2543641 ']' 00:20:17.811 18:24:01 bdev_raid.raid_superblock_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:20:17.811 18:24:01 bdev_raid.raid_superblock_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:20:17.811 18:24:01 bdev_raid.raid_superblock_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:20:17.811 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:20:17.811 18:24:01 bdev_raid.raid_superblock_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:20:17.811 18:24:01 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:20:18.069 [2024-07-12 18:24:01.593515] Starting SPDK v24.09-pre git sha1 182dd7de4 / DPDK 24.03.0 initialization... 00:20:18.069 [2024-07-12 18:24:01.593586] [ DPDK EAL parameters: bdev_svc --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2543641 ] 00:20:18.069 [2024-07-12 18:24:01.725158] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:20:18.328 [2024-07-12 18:24:01.832763] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:20:18.328 [2024-07-12 18:24:01.896777] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:20:18.328 [2024-07-12 18:24:01.896812] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:20:18.896 18:24:02 bdev_raid.raid_superblock_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:20:18.896 18:24:02 bdev_raid.raid_superblock_test -- common/autotest_common.sh@862 -- # return 0 00:20:18.896 18:24:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i = 1 )) 00:20:18.896 18:24:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:20:18.896 18:24:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc1 00:20:18.896 18:24:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt1 00:20:18.896 18:24:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000001 00:20:18.896 18:24:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:20:18.896 18:24:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:20:18.896 18:24:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:20:18.896 18:24:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc1 00:20:19.155 malloc1 00:20:19.155 18:24:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:20:19.412 [2024-07-12 18:24:03.008934] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:20:19.412 [2024-07-12 18:24:03.008984] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:20:19.412 [2024-07-12 18:24:03.009003] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xd9a570 00:20:19.412 [2024-07-12 18:24:03.009015] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:20:19.412 [2024-07-12 18:24:03.010566] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:20:19.412 [2024-07-12 18:24:03.010594] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:20:19.412 pt1 00:20:19.412 18:24:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:20:19.412 18:24:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:20:19.412 18:24:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc2 00:20:19.412 18:24:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt2 00:20:19.412 18:24:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000002 00:20:19.413 18:24:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:20:19.413 18:24:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:20:19.413 18:24:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:20:19.413 18:24:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc2 00:20:19.670 malloc2 00:20:19.670 18:24:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:20:19.928 [2024-07-12 18:24:03.506922] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:20:19.928 [2024-07-12 18:24:03.506972] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:20:19.928 [2024-07-12 18:24:03.506989] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xd9b970 00:20:19.928 [2024-07-12 18:24:03.507001] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:20:19.928 [2024-07-12 18:24:03.508454] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:20:19.928 [2024-07-12 18:24:03.508481] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:20:19.928 pt2 00:20:19.928 18:24:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:20:19.928 18:24:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:20:19.928 18:24:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc3 00:20:19.928 18:24:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt3 00:20:19.928 18:24:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000003 00:20:19.928 18:24:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:20:19.928 18:24:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:20:19.928 18:24:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:20:19.928 18:24:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc3 00:20:20.187 malloc3 00:20:20.187 18:24:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc3 -p pt3 -u 00000000-0000-0000-0000-000000000003 00:20:20.477 [2024-07-12 18:24:03.944803] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc3 00:20:20.477 [2024-07-12 18:24:03.944851] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:20:20.477 [2024-07-12 18:24:03.944868] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xf32340 00:20:20.477 [2024-07-12 18:24:03.944881] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:20:20.477 [2024-07-12 18:24:03.946324] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:20:20.477 [2024-07-12 18:24:03.946351] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt3 00:20:20.477 pt3 00:20:20.477 18:24:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:20:20.477 18:24:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:20:20.477 18:24:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc4 00:20:20.477 18:24:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt4 00:20:20.477 18:24:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000004 00:20:20.477 18:24:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:20:20.477 18:24:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:20:20.477 18:24:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:20:20.477 18:24:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc4 00:20:20.477 malloc4 00:20:20.477 18:24:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc4 -p pt4 -u 00000000-0000-0000-0000-000000000004 00:20:20.743 [2024-07-12 18:24:04.378553] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc4 00:20:20.743 [2024-07-12 18:24:04.378605] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:20:20.743 [2024-07-12 18:24:04.378625] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xf34c60 00:20:20.743 [2024-07-12 18:24:04.378644] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:20:20.743 [2024-07-12 18:24:04.380158] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:20:20.743 [2024-07-12 18:24:04.380186] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt4 00:20:20.743 pt4 00:20:20.743 18:24:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:20:20.743 18:24:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:20:20.743 18:24:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@429 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'pt1 pt2 pt3 pt4' -n raid_bdev1 -s 00:20:21.002 [2024-07-12 18:24:04.563066] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:20:21.002 [2024-07-12 18:24:04.564293] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:20:21.002 [2024-07-12 18:24:04.564348] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt3 is claimed 00:20:21.002 [2024-07-12 18:24:04.564391] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt4 is claimed 00:20:21.002 [2024-07-12 18:24:04.564559] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0xd92530 00:20:21.002 [2024-07-12 18:24:04.564571] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 253952, blocklen 512 00:20:21.002 [2024-07-12 18:24:04.564766] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xd90770 00:20:21.002 [2024-07-12 18:24:04.564910] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0xd92530 00:20:21.002 [2024-07-12 18:24:04.564921] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0xd92530 00:20:21.002 [2024-07-12 18:24:04.565025] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:20:21.002 18:24:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@430 -- # verify_raid_bdev_state raid_bdev1 online concat 64 4 00:20:21.002 18:24:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:20:21.002 18:24:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:20:21.002 18:24:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:20:21.002 18:24:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:20:21.002 18:24:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:20:21.002 18:24:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:20:21.002 18:24:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:20:21.002 18:24:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:20:21.002 18:24:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:20:21.002 18:24:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:20:21.002 18:24:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:21.261 18:24:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:20:21.261 "name": "raid_bdev1", 00:20:21.261 "uuid": "b24a3c07-eb8a-4975-8482-bd9347f124ec", 00:20:21.261 "strip_size_kb": 64, 00:20:21.261 "state": "online", 00:20:21.261 "raid_level": "concat", 00:20:21.261 "superblock": true, 00:20:21.261 "num_base_bdevs": 4, 00:20:21.261 "num_base_bdevs_discovered": 4, 00:20:21.261 "num_base_bdevs_operational": 4, 00:20:21.261 "base_bdevs_list": [ 00:20:21.261 { 00:20:21.261 "name": "pt1", 00:20:21.261 "uuid": "00000000-0000-0000-0000-000000000001", 00:20:21.261 "is_configured": true, 00:20:21.261 "data_offset": 2048, 00:20:21.261 "data_size": 63488 00:20:21.261 }, 00:20:21.261 { 00:20:21.261 "name": "pt2", 00:20:21.261 "uuid": "00000000-0000-0000-0000-000000000002", 00:20:21.261 "is_configured": true, 00:20:21.261 "data_offset": 2048, 00:20:21.261 "data_size": 63488 00:20:21.261 }, 00:20:21.261 { 00:20:21.261 "name": "pt3", 00:20:21.261 "uuid": "00000000-0000-0000-0000-000000000003", 00:20:21.261 "is_configured": true, 00:20:21.261 "data_offset": 2048, 00:20:21.261 "data_size": 63488 00:20:21.261 }, 00:20:21.261 { 00:20:21.261 "name": "pt4", 00:20:21.261 "uuid": "00000000-0000-0000-0000-000000000004", 00:20:21.261 "is_configured": true, 00:20:21.261 "data_offset": 2048, 00:20:21.261 "data_size": 63488 00:20:21.261 } 00:20:21.261 ] 00:20:21.261 }' 00:20:21.261 18:24:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:20:21.261 18:24:04 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:20:21.828 18:24:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@431 -- # verify_raid_bdev_properties raid_bdev1 00:20:21.828 18:24:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:20:21.828 18:24:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:20:21.828 18:24:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:20:21.828 18:24:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:20:21.828 18:24:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@198 -- # local name 00:20:21.828 18:24:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:20:21.828 18:24:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:20:22.087 [2024-07-12 18:24:05.650247] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:20:22.087 18:24:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:20:22.087 "name": "raid_bdev1", 00:20:22.087 "aliases": [ 00:20:22.087 "b24a3c07-eb8a-4975-8482-bd9347f124ec" 00:20:22.087 ], 00:20:22.087 "product_name": "Raid Volume", 00:20:22.087 "block_size": 512, 00:20:22.087 "num_blocks": 253952, 00:20:22.087 "uuid": "b24a3c07-eb8a-4975-8482-bd9347f124ec", 00:20:22.087 "assigned_rate_limits": { 00:20:22.087 "rw_ios_per_sec": 0, 00:20:22.087 "rw_mbytes_per_sec": 0, 00:20:22.087 "r_mbytes_per_sec": 0, 00:20:22.087 "w_mbytes_per_sec": 0 00:20:22.087 }, 00:20:22.087 "claimed": false, 00:20:22.087 "zoned": false, 00:20:22.087 "supported_io_types": { 00:20:22.087 "read": true, 00:20:22.087 "write": true, 00:20:22.087 "unmap": true, 00:20:22.087 "flush": true, 00:20:22.087 "reset": true, 00:20:22.087 "nvme_admin": false, 00:20:22.087 "nvme_io": false, 00:20:22.087 "nvme_io_md": false, 00:20:22.087 "write_zeroes": true, 00:20:22.087 "zcopy": false, 00:20:22.087 "get_zone_info": false, 00:20:22.087 "zone_management": false, 00:20:22.087 "zone_append": false, 00:20:22.087 "compare": false, 00:20:22.087 "compare_and_write": false, 00:20:22.087 "abort": false, 00:20:22.087 "seek_hole": false, 00:20:22.087 "seek_data": false, 00:20:22.087 "copy": false, 00:20:22.087 "nvme_iov_md": false 00:20:22.087 }, 00:20:22.087 "memory_domains": [ 00:20:22.087 { 00:20:22.087 "dma_device_id": "system", 00:20:22.087 "dma_device_type": 1 00:20:22.087 }, 00:20:22.087 { 00:20:22.087 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:22.087 "dma_device_type": 2 00:20:22.087 }, 00:20:22.087 { 00:20:22.087 "dma_device_id": "system", 00:20:22.087 "dma_device_type": 1 00:20:22.087 }, 00:20:22.087 { 00:20:22.087 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:22.087 "dma_device_type": 2 00:20:22.087 }, 00:20:22.087 { 00:20:22.087 "dma_device_id": "system", 00:20:22.087 "dma_device_type": 1 00:20:22.087 }, 00:20:22.087 { 00:20:22.087 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:22.087 "dma_device_type": 2 00:20:22.087 }, 00:20:22.087 { 00:20:22.087 "dma_device_id": "system", 00:20:22.087 "dma_device_type": 1 00:20:22.087 }, 00:20:22.087 { 00:20:22.087 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:22.087 "dma_device_type": 2 00:20:22.087 } 00:20:22.087 ], 00:20:22.087 "driver_specific": { 00:20:22.087 "raid": { 00:20:22.087 "uuid": "b24a3c07-eb8a-4975-8482-bd9347f124ec", 00:20:22.087 "strip_size_kb": 64, 00:20:22.087 "state": "online", 00:20:22.087 "raid_level": "concat", 00:20:22.087 "superblock": true, 00:20:22.087 "num_base_bdevs": 4, 00:20:22.087 "num_base_bdevs_discovered": 4, 00:20:22.087 "num_base_bdevs_operational": 4, 00:20:22.087 "base_bdevs_list": [ 00:20:22.087 { 00:20:22.087 "name": "pt1", 00:20:22.087 "uuid": "00000000-0000-0000-0000-000000000001", 00:20:22.087 "is_configured": true, 00:20:22.087 "data_offset": 2048, 00:20:22.087 "data_size": 63488 00:20:22.087 }, 00:20:22.087 { 00:20:22.087 "name": "pt2", 00:20:22.087 "uuid": "00000000-0000-0000-0000-000000000002", 00:20:22.087 "is_configured": true, 00:20:22.087 "data_offset": 2048, 00:20:22.087 "data_size": 63488 00:20:22.087 }, 00:20:22.087 { 00:20:22.087 "name": "pt3", 00:20:22.087 "uuid": "00000000-0000-0000-0000-000000000003", 00:20:22.087 "is_configured": true, 00:20:22.087 "data_offset": 2048, 00:20:22.087 "data_size": 63488 00:20:22.087 }, 00:20:22.087 { 00:20:22.087 "name": "pt4", 00:20:22.087 "uuid": "00000000-0000-0000-0000-000000000004", 00:20:22.087 "is_configured": true, 00:20:22.087 "data_offset": 2048, 00:20:22.087 "data_size": 63488 00:20:22.087 } 00:20:22.087 ] 00:20:22.087 } 00:20:22.087 } 00:20:22.087 }' 00:20:22.087 18:24:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:20:22.087 18:24:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:20:22.087 pt2 00:20:22.087 pt3 00:20:22.087 pt4' 00:20:22.087 18:24:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:20:22.087 18:24:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:20:22.087 18:24:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:20:22.345 18:24:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:20:22.345 "name": "pt1", 00:20:22.345 "aliases": [ 00:20:22.345 "00000000-0000-0000-0000-000000000001" 00:20:22.345 ], 00:20:22.345 "product_name": "passthru", 00:20:22.345 "block_size": 512, 00:20:22.345 "num_blocks": 65536, 00:20:22.345 "uuid": "00000000-0000-0000-0000-000000000001", 00:20:22.345 "assigned_rate_limits": { 00:20:22.345 "rw_ios_per_sec": 0, 00:20:22.345 "rw_mbytes_per_sec": 0, 00:20:22.345 "r_mbytes_per_sec": 0, 00:20:22.345 "w_mbytes_per_sec": 0 00:20:22.345 }, 00:20:22.345 "claimed": true, 00:20:22.345 "claim_type": "exclusive_write", 00:20:22.345 "zoned": false, 00:20:22.345 "supported_io_types": { 00:20:22.345 "read": true, 00:20:22.345 "write": true, 00:20:22.345 "unmap": true, 00:20:22.345 "flush": true, 00:20:22.345 "reset": true, 00:20:22.345 "nvme_admin": false, 00:20:22.345 "nvme_io": false, 00:20:22.345 "nvme_io_md": false, 00:20:22.345 "write_zeroes": true, 00:20:22.345 "zcopy": true, 00:20:22.345 "get_zone_info": false, 00:20:22.345 "zone_management": false, 00:20:22.345 "zone_append": false, 00:20:22.345 "compare": false, 00:20:22.345 "compare_and_write": false, 00:20:22.345 "abort": true, 00:20:22.345 "seek_hole": false, 00:20:22.345 "seek_data": false, 00:20:22.345 "copy": true, 00:20:22.345 "nvme_iov_md": false 00:20:22.345 }, 00:20:22.345 "memory_domains": [ 00:20:22.345 { 00:20:22.345 "dma_device_id": "system", 00:20:22.345 "dma_device_type": 1 00:20:22.345 }, 00:20:22.345 { 00:20:22.345 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:22.345 "dma_device_type": 2 00:20:22.345 } 00:20:22.345 ], 00:20:22.345 "driver_specific": { 00:20:22.345 "passthru": { 00:20:22.345 "name": "pt1", 00:20:22.345 "base_bdev_name": "malloc1" 00:20:22.345 } 00:20:22.345 } 00:20:22.345 }' 00:20:22.345 18:24:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:20:22.345 18:24:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:20:22.345 18:24:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:20:22.346 18:24:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:20:22.604 18:24:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:20:22.604 18:24:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:20:22.604 18:24:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:20:22.604 18:24:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:20:22.604 18:24:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:20:22.604 18:24:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:20:22.604 18:24:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:20:22.862 18:24:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:20:22.862 18:24:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:20:22.862 18:24:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:20:22.862 18:24:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:20:22.862 18:24:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:20:22.862 "name": "pt2", 00:20:22.862 "aliases": [ 00:20:22.862 "00000000-0000-0000-0000-000000000002" 00:20:22.862 ], 00:20:22.862 "product_name": "passthru", 00:20:22.862 "block_size": 512, 00:20:22.862 "num_blocks": 65536, 00:20:22.862 "uuid": "00000000-0000-0000-0000-000000000002", 00:20:22.862 "assigned_rate_limits": { 00:20:22.862 "rw_ios_per_sec": 0, 00:20:22.862 "rw_mbytes_per_sec": 0, 00:20:22.862 "r_mbytes_per_sec": 0, 00:20:22.862 "w_mbytes_per_sec": 0 00:20:22.862 }, 00:20:22.862 "claimed": true, 00:20:22.862 "claim_type": "exclusive_write", 00:20:22.862 "zoned": false, 00:20:22.862 "supported_io_types": { 00:20:22.862 "read": true, 00:20:22.862 "write": true, 00:20:22.862 "unmap": true, 00:20:22.862 "flush": true, 00:20:22.862 "reset": true, 00:20:22.862 "nvme_admin": false, 00:20:22.862 "nvme_io": false, 00:20:22.862 "nvme_io_md": false, 00:20:22.862 "write_zeroes": true, 00:20:22.862 "zcopy": true, 00:20:22.862 "get_zone_info": false, 00:20:22.862 "zone_management": false, 00:20:22.862 "zone_append": false, 00:20:22.862 "compare": false, 00:20:22.862 "compare_and_write": false, 00:20:22.862 "abort": true, 00:20:22.862 "seek_hole": false, 00:20:22.862 "seek_data": false, 00:20:22.862 "copy": true, 00:20:22.862 "nvme_iov_md": false 00:20:22.862 }, 00:20:22.862 "memory_domains": [ 00:20:22.862 { 00:20:22.862 "dma_device_id": "system", 00:20:22.862 "dma_device_type": 1 00:20:22.862 }, 00:20:22.862 { 00:20:22.862 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:22.862 "dma_device_type": 2 00:20:22.862 } 00:20:22.862 ], 00:20:22.862 "driver_specific": { 00:20:22.862 "passthru": { 00:20:22.862 "name": "pt2", 00:20:22.862 "base_bdev_name": "malloc2" 00:20:22.862 } 00:20:22.862 } 00:20:22.862 }' 00:20:22.862 18:24:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:20:22.862 18:24:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:20:23.120 18:24:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:20:23.120 18:24:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:20:23.120 18:24:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:20:23.120 18:24:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:20:23.120 18:24:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:20:23.120 18:24:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:20:23.120 18:24:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:20:23.120 18:24:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:20:23.120 18:24:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:20:23.378 18:24:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:20:23.378 18:24:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:20:23.378 18:24:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt3 00:20:23.378 18:24:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:20:23.637 18:24:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:20:23.637 "name": "pt3", 00:20:23.637 "aliases": [ 00:20:23.637 "00000000-0000-0000-0000-000000000003" 00:20:23.637 ], 00:20:23.637 "product_name": "passthru", 00:20:23.637 "block_size": 512, 00:20:23.637 "num_blocks": 65536, 00:20:23.637 "uuid": "00000000-0000-0000-0000-000000000003", 00:20:23.637 "assigned_rate_limits": { 00:20:23.637 "rw_ios_per_sec": 0, 00:20:23.637 "rw_mbytes_per_sec": 0, 00:20:23.637 "r_mbytes_per_sec": 0, 00:20:23.637 "w_mbytes_per_sec": 0 00:20:23.637 }, 00:20:23.637 "claimed": true, 00:20:23.637 "claim_type": "exclusive_write", 00:20:23.637 "zoned": false, 00:20:23.637 "supported_io_types": { 00:20:23.637 "read": true, 00:20:23.637 "write": true, 00:20:23.637 "unmap": true, 00:20:23.637 "flush": true, 00:20:23.637 "reset": true, 00:20:23.637 "nvme_admin": false, 00:20:23.637 "nvme_io": false, 00:20:23.637 "nvme_io_md": false, 00:20:23.637 "write_zeroes": true, 00:20:23.637 "zcopy": true, 00:20:23.637 "get_zone_info": false, 00:20:23.637 "zone_management": false, 00:20:23.637 "zone_append": false, 00:20:23.637 "compare": false, 00:20:23.637 "compare_and_write": false, 00:20:23.637 "abort": true, 00:20:23.637 "seek_hole": false, 00:20:23.637 "seek_data": false, 00:20:23.637 "copy": true, 00:20:23.637 "nvme_iov_md": false 00:20:23.637 }, 00:20:23.637 "memory_domains": [ 00:20:23.637 { 00:20:23.637 "dma_device_id": "system", 00:20:23.637 "dma_device_type": 1 00:20:23.637 }, 00:20:23.637 { 00:20:23.637 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:23.637 "dma_device_type": 2 00:20:23.637 } 00:20:23.637 ], 00:20:23.637 "driver_specific": { 00:20:23.637 "passthru": { 00:20:23.637 "name": "pt3", 00:20:23.637 "base_bdev_name": "malloc3" 00:20:23.637 } 00:20:23.637 } 00:20:23.637 }' 00:20:23.637 18:24:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:20:23.637 18:24:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:20:23.637 18:24:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:20:23.637 18:24:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:20:23.637 18:24:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:20:23.637 18:24:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:20:23.637 18:24:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:20:23.637 18:24:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:20:23.895 18:24:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:20:23.895 18:24:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:20:23.895 18:24:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:20:23.895 18:24:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:20:23.895 18:24:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:20:23.896 18:24:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt4 00:20:23.896 18:24:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:20:24.154 18:24:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:20:24.154 "name": "pt4", 00:20:24.154 "aliases": [ 00:20:24.154 "00000000-0000-0000-0000-000000000004" 00:20:24.154 ], 00:20:24.154 "product_name": "passthru", 00:20:24.154 "block_size": 512, 00:20:24.154 "num_blocks": 65536, 00:20:24.154 "uuid": "00000000-0000-0000-0000-000000000004", 00:20:24.154 "assigned_rate_limits": { 00:20:24.154 "rw_ios_per_sec": 0, 00:20:24.154 "rw_mbytes_per_sec": 0, 00:20:24.154 "r_mbytes_per_sec": 0, 00:20:24.154 "w_mbytes_per_sec": 0 00:20:24.154 }, 00:20:24.154 "claimed": true, 00:20:24.154 "claim_type": "exclusive_write", 00:20:24.154 "zoned": false, 00:20:24.154 "supported_io_types": { 00:20:24.154 "read": true, 00:20:24.154 "write": true, 00:20:24.154 "unmap": true, 00:20:24.154 "flush": true, 00:20:24.154 "reset": true, 00:20:24.154 "nvme_admin": false, 00:20:24.154 "nvme_io": false, 00:20:24.154 "nvme_io_md": false, 00:20:24.154 "write_zeroes": true, 00:20:24.154 "zcopy": true, 00:20:24.154 "get_zone_info": false, 00:20:24.154 "zone_management": false, 00:20:24.154 "zone_append": false, 00:20:24.154 "compare": false, 00:20:24.154 "compare_and_write": false, 00:20:24.154 "abort": true, 00:20:24.154 "seek_hole": false, 00:20:24.154 "seek_data": false, 00:20:24.154 "copy": true, 00:20:24.154 "nvme_iov_md": false 00:20:24.154 }, 00:20:24.154 "memory_domains": [ 00:20:24.154 { 00:20:24.154 "dma_device_id": "system", 00:20:24.154 "dma_device_type": 1 00:20:24.154 }, 00:20:24.154 { 00:20:24.154 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:24.154 "dma_device_type": 2 00:20:24.154 } 00:20:24.154 ], 00:20:24.154 "driver_specific": { 00:20:24.154 "passthru": { 00:20:24.154 "name": "pt4", 00:20:24.154 "base_bdev_name": "malloc4" 00:20:24.154 } 00:20:24.155 } 00:20:24.155 }' 00:20:24.155 18:24:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:20:24.155 18:24:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:20:24.155 18:24:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:20:24.155 18:24:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:20:24.155 18:24:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:20:24.413 18:24:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:20:24.413 18:24:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:20:24.413 18:24:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:20:24.413 18:24:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:20:24.413 18:24:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:20:24.413 18:24:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:20:24.413 18:24:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:20:24.413 18:24:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:20:24.413 18:24:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # jq -r '.[] | .uuid' 00:20:24.672 [2024-07-12 18:24:08.281208] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:20:24.672 18:24:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # raid_bdev_uuid=b24a3c07-eb8a-4975-8482-bd9347f124ec 00:20:24.672 18:24:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@435 -- # '[' -z b24a3c07-eb8a-4975-8482-bd9347f124ec ']' 00:20:24.672 18:24:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@440 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:20:24.931 [2024-07-12 18:24:08.513517] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:20:24.931 [2024-07-12 18:24:08.513537] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:20:24.931 [2024-07-12 18:24:08.513586] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:20:24.931 [2024-07-12 18:24:08.513648] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:20:24.931 [2024-07-12 18:24:08.513665] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xd92530 name raid_bdev1, state offline 00:20:24.931 18:24:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:24.931 18:24:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # jq -r '.[]' 00:20:25.189 18:24:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # raid_bdev= 00:20:25.189 18:24:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@442 -- # '[' -n '' ']' 00:20:25.189 18:24:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:20:25.189 18:24:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt1 00:20:25.448 18:24:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:20:25.448 18:24:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:20:25.706 18:24:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:20:25.706 18:24:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt3 00:20:25.964 18:24:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:20:25.964 18:24:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt4 00:20:26.223 18:24:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs 00:20:26.223 18:24:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # jq -r '[.[] | select(.product_name == "passthru")] | any' 00:20:26.482 18:24:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # '[' false == true ']' 00:20:26.482 18:24:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@456 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'malloc1 malloc2 malloc3 malloc4' -n raid_bdev1 00:20:26.482 18:24:09 bdev_raid.raid_superblock_test -- common/autotest_common.sh@648 -- # local es=0 00:20:26.482 18:24:09 bdev_raid.raid_superblock_test -- common/autotest_common.sh@650 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'malloc1 malloc2 malloc3 malloc4' -n raid_bdev1 00:20:26.482 18:24:09 bdev_raid.raid_superblock_test -- common/autotest_common.sh@636 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:20:26.482 18:24:09 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:20:26.482 18:24:09 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:20:26.482 18:24:09 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:20:26.482 18:24:09 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:20:26.482 18:24:09 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:20:26.482 18:24:09 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:20:26.482 18:24:09 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py ]] 00:20:26.482 18:24:09 bdev_raid.raid_superblock_test -- common/autotest_common.sh@651 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'malloc1 malloc2 malloc3 malloc4' -n raid_bdev1 00:20:26.740 [2024-07-12 18:24:10.221992] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc1 is claimed 00:20:26.740 [2024-07-12 18:24:10.223357] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc2 is claimed 00:20:26.740 [2024-07-12 18:24:10.223401] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc3 is claimed 00:20:26.740 [2024-07-12 18:24:10.223435] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc4 is claimed 00:20:26.740 [2024-07-12 18:24:10.223481] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc1 00:20:26.740 [2024-07-12 18:24:10.223521] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc2 00:20:26.740 [2024-07-12 18:24:10.223544] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc3 00:20:26.740 [2024-07-12 18:24:10.223566] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc4 00:20:26.740 [2024-07-12 18:24:10.223585] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:20:26.740 [2024-07-12 18:24:10.223595] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xf3dff0 name raid_bdev1, state configuring 00:20:26.740 request: 00:20:26.740 { 00:20:26.740 "name": "raid_bdev1", 00:20:26.740 "raid_level": "concat", 00:20:26.740 "base_bdevs": [ 00:20:26.740 "malloc1", 00:20:26.740 "malloc2", 00:20:26.740 "malloc3", 00:20:26.740 "malloc4" 00:20:26.740 ], 00:20:26.740 "strip_size_kb": 64, 00:20:26.740 "superblock": false, 00:20:26.740 "method": "bdev_raid_create", 00:20:26.740 "req_id": 1 00:20:26.740 } 00:20:26.740 Got JSON-RPC error response 00:20:26.740 response: 00:20:26.740 { 00:20:26.740 "code": -17, 00:20:26.740 "message": "Failed to create RAID bdev raid_bdev1: File exists" 00:20:26.740 } 00:20:26.740 18:24:10 bdev_raid.raid_superblock_test -- common/autotest_common.sh@651 -- # es=1 00:20:26.740 18:24:10 bdev_raid.raid_superblock_test -- common/autotest_common.sh@659 -- # (( es > 128 )) 00:20:26.740 18:24:10 bdev_raid.raid_superblock_test -- common/autotest_common.sh@670 -- # [[ -n '' ]] 00:20:26.740 18:24:10 bdev_raid.raid_superblock_test -- common/autotest_common.sh@675 -- # (( !es == 0 )) 00:20:26.740 18:24:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:26.741 18:24:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # jq -r '.[]' 00:20:26.999 18:24:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # raid_bdev= 00:20:26.999 18:24:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@459 -- # '[' -n '' ']' 00:20:26.999 18:24:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@464 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:20:26.999 [2024-07-12 18:24:10.715217] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:20:26.999 [2024-07-12 18:24:10.715265] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:20:26.999 [2024-07-12 18:24:10.715286] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xd9a7a0 00:20:26.999 [2024-07-12 18:24:10.715298] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:20:26.999 [2024-07-12 18:24:10.716975] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:20:26.999 [2024-07-12 18:24:10.717005] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:20:26.999 [2024-07-12 18:24:10.717087] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt1 00:20:26.999 [2024-07-12 18:24:10.717116] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:20:26.999 pt1 00:20:27.258 18:24:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@467 -- # verify_raid_bdev_state raid_bdev1 configuring concat 64 4 00:20:27.258 18:24:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:20:27.258 18:24:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:20:27.258 18:24:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:20:27.258 18:24:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:20:27.258 18:24:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:20:27.258 18:24:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:20:27.258 18:24:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:20:27.258 18:24:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:20:27.258 18:24:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:20:27.258 18:24:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:27.258 18:24:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:20:27.258 18:24:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:20:27.258 "name": "raid_bdev1", 00:20:27.258 "uuid": "b24a3c07-eb8a-4975-8482-bd9347f124ec", 00:20:27.258 "strip_size_kb": 64, 00:20:27.258 "state": "configuring", 00:20:27.258 "raid_level": "concat", 00:20:27.258 "superblock": true, 00:20:27.258 "num_base_bdevs": 4, 00:20:27.258 "num_base_bdevs_discovered": 1, 00:20:27.258 "num_base_bdevs_operational": 4, 00:20:27.258 "base_bdevs_list": [ 00:20:27.258 { 00:20:27.258 "name": "pt1", 00:20:27.258 "uuid": "00000000-0000-0000-0000-000000000001", 00:20:27.258 "is_configured": true, 00:20:27.258 "data_offset": 2048, 00:20:27.258 "data_size": 63488 00:20:27.258 }, 00:20:27.258 { 00:20:27.258 "name": null, 00:20:27.258 "uuid": "00000000-0000-0000-0000-000000000002", 00:20:27.258 "is_configured": false, 00:20:27.258 "data_offset": 2048, 00:20:27.258 "data_size": 63488 00:20:27.258 }, 00:20:27.258 { 00:20:27.258 "name": null, 00:20:27.258 "uuid": "00000000-0000-0000-0000-000000000003", 00:20:27.258 "is_configured": false, 00:20:27.258 "data_offset": 2048, 00:20:27.258 "data_size": 63488 00:20:27.258 }, 00:20:27.258 { 00:20:27.258 "name": null, 00:20:27.258 "uuid": "00000000-0000-0000-0000-000000000004", 00:20:27.258 "is_configured": false, 00:20:27.258 "data_offset": 2048, 00:20:27.258 "data_size": 63488 00:20:27.258 } 00:20:27.258 ] 00:20:27.258 }' 00:20:27.258 18:24:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:20:27.258 18:24:10 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:20:28.193 18:24:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@469 -- # '[' 4 -gt 2 ']' 00:20:28.193 18:24:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@471 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:20:28.193 [2024-07-12 18:24:11.802112] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:20:28.193 [2024-07-12 18:24:11.802161] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:20:28.193 [2024-07-12 18:24:11.802179] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xd91ea0 00:20:28.193 [2024-07-12 18:24:11.802191] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:20:28.193 [2024-07-12 18:24:11.802540] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:20:28.193 [2024-07-12 18:24:11.802557] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:20:28.193 [2024-07-12 18:24:11.802620] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:20:28.193 [2024-07-12 18:24:11.802638] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:20:28.193 pt2 00:20:28.193 18:24:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@472 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:20:28.464 [2024-07-12 18:24:12.042761] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: pt2 00:20:28.464 18:24:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@473 -- # verify_raid_bdev_state raid_bdev1 configuring concat 64 4 00:20:28.464 18:24:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:20:28.464 18:24:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:20:28.464 18:24:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:20:28.464 18:24:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:20:28.464 18:24:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:20:28.464 18:24:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:20:28.464 18:24:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:20:28.464 18:24:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:20:28.464 18:24:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:20:28.464 18:24:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:28.464 18:24:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:20:28.726 18:24:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:20:28.726 "name": "raid_bdev1", 00:20:28.726 "uuid": "b24a3c07-eb8a-4975-8482-bd9347f124ec", 00:20:28.726 "strip_size_kb": 64, 00:20:28.726 "state": "configuring", 00:20:28.726 "raid_level": "concat", 00:20:28.726 "superblock": true, 00:20:28.726 "num_base_bdevs": 4, 00:20:28.726 "num_base_bdevs_discovered": 1, 00:20:28.726 "num_base_bdevs_operational": 4, 00:20:28.726 "base_bdevs_list": [ 00:20:28.726 { 00:20:28.726 "name": "pt1", 00:20:28.726 "uuid": "00000000-0000-0000-0000-000000000001", 00:20:28.726 "is_configured": true, 00:20:28.726 "data_offset": 2048, 00:20:28.726 "data_size": 63488 00:20:28.726 }, 00:20:28.726 { 00:20:28.726 "name": null, 00:20:28.726 "uuid": "00000000-0000-0000-0000-000000000002", 00:20:28.726 "is_configured": false, 00:20:28.726 "data_offset": 2048, 00:20:28.726 "data_size": 63488 00:20:28.726 }, 00:20:28.726 { 00:20:28.726 "name": null, 00:20:28.726 "uuid": "00000000-0000-0000-0000-000000000003", 00:20:28.726 "is_configured": false, 00:20:28.726 "data_offset": 2048, 00:20:28.726 "data_size": 63488 00:20:28.726 }, 00:20:28.726 { 00:20:28.726 "name": null, 00:20:28.726 "uuid": "00000000-0000-0000-0000-000000000004", 00:20:28.726 "is_configured": false, 00:20:28.726 "data_offset": 2048, 00:20:28.726 "data_size": 63488 00:20:28.726 } 00:20:28.726 ] 00:20:28.726 }' 00:20:28.726 18:24:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:20:28.726 18:24:12 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:20:29.290 18:24:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i = 1 )) 00:20:29.290 18:24:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:20:29.290 18:24:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@478 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:20:29.547 [2024-07-12 18:24:13.113604] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:20:29.547 [2024-07-12 18:24:13.113651] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:20:29.547 [2024-07-12 18:24:13.113669] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xd90ec0 00:20:29.547 [2024-07-12 18:24:13.113682] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:20:29.547 [2024-07-12 18:24:13.114031] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:20:29.547 [2024-07-12 18:24:13.114049] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:20:29.547 [2024-07-12 18:24:13.114108] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:20:29.547 [2024-07-12 18:24:13.114126] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:20:29.547 pt2 00:20:29.547 18:24:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i++ )) 00:20:29.547 18:24:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:20:29.547 18:24:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@478 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc3 -p pt3 -u 00000000-0000-0000-0000-000000000003 00:20:29.805 [2024-07-12 18:24:13.358260] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc3 00:20:29.805 [2024-07-12 18:24:13.358295] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:20:29.805 [2024-07-12 18:24:13.358311] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xd910f0 00:20:29.805 [2024-07-12 18:24:13.358323] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:20:29.805 [2024-07-12 18:24:13.358631] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:20:29.805 [2024-07-12 18:24:13.358654] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt3 00:20:29.805 [2024-07-12 18:24:13.358713] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt3 00:20:29.805 [2024-07-12 18:24:13.358731] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt3 is claimed 00:20:29.805 pt3 00:20:29.805 18:24:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i++ )) 00:20:29.805 18:24:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:20:29.805 18:24:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@478 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc4 -p pt4 -u 00000000-0000-0000-0000-000000000004 00:20:30.063 [2024-07-12 18:24:13.610934] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc4 00:20:30.063 [2024-07-12 18:24:13.610970] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:20:30.063 [2024-07-12 18:24:13.610986] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xd99af0 00:20:30.063 [2024-07-12 18:24:13.610997] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:20:30.063 [2024-07-12 18:24:13.611302] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:20:30.063 [2024-07-12 18:24:13.611318] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt4 00:20:30.063 [2024-07-12 18:24:13.611370] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt4 00:20:30.063 [2024-07-12 18:24:13.611388] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt4 is claimed 00:20:30.063 [2024-07-12 18:24:13.611508] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0xd938f0 00:20:30.063 [2024-07-12 18:24:13.611519] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 253952, blocklen 512 00:20:30.063 [2024-07-12 18:24:13.611688] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xd93150 00:20:30.063 [2024-07-12 18:24:13.611817] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0xd938f0 00:20:30.063 [2024-07-12 18:24:13.611827] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0xd938f0 00:20:30.063 [2024-07-12 18:24:13.611923] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:20:30.063 pt4 00:20:30.063 18:24:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i++ )) 00:20:30.063 18:24:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:20:30.063 18:24:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@482 -- # verify_raid_bdev_state raid_bdev1 online concat 64 4 00:20:30.063 18:24:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:20:30.063 18:24:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:20:30.063 18:24:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:20:30.063 18:24:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:20:30.063 18:24:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:20:30.063 18:24:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:20:30.063 18:24:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:20:30.063 18:24:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:20:30.063 18:24:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:20:30.063 18:24:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:30.063 18:24:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:20:30.322 18:24:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:20:30.322 "name": "raid_bdev1", 00:20:30.322 "uuid": "b24a3c07-eb8a-4975-8482-bd9347f124ec", 00:20:30.322 "strip_size_kb": 64, 00:20:30.322 "state": "online", 00:20:30.322 "raid_level": "concat", 00:20:30.322 "superblock": true, 00:20:30.322 "num_base_bdevs": 4, 00:20:30.322 "num_base_bdevs_discovered": 4, 00:20:30.322 "num_base_bdevs_operational": 4, 00:20:30.322 "base_bdevs_list": [ 00:20:30.322 { 00:20:30.322 "name": "pt1", 00:20:30.322 "uuid": "00000000-0000-0000-0000-000000000001", 00:20:30.322 "is_configured": true, 00:20:30.322 "data_offset": 2048, 00:20:30.322 "data_size": 63488 00:20:30.322 }, 00:20:30.322 { 00:20:30.322 "name": "pt2", 00:20:30.322 "uuid": "00000000-0000-0000-0000-000000000002", 00:20:30.322 "is_configured": true, 00:20:30.322 "data_offset": 2048, 00:20:30.322 "data_size": 63488 00:20:30.322 }, 00:20:30.322 { 00:20:30.322 "name": "pt3", 00:20:30.322 "uuid": "00000000-0000-0000-0000-000000000003", 00:20:30.322 "is_configured": true, 00:20:30.322 "data_offset": 2048, 00:20:30.322 "data_size": 63488 00:20:30.322 }, 00:20:30.322 { 00:20:30.322 "name": "pt4", 00:20:30.322 "uuid": "00000000-0000-0000-0000-000000000004", 00:20:30.322 "is_configured": true, 00:20:30.322 "data_offset": 2048, 00:20:30.322 "data_size": 63488 00:20:30.322 } 00:20:30.322 ] 00:20:30.322 }' 00:20:30.322 18:24:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:20:30.322 18:24:13 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:20:30.888 18:24:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@483 -- # verify_raid_bdev_properties raid_bdev1 00:20:30.888 18:24:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:20:30.888 18:24:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:20:30.888 18:24:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:20:30.888 18:24:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:20:30.889 18:24:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@198 -- # local name 00:20:30.889 18:24:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:20:30.889 18:24:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:20:31.147 [2024-07-12 18:24:14.714186] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:20:31.147 18:24:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:20:31.147 "name": "raid_bdev1", 00:20:31.147 "aliases": [ 00:20:31.147 "b24a3c07-eb8a-4975-8482-bd9347f124ec" 00:20:31.147 ], 00:20:31.147 "product_name": "Raid Volume", 00:20:31.147 "block_size": 512, 00:20:31.147 "num_blocks": 253952, 00:20:31.147 "uuid": "b24a3c07-eb8a-4975-8482-bd9347f124ec", 00:20:31.147 "assigned_rate_limits": { 00:20:31.147 "rw_ios_per_sec": 0, 00:20:31.147 "rw_mbytes_per_sec": 0, 00:20:31.147 "r_mbytes_per_sec": 0, 00:20:31.147 "w_mbytes_per_sec": 0 00:20:31.147 }, 00:20:31.147 "claimed": false, 00:20:31.147 "zoned": false, 00:20:31.147 "supported_io_types": { 00:20:31.147 "read": true, 00:20:31.147 "write": true, 00:20:31.147 "unmap": true, 00:20:31.147 "flush": true, 00:20:31.147 "reset": true, 00:20:31.147 "nvme_admin": false, 00:20:31.147 "nvme_io": false, 00:20:31.147 "nvme_io_md": false, 00:20:31.147 "write_zeroes": true, 00:20:31.147 "zcopy": false, 00:20:31.147 "get_zone_info": false, 00:20:31.147 "zone_management": false, 00:20:31.147 "zone_append": false, 00:20:31.147 "compare": false, 00:20:31.147 "compare_and_write": false, 00:20:31.147 "abort": false, 00:20:31.147 "seek_hole": false, 00:20:31.147 "seek_data": false, 00:20:31.147 "copy": false, 00:20:31.147 "nvme_iov_md": false 00:20:31.147 }, 00:20:31.147 "memory_domains": [ 00:20:31.147 { 00:20:31.147 "dma_device_id": "system", 00:20:31.147 "dma_device_type": 1 00:20:31.147 }, 00:20:31.147 { 00:20:31.147 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:31.147 "dma_device_type": 2 00:20:31.147 }, 00:20:31.147 { 00:20:31.147 "dma_device_id": "system", 00:20:31.147 "dma_device_type": 1 00:20:31.147 }, 00:20:31.147 { 00:20:31.147 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:31.147 "dma_device_type": 2 00:20:31.147 }, 00:20:31.147 { 00:20:31.147 "dma_device_id": "system", 00:20:31.147 "dma_device_type": 1 00:20:31.147 }, 00:20:31.147 { 00:20:31.147 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:31.147 "dma_device_type": 2 00:20:31.147 }, 00:20:31.147 { 00:20:31.147 "dma_device_id": "system", 00:20:31.147 "dma_device_type": 1 00:20:31.147 }, 00:20:31.147 { 00:20:31.147 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:31.147 "dma_device_type": 2 00:20:31.147 } 00:20:31.147 ], 00:20:31.147 "driver_specific": { 00:20:31.147 "raid": { 00:20:31.147 "uuid": "b24a3c07-eb8a-4975-8482-bd9347f124ec", 00:20:31.147 "strip_size_kb": 64, 00:20:31.147 "state": "online", 00:20:31.147 "raid_level": "concat", 00:20:31.147 "superblock": true, 00:20:31.147 "num_base_bdevs": 4, 00:20:31.147 "num_base_bdevs_discovered": 4, 00:20:31.147 "num_base_bdevs_operational": 4, 00:20:31.147 "base_bdevs_list": [ 00:20:31.147 { 00:20:31.147 "name": "pt1", 00:20:31.147 "uuid": "00000000-0000-0000-0000-000000000001", 00:20:31.147 "is_configured": true, 00:20:31.147 "data_offset": 2048, 00:20:31.147 "data_size": 63488 00:20:31.147 }, 00:20:31.147 { 00:20:31.147 "name": "pt2", 00:20:31.147 "uuid": "00000000-0000-0000-0000-000000000002", 00:20:31.147 "is_configured": true, 00:20:31.147 "data_offset": 2048, 00:20:31.147 "data_size": 63488 00:20:31.147 }, 00:20:31.147 { 00:20:31.147 "name": "pt3", 00:20:31.147 "uuid": "00000000-0000-0000-0000-000000000003", 00:20:31.147 "is_configured": true, 00:20:31.147 "data_offset": 2048, 00:20:31.147 "data_size": 63488 00:20:31.147 }, 00:20:31.147 { 00:20:31.147 "name": "pt4", 00:20:31.147 "uuid": "00000000-0000-0000-0000-000000000004", 00:20:31.147 "is_configured": true, 00:20:31.147 "data_offset": 2048, 00:20:31.147 "data_size": 63488 00:20:31.147 } 00:20:31.147 ] 00:20:31.147 } 00:20:31.147 } 00:20:31.147 }' 00:20:31.147 18:24:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:20:31.147 18:24:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:20:31.147 pt2 00:20:31.147 pt3 00:20:31.147 pt4' 00:20:31.147 18:24:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:20:31.147 18:24:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:20:31.147 18:24:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:20:31.405 18:24:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:20:31.405 "name": "pt1", 00:20:31.405 "aliases": [ 00:20:31.405 "00000000-0000-0000-0000-000000000001" 00:20:31.405 ], 00:20:31.405 "product_name": "passthru", 00:20:31.405 "block_size": 512, 00:20:31.405 "num_blocks": 65536, 00:20:31.405 "uuid": "00000000-0000-0000-0000-000000000001", 00:20:31.405 "assigned_rate_limits": { 00:20:31.405 "rw_ios_per_sec": 0, 00:20:31.405 "rw_mbytes_per_sec": 0, 00:20:31.405 "r_mbytes_per_sec": 0, 00:20:31.405 "w_mbytes_per_sec": 0 00:20:31.405 }, 00:20:31.405 "claimed": true, 00:20:31.405 "claim_type": "exclusive_write", 00:20:31.405 "zoned": false, 00:20:31.405 "supported_io_types": { 00:20:31.405 "read": true, 00:20:31.405 "write": true, 00:20:31.405 "unmap": true, 00:20:31.405 "flush": true, 00:20:31.405 "reset": true, 00:20:31.405 "nvme_admin": false, 00:20:31.405 "nvme_io": false, 00:20:31.405 "nvme_io_md": false, 00:20:31.405 "write_zeroes": true, 00:20:31.405 "zcopy": true, 00:20:31.405 "get_zone_info": false, 00:20:31.405 "zone_management": false, 00:20:31.405 "zone_append": false, 00:20:31.405 "compare": false, 00:20:31.405 "compare_and_write": false, 00:20:31.405 "abort": true, 00:20:31.405 "seek_hole": false, 00:20:31.405 "seek_data": false, 00:20:31.405 "copy": true, 00:20:31.405 "nvme_iov_md": false 00:20:31.405 }, 00:20:31.405 "memory_domains": [ 00:20:31.405 { 00:20:31.405 "dma_device_id": "system", 00:20:31.405 "dma_device_type": 1 00:20:31.405 }, 00:20:31.405 { 00:20:31.405 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:31.405 "dma_device_type": 2 00:20:31.405 } 00:20:31.405 ], 00:20:31.405 "driver_specific": { 00:20:31.405 "passthru": { 00:20:31.405 "name": "pt1", 00:20:31.405 "base_bdev_name": "malloc1" 00:20:31.405 } 00:20:31.405 } 00:20:31.405 }' 00:20:31.405 18:24:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:20:31.405 18:24:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:20:31.405 18:24:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:20:31.405 18:24:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:20:31.663 18:24:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:20:31.663 18:24:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:20:31.663 18:24:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:20:31.663 18:24:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:20:31.663 18:24:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:20:31.663 18:24:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:20:31.663 18:24:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:20:31.922 18:24:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:20:31.922 18:24:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:20:31.922 18:24:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:20:31.922 18:24:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:20:31.922 18:24:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:20:31.922 "name": "pt2", 00:20:31.922 "aliases": [ 00:20:31.922 "00000000-0000-0000-0000-000000000002" 00:20:31.922 ], 00:20:31.922 "product_name": "passthru", 00:20:31.922 "block_size": 512, 00:20:31.922 "num_blocks": 65536, 00:20:31.922 "uuid": "00000000-0000-0000-0000-000000000002", 00:20:31.922 "assigned_rate_limits": { 00:20:31.922 "rw_ios_per_sec": 0, 00:20:31.922 "rw_mbytes_per_sec": 0, 00:20:31.922 "r_mbytes_per_sec": 0, 00:20:31.922 "w_mbytes_per_sec": 0 00:20:31.922 }, 00:20:31.922 "claimed": true, 00:20:31.922 "claim_type": "exclusive_write", 00:20:31.922 "zoned": false, 00:20:31.922 "supported_io_types": { 00:20:31.922 "read": true, 00:20:31.922 "write": true, 00:20:31.922 "unmap": true, 00:20:31.922 "flush": true, 00:20:31.922 "reset": true, 00:20:31.922 "nvme_admin": false, 00:20:31.922 "nvme_io": false, 00:20:31.922 "nvme_io_md": false, 00:20:31.922 "write_zeroes": true, 00:20:31.922 "zcopy": true, 00:20:31.922 "get_zone_info": false, 00:20:31.922 "zone_management": false, 00:20:31.922 "zone_append": false, 00:20:31.922 "compare": false, 00:20:31.922 "compare_and_write": false, 00:20:31.922 "abort": true, 00:20:31.922 "seek_hole": false, 00:20:31.922 "seek_data": false, 00:20:31.922 "copy": true, 00:20:31.922 "nvme_iov_md": false 00:20:31.922 }, 00:20:31.922 "memory_domains": [ 00:20:31.922 { 00:20:31.922 "dma_device_id": "system", 00:20:31.922 "dma_device_type": 1 00:20:31.922 }, 00:20:31.922 { 00:20:31.922 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:31.922 "dma_device_type": 2 00:20:31.922 } 00:20:31.922 ], 00:20:31.922 "driver_specific": { 00:20:31.922 "passthru": { 00:20:31.922 "name": "pt2", 00:20:31.922 "base_bdev_name": "malloc2" 00:20:31.922 } 00:20:31.922 } 00:20:31.922 }' 00:20:31.922 18:24:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:20:32.181 18:24:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:20:32.181 18:24:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:20:32.181 18:24:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:20:32.181 18:24:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:20:32.181 18:24:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:20:32.181 18:24:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:20:32.181 18:24:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:20:32.439 18:24:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:20:32.439 18:24:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:20:32.439 18:24:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:20:32.439 18:24:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:20:32.439 18:24:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:20:32.439 18:24:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:20:32.439 18:24:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt3 00:20:33.006 18:24:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:20:33.006 "name": "pt3", 00:20:33.006 "aliases": [ 00:20:33.006 "00000000-0000-0000-0000-000000000003" 00:20:33.006 ], 00:20:33.006 "product_name": "passthru", 00:20:33.006 "block_size": 512, 00:20:33.006 "num_blocks": 65536, 00:20:33.006 "uuid": "00000000-0000-0000-0000-000000000003", 00:20:33.006 "assigned_rate_limits": { 00:20:33.006 "rw_ios_per_sec": 0, 00:20:33.006 "rw_mbytes_per_sec": 0, 00:20:33.006 "r_mbytes_per_sec": 0, 00:20:33.006 "w_mbytes_per_sec": 0 00:20:33.006 }, 00:20:33.006 "claimed": true, 00:20:33.006 "claim_type": "exclusive_write", 00:20:33.006 "zoned": false, 00:20:33.006 "supported_io_types": { 00:20:33.006 "read": true, 00:20:33.006 "write": true, 00:20:33.006 "unmap": true, 00:20:33.006 "flush": true, 00:20:33.006 "reset": true, 00:20:33.006 "nvme_admin": false, 00:20:33.006 "nvme_io": false, 00:20:33.006 "nvme_io_md": false, 00:20:33.006 "write_zeroes": true, 00:20:33.006 "zcopy": true, 00:20:33.006 "get_zone_info": false, 00:20:33.006 "zone_management": false, 00:20:33.006 "zone_append": false, 00:20:33.006 "compare": false, 00:20:33.006 "compare_and_write": false, 00:20:33.006 "abort": true, 00:20:33.006 "seek_hole": false, 00:20:33.006 "seek_data": false, 00:20:33.006 "copy": true, 00:20:33.006 "nvme_iov_md": false 00:20:33.006 }, 00:20:33.006 "memory_domains": [ 00:20:33.006 { 00:20:33.006 "dma_device_id": "system", 00:20:33.006 "dma_device_type": 1 00:20:33.006 }, 00:20:33.006 { 00:20:33.006 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:33.006 "dma_device_type": 2 00:20:33.006 } 00:20:33.006 ], 00:20:33.006 "driver_specific": { 00:20:33.006 "passthru": { 00:20:33.006 "name": "pt3", 00:20:33.006 "base_bdev_name": "malloc3" 00:20:33.006 } 00:20:33.006 } 00:20:33.006 }' 00:20:33.006 18:24:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:20:33.006 18:24:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:20:33.006 18:24:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:20:33.006 18:24:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:20:33.006 18:24:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:20:33.006 18:24:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:20:33.006 18:24:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:20:33.264 18:24:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:20:33.264 18:24:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:20:33.264 18:24:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:20:33.264 18:24:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:20:33.264 18:24:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:20:33.264 18:24:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:20:33.264 18:24:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt4 00:20:33.264 18:24:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:20:33.523 18:24:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:20:33.523 "name": "pt4", 00:20:33.523 "aliases": [ 00:20:33.523 "00000000-0000-0000-0000-000000000004" 00:20:33.523 ], 00:20:33.523 "product_name": "passthru", 00:20:33.523 "block_size": 512, 00:20:33.523 "num_blocks": 65536, 00:20:33.523 "uuid": "00000000-0000-0000-0000-000000000004", 00:20:33.523 "assigned_rate_limits": { 00:20:33.523 "rw_ios_per_sec": 0, 00:20:33.523 "rw_mbytes_per_sec": 0, 00:20:33.523 "r_mbytes_per_sec": 0, 00:20:33.523 "w_mbytes_per_sec": 0 00:20:33.523 }, 00:20:33.523 "claimed": true, 00:20:33.523 "claim_type": "exclusive_write", 00:20:33.523 "zoned": false, 00:20:33.523 "supported_io_types": { 00:20:33.523 "read": true, 00:20:33.523 "write": true, 00:20:33.523 "unmap": true, 00:20:33.523 "flush": true, 00:20:33.523 "reset": true, 00:20:33.523 "nvme_admin": false, 00:20:33.523 "nvme_io": false, 00:20:33.523 "nvme_io_md": false, 00:20:33.523 "write_zeroes": true, 00:20:33.523 "zcopy": true, 00:20:33.523 "get_zone_info": false, 00:20:33.523 "zone_management": false, 00:20:33.523 "zone_append": false, 00:20:33.523 "compare": false, 00:20:33.523 "compare_and_write": false, 00:20:33.523 "abort": true, 00:20:33.523 "seek_hole": false, 00:20:33.523 "seek_data": false, 00:20:33.523 "copy": true, 00:20:33.523 "nvme_iov_md": false 00:20:33.523 }, 00:20:33.523 "memory_domains": [ 00:20:33.523 { 00:20:33.523 "dma_device_id": "system", 00:20:33.523 "dma_device_type": 1 00:20:33.523 }, 00:20:33.523 { 00:20:33.523 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:33.523 "dma_device_type": 2 00:20:33.523 } 00:20:33.523 ], 00:20:33.523 "driver_specific": { 00:20:33.523 "passthru": { 00:20:33.523 "name": "pt4", 00:20:33.523 "base_bdev_name": "malloc4" 00:20:33.523 } 00:20:33.523 } 00:20:33.523 }' 00:20:33.523 18:24:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:20:33.523 18:24:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:20:33.523 18:24:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:20:33.523 18:24:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:20:33.523 18:24:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:20:33.782 18:24:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:20:33.782 18:24:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:20:33.782 18:24:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:20:33.782 18:24:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:20:33.782 18:24:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:20:33.782 18:24:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:20:33.782 18:24:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:20:33.782 18:24:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:20:33.782 18:24:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # jq -r '.[] | .uuid' 00:20:34.348 [2024-07-12 18:24:17.882611] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:20:34.348 18:24:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # '[' b24a3c07-eb8a-4975-8482-bd9347f124ec '!=' b24a3c07-eb8a-4975-8482-bd9347f124ec ']' 00:20:34.348 18:24:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@490 -- # has_redundancy concat 00:20:34.348 18:24:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:20:34.348 18:24:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@215 -- # return 1 00:20:34.348 18:24:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@562 -- # killprocess 2543641 00:20:34.348 18:24:17 bdev_raid.raid_superblock_test -- common/autotest_common.sh@948 -- # '[' -z 2543641 ']' 00:20:34.348 18:24:17 bdev_raid.raid_superblock_test -- common/autotest_common.sh@952 -- # kill -0 2543641 00:20:34.348 18:24:17 bdev_raid.raid_superblock_test -- common/autotest_common.sh@953 -- # uname 00:20:34.348 18:24:17 bdev_raid.raid_superblock_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:20:34.348 18:24:17 bdev_raid.raid_superblock_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2543641 00:20:34.348 18:24:17 bdev_raid.raid_superblock_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:20:34.348 18:24:17 bdev_raid.raid_superblock_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:20:34.348 18:24:17 bdev_raid.raid_superblock_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2543641' 00:20:34.348 killing process with pid 2543641 00:20:34.348 18:24:17 bdev_raid.raid_superblock_test -- common/autotest_common.sh@967 -- # kill 2543641 00:20:34.348 [2024-07-12 18:24:17.966658] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:20:34.348 [2024-07-12 18:24:17.966719] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:20:34.348 [2024-07-12 18:24:17.966778] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:20:34.348 [2024-07-12 18:24:17.966791] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xd938f0 name raid_bdev1, state offline 00:20:34.348 18:24:17 bdev_raid.raid_superblock_test -- common/autotest_common.sh@972 -- # wait 2543641 00:20:34.348 [2024-07-12 18:24:18.003306] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:20:34.608 18:24:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@564 -- # return 0 00:20:34.608 00:20:34.608 real 0m16.693s 00:20:34.608 user 0m30.100s 00:20:34.608 sys 0m3.024s 00:20:34.608 18:24:18 bdev_raid.raid_superblock_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:20:34.608 18:24:18 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:20:34.608 ************************************ 00:20:34.608 END TEST raid_superblock_test 00:20:34.608 ************************************ 00:20:34.608 18:24:18 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:20:34.608 18:24:18 bdev_raid -- bdev/bdev_raid.sh@870 -- # run_test raid_read_error_test raid_io_error_test concat 4 read 00:20:34.608 18:24:18 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:20:34.608 18:24:18 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:20:34.608 18:24:18 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:20:34.608 ************************************ 00:20:34.608 START TEST raid_read_error_test 00:20:34.608 ************************************ 00:20:34.608 18:24:18 bdev_raid.raid_read_error_test -- common/autotest_common.sh@1123 -- # raid_io_error_test concat 4 read 00:20:34.608 18:24:18 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@788 -- # local raid_level=concat 00:20:34.608 18:24:18 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@789 -- # local num_base_bdevs=4 00:20:34.608 18:24:18 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@790 -- # local error_io_type=read 00:20:34.608 18:24:18 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i = 1 )) 00:20:34.608 18:24:18 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:20:34.608 18:24:18 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev1 00:20:34.608 18:24:18 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:20:34.608 18:24:18 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:20:34.608 18:24:18 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev2 00:20:34.608 18:24:18 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:20:34.608 18:24:18 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:20:34.608 18:24:18 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev3 00:20:34.608 18:24:18 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:20:34.608 18:24:18 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:20:34.608 18:24:18 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev4 00:20:34.608 18:24:18 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:20:34.608 18:24:18 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:20:34.608 18:24:18 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3' 'BaseBdev4') 00:20:34.608 18:24:18 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # local base_bdevs 00:20:34.608 18:24:18 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@792 -- # local raid_bdev_name=raid_bdev1 00:20:34.608 18:24:18 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # local strip_size 00:20:34.608 18:24:18 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@794 -- # local create_arg 00:20:34.608 18:24:18 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@795 -- # local bdevperf_log 00:20:34.608 18:24:18 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@796 -- # local fail_per_s 00:20:34.608 18:24:18 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@798 -- # '[' concat '!=' raid1 ']' 00:20:34.608 18:24:18 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@799 -- # strip_size=64 00:20:34.608 18:24:18 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@800 -- # create_arg+=' -z 64' 00:20:34.608 18:24:18 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@805 -- # mktemp -p /raidtest 00:20:34.608 18:24:18 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@805 -- # bdevperf_log=/raidtest/tmp.SelPG277Yu 00:20:34.608 18:24:18 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@808 -- # raid_pid=2546489 00:20:34.608 18:24:18 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@809 -- # waitforlisten 2546489 /var/tmp/spdk-raid.sock 00:20:34.608 18:24:18 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@807 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 128k -q 1 -z -f -L bdev_raid 00:20:34.608 18:24:18 bdev_raid.raid_read_error_test -- common/autotest_common.sh@829 -- # '[' -z 2546489 ']' 00:20:34.608 18:24:18 bdev_raid.raid_read_error_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:20:34.608 18:24:18 bdev_raid.raid_read_error_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:20:34.608 18:24:18 bdev_raid.raid_read_error_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:20:34.608 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:20:34.608 18:24:18 bdev_raid.raid_read_error_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:20:34.608 18:24:18 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:20:34.866 [2024-07-12 18:24:18.376769] Starting SPDK v24.09-pre git sha1 182dd7de4 / DPDK 24.03.0 initialization... 00:20:34.866 [2024-07-12 18:24:18.376837] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2546489 ] 00:20:34.866 [2024-07-12 18:24:18.506303] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:20:35.124 [2024-07-12 18:24:18.613214] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:20:35.124 [2024-07-12 18:24:18.684233] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:20:35.124 [2024-07-12 18:24:18.684271] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:20:35.691 18:24:19 bdev_raid.raid_read_error_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:20:35.691 18:24:19 bdev_raid.raid_read_error_test -- common/autotest_common.sh@862 -- # return 0 00:20:35.692 18:24:19 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:20:35.692 18:24:19 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:20:35.950 BaseBdev1_malloc 00:20:35.950 18:24:19 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev1_malloc 00:20:36.208 true 00:20:36.208 18:24:19 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev1_malloc -p BaseBdev1 00:20:36.466 [2024-07-12 18:24:20.022923] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev1_malloc 00:20:36.466 [2024-07-12 18:24:20.022976] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:20:36.466 [2024-07-12 18:24:20.023002] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x27150d0 00:20:36.466 [2024-07-12 18:24:20.023015] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:20:36.466 [2024-07-12 18:24:20.024901] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:20:36.467 [2024-07-12 18:24:20.024940] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:20:36.467 BaseBdev1 00:20:36.467 18:24:20 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:20:36.467 18:24:20 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:20:36.725 BaseBdev2_malloc 00:20:36.725 18:24:20 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev2_malloc 00:20:36.985 true 00:20:36.985 18:24:20 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev2_malloc -p BaseBdev2 00:20:36.985 [2024-07-12 18:24:20.625226] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev2_malloc 00:20:36.985 [2024-07-12 18:24:20.625268] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:20:36.985 [2024-07-12 18:24:20.625291] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x2719910 00:20:36.985 [2024-07-12 18:24:20.625304] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:20:36.985 [2024-07-12 18:24:20.626747] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:20:36.985 [2024-07-12 18:24:20.626773] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:20:36.985 BaseBdev2 00:20:36.985 18:24:20 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:20:36.985 18:24:20 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3_malloc 00:20:37.244 BaseBdev3_malloc 00:20:37.244 18:24:20 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev3_malloc 00:20:37.501 true 00:20:37.501 18:24:21 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev3_malloc -p BaseBdev3 00:20:37.760 [2024-07-12 18:24:21.393054] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev3_malloc 00:20:37.760 [2024-07-12 18:24:21.393099] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:20:37.760 [2024-07-12 18:24:21.393123] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x271bbd0 00:20:37.760 [2024-07-12 18:24:21.393136] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:20:37.760 [2024-07-12 18:24:21.394682] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:20:37.760 [2024-07-12 18:24:21.394710] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev3 00:20:37.760 BaseBdev3 00:20:37.760 18:24:21 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:20:37.760 18:24:21 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4_malloc 00:20:38.018 BaseBdev4_malloc 00:20:38.018 18:24:21 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev4_malloc 00:20:38.276 true 00:20:38.277 18:24:21 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev4_malloc -p BaseBdev4 00:20:38.534 [2024-07-12 18:24:22.127556] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev4_malloc 00:20:38.534 [2024-07-12 18:24:22.127599] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:20:38.534 [2024-07-12 18:24:22.127625] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x271caa0 00:20:38.534 [2024-07-12 18:24:22.127638] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:20:38.534 [2024-07-12 18:24:22.129236] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:20:38.534 [2024-07-12 18:24:22.129264] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev4 00:20:38.534 BaseBdev4 00:20:38.534 18:24:22 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@819 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n raid_bdev1 -s 00:20:38.872 [2024-07-12 18:24:22.368227] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:20:38.872 [2024-07-12 18:24:22.369590] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:20:38.872 [2024-07-12 18:24:22.369659] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:20:38.872 [2024-07-12 18:24:22.369719] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:20:38.872 [2024-07-12 18:24:22.369952] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x2716c20 00:20:38.872 [2024-07-12 18:24:22.369964] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 253952, blocklen 512 00:20:38.872 [2024-07-12 18:24:22.370164] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x256b260 00:20:38.872 [2024-07-12 18:24:22.370311] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x2716c20 00:20:38.872 [2024-07-12 18:24:22.370321] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x2716c20 00:20:38.872 [2024-07-12 18:24:22.370424] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:20:38.872 18:24:22 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@820 -- # verify_raid_bdev_state raid_bdev1 online concat 64 4 00:20:38.872 18:24:22 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:20:38.872 18:24:22 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:20:38.872 18:24:22 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:20:38.872 18:24:22 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:20:38.872 18:24:22 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:20:38.872 18:24:22 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:20:38.872 18:24:22 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:20:38.872 18:24:22 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:20:38.872 18:24:22 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:20:38.872 18:24:22 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:20:38.872 18:24:22 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:39.130 18:24:22 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:20:39.130 "name": "raid_bdev1", 00:20:39.130 "uuid": "5439d56b-3605-4c4b-9416-9dfb6ebeb625", 00:20:39.130 "strip_size_kb": 64, 00:20:39.130 "state": "online", 00:20:39.130 "raid_level": "concat", 00:20:39.130 "superblock": true, 00:20:39.130 "num_base_bdevs": 4, 00:20:39.130 "num_base_bdevs_discovered": 4, 00:20:39.130 "num_base_bdevs_operational": 4, 00:20:39.130 "base_bdevs_list": [ 00:20:39.130 { 00:20:39.130 "name": "BaseBdev1", 00:20:39.130 "uuid": "0615ae99-ca6f-57c9-94e4-ba31864c227d", 00:20:39.130 "is_configured": true, 00:20:39.130 "data_offset": 2048, 00:20:39.130 "data_size": 63488 00:20:39.130 }, 00:20:39.130 { 00:20:39.130 "name": "BaseBdev2", 00:20:39.130 "uuid": "72354a92-ddf4-57c9-b7dd-58a6d7caf865", 00:20:39.130 "is_configured": true, 00:20:39.130 "data_offset": 2048, 00:20:39.130 "data_size": 63488 00:20:39.130 }, 00:20:39.130 { 00:20:39.130 "name": "BaseBdev3", 00:20:39.130 "uuid": "22715498-2eb2-5e07-8c7e-2558cc1a6529", 00:20:39.130 "is_configured": true, 00:20:39.130 "data_offset": 2048, 00:20:39.130 "data_size": 63488 00:20:39.130 }, 00:20:39.130 { 00:20:39.130 "name": "BaseBdev4", 00:20:39.130 "uuid": "7836445f-136e-5582-8c1e-29158c7eab89", 00:20:39.130 "is_configured": true, 00:20:39.130 "data_offset": 2048, 00:20:39.130 "data_size": 63488 00:20:39.130 } 00:20:39.130 ] 00:20:39.130 }' 00:20:39.130 18:24:22 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:20:39.130 18:24:22 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:20:39.695 18:24:23 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@824 -- # sleep 1 00:20:39.695 18:24:23 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@823 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/spdk-raid.sock perform_tests 00:20:39.695 [2024-07-12 18:24:23.383174] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x2708fc0 00:20:40.630 18:24:24 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@827 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_inject_error EE_BaseBdev1_malloc read failure 00:20:40.888 18:24:24 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@829 -- # local expected_num_base_bdevs 00:20:40.888 18:24:24 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@830 -- # [[ concat = \r\a\i\d\1 ]] 00:20:40.888 18:24:24 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@833 -- # expected_num_base_bdevs=4 00:20:40.888 18:24:24 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@835 -- # verify_raid_bdev_state raid_bdev1 online concat 64 4 00:20:40.888 18:24:24 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:20:40.888 18:24:24 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:20:40.888 18:24:24 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:20:40.888 18:24:24 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:20:40.888 18:24:24 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:20:40.888 18:24:24 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:20:40.888 18:24:24 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:20:40.888 18:24:24 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:20:40.888 18:24:24 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:20:40.888 18:24:24 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:40.888 18:24:24 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:20:41.146 18:24:24 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:20:41.146 "name": "raid_bdev1", 00:20:41.146 "uuid": "5439d56b-3605-4c4b-9416-9dfb6ebeb625", 00:20:41.146 "strip_size_kb": 64, 00:20:41.146 "state": "online", 00:20:41.146 "raid_level": "concat", 00:20:41.146 "superblock": true, 00:20:41.146 "num_base_bdevs": 4, 00:20:41.146 "num_base_bdevs_discovered": 4, 00:20:41.146 "num_base_bdevs_operational": 4, 00:20:41.146 "base_bdevs_list": [ 00:20:41.146 { 00:20:41.146 "name": "BaseBdev1", 00:20:41.146 "uuid": "0615ae99-ca6f-57c9-94e4-ba31864c227d", 00:20:41.146 "is_configured": true, 00:20:41.146 "data_offset": 2048, 00:20:41.146 "data_size": 63488 00:20:41.146 }, 00:20:41.146 { 00:20:41.146 "name": "BaseBdev2", 00:20:41.146 "uuid": "72354a92-ddf4-57c9-b7dd-58a6d7caf865", 00:20:41.146 "is_configured": true, 00:20:41.146 "data_offset": 2048, 00:20:41.146 "data_size": 63488 00:20:41.146 }, 00:20:41.146 { 00:20:41.146 "name": "BaseBdev3", 00:20:41.146 "uuid": "22715498-2eb2-5e07-8c7e-2558cc1a6529", 00:20:41.146 "is_configured": true, 00:20:41.146 "data_offset": 2048, 00:20:41.146 "data_size": 63488 00:20:41.146 }, 00:20:41.146 { 00:20:41.146 "name": "BaseBdev4", 00:20:41.146 "uuid": "7836445f-136e-5582-8c1e-29158c7eab89", 00:20:41.146 "is_configured": true, 00:20:41.146 "data_offset": 2048, 00:20:41.146 "data_size": 63488 00:20:41.146 } 00:20:41.146 ] 00:20:41.146 }' 00:20:41.146 18:24:24 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:20:41.146 18:24:24 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:20:41.713 18:24:25 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@837 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:20:41.971 [2024-07-12 18:24:25.576960] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:20:41.971 [2024-07-12 18:24:25.576996] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:20:41.971 [2024-07-12 18:24:25.580154] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:20:41.971 [2024-07-12 18:24:25.580191] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:20:41.971 [2024-07-12 18:24:25.580232] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:20:41.971 [2024-07-12 18:24:25.580243] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x2716c20 name raid_bdev1, state offline 00:20:41.971 0 00:20:41.971 18:24:25 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@839 -- # killprocess 2546489 00:20:41.971 18:24:25 bdev_raid.raid_read_error_test -- common/autotest_common.sh@948 -- # '[' -z 2546489 ']' 00:20:41.971 18:24:25 bdev_raid.raid_read_error_test -- common/autotest_common.sh@952 -- # kill -0 2546489 00:20:41.971 18:24:25 bdev_raid.raid_read_error_test -- common/autotest_common.sh@953 -- # uname 00:20:41.971 18:24:25 bdev_raid.raid_read_error_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:20:41.971 18:24:25 bdev_raid.raid_read_error_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2546489 00:20:41.971 18:24:25 bdev_raid.raid_read_error_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:20:41.971 18:24:25 bdev_raid.raid_read_error_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:20:41.971 18:24:25 bdev_raid.raid_read_error_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2546489' 00:20:41.971 killing process with pid 2546489 00:20:41.971 18:24:25 bdev_raid.raid_read_error_test -- common/autotest_common.sh@967 -- # kill 2546489 00:20:41.971 [2024-07-12 18:24:25.634956] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:20:41.971 18:24:25 bdev_raid.raid_read_error_test -- common/autotest_common.sh@972 -- # wait 2546489 00:20:41.971 [2024-07-12 18:24:25.665276] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:20:42.230 18:24:25 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # grep -v Job /raidtest/tmp.SelPG277Yu 00:20:42.230 18:24:25 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # grep raid_bdev1 00:20:42.230 18:24:25 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # awk '{print $6}' 00:20:42.230 18:24:25 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # fail_per_s=0.46 00:20:42.230 18:24:25 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@844 -- # has_redundancy concat 00:20:42.230 18:24:25 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:20:42.230 18:24:25 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@215 -- # return 1 00:20:42.230 18:24:25 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@847 -- # [[ 0.46 != \0\.\0\0 ]] 00:20:42.230 00:20:42.230 real 0m7.590s 00:20:42.230 user 0m12.203s 00:20:42.230 sys 0m1.284s 00:20:42.230 18:24:25 bdev_raid.raid_read_error_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:20:42.230 18:24:25 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:20:42.230 ************************************ 00:20:42.230 END TEST raid_read_error_test 00:20:42.230 ************************************ 00:20:42.230 18:24:25 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:20:42.230 18:24:25 bdev_raid -- bdev/bdev_raid.sh@871 -- # run_test raid_write_error_test raid_io_error_test concat 4 write 00:20:42.230 18:24:25 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:20:42.230 18:24:25 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:20:42.230 18:24:25 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:20:42.489 ************************************ 00:20:42.489 START TEST raid_write_error_test 00:20:42.489 ************************************ 00:20:42.489 18:24:25 bdev_raid.raid_write_error_test -- common/autotest_common.sh@1123 -- # raid_io_error_test concat 4 write 00:20:42.489 18:24:25 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@788 -- # local raid_level=concat 00:20:42.489 18:24:25 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@789 -- # local num_base_bdevs=4 00:20:42.489 18:24:25 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@790 -- # local error_io_type=write 00:20:42.489 18:24:25 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i = 1 )) 00:20:42.489 18:24:25 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:20:42.489 18:24:25 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev1 00:20:42.489 18:24:25 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:20:42.489 18:24:25 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:20:42.489 18:24:25 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev2 00:20:42.489 18:24:25 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:20:42.489 18:24:25 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:20:42.489 18:24:25 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev3 00:20:42.489 18:24:25 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:20:42.489 18:24:25 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:20:42.489 18:24:25 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev4 00:20:42.489 18:24:25 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:20:42.489 18:24:25 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:20:42.489 18:24:25 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3' 'BaseBdev4') 00:20:42.489 18:24:25 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # local base_bdevs 00:20:42.489 18:24:25 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@792 -- # local raid_bdev_name=raid_bdev1 00:20:42.489 18:24:25 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # local strip_size 00:20:42.489 18:24:25 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@794 -- # local create_arg 00:20:42.489 18:24:25 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@795 -- # local bdevperf_log 00:20:42.489 18:24:25 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@796 -- # local fail_per_s 00:20:42.489 18:24:25 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@798 -- # '[' concat '!=' raid1 ']' 00:20:42.489 18:24:25 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@799 -- # strip_size=64 00:20:42.489 18:24:25 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@800 -- # create_arg+=' -z 64' 00:20:42.489 18:24:25 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@805 -- # mktemp -p /raidtest 00:20:42.489 18:24:25 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@805 -- # bdevperf_log=/raidtest/tmp.Gw2CZZFAjo 00:20:42.489 18:24:25 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@808 -- # raid_pid=2547638 00:20:42.489 18:24:25 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@809 -- # waitforlisten 2547638 /var/tmp/spdk-raid.sock 00:20:42.489 18:24:25 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@807 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 128k -q 1 -z -f -L bdev_raid 00:20:42.489 18:24:25 bdev_raid.raid_write_error_test -- common/autotest_common.sh@829 -- # '[' -z 2547638 ']' 00:20:42.489 18:24:25 bdev_raid.raid_write_error_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:20:42.489 18:24:25 bdev_raid.raid_write_error_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:20:42.489 18:24:25 bdev_raid.raid_write_error_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:20:42.489 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:20:42.489 18:24:25 bdev_raid.raid_write_error_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:20:42.489 18:24:25 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:20:42.489 [2024-07-12 18:24:26.053206] Starting SPDK v24.09-pre git sha1 182dd7de4 / DPDK 24.03.0 initialization... 00:20:42.489 [2024-07-12 18:24:26.053274] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2547638 ] 00:20:42.489 [2024-07-12 18:24:26.181966] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:20:42.747 [2024-07-12 18:24:26.289049] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:20:42.747 [2024-07-12 18:24:26.356021] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:20:42.747 [2024-07-12 18:24:26.356060] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:20:43.313 18:24:26 bdev_raid.raid_write_error_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:20:43.313 18:24:26 bdev_raid.raid_write_error_test -- common/autotest_common.sh@862 -- # return 0 00:20:43.313 18:24:26 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:20:43.313 18:24:26 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:20:43.570 BaseBdev1_malloc 00:20:43.570 18:24:27 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev1_malloc 00:20:43.829 true 00:20:43.829 18:24:27 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev1_malloc -p BaseBdev1 00:20:44.087 [2024-07-12 18:24:27.698056] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev1_malloc 00:20:44.087 [2024-07-12 18:24:27.698101] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:20:44.087 [2024-07-12 18:24:27.698122] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x25be0d0 00:20:44.087 [2024-07-12 18:24:27.698134] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:20:44.087 [2024-07-12 18:24:27.699971] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:20:44.087 [2024-07-12 18:24:27.700001] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:20:44.087 BaseBdev1 00:20:44.087 18:24:27 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:20:44.087 18:24:27 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:20:44.346 BaseBdev2_malloc 00:20:44.346 18:24:27 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev2_malloc 00:20:44.604 true 00:20:44.604 18:24:28 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev2_malloc -p BaseBdev2 00:20:44.861 [2024-07-12 18:24:28.437796] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev2_malloc 00:20:44.861 [2024-07-12 18:24:28.437842] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:20:44.861 [2024-07-12 18:24:28.437863] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x25c2910 00:20:44.861 [2024-07-12 18:24:28.437875] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:20:44.861 [2024-07-12 18:24:28.439474] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:20:44.861 [2024-07-12 18:24:28.439502] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:20:44.862 BaseBdev2 00:20:44.862 18:24:28 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:20:44.862 18:24:28 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3_malloc 00:20:45.120 BaseBdev3_malloc 00:20:45.120 18:24:28 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev3_malloc 00:20:45.378 true 00:20:45.378 18:24:28 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev3_malloc -p BaseBdev3 00:20:45.636 [2024-07-12 18:24:29.161510] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev3_malloc 00:20:45.636 [2024-07-12 18:24:29.161555] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:20:45.636 [2024-07-12 18:24:29.161574] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x25c4bd0 00:20:45.636 [2024-07-12 18:24:29.161587] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:20:45.636 [2024-07-12 18:24:29.163112] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:20:45.636 [2024-07-12 18:24:29.163139] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev3 00:20:45.636 BaseBdev3 00:20:45.636 18:24:29 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:20:45.636 18:24:29 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4_malloc 00:20:45.893 BaseBdev4_malloc 00:20:45.893 18:24:29 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev4_malloc 00:20:46.151 true 00:20:46.151 18:24:29 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev4_malloc -p BaseBdev4 00:20:46.410 [2024-07-12 18:24:29.899993] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev4_malloc 00:20:46.410 [2024-07-12 18:24:29.900037] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:20:46.410 [2024-07-12 18:24:29.900058] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x25c5aa0 00:20:46.410 [2024-07-12 18:24:29.900070] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:20:46.410 [2024-07-12 18:24:29.901651] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:20:46.410 [2024-07-12 18:24:29.901677] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev4 00:20:46.410 BaseBdev4 00:20:46.410 18:24:29 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@819 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n raid_bdev1 -s 00:20:46.668 [2024-07-12 18:24:30.144675] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:20:46.668 [2024-07-12 18:24:30.146079] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:20:46.668 [2024-07-12 18:24:30.146146] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:20:46.668 [2024-07-12 18:24:30.146208] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:20:46.668 [2024-07-12 18:24:30.146435] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x25bfc20 00:20:46.668 [2024-07-12 18:24:30.146447] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 253952, blocklen 512 00:20:46.668 [2024-07-12 18:24:30.146647] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x2414260 00:20:46.668 [2024-07-12 18:24:30.146797] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x25bfc20 00:20:46.668 [2024-07-12 18:24:30.146807] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x25bfc20 00:20:46.668 [2024-07-12 18:24:30.146916] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:20:46.668 18:24:30 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@820 -- # verify_raid_bdev_state raid_bdev1 online concat 64 4 00:20:46.668 18:24:30 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:20:46.668 18:24:30 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:20:46.668 18:24:30 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:20:46.668 18:24:30 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:20:46.668 18:24:30 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:20:46.668 18:24:30 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:20:46.668 18:24:30 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:20:46.668 18:24:30 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:20:46.668 18:24:30 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:20:46.668 18:24:30 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:46.668 18:24:30 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:20:46.926 18:24:30 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:20:46.926 "name": "raid_bdev1", 00:20:46.926 "uuid": "0e40ca79-85ac-4b13-93e7-a1c05eab3555", 00:20:46.926 "strip_size_kb": 64, 00:20:46.926 "state": "online", 00:20:46.926 "raid_level": "concat", 00:20:46.926 "superblock": true, 00:20:46.926 "num_base_bdevs": 4, 00:20:46.926 "num_base_bdevs_discovered": 4, 00:20:46.926 "num_base_bdevs_operational": 4, 00:20:46.926 "base_bdevs_list": [ 00:20:46.926 { 00:20:46.926 "name": "BaseBdev1", 00:20:46.926 "uuid": "047c6a55-623f-58d9-915d-4daa94fd02bb", 00:20:46.926 "is_configured": true, 00:20:46.926 "data_offset": 2048, 00:20:46.926 "data_size": 63488 00:20:46.926 }, 00:20:46.926 { 00:20:46.926 "name": "BaseBdev2", 00:20:46.926 "uuid": "9ade353f-48be-50fb-bade-2e5628bdc434", 00:20:46.926 "is_configured": true, 00:20:46.926 "data_offset": 2048, 00:20:46.926 "data_size": 63488 00:20:46.926 }, 00:20:46.926 { 00:20:46.926 "name": "BaseBdev3", 00:20:46.926 "uuid": "71e596cb-bcb0-5a86-90bb-1042e90f180b", 00:20:46.926 "is_configured": true, 00:20:46.926 "data_offset": 2048, 00:20:46.926 "data_size": 63488 00:20:46.926 }, 00:20:46.926 { 00:20:46.926 "name": "BaseBdev4", 00:20:46.926 "uuid": "f561d7cd-1ec1-562d-8382-9a0b7676eb3e", 00:20:46.926 "is_configured": true, 00:20:46.926 "data_offset": 2048, 00:20:46.926 "data_size": 63488 00:20:46.926 } 00:20:46.926 ] 00:20:46.926 }' 00:20:46.926 18:24:30 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:20:46.926 18:24:30 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:20:47.492 18:24:31 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@824 -- # sleep 1 00:20:47.492 18:24:31 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@823 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/spdk-raid.sock perform_tests 00:20:47.492 [2024-07-12 18:24:31.119515] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x25b1fc0 00:20:48.425 18:24:32 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@827 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_inject_error EE_BaseBdev1_malloc write failure 00:20:48.683 18:24:32 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@829 -- # local expected_num_base_bdevs 00:20:48.683 18:24:32 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@830 -- # [[ concat = \r\a\i\d\1 ]] 00:20:48.683 18:24:32 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@833 -- # expected_num_base_bdevs=4 00:20:48.683 18:24:32 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@835 -- # verify_raid_bdev_state raid_bdev1 online concat 64 4 00:20:48.683 18:24:32 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:20:48.683 18:24:32 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:20:48.684 18:24:32 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:20:48.684 18:24:32 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:20:48.684 18:24:32 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:20:48.684 18:24:32 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:20:48.684 18:24:32 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:20:48.684 18:24:32 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:20:48.684 18:24:32 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:20:48.684 18:24:32 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:48.684 18:24:32 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:20:48.941 18:24:32 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:20:48.941 "name": "raid_bdev1", 00:20:48.941 "uuid": "0e40ca79-85ac-4b13-93e7-a1c05eab3555", 00:20:48.941 "strip_size_kb": 64, 00:20:48.941 "state": "online", 00:20:48.941 "raid_level": "concat", 00:20:48.941 "superblock": true, 00:20:48.941 "num_base_bdevs": 4, 00:20:48.941 "num_base_bdevs_discovered": 4, 00:20:48.941 "num_base_bdevs_operational": 4, 00:20:48.941 "base_bdevs_list": [ 00:20:48.941 { 00:20:48.941 "name": "BaseBdev1", 00:20:48.941 "uuid": "047c6a55-623f-58d9-915d-4daa94fd02bb", 00:20:48.942 "is_configured": true, 00:20:48.942 "data_offset": 2048, 00:20:48.942 "data_size": 63488 00:20:48.942 }, 00:20:48.942 { 00:20:48.942 "name": "BaseBdev2", 00:20:48.942 "uuid": "9ade353f-48be-50fb-bade-2e5628bdc434", 00:20:48.942 "is_configured": true, 00:20:48.942 "data_offset": 2048, 00:20:48.942 "data_size": 63488 00:20:48.942 }, 00:20:48.942 { 00:20:48.942 "name": "BaseBdev3", 00:20:48.942 "uuid": "71e596cb-bcb0-5a86-90bb-1042e90f180b", 00:20:48.942 "is_configured": true, 00:20:48.942 "data_offset": 2048, 00:20:48.942 "data_size": 63488 00:20:48.942 }, 00:20:48.942 { 00:20:48.942 "name": "BaseBdev4", 00:20:48.942 "uuid": "f561d7cd-1ec1-562d-8382-9a0b7676eb3e", 00:20:48.942 "is_configured": true, 00:20:48.942 "data_offset": 2048, 00:20:48.942 "data_size": 63488 00:20:48.942 } 00:20:48.942 ] 00:20:48.942 }' 00:20:48.942 18:24:32 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:20:48.942 18:24:32 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:20:49.507 18:24:33 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@837 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:20:50.073 [2024-07-12 18:24:33.620318] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:20:50.073 [2024-07-12 18:24:33.620355] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:20:50.073 [2024-07-12 18:24:33.623515] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:20:50.073 [2024-07-12 18:24:33.623552] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:20:50.073 [2024-07-12 18:24:33.623592] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:20:50.073 [2024-07-12 18:24:33.623602] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x25bfc20 name raid_bdev1, state offline 00:20:50.073 0 00:20:50.073 18:24:33 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@839 -- # killprocess 2547638 00:20:50.073 18:24:33 bdev_raid.raid_write_error_test -- common/autotest_common.sh@948 -- # '[' -z 2547638 ']' 00:20:50.073 18:24:33 bdev_raid.raid_write_error_test -- common/autotest_common.sh@952 -- # kill -0 2547638 00:20:50.073 18:24:33 bdev_raid.raid_write_error_test -- common/autotest_common.sh@953 -- # uname 00:20:50.073 18:24:33 bdev_raid.raid_write_error_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:20:50.073 18:24:33 bdev_raid.raid_write_error_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2547638 00:20:50.073 18:24:33 bdev_raid.raid_write_error_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:20:50.073 18:24:33 bdev_raid.raid_write_error_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:20:50.073 18:24:33 bdev_raid.raid_write_error_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2547638' 00:20:50.073 killing process with pid 2547638 00:20:50.073 18:24:33 bdev_raid.raid_write_error_test -- common/autotest_common.sh@967 -- # kill 2547638 00:20:50.073 [2024-07-12 18:24:33.699841] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:20:50.073 18:24:33 bdev_raid.raid_write_error_test -- common/autotest_common.sh@972 -- # wait 2547638 00:20:50.073 [2024-07-12 18:24:33.730753] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:20:50.332 18:24:33 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # grep -v Job /raidtest/tmp.Gw2CZZFAjo 00:20:50.332 18:24:33 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # grep raid_bdev1 00:20:50.332 18:24:33 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # awk '{print $6}' 00:20:50.332 18:24:33 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # fail_per_s=0.40 00:20:50.332 18:24:33 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@844 -- # has_redundancy concat 00:20:50.332 18:24:33 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:20:50.332 18:24:33 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@215 -- # return 1 00:20:50.332 18:24:33 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@847 -- # [[ 0.40 != \0\.\0\0 ]] 00:20:50.332 00:20:50.332 real 0m7.986s 00:20:50.332 user 0m12.870s 00:20:50.332 sys 0m1.398s 00:20:50.332 18:24:33 bdev_raid.raid_write_error_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:20:50.332 18:24:33 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:20:50.332 ************************************ 00:20:50.332 END TEST raid_write_error_test 00:20:50.332 ************************************ 00:20:50.332 18:24:33 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:20:50.332 18:24:33 bdev_raid -- bdev/bdev_raid.sh@866 -- # for level in raid0 concat raid1 00:20:50.332 18:24:33 bdev_raid -- bdev/bdev_raid.sh@867 -- # run_test raid_state_function_test raid_state_function_test raid1 4 false 00:20:50.332 18:24:34 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:20:50.332 18:24:34 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:20:50.332 18:24:34 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:20:50.332 ************************************ 00:20:50.332 START TEST raid_state_function_test 00:20:50.332 ************************************ 00:20:50.332 18:24:34 bdev_raid.raid_state_function_test -- common/autotest_common.sh@1123 -- # raid_state_function_test raid1 4 false 00:20:50.332 18:24:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@220 -- # local raid_level=raid1 00:20:50.332 18:24:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@221 -- # local num_base_bdevs=4 00:20:50.332 18:24:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@222 -- # local superblock=false 00:20:50.332 18:24:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@223 -- # local raid_bdev 00:20:50.332 18:24:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i = 1 )) 00:20:50.333 18:24:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:20:50.333 18:24:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev1 00:20:50.333 18:24:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:20:50.333 18:24:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:20:50.333 18:24:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev2 00:20:50.333 18:24:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:20:50.333 18:24:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:20:50.333 18:24:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev3 00:20:50.333 18:24:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:20:50.333 18:24:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:20:50.333 18:24:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev4 00:20:50.333 18:24:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:20:50.333 18:24:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:20:50.333 18:24:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3' 'BaseBdev4') 00:20:50.333 18:24:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # local base_bdevs 00:20:50.333 18:24:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@225 -- # local raid_bdev_name=Existed_Raid 00:20:50.333 18:24:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # local strip_size 00:20:50.333 18:24:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@227 -- # local strip_size_create_arg 00:20:50.333 18:24:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@228 -- # local superblock_create_arg 00:20:50.333 18:24:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@230 -- # '[' raid1 '!=' raid1 ']' 00:20:50.333 18:24:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@234 -- # strip_size=0 00:20:50.333 18:24:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@237 -- # '[' false = true ']' 00:20:50.333 18:24:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@240 -- # superblock_create_arg= 00:20:50.333 18:24:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@244 -- # raid_pid=2548788 00:20:50.333 18:24:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@245 -- # echo 'Process raid pid: 2548788' 00:20:50.333 Process raid pid: 2548788 00:20:50.333 18:24:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@246 -- # waitforlisten 2548788 /var/tmp/spdk-raid.sock 00:20:50.333 18:24:34 bdev_raid.raid_state_function_test -- common/autotest_common.sh@829 -- # '[' -z 2548788 ']' 00:20:50.333 18:24:34 bdev_raid.raid_state_function_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:20:50.333 18:24:34 bdev_raid.raid_state_function_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:20:50.333 18:24:34 bdev_raid.raid_state_function_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:20:50.333 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:20:50.333 18:24:34 bdev_raid.raid_state_function_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:20:50.333 18:24:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@243 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:20:50.333 18:24:34 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:20:50.591 [2024-07-12 18:24:34.102893] Starting SPDK v24.09-pre git sha1 182dd7de4 / DPDK 24.03.0 initialization... 00:20:50.591 [2024-07-12 18:24:34.102977] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:20:50.591 [2024-07-12 18:24:34.234650] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:20:50.849 [2024-07-12 18:24:34.341969] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:20:50.849 [2024-07-12 18:24:34.409877] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:20:50.849 [2024-07-12 18:24:34.409912] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:20:51.415 18:24:35 bdev_raid.raid_state_function_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:20:51.415 18:24:35 bdev_raid.raid_state_function_test -- common/autotest_common.sh@862 -- # return 0 00:20:51.415 18:24:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@250 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:20:51.981 [2024-07-12 18:24:35.505181] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:20:51.981 [2024-07-12 18:24:35.505224] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:20:51.981 [2024-07-12 18:24:35.505235] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:20:51.981 [2024-07-12 18:24:35.505247] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:20:51.981 [2024-07-12 18:24:35.505256] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:20:51.981 [2024-07-12 18:24:35.505267] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:20:51.981 [2024-07-12 18:24:35.505276] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev4 00:20:51.981 [2024-07-12 18:24:35.505287] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev4 doesn't exist now 00:20:51.981 18:24:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@251 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:20:51.981 18:24:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:20:51.981 18:24:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:20:51.981 18:24:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:20:51.981 18:24:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:20:51.981 18:24:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:20:51.981 18:24:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:20:51.981 18:24:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:20:51.981 18:24:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:20:51.981 18:24:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:20:51.981 18:24:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:51.981 18:24:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:20:52.240 18:24:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:20:52.240 "name": "Existed_Raid", 00:20:52.240 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:52.240 "strip_size_kb": 0, 00:20:52.240 "state": "configuring", 00:20:52.240 "raid_level": "raid1", 00:20:52.240 "superblock": false, 00:20:52.240 "num_base_bdevs": 4, 00:20:52.240 "num_base_bdevs_discovered": 0, 00:20:52.240 "num_base_bdevs_operational": 4, 00:20:52.240 "base_bdevs_list": [ 00:20:52.240 { 00:20:52.240 "name": "BaseBdev1", 00:20:52.240 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:52.240 "is_configured": false, 00:20:52.240 "data_offset": 0, 00:20:52.240 "data_size": 0 00:20:52.240 }, 00:20:52.240 { 00:20:52.240 "name": "BaseBdev2", 00:20:52.240 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:52.240 "is_configured": false, 00:20:52.240 "data_offset": 0, 00:20:52.240 "data_size": 0 00:20:52.240 }, 00:20:52.240 { 00:20:52.240 "name": "BaseBdev3", 00:20:52.240 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:52.240 "is_configured": false, 00:20:52.240 "data_offset": 0, 00:20:52.240 "data_size": 0 00:20:52.240 }, 00:20:52.240 { 00:20:52.240 "name": "BaseBdev4", 00:20:52.240 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:52.240 "is_configured": false, 00:20:52.240 "data_offset": 0, 00:20:52.240 "data_size": 0 00:20:52.240 } 00:20:52.240 ] 00:20:52.240 }' 00:20:52.240 18:24:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:20:52.240 18:24:35 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:20:52.806 18:24:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@252 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:20:53.064 [2024-07-12 18:24:36.539782] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:20:53.064 [2024-07-12 18:24:36.539811] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1f84aa0 name Existed_Raid, state configuring 00:20:53.064 18:24:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@256 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:20:53.064 [2024-07-12 18:24:36.784446] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:20:53.064 [2024-07-12 18:24:36.784475] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:20:53.064 [2024-07-12 18:24:36.784484] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:20:53.064 [2024-07-12 18:24:36.784496] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:20:53.064 [2024-07-12 18:24:36.784504] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:20:53.064 [2024-07-12 18:24:36.784516] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:20:53.064 [2024-07-12 18:24:36.784524] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev4 00:20:53.064 [2024-07-12 18:24:36.784535] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev4 doesn't exist now 00:20:53.322 18:24:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@257 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:20:53.322 [2024-07-12 18:24:36.970890] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:20:53.322 BaseBdev1 00:20:53.322 18:24:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@258 -- # waitforbdev BaseBdev1 00:20:53.322 18:24:36 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev1 00:20:53.322 18:24:36 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:20:53.322 18:24:36 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:20:53.322 18:24:36 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:20:53.322 18:24:36 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:20:53.322 18:24:36 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:20:53.581 18:24:37 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:20:53.842 [ 00:20:53.842 { 00:20:53.842 "name": "BaseBdev1", 00:20:53.842 "aliases": [ 00:20:53.842 "59a4c6c5-485e-4c00-b5b8-1e9306ae817f" 00:20:53.842 ], 00:20:53.842 "product_name": "Malloc disk", 00:20:53.842 "block_size": 512, 00:20:53.842 "num_blocks": 65536, 00:20:53.842 "uuid": "59a4c6c5-485e-4c00-b5b8-1e9306ae817f", 00:20:53.842 "assigned_rate_limits": { 00:20:53.842 "rw_ios_per_sec": 0, 00:20:53.842 "rw_mbytes_per_sec": 0, 00:20:53.842 "r_mbytes_per_sec": 0, 00:20:53.842 "w_mbytes_per_sec": 0 00:20:53.842 }, 00:20:53.842 "claimed": true, 00:20:53.842 "claim_type": "exclusive_write", 00:20:53.842 "zoned": false, 00:20:53.842 "supported_io_types": { 00:20:53.842 "read": true, 00:20:53.842 "write": true, 00:20:53.842 "unmap": true, 00:20:53.842 "flush": true, 00:20:53.842 "reset": true, 00:20:53.842 "nvme_admin": false, 00:20:53.842 "nvme_io": false, 00:20:53.842 "nvme_io_md": false, 00:20:53.842 "write_zeroes": true, 00:20:53.842 "zcopy": true, 00:20:53.842 "get_zone_info": false, 00:20:53.842 "zone_management": false, 00:20:53.842 "zone_append": false, 00:20:53.842 "compare": false, 00:20:53.842 "compare_and_write": false, 00:20:53.842 "abort": true, 00:20:53.842 "seek_hole": false, 00:20:53.842 "seek_data": false, 00:20:53.842 "copy": true, 00:20:53.842 "nvme_iov_md": false 00:20:53.842 }, 00:20:53.842 "memory_domains": [ 00:20:53.842 { 00:20:53.842 "dma_device_id": "system", 00:20:53.842 "dma_device_type": 1 00:20:53.842 }, 00:20:53.842 { 00:20:53.842 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:53.842 "dma_device_type": 2 00:20:53.842 } 00:20:53.842 ], 00:20:53.842 "driver_specific": {} 00:20:53.842 } 00:20:53.842 ] 00:20:53.842 18:24:37 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:20:53.842 18:24:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@259 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:20:53.842 18:24:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:20:53.842 18:24:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:20:53.842 18:24:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:20:53.842 18:24:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:20:53.842 18:24:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:20:53.842 18:24:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:20:53.842 18:24:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:20:53.842 18:24:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:20:53.842 18:24:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:20:53.842 18:24:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:53.842 18:24:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:20:53.842 18:24:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:20:53.842 "name": "Existed_Raid", 00:20:53.842 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:53.842 "strip_size_kb": 0, 00:20:53.842 "state": "configuring", 00:20:53.842 "raid_level": "raid1", 00:20:53.842 "superblock": false, 00:20:53.842 "num_base_bdevs": 4, 00:20:53.842 "num_base_bdevs_discovered": 1, 00:20:53.842 "num_base_bdevs_operational": 4, 00:20:53.842 "base_bdevs_list": [ 00:20:53.842 { 00:20:53.842 "name": "BaseBdev1", 00:20:53.842 "uuid": "59a4c6c5-485e-4c00-b5b8-1e9306ae817f", 00:20:53.842 "is_configured": true, 00:20:53.842 "data_offset": 0, 00:20:53.842 "data_size": 65536 00:20:53.842 }, 00:20:53.842 { 00:20:53.842 "name": "BaseBdev2", 00:20:53.842 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:53.842 "is_configured": false, 00:20:53.842 "data_offset": 0, 00:20:53.842 "data_size": 0 00:20:53.842 }, 00:20:53.842 { 00:20:53.842 "name": "BaseBdev3", 00:20:53.842 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:53.842 "is_configured": false, 00:20:53.842 "data_offset": 0, 00:20:53.842 "data_size": 0 00:20:53.842 }, 00:20:53.842 { 00:20:53.842 "name": "BaseBdev4", 00:20:53.842 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:53.842 "is_configured": false, 00:20:53.842 "data_offset": 0, 00:20:53.842 "data_size": 0 00:20:53.842 } 00:20:53.842 ] 00:20:53.842 }' 00:20:53.842 18:24:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:20:53.842 18:24:37 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:20:54.476 18:24:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@260 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:20:54.733 [2024-07-12 18:24:38.342529] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:20:54.733 [2024-07-12 18:24:38.342570] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1f84310 name Existed_Raid, state configuring 00:20:54.734 18:24:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@264 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:20:54.991 [2024-07-12 18:24:38.583193] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:20:54.991 [2024-07-12 18:24:38.584638] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:20:54.991 [2024-07-12 18:24:38.584669] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:20:54.991 [2024-07-12 18:24:38.584679] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:20:54.991 [2024-07-12 18:24:38.584691] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:20:54.991 [2024-07-12 18:24:38.584700] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev4 00:20:54.991 [2024-07-12 18:24:38.584711] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev4 doesn't exist now 00:20:54.991 18:24:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i = 1 )) 00:20:54.991 18:24:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:20:54.991 18:24:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:20:54.991 18:24:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:20:54.991 18:24:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:20:54.991 18:24:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:20:54.991 18:24:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:20:54.991 18:24:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:20:54.991 18:24:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:20:54.991 18:24:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:20:54.991 18:24:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:20:54.991 18:24:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:20:54.991 18:24:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:54.991 18:24:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:20:55.249 18:24:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:20:55.249 "name": "Existed_Raid", 00:20:55.249 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:55.249 "strip_size_kb": 0, 00:20:55.249 "state": "configuring", 00:20:55.249 "raid_level": "raid1", 00:20:55.249 "superblock": false, 00:20:55.249 "num_base_bdevs": 4, 00:20:55.249 "num_base_bdevs_discovered": 1, 00:20:55.249 "num_base_bdevs_operational": 4, 00:20:55.249 "base_bdevs_list": [ 00:20:55.249 { 00:20:55.249 "name": "BaseBdev1", 00:20:55.249 "uuid": "59a4c6c5-485e-4c00-b5b8-1e9306ae817f", 00:20:55.249 "is_configured": true, 00:20:55.249 "data_offset": 0, 00:20:55.249 "data_size": 65536 00:20:55.249 }, 00:20:55.249 { 00:20:55.249 "name": "BaseBdev2", 00:20:55.249 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:55.249 "is_configured": false, 00:20:55.249 "data_offset": 0, 00:20:55.249 "data_size": 0 00:20:55.249 }, 00:20:55.249 { 00:20:55.249 "name": "BaseBdev3", 00:20:55.249 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:55.249 "is_configured": false, 00:20:55.249 "data_offset": 0, 00:20:55.249 "data_size": 0 00:20:55.249 }, 00:20:55.249 { 00:20:55.249 "name": "BaseBdev4", 00:20:55.249 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:55.249 "is_configured": false, 00:20:55.249 "data_offset": 0, 00:20:55.249 "data_size": 0 00:20:55.249 } 00:20:55.249 ] 00:20:55.249 }' 00:20:55.249 18:24:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:20:55.249 18:24:38 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:20:55.835 18:24:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:20:56.093 [2024-07-12 18:24:39.670674] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:20:56.093 BaseBdev2 00:20:56.093 18:24:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev2 00:20:56.093 18:24:39 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev2 00:20:56.093 18:24:39 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:20:56.093 18:24:39 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:20:56.093 18:24:39 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:20:56.093 18:24:39 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:20:56.093 18:24:39 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:20:56.351 18:24:39 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:20:56.609 [ 00:20:56.609 { 00:20:56.609 "name": "BaseBdev2", 00:20:56.609 "aliases": [ 00:20:56.609 "e416d868-8a47-4a8d-9cfd-57f50e29c0ca" 00:20:56.609 ], 00:20:56.609 "product_name": "Malloc disk", 00:20:56.609 "block_size": 512, 00:20:56.609 "num_blocks": 65536, 00:20:56.609 "uuid": "e416d868-8a47-4a8d-9cfd-57f50e29c0ca", 00:20:56.609 "assigned_rate_limits": { 00:20:56.609 "rw_ios_per_sec": 0, 00:20:56.609 "rw_mbytes_per_sec": 0, 00:20:56.609 "r_mbytes_per_sec": 0, 00:20:56.609 "w_mbytes_per_sec": 0 00:20:56.609 }, 00:20:56.609 "claimed": true, 00:20:56.609 "claim_type": "exclusive_write", 00:20:56.609 "zoned": false, 00:20:56.609 "supported_io_types": { 00:20:56.609 "read": true, 00:20:56.609 "write": true, 00:20:56.609 "unmap": true, 00:20:56.609 "flush": true, 00:20:56.609 "reset": true, 00:20:56.609 "nvme_admin": false, 00:20:56.609 "nvme_io": false, 00:20:56.609 "nvme_io_md": false, 00:20:56.609 "write_zeroes": true, 00:20:56.609 "zcopy": true, 00:20:56.609 "get_zone_info": false, 00:20:56.609 "zone_management": false, 00:20:56.609 "zone_append": false, 00:20:56.609 "compare": false, 00:20:56.609 "compare_and_write": false, 00:20:56.609 "abort": true, 00:20:56.609 "seek_hole": false, 00:20:56.609 "seek_data": false, 00:20:56.609 "copy": true, 00:20:56.609 "nvme_iov_md": false 00:20:56.609 }, 00:20:56.609 "memory_domains": [ 00:20:56.609 { 00:20:56.609 "dma_device_id": "system", 00:20:56.609 "dma_device_type": 1 00:20:56.609 }, 00:20:56.609 { 00:20:56.609 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:56.609 "dma_device_type": 2 00:20:56.609 } 00:20:56.609 ], 00:20:56.609 "driver_specific": {} 00:20:56.609 } 00:20:56.609 ] 00:20:56.609 18:24:40 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:20:56.609 18:24:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:20:56.609 18:24:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:20:56.609 18:24:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:20:56.609 18:24:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:20:56.609 18:24:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:20:56.609 18:24:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:20:56.609 18:24:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:20:56.609 18:24:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:20:56.609 18:24:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:20:56.609 18:24:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:20:56.609 18:24:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:20:56.609 18:24:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:20:56.609 18:24:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:56.609 18:24:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:20:56.868 18:24:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:20:56.868 "name": "Existed_Raid", 00:20:56.868 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:56.868 "strip_size_kb": 0, 00:20:56.868 "state": "configuring", 00:20:56.868 "raid_level": "raid1", 00:20:56.868 "superblock": false, 00:20:56.868 "num_base_bdevs": 4, 00:20:56.868 "num_base_bdevs_discovered": 2, 00:20:56.868 "num_base_bdevs_operational": 4, 00:20:56.868 "base_bdevs_list": [ 00:20:56.868 { 00:20:56.868 "name": "BaseBdev1", 00:20:56.868 "uuid": "59a4c6c5-485e-4c00-b5b8-1e9306ae817f", 00:20:56.868 "is_configured": true, 00:20:56.868 "data_offset": 0, 00:20:56.868 "data_size": 65536 00:20:56.868 }, 00:20:56.868 { 00:20:56.868 "name": "BaseBdev2", 00:20:56.868 "uuid": "e416d868-8a47-4a8d-9cfd-57f50e29c0ca", 00:20:56.868 "is_configured": true, 00:20:56.868 "data_offset": 0, 00:20:56.868 "data_size": 65536 00:20:56.868 }, 00:20:56.868 { 00:20:56.868 "name": "BaseBdev3", 00:20:56.868 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:56.868 "is_configured": false, 00:20:56.868 "data_offset": 0, 00:20:56.868 "data_size": 0 00:20:56.868 }, 00:20:56.868 { 00:20:56.868 "name": "BaseBdev4", 00:20:56.868 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:56.868 "is_configured": false, 00:20:56.868 "data_offset": 0, 00:20:56.868 "data_size": 0 00:20:56.868 } 00:20:56.868 ] 00:20:56.868 }' 00:20:56.868 18:24:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:20:56.868 18:24:40 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:20:57.433 18:24:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:20:57.998 [2024-07-12 18:24:41.476025] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:20:57.998 BaseBdev3 00:20:57.998 18:24:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev3 00:20:57.998 18:24:41 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev3 00:20:57.998 18:24:41 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:20:57.998 18:24:41 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:20:57.998 18:24:41 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:20:57.998 18:24:41 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:20:57.998 18:24:41 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:20:58.254 18:24:41 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:20:58.512 [ 00:20:58.512 { 00:20:58.512 "name": "BaseBdev3", 00:20:58.512 "aliases": [ 00:20:58.512 "1ba9d5b0-a3bd-48df-9dd5-98886547c716" 00:20:58.512 ], 00:20:58.512 "product_name": "Malloc disk", 00:20:58.512 "block_size": 512, 00:20:58.512 "num_blocks": 65536, 00:20:58.512 "uuid": "1ba9d5b0-a3bd-48df-9dd5-98886547c716", 00:20:58.512 "assigned_rate_limits": { 00:20:58.512 "rw_ios_per_sec": 0, 00:20:58.512 "rw_mbytes_per_sec": 0, 00:20:58.512 "r_mbytes_per_sec": 0, 00:20:58.512 "w_mbytes_per_sec": 0 00:20:58.512 }, 00:20:58.512 "claimed": true, 00:20:58.512 "claim_type": "exclusive_write", 00:20:58.512 "zoned": false, 00:20:58.512 "supported_io_types": { 00:20:58.512 "read": true, 00:20:58.512 "write": true, 00:20:58.512 "unmap": true, 00:20:58.512 "flush": true, 00:20:58.512 "reset": true, 00:20:58.512 "nvme_admin": false, 00:20:58.512 "nvme_io": false, 00:20:58.512 "nvme_io_md": false, 00:20:58.512 "write_zeroes": true, 00:20:58.512 "zcopy": true, 00:20:58.512 "get_zone_info": false, 00:20:58.512 "zone_management": false, 00:20:58.512 "zone_append": false, 00:20:58.512 "compare": false, 00:20:58.512 "compare_and_write": false, 00:20:58.512 "abort": true, 00:20:58.512 "seek_hole": false, 00:20:58.512 "seek_data": false, 00:20:58.512 "copy": true, 00:20:58.512 "nvme_iov_md": false 00:20:58.512 }, 00:20:58.512 "memory_domains": [ 00:20:58.512 { 00:20:58.512 "dma_device_id": "system", 00:20:58.512 "dma_device_type": 1 00:20:58.512 }, 00:20:58.512 { 00:20:58.512 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:58.512 "dma_device_type": 2 00:20:58.512 } 00:20:58.512 ], 00:20:58.512 "driver_specific": {} 00:20:58.512 } 00:20:58.512 ] 00:20:58.770 18:24:42 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:20:58.770 18:24:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:20:58.770 18:24:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:20:58.770 18:24:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:20:58.770 18:24:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:20:58.770 18:24:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:20:58.770 18:24:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:20:58.770 18:24:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:20:58.770 18:24:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:20:58.770 18:24:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:20:58.770 18:24:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:20:58.770 18:24:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:20:58.770 18:24:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:20:58.770 18:24:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:58.770 18:24:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:20:59.027 18:24:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:20:59.027 "name": "Existed_Raid", 00:20:59.027 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:59.027 "strip_size_kb": 0, 00:20:59.027 "state": "configuring", 00:20:59.027 "raid_level": "raid1", 00:20:59.027 "superblock": false, 00:20:59.027 "num_base_bdevs": 4, 00:20:59.027 "num_base_bdevs_discovered": 3, 00:20:59.027 "num_base_bdevs_operational": 4, 00:20:59.027 "base_bdevs_list": [ 00:20:59.027 { 00:20:59.027 "name": "BaseBdev1", 00:20:59.027 "uuid": "59a4c6c5-485e-4c00-b5b8-1e9306ae817f", 00:20:59.027 "is_configured": true, 00:20:59.027 "data_offset": 0, 00:20:59.027 "data_size": 65536 00:20:59.027 }, 00:20:59.027 { 00:20:59.027 "name": "BaseBdev2", 00:20:59.027 "uuid": "e416d868-8a47-4a8d-9cfd-57f50e29c0ca", 00:20:59.027 "is_configured": true, 00:20:59.027 "data_offset": 0, 00:20:59.027 "data_size": 65536 00:20:59.027 }, 00:20:59.027 { 00:20:59.027 "name": "BaseBdev3", 00:20:59.027 "uuid": "1ba9d5b0-a3bd-48df-9dd5-98886547c716", 00:20:59.027 "is_configured": true, 00:20:59.027 "data_offset": 0, 00:20:59.027 "data_size": 65536 00:20:59.027 }, 00:20:59.027 { 00:20:59.027 "name": "BaseBdev4", 00:20:59.027 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:59.027 "is_configured": false, 00:20:59.027 "data_offset": 0, 00:20:59.027 "data_size": 0 00:20:59.027 } 00:20:59.027 ] 00:20:59.027 }' 00:20:59.027 18:24:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:20:59.027 18:24:42 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:20:59.594 18:24:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4 00:20:59.594 [2024-07-12 18:24:43.211976] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:20:59.594 [2024-07-12 18:24:43.212015] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x1f85350 00:20:59.594 [2024-07-12 18:24:43.212023] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 65536, blocklen 512 00:20:59.594 [2024-07-12 18:24:43.212290] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1f85020 00:20:59.594 [2024-07-12 18:24:43.212424] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1f85350 00:20:59.594 [2024-07-12 18:24:43.212435] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x1f85350 00:20:59.594 [2024-07-12 18:24:43.212604] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:20:59.594 BaseBdev4 00:20:59.594 18:24:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev4 00:20:59.594 18:24:43 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev4 00:20:59.594 18:24:43 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:20:59.594 18:24:43 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:20:59.594 18:24:43 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:20:59.594 18:24:43 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:20:59.594 18:24:43 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:20:59.851 18:24:43 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 -t 2000 00:21:00.108 [ 00:21:00.108 { 00:21:00.108 "name": "BaseBdev4", 00:21:00.108 "aliases": [ 00:21:00.108 "c2b99572-52bb-44e1-aa30-cae556f3f7b7" 00:21:00.108 ], 00:21:00.108 "product_name": "Malloc disk", 00:21:00.108 "block_size": 512, 00:21:00.108 "num_blocks": 65536, 00:21:00.108 "uuid": "c2b99572-52bb-44e1-aa30-cae556f3f7b7", 00:21:00.108 "assigned_rate_limits": { 00:21:00.108 "rw_ios_per_sec": 0, 00:21:00.108 "rw_mbytes_per_sec": 0, 00:21:00.108 "r_mbytes_per_sec": 0, 00:21:00.108 "w_mbytes_per_sec": 0 00:21:00.108 }, 00:21:00.108 "claimed": true, 00:21:00.108 "claim_type": "exclusive_write", 00:21:00.108 "zoned": false, 00:21:00.108 "supported_io_types": { 00:21:00.108 "read": true, 00:21:00.108 "write": true, 00:21:00.108 "unmap": true, 00:21:00.108 "flush": true, 00:21:00.108 "reset": true, 00:21:00.108 "nvme_admin": false, 00:21:00.108 "nvme_io": false, 00:21:00.108 "nvme_io_md": false, 00:21:00.108 "write_zeroes": true, 00:21:00.108 "zcopy": true, 00:21:00.108 "get_zone_info": false, 00:21:00.108 "zone_management": false, 00:21:00.108 "zone_append": false, 00:21:00.108 "compare": false, 00:21:00.108 "compare_and_write": false, 00:21:00.108 "abort": true, 00:21:00.108 "seek_hole": false, 00:21:00.108 "seek_data": false, 00:21:00.108 "copy": true, 00:21:00.108 "nvme_iov_md": false 00:21:00.108 }, 00:21:00.108 "memory_domains": [ 00:21:00.108 { 00:21:00.109 "dma_device_id": "system", 00:21:00.109 "dma_device_type": 1 00:21:00.109 }, 00:21:00.109 { 00:21:00.109 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:00.109 "dma_device_type": 2 00:21:00.109 } 00:21:00.109 ], 00:21:00.109 "driver_specific": {} 00:21:00.109 } 00:21:00.109 ] 00:21:00.109 18:24:43 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:21:00.109 18:24:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:21:00.109 18:24:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:21:00.109 18:24:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@270 -- # verify_raid_bdev_state Existed_Raid online raid1 0 4 00:21:00.109 18:24:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:21:00.109 18:24:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:21:00.109 18:24:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:21:00.109 18:24:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:21:00.109 18:24:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:21:00.109 18:24:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:21:00.109 18:24:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:21:00.109 18:24:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:21:00.109 18:24:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:21:00.109 18:24:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:00.109 18:24:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:21:00.366 18:24:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:21:00.366 "name": "Existed_Raid", 00:21:00.366 "uuid": "2cc98672-fc81-4c62-a86f-5ee965ab0e29", 00:21:00.366 "strip_size_kb": 0, 00:21:00.366 "state": "online", 00:21:00.366 "raid_level": "raid1", 00:21:00.366 "superblock": false, 00:21:00.366 "num_base_bdevs": 4, 00:21:00.366 "num_base_bdevs_discovered": 4, 00:21:00.366 "num_base_bdevs_operational": 4, 00:21:00.366 "base_bdevs_list": [ 00:21:00.366 { 00:21:00.366 "name": "BaseBdev1", 00:21:00.366 "uuid": "59a4c6c5-485e-4c00-b5b8-1e9306ae817f", 00:21:00.366 "is_configured": true, 00:21:00.366 "data_offset": 0, 00:21:00.366 "data_size": 65536 00:21:00.366 }, 00:21:00.366 { 00:21:00.366 "name": "BaseBdev2", 00:21:00.366 "uuid": "e416d868-8a47-4a8d-9cfd-57f50e29c0ca", 00:21:00.366 "is_configured": true, 00:21:00.366 "data_offset": 0, 00:21:00.366 "data_size": 65536 00:21:00.366 }, 00:21:00.366 { 00:21:00.366 "name": "BaseBdev3", 00:21:00.366 "uuid": "1ba9d5b0-a3bd-48df-9dd5-98886547c716", 00:21:00.366 "is_configured": true, 00:21:00.366 "data_offset": 0, 00:21:00.366 "data_size": 65536 00:21:00.366 }, 00:21:00.366 { 00:21:00.366 "name": "BaseBdev4", 00:21:00.366 "uuid": "c2b99572-52bb-44e1-aa30-cae556f3f7b7", 00:21:00.366 "is_configured": true, 00:21:00.366 "data_offset": 0, 00:21:00.366 "data_size": 65536 00:21:00.366 } 00:21:00.366 ] 00:21:00.366 }' 00:21:00.366 18:24:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:21:00.366 18:24:43 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:21:00.931 18:24:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_properties Existed_Raid 00:21:00.931 18:24:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:21:00.931 18:24:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:21:00.931 18:24:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:21:00.931 18:24:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:21:00.931 18:24:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@198 -- # local name 00:21:00.931 18:24:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:21:00.931 18:24:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:21:01.189 [2024-07-12 18:24:44.776448] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:21:01.189 18:24:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:21:01.189 "name": "Existed_Raid", 00:21:01.189 "aliases": [ 00:21:01.189 "2cc98672-fc81-4c62-a86f-5ee965ab0e29" 00:21:01.189 ], 00:21:01.189 "product_name": "Raid Volume", 00:21:01.189 "block_size": 512, 00:21:01.189 "num_blocks": 65536, 00:21:01.189 "uuid": "2cc98672-fc81-4c62-a86f-5ee965ab0e29", 00:21:01.189 "assigned_rate_limits": { 00:21:01.189 "rw_ios_per_sec": 0, 00:21:01.189 "rw_mbytes_per_sec": 0, 00:21:01.189 "r_mbytes_per_sec": 0, 00:21:01.189 "w_mbytes_per_sec": 0 00:21:01.189 }, 00:21:01.189 "claimed": false, 00:21:01.189 "zoned": false, 00:21:01.189 "supported_io_types": { 00:21:01.189 "read": true, 00:21:01.189 "write": true, 00:21:01.189 "unmap": false, 00:21:01.189 "flush": false, 00:21:01.189 "reset": true, 00:21:01.189 "nvme_admin": false, 00:21:01.189 "nvme_io": false, 00:21:01.189 "nvme_io_md": false, 00:21:01.189 "write_zeroes": true, 00:21:01.189 "zcopy": false, 00:21:01.189 "get_zone_info": false, 00:21:01.189 "zone_management": false, 00:21:01.189 "zone_append": false, 00:21:01.189 "compare": false, 00:21:01.189 "compare_and_write": false, 00:21:01.189 "abort": false, 00:21:01.189 "seek_hole": false, 00:21:01.189 "seek_data": false, 00:21:01.189 "copy": false, 00:21:01.189 "nvme_iov_md": false 00:21:01.189 }, 00:21:01.189 "memory_domains": [ 00:21:01.189 { 00:21:01.189 "dma_device_id": "system", 00:21:01.189 "dma_device_type": 1 00:21:01.189 }, 00:21:01.189 { 00:21:01.189 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:01.189 "dma_device_type": 2 00:21:01.189 }, 00:21:01.189 { 00:21:01.189 "dma_device_id": "system", 00:21:01.189 "dma_device_type": 1 00:21:01.189 }, 00:21:01.189 { 00:21:01.189 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:01.189 "dma_device_type": 2 00:21:01.189 }, 00:21:01.189 { 00:21:01.189 "dma_device_id": "system", 00:21:01.189 "dma_device_type": 1 00:21:01.189 }, 00:21:01.189 { 00:21:01.189 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:01.190 "dma_device_type": 2 00:21:01.190 }, 00:21:01.190 { 00:21:01.190 "dma_device_id": "system", 00:21:01.190 "dma_device_type": 1 00:21:01.190 }, 00:21:01.190 { 00:21:01.190 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:01.190 "dma_device_type": 2 00:21:01.190 } 00:21:01.190 ], 00:21:01.190 "driver_specific": { 00:21:01.190 "raid": { 00:21:01.190 "uuid": "2cc98672-fc81-4c62-a86f-5ee965ab0e29", 00:21:01.190 "strip_size_kb": 0, 00:21:01.190 "state": "online", 00:21:01.190 "raid_level": "raid1", 00:21:01.190 "superblock": false, 00:21:01.190 "num_base_bdevs": 4, 00:21:01.190 "num_base_bdevs_discovered": 4, 00:21:01.190 "num_base_bdevs_operational": 4, 00:21:01.190 "base_bdevs_list": [ 00:21:01.190 { 00:21:01.190 "name": "BaseBdev1", 00:21:01.190 "uuid": "59a4c6c5-485e-4c00-b5b8-1e9306ae817f", 00:21:01.190 "is_configured": true, 00:21:01.190 "data_offset": 0, 00:21:01.190 "data_size": 65536 00:21:01.190 }, 00:21:01.190 { 00:21:01.190 "name": "BaseBdev2", 00:21:01.190 "uuid": "e416d868-8a47-4a8d-9cfd-57f50e29c0ca", 00:21:01.190 "is_configured": true, 00:21:01.190 "data_offset": 0, 00:21:01.190 "data_size": 65536 00:21:01.190 }, 00:21:01.190 { 00:21:01.190 "name": "BaseBdev3", 00:21:01.190 "uuid": "1ba9d5b0-a3bd-48df-9dd5-98886547c716", 00:21:01.190 "is_configured": true, 00:21:01.190 "data_offset": 0, 00:21:01.190 "data_size": 65536 00:21:01.190 }, 00:21:01.190 { 00:21:01.190 "name": "BaseBdev4", 00:21:01.190 "uuid": "c2b99572-52bb-44e1-aa30-cae556f3f7b7", 00:21:01.190 "is_configured": true, 00:21:01.190 "data_offset": 0, 00:21:01.190 "data_size": 65536 00:21:01.190 } 00:21:01.190 ] 00:21:01.190 } 00:21:01.190 } 00:21:01.190 }' 00:21:01.190 18:24:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:21:01.190 18:24:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='BaseBdev1 00:21:01.190 BaseBdev2 00:21:01.190 BaseBdev3 00:21:01.190 BaseBdev4' 00:21:01.190 18:24:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:21:01.190 18:24:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:21:01.190 18:24:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:21:01.447 18:24:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:21:01.447 "name": "BaseBdev1", 00:21:01.447 "aliases": [ 00:21:01.447 "59a4c6c5-485e-4c00-b5b8-1e9306ae817f" 00:21:01.447 ], 00:21:01.447 "product_name": "Malloc disk", 00:21:01.447 "block_size": 512, 00:21:01.447 "num_blocks": 65536, 00:21:01.447 "uuid": "59a4c6c5-485e-4c00-b5b8-1e9306ae817f", 00:21:01.447 "assigned_rate_limits": { 00:21:01.447 "rw_ios_per_sec": 0, 00:21:01.447 "rw_mbytes_per_sec": 0, 00:21:01.447 "r_mbytes_per_sec": 0, 00:21:01.447 "w_mbytes_per_sec": 0 00:21:01.447 }, 00:21:01.447 "claimed": true, 00:21:01.447 "claim_type": "exclusive_write", 00:21:01.447 "zoned": false, 00:21:01.447 "supported_io_types": { 00:21:01.447 "read": true, 00:21:01.447 "write": true, 00:21:01.447 "unmap": true, 00:21:01.447 "flush": true, 00:21:01.447 "reset": true, 00:21:01.447 "nvme_admin": false, 00:21:01.447 "nvme_io": false, 00:21:01.447 "nvme_io_md": false, 00:21:01.447 "write_zeroes": true, 00:21:01.447 "zcopy": true, 00:21:01.447 "get_zone_info": false, 00:21:01.447 "zone_management": false, 00:21:01.447 "zone_append": false, 00:21:01.447 "compare": false, 00:21:01.447 "compare_and_write": false, 00:21:01.447 "abort": true, 00:21:01.447 "seek_hole": false, 00:21:01.447 "seek_data": false, 00:21:01.447 "copy": true, 00:21:01.447 "nvme_iov_md": false 00:21:01.447 }, 00:21:01.447 "memory_domains": [ 00:21:01.447 { 00:21:01.447 "dma_device_id": "system", 00:21:01.447 "dma_device_type": 1 00:21:01.447 }, 00:21:01.447 { 00:21:01.447 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:01.447 "dma_device_type": 2 00:21:01.447 } 00:21:01.447 ], 00:21:01.447 "driver_specific": {} 00:21:01.447 }' 00:21:01.447 18:24:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:21:01.447 18:24:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:21:01.704 18:24:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:21:01.704 18:24:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:21:01.704 18:24:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:21:01.704 18:24:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:21:01.704 18:24:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:21:01.704 18:24:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:21:01.704 18:24:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:21:01.704 18:24:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:21:01.704 18:24:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:21:01.962 18:24:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:21:01.962 18:24:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:21:01.962 18:24:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:21:01.962 18:24:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:21:02.220 18:24:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:21:02.220 "name": "BaseBdev2", 00:21:02.220 "aliases": [ 00:21:02.220 "e416d868-8a47-4a8d-9cfd-57f50e29c0ca" 00:21:02.220 ], 00:21:02.220 "product_name": "Malloc disk", 00:21:02.220 "block_size": 512, 00:21:02.220 "num_blocks": 65536, 00:21:02.220 "uuid": "e416d868-8a47-4a8d-9cfd-57f50e29c0ca", 00:21:02.220 "assigned_rate_limits": { 00:21:02.220 "rw_ios_per_sec": 0, 00:21:02.220 "rw_mbytes_per_sec": 0, 00:21:02.220 "r_mbytes_per_sec": 0, 00:21:02.220 "w_mbytes_per_sec": 0 00:21:02.220 }, 00:21:02.220 "claimed": true, 00:21:02.220 "claim_type": "exclusive_write", 00:21:02.220 "zoned": false, 00:21:02.220 "supported_io_types": { 00:21:02.220 "read": true, 00:21:02.220 "write": true, 00:21:02.220 "unmap": true, 00:21:02.220 "flush": true, 00:21:02.220 "reset": true, 00:21:02.220 "nvme_admin": false, 00:21:02.220 "nvme_io": false, 00:21:02.220 "nvme_io_md": false, 00:21:02.220 "write_zeroes": true, 00:21:02.220 "zcopy": true, 00:21:02.220 "get_zone_info": false, 00:21:02.220 "zone_management": false, 00:21:02.220 "zone_append": false, 00:21:02.220 "compare": false, 00:21:02.220 "compare_and_write": false, 00:21:02.220 "abort": true, 00:21:02.220 "seek_hole": false, 00:21:02.220 "seek_data": false, 00:21:02.220 "copy": true, 00:21:02.220 "nvme_iov_md": false 00:21:02.220 }, 00:21:02.220 "memory_domains": [ 00:21:02.220 { 00:21:02.220 "dma_device_id": "system", 00:21:02.220 "dma_device_type": 1 00:21:02.220 }, 00:21:02.220 { 00:21:02.220 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:02.220 "dma_device_type": 2 00:21:02.220 } 00:21:02.220 ], 00:21:02.220 "driver_specific": {} 00:21:02.220 }' 00:21:02.220 18:24:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:21:02.220 18:24:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:21:02.220 18:24:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:21:02.220 18:24:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:21:02.220 18:24:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:21:02.220 18:24:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:21:02.220 18:24:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:21:02.220 18:24:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:21:02.477 18:24:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:21:02.477 18:24:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:21:02.478 18:24:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:21:02.478 18:24:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:21:02.478 18:24:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:21:02.478 18:24:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:21:02.478 18:24:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:21:02.735 18:24:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:21:02.735 "name": "BaseBdev3", 00:21:02.735 "aliases": [ 00:21:02.735 "1ba9d5b0-a3bd-48df-9dd5-98886547c716" 00:21:02.735 ], 00:21:02.735 "product_name": "Malloc disk", 00:21:02.735 "block_size": 512, 00:21:02.735 "num_blocks": 65536, 00:21:02.735 "uuid": "1ba9d5b0-a3bd-48df-9dd5-98886547c716", 00:21:02.735 "assigned_rate_limits": { 00:21:02.735 "rw_ios_per_sec": 0, 00:21:02.735 "rw_mbytes_per_sec": 0, 00:21:02.735 "r_mbytes_per_sec": 0, 00:21:02.735 "w_mbytes_per_sec": 0 00:21:02.735 }, 00:21:02.735 "claimed": true, 00:21:02.735 "claim_type": "exclusive_write", 00:21:02.735 "zoned": false, 00:21:02.735 "supported_io_types": { 00:21:02.735 "read": true, 00:21:02.735 "write": true, 00:21:02.735 "unmap": true, 00:21:02.735 "flush": true, 00:21:02.735 "reset": true, 00:21:02.735 "nvme_admin": false, 00:21:02.735 "nvme_io": false, 00:21:02.735 "nvme_io_md": false, 00:21:02.735 "write_zeroes": true, 00:21:02.735 "zcopy": true, 00:21:02.735 "get_zone_info": false, 00:21:02.735 "zone_management": false, 00:21:02.735 "zone_append": false, 00:21:02.735 "compare": false, 00:21:02.735 "compare_and_write": false, 00:21:02.735 "abort": true, 00:21:02.735 "seek_hole": false, 00:21:02.735 "seek_data": false, 00:21:02.735 "copy": true, 00:21:02.735 "nvme_iov_md": false 00:21:02.735 }, 00:21:02.735 "memory_domains": [ 00:21:02.735 { 00:21:02.735 "dma_device_id": "system", 00:21:02.735 "dma_device_type": 1 00:21:02.735 }, 00:21:02.735 { 00:21:02.735 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:02.735 "dma_device_type": 2 00:21:02.735 } 00:21:02.735 ], 00:21:02.735 "driver_specific": {} 00:21:02.735 }' 00:21:02.735 18:24:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:21:02.735 18:24:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:21:02.735 18:24:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:21:02.735 18:24:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:21:02.735 18:24:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:21:02.735 18:24:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:21:02.735 18:24:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:21:02.993 18:24:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:21:02.993 18:24:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:21:02.993 18:24:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:21:02.993 18:24:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:21:02.993 18:24:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:21:02.993 18:24:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:21:02.993 18:24:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 00:21:02.993 18:24:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:21:03.250 18:24:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:21:03.250 "name": "BaseBdev4", 00:21:03.250 "aliases": [ 00:21:03.250 "c2b99572-52bb-44e1-aa30-cae556f3f7b7" 00:21:03.250 ], 00:21:03.250 "product_name": "Malloc disk", 00:21:03.250 "block_size": 512, 00:21:03.250 "num_blocks": 65536, 00:21:03.250 "uuid": "c2b99572-52bb-44e1-aa30-cae556f3f7b7", 00:21:03.250 "assigned_rate_limits": { 00:21:03.250 "rw_ios_per_sec": 0, 00:21:03.250 "rw_mbytes_per_sec": 0, 00:21:03.250 "r_mbytes_per_sec": 0, 00:21:03.250 "w_mbytes_per_sec": 0 00:21:03.250 }, 00:21:03.250 "claimed": true, 00:21:03.250 "claim_type": "exclusive_write", 00:21:03.250 "zoned": false, 00:21:03.250 "supported_io_types": { 00:21:03.250 "read": true, 00:21:03.250 "write": true, 00:21:03.250 "unmap": true, 00:21:03.250 "flush": true, 00:21:03.250 "reset": true, 00:21:03.250 "nvme_admin": false, 00:21:03.250 "nvme_io": false, 00:21:03.250 "nvme_io_md": false, 00:21:03.250 "write_zeroes": true, 00:21:03.250 "zcopy": true, 00:21:03.250 "get_zone_info": false, 00:21:03.250 "zone_management": false, 00:21:03.250 "zone_append": false, 00:21:03.250 "compare": false, 00:21:03.250 "compare_and_write": false, 00:21:03.250 "abort": true, 00:21:03.250 "seek_hole": false, 00:21:03.250 "seek_data": false, 00:21:03.250 "copy": true, 00:21:03.250 "nvme_iov_md": false 00:21:03.250 }, 00:21:03.250 "memory_domains": [ 00:21:03.250 { 00:21:03.250 "dma_device_id": "system", 00:21:03.250 "dma_device_type": 1 00:21:03.250 }, 00:21:03.250 { 00:21:03.250 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:03.250 "dma_device_type": 2 00:21:03.250 } 00:21:03.250 ], 00:21:03.250 "driver_specific": {} 00:21:03.250 }' 00:21:03.250 18:24:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:21:03.250 18:24:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:21:03.250 18:24:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:21:03.250 18:24:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:21:03.507 18:24:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:21:03.507 18:24:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:21:03.507 18:24:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:21:03.507 18:24:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:21:03.507 18:24:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:21:03.507 18:24:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:21:03.507 18:24:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:21:03.507 18:24:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:21:03.765 18:24:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@274 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:21:03.765 [2024-07-12 18:24:47.459296] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:21:03.765 18:24:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@275 -- # local expected_state 00:21:03.765 18:24:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@276 -- # has_redundancy raid1 00:21:03.765 18:24:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:21:03.765 18:24:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@214 -- # return 0 00:21:03.765 18:24:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@279 -- # expected_state=online 00:21:03.765 18:24:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@281 -- # verify_raid_bdev_state Existed_Raid online raid1 0 3 00:21:03.765 18:24:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:21:03.765 18:24:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:21:03.765 18:24:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:21:03.765 18:24:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:21:03.765 18:24:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:21:03.765 18:24:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:21:03.765 18:24:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:21:03.765 18:24:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:21:03.765 18:24:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:21:03.765 18:24:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:03.765 18:24:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:21:04.022 18:24:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:21:04.022 "name": "Existed_Raid", 00:21:04.022 "uuid": "2cc98672-fc81-4c62-a86f-5ee965ab0e29", 00:21:04.022 "strip_size_kb": 0, 00:21:04.022 "state": "online", 00:21:04.022 "raid_level": "raid1", 00:21:04.022 "superblock": false, 00:21:04.022 "num_base_bdevs": 4, 00:21:04.022 "num_base_bdevs_discovered": 3, 00:21:04.022 "num_base_bdevs_operational": 3, 00:21:04.022 "base_bdevs_list": [ 00:21:04.022 { 00:21:04.022 "name": null, 00:21:04.022 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:04.022 "is_configured": false, 00:21:04.022 "data_offset": 0, 00:21:04.022 "data_size": 65536 00:21:04.022 }, 00:21:04.022 { 00:21:04.022 "name": "BaseBdev2", 00:21:04.022 "uuid": "e416d868-8a47-4a8d-9cfd-57f50e29c0ca", 00:21:04.022 "is_configured": true, 00:21:04.022 "data_offset": 0, 00:21:04.022 "data_size": 65536 00:21:04.022 }, 00:21:04.022 { 00:21:04.022 "name": "BaseBdev3", 00:21:04.022 "uuid": "1ba9d5b0-a3bd-48df-9dd5-98886547c716", 00:21:04.022 "is_configured": true, 00:21:04.022 "data_offset": 0, 00:21:04.022 "data_size": 65536 00:21:04.022 }, 00:21:04.022 { 00:21:04.022 "name": "BaseBdev4", 00:21:04.022 "uuid": "c2b99572-52bb-44e1-aa30-cae556f3f7b7", 00:21:04.022 "is_configured": true, 00:21:04.022 "data_offset": 0, 00:21:04.022 "data_size": 65536 00:21:04.022 } 00:21:04.022 ] 00:21:04.022 }' 00:21:04.022 18:24:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:21:04.022 18:24:47 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:21:04.955 18:24:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i = 1 )) 00:21:04.955 18:24:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:21:04.955 18:24:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:04.955 18:24:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:21:04.955 18:24:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:21:04.955 18:24:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:21:04.955 18:24:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:21:05.213 [2024-07-12 18:24:48.792786] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:21:05.213 18:24:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:21:05.213 18:24:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:21:05.213 18:24:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:05.213 18:24:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:21:05.470 18:24:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:21:05.470 18:24:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:21:05.470 18:24:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev3 00:21:05.728 [2024-07-12 18:24:49.300422] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:21:05.728 18:24:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:21:05.728 18:24:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:21:05.728 18:24:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:05.728 18:24:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:21:05.986 18:24:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:21:05.986 18:24:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:21:05.986 18:24:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev4 00:21:06.244 [2024-07-12 18:24:49.802220] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev4 00:21:06.244 [2024-07-12 18:24:49.802294] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:21:06.244 [2024-07-12 18:24:49.814903] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:21:06.244 [2024-07-12 18:24:49.814945] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:21:06.244 [2024-07-12 18:24:49.814958] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1f85350 name Existed_Raid, state offline 00:21:06.244 18:24:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:21:06.244 18:24:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:21:06.244 18:24:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:06.244 18:24:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # jq -r '.[0]["name"] | select(.)' 00:21:06.501 18:24:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # raid_bdev= 00:21:06.501 18:24:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@294 -- # '[' -n '' ']' 00:21:06.501 18:24:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@299 -- # '[' 4 -gt 2 ']' 00:21:06.501 18:24:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i = 1 )) 00:21:06.501 18:24:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:21:06.501 18:24:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:21:06.758 BaseBdev2 00:21:06.758 18:24:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev2 00:21:06.758 18:24:50 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev2 00:21:06.758 18:24:50 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:21:06.758 18:24:50 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:21:06.758 18:24:50 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:21:06.758 18:24:50 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:21:06.758 18:24:50 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:21:07.016 18:24:50 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:21:07.274 [ 00:21:07.274 { 00:21:07.275 "name": "BaseBdev2", 00:21:07.275 "aliases": [ 00:21:07.275 "d8e532c0-6f39-4231-bd21-e3c367b23850" 00:21:07.275 ], 00:21:07.275 "product_name": "Malloc disk", 00:21:07.275 "block_size": 512, 00:21:07.275 "num_blocks": 65536, 00:21:07.275 "uuid": "d8e532c0-6f39-4231-bd21-e3c367b23850", 00:21:07.275 "assigned_rate_limits": { 00:21:07.275 "rw_ios_per_sec": 0, 00:21:07.275 "rw_mbytes_per_sec": 0, 00:21:07.275 "r_mbytes_per_sec": 0, 00:21:07.275 "w_mbytes_per_sec": 0 00:21:07.275 }, 00:21:07.275 "claimed": false, 00:21:07.275 "zoned": false, 00:21:07.275 "supported_io_types": { 00:21:07.275 "read": true, 00:21:07.275 "write": true, 00:21:07.275 "unmap": true, 00:21:07.275 "flush": true, 00:21:07.275 "reset": true, 00:21:07.275 "nvme_admin": false, 00:21:07.275 "nvme_io": false, 00:21:07.275 "nvme_io_md": false, 00:21:07.275 "write_zeroes": true, 00:21:07.275 "zcopy": true, 00:21:07.275 "get_zone_info": false, 00:21:07.275 "zone_management": false, 00:21:07.275 "zone_append": false, 00:21:07.275 "compare": false, 00:21:07.275 "compare_and_write": false, 00:21:07.275 "abort": true, 00:21:07.275 "seek_hole": false, 00:21:07.275 "seek_data": false, 00:21:07.275 "copy": true, 00:21:07.275 "nvme_iov_md": false 00:21:07.275 }, 00:21:07.275 "memory_domains": [ 00:21:07.275 { 00:21:07.275 "dma_device_id": "system", 00:21:07.275 "dma_device_type": 1 00:21:07.275 }, 00:21:07.275 { 00:21:07.275 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:07.275 "dma_device_type": 2 00:21:07.275 } 00:21:07.275 ], 00:21:07.275 "driver_specific": {} 00:21:07.275 } 00:21:07.275 ] 00:21:07.275 18:24:50 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:21:07.275 18:24:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:21:07.275 18:24:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:21:07.275 18:24:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:21:07.533 BaseBdev3 00:21:07.533 18:24:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev3 00:21:07.533 18:24:51 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev3 00:21:07.533 18:24:51 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:21:07.533 18:24:51 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:21:07.533 18:24:51 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:21:07.533 18:24:51 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:21:07.533 18:24:51 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:21:07.790 18:24:51 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:21:08.048 [ 00:21:08.048 { 00:21:08.048 "name": "BaseBdev3", 00:21:08.048 "aliases": [ 00:21:08.048 "962bb006-dc5c-48ed-aa46-3c70b142baf0" 00:21:08.048 ], 00:21:08.048 "product_name": "Malloc disk", 00:21:08.048 "block_size": 512, 00:21:08.048 "num_blocks": 65536, 00:21:08.048 "uuid": "962bb006-dc5c-48ed-aa46-3c70b142baf0", 00:21:08.048 "assigned_rate_limits": { 00:21:08.048 "rw_ios_per_sec": 0, 00:21:08.048 "rw_mbytes_per_sec": 0, 00:21:08.048 "r_mbytes_per_sec": 0, 00:21:08.048 "w_mbytes_per_sec": 0 00:21:08.048 }, 00:21:08.048 "claimed": false, 00:21:08.048 "zoned": false, 00:21:08.048 "supported_io_types": { 00:21:08.048 "read": true, 00:21:08.048 "write": true, 00:21:08.048 "unmap": true, 00:21:08.048 "flush": true, 00:21:08.048 "reset": true, 00:21:08.048 "nvme_admin": false, 00:21:08.048 "nvme_io": false, 00:21:08.048 "nvme_io_md": false, 00:21:08.048 "write_zeroes": true, 00:21:08.048 "zcopy": true, 00:21:08.048 "get_zone_info": false, 00:21:08.048 "zone_management": false, 00:21:08.048 "zone_append": false, 00:21:08.048 "compare": false, 00:21:08.048 "compare_and_write": false, 00:21:08.048 "abort": true, 00:21:08.048 "seek_hole": false, 00:21:08.048 "seek_data": false, 00:21:08.048 "copy": true, 00:21:08.048 "nvme_iov_md": false 00:21:08.048 }, 00:21:08.048 "memory_domains": [ 00:21:08.048 { 00:21:08.048 "dma_device_id": "system", 00:21:08.048 "dma_device_type": 1 00:21:08.048 }, 00:21:08.048 { 00:21:08.048 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:08.048 "dma_device_type": 2 00:21:08.048 } 00:21:08.048 ], 00:21:08.048 "driver_specific": {} 00:21:08.048 } 00:21:08.048 ] 00:21:08.048 18:24:51 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:21:08.048 18:24:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:21:08.048 18:24:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:21:08.048 18:24:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4 00:21:08.306 BaseBdev4 00:21:08.306 18:24:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev4 00:21:08.306 18:24:51 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev4 00:21:08.306 18:24:51 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:21:08.306 18:24:51 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:21:08.306 18:24:51 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:21:08.306 18:24:51 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:21:08.306 18:24:51 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:21:08.565 18:24:52 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 -t 2000 00:21:08.565 [ 00:21:08.565 { 00:21:08.565 "name": "BaseBdev4", 00:21:08.565 "aliases": [ 00:21:08.565 "6d169095-940a-4521-8407-f342df56550e" 00:21:08.565 ], 00:21:08.565 "product_name": "Malloc disk", 00:21:08.565 "block_size": 512, 00:21:08.565 "num_blocks": 65536, 00:21:08.565 "uuid": "6d169095-940a-4521-8407-f342df56550e", 00:21:08.565 "assigned_rate_limits": { 00:21:08.565 "rw_ios_per_sec": 0, 00:21:08.565 "rw_mbytes_per_sec": 0, 00:21:08.565 "r_mbytes_per_sec": 0, 00:21:08.565 "w_mbytes_per_sec": 0 00:21:08.565 }, 00:21:08.565 "claimed": false, 00:21:08.565 "zoned": false, 00:21:08.565 "supported_io_types": { 00:21:08.565 "read": true, 00:21:08.565 "write": true, 00:21:08.565 "unmap": true, 00:21:08.565 "flush": true, 00:21:08.565 "reset": true, 00:21:08.565 "nvme_admin": false, 00:21:08.565 "nvme_io": false, 00:21:08.565 "nvme_io_md": false, 00:21:08.565 "write_zeroes": true, 00:21:08.565 "zcopy": true, 00:21:08.565 "get_zone_info": false, 00:21:08.565 "zone_management": false, 00:21:08.565 "zone_append": false, 00:21:08.565 "compare": false, 00:21:08.565 "compare_and_write": false, 00:21:08.565 "abort": true, 00:21:08.565 "seek_hole": false, 00:21:08.565 "seek_data": false, 00:21:08.565 "copy": true, 00:21:08.565 "nvme_iov_md": false 00:21:08.565 }, 00:21:08.565 "memory_domains": [ 00:21:08.565 { 00:21:08.565 "dma_device_id": "system", 00:21:08.565 "dma_device_type": 1 00:21:08.565 }, 00:21:08.565 { 00:21:08.565 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:08.565 "dma_device_type": 2 00:21:08.565 } 00:21:08.565 ], 00:21:08.565 "driver_specific": {} 00:21:08.565 } 00:21:08.565 ] 00:21:08.565 18:24:52 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:21:08.565 18:24:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:21:08.565 18:24:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:21:08.565 18:24:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@305 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:21:08.822 [2024-07-12 18:24:52.513069] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:21:08.823 [2024-07-12 18:24:52.513107] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:21:08.823 [2024-07-12 18:24:52.513125] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:21:08.823 [2024-07-12 18:24:52.514454] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:21:08.823 [2024-07-12 18:24:52.514495] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:21:08.823 18:24:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@306 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:21:08.823 18:24:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:21:08.823 18:24:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:21:08.823 18:24:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:21:08.823 18:24:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:21:08.823 18:24:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:21:08.823 18:24:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:21:08.823 18:24:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:21:08.823 18:24:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:21:08.823 18:24:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:21:08.823 18:24:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:08.823 18:24:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:21:09.080 18:24:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:21:09.080 "name": "Existed_Raid", 00:21:09.080 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:09.080 "strip_size_kb": 0, 00:21:09.080 "state": "configuring", 00:21:09.080 "raid_level": "raid1", 00:21:09.080 "superblock": false, 00:21:09.080 "num_base_bdevs": 4, 00:21:09.080 "num_base_bdevs_discovered": 3, 00:21:09.080 "num_base_bdevs_operational": 4, 00:21:09.080 "base_bdevs_list": [ 00:21:09.080 { 00:21:09.080 "name": "BaseBdev1", 00:21:09.080 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:09.080 "is_configured": false, 00:21:09.080 "data_offset": 0, 00:21:09.080 "data_size": 0 00:21:09.080 }, 00:21:09.080 { 00:21:09.080 "name": "BaseBdev2", 00:21:09.080 "uuid": "d8e532c0-6f39-4231-bd21-e3c367b23850", 00:21:09.080 "is_configured": true, 00:21:09.080 "data_offset": 0, 00:21:09.080 "data_size": 65536 00:21:09.080 }, 00:21:09.080 { 00:21:09.080 "name": "BaseBdev3", 00:21:09.080 "uuid": "962bb006-dc5c-48ed-aa46-3c70b142baf0", 00:21:09.080 "is_configured": true, 00:21:09.080 "data_offset": 0, 00:21:09.080 "data_size": 65536 00:21:09.080 }, 00:21:09.080 { 00:21:09.080 "name": "BaseBdev4", 00:21:09.080 "uuid": "6d169095-940a-4521-8407-f342df56550e", 00:21:09.080 "is_configured": true, 00:21:09.080 "data_offset": 0, 00:21:09.080 "data_size": 65536 00:21:09.080 } 00:21:09.080 ] 00:21:09.080 }' 00:21:09.080 18:24:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:21:09.080 18:24:52 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:21:10.013 18:24:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@308 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev2 00:21:10.013 [2024-07-12 18:24:53.607960] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:21:10.013 18:24:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@309 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:21:10.013 18:24:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:21:10.013 18:24:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:21:10.013 18:24:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:21:10.013 18:24:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:21:10.013 18:24:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:21:10.013 18:24:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:21:10.013 18:24:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:21:10.013 18:24:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:21:10.013 18:24:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:21:10.013 18:24:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:10.013 18:24:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:21:10.272 18:24:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:21:10.272 "name": "Existed_Raid", 00:21:10.272 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:10.272 "strip_size_kb": 0, 00:21:10.272 "state": "configuring", 00:21:10.272 "raid_level": "raid1", 00:21:10.272 "superblock": false, 00:21:10.272 "num_base_bdevs": 4, 00:21:10.272 "num_base_bdevs_discovered": 2, 00:21:10.272 "num_base_bdevs_operational": 4, 00:21:10.272 "base_bdevs_list": [ 00:21:10.272 { 00:21:10.272 "name": "BaseBdev1", 00:21:10.272 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:10.272 "is_configured": false, 00:21:10.272 "data_offset": 0, 00:21:10.272 "data_size": 0 00:21:10.272 }, 00:21:10.272 { 00:21:10.272 "name": null, 00:21:10.272 "uuid": "d8e532c0-6f39-4231-bd21-e3c367b23850", 00:21:10.272 "is_configured": false, 00:21:10.272 "data_offset": 0, 00:21:10.272 "data_size": 65536 00:21:10.272 }, 00:21:10.272 { 00:21:10.272 "name": "BaseBdev3", 00:21:10.272 "uuid": "962bb006-dc5c-48ed-aa46-3c70b142baf0", 00:21:10.272 "is_configured": true, 00:21:10.272 "data_offset": 0, 00:21:10.272 "data_size": 65536 00:21:10.272 }, 00:21:10.272 { 00:21:10.272 "name": "BaseBdev4", 00:21:10.272 "uuid": "6d169095-940a-4521-8407-f342df56550e", 00:21:10.272 "is_configured": true, 00:21:10.272 "data_offset": 0, 00:21:10.272 "data_size": 65536 00:21:10.272 } 00:21:10.272 ] 00:21:10.272 }' 00:21:10.272 18:24:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:21:10.272 18:24:53 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:21:10.892 18:24:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@310 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:10.892 18:24:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@310 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:21:11.150 18:24:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@310 -- # [[ false == \f\a\l\s\e ]] 00:21:11.150 18:24:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@312 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:21:11.408 [2024-07-12 18:24:54.964016] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:21:11.408 BaseBdev1 00:21:11.408 18:24:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@313 -- # waitforbdev BaseBdev1 00:21:11.408 18:24:54 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev1 00:21:11.408 18:24:54 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:21:11.408 18:24:54 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:21:11.408 18:24:54 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:21:11.408 18:24:54 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:21:11.408 18:24:54 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:21:11.666 18:24:55 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:21:11.925 [ 00:21:11.925 { 00:21:11.925 "name": "BaseBdev1", 00:21:11.925 "aliases": [ 00:21:11.925 "8f10f72a-2c12-4c66-91bf-88786815061b" 00:21:11.925 ], 00:21:11.925 "product_name": "Malloc disk", 00:21:11.925 "block_size": 512, 00:21:11.925 "num_blocks": 65536, 00:21:11.925 "uuid": "8f10f72a-2c12-4c66-91bf-88786815061b", 00:21:11.925 "assigned_rate_limits": { 00:21:11.925 "rw_ios_per_sec": 0, 00:21:11.925 "rw_mbytes_per_sec": 0, 00:21:11.925 "r_mbytes_per_sec": 0, 00:21:11.925 "w_mbytes_per_sec": 0 00:21:11.925 }, 00:21:11.925 "claimed": true, 00:21:11.925 "claim_type": "exclusive_write", 00:21:11.925 "zoned": false, 00:21:11.925 "supported_io_types": { 00:21:11.925 "read": true, 00:21:11.925 "write": true, 00:21:11.925 "unmap": true, 00:21:11.925 "flush": true, 00:21:11.925 "reset": true, 00:21:11.925 "nvme_admin": false, 00:21:11.925 "nvme_io": false, 00:21:11.925 "nvme_io_md": false, 00:21:11.925 "write_zeroes": true, 00:21:11.925 "zcopy": true, 00:21:11.925 "get_zone_info": false, 00:21:11.925 "zone_management": false, 00:21:11.925 "zone_append": false, 00:21:11.925 "compare": false, 00:21:11.925 "compare_and_write": false, 00:21:11.925 "abort": true, 00:21:11.925 "seek_hole": false, 00:21:11.925 "seek_data": false, 00:21:11.925 "copy": true, 00:21:11.925 "nvme_iov_md": false 00:21:11.925 }, 00:21:11.925 "memory_domains": [ 00:21:11.925 { 00:21:11.925 "dma_device_id": "system", 00:21:11.925 "dma_device_type": 1 00:21:11.925 }, 00:21:11.925 { 00:21:11.925 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:11.925 "dma_device_type": 2 00:21:11.925 } 00:21:11.925 ], 00:21:11.925 "driver_specific": {} 00:21:11.925 } 00:21:11.925 ] 00:21:11.925 18:24:55 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:21:11.925 18:24:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@314 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:21:11.925 18:24:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:21:11.925 18:24:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:21:11.925 18:24:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:21:11.925 18:24:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:21:11.925 18:24:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:21:11.925 18:24:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:21:11.925 18:24:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:21:11.925 18:24:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:21:11.925 18:24:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:21:11.925 18:24:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:11.925 18:24:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:21:12.183 18:24:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:21:12.183 "name": "Existed_Raid", 00:21:12.183 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:12.183 "strip_size_kb": 0, 00:21:12.183 "state": "configuring", 00:21:12.183 "raid_level": "raid1", 00:21:12.183 "superblock": false, 00:21:12.183 "num_base_bdevs": 4, 00:21:12.183 "num_base_bdevs_discovered": 3, 00:21:12.183 "num_base_bdevs_operational": 4, 00:21:12.183 "base_bdevs_list": [ 00:21:12.183 { 00:21:12.183 "name": "BaseBdev1", 00:21:12.183 "uuid": "8f10f72a-2c12-4c66-91bf-88786815061b", 00:21:12.183 "is_configured": true, 00:21:12.183 "data_offset": 0, 00:21:12.183 "data_size": 65536 00:21:12.183 }, 00:21:12.183 { 00:21:12.183 "name": null, 00:21:12.183 "uuid": "d8e532c0-6f39-4231-bd21-e3c367b23850", 00:21:12.183 "is_configured": false, 00:21:12.183 "data_offset": 0, 00:21:12.183 "data_size": 65536 00:21:12.183 }, 00:21:12.183 { 00:21:12.183 "name": "BaseBdev3", 00:21:12.183 "uuid": "962bb006-dc5c-48ed-aa46-3c70b142baf0", 00:21:12.183 "is_configured": true, 00:21:12.183 "data_offset": 0, 00:21:12.183 "data_size": 65536 00:21:12.183 }, 00:21:12.183 { 00:21:12.183 "name": "BaseBdev4", 00:21:12.183 "uuid": "6d169095-940a-4521-8407-f342df56550e", 00:21:12.183 "is_configured": true, 00:21:12.183 "data_offset": 0, 00:21:12.183 "data_size": 65536 00:21:12.183 } 00:21:12.183 ] 00:21:12.183 }' 00:21:12.183 18:24:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:21:12.183 18:24:55 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:21:12.749 18:24:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@315 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:12.749 18:24:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@315 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:21:13.008 18:24:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@315 -- # [[ true == \t\r\u\e ]] 00:21:13.008 18:24:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@317 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev3 00:21:13.266 [2024-07-12 18:24:56.788914] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:21:13.266 18:24:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@318 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:21:13.266 18:24:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:21:13.266 18:24:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:21:13.266 18:24:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:21:13.266 18:24:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:21:13.266 18:24:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:21:13.266 18:24:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:21:13.266 18:24:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:21:13.266 18:24:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:21:13.266 18:24:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:21:13.266 18:24:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:13.266 18:24:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:21:13.525 18:24:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:21:13.525 "name": "Existed_Raid", 00:21:13.525 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:13.525 "strip_size_kb": 0, 00:21:13.525 "state": "configuring", 00:21:13.525 "raid_level": "raid1", 00:21:13.525 "superblock": false, 00:21:13.525 "num_base_bdevs": 4, 00:21:13.525 "num_base_bdevs_discovered": 2, 00:21:13.525 "num_base_bdevs_operational": 4, 00:21:13.525 "base_bdevs_list": [ 00:21:13.525 { 00:21:13.525 "name": "BaseBdev1", 00:21:13.525 "uuid": "8f10f72a-2c12-4c66-91bf-88786815061b", 00:21:13.525 "is_configured": true, 00:21:13.525 "data_offset": 0, 00:21:13.525 "data_size": 65536 00:21:13.525 }, 00:21:13.525 { 00:21:13.525 "name": null, 00:21:13.525 "uuid": "d8e532c0-6f39-4231-bd21-e3c367b23850", 00:21:13.525 "is_configured": false, 00:21:13.525 "data_offset": 0, 00:21:13.525 "data_size": 65536 00:21:13.525 }, 00:21:13.525 { 00:21:13.525 "name": null, 00:21:13.525 "uuid": "962bb006-dc5c-48ed-aa46-3c70b142baf0", 00:21:13.525 "is_configured": false, 00:21:13.525 "data_offset": 0, 00:21:13.525 "data_size": 65536 00:21:13.525 }, 00:21:13.525 { 00:21:13.525 "name": "BaseBdev4", 00:21:13.525 "uuid": "6d169095-940a-4521-8407-f342df56550e", 00:21:13.525 "is_configured": true, 00:21:13.525 "data_offset": 0, 00:21:13.525 "data_size": 65536 00:21:13.525 } 00:21:13.525 ] 00:21:13.525 }' 00:21:13.525 18:24:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:21:13.525 18:24:57 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:21:14.090 18:24:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@319 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:14.090 18:24:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@319 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:21:14.348 18:24:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@319 -- # [[ false == \f\a\l\s\e ]] 00:21:14.348 18:24:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@321 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev3 00:21:14.606 [2024-07-12 18:24:58.108590] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:21:14.606 18:24:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@322 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:21:14.606 18:24:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:21:14.606 18:24:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:21:14.606 18:24:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:21:14.606 18:24:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:21:14.606 18:24:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:21:14.606 18:24:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:21:14.606 18:24:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:21:14.606 18:24:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:21:14.606 18:24:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:21:14.606 18:24:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:14.606 18:24:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:21:14.864 18:24:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:21:14.864 "name": "Existed_Raid", 00:21:14.864 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:14.864 "strip_size_kb": 0, 00:21:14.864 "state": "configuring", 00:21:14.864 "raid_level": "raid1", 00:21:14.864 "superblock": false, 00:21:14.864 "num_base_bdevs": 4, 00:21:14.864 "num_base_bdevs_discovered": 3, 00:21:14.864 "num_base_bdevs_operational": 4, 00:21:14.864 "base_bdevs_list": [ 00:21:14.864 { 00:21:14.864 "name": "BaseBdev1", 00:21:14.864 "uuid": "8f10f72a-2c12-4c66-91bf-88786815061b", 00:21:14.864 "is_configured": true, 00:21:14.864 "data_offset": 0, 00:21:14.864 "data_size": 65536 00:21:14.864 }, 00:21:14.864 { 00:21:14.864 "name": null, 00:21:14.864 "uuid": "d8e532c0-6f39-4231-bd21-e3c367b23850", 00:21:14.864 "is_configured": false, 00:21:14.864 "data_offset": 0, 00:21:14.864 "data_size": 65536 00:21:14.864 }, 00:21:14.864 { 00:21:14.864 "name": "BaseBdev3", 00:21:14.864 "uuid": "962bb006-dc5c-48ed-aa46-3c70b142baf0", 00:21:14.864 "is_configured": true, 00:21:14.864 "data_offset": 0, 00:21:14.864 "data_size": 65536 00:21:14.864 }, 00:21:14.864 { 00:21:14.864 "name": "BaseBdev4", 00:21:14.864 "uuid": "6d169095-940a-4521-8407-f342df56550e", 00:21:14.864 "is_configured": true, 00:21:14.864 "data_offset": 0, 00:21:14.864 "data_size": 65536 00:21:14.864 } 00:21:14.864 ] 00:21:14.864 }' 00:21:14.864 18:24:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:21:14.864 18:24:58 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:21:15.430 18:24:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@323 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:15.430 18:24:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@323 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:21:15.688 18:24:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@323 -- # [[ true == \t\r\u\e ]] 00:21:15.688 18:24:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@325 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:21:15.947 [2024-07-12 18:24:59.480240] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:21:15.947 18:24:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@326 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:21:15.947 18:24:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:21:15.947 18:24:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:21:15.947 18:24:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:21:15.947 18:24:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:21:15.947 18:24:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:21:15.947 18:24:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:21:15.947 18:24:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:21:15.947 18:24:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:21:15.947 18:24:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:21:15.947 18:24:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:15.947 18:24:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:21:16.206 18:24:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:21:16.206 "name": "Existed_Raid", 00:21:16.206 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:16.206 "strip_size_kb": 0, 00:21:16.206 "state": "configuring", 00:21:16.206 "raid_level": "raid1", 00:21:16.206 "superblock": false, 00:21:16.206 "num_base_bdevs": 4, 00:21:16.206 "num_base_bdevs_discovered": 2, 00:21:16.206 "num_base_bdevs_operational": 4, 00:21:16.206 "base_bdevs_list": [ 00:21:16.206 { 00:21:16.206 "name": null, 00:21:16.206 "uuid": "8f10f72a-2c12-4c66-91bf-88786815061b", 00:21:16.206 "is_configured": false, 00:21:16.206 "data_offset": 0, 00:21:16.206 "data_size": 65536 00:21:16.206 }, 00:21:16.206 { 00:21:16.206 "name": null, 00:21:16.206 "uuid": "d8e532c0-6f39-4231-bd21-e3c367b23850", 00:21:16.206 "is_configured": false, 00:21:16.206 "data_offset": 0, 00:21:16.206 "data_size": 65536 00:21:16.206 }, 00:21:16.206 { 00:21:16.206 "name": "BaseBdev3", 00:21:16.206 "uuid": "962bb006-dc5c-48ed-aa46-3c70b142baf0", 00:21:16.206 "is_configured": true, 00:21:16.206 "data_offset": 0, 00:21:16.206 "data_size": 65536 00:21:16.206 }, 00:21:16.206 { 00:21:16.206 "name": "BaseBdev4", 00:21:16.206 "uuid": "6d169095-940a-4521-8407-f342df56550e", 00:21:16.206 "is_configured": true, 00:21:16.206 "data_offset": 0, 00:21:16.206 "data_size": 65536 00:21:16.206 } 00:21:16.206 ] 00:21:16.206 }' 00:21:16.206 18:24:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:21:16.206 18:24:59 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:21:16.773 18:25:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@327 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:21:16.773 18:25:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@327 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:17.031 18:25:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@327 -- # [[ false == \f\a\l\s\e ]] 00:21:17.031 18:25:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@329 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev2 00:21:17.289 [2024-07-12 18:25:00.835880] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:21:17.289 18:25:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@330 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:21:17.289 18:25:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:21:17.289 18:25:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:21:17.289 18:25:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:21:17.289 18:25:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:21:17.289 18:25:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:21:17.289 18:25:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:21:17.289 18:25:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:21:17.289 18:25:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:21:17.289 18:25:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:21:17.289 18:25:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:17.289 18:25:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:21:17.547 18:25:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:21:17.547 "name": "Existed_Raid", 00:21:17.547 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:17.547 "strip_size_kb": 0, 00:21:17.547 "state": "configuring", 00:21:17.547 "raid_level": "raid1", 00:21:17.547 "superblock": false, 00:21:17.547 "num_base_bdevs": 4, 00:21:17.547 "num_base_bdevs_discovered": 3, 00:21:17.547 "num_base_bdevs_operational": 4, 00:21:17.547 "base_bdevs_list": [ 00:21:17.547 { 00:21:17.547 "name": null, 00:21:17.547 "uuid": "8f10f72a-2c12-4c66-91bf-88786815061b", 00:21:17.547 "is_configured": false, 00:21:17.547 "data_offset": 0, 00:21:17.547 "data_size": 65536 00:21:17.547 }, 00:21:17.547 { 00:21:17.547 "name": "BaseBdev2", 00:21:17.547 "uuid": "d8e532c0-6f39-4231-bd21-e3c367b23850", 00:21:17.547 "is_configured": true, 00:21:17.547 "data_offset": 0, 00:21:17.547 "data_size": 65536 00:21:17.547 }, 00:21:17.547 { 00:21:17.547 "name": "BaseBdev3", 00:21:17.547 "uuid": "962bb006-dc5c-48ed-aa46-3c70b142baf0", 00:21:17.547 "is_configured": true, 00:21:17.547 "data_offset": 0, 00:21:17.547 "data_size": 65536 00:21:17.547 }, 00:21:17.547 { 00:21:17.547 "name": "BaseBdev4", 00:21:17.547 "uuid": "6d169095-940a-4521-8407-f342df56550e", 00:21:17.547 "is_configured": true, 00:21:17.547 "data_offset": 0, 00:21:17.547 "data_size": 65536 00:21:17.547 } 00:21:17.547 ] 00:21:17.547 }' 00:21:17.547 18:25:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:21:17.547 18:25:01 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:21:18.113 18:25:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@331 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:21:18.113 18:25:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@331 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:18.371 18:25:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@331 -- # [[ true == \t\r\u\e ]] 00:21:18.371 18:25:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:18.371 18:25:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@333 -- # jq -r '.[0].base_bdevs_list[0].uuid' 00:21:18.630 18:25:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b NewBaseBdev -u 8f10f72a-2c12-4c66-91bf-88786815061b 00:21:18.888 [2024-07-12 18:25:02.359370] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev NewBaseBdev is claimed 00:21:18.888 [2024-07-12 18:25:02.359410] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x1f83610 00:21:18.888 [2024-07-12 18:25:02.359419] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 65536, blocklen 512 00:21:18.888 [2024-07-12 18:25:02.359612] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1f84a70 00:21:18.888 [2024-07-12 18:25:02.359733] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1f83610 00:21:18.888 [2024-07-12 18:25:02.359743] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x1f83610 00:21:18.888 [2024-07-12 18:25:02.359907] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:21:18.888 NewBaseBdev 00:21:18.888 18:25:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@334 -- # waitforbdev NewBaseBdev 00:21:18.888 18:25:02 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=NewBaseBdev 00:21:18.888 18:25:02 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:21:18.888 18:25:02 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:21:18.888 18:25:02 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:21:18.888 18:25:02 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:21:18.888 18:25:02 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:21:19.147 18:25:02 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev -t 2000 00:21:19.147 [ 00:21:19.147 { 00:21:19.147 "name": "NewBaseBdev", 00:21:19.147 "aliases": [ 00:21:19.147 "8f10f72a-2c12-4c66-91bf-88786815061b" 00:21:19.148 ], 00:21:19.148 "product_name": "Malloc disk", 00:21:19.148 "block_size": 512, 00:21:19.148 "num_blocks": 65536, 00:21:19.148 "uuid": "8f10f72a-2c12-4c66-91bf-88786815061b", 00:21:19.148 "assigned_rate_limits": { 00:21:19.148 "rw_ios_per_sec": 0, 00:21:19.148 "rw_mbytes_per_sec": 0, 00:21:19.148 "r_mbytes_per_sec": 0, 00:21:19.148 "w_mbytes_per_sec": 0 00:21:19.148 }, 00:21:19.148 "claimed": true, 00:21:19.148 "claim_type": "exclusive_write", 00:21:19.148 "zoned": false, 00:21:19.148 "supported_io_types": { 00:21:19.148 "read": true, 00:21:19.148 "write": true, 00:21:19.148 "unmap": true, 00:21:19.148 "flush": true, 00:21:19.148 "reset": true, 00:21:19.148 "nvme_admin": false, 00:21:19.148 "nvme_io": false, 00:21:19.148 "nvme_io_md": false, 00:21:19.148 "write_zeroes": true, 00:21:19.148 "zcopy": true, 00:21:19.148 "get_zone_info": false, 00:21:19.148 "zone_management": false, 00:21:19.148 "zone_append": false, 00:21:19.148 "compare": false, 00:21:19.148 "compare_and_write": false, 00:21:19.148 "abort": true, 00:21:19.148 "seek_hole": false, 00:21:19.148 "seek_data": false, 00:21:19.148 "copy": true, 00:21:19.148 "nvme_iov_md": false 00:21:19.148 }, 00:21:19.148 "memory_domains": [ 00:21:19.148 { 00:21:19.148 "dma_device_id": "system", 00:21:19.148 "dma_device_type": 1 00:21:19.148 }, 00:21:19.148 { 00:21:19.148 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:19.148 "dma_device_type": 2 00:21:19.148 } 00:21:19.148 ], 00:21:19.148 "driver_specific": {} 00:21:19.148 } 00:21:19.148 ] 00:21:19.148 18:25:02 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:21:19.148 18:25:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@335 -- # verify_raid_bdev_state Existed_Raid online raid1 0 4 00:21:19.148 18:25:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:21:19.148 18:25:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:21:19.148 18:25:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:21:19.148 18:25:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:21:19.148 18:25:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:21:19.148 18:25:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:21:19.148 18:25:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:21:19.148 18:25:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:21:19.148 18:25:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:21:19.148 18:25:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:19.148 18:25:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:21:19.406 18:25:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:21:19.406 "name": "Existed_Raid", 00:21:19.406 "uuid": "761b1c05-293d-44e3-8790-fd43733fe542", 00:21:19.406 "strip_size_kb": 0, 00:21:19.406 "state": "online", 00:21:19.406 "raid_level": "raid1", 00:21:19.406 "superblock": false, 00:21:19.406 "num_base_bdevs": 4, 00:21:19.406 "num_base_bdevs_discovered": 4, 00:21:19.406 "num_base_bdevs_operational": 4, 00:21:19.406 "base_bdevs_list": [ 00:21:19.406 { 00:21:19.406 "name": "NewBaseBdev", 00:21:19.406 "uuid": "8f10f72a-2c12-4c66-91bf-88786815061b", 00:21:19.406 "is_configured": true, 00:21:19.406 "data_offset": 0, 00:21:19.406 "data_size": 65536 00:21:19.406 }, 00:21:19.406 { 00:21:19.406 "name": "BaseBdev2", 00:21:19.406 "uuid": "d8e532c0-6f39-4231-bd21-e3c367b23850", 00:21:19.406 "is_configured": true, 00:21:19.406 "data_offset": 0, 00:21:19.406 "data_size": 65536 00:21:19.406 }, 00:21:19.406 { 00:21:19.406 "name": "BaseBdev3", 00:21:19.406 "uuid": "962bb006-dc5c-48ed-aa46-3c70b142baf0", 00:21:19.406 "is_configured": true, 00:21:19.406 "data_offset": 0, 00:21:19.406 "data_size": 65536 00:21:19.406 }, 00:21:19.406 { 00:21:19.406 "name": "BaseBdev4", 00:21:19.406 "uuid": "6d169095-940a-4521-8407-f342df56550e", 00:21:19.406 "is_configured": true, 00:21:19.406 "data_offset": 0, 00:21:19.406 "data_size": 65536 00:21:19.406 } 00:21:19.406 ] 00:21:19.406 }' 00:21:19.406 18:25:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:21:19.406 18:25:03 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:21:20.337 18:25:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@336 -- # verify_raid_bdev_properties Existed_Raid 00:21:20.337 18:25:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:21:20.337 18:25:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:21:20.337 18:25:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:21:20.337 18:25:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:21:20.337 18:25:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@198 -- # local name 00:21:20.337 18:25:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:21:20.337 18:25:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:21:20.595 [2024-07-12 18:25:04.208596] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:21:20.595 18:25:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:21:20.595 "name": "Existed_Raid", 00:21:20.595 "aliases": [ 00:21:20.595 "761b1c05-293d-44e3-8790-fd43733fe542" 00:21:20.595 ], 00:21:20.595 "product_name": "Raid Volume", 00:21:20.595 "block_size": 512, 00:21:20.595 "num_blocks": 65536, 00:21:20.595 "uuid": "761b1c05-293d-44e3-8790-fd43733fe542", 00:21:20.595 "assigned_rate_limits": { 00:21:20.595 "rw_ios_per_sec": 0, 00:21:20.595 "rw_mbytes_per_sec": 0, 00:21:20.595 "r_mbytes_per_sec": 0, 00:21:20.595 "w_mbytes_per_sec": 0 00:21:20.595 }, 00:21:20.595 "claimed": false, 00:21:20.595 "zoned": false, 00:21:20.595 "supported_io_types": { 00:21:20.595 "read": true, 00:21:20.595 "write": true, 00:21:20.595 "unmap": false, 00:21:20.595 "flush": false, 00:21:20.595 "reset": true, 00:21:20.595 "nvme_admin": false, 00:21:20.595 "nvme_io": false, 00:21:20.595 "nvme_io_md": false, 00:21:20.595 "write_zeroes": true, 00:21:20.595 "zcopy": false, 00:21:20.595 "get_zone_info": false, 00:21:20.595 "zone_management": false, 00:21:20.595 "zone_append": false, 00:21:20.595 "compare": false, 00:21:20.595 "compare_and_write": false, 00:21:20.596 "abort": false, 00:21:20.596 "seek_hole": false, 00:21:20.596 "seek_data": false, 00:21:20.596 "copy": false, 00:21:20.596 "nvme_iov_md": false 00:21:20.596 }, 00:21:20.596 "memory_domains": [ 00:21:20.596 { 00:21:20.596 "dma_device_id": "system", 00:21:20.596 "dma_device_type": 1 00:21:20.596 }, 00:21:20.596 { 00:21:20.596 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:20.596 "dma_device_type": 2 00:21:20.596 }, 00:21:20.596 { 00:21:20.596 "dma_device_id": "system", 00:21:20.596 "dma_device_type": 1 00:21:20.596 }, 00:21:20.596 { 00:21:20.596 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:20.596 "dma_device_type": 2 00:21:20.596 }, 00:21:20.596 { 00:21:20.596 "dma_device_id": "system", 00:21:20.596 "dma_device_type": 1 00:21:20.596 }, 00:21:20.596 { 00:21:20.596 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:20.596 "dma_device_type": 2 00:21:20.596 }, 00:21:20.596 { 00:21:20.596 "dma_device_id": "system", 00:21:20.596 "dma_device_type": 1 00:21:20.596 }, 00:21:20.596 { 00:21:20.596 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:20.596 "dma_device_type": 2 00:21:20.596 } 00:21:20.596 ], 00:21:20.596 "driver_specific": { 00:21:20.596 "raid": { 00:21:20.596 "uuid": "761b1c05-293d-44e3-8790-fd43733fe542", 00:21:20.596 "strip_size_kb": 0, 00:21:20.596 "state": "online", 00:21:20.596 "raid_level": "raid1", 00:21:20.596 "superblock": false, 00:21:20.596 "num_base_bdevs": 4, 00:21:20.596 "num_base_bdevs_discovered": 4, 00:21:20.596 "num_base_bdevs_operational": 4, 00:21:20.596 "base_bdevs_list": [ 00:21:20.596 { 00:21:20.596 "name": "NewBaseBdev", 00:21:20.596 "uuid": "8f10f72a-2c12-4c66-91bf-88786815061b", 00:21:20.596 "is_configured": true, 00:21:20.596 "data_offset": 0, 00:21:20.596 "data_size": 65536 00:21:20.596 }, 00:21:20.596 { 00:21:20.596 "name": "BaseBdev2", 00:21:20.596 "uuid": "d8e532c0-6f39-4231-bd21-e3c367b23850", 00:21:20.596 "is_configured": true, 00:21:20.596 "data_offset": 0, 00:21:20.596 "data_size": 65536 00:21:20.596 }, 00:21:20.596 { 00:21:20.596 "name": "BaseBdev3", 00:21:20.596 "uuid": "962bb006-dc5c-48ed-aa46-3c70b142baf0", 00:21:20.596 "is_configured": true, 00:21:20.596 "data_offset": 0, 00:21:20.596 "data_size": 65536 00:21:20.596 }, 00:21:20.596 { 00:21:20.596 "name": "BaseBdev4", 00:21:20.596 "uuid": "6d169095-940a-4521-8407-f342df56550e", 00:21:20.596 "is_configured": true, 00:21:20.596 "data_offset": 0, 00:21:20.596 "data_size": 65536 00:21:20.596 } 00:21:20.596 ] 00:21:20.596 } 00:21:20.596 } 00:21:20.596 }' 00:21:20.596 18:25:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:21:20.596 18:25:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='NewBaseBdev 00:21:20.596 BaseBdev2 00:21:20.596 BaseBdev3 00:21:20.596 BaseBdev4' 00:21:20.596 18:25:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:21:20.596 18:25:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev 00:21:20.596 18:25:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:21:20.855 18:25:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:21:20.855 "name": "NewBaseBdev", 00:21:20.855 "aliases": [ 00:21:20.855 "8f10f72a-2c12-4c66-91bf-88786815061b" 00:21:20.855 ], 00:21:20.855 "product_name": "Malloc disk", 00:21:20.855 "block_size": 512, 00:21:20.855 "num_blocks": 65536, 00:21:20.855 "uuid": "8f10f72a-2c12-4c66-91bf-88786815061b", 00:21:20.855 "assigned_rate_limits": { 00:21:20.855 "rw_ios_per_sec": 0, 00:21:20.855 "rw_mbytes_per_sec": 0, 00:21:20.855 "r_mbytes_per_sec": 0, 00:21:20.855 "w_mbytes_per_sec": 0 00:21:20.855 }, 00:21:20.855 "claimed": true, 00:21:20.855 "claim_type": "exclusive_write", 00:21:20.855 "zoned": false, 00:21:20.855 "supported_io_types": { 00:21:20.855 "read": true, 00:21:20.855 "write": true, 00:21:20.855 "unmap": true, 00:21:20.855 "flush": true, 00:21:20.855 "reset": true, 00:21:20.855 "nvme_admin": false, 00:21:20.855 "nvme_io": false, 00:21:20.855 "nvme_io_md": false, 00:21:20.855 "write_zeroes": true, 00:21:20.855 "zcopy": true, 00:21:20.855 "get_zone_info": false, 00:21:20.855 "zone_management": false, 00:21:20.855 "zone_append": false, 00:21:20.855 "compare": false, 00:21:20.855 "compare_and_write": false, 00:21:20.855 "abort": true, 00:21:20.855 "seek_hole": false, 00:21:20.855 "seek_data": false, 00:21:20.855 "copy": true, 00:21:20.855 "nvme_iov_md": false 00:21:20.855 }, 00:21:20.855 "memory_domains": [ 00:21:20.855 { 00:21:20.855 "dma_device_id": "system", 00:21:20.855 "dma_device_type": 1 00:21:20.855 }, 00:21:20.855 { 00:21:20.855 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:20.855 "dma_device_type": 2 00:21:20.855 } 00:21:20.855 ], 00:21:20.855 "driver_specific": {} 00:21:20.855 }' 00:21:20.855 18:25:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:21:20.855 18:25:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:21:21.113 18:25:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:21:21.113 18:25:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:21:21.113 18:25:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:21:21.113 18:25:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:21:21.113 18:25:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:21:21.113 18:25:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:21:21.113 18:25:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:21:21.113 18:25:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:21:21.371 18:25:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:21:21.371 18:25:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:21:21.371 18:25:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:21:21.371 18:25:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:21:21.371 18:25:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:21:21.629 18:25:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:21:21.629 "name": "BaseBdev2", 00:21:21.629 "aliases": [ 00:21:21.629 "d8e532c0-6f39-4231-bd21-e3c367b23850" 00:21:21.629 ], 00:21:21.629 "product_name": "Malloc disk", 00:21:21.629 "block_size": 512, 00:21:21.629 "num_blocks": 65536, 00:21:21.629 "uuid": "d8e532c0-6f39-4231-bd21-e3c367b23850", 00:21:21.629 "assigned_rate_limits": { 00:21:21.629 "rw_ios_per_sec": 0, 00:21:21.629 "rw_mbytes_per_sec": 0, 00:21:21.629 "r_mbytes_per_sec": 0, 00:21:21.629 "w_mbytes_per_sec": 0 00:21:21.629 }, 00:21:21.629 "claimed": true, 00:21:21.629 "claim_type": "exclusive_write", 00:21:21.629 "zoned": false, 00:21:21.629 "supported_io_types": { 00:21:21.629 "read": true, 00:21:21.629 "write": true, 00:21:21.629 "unmap": true, 00:21:21.629 "flush": true, 00:21:21.629 "reset": true, 00:21:21.629 "nvme_admin": false, 00:21:21.629 "nvme_io": false, 00:21:21.629 "nvme_io_md": false, 00:21:21.629 "write_zeroes": true, 00:21:21.629 "zcopy": true, 00:21:21.629 "get_zone_info": false, 00:21:21.629 "zone_management": false, 00:21:21.629 "zone_append": false, 00:21:21.629 "compare": false, 00:21:21.629 "compare_and_write": false, 00:21:21.629 "abort": true, 00:21:21.629 "seek_hole": false, 00:21:21.629 "seek_data": false, 00:21:21.629 "copy": true, 00:21:21.629 "nvme_iov_md": false 00:21:21.629 }, 00:21:21.629 "memory_domains": [ 00:21:21.629 { 00:21:21.629 "dma_device_id": "system", 00:21:21.629 "dma_device_type": 1 00:21:21.629 }, 00:21:21.629 { 00:21:21.629 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:21.629 "dma_device_type": 2 00:21:21.629 } 00:21:21.629 ], 00:21:21.629 "driver_specific": {} 00:21:21.629 }' 00:21:21.629 18:25:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:21:21.629 18:25:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:21:21.629 18:25:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:21:21.629 18:25:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:21:21.629 18:25:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:21:21.887 18:25:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:21:21.887 18:25:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:21:21.887 18:25:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:21:21.887 18:25:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:21:21.887 18:25:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:21:21.887 18:25:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:21:21.887 18:25:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:21:21.887 18:25:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:21:21.887 18:25:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:21:21.887 18:25:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:21:22.144 18:25:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:21:22.144 "name": "BaseBdev3", 00:21:22.145 "aliases": [ 00:21:22.145 "962bb006-dc5c-48ed-aa46-3c70b142baf0" 00:21:22.145 ], 00:21:22.145 "product_name": "Malloc disk", 00:21:22.145 "block_size": 512, 00:21:22.145 "num_blocks": 65536, 00:21:22.145 "uuid": "962bb006-dc5c-48ed-aa46-3c70b142baf0", 00:21:22.145 "assigned_rate_limits": { 00:21:22.145 "rw_ios_per_sec": 0, 00:21:22.145 "rw_mbytes_per_sec": 0, 00:21:22.145 "r_mbytes_per_sec": 0, 00:21:22.145 "w_mbytes_per_sec": 0 00:21:22.145 }, 00:21:22.145 "claimed": true, 00:21:22.145 "claim_type": "exclusive_write", 00:21:22.145 "zoned": false, 00:21:22.145 "supported_io_types": { 00:21:22.145 "read": true, 00:21:22.145 "write": true, 00:21:22.145 "unmap": true, 00:21:22.145 "flush": true, 00:21:22.145 "reset": true, 00:21:22.145 "nvme_admin": false, 00:21:22.145 "nvme_io": false, 00:21:22.145 "nvme_io_md": false, 00:21:22.145 "write_zeroes": true, 00:21:22.145 "zcopy": true, 00:21:22.145 "get_zone_info": false, 00:21:22.145 "zone_management": false, 00:21:22.145 "zone_append": false, 00:21:22.145 "compare": false, 00:21:22.145 "compare_and_write": false, 00:21:22.145 "abort": true, 00:21:22.145 "seek_hole": false, 00:21:22.145 "seek_data": false, 00:21:22.145 "copy": true, 00:21:22.145 "nvme_iov_md": false 00:21:22.145 }, 00:21:22.145 "memory_domains": [ 00:21:22.145 { 00:21:22.145 "dma_device_id": "system", 00:21:22.145 "dma_device_type": 1 00:21:22.145 }, 00:21:22.145 { 00:21:22.145 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:22.145 "dma_device_type": 2 00:21:22.145 } 00:21:22.145 ], 00:21:22.145 "driver_specific": {} 00:21:22.145 }' 00:21:22.145 18:25:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:21:22.145 18:25:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:21:22.145 18:25:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:21:22.145 18:25:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:21:22.145 18:25:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:21:22.403 18:25:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:21:22.403 18:25:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:21:22.403 18:25:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:21:22.403 18:25:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:21:22.403 18:25:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:21:22.403 18:25:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:21:22.403 18:25:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:21:22.403 18:25:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:21:22.403 18:25:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 00:21:22.403 18:25:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:21:22.660 18:25:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:21:22.660 "name": "BaseBdev4", 00:21:22.660 "aliases": [ 00:21:22.660 "6d169095-940a-4521-8407-f342df56550e" 00:21:22.660 ], 00:21:22.660 "product_name": "Malloc disk", 00:21:22.660 "block_size": 512, 00:21:22.660 "num_blocks": 65536, 00:21:22.660 "uuid": "6d169095-940a-4521-8407-f342df56550e", 00:21:22.660 "assigned_rate_limits": { 00:21:22.660 "rw_ios_per_sec": 0, 00:21:22.660 "rw_mbytes_per_sec": 0, 00:21:22.660 "r_mbytes_per_sec": 0, 00:21:22.660 "w_mbytes_per_sec": 0 00:21:22.660 }, 00:21:22.660 "claimed": true, 00:21:22.660 "claim_type": "exclusive_write", 00:21:22.660 "zoned": false, 00:21:22.660 "supported_io_types": { 00:21:22.660 "read": true, 00:21:22.660 "write": true, 00:21:22.660 "unmap": true, 00:21:22.660 "flush": true, 00:21:22.660 "reset": true, 00:21:22.660 "nvme_admin": false, 00:21:22.660 "nvme_io": false, 00:21:22.660 "nvme_io_md": false, 00:21:22.660 "write_zeroes": true, 00:21:22.660 "zcopy": true, 00:21:22.660 "get_zone_info": false, 00:21:22.660 "zone_management": false, 00:21:22.660 "zone_append": false, 00:21:22.660 "compare": false, 00:21:22.660 "compare_and_write": false, 00:21:22.660 "abort": true, 00:21:22.660 "seek_hole": false, 00:21:22.661 "seek_data": false, 00:21:22.661 "copy": true, 00:21:22.661 "nvme_iov_md": false 00:21:22.661 }, 00:21:22.661 "memory_domains": [ 00:21:22.661 { 00:21:22.661 "dma_device_id": "system", 00:21:22.661 "dma_device_type": 1 00:21:22.661 }, 00:21:22.661 { 00:21:22.661 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:22.661 "dma_device_type": 2 00:21:22.661 } 00:21:22.661 ], 00:21:22.661 "driver_specific": {} 00:21:22.661 }' 00:21:22.661 18:25:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:21:22.661 18:25:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:21:22.661 18:25:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:21:22.661 18:25:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:21:22.661 18:25:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:21:22.918 18:25:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:21:22.918 18:25:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:21:22.918 18:25:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:21:22.918 18:25:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:21:22.918 18:25:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:21:22.918 18:25:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:21:22.918 18:25:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:21:22.918 18:25:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@338 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:21:23.176 [2024-07-12 18:25:06.795175] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:21:23.176 [2024-07-12 18:25:06.795198] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:21:23.176 [2024-07-12 18:25:06.795249] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:21:23.176 [2024-07-12 18:25:06.795531] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:21:23.176 [2024-07-12 18:25:06.795543] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1f83610 name Existed_Raid, state offline 00:21:23.176 18:25:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@341 -- # killprocess 2548788 00:21:23.176 18:25:06 bdev_raid.raid_state_function_test -- common/autotest_common.sh@948 -- # '[' -z 2548788 ']' 00:21:23.176 18:25:06 bdev_raid.raid_state_function_test -- common/autotest_common.sh@952 -- # kill -0 2548788 00:21:23.176 18:25:06 bdev_raid.raid_state_function_test -- common/autotest_common.sh@953 -- # uname 00:21:23.176 18:25:06 bdev_raid.raid_state_function_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:21:23.176 18:25:06 bdev_raid.raid_state_function_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2548788 00:21:23.176 18:25:06 bdev_raid.raid_state_function_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:21:23.176 18:25:06 bdev_raid.raid_state_function_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:21:23.176 18:25:06 bdev_raid.raid_state_function_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2548788' 00:21:23.176 killing process with pid 2548788 00:21:23.176 18:25:06 bdev_raid.raid_state_function_test -- common/autotest_common.sh@967 -- # kill 2548788 00:21:23.176 [2024-07-12 18:25:06.862415] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:21:23.176 18:25:06 bdev_raid.raid_state_function_test -- common/autotest_common.sh@972 -- # wait 2548788 00:21:23.434 [2024-07-12 18:25:06.904320] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:21:23.434 18:25:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@343 -- # return 0 00:21:23.434 00:21:23.434 real 0m33.096s 00:21:23.434 user 1m0.790s 00:21:23.434 sys 0m5.895s 00:21:23.434 18:25:07 bdev_raid.raid_state_function_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:21:23.434 18:25:07 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:21:23.434 ************************************ 00:21:23.434 END TEST raid_state_function_test 00:21:23.434 ************************************ 00:21:23.693 18:25:07 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:21:23.693 18:25:07 bdev_raid -- bdev/bdev_raid.sh@868 -- # run_test raid_state_function_test_sb raid_state_function_test raid1 4 true 00:21:23.693 18:25:07 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:21:23.693 18:25:07 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:21:23.693 18:25:07 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:21:23.693 ************************************ 00:21:23.693 START TEST raid_state_function_test_sb 00:21:23.693 ************************************ 00:21:23.693 18:25:07 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@1123 -- # raid_state_function_test raid1 4 true 00:21:23.693 18:25:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@220 -- # local raid_level=raid1 00:21:23.693 18:25:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@221 -- # local num_base_bdevs=4 00:21:23.693 18:25:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@222 -- # local superblock=true 00:21:23.693 18:25:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@223 -- # local raid_bdev 00:21:23.693 18:25:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i = 1 )) 00:21:23.693 18:25:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:21:23.693 18:25:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev1 00:21:23.693 18:25:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:21:23.693 18:25:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:21:23.693 18:25:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev2 00:21:23.693 18:25:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:21:23.693 18:25:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:21:23.693 18:25:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev3 00:21:23.693 18:25:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:21:23.693 18:25:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:21:23.693 18:25:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev4 00:21:23.693 18:25:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:21:23.693 18:25:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:21:23.693 18:25:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3' 'BaseBdev4') 00:21:23.693 18:25:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # local base_bdevs 00:21:23.693 18:25:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@225 -- # local raid_bdev_name=Existed_Raid 00:21:23.693 18:25:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # local strip_size 00:21:23.693 18:25:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@227 -- # local strip_size_create_arg 00:21:23.693 18:25:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@228 -- # local superblock_create_arg 00:21:23.693 18:25:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@230 -- # '[' raid1 '!=' raid1 ']' 00:21:23.693 18:25:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@234 -- # strip_size=0 00:21:23.693 18:25:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@237 -- # '[' true = true ']' 00:21:23.693 18:25:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@238 -- # superblock_create_arg=-s 00:21:23.693 18:25:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@244 -- # raid_pid=2553674 00:21:23.693 18:25:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@245 -- # echo 'Process raid pid: 2553674' 00:21:23.693 Process raid pid: 2553674 00:21:23.693 18:25:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@243 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:21:23.693 18:25:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@246 -- # waitforlisten 2553674 /var/tmp/spdk-raid.sock 00:21:23.693 18:25:07 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@829 -- # '[' -z 2553674 ']' 00:21:23.693 18:25:07 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:21:23.693 18:25:07 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@834 -- # local max_retries=100 00:21:23.693 18:25:07 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:21:23.693 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:21:23.693 18:25:07 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@838 -- # xtrace_disable 00:21:23.693 18:25:07 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:21:23.693 [2024-07-12 18:25:07.284522] Starting SPDK v24.09-pre git sha1 182dd7de4 / DPDK 24.03.0 initialization... 00:21:23.693 [2024-07-12 18:25:07.284589] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:21:23.693 [2024-07-12 18:25:07.414071] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:21:23.951 [2024-07-12 18:25:07.525575] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:21:23.951 [2024-07-12 18:25:07.588269] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:21:23.951 [2024-07-12 18:25:07.588293] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:21:24.517 18:25:08 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:21:24.517 18:25:08 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@862 -- # return 0 00:21:24.517 18:25:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@250 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:21:24.775 [2024-07-12 18:25:08.426782] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:21:24.775 [2024-07-12 18:25:08.426823] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:21:24.775 [2024-07-12 18:25:08.426833] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:21:24.775 [2024-07-12 18:25:08.426845] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:21:24.775 [2024-07-12 18:25:08.426853] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:21:24.776 [2024-07-12 18:25:08.426864] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:21:24.776 [2024-07-12 18:25:08.426873] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev4 00:21:24.776 [2024-07-12 18:25:08.426884] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev4 doesn't exist now 00:21:24.776 18:25:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@251 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:21:24.776 18:25:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:21:24.776 18:25:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:21:24.776 18:25:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:21:24.776 18:25:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:21:24.776 18:25:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:21:24.776 18:25:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:21:24.776 18:25:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:21:24.776 18:25:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:21:24.776 18:25:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:21:24.776 18:25:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:21:24.776 18:25:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:25.034 18:25:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:21:25.034 "name": "Existed_Raid", 00:21:25.034 "uuid": "03de87f0-8ea0-41cd-9bd0-55c22c68bb68", 00:21:25.034 "strip_size_kb": 0, 00:21:25.034 "state": "configuring", 00:21:25.034 "raid_level": "raid1", 00:21:25.034 "superblock": true, 00:21:25.034 "num_base_bdevs": 4, 00:21:25.034 "num_base_bdevs_discovered": 0, 00:21:25.034 "num_base_bdevs_operational": 4, 00:21:25.034 "base_bdevs_list": [ 00:21:25.034 { 00:21:25.034 "name": "BaseBdev1", 00:21:25.034 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:25.034 "is_configured": false, 00:21:25.034 "data_offset": 0, 00:21:25.034 "data_size": 0 00:21:25.034 }, 00:21:25.034 { 00:21:25.034 "name": "BaseBdev2", 00:21:25.034 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:25.034 "is_configured": false, 00:21:25.034 "data_offset": 0, 00:21:25.034 "data_size": 0 00:21:25.034 }, 00:21:25.034 { 00:21:25.034 "name": "BaseBdev3", 00:21:25.034 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:25.034 "is_configured": false, 00:21:25.034 "data_offset": 0, 00:21:25.034 "data_size": 0 00:21:25.034 }, 00:21:25.034 { 00:21:25.034 "name": "BaseBdev4", 00:21:25.034 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:25.034 "is_configured": false, 00:21:25.034 "data_offset": 0, 00:21:25.034 "data_size": 0 00:21:25.034 } 00:21:25.034 ] 00:21:25.034 }' 00:21:25.034 18:25:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:21:25.034 18:25:08 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:21:25.599 18:25:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@252 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:21:25.868 [2024-07-12 18:25:09.493471] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:21:25.868 [2024-07-12 18:25:09.493501] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1236aa0 name Existed_Raid, state configuring 00:21:25.868 18:25:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@256 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:21:26.126 [2024-07-12 18:25:09.730121] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:21:26.126 [2024-07-12 18:25:09.730149] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:21:26.126 [2024-07-12 18:25:09.730159] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:21:26.126 [2024-07-12 18:25:09.730170] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:21:26.126 [2024-07-12 18:25:09.730179] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:21:26.126 [2024-07-12 18:25:09.730190] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:21:26.126 [2024-07-12 18:25:09.730199] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev4 00:21:26.126 [2024-07-12 18:25:09.730210] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev4 doesn't exist now 00:21:26.126 18:25:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@257 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:21:26.384 [2024-07-12 18:25:09.984512] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:21:26.384 BaseBdev1 00:21:26.384 18:25:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@258 -- # waitforbdev BaseBdev1 00:21:26.384 18:25:10 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev1 00:21:26.384 18:25:10 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:21:26.384 18:25:10 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:21:26.384 18:25:10 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:21:26.384 18:25:10 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:21:26.384 18:25:10 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:21:26.642 18:25:10 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:21:26.900 [ 00:21:26.900 { 00:21:26.900 "name": "BaseBdev1", 00:21:26.900 "aliases": [ 00:21:26.900 "22d14e60-204c-4fd5-b9a7-413548110f86" 00:21:26.900 ], 00:21:26.900 "product_name": "Malloc disk", 00:21:26.900 "block_size": 512, 00:21:26.900 "num_blocks": 65536, 00:21:26.900 "uuid": "22d14e60-204c-4fd5-b9a7-413548110f86", 00:21:26.900 "assigned_rate_limits": { 00:21:26.900 "rw_ios_per_sec": 0, 00:21:26.900 "rw_mbytes_per_sec": 0, 00:21:26.900 "r_mbytes_per_sec": 0, 00:21:26.900 "w_mbytes_per_sec": 0 00:21:26.900 }, 00:21:26.900 "claimed": true, 00:21:26.900 "claim_type": "exclusive_write", 00:21:26.900 "zoned": false, 00:21:26.900 "supported_io_types": { 00:21:26.900 "read": true, 00:21:26.900 "write": true, 00:21:26.900 "unmap": true, 00:21:26.900 "flush": true, 00:21:26.900 "reset": true, 00:21:26.900 "nvme_admin": false, 00:21:26.900 "nvme_io": false, 00:21:26.900 "nvme_io_md": false, 00:21:26.900 "write_zeroes": true, 00:21:26.900 "zcopy": true, 00:21:26.900 "get_zone_info": false, 00:21:26.900 "zone_management": false, 00:21:26.900 "zone_append": false, 00:21:26.900 "compare": false, 00:21:26.900 "compare_and_write": false, 00:21:26.900 "abort": true, 00:21:26.900 "seek_hole": false, 00:21:26.900 "seek_data": false, 00:21:26.900 "copy": true, 00:21:26.900 "nvme_iov_md": false 00:21:26.900 }, 00:21:26.900 "memory_domains": [ 00:21:26.900 { 00:21:26.900 "dma_device_id": "system", 00:21:26.900 "dma_device_type": 1 00:21:26.900 }, 00:21:26.900 { 00:21:26.900 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:26.900 "dma_device_type": 2 00:21:26.900 } 00:21:26.900 ], 00:21:26.900 "driver_specific": {} 00:21:26.900 } 00:21:26.900 ] 00:21:26.900 18:25:10 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:21:26.900 18:25:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@259 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:21:26.900 18:25:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:21:26.900 18:25:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:21:26.900 18:25:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:21:26.900 18:25:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:21:26.900 18:25:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:21:26.900 18:25:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:21:26.900 18:25:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:21:26.900 18:25:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:21:26.900 18:25:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:21:26.900 18:25:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:26.900 18:25:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:21:27.526 18:25:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:21:27.526 "name": "Existed_Raid", 00:21:27.526 "uuid": "0fe2c795-9aa7-495c-997e-1087d6cf8f6a", 00:21:27.526 "strip_size_kb": 0, 00:21:27.526 "state": "configuring", 00:21:27.526 "raid_level": "raid1", 00:21:27.526 "superblock": true, 00:21:27.527 "num_base_bdevs": 4, 00:21:27.527 "num_base_bdevs_discovered": 1, 00:21:27.527 "num_base_bdevs_operational": 4, 00:21:27.527 "base_bdevs_list": [ 00:21:27.527 { 00:21:27.527 "name": "BaseBdev1", 00:21:27.527 "uuid": "22d14e60-204c-4fd5-b9a7-413548110f86", 00:21:27.527 "is_configured": true, 00:21:27.527 "data_offset": 2048, 00:21:27.527 "data_size": 63488 00:21:27.527 }, 00:21:27.527 { 00:21:27.527 "name": "BaseBdev2", 00:21:27.527 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:27.527 "is_configured": false, 00:21:27.527 "data_offset": 0, 00:21:27.527 "data_size": 0 00:21:27.527 }, 00:21:27.527 { 00:21:27.527 "name": "BaseBdev3", 00:21:27.527 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:27.527 "is_configured": false, 00:21:27.527 "data_offset": 0, 00:21:27.527 "data_size": 0 00:21:27.527 }, 00:21:27.527 { 00:21:27.527 "name": "BaseBdev4", 00:21:27.527 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:27.527 "is_configured": false, 00:21:27.527 "data_offset": 0, 00:21:27.527 "data_size": 0 00:21:27.527 } 00:21:27.527 ] 00:21:27.527 }' 00:21:27.527 18:25:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:21:27.527 18:25:11 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:21:28.460 18:25:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@260 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:21:28.460 [2024-07-12 18:25:12.082079] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:21:28.461 [2024-07-12 18:25:12.082124] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1236310 name Existed_Raid, state configuring 00:21:28.461 18:25:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@264 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:21:28.718 [2024-07-12 18:25:12.326770] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:21:28.718 [2024-07-12 18:25:12.328260] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:21:28.718 [2024-07-12 18:25:12.328310] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:21:28.718 [2024-07-12 18:25:12.328320] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:21:28.718 [2024-07-12 18:25:12.328332] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:21:28.718 [2024-07-12 18:25:12.328341] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev4 00:21:28.718 [2024-07-12 18:25:12.328352] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev4 doesn't exist now 00:21:28.718 18:25:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i = 1 )) 00:21:28.718 18:25:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:21:28.719 18:25:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:21:28.719 18:25:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:21:28.719 18:25:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:21:28.719 18:25:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:21:28.719 18:25:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:21:28.719 18:25:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:21:28.719 18:25:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:21:28.719 18:25:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:21:28.719 18:25:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:21:28.719 18:25:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:21:28.719 18:25:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:21:28.719 18:25:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:28.977 18:25:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:21:28.977 "name": "Existed_Raid", 00:21:28.977 "uuid": "f98c468a-8c4f-4a93-8581-f37fca9842fd", 00:21:28.977 "strip_size_kb": 0, 00:21:28.977 "state": "configuring", 00:21:28.977 "raid_level": "raid1", 00:21:28.977 "superblock": true, 00:21:28.977 "num_base_bdevs": 4, 00:21:28.977 "num_base_bdevs_discovered": 1, 00:21:28.977 "num_base_bdevs_operational": 4, 00:21:28.977 "base_bdevs_list": [ 00:21:28.977 { 00:21:28.977 "name": "BaseBdev1", 00:21:28.977 "uuid": "22d14e60-204c-4fd5-b9a7-413548110f86", 00:21:28.977 "is_configured": true, 00:21:28.977 "data_offset": 2048, 00:21:28.977 "data_size": 63488 00:21:28.977 }, 00:21:28.977 { 00:21:28.977 "name": "BaseBdev2", 00:21:28.977 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:28.977 "is_configured": false, 00:21:28.977 "data_offset": 0, 00:21:28.977 "data_size": 0 00:21:28.977 }, 00:21:28.977 { 00:21:28.977 "name": "BaseBdev3", 00:21:28.977 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:28.977 "is_configured": false, 00:21:28.977 "data_offset": 0, 00:21:28.977 "data_size": 0 00:21:28.977 }, 00:21:28.977 { 00:21:28.977 "name": "BaseBdev4", 00:21:28.977 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:28.977 "is_configured": false, 00:21:28.977 "data_offset": 0, 00:21:28.977 "data_size": 0 00:21:28.977 } 00:21:28.977 ] 00:21:28.977 }' 00:21:28.977 18:25:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:21:28.977 18:25:12 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:21:29.543 18:25:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:21:29.801 [2024-07-12 18:25:13.384948] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:21:29.801 BaseBdev2 00:21:29.801 18:25:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev2 00:21:29.801 18:25:13 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev2 00:21:29.801 18:25:13 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:21:29.801 18:25:13 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:21:29.801 18:25:13 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:21:29.801 18:25:13 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:21:29.801 18:25:13 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:21:30.058 18:25:13 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:21:30.317 [ 00:21:30.317 { 00:21:30.317 "name": "BaseBdev2", 00:21:30.317 "aliases": [ 00:21:30.317 "f9a1398e-12f0-4cbe-9c34-b51c3935a1fd" 00:21:30.317 ], 00:21:30.317 "product_name": "Malloc disk", 00:21:30.317 "block_size": 512, 00:21:30.317 "num_blocks": 65536, 00:21:30.317 "uuid": "f9a1398e-12f0-4cbe-9c34-b51c3935a1fd", 00:21:30.317 "assigned_rate_limits": { 00:21:30.317 "rw_ios_per_sec": 0, 00:21:30.317 "rw_mbytes_per_sec": 0, 00:21:30.317 "r_mbytes_per_sec": 0, 00:21:30.317 "w_mbytes_per_sec": 0 00:21:30.317 }, 00:21:30.317 "claimed": true, 00:21:30.317 "claim_type": "exclusive_write", 00:21:30.317 "zoned": false, 00:21:30.317 "supported_io_types": { 00:21:30.317 "read": true, 00:21:30.317 "write": true, 00:21:30.317 "unmap": true, 00:21:30.317 "flush": true, 00:21:30.317 "reset": true, 00:21:30.317 "nvme_admin": false, 00:21:30.317 "nvme_io": false, 00:21:30.317 "nvme_io_md": false, 00:21:30.317 "write_zeroes": true, 00:21:30.317 "zcopy": true, 00:21:30.317 "get_zone_info": false, 00:21:30.317 "zone_management": false, 00:21:30.317 "zone_append": false, 00:21:30.317 "compare": false, 00:21:30.317 "compare_and_write": false, 00:21:30.317 "abort": true, 00:21:30.317 "seek_hole": false, 00:21:30.317 "seek_data": false, 00:21:30.317 "copy": true, 00:21:30.317 "nvme_iov_md": false 00:21:30.317 }, 00:21:30.317 "memory_domains": [ 00:21:30.317 { 00:21:30.317 "dma_device_id": "system", 00:21:30.317 "dma_device_type": 1 00:21:30.317 }, 00:21:30.317 { 00:21:30.317 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:30.317 "dma_device_type": 2 00:21:30.317 } 00:21:30.317 ], 00:21:30.317 "driver_specific": {} 00:21:30.317 } 00:21:30.317 ] 00:21:30.317 18:25:13 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:21:30.317 18:25:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:21:30.317 18:25:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:21:30.317 18:25:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:21:30.317 18:25:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:21:30.317 18:25:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:21:30.317 18:25:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:21:30.317 18:25:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:21:30.317 18:25:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:21:30.317 18:25:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:21:30.317 18:25:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:21:30.317 18:25:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:21:30.317 18:25:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:21:30.317 18:25:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:30.317 18:25:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:21:30.317 18:25:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:21:30.317 "name": "Existed_Raid", 00:21:30.317 "uuid": "f98c468a-8c4f-4a93-8581-f37fca9842fd", 00:21:30.317 "strip_size_kb": 0, 00:21:30.317 "state": "configuring", 00:21:30.317 "raid_level": "raid1", 00:21:30.317 "superblock": true, 00:21:30.317 "num_base_bdevs": 4, 00:21:30.317 "num_base_bdevs_discovered": 2, 00:21:30.317 "num_base_bdevs_operational": 4, 00:21:30.317 "base_bdevs_list": [ 00:21:30.317 { 00:21:30.317 "name": "BaseBdev1", 00:21:30.317 "uuid": "22d14e60-204c-4fd5-b9a7-413548110f86", 00:21:30.317 "is_configured": true, 00:21:30.317 "data_offset": 2048, 00:21:30.317 "data_size": 63488 00:21:30.317 }, 00:21:30.317 { 00:21:30.317 "name": "BaseBdev2", 00:21:30.317 "uuid": "f9a1398e-12f0-4cbe-9c34-b51c3935a1fd", 00:21:30.317 "is_configured": true, 00:21:30.317 "data_offset": 2048, 00:21:30.317 "data_size": 63488 00:21:30.317 }, 00:21:30.317 { 00:21:30.317 "name": "BaseBdev3", 00:21:30.317 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:30.317 "is_configured": false, 00:21:30.317 "data_offset": 0, 00:21:30.317 "data_size": 0 00:21:30.317 }, 00:21:30.317 { 00:21:30.317 "name": "BaseBdev4", 00:21:30.317 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:30.317 "is_configured": false, 00:21:30.317 "data_offset": 0, 00:21:30.317 "data_size": 0 00:21:30.317 } 00:21:30.317 ] 00:21:30.317 }' 00:21:30.317 18:25:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:21:30.317 18:25:13 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:21:30.882 18:25:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:21:31.139 [2024-07-12 18:25:14.767976] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:21:31.139 BaseBdev3 00:21:31.139 18:25:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev3 00:21:31.139 18:25:14 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev3 00:21:31.139 18:25:14 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:21:31.139 18:25:14 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:21:31.139 18:25:14 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:21:31.139 18:25:14 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:21:31.139 18:25:14 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:21:31.397 18:25:15 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:21:31.655 [ 00:21:31.655 { 00:21:31.655 "name": "BaseBdev3", 00:21:31.655 "aliases": [ 00:21:31.655 "d8d4f304-af41-4004-982c-cccd039599b1" 00:21:31.655 ], 00:21:31.655 "product_name": "Malloc disk", 00:21:31.655 "block_size": 512, 00:21:31.655 "num_blocks": 65536, 00:21:31.655 "uuid": "d8d4f304-af41-4004-982c-cccd039599b1", 00:21:31.655 "assigned_rate_limits": { 00:21:31.655 "rw_ios_per_sec": 0, 00:21:31.655 "rw_mbytes_per_sec": 0, 00:21:31.655 "r_mbytes_per_sec": 0, 00:21:31.655 "w_mbytes_per_sec": 0 00:21:31.655 }, 00:21:31.655 "claimed": true, 00:21:31.655 "claim_type": "exclusive_write", 00:21:31.655 "zoned": false, 00:21:31.655 "supported_io_types": { 00:21:31.655 "read": true, 00:21:31.655 "write": true, 00:21:31.655 "unmap": true, 00:21:31.655 "flush": true, 00:21:31.655 "reset": true, 00:21:31.655 "nvme_admin": false, 00:21:31.655 "nvme_io": false, 00:21:31.655 "nvme_io_md": false, 00:21:31.655 "write_zeroes": true, 00:21:31.655 "zcopy": true, 00:21:31.655 "get_zone_info": false, 00:21:31.655 "zone_management": false, 00:21:31.655 "zone_append": false, 00:21:31.655 "compare": false, 00:21:31.655 "compare_and_write": false, 00:21:31.655 "abort": true, 00:21:31.655 "seek_hole": false, 00:21:31.655 "seek_data": false, 00:21:31.655 "copy": true, 00:21:31.655 "nvme_iov_md": false 00:21:31.655 }, 00:21:31.655 "memory_domains": [ 00:21:31.655 { 00:21:31.655 "dma_device_id": "system", 00:21:31.655 "dma_device_type": 1 00:21:31.655 }, 00:21:31.655 { 00:21:31.655 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:31.655 "dma_device_type": 2 00:21:31.655 } 00:21:31.655 ], 00:21:31.655 "driver_specific": {} 00:21:31.655 } 00:21:31.655 ] 00:21:31.655 18:25:15 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:21:31.655 18:25:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:21:31.655 18:25:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:21:31.655 18:25:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:21:31.655 18:25:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:21:31.655 18:25:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:21:31.655 18:25:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:21:31.655 18:25:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:21:31.655 18:25:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:21:31.655 18:25:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:21:31.655 18:25:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:21:31.655 18:25:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:21:31.655 18:25:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:21:31.655 18:25:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:31.655 18:25:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:21:31.912 18:25:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:21:31.912 "name": "Existed_Raid", 00:21:31.912 "uuid": "f98c468a-8c4f-4a93-8581-f37fca9842fd", 00:21:31.912 "strip_size_kb": 0, 00:21:31.912 "state": "configuring", 00:21:31.912 "raid_level": "raid1", 00:21:31.912 "superblock": true, 00:21:31.912 "num_base_bdevs": 4, 00:21:31.912 "num_base_bdevs_discovered": 3, 00:21:31.912 "num_base_bdevs_operational": 4, 00:21:31.912 "base_bdevs_list": [ 00:21:31.912 { 00:21:31.912 "name": "BaseBdev1", 00:21:31.912 "uuid": "22d14e60-204c-4fd5-b9a7-413548110f86", 00:21:31.912 "is_configured": true, 00:21:31.912 "data_offset": 2048, 00:21:31.912 "data_size": 63488 00:21:31.912 }, 00:21:31.912 { 00:21:31.912 "name": "BaseBdev2", 00:21:31.912 "uuid": "f9a1398e-12f0-4cbe-9c34-b51c3935a1fd", 00:21:31.912 "is_configured": true, 00:21:31.912 "data_offset": 2048, 00:21:31.912 "data_size": 63488 00:21:31.912 }, 00:21:31.912 { 00:21:31.912 "name": "BaseBdev3", 00:21:31.912 "uuid": "d8d4f304-af41-4004-982c-cccd039599b1", 00:21:31.912 "is_configured": true, 00:21:31.912 "data_offset": 2048, 00:21:31.912 "data_size": 63488 00:21:31.912 }, 00:21:31.912 { 00:21:31.912 "name": "BaseBdev4", 00:21:31.912 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:31.912 "is_configured": false, 00:21:31.912 "data_offset": 0, 00:21:31.912 "data_size": 0 00:21:31.912 } 00:21:31.912 ] 00:21:31.912 }' 00:21:31.912 18:25:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:21:31.912 18:25:15 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:21:32.477 18:25:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4 00:21:32.735 [2024-07-12 18:25:16.363687] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:21:32.735 [2024-07-12 18:25:16.363853] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x1237350 00:21:32.735 [2024-07-12 18:25:16.363867] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:21:32.735 [2024-07-12 18:25:16.364050] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1237020 00:21:32.735 [2024-07-12 18:25:16.364173] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1237350 00:21:32.735 [2024-07-12 18:25:16.364183] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x1237350 00:21:32.735 [2024-07-12 18:25:16.364273] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:21:32.735 BaseBdev4 00:21:32.735 18:25:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev4 00:21:32.735 18:25:16 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev4 00:21:32.735 18:25:16 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:21:32.735 18:25:16 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:21:32.735 18:25:16 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:21:32.735 18:25:16 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:21:32.735 18:25:16 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:21:32.993 18:25:16 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 -t 2000 00:21:33.251 [ 00:21:33.251 { 00:21:33.251 "name": "BaseBdev4", 00:21:33.251 "aliases": [ 00:21:33.251 "62c115c2-5789-4a2f-b9b2-766fa0945366" 00:21:33.251 ], 00:21:33.251 "product_name": "Malloc disk", 00:21:33.251 "block_size": 512, 00:21:33.251 "num_blocks": 65536, 00:21:33.251 "uuid": "62c115c2-5789-4a2f-b9b2-766fa0945366", 00:21:33.251 "assigned_rate_limits": { 00:21:33.251 "rw_ios_per_sec": 0, 00:21:33.251 "rw_mbytes_per_sec": 0, 00:21:33.251 "r_mbytes_per_sec": 0, 00:21:33.251 "w_mbytes_per_sec": 0 00:21:33.251 }, 00:21:33.251 "claimed": true, 00:21:33.251 "claim_type": "exclusive_write", 00:21:33.251 "zoned": false, 00:21:33.251 "supported_io_types": { 00:21:33.251 "read": true, 00:21:33.251 "write": true, 00:21:33.251 "unmap": true, 00:21:33.251 "flush": true, 00:21:33.251 "reset": true, 00:21:33.251 "nvme_admin": false, 00:21:33.251 "nvme_io": false, 00:21:33.251 "nvme_io_md": false, 00:21:33.251 "write_zeroes": true, 00:21:33.251 "zcopy": true, 00:21:33.251 "get_zone_info": false, 00:21:33.251 "zone_management": false, 00:21:33.251 "zone_append": false, 00:21:33.251 "compare": false, 00:21:33.251 "compare_and_write": false, 00:21:33.251 "abort": true, 00:21:33.251 "seek_hole": false, 00:21:33.251 "seek_data": false, 00:21:33.251 "copy": true, 00:21:33.251 "nvme_iov_md": false 00:21:33.251 }, 00:21:33.251 "memory_domains": [ 00:21:33.251 { 00:21:33.251 "dma_device_id": "system", 00:21:33.251 "dma_device_type": 1 00:21:33.251 }, 00:21:33.251 { 00:21:33.251 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:33.251 "dma_device_type": 2 00:21:33.251 } 00:21:33.251 ], 00:21:33.251 "driver_specific": {} 00:21:33.251 } 00:21:33.251 ] 00:21:33.251 18:25:16 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:21:33.251 18:25:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:21:33.251 18:25:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:21:33.251 18:25:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@270 -- # verify_raid_bdev_state Existed_Raid online raid1 0 4 00:21:33.251 18:25:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:21:33.251 18:25:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:21:33.251 18:25:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:21:33.251 18:25:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:21:33.251 18:25:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:21:33.251 18:25:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:21:33.251 18:25:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:21:33.251 18:25:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:21:33.251 18:25:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:21:33.251 18:25:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:33.251 18:25:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:21:33.509 18:25:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:21:33.509 "name": "Existed_Raid", 00:21:33.509 "uuid": "f98c468a-8c4f-4a93-8581-f37fca9842fd", 00:21:33.509 "strip_size_kb": 0, 00:21:33.509 "state": "online", 00:21:33.509 "raid_level": "raid1", 00:21:33.509 "superblock": true, 00:21:33.509 "num_base_bdevs": 4, 00:21:33.509 "num_base_bdevs_discovered": 4, 00:21:33.509 "num_base_bdevs_operational": 4, 00:21:33.509 "base_bdevs_list": [ 00:21:33.509 { 00:21:33.509 "name": "BaseBdev1", 00:21:33.509 "uuid": "22d14e60-204c-4fd5-b9a7-413548110f86", 00:21:33.509 "is_configured": true, 00:21:33.510 "data_offset": 2048, 00:21:33.510 "data_size": 63488 00:21:33.510 }, 00:21:33.510 { 00:21:33.510 "name": "BaseBdev2", 00:21:33.510 "uuid": "f9a1398e-12f0-4cbe-9c34-b51c3935a1fd", 00:21:33.510 "is_configured": true, 00:21:33.510 "data_offset": 2048, 00:21:33.510 "data_size": 63488 00:21:33.510 }, 00:21:33.510 { 00:21:33.510 "name": "BaseBdev3", 00:21:33.510 "uuid": "d8d4f304-af41-4004-982c-cccd039599b1", 00:21:33.510 "is_configured": true, 00:21:33.510 "data_offset": 2048, 00:21:33.510 "data_size": 63488 00:21:33.510 }, 00:21:33.510 { 00:21:33.510 "name": "BaseBdev4", 00:21:33.510 "uuid": "62c115c2-5789-4a2f-b9b2-766fa0945366", 00:21:33.510 "is_configured": true, 00:21:33.510 "data_offset": 2048, 00:21:33.510 "data_size": 63488 00:21:33.510 } 00:21:33.510 ] 00:21:33.510 }' 00:21:33.510 18:25:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:21:33.510 18:25:17 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:21:34.076 18:25:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_properties Existed_Raid 00:21:34.076 18:25:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:21:34.076 18:25:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:21:34.076 18:25:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:21:34.076 18:25:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:21:34.076 18:25:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@198 -- # local name 00:21:34.076 18:25:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:21:34.076 18:25:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:21:34.334 [2024-07-12 18:25:17.940196] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:21:34.334 18:25:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:21:34.334 "name": "Existed_Raid", 00:21:34.334 "aliases": [ 00:21:34.334 "f98c468a-8c4f-4a93-8581-f37fca9842fd" 00:21:34.334 ], 00:21:34.334 "product_name": "Raid Volume", 00:21:34.334 "block_size": 512, 00:21:34.334 "num_blocks": 63488, 00:21:34.334 "uuid": "f98c468a-8c4f-4a93-8581-f37fca9842fd", 00:21:34.334 "assigned_rate_limits": { 00:21:34.334 "rw_ios_per_sec": 0, 00:21:34.334 "rw_mbytes_per_sec": 0, 00:21:34.334 "r_mbytes_per_sec": 0, 00:21:34.334 "w_mbytes_per_sec": 0 00:21:34.334 }, 00:21:34.334 "claimed": false, 00:21:34.334 "zoned": false, 00:21:34.334 "supported_io_types": { 00:21:34.334 "read": true, 00:21:34.334 "write": true, 00:21:34.334 "unmap": false, 00:21:34.334 "flush": false, 00:21:34.334 "reset": true, 00:21:34.334 "nvme_admin": false, 00:21:34.334 "nvme_io": false, 00:21:34.334 "nvme_io_md": false, 00:21:34.334 "write_zeroes": true, 00:21:34.334 "zcopy": false, 00:21:34.334 "get_zone_info": false, 00:21:34.334 "zone_management": false, 00:21:34.334 "zone_append": false, 00:21:34.334 "compare": false, 00:21:34.334 "compare_and_write": false, 00:21:34.334 "abort": false, 00:21:34.334 "seek_hole": false, 00:21:34.334 "seek_data": false, 00:21:34.334 "copy": false, 00:21:34.334 "nvme_iov_md": false 00:21:34.334 }, 00:21:34.334 "memory_domains": [ 00:21:34.334 { 00:21:34.334 "dma_device_id": "system", 00:21:34.334 "dma_device_type": 1 00:21:34.334 }, 00:21:34.334 { 00:21:34.334 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:34.334 "dma_device_type": 2 00:21:34.334 }, 00:21:34.334 { 00:21:34.334 "dma_device_id": "system", 00:21:34.334 "dma_device_type": 1 00:21:34.334 }, 00:21:34.334 { 00:21:34.334 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:34.334 "dma_device_type": 2 00:21:34.334 }, 00:21:34.334 { 00:21:34.334 "dma_device_id": "system", 00:21:34.334 "dma_device_type": 1 00:21:34.334 }, 00:21:34.334 { 00:21:34.334 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:34.334 "dma_device_type": 2 00:21:34.334 }, 00:21:34.334 { 00:21:34.334 "dma_device_id": "system", 00:21:34.334 "dma_device_type": 1 00:21:34.334 }, 00:21:34.334 { 00:21:34.334 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:34.334 "dma_device_type": 2 00:21:34.334 } 00:21:34.334 ], 00:21:34.334 "driver_specific": { 00:21:34.334 "raid": { 00:21:34.334 "uuid": "f98c468a-8c4f-4a93-8581-f37fca9842fd", 00:21:34.334 "strip_size_kb": 0, 00:21:34.334 "state": "online", 00:21:34.334 "raid_level": "raid1", 00:21:34.334 "superblock": true, 00:21:34.334 "num_base_bdevs": 4, 00:21:34.334 "num_base_bdevs_discovered": 4, 00:21:34.334 "num_base_bdevs_operational": 4, 00:21:34.334 "base_bdevs_list": [ 00:21:34.334 { 00:21:34.334 "name": "BaseBdev1", 00:21:34.334 "uuid": "22d14e60-204c-4fd5-b9a7-413548110f86", 00:21:34.334 "is_configured": true, 00:21:34.334 "data_offset": 2048, 00:21:34.334 "data_size": 63488 00:21:34.334 }, 00:21:34.334 { 00:21:34.334 "name": "BaseBdev2", 00:21:34.334 "uuid": "f9a1398e-12f0-4cbe-9c34-b51c3935a1fd", 00:21:34.334 "is_configured": true, 00:21:34.334 "data_offset": 2048, 00:21:34.334 "data_size": 63488 00:21:34.334 }, 00:21:34.334 { 00:21:34.334 "name": "BaseBdev3", 00:21:34.334 "uuid": "d8d4f304-af41-4004-982c-cccd039599b1", 00:21:34.334 "is_configured": true, 00:21:34.334 "data_offset": 2048, 00:21:34.334 "data_size": 63488 00:21:34.334 }, 00:21:34.334 { 00:21:34.334 "name": "BaseBdev4", 00:21:34.334 "uuid": "62c115c2-5789-4a2f-b9b2-766fa0945366", 00:21:34.334 "is_configured": true, 00:21:34.334 "data_offset": 2048, 00:21:34.334 "data_size": 63488 00:21:34.334 } 00:21:34.334 ] 00:21:34.334 } 00:21:34.334 } 00:21:34.334 }' 00:21:34.334 18:25:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:21:34.334 18:25:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # base_bdev_names='BaseBdev1 00:21:34.334 BaseBdev2 00:21:34.334 BaseBdev3 00:21:34.334 BaseBdev4' 00:21:34.334 18:25:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:21:34.334 18:25:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:21:34.334 18:25:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:21:34.592 18:25:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:21:34.592 "name": "BaseBdev1", 00:21:34.592 "aliases": [ 00:21:34.592 "22d14e60-204c-4fd5-b9a7-413548110f86" 00:21:34.592 ], 00:21:34.592 "product_name": "Malloc disk", 00:21:34.592 "block_size": 512, 00:21:34.592 "num_blocks": 65536, 00:21:34.592 "uuid": "22d14e60-204c-4fd5-b9a7-413548110f86", 00:21:34.592 "assigned_rate_limits": { 00:21:34.592 "rw_ios_per_sec": 0, 00:21:34.592 "rw_mbytes_per_sec": 0, 00:21:34.592 "r_mbytes_per_sec": 0, 00:21:34.592 "w_mbytes_per_sec": 0 00:21:34.592 }, 00:21:34.592 "claimed": true, 00:21:34.592 "claim_type": "exclusive_write", 00:21:34.592 "zoned": false, 00:21:34.592 "supported_io_types": { 00:21:34.592 "read": true, 00:21:34.592 "write": true, 00:21:34.592 "unmap": true, 00:21:34.592 "flush": true, 00:21:34.592 "reset": true, 00:21:34.592 "nvme_admin": false, 00:21:34.592 "nvme_io": false, 00:21:34.592 "nvme_io_md": false, 00:21:34.592 "write_zeroes": true, 00:21:34.592 "zcopy": true, 00:21:34.592 "get_zone_info": false, 00:21:34.592 "zone_management": false, 00:21:34.592 "zone_append": false, 00:21:34.592 "compare": false, 00:21:34.592 "compare_and_write": false, 00:21:34.592 "abort": true, 00:21:34.592 "seek_hole": false, 00:21:34.592 "seek_data": false, 00:21:34.592 "copy": true, 00:21:34.592 "nvme_iov_md": false 00:21:34.592 }, 00:21:34.592 "memory_domains": [ 00:21:34.592 { 00:21:34.592 "dma_device_id": "system", 00:21:34.592 "dma_device_type": 1 00:21:34.592 }, 00:21:34.592 { 00:21:34.592 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:34.592 "dma_device_type": 2 00:21:34.592 } 00:21:34.592 ], 00:21:34.592 "driver_specific": {} 00:21:34.592 }' 00:21:34.592 18:25:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:21:34.592 18:25:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:21:34.592 18:25:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:21:34.592 18:25:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:21:34.850 18:25:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:21:34.850 18:25:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:21:34.850 18:25:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:21:34.850 18:25:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:21:34.850 18:25:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:21:34.850 18:25:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:21:34.850 18:25:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:21:34.850 18:25:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:21:34.850 18:25:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:21:34.850 18:25:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:21:34.850 18:25:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:21:35.108 18:25:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:21:35.108 "name": "BaseBdev2", 00:21:35.108 "aliases": [ 00:21:35.108 "f9a1398e-12f0-4cbe-9c34-b51c3935a1fd" 00:21:35.108 ], 00:21:35.108 "product_name": "Malloc disk", 00:21:35.108 "block_size": 512, 00:21:35.108 "num_blocks": 65536, 00:21:35.108 "uuid": "f9a1398e-12f0-4cbe-9c34-b51c3935a1fd", 00:21:35.108 "assigned_rate_limits": { 00:21:35.108 "rw_ios_per_sec": 0, 00:21:35.108 "rw_mbytes_per_sec": 0, 00:21:35.108 "r_mbytes_per_sec": 0, 00:21:35.108 "w_mbytes_per_sec": 0 00:21:35.108 }, 00:21:35.108 "claimed": true, 00:21:35.108 "claim_type": "exclusive_write", 00:21:35.108 "zoned": false, 00:21:35.108 "supported_io_types": { 00:21:35.108 "read": true, 00:21:35.108 "write": true, 00:21:35.108 "unmap": true, 00:21:35.108 "flush": true, 00:21:35.108 "reset": true, 00:21:35.108 "nvme_admin": false, 00:21:35.108 "nvme_io": false, 00:21:35.108 "nvme_io_md": false, 00:21:35.108 "write_zeroes": true, 00:21:35.108 "zcopy": true, 00:21:35.108 "get_zone_info": false, 00:21:35.108 "zone_management": false, 00:21:35.108 "zone_append": false, 00:21:35.108 "compare": false, 00:21:35.108 "compare_and_write": false, 00:21:35.108 "abort": true, 00:21:35.108 "seek_hole": false, 00:21:35.108 "seek_data": false, 00:21:35.108 "copy": true, 00:21:35.108 "nvme_iov_md": false 00:21:35.108 }, 00:21:35.108 "memory_domains": [ 00:21:35.108 { 00:21:35.108 "dma_device_id": "system", 00:21:35.108 "dma_device_type": 1 00:21:35.108 }, 00:21:35.108 { 00:21:35.108 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:35.108 "dma_device_type": 2 00:21:35.108 } 00:21:35.108 ], 00:21:35.108 "driver_specific": {} 00:21:35.108 }' 00:21:35.108 18:25:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:21:35.108 18:25:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:21:35.365 18:25:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:21:35.365 18:25:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:21:35.365 18:25:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:21:35.365 18:25:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:21:35.365 18:25:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:21:35.365 18:25:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:21:35.365 18:25:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:21:35.365 18:25:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:21:35.365 18:25:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:21:35.622 18:25:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:21:35.622 18:25:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:21:35.622 18:25:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:21:35.622 18:25:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:21:35.881 18:25:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:21:35.881 "name": "BaseBdev3", 00:21:35.881 "aliases": [ 00:21:35.881 "d8d4f304-af41-4004-982c-cccd039599b1" 00:21:35.881 ], 00:21:35.881 "product_name": "Malloc disk", 00:21:35.881 "block_size": 512, 00:21:35.881 "num_blocks": 65536, 00:21:35.881 "uuid": "d8d4f304-af41-4004-982c-cccd039599b1", 00:21:35.881 "assigned_rate_limits": { 00:21:35.881 "rw_ios_per_sec": 0, 00:21:35.881 "rw_mbytes_per_sec": 0, 00:21:35.881 "r_mbytes_per_sec": 0, 00:21:35.881 "w_mbytes_per_sec": 0 00:21:35.881 }, 00:21:35.881 "claimed": true, 00:21:35.881 "claim_type": "exclusive_write", 00:21:35.881 "zoned": false, 00:21:35.881 "supported_io_types": { 00:21:35.881 "read": true, 00:21:35.881 "write": true, 00:21:35.881 "unmap": true, 00:21:35.881 "flush": true, 00:21:35.881 "reset": true, 00:21:35.881 "nvme_admin": false, 00:21:35.881 "nvme_io": false, 00:21:35.881 "nvme_io_md": false, 00:21:35.881 "write_zeroes": true, 00:21:35.881 "zcopy": true, 00:21:35.881 "get_zone_info": false, 00:21:35.881 "zone_management": false, 00:21:35.881 "zone_append": false, 00:21:35.881 "compare": false, 00:21:35.881 "compare_and_write": false, 00:21:35.881 "abort": true, 00:21:35.881 "seek_hole": false, 00:21:35.881 "seek_data": false, 00:21:35.881 "copy": true, 00:21:35.881 "nvme_iov_md": false 00:21:35.881 }, 00:21:35.881 "memory_domains": [ 00:21:35.881 { 00:21:35.881 "dma_device_id": "system", 00:21:35.881 "dma_device_type": 1 00:21:35.881 }, 00:21:35.881 { 00:21:35.881 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:35.881 "dma_device_type": 2 00:21:35.881 } 00:21:35.881 ], 00:21:35.881 "driver_specific": {} 00:21:35.881 }' 00:21:35.881 18:25:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:21:35.881 18:25:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:21:35.881 18:25:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:21:35.881 18:25:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:21:35.881 18:25:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:21:35.881 18:25:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:21:35.881 18:25:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:21:35.881 18:25:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:21:36.139 18:25:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:21:36.140 18:25:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:21:36.140 18:25:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:21:36.140 18:25:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:21:36.140 18:25:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:21:36.140 18:25:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:21:36.140 18:25:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 00:21:36.398 18:25:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:21:36.398 "name": "BaseBdev4", 00:21:36.398 "aliases": [ 00:21:36.398 "62c115c2-5789-4a2f-b9b2-766fa0945366" 00:21:36.398 ], 00:21:36.398 "product_name": "Malloc disk", 00:21:36.398 "block_size": 512, 00:21:36.398 "num_blocks": 65536, 00:21:36.398 "uuid": "62c115c2-5789-4a2f-b9b2-766fa0945366", 00:21:36.398 "assigned_rate_limits": { 00:21:36.398 "rw_ios_per_sec": 0, 00:21:36.398 "rw_mbytes_per_sec": 0, 00:21:36.398 "r_mbytes_per_sec": 0, 00:21:36.398 "w_mbytes_per_sec": 0 00:21:36.398 }, 00:21:36.398 "claimed": true, 00:21:36.398 "claim_type": "exclusive_write", 00:21:36.398 "zoned": false, 00:21:36.398 "supported_io_types": { 00:21:36.398 "read": true, 00:21:36.398 "write": true, 00:21:36.398 "unmap": true, 00:21:36.398 "flush": true, 00:21:36.398 "reset": true, 00:21:36.398 "nvme_admin": false, 00:21:36.398 "nvme_io": false, 00:21:36.398 "nvme_io_md": false, 00:21:36.398 "write_zeroes": true, 00:21:36.398 "zcopy": true, 00:21:36.399 "get_zone_info": false, 00:21:36.399 "zone_management": false, 00:21:36.399 "zone_append": false, 00:21:36.399 "compare": false, 00:21:36.399 "compare_and_write": false, 00:21:36.399 "abort": true, 00:21:36.399 "seek_hole": false, 00:21:36.399 "seek_data": false, 00:21:36.399 "copy": true, 00:21:36.399 "nvme_iov_md": false 00:21:36.399 }, 00:21:36.399 "memory_domains": [ 00:21:36.399 { 00:21:36.399 "dma_device_id": "system", 00:21:36.399 "dma_device_type": 1 00:21:36.399 }, 00:21:36.399 { 00:21:36.399 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:36.399 "dma_device_type": 2 00:21:36.399 } 00:21:36.399 ], 00:21:36.399 "driver_specific": {} 00:21:36.399 }' 00:21:36.399 18:25:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:21:36.399 18:25:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:21:36.399 18:25:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:21:36.399 18:25:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:21:36.399 18:25:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:21:36.657 18:25:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:21:36.657 18:25:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:21:36.657 18:25:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:21:36.657 18:25:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:21:36.657 18:25:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:21:36.657 18:25:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:21:36.657 18:25:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:21:36.657 18:25:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@274 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:21:36.916 [2024-07-12 18:25:20.546818] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:21:36.916 18:25:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@275 -- # local expected_state 00:21:36.916 18:25:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@276 -- # has_redundancy raid1 00:21:36.916 18:25:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@213 -- # case $1 in 00:21:36.916 18:25:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@214 -- # return 0 00:21:36.916 18:25:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@279 -- # expected_state=online 00:21:36.916 18:25:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@281 -- # verify_raid_bdev_state Existed_Raid online raid1 0 3 00:21:36.916 18:25:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:21:36.916 18:25:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:21:36.916 18:25:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:21:36.916 18:25:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:21:36.916 18:25:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:21:36.916 18:25:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:21:36.916 18:25:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:21:36.916 18:25:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:21:36.916 18:25:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:21:36.916 18:25:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:36.916 18:25:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:21:37.174 18:25:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:21:37.174 "name": "Existed_Raid", 00:21:37.174 "uuid": "f98c468a-8c4f-4a93-8581-f37fca9842fd", 00:21:37.174 "strip_size_kb": 0, 00:21:37.174 "state": "online", 00:21:37.174 "raid_level": "raid1", 00:21:37.174 "superblock": true, 00:21:37.174 "num_base_bdevs": 4, 00:21:37.174 "num_base_bdevs_discovered": 3, 00:21:37.174 "num_base_bdevs_operational": 3, 00:21:37.174 "base_bdevs_list": [ 00:21:37.174 { 00:21:37.174 "name": null, 00:21:37.174 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:37.174 "is_configured": false, 00:21:37.174 "data_offset": 2048, 00:21:37.174 "data_size": 63488 00:21:37.174 }, 00:21:37.174 { 00:21:37.174 "name": "BaseBdev2", 00:21:37.174 "uuid": "f9a1398e-12f0-4cbe-9c34-b51c3935a1fd", 00:21:37.174 "is_configured": true, 00:21:37.174 "data_offset": 2048, 00:21:37.174 "data_size": 63488 00:21:37.174 }, 00:21:37.174 { 00:21:37.174 "name": "BaseBdev3", 00:21:37.174 "uuid": "d8d4f304-af41-4004-982c-cccd039599b1", 00:21:37.174 "is_configured": true, 00:21:37.174 "data_offset": 2048, 00:21:37.174 "data_size": 63488 00:21:37.174 }, 00:21:37.174 { 00:21:37.174 "name": "BaseBdev4", 00:21:37.174 "uuid": "62c115c2-5789-4a2f-b9b2-766fa0945366", 00:21:37.174 "is_configured": true, 00:21:37.174 "data_offset": 2048, 00:21:37.174 "data_size": 63488 00:21:37.174 } 00:21:37.174 ] 00:21:37.174 }' 00:21:37.174 18:25:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:21:37.174 18:25:20 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:21:37.740 18:25:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i = 1 )) 00:21:37.741 18:25:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:21:37.741 18:25:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:37.741 18:25:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:21:37.999 18:25:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:21:37.999 18:25:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:21:37.999 18:25:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:21:38.258 [2024-07-12 18:25:21.895490] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:21:38.258 18:25:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:21:38.258 18:25:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:21:38.258 18:25:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:38.258 18:25:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:21:38.517 18:25:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:21:38.517 18:25:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:21:38.517 18:25:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev3 00:21:38.776 [2024-07-12 18:25:22.401238] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:21:38.776 18:25:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:21:38.776 18:25:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:21:38.776 18:25:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:38.776 18:25:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:21:39.116 18:25:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:21:39.116 18:25:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:21:39.116 18:25:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev4 00:21:39.710 [2024-07-12 18:25:23.163704] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev4 00:21:39.710 [2024-07-12 18:25:23.163784] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:21:39.710 [2024-07-12 18:25:23.174223] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:21:39.710 [2024-07-12 18:25:23.174254] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:21:39.710 [2024-07-12 18:25:23.174266] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1237350 name Existed_Raid, state offline 00:21:39.710 18:25:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:21:39.710 18:25:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:21:39.710 18:25:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:39.710 18:25:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # jq -r '.[0]["name"] | select(.)' 00:21:39.969 18:25:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # raid_bdev= 00:21:39.969 18:25:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@294 -- # '[' -n '' ']' 00:21:39.969 18:25:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@299 -- # '[' 4 -gt 2 ']' 00:21:39.969 18:25:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i = 1 )) 00:21:39.969 18:25:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:21:39.969 18:25:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:21:39.969 BaseBdev2 00:21:39.969 18:25:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev2 00:21:39.969 18:25:23 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev2 00:21:39.969 18:25:23 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:21:39.969 18:25:23 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:21:39.969 18:25:23 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:21:39.969 18:25:23 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:21:39.969 18:25:23 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:21:40.536 18:25:24 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:21:41.103 [ 00:21:41.103 { 00:21:41.103 "name": "BaseBdev2", 00:21:41.103 "aliases": [ 00:21:41.103 "0332b9ff-8097-4b92-8362-9b22b179abb6" 00:21:41.103 ], 00:21:41.103 "product_name": "Malloc disk", 00:21:41.103 "block_size": 512, 00:21:41.103 "num_blocks": 65536, 00:21:41.103 "uuid": "0332b9ff-8097-4b92-8362-9b22b179abb6", 00:21:41.103 "assigned_rate_limits": { 00:21:41.103 "rw_ios_per_sec": 0, 00:21:41.103 "rw_mbytes_per_sec": 0, 00:21:41.103 "r_mbytes_per_sec": 0, 00:21:41.103 "w_mbytes_per_sec": 0 00:21:41.103 }, 00:21:41.103 "claimed": false, 00:21:41.103 "zoned": false, 00:21:41.103 "supported_io_types": { 00:21:41.103 "read": true, 00:21:41.103 "write": true, 00:21:41.103 "unmap": true, 00:21:41.103 "flush": true, 00:21:41.103 "reset": true, 00:21:41.103 "nvme_admin": false, 00:21:41.103 "nvme_io": false, 00:21:41.103 "nvme_io_md": false, 00:21:41.103 "write_zeroes": true, 00:21:41.103 "zcopy": true, 00:21:41.103 "get_zone_info": false, 00:21:41.103 "zone_management": false, 00:21:41.103 "zone_append": false, 00:21:41.103 "compare": false, 00:21:41.103 "compare_and_write": false, 00:21:41.103 "abort": true, 00:21:41.103 "seek_hole": false, 00:21:41.103 "seek_data": false, 00:21:41.103 "copy": true, 00:21:41.103 "nvme_iov_md": false 00:21:41.103 }, 00:21:41.103 "memory_domains": [ 00:21:41.103 { 00:21:41.103 "dma_device_id": "system", 00:21:41.103 "dma_device_type": 1 00:21:41.103 }, 00:21:41.103 { 00:21:41.103 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:41.103 "dma_device_type": 2 00:21:41.103 } 00:21:41.103 ], 00:21:41.103 "driver_specific": {} 00:21:41.103 } 00:21:41.103 ] 00:21:41.103 18:25:24 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:21:41.103 18:25:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:21:41.103 18:25:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:21:41.103 18:25:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:21:41.362 BaseBdev3 00:21:41.362 18:25:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev3 00:21:41.362 18:25:24 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev3 00:21:41.362 18:25:24 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:21:41.362 18:25:24 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:21:41.362 18:25:24 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:21:41.362 18:25:24 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:21:41.362 18:25:24 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:21:41.929 18:25:25 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:21:42.187 [ 00:21:42.187 { 00:21:42.187 "name": "BaseBdev3", 00:21:42.187 "aliases": [ 00:21:42.187 "13a54bb6-29b6-4e6a-b83f-d66877399424" 00:21:42.187 ], 00:21:42.187 "product_name": "Malloc disk", 00:21:42.187 "block_size": 512, 00:21:42.187 "num_blocks": 65536, 00:21:42.187 "uuid": "13a54bb6-29b6-4e6a-b83f-d66877399424", 00:21:42.187 "assigned_rate_limits": { 00:21:42.187 "rw_ios_per_sec": 0, 00:21:42.187 "rw_mbytes_per_sec": 0, 00:21:42.187 "r_mbytes_per_sec": 0, 00:21:42.187 "w_mbytes_per_sec": 0 00:21:42.187 }, 00:21:42.187 "claimed": false, 00:21:42.187 "zoned": false, 00:21:42.187 "supported_io_types": { 00:21:42.187 "read": true, 00:21:42.187 "write": true, 00:21:42.187 "unmap": true, 00:21:42.187 "flush": true, 00:21:42.187 "reset": true, 00:21:42.187 "nvme_admin": false, 00:21:42.187 "nvme_io": false, 00:21:42.187 "nvme_io_md": false, 00:21:42.187 "write_zeroes": true, 00:21:42.187 "zcopy": true, 00:21:42.187 "get_zone_info": false, 00:21:42.187 "zone_management": false, 00:21:42.187 "zone_append": false, 00:21:42.187 "compare": false, 00:21:42.187 "compare_and_write": false, 00:21:42.187 "abort": true, 00:21:42.187 "seek_hole": false, 00:21:42.187 "seek_data": false, 00:21:42.187 "copy": true, 00:21:42.187 "nvme_iov_md": false 00:21:42.187 }, 00:21:42.187 "memory_domains": [ 00:21:42.187 { 00:21:42.187 "dma_device_id": "system", 00:21:42.187 "dma_device_type": 1 00:21:42.187 }, 00:21:42.187 { 00:21:42.187 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:42.187 "dma_device_type": 2 00:21:42.187 } 00:21:42.187 ], 00:21:42.187 "driver_specific": {} 00:21:42.187 } 00:21:42.187 ] 00:21:42.187 18:25:25 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:21:42.187 18:25:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:21:42.187 18:25:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:21:42.187 18:25:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4 00:21:42.755 BaseBdev4 00:21:42.755 18:25:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev4 00:21:42.755 18:25:26 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev4 00:21:42.755 18:25:26 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:21:42.755 18:25:26 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:21:42.755 18:25:26 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:21:42.755 18:25:26 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:21:42.755 18:25:26 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:21:42.755 18:25:26 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 -t 2000 00:21:43.321 [ 00:21:43.321 { 00:21:43.321 "name": "BaseBdev4", 00:21:43.321 "aliases": [ 00:21:43.321 "a533f0de-60fe-40e5-bb92-3ae6a238737c" 00:21:43.321 ], 00:21:43.321 "product_name": "Malloc disk", 00:21:43.321 "block_size": 512, 00:21:43.321 "num_blocks": 65536, 00:21:43.321 "uuid": "a533f0de-60fe-40e5-bb92-3ae6a238737c", 00:21:43.321 "assigned_rate_limits": { 00:21:43.321 "rw_ios_per_sec": 0, 00:21:43.321 "rw_mbytes_per_sec": 0, 00:21:43.321 "r_mbytes_per_sec": 0, 00:21:43.321 "w_mbytes_per_sec": 0 00:21:43.321 }, 00:21:43.321 "claimed": false, 00:21:43.321 "zoned": false, 00:21:43.321 "supported_io_types": { 00:21:43.321 "read": true, 00:21:43.321 "write": true, 00:21:43.321 "unmap": true, 00:21:43.321 "flush": true, 00:21:43.321 "reset": true, 00:21:43.321 "nvme_admin": false, 00:21:43.321 "nvme_io": false, 00:21:43.321 "nvme_io_md": false, 00:21:43.321 "write_zeroes": true, 00:21:43.321 "zcopy": true, 00:21:43.321 "get_zone_info": false, 00:21:43.321 "zone_management": false, 00:21:43.321 "zone_append": false, 00:21:43.321 "compare": false, 00:21:43.321 "compare_and_write": false, 00:21:43.321 "abort": true, 00:21:43.321 "seek_hole": false, 00:21:43.321 "seek_data": false, 00:21:43.321 "copy": true, 00:21:43.322 "nvme_iov_md": false 00:21:43.322 }, 00:21:43.322 "memory_domains": [ 00:21:43.322 { 00:21:43.322 "dma_device_id": "system", 00:21:43.322 "dma_device_type": 1 00:21:43.322 }, 00:21:43.322 { 00:21:43.322 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:43.322 "dma_device_type": 2 00:21:43.322 } 00:21:43.322 ], 00:21:43.322 "driver_specific": {} 00:21:43.322 } 00:21:43.322 ] 00:21:43.322 18:25:26 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:21:43.322 18:25:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:21:43.322 18:25:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:21:43.322 18:25:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@305 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:21:43.580 [2024-07-12 18:25:27.195123] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:21:43.580 [2024-07-12 18:25:27.195162] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:21:43.580 [2024-07-12 18:25:27.195180] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:21:43.580 [2024-07-12 18:25:27.196487] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:21:43.580 [2024-07-12 18:25:27.196535] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:21:43.580 18:25:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@306 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:21:43.580 18:25:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:21:43.580 18:25:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:21:43.580 18:25:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:21:43.580 18:25:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:21:43.580 18:25:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:21:43.580 18:25:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:21:43.580 18:25:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:21:43.580 18:25:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:21:43.580 18:25:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:21:43.580 18:25:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:43.580 18:25:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:21:43.839 18:25:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:21:43.839 "name": "Existed_Raid", 00:21:43.839 "uuid": "d3062b40-e920-4167-8e78-3dd633c2cb92", 00:21:43.839 "strip_size_kb": 0, 00:21:43.839 "state": "configuring", 00:21:43.839 "raid_level": "raid1", 00:21:43.839 "superblock": true, 00:21:43.839 "num_base_bdevs": 4, 00:21:43.839 "num_base_bdevs_discovered": 3, 00:21:43.839 "num_base_bdevs_operational": 4, 00:21:43.839 "base_bdevs_list": [ 00:21:43.839 { 00:21:43.839 "name": "BaseBdev1", 00:21:43.839 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:43.839 "is_configured": false, 00:21:43.839 "data_offset": 0, 00:21:43.839 "data_size": 0 00:21:43.839 }, 00:21:43.839 { 00:21:43.839 "name": "BaseBdev2", 00:21:43.839 "uuid": "0332b9ff-8097-4b92-8362-9b22b179abb6", 00:21:43.839 "is_configured": true, 00:21:43.839 "data_offset": 2048, 00:21:43.839 "data_size": 63488 00:21:43.839 }, 00:21:43.839 { 00:21:43.839 "name": "BaseBdev3", 00:21:43.839 "uuid": "13a54bb6-29b6-4e6a-b83f-d66877399424", 00:21:43.839 "is_configured": true, 00:21:43.839 "data_offset": 2048, 00:21:43.839 "data_size": 63488 00:21:43.839 }, 00:21:43.839 { 00:21:43.839 "name": "BaseBdev4", 00:21:43.839 "uuid": "a533f0de-60fe-40e5-bb92-3ae6a238737c", 00:21:43.839 "is_configured": true, 00:21:43.839 "data_offset": 2048, 00:21:43.839 "data_size": 63488 00:21:43.839 } 00:21:43.839 ] 00:21:43.839 }' 00:21:43.839 18:25:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:21:43.839 18:25:27 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:21:44.405 18:25:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@308 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev2 00:21:44.664 [2024-07-12 18:25:28.273980] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:21:44.664 18:25:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@309 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:21:44.664 18:25:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:21:44.664 18:25:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:21:44.664 18:25:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:21:44.664 18:25:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:21:44.664 18:25:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:21:44.664 18:25:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:21:44.664 18:25:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:21:44.664 18:25:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:21:44.664 18:25:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:21:44.664 18:25:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:21:44.664 18:25:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:44.922 18:25:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:21:44.922 "name": "Existed_Raid", 00:21:44.922 "uuid": "d3062b40-e920-4167-8e78-3dd633c2cb92", 00:21:44.922 "strip_size_kb": 0, 00:21:44.922 "state": "configuring", 00:21:44.922 "raid_level": "raid1", 00:21:44.922 "superblock": true, 00:21:44.922 "num_base_bdevs": 4, 00:21:44.922 "num_base_bdevs_discovered": 2, 00:21:44.922 "num_base_bdevs_operational": 4, 00:21:44.922 "base_bdevs_list": [ 00:21:44.922 { 00:21:44.922 "name": "BaseBdev1", 00:21:44.922 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:44.922 "is_configured": false, 00:21:44.922 "data_offset": 0, 00:21:44.922 "data_size": 0 00:21:44.922 }, 00:21:44.922 { 00:21:44.922 "name": null, 00:21:44.922 "uuid": "0332b9ff-8097-4b92-8362-9b22b179abb6", 00:21:44.922 "is_configured": false, 00:21:44.922 "data_offset": 2048, 00:21:44.922 "data_size": 63488 00:21:44.922 }, 00:21:44.922 { 00:21:44.922 "name": "BaseBdev3", 00:21:44.922 "uuid": "13a54bb6-29b6-4e6a-b83f-d66877399424", 00:21:44.922 "is_configured": true, 00:21:44.922 "data_offset": 2048, 00:21:44.922 "data_size": 63488 00:21:44.922 }, 00:21:44.922 { 00:21:44.922 "name": "BaseBdev4", 00:21:44.922 "uuid": "a533f0de-60fe-40e5-bb92-3ae6a238737c", 00:21:44.922 "is_configured": true, 00:21:44.922 "data_offset": 2048, 00:21:44.922 "data_size": 63488 00:21:44.922 } 00:21:44.922 ] 00:21:44.922 }' 00:21:44.922 18:25:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:21:44.922 18:25:28 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:21:45.488 18:25:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@310 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:45.488 18:25:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@310 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:21:45.746 18:25:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@310 -- # [[ false == \f\a\l\s\e ]] 00:21:45.746 18:25:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@312 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:21:46.314 [2024-07-12 18:25:29.810609] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:21:46.314 BaseBdev1 00:21:46.314 18:25:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@313 -- # waitforbdev BaseBdev1 00:21:46.314 18:25:29 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev1 00:21:46.314 18:25:29 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:21:46.314 18:25:29 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:21:46.314 18:25:29 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:21:46.314 18:25:29 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:21:46.314 18:25:29 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:21:46.572 18:25:30 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:21:46.831 [ 00:21:46.831 { 00:21:46.831 "name": "BaseBdev1", 00:21:46.831 "aliases": [ 00:21:46.831 "5714ba19-bd74-4a47-9199-5bb08f7fb39d" 00:21:46.831 ], 00:21:46.831 "product_name": "Malloc disk", 00:21:46.831 "block_size": 512, 00:21:46.831 "num_blocks": 65536, 00:21:46.831 "uuid": "5714ba19-bd74-4a47-9199-5bb08f7fb39d", 00:21:46.831 "assigned_rate_limits": { 00:21:46.831 "rw_ios_per_sec": 0, 00:21:46.831 "rw_mbytes_per_sec": 0, 00:21:46.831 "r_mbytes_per_sec": 0, 00:21:46.831 "w_mbytes_per_sec": 0 00:21:46.831 }, 00:21:46.831 "claimed": true, 00:21:46.831 "claim_type": "exclusive_write", 00:21:46.831 "zoned": false, 00:21:46.831 "supported_io_types": { 00:21:46.831 "read": true, 00:21:46.831 "write": true, 00:21:46.831 "unmap": true, 00:21:46.831 "flush": true, 00:21:46.831 "reset": true, 00:21:46.831 "nvme_admin": false, 00:21:46.831 "nvme_io": false, 00:21:46.831 "nvme_io_md": false, 00:21:46.831 "write_zeroes": true, 00:21:46.831 "zcopy": true, 00:21:46.831 "get_zone_info": false, 00:21:46.831 "zone_management": false, 00:21:46.831 "zone_append": false, 00:21:46.831 "compare": false, 00:21:46.831 "compare_and_write": false, 00:21:46.831 "abort": true, 00:21:46.831 "seek_hole": false, 00:21:46.831 "seek_data": false, 00:21:46.831 "copy": true, 00:21:46.831 "nvme_iov_md": false 00:21:46.831 }, 00:21:46.831 "memory_domains": [ 00:21:46.831 { 00:21:46.831 "dma_device_id": "system", 00:21:46.831 "dma_device_type": 1 00:21:46.831 }, 00:21:46.831 { 00:21:46.831 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:46.831 "dma_device_type": 2 00:21:46.831 } 00:21:46.831 ], 00:21:46.831 "driver_specific": {} 00:21:46.831 } 00:21:46.831 ] 00:21:46.831 18:25:30 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:21:46.831 18:25:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@314 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:21:46.831 18:25:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:21:46.831 18:25:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:21:46.831 18:25:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:21:46.831 18:25:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:21:46.831 18:25:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:21:46.831 18:25:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:21:46.831 18:25:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:21:46.831 18:25:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:21:46.831 18:25:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:21:46.831 18:25:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:46.831 18:25:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:21:47.090 18:25:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:21:47.090 "name": "Existed_Raid", 00:21:47.090 "uuid": "d3062b40-e920-4167-8e78-3dd633c2cb92", 00:21:47.091 "strip_size_kb": 0, 00:21:47.091 "state": "configuring", 00:21:47.091 "raid_level": "raid1", 00:21:47.091 "superblock": true, 00:21:47.091 "num_base_bdevs": 4, 00:21:47.091 "num_base_bdevs_discovered": 3, 00:21:47.091 "num_base_bdevs_operational": 4, 00:21:47.091 "base_bdevs_list": [ 00:21:47.091 { 00:21:47.091 "name": "BaseBdev1", 00:21:47.091 "uuid": "5714ba19-bd74-4a47-9199-5bb08f7fb39d", 00:21:47.091 "is_configured": true, 00:21:47.091 "data_offset": 2048, 00:21:47.091 "data_size": 63488 00:21:47.091 }, 00:21:47.091 { 00:21:47.091 "name": null, 00:21:47.091 "uuid": "0332b9ff-8097-4b92-8362-9b22b179abb6", 00:21:47.091 "is_configured": false, 00:21:47.091 "data_offset": 2048, 00:21:47.091 "data_size": 63488 00:21:47.091 }, 00:21:47.091 { 00:21:47.091 "name": "BaseBdev3", 00:21:47.091 "uuid": "13a54bb6-29b6-4e6a-b83f-d66877399424", 00:21:47.091 "is_configured": true, 00:21:47.091 "data_offset": 2048, 00:21:47.091 "data_size": 63488 00:21:47.091 }, 00:21:47.091 { 00:21:47.091 "name": "BaseBdev4", 00:21:47.091 "uuid": "a533f0de-60fe-40e5-bb92-3ae6a238737c", 00:21:47.091 "is_configured": true, 00:21:47.091 "data_offset": 2048, 00:21:47.091 "data_size": 63488 00:21:47.091 } 00:21:47.091 ] 00:21:47.091 }' 00:21:47.091 18:25:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:21:47.091 18:25:30 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:21:47.658 18:25:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@315 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:47.658 18:25:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@315 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:21:47.916 18:25:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@315 -- # [[ true == \t\r\u\e ]] 00:21:47.916 18:25:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@317 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev3 00:21:47.916 [2024-07-12 18:25:31.631504] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:21:48.175 18:25:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@318 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:21:48.175 18:25:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:21:48.175 18:25:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:21:48.175 18:25:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:21:48.175 18:25:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:21:48.175 18:25:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:21:48.175 18:25:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:21:48.175 18:25:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:21:48.175 18:25:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:21:48.175 18:25:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:21:48.175 18:25:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:48.175 18:25:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:21:48.434 18:25:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:21:48.434 "name": "Existed_Raid", 00:21:48.434 "uuid": "d3062b40-e920-4167-8e78-3dd633c2cb92", 00:21:48.434 "strip_size_kb": 0, 00:21:48.434 "state": "configuring", 00:21:48.434 "raid_level": "raid1", 00:21:48.434 "superblock": true, 00:21:48.434 "num_base_bdevs": 4, 00:21:48.434 "num_base_bdevs_discovered": 2, 00:21:48.434 "num_base_bdevs_operational": 4, 00:21:48.434 "base_bdevs_list": [ 00:21:48.434 { 00:21:48.434 "name": "BaseBdev1", 00:21:48.434 "uuid": "5714ba19-bd74-4a47-9199-5bb08f7fb39d", 00:21:48.434 "is_configured": true, 00:21:48.434 "data_offset": 2048, 00:21:48.434 "data_size": 63488 00:21:48.434 }, 00:21:48.434 { 00:21:48.434 "name": null, 00:21:48.434 "uuid": "0332b9ff-8097-4b92-8362-9b22b179abb6", 00:21:48.434 "is_configured": false, 00:21:48.434 "data_offset": 2048, 00:21:48.434 "data_size": 63488 00:21:48.434 }, 00:21:48.434 { 00:21:48.434 "name": null, 00:21:48.434 "uuid": "13a54bb6-29b6-4e6a-b83f-d66877399424", 00:21:48.434 "is_configured": false, 00:21:48.434 "data_offset": 2048, 00:21:48.434 "data_size": 63488 00:21:48.434 }, 00:21:48.434 { 00:21:48.434 "name": "BaseBdev4", 00:21:48.434 "uuid": "a533f0de-60fe-40e5-bb92-3ae6a238737c", 00:21:48.434 "is_configured": true, 00:21:48.434 "data_offset": 2048, 00:21:48.434 "data_size": 63488 00:21:48.434 } 00:21:48.434 ] 00:21:48.434 }' 00:21:48.434 18:25:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:21:48.434 18:25:31 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:21:49.000 18:25:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@319 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:49.000 18:25:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@319 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:21:49.258 18:25:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@319 -- # [[ false == \f\a\l\s\e ]] 00:21:49.258 18:25:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@321 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev3 00:21:49.258 [2024-07-12 18:25:32.959036] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:21:49.258 18:25:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@322 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:21:49.258 18:25:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:21:49.258 18:25:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:21:49.258 18:25:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:21:49.258 18:25:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:21:49.258 18:25:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:21:49.258 18:25:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:21:49.258 18:25:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:21:49.258 18:25:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:21:49.258 18:25:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:21:49.258 18:25:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:49.258 18:25:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:21:49.516 18:25:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:21:49.516 "name": "Existed_Raid", 00:21:49.516 "uuid": "d3062b40-e920-4167-8e78-3dd633c2cb92", 00:21:49.516 "strip_size_kb": 0, 00:21:49.516 "state": "configuring", 00:21:49.516 "raid_level": "raid1", 00:21:49.516 "superblock": true, 00:21:49.516 "num_base_bdevs": 4, 00:21:49.516 "num_base_bdevs_discovered": 3, 00:21:49.516 "num_base_bdevs_operational": 4, 00:21:49.516 "base_bdevs_list": [ 00:21:49.516 { 00:21:49.516 "name": "BaseBdev1", 00:21:49.516 "uuid": "5714ba19-bd74-4a47-9199-5bb08f7fb39d", 00:21:49.516 "is_configured": true, 00:21:49.516 "data_offset": 2048, 00:21:49.516 "data_size": 63488 00:21:49.516 }, 00:21:49.516 { 00:21:49.516 "name": null, 00:21:49.516 "uuid": "0332b9ff-8097-4b92-8362-9b22b179abb6", 00:21:49.516 "is_configured": false, 00:21:49.516 "data_offset": 2048, 00:21:49.516 "data_size": 63488 00:21:49.516 }, 00:21:49.516 { 00:21:49.516 "name": "BaseBdev3", 00:21:49.516 "uuid": "13a54bb6-29b6-4e6a-b83f-d66877399424", 00:21:49.516 "is_configured": true, 00:21:49.516 "data_offset": 2048, 00:21:49.516 "data_size": 63488 00:21:49.516 }, 00:21:49.516 { 00:21:49.516 "name": "BaseBdev4", 00:21:49.516 "uuid": "a533f0de-60fe-40e5-bb92-3ae6a238737c", 00:21:49.516 "is_configured": true, 00:21:49.516 "data_offset": 2048, 00:21:49.516 "data_size": 63488 00:21:49.516 } 00:21:49.516 ] 00:21:49.516 }' 00:21:49.516 18:25:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:21:49.516 18:25:33 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:21:50.451 18:25:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@323 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:50.451 18:25:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@323 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:21:50.451 18:25:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@323 -- # [[ true == \t\r\u\e ]] 00:21:50.451 18:25:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@325 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:21:51.018 [2024-07-12 18:25:34.535230] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:21:51.018 18:25:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@326 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:21:51.018 18:25:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:21:51.018 18:25:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:21:51.018 18:25:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:21:51.018 18:25:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:21:51.018 18:25:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:21:51.018 18:25:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:21:51.018 18:25:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:21:51.018 18:25:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:21:51.018 18:25:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:21:51.018 18:25:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:51.018 18:25:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:21:51.278 18:25:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:21:51.278 "name": "Existed_Raid", 00:21:51.278 "uuid": "d3062b40-e920-4167-8e78-3dd633c2cb92", 00:21:51.278 "strip_size_kb": 0, 00:21:51.278 "state": "configuring", 00:21:51.278 "raid_level": "raid1", 00:21:51.278 "superblock": true, 00:21:51.278 "num_base_bdevs": 4, 00:21:51.278 "num_base_bdevs_discovered": 2, 00:21:51.278 "num_base_bdevs_operational": 4, 00:21:51.278 "base_bdevs_list": [ 00:21:51.278 { 00:21:51.278 "name": null, 00:21:51.278 "uuid": "5714ba19-bd74-4a47-9199-5bb08f7fb39d", 00:21:51.278 "is_configured": false, 00:21:51.278 "data_offset": 2048, 00:21:51.278 "data_size": 63488 00:21:51.278 }, 00:21:51.278 { 00:21:51.278 "name": null, 00:21:51.278 "uuid": "0332b9ff-8097-4b92-8362-9b22b179abb6", 00:21:51.278 "is_configured": false, 00:21:51.278 "data_offset": 2048, 00:21:51.278 "data_size": 63488 00:21:51.278 }, 00:21:51.278 { 00:21:51.278 "name": "BaseBdev3", 00:21:51.278 "uuid": "13a54bb6-29b6-4e6a-b83f-d66877399424", 00:21:51.278 "is_configured": true, 00:21:51.278 "data_offset": 2048, 00:21:51.278 "data_size": 63488 00:21:51.278 }, 00:21:51.278 { 00:21:51.278 "name": "BaseBdev4", 00:21:51.278 "uuid": "a533f0de-60fe-40e5-bb92-3ae6a238737c", 00:21:51.278 "is_configured": true, 00:21:51.278 "data_offset": 2048, 00:21:51.278 "data_size": 63488 00:21:51.278 } 00:21:51.278 ] 00:21:51.278 }' 00:21:51.278 18:25:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:21:51.278 18:25:34 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:21:51.847 18:25:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@327 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:51.847 18:25:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@327 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:21:52.105 18:25:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@327 -- # [[ false == \f\a\l\s\e ]] 00:21:52.105 18:25:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@329 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev2 00:21:52.364 [2024-07-12 18:25:35.873279] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:21:52.364 18:25:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@330 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:21:52.364 18:25:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:21:52.364 18:25:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:21:52.364 18:25:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:21:52.364 18:25:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:21:52.364 18:25:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:21:52.364 18:25:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:21:52.364 18:25:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:21:52.364 18:25:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:21:52.364 18:25:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:21:52.364 18:25:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:52.364 18:25:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:21:52.623 18:25:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:21:52.623 "name": "Existed_Raid", 00:21:52.623 "uuid": "d3062b40-e920-4167-8e78-3dd633c2cb92", 00:21:52.623 "strip_size_kb": 0, 00:21:52.623 "state": "configuring", 00:21:52.623 "raid_level": "raid1", 00:21:52.623 "superblock": true, 00:21:52.623 "num_base_bdevs": 4, 00:21:52.623 "num_base_bdevs_discovered": 3, 00:21:52.623 "num_base_bdevs_operational": 4, 00:21:52.623 "base_bdevs_list": [ 00:21:52.623 { 00:21:52.623 "name": null, 00:21:52.623 "uuid": "5714ba19-bd74-4a47-9199-5bb08f7fb39d", 00:21:52.623 "is_configured": false, 00:21:52.623 "data_offset": 2048, 00:21:52.623 "data_size": 63488 00:21:52.623 }, 00:21:52.623 { 00:21:52.623 "name": "BaseBdev2", 00:21:52.623 "uuid": "0332b9ff-8097-4b92-8362-9b22b179abb6", 00:21:52.623 "is_configured": true, 00:21:52.623 "data_offset": 2048, 00:21:52.623 "data_size": 63488 00:21:52.623 }, 00:21:52.623 { 00:21:52.623 "name": "BaseBdev3", 00:21:52.623 "uuid": "13a54bb6-29b6-4e6a-b83f-d66877399424", 00:21:52.623 "is_configured": true, 00:21:52.623 "data_offset": 2048, 00:21:52.623 "data_size": 63488 00:21:52.623 }, 00:21:52.623 { 00:21:52.623 "name": "BaseBdev4", 00:21:52.623 "uuid": "a533f0de-60fe-40e5-bb92-3ae6a238737c", 00:21:52.623 "is_configured": true, 00:21:52.623 "data_offset": 2048, 00:21:52.623 "data_size": 63488 00:21:52.623 } 00:21:52.623 ] 00:21:52.623 }' 00:21:52.623 18:25:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:21:52.623 18:25:36 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:21:53.190 18:25:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@331 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:53.190 18:25:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@331 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:21:53.449 18:25:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@331 -- # [[ true == \t\r\u\e ]] 00:21:53.449 18:25:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:53.449 18:25:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@333 -- # jq -r '.[0].base_bdevs_list[0].uuid' 00:21:53.708 18:25:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b NewBaseBdev -u 5714ba19-bd74-4a47-9199-5bb08f7fb39d 00:21:53.708 [2024-07-12 18:25:37.421615] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev NewBaseBdev is claimed 00:21:53.708 [2024-07-12 18:25:37.421778] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x1239180 00:21:53.708 [2024-07-12 18:25:37.421791] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:21:53.708 [2024-07-12 18:25:37.421979] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1239c20 00:21:53.708 [2024-07-12 18:25:37.422110] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1239180 00:21:53.708 [2024-07-12 18:25:37.422120] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x1239180 00:21:53.709 [2024-07-12 18:25:37.422215] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:21:53.709 NewBaseBdev 00:21:53.967 18:25:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@334 -- # waitforbdev NewBaseBdev 00:21:53.967 18:25:37 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=NewBaseBdev 00:21:53.967 18:25:37 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:21:53.967 18:25:37 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:21:53.967 18:25:37 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:21:53.967 18:25:37 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:21:53.967 18:25:37 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:21:53.967 18:25:37 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev -t 2000 00:21:54.226 [ 00:21:54.226 { 00:21:54.226 "name": "NewBaseBdev", 00:21:54.226 "aliases": [ 00:21:54.226 "5714ba19-bd74-4a47-9199-5bb08f7fb39d" 00:21:54.226 ], 00:21:54.226 "product_name": "Malloc disk", 00:21:54.226 "block_size": 512, 00:21:54.226 "num_blocks": 65536, 00:21:54.226 "uuid": "5714ba19-bd74-4a47-9199-5bb08f7fb39d", 00:21:54.226 "assigned_rate_limits": { 00:21:54.226 "rw_ios_per_sec": 0, 00:21:54.226 "rw_mbytes_per_sec": 0, 00:21:54.226 "r_mbytes_per_sec": 0, 00:21:54.226 "w_mbytes_per_sec": 0 00:21:54.226 }, 00:21:54.226 "claimed": true, 00:21:54.226 "claim_type": "exclusive_write", 00:21:54.226 "zoned": false, 00:21:54.226 "supported_io_types": { 00:21:54.226 "read": true, 00:21:54.226 "write": true, 00:21:54.226 "unmap": true, 00:21:54.226 "flush": true, 00:21:54.226 "reset": true, 00:21:54.226 "nvme_admin": false, 00:21:54.226 "nvme_io": false, 00:21:54.226 "nvme_io_md": false, 00:21:54.226 "write_zeroes": true, 00:21:54.226 "zcopy": true, 00:21:54.226 "get_zone_info": false, 00:21:54.226 "zone_management": false, 00:21:54.226 "zone_append": false, 00:21:54.226 "compare": false, 00:21:54.226 "compare_and_write": false, 00:21:54.226 "abort": true, 00:21:54.226 "seek_hole": false, 00:21:54.226 "seek_data": false, 00:21:54.226 "copy": true, 00:21:54.226 "nvme_iov_md": false 00:21:54.227 }, 00:21:54.227 "memory_domains": [ 00:21:54.227 { 00:21:54.227 "dma_device_id": "system", 00:21:54.227 "dma_device_type": 1 00:21:54.227 }, 00:21:54.227 { 00:21:54.227 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:54.227 "dma_device_type": 2 00:21:54.227 } 00:21:54.227 ], 00:21:54.227 "driver_specific": {} 00:21:54.227 } 00:21:54.227 ] 00:21:54.227 18:25:37 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:21:54.227 18:25:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@335 -- # verify_raid_bdev_state Existed_Raid online raid1 0 4 00:21:54.227 18:25:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:21:54.227 18:25:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:21:54.227 18:25:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:21:54.227 18:25:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:21:54.227 18:25:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:21:54.227 18:25:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:21:54.227 18:25:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:21:54.227 18:25:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:21:54.227 18:25:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:21:54.227 18:25:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:21:54.227 18:25:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:54.485 18:25:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:21:54.485 "name": "Existed_Raid", 00:21:54.485 "uuid": "d3062b40-e920-4167-8e78-3dd633c2cb92", 00:21:54.485 "strip_size_kb": 0, 00:21:54.485 "state": "online", 00:21:54.485 "raid_level": "raid1", 00:21:54.485 "superblock": true, 00:21:54.485 "num_base_bdevs": 4, 00:21:54.485 "num_base_bdevs_discovered": 4, 00:21:54.485 "num_base_bdevs_operational": 4, 00:21:54.485 "base_bdevs_list": [ 00:21:54.485 { 00:21:54.485 "name": "NewBaseBdev", 00:21:54.485 "uuid": "5714ba19-bd74-4a47-9199-5bb08f7fb39d", 00:21:54.485 "is_configured": true, 00:21:54.485 "data_offset": 2048, 00:21:54.485 "data_size": 63488 00:21:54.485 }, 00:21:54.485 { 00:21:54.485 "name": "BaseBdev2", 00:21:54.485 "uuid": "0332b9ff-8097-4b92-8362-9b22b179abb6", 00:21:54.485 "is_configured": true, 00:21:54.485 "data_offset": 2048, 00:21:54.485 "data_size": 63488 00:21:54.485 }, 00:21:54.485 { 00:21:54.485 "name": "BaseBdev3", 00:21:54.485 "uuid": "13a54bb6-29b6-4e6a-b83f-d66877399424", 00:21:54.485 "is_configured": true, 00:21:54.485 "data_offset": 2048, 00:21:54.485 "data_size": 63488 00:21:54.485 }, 00:21:54.485 { 00:21:54.485 "name": "BaseBdev4", 00:21:54.485 "uuid": "a533f0de-60fe-40e5-bb92-3ae6a238737c", 00:21:54.485 "is_configured": true, 00:21:54.485 "data_offset": 2048, 00:21:54.485 "data_size": 63488 00:21:54.485 } 00:21:54.485 ] 00:21:54.485 }' 00:21:54.485 18:25:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:21:54.485 18:25:38 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:21:55.050 18:25:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@336 -- # verify_raid_bdev_properties Existed_Raid 00:21:55.050 18:25:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:21:55.050 18:25:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:21:55.050 18:25:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:21:55.050 18:25:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:21:55.050 18:25:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@198 -- # local name 00:21:55.050 18:25:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:21:55.050 18:25:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:21:55.308 [2024-07-12 18:25:38.970069] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:21:55.308 18:25:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:21:55.308 "name": "Existed_Raid", 00:21:55.308 "aliases": [ 00:21:55.308 "d3062b40-e920-4167-8e78-3dd633c2cb92" 00:21:55.308 ], 00:21:55.308 "product_name": "Raid Volume", 00:21:55.308 "block_size": 512, 00:21:55.308 "num_blocks": 63488, 00:21:55.308 "uuid": "d3062b40-e920-4167-8e78-3dd633c2cb92", 00:21:55.308 "assigned_rate_limits": { 00:21:55.308 "rw_ios_per_sec": 0, 00:21:55.308 "rw_mbytes_per_sec": 0, 00:21:55.308 "r_mbytes_per_sec": 0, 00:21:55.308 "w_mbytes_per_sec": 0 00:21:55.308 }, 00:21:55.308 "claimed": false, 00:21:55.308 "zoned": false, 00:21:55.308 "supported_io_types": { 00:21:55.308 "read": true, 00:21:55.308 "write": true, 00:21:55.308 "unmap": false, 00:21:55.308 "flush": false, 00:21:55.308 "reset": true, 00:21:55.308 "nvme_admin": false, 00:21:55.308 "nvme_io": false, 00:21:55.308 "nvme_io_md": false, 00:21:55.308 "write_zeroes": true, 00:21:55.308 "zcopy": false, 00:21:55.308 "get_zone_info": false, 00:21:55.308 "zone_management": false, 00:21:55.308 "zone_append": false, 00:21:55.308 "compare": false, 00:21:55.308 "compare_and_write": false, 00:21:55.308 "abort": false, 00:21:55.308 "seek_hole": false, 00:21:55.308 "seek_data": false, 00:21:55.308 "copy": false, 00:21:55.308 "nvme_iov_md": false 00:21:55.308 }, 00:21:55.308 "memory_domains": [ 00:21:55.308 { 00:21:55.308 "dma_device_id": "system", 00:21:55.308 "dma_device_type": 1 00:21:55.308 }, 00:21:55.308 { 00:21:55.308 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:55.308 "dma_device_type": 2 00:21:55.308 }, 00:21:55.308 { 00:21:55.308 "dma_device_id": "system", 00:21:55.308 "dma_device_type": 1 00:21:55.308 }, 00:21:55.308 { 00:21:55.308 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:55.308 "dma_device_type": 2 00:21:55.308 }, 00:21:55.308 { 00:21:55.308 "dma_device_id": "system", 00:21:55.308 "dma_device_type": 1 00:21:55.308 }, 00:21:55.308 { 00:21:55.308 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:55.308 "dma_device_type": 2 00:21:55.308 }, 00:21:55.308 { 00:21:55.308 "dma_device_id": "system", 00:21:55.308 "dma_device_type": 1 00:21:55.308 }, 00:21:55.308 { 00:21:55.308 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:55.308 "dma_device_type": 2 00:21:55.308 } 00:21:55.308 ], 00:21:55.308 "driver_specific": { 00:21:55.308 "raid": { 00:21:55.308 "uuid": "d3062b40-e920-4167-8e78-3dd633c2cb92", 00:21:55.308 "strip_size_kb": 0, 00:21:55.308 "state": "online", 00:21:55.308 "raid_level": "raid1", 00:21:55.308 "superblock": true, 00:21:55.308 "num_base_bdevs": 4, 00:21:55.308 "num_base_bdevs_discovered": 4, 00:21:55.308 "num_base_bdevs_operational": 4, 00:21:55.308 "base_bdevs_list": [ 00:21:55.308 { 00:21:55.308 "name": "NewBaseBdev", 00:21:55.308 "uuid": "5714ba19-bd74-4a47-9199-5bb08f7fb39d", 00:21:55.308 "is_configured": true, 00:21:55.308 "data_offset": 2048, 00:21:55.308 "data_size": 63488 00:21:55.308 }, 00:21:55.308 { 00:21:55.308 "name": "BaseBdev2", 00:21:55.308 "uuid": "0332b9ff-8097-4b92-8362-9b22b179abb6", 00:21:55.308 "is_configured": true, 00:21:55.308 "data_offset": 2048, 00:21:55.308 "data_size": 63488 00:21:55.308 }, 00:21:55.308 { 00:21:55.308 "name": "BaseBdev3", 00:21:55.308 "uuid": "13a54bb6-29b6-4e6a-b83f-d66877399424", 00:21:55.308 "is_configured": true, 00:21:55.308 "data_offset": 2048, 00:21:55.308 "data_size": 63488 00:21:55.308 }, 00:21:55.308 { 00:21:55.308 "name": "BaseBdev4", 00:21:55.308 "uuid": "a533f0de-60fe-40e5-bb92-3ae6a238737c", 00:21:55.308 "is_configured": true, 00:21:55.308 "data_offset": 2048, 00:21:55.308 "data_size": 63488 00:21:55.308 } 00:21:55.308 ] 00:21:55.308 } 00:21:55.308 } 00:21:55.308 }' 00:21:55.308 18:25:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:21:55.308 18:25:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # base_bdev_names='NewBaseBdev 00:21:55.308 BaseBdev2 00:21:55.308 BaseBdev3 00:21:55.308 BaseBdev4' 00:21:55.308 18:25:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:21:55.308 18:25:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev 00:21:55.308 18:25:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:21:55.567 18:25:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:21:55.567 "name": "NewBaseBdev", 00:21:55.567 "aliases": [ 00:21:55.567 "5714ba19-bd74-4a47-9199-5bb08f7fb39d" 00:21:55.567 ], 00:21:55.567 "product_name": "Malloc disk", 00:21:55.567 "block_size": 512, 00:21:55.567 "num_blocks": 65536, 00:21:55.567 "uuid": "5714ba19-bd74-4a47-9199-5bb08f7fb39d", 00:21:55.567 "assigned_rate_limits": { 00:21:55.567 "rw_ios_per_sec": 0, 00:21:55.567 "rw_mbytes_per_sec": 0, 00:21:55.567 "r_mbytes_per_sec": 0, 00:21:55.567 "w_mbytes_per_sec": 0 00:21:55.567 }, 00:21:55.567 "claimed": true, 00:21:55.567 "claim_type": "exclusive_write", 00:21:55.567 "zoned": false, 00:21:55.567 "supported_io_types": { 00:21:55.567 "read": true, 00:21:55.567 "write": true, 00:21:55.567 "unmap": true, 00:21:55.567 "flush": true, 00:21:55.567 "reset": true, 00:21:55.567 "nvme_admin": false, 00:21:55.567 "nvme_io": false, 00:21:55.567 "nvme_io_md": false, 00:21:55.567 "write_zeroes": true, 00:21:55.567 "zcopy": true, 00:21:55.567 "get_zone_info": false, 00:21:55.567 "zone_management": false, 00:21:55.567 "zone_append": false, 00:21:55.567 "compare": false, 00:21:55.567 "compare_and_write": false, 00:21:55.567 "abort": true, 00:21:55.567 "seek_hole": false, 00:21:55.567 "seek_data": false, 00:21:55.567 "copy": true, 00:21:55.567 "nvme_iov_md": false 00:21:55.567 }, 00:21:55.567 "memory_domains": [ 00:21:55.567 { 00:21:55.567 "dma_device_id": "system", 00:21:55.567 "dma_device_type": 1 00:21:55.567 }, 00:21:55.567 { 00:21:55.567 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:55.567 "dma_device_type": 2 00:21:55.567 } 00:21:55.567 ], 00:21:55.567 "driver_specific": {} 00:21:55.567 }' 00:21:55.567 18:25:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:21:55.825 18:25:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:21:55.825 18:25:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:21:55.825 18:25:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:21:55.825 18:25:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:21:55.825 18:25:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:21:55.825 18:25:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:21:55.825 18:25:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:21:55.825 18:25:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:21:55.825 18:25:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:21:56.082 18:25:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:21:56.082 18:25:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:21:56.082 18:25:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:21:56.082 18:25:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:21:56.082 18:25:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:21:56.341 18:25:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:21:56.341 "name": "BaseBdev2", 00:21:56.341 "aliases": [ 00:21:56.341 "0332b9ff-8097-4b92-8362-9b22b179abb6" 00:21:56.341 ], 00:21:56.341 "product_name": "Malloc disk", 00:21:56.341 "block_size": 512, 00:21:56.341 "num_blocks": 65536, 00:21:56.341 "uuid": "0332b9ff-8097-4b92-8362-9b22b179abb6", 00:21:56.341 "assigned_rate_limits": { 00:21:56.341 "rw_ios_per_sec": 0, 00:21:56.341 "rw_mbytes_per_sec": 0, 00:21:56.341 "r_mbytes_per_sec": 0, 00:21:56.341 "w_mbytes_per_sec": 0 00:21:56.341 }, 00:21:56.341 "claimed": true, 00:21:56.341 "claim_type": "exclusive_write", 00:21:56.341 "zoned": false, 00:21:56.341 "supported_io_types": { 00:21:56.341 "read": true, 00:21:56.341 "write": true, 00:21:56.341 "unmap": true, 00:21:56.341 "flush": true, 00:21:56.341 "reset": true, 00:21:56.341 "nvme_admin": false, 00:21:56.341 "nvme_io": false, 00:21:56.341 "nvme_io_md": false, 00:21:56.341 "write_zeroes": true, 00:21:56.341 "zcopy": true, 00:21:56.341 "get_zone_info": false, 00:21:56.341 "zone_management": false, 00:21:56.341 "zone_append": false, 00:21:56.341 "compare": false, 00:21:56.341 "compare_and_write": false, 00:21:56.341 "abort": true, 00:21:56.341 "seek_hole": false, 00:21:56.341 "seek_data": false, 00:21:56.341 "copy": true, 00:21:56.341 "nvme_iov_md": false 00:21:56.341 }, 00:21:56.341 "memory_domains": [ 00:21:56.341 { 00:21:56.341 "dma_device_id": "system", 00:21:56.341 "dma_device_type": 1 00:21:56.341 }, 00:21:56.341 { 00:21:56.341 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:56.341 "dma_device_type": 2 00:21:56.341 } 00:21:56.341 ], 00:21:56.341 "driver_specific": {} 00:21:56.341 }' 00:21:56.341 18:25:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:21:56.341 18:25:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:21:56.341 18:25:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:21:56.341 18:25:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:21:56.341 18:25:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:21:56.341 18:25:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:21:56.341 18:25:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:21:56.341 18:25:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:21:56.599 18:25:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:21:56.599 18:25:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:21:56.599 18:25:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:21:56.599 18:25:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:21:56.599 18:25:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:21:56.599 18:25:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:21:56.599 18:25:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:21:56.858 18:25:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:21:56.858 "name": "BaseBdev3", 00:21:56.858 "aliases": [ 00:21:56.858 "13a54bb6-29b6-4e6a-b83f-d66877399424" 00:21:56.858 ], 00:21:56.858 "product_name": "Malloc disk", 00:21:56.858 "block_size": 512, 00:21:56.858 "num_blocks": 65536, 00:21:56.858 "uuid": "13a54bb6-29b6-4e6a-b83f-d66877399424", 00:21:56.858 "assigned_rate_limits": { 00:21:56.858 "rw_ios_per_sec": 0, 00:21:56.858 "rw_mbytes_per_sec": 0, 00:21:56.858 "r_mbytes_per_sec": 0, 00:21:56.858 "w_mbytes_per_sec": 0 00:21:56.858 }, 00:21:56.858 "claimed": true, 00:21:56.858 "claim_type": "exclusive_write", 00:21:56.858 "zoned": false, 00:21:56.858 "supported_io_types": { 00:21:56.858 "read": true, 00:21:56.858 "write": true, 00:21:56.858 "unmap": true, 00:21:56.858 "flush": true, 00:21:56.858 "reset": true, 00:21:56.858 "nvme_admin": false, 00:21:56.858 "nvme_io": false, 00:21:56.858 "nvme_io_md": false, 00:21:56.858 "write_zeroes": true, 00:21:56.858 "zcopy": true, 00:21:56.858 "get_zone_info": false, 00:21:56.858 "zone_management": false, 00:21:56.858 "zone_append": false, 00:21:56.858 "compare": false, 00:21:56.858 "compare_and_write": false, 00:21:56.858 "abort": true, 00:21:56.858 "seek_hole": false, 00:21:56.858 "seek_data": false, 00:21:56.858 "copy": true, 00:21:56.858 "nvme_iov_md": false 00:21:56.858 }, 00:21:56.858 "memory_domains": [ 00:21:56.858 { 00:21:56.858 "dma_device_id": "system", 00:21:56.858 "dma_device_type": 1 00:21:56.858 }, 00:21:56.858 { 00:21:56.858 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:56.858 "dma_device_type": 2 00:21:56.858 } 00:21:56.858 ], 00:21:56.858 "driver_specific": {} 00:21:56.858 }' 00:21:56.858 18:25:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:21:56.858 18:25:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:21:56.858 18:25:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:21:56.858 18:25:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:21:57.117 18:25:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:21:57.117 18:25:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:21:57.117 18:25:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:21:57.117 18:25:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:21:57.117 18:25:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:21:57.117 18:25:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:21:57.118 18:25:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:21:57.118 18:25:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:21:57.118 18:25:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:21:57.118 18:25:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 00:21:57.118 18:25:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:21:57.377 18:25:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:21:57.377 "name": "BaseBdev4", 00:21:57.377 "aliases": [ 00:21:57.377 "a533f0de-60fe-40e5-bb92-3ae6a238737c" 00:21:57.377 ], 00:21:57.377 "product_name": "Malloc disk", 00:21:57.377 "block_size": 512, 00:21:57.377 "num_blocks": 65536, 00:21:57.377 "uuid": "a533f0de-60fe-40e5-bb92-3ae6a238737c", 00:21:57.377 "assigned_rate_limits": { 00:21:57.377 "rw_ios_per_sec": 0, 00:21:57.377 "rw_mbytes_per_sec": 0, 00:21:57.377 "r_mbytes_per_sec": 0, 00:21:57.377 "w_mbytes_per_sec": 0 00:21:57.377 }, 00:21:57.377 "claimed": true, 00:21:57.377 "claim_type": "exclusive_write", 00:21:57.377 "zoned": false, 00:21:57.377 "supported_io_types": { 00:21:57.377 "read": true, 00:21:57.377 "write": true, 00:21:57.377 "unmap": true, 00:21:57.377 "flush": true, 00:21:57.377 "reset": true, 00:21:57.377 "nvme_admin": false, 00:21:57.377 "nvme_io": false, 00:21:57.377 "nvme_io_md": false, 00:21:57.377 "write_zeroes": true, 00:21:57.377 "zcopy": true, 00:21:57.377 "get_zone_info": false, 00:21:57.377 "zone_management": false, 00:21:57.377 "zone_append": false, 00:21:57.377 "compare": false, 00:21:57.377 "compare_and_write": false, 00:21:57.377 "abort": true, 00:21:57.377 "seek_hole": false, 00:21:57.377 "seek_data": false, 00:21:57.377 "copy": true, 00:21:57.377 "nvme_iov_md": false 00:21:57.377 }, 00:21:57.377 "memory_domains": [ 00:21:57.377 { 00:21:57.377 "dma_device_id": "system", 00:21:57.377 "dma_device_type": 1 00:21:57.377 }, 00:21:57.377 { 00:21:57.377 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:57.377 "dma_device_type": 2 00:21:57.377 } 00:21:57.377 ], 00:21:57.377 "driver_specific": {} 00:21:57.377 }' 00:21:57.377 18:25:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:21:57.636 18:25:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:21:57.636 18:25:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:21:57.636 18:25:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:21:57.636 18:25:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:21:57.636 18:25:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:21:57.636 18:25:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:21:57.636 18:25:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:21:57.636 18:25:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:21:57.636 18:25:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:21:57.895 18:25:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:21:57.895 18:25:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:21:57.895 18:25:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@338 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:21:58.153 [2024-07-12 18:25:41.652891] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:21:58.153 [2024-07-12 18:25:41.652919] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:21:58.153 [2024-07-12 18:25:41.652972] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:21:58.153 [2024-07-12 18:25:41.653255] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:21:58.153 [2024-07-12 18:25:41.653268] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1239180 name Existed_Raid, state offline 00:21:58.153 18:25:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@341 -- # killprocess 2553674 00:21:58.153 18:25:41 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@948 -- # '[' -z 2553674 ']' 00:21:58.153 18:25:41 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@952 -- # kill -0 2553674 00:21:58.153 18:25:41 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@953 -- # uname 00:21:58.153 18:25:41 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:21:58.153 18:25:41 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2553674 00:21:58.153 18:25:41 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:21:58.153 18:25:41 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:21:58.153 18:25:41 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2553674' 00:21:58.153 killing process with pid 2553674 00:21:58.153 18:25:41 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@967 -- # kill 2553674 00:21:58.153 [2024-07-12 18:25:41.720848] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:21:58.153 18:25:41 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@972 -- # wait 2553674 00:21:58.154 [2024-07-12 18:25:41.758259] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:21:58.413 18:25:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@343 -- # return 0 00:21:58.413 00:21:58.413 real 0m34.750s 00:21:58.413 user 1m3.993s 00:21:58.413 sys 0m5.909s 00:21:58.413 18:25:41 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@1124 -- # xtrace_disable 00:21:58.413 18:25:41 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:21:58.413 ************************************ 00:21:58.413 END TEST raid_state_function_test_sb 00:21:58.413 ************************************ 00:21:58.413 18:25:42 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:21:58.413 18:25:42 bdev_raid -- bdev/bdev_raid.sh@869 -- # run_test raid_superblock_test raid_superblock_test raid1 4 00:21:58.413 18:25:42 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 4 -le 1 ']' 00:21:58.413 18:25:42 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:21:58.413 18:25:42 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:21:58.413 ************************************ 00:21:58.413 START TEST raid_superblock_test 00:21:58.413 ************************************ 00:21:58.413 18:25:42 bdev_raid.raid_superblock_test -- common/autotest_common.sh@1123 -- # raid_superblock_test raid1 4 00:21:58.413 18:25:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@392 -- # local raid_level=raid1 00:21:58.413 18:25:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@393 -- # local num_base_bdevs=4 00:21:58.413 18:25:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@394 -- # base_bdevs_malloc=() 00:21:58.413 18:25:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@394 -- # local base_bdevs_malloc 00:21:58.413 18:25:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@395 -- # base_bdevs_pt=() 00:21:58.413 18:25:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@395 -- # local base_bdevs_pt 00:21:58.413 18:25:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@396 -- # base_bdevs_pt_uuid=() 00:21:58.413 18:25:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@396 -- # local base_bdevs_pt_uuid 00:21:58.413 18:25:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@397 -- # local raid_bdev_name=raid_bdev1 00:21:58.413 18:25:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@398 -- # local strip_size 00:21:58.413 18:25:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@399 -- # local strip_size_create_arg 00:21:58.413 18:25:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@400 -- # local raid_bdev_uuid 00:21:58.413 18:25:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@401 -- # local raid_bdev 00:21:58.413 18:25:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@403 -- # '[' raid1 '!=' raid1 ']' 00:21:58.413 18:25:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@407 -- # strip_size=0 00:21:58.413 18:25:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@411 -- # raid_pid=2558879 00:21:58.413 18:25:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@412 -- # waitforlisten 2558879 /var/tmp/spdk-raid.sock 00:21:58.413 18:25:42 bdev_raid.raid_superblock_test -- common/autotest_common.sh@829 -- # '[' -z 2558879 ']' 00:21:58.413 18:25:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@410 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -L bdev_raid 00:21:58.413 18:25:42 bdev_raid.raid_superblock_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:21:58.413 18:25:42 bdev_raid.raid_superblock_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:21:58.413 18:25:42 bdev_raid.raid_superblock_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:21:58.413 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:21:58.413 18:25:42 bdev_raid.raid_superblock_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:21:58.413 18:25:42 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:21:58.413 [2024-07-12 18:25:42.101422] Starting SPDK v24.09-pre git sha1 182dd7de4 / DPDK 24.03.0 initialization... 00:21:58.413 [2024-07-12 18:25:42.101491] [ DPDK EAL parameters: bdev_svc --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2558879 ] 00:21:58.672 [2024-07-12 18:25:42.218581] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:21:58.672 [2024-07-12 18:25:42.316795] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:21:58.672 [2024-07-12 18:25:42.378356] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:21:58.672 [2024-07-12 18:25:42.378392] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:21:59.607 18:25:43 bdev_raid.raid_superblock_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:21:59.607 18:25:43 bdev_raid.raid_superblock_test -- common/autotest_common.sh@862 -- # return 0 00:21:59.607 18:25:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i = 1 )) 00:21:59.607 18:25:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:21:59.607 18:25:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc1 00:21:59.607 18:25:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt1 00:21:59.607 18:25:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000001 00:21:59.607 18:25:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:21:59.607 18:25:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:21:59.607 18:25:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:21:59.607 18:25:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc1 00:21:59.864 malloc1 00:21:59.864 18:25:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:22:00.429 [2024-07-12 18:25:43.960221] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:22:00.429 [2024-07-12 18:25:43.960271] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:22:00.429 [2024-07-12 18:25:43.960291] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x137b570 00:22:00.429 [2024-07-12 18:25:43.960304] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:22:00.429 [2024-07-12 18:25:43.962020] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:22:00.429 [2024-07-12 18:25:43.962047] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:22:00.429 pt1 00:22:00.429 18:25:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:22:00.429 18:25:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:22:00.429 18:25:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc2 00:22:00.429 18:25:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt2 00:22:00.429 18:25:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000002 00:22:00.429 18:25:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:22:00.429 18:25:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:22:00.429 18:25:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:22:00.429 18:25:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc2 00:22:00.688 malloc2 00:22:00.688 18:25:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:22:01.255 [2024-07-12 18:25:44.727029] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:22:01.255 [2024-07-12 18:25:44.727076] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:22:01.255 [2024-07-12 18:25:44.727093] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x137c970 00:22:01.255 [2024-07-12 18:25:44.727106] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:22:01.255 [2024-07-12 18:25:44.728746] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:22:01.255 [2024-07-12 18:25:44.728773] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:22:01.256 pt2 00:22:01.256 18:25:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:22:01.256 18:25:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:22:01.256 18:25:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc3 00:22:01.256 18:25:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt3 00:22:01.256 18:25:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000003 00:22:01.256 18:25:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:22:01.256 18:25:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:22:01.256 18:25:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:22:01.256 18:25:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc3 00:22:01.514 malloc3 00:22:01.514 18:25:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc3 -p pt3 -u 00000000-0000-0000-0000-000000000003 00:22:01.771 [2024-07-12 18:25:45.494810] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc3 00:22:01.771 [2024-07-12 18:25:45.494859] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:22:01.771 [2024-07-12 18:25:45.494877] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1513340 00:22:01.771 [2024-07-12 18:25:45.494889] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:22:01.771 [2024-07-12 18:25:45.496493] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:22:01.771 [2024-07-12 18:25:45.496520] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt3 00:22:02.030 pt3 00:22:02.030 18:25:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:22:02.030 18:25:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:22:02.030 18:25:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc4 00:22:02.030 18:25:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt4 00:22:02.030 18:25:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000004 00:22:02.030 18:25:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:22:02.030 18:25:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:22:02.030 18:25:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:22:02.030 18:25:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc4 00:22:02.597 malloc4 00:22:02.597 18:25:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc4 -p pt4 -u 00000000-0000-0000-0000-000000000004 00:22:02.597 [2024-07-12 18:25:46.269416] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc4 00:22:02.597 [2024-07-12 18:25:46.269465] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:22:02.597 [2024-07-12 18:25:46.269485] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1515c60 00:22:02.597 [2024-07-12 18:25:46.269497] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:22:02.597 [2024-07-12 18:25:46.271057] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:22:02.597 [2024-07-12 18:25:46.271086] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt4 00:22:02.597 pt4 00:22:02.597 18:25:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:22:02.597 18:25:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:22:02.597 18:25:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@429 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'pt1 pt2 pt3 pt4' -n raid_bdev1 -s 00:22:03.163 [2024-07-12 18:25:46.766736] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:22:03.163 [2024-07-12 18:25:46.768079] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:22:03.163 [2024-07-12 18:25:46.768134] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt3 is claimed 00:22:03.163 [2024-07-12 18:25:46.768178] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt4 is claimed 00:22:03.163 [2024-07-12 18:25:46.768348] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x1373530 00:22:03.163 [2024-07-12 18:25:46.768359] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:22:03.163 [2024-07-12 18:25:46.768556] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1371770 00:22:03.163 [2024-07-12 18:25:46.768707] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1373530 00:22:03.163 [2024-07-12 18:25:46.768718] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x1373530 00:22:03.163 [2024-07-12 18:25:46.768817] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:22:03.163 18:25:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@430 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 4 00:22:03.163 18:25:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:22:03.163 18:25:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:22:03.163 18:25:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:22:03.163 18:25:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:22:03.163 18:25:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:22:03.163 18:25:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:22:03.163 18:25:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:22:03.163 18:25:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:22:03.163 18:25:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:22:03.163 18:25:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:03.163 18:25:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:03.422 18:25:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:22:03.422 "name": "raid_bdev1", 00:22:03.422 "uuid": "873e1549-bc2e-4d65-af75-ce4e9ea819a1", 00:22:03.422 "strip_size_kb": 0, 00:22:03.422 "state": "online", 00:22:03.422 "raid_level": "raid1", 00:22:03.422 "superblock": true, 00:22:03.422 "num_base_bdevs": 4, 00:22:03.422 "num_base_bdevs_discovered": 4, 00:22:03.422 "num_base_bdevs_operational": 4, 00:22:03.422 "base_bdevs_list": [ 00:22:03.422 { 00:22:03.422 "name": "pt1", 00:22:03.422 "uuid": "00000000-0000-0000-0000-000000000001", 00:22:03.422 "is_configured": true, 00:22:03.422 "data_offset": 2048, 00:22:03.422 "data_size": 63488 00:22:03.422 }, 00:22:03.422 { 00:22:03.422 "name": "pt2", 00:22:03.422 "uuid": "00000000-0000-0000-0000-000000000002", 00:22:03.422 "is_configured": true, 00:22:03.422 "data_offset": 2048, 00:22:03.422 "data_size": 63488 00:22:03.422 }, 00:22:03.422 { 00:22:03.422 "name": "pt3", 00:22:03.422 "uuid": "00000000-0000-0000-0000-000000000003", 00:22:03.422 "is_configured": true, 00:22:03.422 "data_offset": 2048, 00:22:03.422 "data_size": 63488 00:22:03.422 }, 00:22:03.422 { 00:22:03.422 "name": "pt4", 00:22:03.422 "uuid": "00000000-0000-0000-0000-000000000004", 00:22:03.422 "is_configured": true, 00:22:03.422 "data_offset": 2048, 00:22:03.422 "data_size": 63488 00:22:03.422 } 00:22:03.422 ] 00:22:03.422 }' 00:22:03.422 18:25:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:22:03.422 18:25:47 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:22:03.988 18:25:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@431 -- # verify_raid_bdev_properties raid_bdev1 00:22:03.988 18:25:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:22:03.988 18:25:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:22:03.988 18:25:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:22:03.988 18:25:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:22:03.988 18:25:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@198 -- # local name 00:22:03.988 18:25:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:22:03.988 18:25:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:22:04.247 [2024-07-12 18:25:47.777687] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:22:04.247 18:25:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:22:04.247 "name": "raid_bdev1", 00:22:04.247 "aliases": [ 00:22:04.247 "873e1549-bc2e-4d65-af75-ce4e9ea819a1" 00:22:04.247 ], 00:22:04.247 "product_name": "Raid Volume", 00:22:04.247 "block_size": 512, 00:22:04.247 "num_blocks": 63488, 00:22:04.247 "uuid": "873e1549-bc2e-4d65-af75-ce4e9ea819a1", 00:22:04.247 "assigned_rate_limits": { 00:22:04.247 "rw_ios_per_sec": 0, 00:22:04.247 "rw_mbytes_per_sec": 0, 00:22:04.247 "r_mbytes_per_sec": 0, 00:22:04.247 "w_mbytes_per_sec": 0 00:22:04.247 }, 00:22:04.247 "claimed": false, 00:22:04.247 "zoned": false, 00:22:04.247 "supported_io_types": { 00:22:04.247 "read": true, 00:22:04.247 "write": true, 00:22:04.247 "unmap": false, 00:22:04.247 "flush": false, 00:22:04.247 "reset": true, 00:22:04.247 "nvme_admin": false, 00:22:04.247 "nvme_io": false, 00:22:04.247 "nvme_io_md": false, 00:22:04.247 "write_zeroes": true, 00:22:04.247 "zcopy": false, 00:22:04.247 "get_zone_info": false, 00:22:04.247 "zone_management": false, 00:22:04.247 "zone_append": false, 00:22:04.247 "compare": false, 00:22:04.247 "compare_and_write": false, 00:22:04.247 "abort": false, 00:22:04.247 "seek_hole": false, 00:22:04.247 "seek_data": false, 00:22:04.247 "copy": false, 00:22:04.247 "nvme_iov_md": false 00:22:04.247 }, 00:22:04.247 "memory_domains": [ 00:22:04.247 { 00:22:04.247 "dma_device_id": "system", 00:22:04.247 "dma_device_type": 1 00:22:04.247 }, 00:22:04.247 { 00:22:04.247 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:22:04.247 "dma_device_type": 2 00:22:04.247 }, 00:22:04.247 { 00:22:04.247 "dma_device_id": "system", 00:22:04.247 "dma_device_type": 1 00:22:04.247 }, 00:22:04.247 { 00:22:04.247 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:22:04.247 "dma_device_type": 2 00:22:04.247 }, 00:22:04.247 { 00:22:04.247 "dma_device_id": "system", 00:22:04.247 "dma_device_type": 1 00:22:04.247 }, 00:22:04.247 { 00:22:04.247 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:22:04.247 "dma_device_type": 2 00:22:04.247 }, 00:22:04.247 { 00:22:04.247 "dma_device_id": "system", 00:22:04.247 "dma_device_type": 1 00:22:04.247 }, 00:22:04.247 { 00:22:04.247 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:22:04.247 "dma_device_type": 2 00:22:04.247 } 00:22:04.247 ], 00:22:04.247 "driver_specific": { 00:22:04.247 "raid": { 00:22:04.247 "uuid": "873e1549-bc2e-4d65-af75-ce4e9ea819a1", 00:22:04.247 "strip_size_kb": 0, 00:22:04.247 "state": "online", 00:22:04.247 "raid_level": "raid1", 00:22:04.247 "superblock": true, 00:22:04.247 "num_base_bdevs": 4, 00:22:04.247 "num_base_bdevs_discovered": 4, 00:22:04.247 "num_base_bdevs_operational": 4, 00:22:04.247 "base_bdevs_list": [ 00:22:04.247 { 00:22:04.247 "name": "pt1", 00:22:04.247 "uuid": "00000000-0000-0000-0000-000000000001", 00:22:04.247 "is_configured": true, 00:22:04.247 "data_offset": 2048, 00:22:04.247 "data_size": 63488 00:22:04.247 }, 00:22:04.247 { 00:22:04.247 "name": "pt2", 00:22:04.247 "uuid": "00000000-0000-0000-0000-000000000002", 00:22:04.247 "is_configured": true, 00:22:04.247 "data_offset": 2048, 00:22:04.247 "data_size": 63488 00:22:04.247 }, 00:22:04.247 { 00:22:04.247 "name": "pt3", 00:22:04.247 "uuid": "00000000-0000-0000-0000-000000000003", 00:22:04.247 "is_configured": true, 00:22:04.247 "data_offset": 2048, 00:22:04.247 "data_size": 63488 00:22:04.247 }, 00:22:04.247 { 00:22:04.247 "name": "pt4", 00:22:04.247 "uuid": "00000000-0000-0000-0000-000000000004", 00:22:04.247 "is_configured": true, 00:22:04.247 "data_offset": 2048, 00:22:04.247 "data_size": 63488 00:22:04.247 } 00:22:04.247 ] 00:22:04.247 } 00:22:04.247 } 00:22:04.247 }' 00:22:04.247 18:25:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:22:04.247 18:25:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:22:04.247 pt2 00:22:04.247 pt3 00:22:04.247 pt4' 00:22:04.247 18:25:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:22:04.247 18:25:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:22:04.247 18:25:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:22:04.505 18:25:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:22:04.505 "name": "pt1", 00:22:04.505 "aliases": [ 00:22:04.505 "00000000-0000-0000-0000-000000000001" 00:22:04.505 ], 00:22:04.505 "product_name": "passthru", 00:22:04.505 "block_size": 512, 00:22:04.505 "num_blocks": 65536, 00:22:04.505 "uuid": "00000000-0000-0000-0000-000000000001", 00:22:04.505 "assigned_rate_limits": { 00:22:04.505 "rw_ios_per_sec": 0, 00:22:04.505 "rw_mbytes_per_sec": 0, 00:22:04.505 "r_mbytes_per_sec": 0, 00:22:04.505 "w_mbytes_per_sec": 0 00:22:04.505 }, 00:22:04.505 "claimed": true, 00:22:04.505 "claim_type": "exclusive_write", 00:22:04.505 "zoned": false, 00:22:04.505 "supported_io_types": { 00:22:04.505 "read": true, 00:22:04.505 "write": true, 00:22:04.505 "unmap": true, 00:22:04.505 "flush": true, 00:22:04.505 "reset": true, 00:22:04.505 "nvme_admin": false, 00:22:04.505 "nvme_io": false, 00:22:04.505 "nvme_io_md": false, 00:22:04.505 "write_zeroes": true, 00:22:04.505 "zcopy": true, 00:22:04.505 "get_zone_info": false, 00:22:04.505 "zone_management": false, 00:22:04.505 "zone_append": false, 00:22:04.505 "compare": false, 00:22:04.505 "compare_and_write": false, 00:22:04.505 "abort": true, 00:22:04.505 "seek_hole": false, 00:22:04.505 "seek_data": false, 00:22:04.505 "copy": true, 00:22:04.505 "nvme_iov_md": false 00:22:04.505 }, 00:22:04.505 "memory_domains": [ 00:22:04.505 { 00:22:04.505 "dma_device_id": "system", 00:22:04.505 "dma_device_type": 1 00:22:04.505 }, 00:22:04.505 { 00:22:04.505 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:22:04.505 "dma_device_type": 2 00:22:04.505 } 00:22:04.505 ], 00:22:04.505 "driver_specific": { 00:22:04.505 "passthru": { 00:22:04.505 "name": "pt1", 00:22:04.505 "base_bdev_name": "malloc1" 00:22:04.505 } 00:22:04.505 } 00:22:04.505 }' 00:22:04.505 18:25:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:22:04.505 18:25:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:22:04.505 18:25:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:22:04.505 18:25:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:22:04.762 18:25:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:22:04.762 18:25:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:22:04.762 18:25:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:22:04.762 18:25:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:22:04.762 18:25:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:22:04.762 18:25:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:22:04.762 18:25:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:22:04.762 18:25:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:22:04.762 18:25:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:22:04.762 18:25:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:22:04.763 18:25:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:22:05.023 18:25:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:22:05.023 "name": "pt2", 00:22:05.023 "aliases": [ 00:22:05.023 "00000000-0000-0000-0000-000000000002" 00:22:05.023 ], 00:22:05.023 "product_name": "passthru", 00:22:05.023 "block_size": 512, 00:22:05.023 "num_blocks": 65536, 00:22:05.023 "uuid": "00000000-0000-0000-0000-000000000002", 00:22:05.023 "assigned_rate_limits": { 00:22:05.023 "rw_ios_per_sec": 0, 00:22:05.023 "rw_mbytes_per_sec": 0, 00:22:05.023 "r_mbytes_per_sec": 0, 00:22:05.023 "w_mbytes_per_sec": 0 00:22:05.023 }, 00:22:05.023 "claimed": true, 00:22:05.023 "claim_type": "exclusive_write", 00:22:05.023 "zoned": false, 00:22:05.023 "supported_io_types": { 00:22:05.023 "read": true, 00:22:05.023 "write": true, 00:22:05.023 "unmap": true, 00:22:05.023 "flush": true, 00:22:05.023 "reset": true, 00:22:05.023 "nvme_admin": false, 00:22:05.023 "nvme_io": false, 00:22:05.023 "nvme_io_md": false, 00:22:05.023 "write_zeroes": true, 00:22:05.023 "zcopy": true, 00:22:05.023 "get_zone_info": false, 00:22:05.023 "zone_management": false, 00:22:05.023 "zone_append": false, 00:22:05.023 "compare": false, 00:22:05.023 "compare_and_write": false, 00:22:05.023 "abort": true, 00:22:05.023 "seek_hole": false, 00:22:05.023 "seek_data": false, 00:22:05.023 "copy": true, 00:22:05.023 "nvme_iov_md": false 00:22:05.023 }, 00:22:05.023 "memory_domains": [ 00:22:05.023 { 00:22:05.023 "dma_device_id": "system", 00:22:05.023 "dma_device_type": 1 00:22:05.023 }, 00:22:05.023 { 00:22:05.023 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:22:05.023 "dma_device_type": 2 00:22:05.023 } 00:22:05.023 ], 00:22:05.023 "driver_specific": { 00:22:05.023 "passthru": { 00:22:05.023 "name": "pt2", 00:22:05.023 "base_bdev_name": "malloc2" 00:22:05.023 } 00:22:05.023 } 00:22:05.023 }' 00:22:05.023 18:25:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:22:05.023 18:25:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:22:05.325 18:25:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:22:05.325 18:25:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:22:05.325 18:25:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:22:05.325 18:25:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:22:05.325 18:25:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:22:05.325 18:25:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:22:05.325 18:25:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:22:05.325 18:25:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:22:05.325 18:25:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:22:05.325 18:25:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:22:05.616 18:25:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:22:05.616 18:25:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt3 00:22:05.616 18:25:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:22:05.616 18:25:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:22:05.616 "name": "pt3", 00:22:05.616 "aliases": [ 00:22:05.616 "00000000-0000-0000-0000-000000000003" 00:22:05.616 ], 00:22:05.616 "product_name": "passthru", 00:22:05.616 "block_size": 512, 00:22:05.616 "num_blocks": 65536, 00:22:05.616 "uuid": "00000000-0000-0000-0000-000000000003", 00:22:05.616 "assigned_rate_limits": { 00:22:05.616 "rw_ios_per_sec": 0, 00:22:05.616 "rw_mbytes_per_sec": 0, 00:22:05.616 "r_mbytes_per_sec": 0, 00:22:05.616 "w_mbytes_per_sec": 0 00:22:05.616 }, 00:22:05.616 "claimed": true, 00:22:05.616 "claim_type": "exclusive_write", 00:22:05.616 "zoned": false, 00:22:05.616 "supported_io_types": { 00:22:05.616 "read": true, 00:22:05.616 "write": true, 00:22:05.616 "unmap": true, 00:22:05.616 "flush": true, 00:22:05.616 "reset": true, 00:22:05.616 "nvme_admin": false, 00:22:05.616 "nvme_io": false, 00:22:05.616 "nvme_io_md": false, 00:22:05.616 "write_zeroes": true, 00:22:05.616 "zcopy": true, 00:22:05.616 "get_zone_info": false, 00:22:05.616 "zone_management": false, 00:22:05.616 "zone_append": false, 00:22:05.616 "compare": false, 00:22:05.616 "compare_and_write": false, 00:22:05.616 "abort": true, 00:22:05.616 "seek_hole": false, 00:22:05.616 "seek_data": false, 00:22:05.616 "copy": true, 00:22:05.616 "nvme_iov_md": false 00:22:05.616 }, 00:22:05.616 "memory_domains": [ 00:22:05.616 { 00:22:05.616 "dma_device_id": "system", 00:22:05.616 "dma_device_type": 1 00:22:05.616 }, 00:22:05.616 { 00:22:05.616 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:22:05.616 "dma_device_type": 2 00:22:05.616 } 00:22:05.616 ], 00:22:05.616 "driver_specific": { 00:22:05.616 "passthru": { 00:22:05.616 "name": "pt3", 00:22:05.616 "base_bdev_name": "malloc3" 00:22:05.616 } 00:22:05.616 } 00:22:05.616 }' 00:22:05.616 18:25:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:22:05.616 18:25:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:22:05.616 18:25:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:22:05.616 18:25:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:22:05.874 18:25:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:22:05.874 18:25:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:22:05.874 18:25:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:22:05.874 18:25:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:22:05.874 18:25:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:22:05.874 18:25:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:22:05.874 18:25:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:22:05.874 18:25:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:22:05.874 18:25:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:22:05.874 18:25:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt4 00:22:05.874 18:25:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:22:06.131 18:25:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:22:06.131 "name": "pt4", 00:22:06.131 "aliases": [ 00:22:06.131 "00000000-0000-0000-0000-000000000004" 00:22:06.131 ], 00:22:06.131 "product_name": "passthru", 00:22:06.131 "block_size": 512, 00:22:06.131 "num_blocks": 65536, 00:22:06.131 "uuid": "00000000-0000-0000-0000-000000000004", 00:22:06.131 "assigned_rate_limits": { 00:22:06.131 "rw_ios_per_sec": 0, 00:22:06.131 "rw_mbytes_per_sec": 0, 00:22:06.131 "r_mbytes_per_sec": 0, 00:22:06.131 "w_mbytes_per_sec": 0 00:22:06.131 }, 00:22:06.131 "claimed": true, 00:22:06.131 "claim_type": "exclusive_write", 00:22:06.131 "zoned": false, 00:22:06.131 "supported_io_types": { 00:22:06.131 "read": true, 00:22:06.131 "write": true, 00:22:06.131 "unmap": true, 00:22:06.131 "flush": true, 00:22:06.131 "reset": true, 00:22:06.131 "nvme_admin": false, 00:22:06.131 "nvme_io": false, 00:22:06.131 "nvme_io_md": false, 00:22:06.131 "write_zeroes": true, 00:22:06.131 "zcopy": true, 00:22:06.131 "get_zone_info": false, 00:22:06.131 "zone_management": false, 00:22:06.131 "zone_append": false, 00:22:06.131 "compare": false, 00:22:06.131 "compare_and_write": false, 00:22:06.131 "abort": true, 00:22:06.131 "seek_hole": false, 00:22:06.131 "seek_data": false, 00:22:06.131 "copy": true, 00:22:06.131 "nvme_iov_md": false 00:22:06.131 }, 00:22:06.131 "memory_domains": [ 00:22:06.131 { 00:22:06.131 "dma_device_id": "system", 00:22:06.131 "dma_device_type": 1 00:22:06.131 }, 00:22:06.131 { 00:22:06.131 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:22:06.131 "dma_device_type": 2 00:22:06.131 } 00:22:06.131 ], 00:22:06.131 "driver_specific": { 00:22:06.131 "passthru": { 00:22:06.131 "name": "pt4", 00:22:06.131 "base_bdev_name": "malloc4" 00:22:06.131 } 00:22:06.131 } 00:22:06.131 }' 00:22:06.131 18:25:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:22:06.131 18:25:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:22:06.388 18:25:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:22:06.388 18:25:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:22:06.388 18:25:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:22:06.389 18:25:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:22:06.389 18:25:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:22:06.389 18:25:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:22:06.389 18:25:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:22:06.389 18:25:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:22:06.389 18:25:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:22:06.647 18:25:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:22:06.647 18:25:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:22:06.647 18:25:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # jq -r '.[] | .uuid' 00:22:06.647 [2024-07-12 18:25:50.352532] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:22:06.647 18:25:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # raid_bdev_uuid=873e1549-bc2e-4d65-af75-ce4e9ea819a1 00:22:06.647 18:25:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@435 -- # '[' -z 873e1549-bc2e-4d65-af75-ce4e9ea819a1 ']' 00:22:06.647 18:25:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@440 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:22:06.905 [2024-07-12 18:25:50.596849] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:22:06.905 [2024-07-12 18:25:50.596870] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:22:06.905 [2024-07-12 18:25:50.596914] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:22:06.905 [2024-07-12 18:25:50.597003] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:22:06.905 [2024-07-12 18:25:50.597016] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1373530 name raid_bdev1, state offline 00:22:06.905 18:25:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:06.905 18:25:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # jq -r '.[]' 00:22:07.163 18:25:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # raid_bdev= 00:22:07.163 18:25:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@442 -- # '[' -n '' ']' 00:22:07.163 18:25:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:22:07.163 18:25:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt1 00:22:07.421 18:25:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:22:07.421 18:25:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:22:07.679 18:25:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:22:07.679 18:25:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt3 00:22:07.936 18:25:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:22:07.936 18:25:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt4 00:22:08.193 18:25:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs 00:22:08.193 18:25:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # jq -r '[.[] | select(.product_name == "passthru")] | any' 00:22:08.451 18:25:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # '[' false == true ']' 00:22:08.451 18:25:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@456 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'malloc1 malloc2 malloc3 malloc4' -n raid_bdev1 00:22:08.451 18:25:52 bdev_raid.raid_superblock_test -- common/autotest_common.sh@648 -- # local es=0 00:22:08.451 18:25:52 bdev_raid.raid_superblock_test -- common/autotest_common.sh@650 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'malloc1 malloc2 malloc3 malloc4' -n raid_bdev1 00:22:08.451 18:25:52 bdev_raid.raid_superblock_test -- common/autotest_common.sh@636 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:22:08.451 18:25:52 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:22:08.451 18:25:52 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:22:08.451 18:25:52 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:22:08.451 18:25:52 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:22:08.451 18:25:52 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:22:08.451 18:25:52 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:22:08.451 18:25:52 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py ]] 00:22:08.451 18:25:52 bdev_raid.raid_superblock_test -- common/autotest_common.sh@651 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'malloc1 malloc2 malloc3 malloc4' -n raid_bdev1 00:22:08.709 [2024-07-12 18:25:52.301306] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc1 is claimed 00:22:08.709 [2024-07-12 18:25:52.302641] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc2 is claimed 00:22:08.709 [2024-07-12 18:25:52.302683] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc3 is claimed 00:22:08.709 [2024-07-12 18:25:52.302716] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc4 is claimed 00:22:08.709 [2024-07-12 18:25:52.302759] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc1 00:22:08.709 [2024-07-12 18:25:52.302799] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc2 00:22:08.709 [2024-07-12 18:25:52.302828] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc3 00:22:08.709 [2024-07-12 18:25:52.302850] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc4 00:22:08.709 [2024-07-12 18:25:52.302867] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:22:08.709 [2024-07-12 18:25:52.302877] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x151eff0 name raid_bdev1, state configuring 00:22:08.709 request: 00:22:08.709 { 00:22:08.709 "name": "raid_bdev1", 00:22:08.709 "raid_level": "raid1", 00:22:08.709 "base_bdevs": [ 00:22:08.709 "malloc1", 00:22:08.709 "malloc2", 00:22:08.709 "malloc3", 00:22:08.709 "malloc4" 00:22:08.709 ], 00:22:08.709 "superblock": false, 00:22:08.709 "method": "bdev_raid_create", 00:22:08.709 "req_id": 1 00:22:08.709 } 00:22:08.709 Got JSON-RPC error response 00:22:08.709 response: 00:22:08.709 { 00:22:08.709 "code": -17, 00:22:08.709 "message": "Failed to create RAID bdev raid_bdev1: File exists" 00:22:08.709 } 00:22:08.709 18:25:52 bdev_raid.raid_superblock_test -- common/autotest_common.sh@651 -- # es=1 00:22:08.709 18:25:52 bdev_raid.raid_superblock_test -- common/autotest_common.sh@659 -- # (( es > 128 )) 00:22:08.709 18:25:52 bdev_raid.raid_superblock_test -- common/autotest_common.sh@670 -- # [[ -n '' ]] 00:22:08.709 18:25:52 bdev_raid.raid_superblock_test -- common/autotest_common.sh@675 -- # (( !es == 0 )) 00:22:08.709 18:25:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:08.709 18:25:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # jq -r '.[]' 00:22:08.967 18:25:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # raid_bdev= 00:22:08.967 18:25:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@459 -- # '[' -n '' ']' 00:22:08.967 18:25:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@464 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:22:09.226 [2024-07-12 18:25:52.782514] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:22:09.226 [2024-07-12 18:25:52.782552] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:22:09.226 [2024-07-12 18:25:52.782571] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x137b7a0 00:22:09.226 [2024-07-12 18:25:52.782584] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:22:09.226 [2024-07-12 18:25:52.784171] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:22:09.226 [2024-07-12 18:25:52.784212] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:22:09.226 [2024-07-12 18:25:52.784276] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt1 00:22:09.226 [2024-07-12 18:25:52.784301] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:22:09.226 pt1 00:22:09.226 18:25:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@467 -- # verify_raid_bdev_state raid_bdev1 configuring raid1 0 4 00:22:09.226 18:25:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:22:09.226 18:25:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:22:09.226 18:25:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:22:09.226 18:25:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:22:09.226 18:25:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:22:09.226 18:25:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:22:09.226 18:25:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:22:09.226 18:25:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:22:09.226 18:25:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:22:09.226 18:25:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:09.226 18:25:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:09.483 18:25:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:22:09.483 "name": "raid_bdev1", 00:22:09.483 "uuid": "873e1549-bc2e-4d65-af75-ce4e9ea819a1", 00:22:09.483 "strip_size_kb": 0, 00:22:09.483 "state": "configuring", 00:22:09.483 "raid_level": "raid1", 00:22:09.483 "superblock": true, 00:22:09.483 "num_base_bdevs": 4, 00:22:09.483 "num_base_bdevs_discovered": 1, 00:22:09.483 "num_base_bdevs_operational": 4, 00:22:09.483 "base_bdevs_list": [ 00:22:09.483 { 00:22:09.483 "name": "pt1", 00:22:09.483 "uuid": "00000000-0000-0000-0000-000000000001", 00:22:09.483 "is_configured": true, 00:22:09.483 "data_offset": 2048, 00:22:09.483 "data_size": 63488 00:22:09.483 }, 00:22:09.483 { 00:22:09.483 "name": null, 00:22:09.483 "uuid": "00000000-0000-0000-0000-000000000002", 00:22:09.483 "is_configured": false, 00:22:09.483 "data_offset": 2048, 00:22:09.483 "data_size": 63488 00:22:09.483 }, 00:22:09.483 { 00:22:09.483 "name": null, 00:22:09.483 "uuid": "00000000-0000-0000-0000-000000000003", 00:22:09.483 "is_configured": false, 00:22:09.483 "data_offset": 2048, 00:22:09.483 "data_size": 63488 00:22:09.483 }, 00:22:09.483 { 00:22:09.483 "name": null, 00:22:09.483 "uuid": "00000000-0000-0000-0000-000000000004", 00:22:09.483 "is_configured": false, 00:22:09.483 "data_offset": 2048, 00:22:09.483 "data_size": 63488 00:22:09.483 } 00:22:09.483 ] 00:22:09.483 }' 00:22:09.483 18:25:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:22:09.483 18:25:53 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:22:10.048 18:25:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@469 -- # '[' 4 -gt 2 ']' 00:22:10.048 18:25:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@471 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:22:10.048 [2024-07-12 18:25:53.773160] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:22:10.048 [2024-07-12 18:25:53.773205] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:22:10.048 [2024-07-12 18:25:53.773224] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1514940 00:22:10.048 [2024-07-12 18:25:53.773237] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:22:10.048 [2024-07-12 18:25:53.773557] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:22:10.048 [2024-07-12 18:25:53.773574] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:22:10.048 [2024-07-12 18:25:53.773630] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:22:10.048 [2024-07-12 18:25:53.773648] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:22:10.305 pt2 00:22:10.305 18:25:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@472 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:22:10.305 [2024-07-12 18:25:53.953640] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: pt2 00:22:10.305 18:25:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@473 -- # verify_raid_bdev_state raid_bdev1 configuring raid1 0 4 00:22:10.305 18:25:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:22:10.305 18:25:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:22:10.305 18:25:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:22:10.305 18:25:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:22:10.305 18:25:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:22:10.305 18:25:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:22:10.305 18:25:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:22:10.305 18:25:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:22:10.305 18:25:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:22:10.305 18:25:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:10.305 18:25:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:10.563 18:25:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:22:10.563 "name": "raid_bdev1", 00:22:10.563 "uuid": "873e1549-bc2e-4d65-af75-ce4e9ea819a1", 00:22:10.563 "strip_size_kb": 0, 00:22:10.563 "state": "configuring", 00:22:10.563 "raid_level": "raid1", 00:22:10.563 "superblock": true, 00:22:10.563 "num_base_bdevs": 4, 00:22:10.563 "num_base_bdevs_discovered": 1, 00:22:10.563 "num_base_bdevs_operational": 4, 00:22:10.563 "base_bdevs_list": [ 00:22:10.563 { 00:22:10.563 "name": "pt1", 00:22:10.563 "uuid": "00000000-0000-0000-0000-000000000001", 00:22:10.563 "is_configured": true, 00:22:10.563 "data_offset": 2048, 00:22:10.563 "data_size": 63488 00:22:10.563 }, 00:22:10.563 { 00:22:10.563 "name": null, 00:22:10.563 "uuid": "00000000-0000-0000-0000-000000000002", 00:22:10.563 "is_configured": false, 00:22:10.563 "data_offset": 2048, 00:22:10.563 "data_size": 63488 00:22:10.563 }, 00:22:10.563 { 00:22:10.563 "name": null, 00:22:10.563 "uuid": "00000000-0000-0000-0000-000000000003", 00:22:10.563 "is_configured": false, 00:22:10.563 "data_offset": 2048, 00:22:10.563 "data_size": 63488 00:22:10.563 }, 00:22:10.563 { 00:22:10.563 "name": null, 00:22:10.563 "uuid": "00000000-0000-0000-0000-000000000004", 00:22:10.563 "is_configured": false, 00:22:10.563 "data_offset": 2048, 00:22:10.563 "data_size": 63488 00:22:10.563 } 00:22:10.563 ] 00:22:10.563 }' 00:22:10.563 18:25:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:22:10.563 18:25:54 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:22:11.129 18:25:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i = 1 )) 00:22:11.129 18:25:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:22:11.129 18:25:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@478 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:22:11.387 [2024-07-12 18:25:55.044588] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:22:11.387 [2024-07-12 18:25:55.044636] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:22:11.387 [2024-07-12 18:25:55.044654] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1372060 00:22:11.387 [2024-07-12 18:25:55.044667] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:22:11.387 [2024-07-12 18:25:55.045007] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:22:11.387 [2024-07-12 18:25:55.045025] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:22:11.387 [2024-07-12 18:25:55.045084] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:22:11.387 [2024-07-12 18:25:55.045102] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:22:11.387 pt2 00:22:11.387 18:25:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i++ )) 00:22:11.387 18:25:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:22:11.387 18:25:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@478 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc3 -p pt3 -u 00000000-0000-0000-0000-000000000003 00:22:11.645 [2024-07-12 18:25:55.285241] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc3 00:22:11.645 [2024-07-12 18:25:55.285282] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:22:11.645 [2024-07-12 18:25:55.285302] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x13748d0 00:22:11.645 [2024-07-12 18:25:55.285315] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:22:11.645 [2024-07-12 18:25:55.285627] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:22:11.645 [2024-07-12 18:25:55.285643] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt3 00:22:11.645 [2024-07-12 18:25:55.285698] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt3 00:22:11.645 [2024-07-12 18:25:55.285715] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt3 is claimed 00:22:11.645 pt3 00:22:11.645 18:25:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i++ )) 00:22:11.645 18:25:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:22:11.645 18:25:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@478 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc4 -p pt4 -u 00000000-0000-0000-0000-000000000004 00:22:11.904 [2024-07-12 18:25:55.517860] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc4 00:22:11.904 [2024-07-12 18:25:55.517894] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:22:11.904 [2024-07-12 18:25:55.517909] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1375b80 00:22:11.904 [2024-07-12 18:25:55.517922] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:22:11.904 [2024-07-12 18:25:55.518222] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:22:11.904 [2024-07-12 18:25:55.518239] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt4 00:22:11.904 [2024-07-12 18:25:55.518290] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt4 00:22:11.904 [2024-07-12 18:25:55.518309] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt4 is claimed 00:22:11.904 [2024-07-12 18:25:55.518428] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x1372780 00:22:11.904 [2024-07-12 18:25:55.518439] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:22:11.904 [2024-07-12 18:25:55.518605] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1377fa0 00:22:11.904 [2024-07-12 18:25:55.518736] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1372780 00:22:11.904 [2024-07-12 18:25:55.518746] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x1372780 00:22:11.904 [2024-07-12 18:25:55.518839] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:22:11.904 pt4 00:22:11.904 18:25:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i++ )) 00:22:11.904 18:25:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:22:11.904 18:25:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@482 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 4 00:22:11.904 18:25:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:22:11.904 18:25:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:22:11.904 18:25:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:22:11.904 18:25:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:22:11.904 18:25:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:22:11.904 18:25:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:22:11.904 18:25:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:22:11.904 18:25:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:22:11.904 18:25:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:22:11.904 18:25:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:11.904 18:25:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:12.161 18:25:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:22:12.161 "name": "raid_bdev1", 00:22:12.161 "uuid": "873e1549-bc2e-4d65-af75-ce4e9ea819a1", 00:22:12.161 "strip_size_kb": 0, 00:22:12.161 "state": "online", 00:22:12.161 "raid_level": "raid1", 00:22:12.161 "superblock": true, 00:22:12.161 "num_base_bdevs": 4, 00:22:12.161 "num_base_bdevs_discovered": 4, 00:22:12.161 "num_base_bdevs_operational": 4, 00:22:12.161 "base_bdevs_list": [ 00:22:12.161 { 00:22:12.161 "name": "pt1", 00:22:12.161 "uuid": "00000000-0000-0000-0000-000000000001", 00:22:12.161 "is_configured": true, 00:22:12.161 "data_offset": 2048, 00:22:12.161 "data_size": 63488 00:22:12.161 }, 00:22:12.161 { 00:22:12.161 "name": "pt2", 00:22:12.161 "uuid": "00000000-0000-0000-0000-000000000002", 00:22:12.161 "is_configured": true, 00:22:12.161 "data_offset": 2048, 00:22:12.161 "data_size": 63488 00:22:12.161 }, 00:22:12.161 { 00:22:12.161 "name": "pt3", 00:22:12.161 "uuid": "00000000-0000-0000-0000-000000000003", 00:22:12.161 "is_configured": true, 00:22:12.161 "data_offset": 2048, 00:22:12.161 "data_size": 63488 00:22:12.161 }, 00:22:12.161 { 00:22:12.161 "name": "pt4", 00:22:12.161 "uuid": "00000000-0000-0000-0000-000000000004", 00:22:12.161 "is_configured": true, 00:22:12.161 "data_offset": 2048, 00:22:12.161 "data_size": 63488 00:22:12.161 } 00:22:12.161 ] 00:22:12.161 }' 00:22:12.161 18:25:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:22:12.162 18:25:55 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:22:12.727 18:25:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@483 -- # verify_raid_bdev_properties raid_bdev1 00:22:12.727 18:25:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:22:12.727 18:25:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:22:12.727 18:25:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:22:12.727 18:25:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:22:12.727 18:25:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@198 -- # local name 00:22:12.727 18:25:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:22:12.727 18:25:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:22:12.985 [2024-07-12 18:25:56.601048] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:22:12.985 18:25:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:22:12.985 "name": "raid_bdev1", 00:22:12.985 "aliases": [ 00:22:12.985 "873e1549-bc2e-4d65-af75-ce4e9ea819a1" 00:22:12.985 ], 00:22:12.985 "product_name": "Raid Volume", 00:22:12.985 "block_size": 512, 00:22:12.985 "num_blocks": 63488, 00:22:12.985 "uuid": "873e1549-bc2e-4d65-af75-ce4e9ea819a1", 00:22:12.985 "assigned_rate_limits": { 00:22:12.985 "rw_ios_per_sec": 0, 00:22:12.985 "rw_mbytes_per_sec": 0, 00:22:12.985 "r_mbytes_per_sec": 0, 00:22:12.985 "w_mbytes_per_sec": 0 00:22:12.985 }, 00:22:12.985 "claimed": false, 00:22:12.985 "zoned": false, 00:22:12.985 "supported_io_types": { 00:22:12.985 "read": true, 00:22:12.985 "write": true, 00:22:12.985 "unmap": false, 00:22:12.985 "flush": false, 00:22:12.985 "reset": true, 00:22:12.985 "nvme_admin": false, 00:22:12.985 "nvme_io": false, 00:22:12.985 "nvme_io_md": false, 00:22:12.985 "write_zeroes": true, 00:22:12.985 "zcopy": false, 00:22:12.985 "get_zone_info": false, 00:22:12.985 "zone_management": false, 00:22:12.985 "zone_append": false, 00:22:12.985 "compare": false, 00:22:12.985 "compare_and_write": false, 00:22:12.985 "abort": false, 00:22:12.985 "seek_hole": false, 00:22:12.985 "seek_data": false, 00:22:12.985 "copy": false, 00:22:12.985 "nvme_iov_md": false 00:22:12.985 }, 00:22:12.985 "memory_domains": [ 00:22:12.985 { 00:22:12.985 "dma_device_id": "system", 00:22:12.985 "dma_device_type": 1 00:22:12.985 }, 00:22:12.985 { 00:22:12.985 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:22:12.985 "dma_device_type": 2 00:22:12.985 }, 00:22:12.985 { 00:22:12.985 "dma_device_id": "system", 00:22:12.986 "dma_device_type": 1 00:22:12.986 }, 00:22:12.986 { 00:22:12.986 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:22:12.986 "dma_device_type": 2 00:22:12.986 }, 00:22:12.986 { 00:22:12.986 "dma_device_id": "system", 00:22:12.986 "dma_device_type": 1 00:22:12.986 }, 00:22:12.986 { 00:22:12.986 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:22:12.986 "dma_device_type": 2 00:22:12.986 }, 00:22:12.986 { 00:22:12.986 "dma_device_id": "system", 00:22:12.986 "dma_device_type": 1 00:22:12.986 }, 00:22:12.986 { 00:22:12.986 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:22:12.986 "dma_device_type": 2 00:22:12.986 } 00:22:12.986 ], 00:22:12.986 "driver_specific": { 00:22:12.986 "raid": { 00:22:12.986 "uuid": "873e1549-bc2e-4d65-af75-ce4e9ea819a1", 00:22:12.986 "strip_size_kb": 0, 00:22:12.986 "state": "online", 00:22:12.986 "raid_level": "raid1", 00:22:12.986 "superblock": true, 00:22:12.986 "num_base_bdevs": 4, 00:22:12.986 "num_base_bdevs_discovered": 4, 00:22:12.986 "num_base_bdevs_operational": 4, 00:22:12.986 "base_bdevs_list": [ 00:22:12.986 { 00:22:12.986 "name": "pt1", 00:22:12.986 "uuid": "00000000-0000-0000-0000-000000000001", 00:22:12.986 "is_configured": true, 00:22:12.986 "data_offset": 2048, 00:22:12.986 "data_size": 63488 00:22:12.986 }, 00:22:12.986 { 00:22:12.986 "name": "pt2", 00:22:12.986 "uuid": "00000000-0000-0000-0000-000000000002", 00:22:12.986 "is_configured": true, 00:22:12.986 "data_offset": 2048, 00:22:12.986 "data_size": 63488 00:22:12.986 }, 00:22:12.986 { 00:22:12.986 "name": "pt3", 00:22:12.986 "uuid": "00000000-0000-0000-0000-000000000003", 00:22:12.986 "is_configured": true, 00:22:12.986 "data_offset": 2048, 00:22:12.986 "data_size": 63488 00:22:12.986 }, 00:22:12.986 { 00:22:12.986 "name": "pt4", 00:22:12.986 "uuid": "00000000-0000-0000-0000-000000000004", 00:22:12.986 "is_configured": true, 00:22:12.986 "data_offset": 2048, 00:22:12.986 "data_size": 63488 00:22:12.986 } 00:22:12.986 ] 00:22:12.986 } 00:22:12.986 } 00:22:12.986 }' 00:22:12.986 18:25:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:22:12.986 18:25:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:22:12.986 pt2 00:22:12.986 pt3 00:22:12.986 pt4' 00:22:12.986 18:25:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:22:12.986 18:25:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:22:12.986 18:25:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:22:13.245 18:25:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:22:13.245 "name": "pt1", 00:22:13.245 "aliases": [ 00:22:13.245 "00000000-0000-0000-0000-000000000001" 00:22:13.245 ], 00:22:13.245 "product_name": "passthru", 00:22:13.245 "block_size": 512, 00:22:13.245 "num_blocks": 65536, 00:22:13.245 "uuid": "00000000-0000-0000-0000-000000000001", 00:22:13.245 "assigned_rate_limits": { 00:22:13.245 "rw_ios_per_sec": 0, 00:22:13.245 "rw_mbytes_per_sec": 0, 00:22:13.245 "r_mbytes_per_sec": 0, 00:22:13.245 "w_mbytes_per_sec": 0 00:22:13.245 }, 00:22:13.245 "claimed": true, 00:22:13.245 "claim_type": "exclusive_write", 00:22:13.245 "zoned": false, 00:22:13.245 "supported_io_types": { 00:22:13.245 "read": true, 00:22:13.245 "write": true, 00:22:13.245 "unmap": true, 00:22:13.245 "flush": true, 00:22:13.245 "reset": true, 00:22:13.245 "nvme_admin": false, 00:22:13.245 "nvme_io": false, 00:22:13.245 "nvme_io_md": false, 00:22:13.245 "write_zeroes": true, 00:22:13.245 "zcopy": true, 00:22:13.245 "get_zone_info": false, 00:22:13.245 "zone_management": false, 00:22:13.245 "zone_append": false, 00:22:13.245 "compare": false, 00:22:13.245 "compare_and_write": false, 00:22:13.245 "abort": true, 00:22:13.245 "seek_hole": false, 00:22:13.245 "seek_data": false, 00:22:13.245 "copy": true, 00:22:13.245 "nvme_iov_md": false 00:22:13.245 }, 00:22:13.245 "memory_domains": [ 00:22:13.245 { 00:22:13.245 "dma_device_id": "system", 00:22:13.245 "dma_device_type": 1 00:22:13.245 }, 00:22:13.245 { 00:22:13.245 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:22:13.245 "dma_device_type": 2 00:22:13.245 } 00:22:13.245 ], 00:22:13.245 "driver_specific": { 00:22:13.245 "passthru": { 00:22:13.245 "name": "pt1", 00:22:13.245 "base_bdev_name": "malloc1" 00:22:13.245 } 00:22:13.245 } 00:22:13.245 }' 00:22:13.245 18:25:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:22:13.245 18:25:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:22:13.524 18:25:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:22:13.524 18:25:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:22:13.524 18:25:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:22:13.524 18:25:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:22:13.524 18:25:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:22:13.524 18:25:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:22:13.524 18:25:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:22:13.524 18:25:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:22:13.524 18:25:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:22:13.781 18:25:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:22:13.781 18:25:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:22:13.781 18:25:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:22:13.781 18:25:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:22:14.039 18:25:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:22:14.039 "name": "pt2", 00:22:14.039 "aliases": [ 00:22:14.039 "00000000-0000-0000-0000-000000000002" 00:22:14.039 ], 00:22:14.039 "product_name": "passthru", 00:22:14.039 "block_size": 512, 00:22:14.039 "num_blocks": 65536, 00:22:14.039 "uuid": "00000000-0000-0000-0000-000000000002", 00:22:14.039 "assigned_rate_limits": { 00:22:14.039 "rw_ios_per_sec": 0, 00:22:14.039 "rw_mbytes_per_sec": 0, 00:22:14.039 "r_mbytes_per_sec": 0, 00:22:14.039 "w_mbytes_per_sec": 0 00:22:14.039 }, 00:22:14.039 "claimed": true, 00:22:14.039 "claim_type": "exclusive_write", 00:22:14.039 "zoned": false, 00:22:14.039 "supported_io_types": { 00:22:14.039 "read": true, 00:22:14.039 "write": true, 00:22:14.039 "unmap": true, 00:22:14.039 "flush": true, 00:22:14.039 "reset": true, 00:22:14.039 "nvme_admin": false, 00:22:14.039 "nvme_io": false, 00:22:14.039 "nvme_io_md": false, 00:22:14.039 "write_zeroes": true, 00:22:14.039 "zcopy": true, 00:22:14.039 "get_zone_info": false, 00:22:14.039 "zone_management": false, 00:22:14.039 "zone_append": false, 00:22:14.039 "compare": false, 00:22:14.039 "compare_and_write": false, 00:22:14.039 "abort": true, 00:22:14.039 "seek_hole": false, 00:22:14.039 "seek_data": false, 00:22:14.039 "copy": true, 00:22:14.039 "nvme_iov_md": false 00:22:14.039 }, 00:22:14.039 "memory_domains": [ 00:22:14.039 { 00:22:14.039 "dma_device_id": "system", 00:22:14.039 "dma_device_type": 1 00:22:14.039 }, 00:22:14.039 { 00:22:14.039 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:22:14.039 "dma_device_type": 2 00:22:14.039 } 00:22:14.039 ], 00:22:14.039 "driver_specific": { 00:22:14.039 "passthru": { 00:22:14.039 "name": "pt2", 00:22:14.039 "base_bdev_name": "malloc2" 00:22:14.039 } 00:22:14.039 } 00:22:14.039 }' 00:22:14.039 18:25:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:22:14.039 18:25:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:22:14.039 18:25:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:22:14.039 18:25:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:22:14.039 18:25:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:22:14.039 18:25:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:22:14.039 18:25:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:22:14.039 18:25:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:22:14.297 18:25:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:22:14.297 18:25:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:22:14.297 18:25:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:22:14.297 18:25:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:22:14.297 18:25:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:22:14.297 18:25:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt3 00:22:14.297 18:25:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:22:14.554 18:25:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:22:14.554 "name": "pt3", 00:22:14.554 "aliases": [ 00:22:14.554 "00000000-0000-0000-0000-000000000003" 00:22:14.554 ], 00:22:14.554 "product_name": "passthru", 00:22:14.554 "block_size": 512, 00:22:14.554 "num_blocks": 65536, 00:22:14.554 "uuid": "00000000-0000-0000-0000-000000000003", 00:22:14.554 "assigned_rate_limits": { 00:22:14.554 "rw_ios_per_sec": 0, 00:22:14.554 "rw_mbytes_per_sec": 0, 00:22:14.554 "r_mbytes_per_sec": 0, 00:22:14.554 "w_mbytes_per_sec": 0 00:22:14.554 }, 00:22:14.554 "claimed": true, 00:22:14.554 "claim_type": "exclusive_write", 00:22:14.554 "zoned": false, 00:22:14.554 "supported_io_types": { 00:22:14.554 "read": true, 00:22:14.554 "write": true, 00:22:14.554 "unmap": true, 00:22:14.554 "flush": true, 00:22:14.554 "reset": true, 00:22:14.554 "nvme_admin": false, 00:22:14.554 "nvme_io": false, 00:22:14.554 "nvme_io_md": false, 00:22:14.554 "write_zeroes": true, 00:22:14.554 "zcopy": true, 00:22:14.554 "get_zone_info": false, 00:22:14.554 "zone_management": false, 00:22:14.554 "zone_append": false, 00:22:14.554 "compare": false, 00:22:14.554 "compare_and_write": false, 00:22:14.554 "abort": true, 00:22:14.554 "seek_hole": false, 00:22:14.554 "seek_data": false, 00:22:14.554 "copy": true, 00:22:14.554 "nvme_iov_md": false 00:22:14.554 }, 00:22:14.554 "memory_domains": [ 00:22:14.554 { 00:22:14.554 "dma_device_id": "system", 00:22:14.554 "dma_device_type": 1 00:22:14.554 }, 00:22:14.554 { 00:22:14.554 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:22:14.554 "dma_device_type": 2 00:22:14.554 } 00:22:14.554 ], 00:22:14.554 "driver_specific": { 00:22:14.554 "passthru": { 00:22:14.554 "name": "pt3", 00:22:14.554 "base_bdev_name": "malloc3" 00:22:14.554 } 00:22:14.554 } 00:22:14.554 }' 00:22:14.554 18:25:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:22:14.554 18:25:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:22:14.554 18:25:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:22:14.554 18:25:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:22:14.554 18:25:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:22:14.811 18:25:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:22:14.811 18:25:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:22:14.811 18:25:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:22:14.811 18:25:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:22:14.811 18:25:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:22:14.811 18:25:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:22:14.811 18:25:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:22:14.811 18:25:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:22:14.811 18:25:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt4 00:22:14.811 18:25:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:22:15.069 18:25:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:22:15.069 "name": "pt4", 00:22:15.069 "aliases": [ 00:22:15.069 "00000000-0000-0000-0000-000000000004" 00:22:15.069 ], 00:22:15.069 "product_name": "passthru", 00:22:15.069 "block_size": 512, 00:22:15.069 "num_blocks": 65536, 00:22:15.069 "uuid": "00000000-0000-0000-0000-000000000004", 00:22:15.069 "assigned_rate_limits": { 00:22:15.069 "rw_ios_per_sec": 0, 00:22:15.069 "rw_mbytes_per_sec": 0, 00:22:15.069 "r_mbytes_per_sec": 0, 00:22:15.069 "w_mbytes_per_sec": 0 00:22:15.069 }, 00:22:15.069 "claimed": true, 00:22:15.069 "claim_type": "exclusive_write", 00:22:15.069 "zoned": false, 00:22:15.069 "supported_io_types": { 00:22:15.069 "read": true, 00:22:15.069 "write": true, 00:22:15.069 "unmap": true, 00:22:15.069 "flush": true, 00:22:15.069 "reset": true, 00:22:15.069 "nvme_admin": false, 00:22:15.069 "nvme_io": false, 00:22:15.069 "nvme_io_md": false, 00:22:15.069 "write_zeroes": true, 00:22:15.069 "zcopy": true, 00:22:15.069 "get_zone_info": false, 00:22:15.069 "zone_management": false, 00:22:15.069 "zone_append": false, 00:22:15.069 "compare": false, 00:22:15.069 "compare_and_write": false, 00:22:15.069 "abort": true, 00:22:15.069 "seek_hole": false, 00:22:15.069 "seek_data": false, 00:22:15.069 "copy": true, 00:22:15.069 "nvme_iov_md": false 00:22:15.069 }, 00:22:15.069 "memory_domains": [ 00:22:15.069 { 00:22:15.069 "dma_device_id": "system", 00:22:15.069 "dma_device_type": 1 00:22:15.069 }, 00:22:15.069 { 00:22:15.069 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:22:15.069 "dma_device_type": 2 00:22:15.069 } 00:22:15.069 ], 00:22:15.069 "driver_specific": { 00:22:15.069 "passthru": { 00:22:15.069 "name": "pt4", 00:22:15.069 "base_bdev_name": "malloc4" 00:22:15.069 } 00:22:15.069 } 00:22:15.069 }' 00:22:15.069 18:25:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:22:15.328 18:25:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:22:15.328 18:25:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:22:15.328 18:25:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:22:15.328 18:25:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:22:15.328 18:25:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:22:15.328 18:25:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:22:15.328 18:25:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:22:15.328 18:25:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:22:15.328 18:25:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:22:15.328 18:25:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:22:15.586 18:25:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:22:15.586 18:25:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:22:15.586 18:25:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # jq -r '.[] | .uuid' 00:22:15.844 [2024-07-12 18:25:59.324295] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:22:15.844 18:25:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # '[' 873e1549-bc2e-4d65-af75-ce4e9ea819a1 '!=' 873e1549-bc2e-4d65-af75-ce4e9ea819a1 ']' 00:22:15.844 18:25:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@490 -- # has_redundancy raid1 00:22:15.844 18:25:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:22:15.844 18:25:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@214 -- # return 0 00:22:15.844 18:25:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@492 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt1 00:22:16.102 [2024-07-12 18:25:59.572678] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: pt1 00:22:16.102 18:25:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@495 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:22:16.102 18:25:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:22:16.102 18:25:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:22:16.102 18:25:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:22:16.102 18:25:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:22:16.102 18:25:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:22:16.102 18:25:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:22:16.102 18:25:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:22:16.102 18:25:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:22:16.102 18:25:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:22:16.102 18:25:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:16.102 18:25:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:16.361 18:25:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:22:16.361 "name": "raid_bdev1", 00:22:16.361 "uuid": "873e1549-bc2e-4d65-af75-ce4e9ea819a1", 00:22:16.361 "strip_size_kb": 0, 00:22:16.361 "state": "online", 00:22:16.361 "raid_level": "raid1", 00:22:16.361 "superblock": true, 00:22:16.361 "num_base_bdevs": 4, 00:22:16.361 "num_base_bdevs_discovered": 3, 00:22:16.361 "num_base_bdevs_operational": 3, 00:22:16.361 "base_bdevs_list": [ 00:22:16.361 { 00:22:16.361 "name": null, 00:22:16.361 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:16.361 "is_configured": false, 00:22:16.361 "data_offset": 2048, 00:22:16.361 "data_size": 63488 00:22:16.361 }, 00:22:16.361 { 00:22:16.361 "name": "pt2", 00:22:16.361 "uuid": "00000000-0000-0000-0000-000000000002", 00:22:16.361 "is_configured": true, 00:22:16.361 "data_offset": 2048, 00:22:16.361 "data_size": 63488 00:22:16.361 }, 00:22:16.361 { 00:22:16.361 "name": "pt3", 00:22:16.361 "uuid": "00000000-0000-0000-0000-000000000003", 00:22:16.361 "is_configured": true, 00:22:16.361 "data_offset": 2048, 00:22:16.361 "data_size": 63488 00:22:16.361 }, 00:22:16.361 { 00:22:16.361 "name": "pt4", 00:22:16.361 "uuid": "00000000-0000-0000-0000-000000000004", 00:22:16.361 "is_configured": true, 00:22:16.361 "data_offset": 2048, 00:22:16.361 "data_size": 63488 00:22:16.361 } 00:22:16.361 ] 00:22:16.361 }' 00:22:16.361 18:25:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:22:16.361 18:25:59 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:22:16.926 18:26:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@498 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:22:16.926 [2024-07-12 18:26:00.639500] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:22:16.926 [2024-07-12 18:26:00.639531] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:22:16.926 [2024-07-12 18:26:00.639582] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:22:16.926 [2024-07-12 18:26:00.639644] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:22:16.926 [2024-07-12 18:26:00.639656] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1372780 name raid_bdev1, state offline 00:22:17.183 18:26:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@499 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:17.183 18:26:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@499 -- # jq -r '.[]' 00:22:17.183 18:26:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@499 -- # raid_bdev= 00:22:17.183 18:26:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@500 -- # '[' -n '' ']' 00:22:17.183 18:26:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@505 -- # (( i = 1 )) 00:22:17.183 18:26:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@505 -- # (( i < num_base_bdevs )) 00:22:17.183 18:26:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@506 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:22:17.441 18:26:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@505 -- # (( i++ )) 00:22:17.441 18:26:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@505 -- # (( i < num_base_bdevs )) 00:22:17.441 18:26:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@506 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt3 00:22:17.698 18:26:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@505 -- # (( i++ )) 00:22:17.698 18:26:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@505 -- # (( i < num_base_bdevs )) 00:22:17.698 18:26:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@506 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt4 00:22:17.956 18:26:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@505 -- # (( i++ )) 00:22:17.956 18:26:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@505 -- # (( i < num_base_bdevs )) 00:22:17.956 18:26:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@510 -- # (( i = 1 )) 00:22:17.956 18:26:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@510 -- # (( i < num_base_bdevs - 1 )) 00:22:17.956 18:26:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@511 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:22:18.215 [2024-07-12 18:26:01.838597] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:22:18.215 [2024-07-12 18:26:01.838641] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:22:18.215 [2024-07-12 18:26:01.838660] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1515700 00:22:18.215 [2024-07-12 18:26:01.838672] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:22:18.215 [2024-07-12 18:26:01.840326] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:22:18.215 [2024-07-12 18:26:01.840355] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:22:18.215 [2024-07-12 18:26:01.840419] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:22:18.215 [2024-07-12 18:26:01.840445] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:22:18.215 pt2 00:22:18.215 18:26:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@514 -- # verify_raid_bdev_state raid_bdev1 configuring raid1 0 3 00:22:18.215 18:26:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:22:18.215 18:26:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:22:18.215 18:26:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:22:18.215 18:26:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:22:18.215 18:26:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:22:18.215 18:26:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:22:18.215 18:26:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:22:18.215 18:26:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:22:18.215 18:26:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:22:18.215 18:26:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:18.215 18:26:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:18.473 18:26:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:22:18.473 "name": "raid_bdev1", 00:22:18.473 "uuid": "873e1549-bc2e-4d65-af75-ce4e9ea819a1", 00:22:18.473 "strip_size_kb": 0, 00:22:18.473 "state": "configuring", 00:22:18.473 "raid_level": "raid1", 00:22:18.473 "superblock": true, 00:22:18.473 "num_base_bdevs": 4, 00:22:18.473 "num_base_bdevs_discovered": 1, 00:22:18.473 "num_base_bdevs_operational": 3, 00:22:18.473 "base_bdevs_list": [ 00:22:18.473 { 00:22:18.473 "name": null, 00:22:18.473 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:18.473 "is_configured": false, 00:22:18.473 "data_offset": 2048, 00:22:18.473 "data_size": 63488 00:22:18.473 }, 00:22:18.473 { 00:22:18.473 "name": "pt2", 00:22:18.473 "uuid": "00000000-0000-0000-0000-000000000002", 00:22:18.473 "is_configured": true, 00:22:18.473 "data_offset": 2048, 00:22:18.473 "data_size": 63488 00:22:18.473 }, 00:22:18.473 { 00:22:18.473 "name": null, 00:22:18.473 "uuid": "00000000-0000-0000-0000-000000000003", 00:22:18.473 "is_configured": false, 00:22:18.473 "data_offset": 2048, 00:22:18.473 "data_size": 63488 00:22:18.473 }, 00:22:18.473 { 00:22:18.473 "name": null, 00:22:18.473 "uuid": "00000000-0000-0000-0000-000000000004", 00:22:18.473 "is_configured": false, 00:22:18.473 "data_offset": 2048, 00:22:18.473 "data_size": 63488 00:22:18.473 } 00:22:18.473 ] 00:22:18.473 }' 00:22:18.473 18:26:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:22:18.473 18:26:02 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:22:19.038 18:26:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@510 -- # (( i++ )) 00:22:19.038 18:26:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@510 -- # (( i < num_base_bdevs - 1 )) 00:22:19.038 18:26:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@511 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc3 -p pt3 -u 00000000-0000-0000-0000-000000000003 00:22:19.308 [2024-07-12 18:26:02.889377] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc3 00:22:19.308 [2024-07-12 18:26:02.889422] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:22:19.308 [2024-07-12 18:26:02.889442] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x137ba10 00:22:19.308 [2024-07-12 18:26:02.889455] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:22:19.308 [2024-07-12 18:26:02.889786] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:22:19.308 [2024-07-12 18:26:02.889804] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt3 00:22:19.308 [2024-07-12 18:26:02.889863] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt3 00:22:19.308 [2024-07-12 18:26:02.889882] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt3 is claimed 00:22:19.308 pt3 00:22:19.308 18:26:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@514 -- # verify_raid_bdev_state raid_bdev1 configuring raid1 0 3 00:22:19.308 18:26:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:22:19.308 18:26:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:22:19.308 18:26:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:22:19.308 18:26:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:22:19.308 18:26:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:22:19.308 18:26:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:22:19.309 18:26:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:22:19.309 18:26:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:22:19.309 18:26:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:22:19.309 18:26:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:19.309 18:26:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:19.571 18:26:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:22:19.571 "name": "raid_bdev1", 00:22:19.571 "uuid": "873e1549-bc2e-4d65-af75-ce4e9ea819a1", 00:22:19.571 "strip_size_kb": 0, 00:22:19.571 "state": "configuring", 00:22:19.571 "raid_level": "raid1", 00:22:19.571 "superblock": true, 00:22:19.571 "num_base_bdevs": 4, 00:22:19.571 "num_base_bdevs_discovered": 2, 00:22:19.571 "num_base_bdevs_operational": 3, 00:22:19.571 "base_bdevs_list": [ 00:22:19.571 { 00:22:19.571 "name": null, 00:22:19.571 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:19.571 "is_configured": false, 00:22:19.571 "data_offset": 2048, 00:22:19.571 "data_size": 63488 00:22:19.571 }, 00:22:19.571 { 00:22:19.571 "name": "pt2", 00:22:19.571 "uuid": "00000000-0000-0000-0000-000000000002", 00:22:19.571 "is_configured": true, 00:22:19.571 "data_offset": 2048, 00:22:19.571 "data_size": 63488 00:22:19.571 }, 00:22:19.571 { 00:22:19.571 "name": "pt3", 00:22:19.571 "uuid": "00000000-0000-0000-0000-000000000003", 00:22:19.571 "is_configured": true, 00:22:19.571 "data_offset": 2048, 00:22:19.571 "data_size": 63488 00:22:19.571 }, 00:22:19.571 { 00:22:19.571 "name": null, 00:22:19.571 "uuid": "00000000-0000-0000-0000-000000000004", 00:22:19.571 "is_configured": false, 00:22:19.571 "data_offset": 2048, 00:22:19.571 "data_size": 63488 00:22:19.571 } 00:22:19.571 ] 00:22:19.571 }' 00:22:19.571 18:26:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:22:19.571 18:26:03 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:22:20.135 18:26:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@510 -- # (( i++ )) 00:22:20.135 18:26:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@510 -- # (( i < num_base_bdevs - 1 )) 00:22:20.135 18:26:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@518 -- # i=3 00:22:20.135 18:26:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@519 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc4 -p pt4 -u 00000000-0000-0000-0000-000000000004 00:22:20.393 [2024-07-12 18:26:03.976266] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc4 00:22:20.393 [2024-07-12 18:26:03.976317] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:22:20.393 [2024-07-12 18:26:03.976335] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x151e520 00:22:20.393 [2024-07-12 18:26:03.976348] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:22:20.393 [2024-07-12 18:26:03.976681] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:22:20.393 [2024-07-12 18:26:03.976698] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt4 00:22:20.393 [2024-07-12 18:26:03.976759] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt4 00:22:20.393 [2024-07-12 18:26:03.976778] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt4 is claimed 00:22:20.394 [2024-07-12 18:26:03.976886] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x1372ea0 00:22:20.394 [2024-07-12 18:26:03.976896] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:22:20.394 [2024-07-12 18:26:03.977074] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1377600 00:22:20.394 [2024-07-12 18:26:03.977206] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1372ea0 00:22:20.394 [2024-07-12 18:26:03.977216] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x1372ea0 00:22:20.394 [2024-07-12 18:26:03.977313] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:22:20.394 pt4 00:22:20.394 18:26:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@522 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:22:20.394 18:26:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:22:20.394 18:26:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:22:20.394 18:26:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:22:20.394 18:26:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:22:20.394 18:26:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:22:20.394 18:26:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:22:20.394 18:26:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:22:20.394 18:26:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:22:20.394 18:26:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:22:20.394 18:26:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:20.394 18:26:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:20.652 18:26:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:22:20.652 "name": "raid_bdev1", 00:22:20.652 "uuid": "873e1549-bc2e-4d65-af75-ce4e9ea819a1", 00:22:20.652 "strip_size_kb": 0, 00:22:20.652 "state": "online", 00:22:20.652 "raid_level": "raid1", 00:22:20.652 "superblock": true, 00:22:20.652 "num_base_bdevs": 4, 00:22:20.652 "num_base_bdevs_discovered": 3, 00:22:20.652 "num_base_bdevs_operational": 3, 00:22:20.652 "base_bdevs_list": [ 00:22:20.652 { 00:22:20.652 "name": null, 00:22:20.652 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:20.652 "is_configured": false, 00:22:20.652 "data_offset": 2048, 00:22:20.652 "data_size": 63488 00:22:20.652 }, 00:22:20.652 { 00:22:20.652 "name": "pt2", 00:22:20.652 "uuid": "00000000-0000-0000-0000-000000000002", 00:22:20.652 "is_configured": true, 00:22:20.652 "data_offset": 2048, 00:22:20.652 "data_size": 63488 00:22:20.652 }, 00:22:20.652 { 00:22:20.652 "name": "pt3", 00:22:20.652 "uuid": "00000000-0000-0000-0000-000000000003", 00:22:20.652 "is_configured": true, 00:22:20.652 "data_offset": 2048, 00:22:20.652 "data_size": 63488 00:22:20.652 }, 00:22:20.652 { 00:22:20.652 "name": "pt4", 00:22:20.652 "uuid": "00000000-0000-0000-0000-000000000004", 00:22:20.652 "is_configured": true, 00:22:20.652 "data_offset": 2048, 00:22:20.652 "data_size": 63488 00:22:20.652 } 00:22:20.652 ] 00:22:20.652 }' 00:22:20.652 18:26:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:22:20.652 18:26:04 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:22:21.246 18:26:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@525 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:22:21.503 [2024-07-12 18:26:05.051093] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:22:21.503 [2024-07-12 18:26:05.051118] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:22:21.503 [2024-07-12 18:26:05.051165] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:22:21.503 [2024-07-12 18:26:05.051227] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:22:21.504 [2024-07-12 18:26:05.051238] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1372ea0 name raid_bdev1, state offline 00:22:21.504 18:26:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@526 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:21.504 18:26:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@526 -- # jq -r '.[]' 00:22:21.761 18:26:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@526 -- # raid_bdev= 00:22:21.761 18:26:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@527 -- # '[' -n '' ']' 00:22:21.761 18:26:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@531 -- # '[' 4 -gt 2 ']' 00:22:21.761 18:26:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@533 -- # i=3 00:22:21.761 18:26:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@534 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt4 00:22:22.018 18:26:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@539 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:22:22.277 [2024-07-12 18:26:05.780994] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:22:22.277 [2024-07-12 18:26:05.781037] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:22:22.277 [2024-07-12 18:26:05.781055] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x151e520 00:22:22.277 [2024-07-12 18:26:05.781067] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:22:22.277 [2024-07-12 18:26:05.782717] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:22:22.277 [2024-07-12 18:26:05.782748] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:22:22.277 [2024-07-12 18:26:05.782813] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt1 00:22:22.277 [2024-07-12 18:26:05.782838] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:22:22.277 [2024-07-12 18:26:05.782949] bdev_raid.c:3547:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev pt2 (4) greater than existing raid bdev raid_bdev1 (2) 00:22:22.277 [2024-07-12 18:26:05.782963] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:22:22.277 [2024-07-12 18:26:05.782977] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1372060 name raid_bdev1, state configuring 00:22:22.277 [2024-07-12 18:26:05.783001] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:22:22.277 [2024-07-12 18:26:05.783075] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt3 is claimed 00:22:22.277 pt1 00:22:22.277 18:26:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@541 -- # '[' 4 -gt 2 ']' 00:22:22.277 18:26:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@544 -- # verify_raid_bdev_state raid_bdev1 configuring raid1 0 3 00:22:22.277 18:26:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:22:22.277 18:26:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:22:22.277 18:26:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:22:22.277 18:26:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:22:22.277 18:26:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:22:22.277 18:26:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:22:22.277 18:26:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:22:22.277 18:26:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:22:22.277 18:26:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:22:22.277 18:26:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:22.277 18:26:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:22.535 18:26:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:22:22.535 "name": "raid_bdev1", 00:22:22.535 "uuid": "873e1549-bc2e-4d65-af75-ce4e9ea819a1", 00:22:22.535 "strip_size_kb": 0, 00:22:22.535 "state": "configuring", 00:22:22.535 "raid_level": "raid1", 00:22:22.535 "superblock": true, 00:22:22.535 "num_base_bdevs": 4, 00:22:22.535 "num_base_bdevs_discovered": 2, 00:22:22.535 "num_base_bdevs_operational": 3, 00:22:22.535 "base_bdevs_list": [ 00:22:22.535 { 00:22:22.535 "name": null, 00:22:22.535 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:22.535 "is_configured": false, 00:22:22.535 "data_offset": 2048, 00:22:22.535 "data_size": 63488 00:22:22.535 }, 00:22:22.535 { 00:22:22.535 "name": "pt2", 00:22:22.535 "uuid": "00000000-0000-0000-0000-000000000002", 00:22:22.535 "is_configured": true, 00:22:22.535 "data_offset": 2048, 00:22:22.535 "data_size": 63488 00:22:22.535 }, 00:22:22.535 { 00:22:22.535 "name": "pt3", 00:22:22.535 "uuid": "00000000-0000-0000-0000-000000000003", 00:22:22.535 "is_configured": true, 00:22:22.535 "data_offset": 2048, 00:22:22.535 "data_size": 63488 00:22:22.535 }, 00:22:22.535 { 00:22:22.535 "name": null, 00:22:22.535 "uuid": "00000000-0000-0000-0000-000000000004", 00:22:22.535 "is_configured": false, 00:22:22.535 "data_offset": 2048, 00:22:22.535 "data_size": 63488 00:22:22.535 } 00:22:22.535 ] 00:22:22.535 }' 00:22:22.536 18:26:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:22:22.536 18:26:06 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:22:23.102 18:26:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@545 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs configuring 00:22:23.102 18:26:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@545 -- # jq -r '.[].base_bdevs_list[0].is_configured' 00:22:23.360 18:26:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@545 -- # [[ false == \f\a\l\s\e ]] 00:22:23.360 18:26:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@548 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc4 -p pt4 -u 00000000-0000-0000-0000-000000000004 00:22:23.619 [2024-07-12 18:26:07.092485] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc4 00:22:23.619 [2024-07-12 18:26:07.092534] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:22:23.619 [2024-07-12 18:26:07.092554] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1372310 00:22:23.619 [2024-07-12 18:26:07.092566] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:22:23.619 [2024-07-12 18:26:07.092894] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:22:23.619 [2024-07-12 18:26:07.092914] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt4 00:22:23.619 [2024-07-12 18:26:07.092980] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt4 00:22:23.619 [2024-07-12 18:26:07.093001] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt4 is claimed 00:22:23.619 [2024-07-12 18:26:07.093112] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x1375b40 00:22:23.619 [2024-07-12 18:26:07.093123] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:22:23.619 [2024-07-12 18:26:07.093293] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1515990 00:22:23.619 [2024-07-12 18:26:07.093431] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1375b40 00:22:23.619 [2024-07-12 18:26:07.093441] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x1375b40 00:22:23.619 [2024-07-12 18:26:07.093535] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:22:23.619 pt4 00:22:23.619 18:26:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@553 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:22:23.619 18:26:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:22:23.619 18:26:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:22:23.619 18:26:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:22:23.619 18:26:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:22:23.619 18:26:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:22:23.619 18:26:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:22:23.619 18:26:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:22:23.619 18:26:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:22:23.619 18:26:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:22:23.619 18:26:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:23.619 18:26:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:23.877 18:26:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:22:23.877 "name": "raid_bdev1", 00:22:23.877 "uuid": "873e1549-bc2e-4d65-af75-ce4e9ea819a1", 00:22:23.877 "strip_size_kb": 0, 00:22:23.877 "state": "online", 00:22:23.877 "raid_level": "raid1", 00:22:23.877 "superblock": true, 00:22:23.877 "num_base_bdevs": 4, 00:22:23.877 "num_base_bdevs_discovered": 3, 00:22:23.877 "num_base_bdevs_operational": 3, 00:22:23.877 "base_bdevs_list": [ 00:22:23.877 { 00:22:23.878 "name": null, 00:22:23.878 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:23.878 "is_configured": false, 00:22:23.878 "data_offset": 2048, 00:22:23.878 "data_size": 63488 00:22:23.878 }, 00:22:23.878 { 00:22:23.878 "name": "pt2", 00:22:23.878 "uuid": "00000000-0000-0000-0000-000000000002", 00:22:23.878 "is_configured": true, 00:22:23.878 "data_offset": 2048, 00:22:23.878 "data_size": 63488 00:22:23.878 }, 00:22:23.878 { 00:22:23.878 "name": "pt3", 00:22:23.878 "uuid": "00000000-0000-0000-0000-000000000003", 00:22:23.878 "is_configured": true, 00:22:23.878 "data_offset": 2048, 00:22:23.878 "data_size": 63488 00:22:23.878 }, 00:22:23.878 { 00:22:23.878 "name": "pt4", 00:22:23.878 "uuid": "00000000-0000-0000-0000-000000000004", 00:22:23.878 "is_configured": true, 00:22:23.878 "data_offset": 2048, 00:22:23.878 "data_size": 63488 00:22:23.878 } 00:22:23.878 ] 00:22:23.878 }' 00:22:23.878 18:26:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:22:23.878 18:26:07 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:22:24.444 18:26:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@554 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs online 00:22:24.444 18:26:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@554 -- # jq -r '.[].base_bdevs_list[0].is_configured' 00:22:24.703 18:26:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@554 -- # [[ false == \f\a\l\s\e ]] 00:22:24.703 18:26:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@557 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:22:24.703 18:26:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@557 -- # jq -r '.[] | .uuid' 00:22:24.961 [2024-07-12 18:26:08.476435] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:22:24.961 18:26:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@557 -- # '[' 873e1549-bc2e-4d65-af75-ce4e9ea819a1 '!=' 873e1549-bc2e-4d65-af75-ce4e9ea819a1 ']' 00:22:24.961 18:26:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@562 -- # killprocess 2558879 00:22:24.961 18:26:08 bdev_raid.raid_superblock_test -- common/autotest_common.sh@948 -- # '[' -z 2558879 ']' 00:22:24.961 18:26:08 bdev_raid.raid_superblock_test -- common/autotest_common.sh@952 -- # kill -0 2558879 00:22:24.961 18:26:08 bdev_raid.raid_superblock_test -- common/autotest_common.sh@953 -- # uname 00:22:24.961 18:26:08 bdev_raid.raid_superblock_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:22:24.961 18:26:08 bdev_raid.raid_superblock_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2558879 00:22:24.961 18:26:08 bdev_raid.raid_superblock_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:22:24.961 18:26:08 bdev_raid.raid_superblock_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:22:24.961 18:26:08 bdev_raid.raid_superblock_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2558879' 00:22:24.961 killing process with pid 2558879 00:22:24.961 18:26:08 bdev_raid.raid_superblock_test -- common/autotest_common.sh@967 -- # kill 2558879 00:22:24.961 [2024-07-12 18:26:08.547235] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:22:24.961 [2024-07-12 18:26:08.547296] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:22:24.961 [2024-07-12 18:26:08.547362] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:22:24.961 [2024-07-12 18:26:08.547375] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1375b40 name raid_bdev1, state offline 00:22:24.961 18:26:08 bdev_raid.raid_superblock_test -- common/autotest_common.sh@972 -- # wait 2558879 00:22:24.961 [2024-07-12 18:26:08.585737] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:22:25.219 18:26:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@564 -- # return 0 00:22:25.219 00:22:25.219 real 0m26.763s 00:22:25.219 user 0m49.092s 00:22:25.219 sys 0m4.690s 00:22:25.219 18:26:08 bdev_raid.raid_superblock_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:22:25.219 18:26:08 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:22:25.219 ************************************ 00:22:25.219 END TEST raid_superblock_test 00:22:25.219 ************************************ 00:22:25.219 18:26:08 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:22:25.219 18:26:08 bdev_raid -- bdev/bdev_raid.sh@870 -- # run_test raid_read_error_test raid_io_error_test raid1 4 read 00:22:25.219 18:26:08 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:22:25.219 18:26:08 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:22:25.219 18:26:08 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:22:25.219 ************************************ 00:22:25.219 START TEST raid_read_error_test 00:22:25.219 ************************************ 00:22:25.219 18:26:08 bdev_raid.raid_read_error_test -- common/autotest_common.sh@1123 -- # raid_io_error_test raid1 4 read 00:22:25.219 18:26:08 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@788 -- # local raid_level=raid1 00:22:25.219 18:26:08 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@789 -- # local num_base_bdevs=4 00:22:25.219 18:26:08 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@790 -- # local error_io_type=read 00:22:25.219 18:26:08 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i = 1 )) 00:22:25.219 18:26:08 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:22:25.219 18:26:08 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev1 00:22:25.219 18:26:08 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:22:25.219 18:26:08 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:22:25.219 18:26:08 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev2 00:22:25.219 18:26:08 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:22:25.219 18:26:08 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:22:25.219 18:26:08 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev3 00:22:25.219 18:26:08 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:22:25.219 18:26:08 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:22:25.219 18:26:08 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev4 00:22:25.219 18:26:08 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:22:25.219 18:26:08 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:22:25.219 18:26:08 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3' 'BaseBdev4') 00:22:25.219 18:26:08 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # local base_bdevs 00:22:25.219 18:26:08 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@792 -- # local raid_bdev_name=raid_bdev1 00:22:25.219 18:26:08 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # local strip_size 00:22:25.219 18:26:08 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@794 -- # local create_arg 00:22:25.219 18:26:08 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@795 -- # local bdevperf_log 00:22:25.219 18:26:08 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@796 -- # local fail_per_s 00:22:25.219 18:26:08 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@798 -- # '[' raid1 '!=' raid1 ']' 00:22:25.219 18:26:08 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@802 -- # strip_size=0 00:22:25.219 18:26:08 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@805 -- # mktemp -p /raidtest 00:22:25.219 18:26:08 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@805 -- # bdevperf_log=/raidtest/tmp.C1JE28tzG7 00:22:25.219 18:26:08 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@808 -- # raid_pid=2562761 00:22:25.219 18:26:08 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@809 -- # waitforlisten 2562761 /var/tmp/spdk-raid.sock 00:22:25.219 18:26:08 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@807 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 128k -q 1 -z -f -L bdev_raid 00:22:25.219 18:26:08 bdev_raid.raid_read_error_test -- common/autotest_common.sh@829 -- # '[' -z 2562761 ']' 00:22:25.219 18:26:08 bdev_raid.raid_read_error_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:22:25.219 18:26:08 bdev_raid.raid_read_error_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:22:25.219 18:26:08 bdev_raid.raid_read_error_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:22:25.219 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:22:25.219 18:26:08 bdev_raid.raid_read_error_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:22:25.219 18:26:08 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:22:25.477 [2024-07-12 18:26:08.978533] Starting SPDK v24.09-pre git sha1 182dd7de4 / DPDK 24.03.0 initialization... 00:22:25.477 [2024-07-12 18:26:08.978603] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2562761 ] 00:22:25.477 [2024-07-12 18:26:09.108617] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:22:25.734 [2024-07-12 18:26:09.207951] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:22:25.734 [2024-07-12 18:26:09.273101] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:22:25.734 [2024-07-12 18:26:09.273160] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:22:26.298 18:26:09 bdev_raid.raid_read_error_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:22:26.299 18:26:09 bdev_raid.raid_read_error_test -- common/autotest_common.sh@862 -- # return 0 00:22:26.299 18:26:09 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:22:26.299 18:26:09 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:22:26.557 BaseBdev1_malloc 00:22:26.557 18:26:10 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev1_malloc 00:22:26.815 true 00:22:26.815 18:26:10 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev1_malloc -p BaseBdev1 00:22:27.074 [2024-07-12 18:26:10.544593] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev1_malloc 00:22:27.074 [2024-07-12 18:26:10.544640] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:22:27.074 [2024-07-12 18:26:10.544666] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x151b0d0 00:22:27.074 [2024-07-12 18:26:10.544679] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:22:27.074 [2024-07-12 18:26:10.546522] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:22:27.074 [2024-07-12 18:26:10.546554] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:22:27.074 BaseBdev1 00:22:27.074 18:26:10 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:22:27.074 18:26:10 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:22:27.074 BaseBdev2_malloc 00:22:27.074 18:26:10 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev2_malloc 00:22:27.333 true 00:22:27.333 18:26:10 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev2_malloc -p BaseBdev2 00:22:27.592 [2024-07-12 18:26:11.194973] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev2_malloc 00:22:27.592 [2024-07-12 18:26:11.195021] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:22:27.592 [2024-07-12 18:26:11.195043] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x151f910 00:22:27.592 [2024-07-12 18:26:11.195055] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:22:27.592 [2024-07-12 18:26:11.196623] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:22:27.592 [2024-07-12 18:26:11.196654] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:22:27.592 BaseBdev2 00:22:27.592 18:26:11 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:22:27.592 18:26:11 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3_malloc 00:22:27.851 BaseBdev3_malloc 00:22:27.851 18:26:11 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev3_malloc 00:22:28.110 true 00:22:28.110 18:26:11 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev3_malloc -p BaseBdev3 00:22:28.369 [2024-07-12 18:26:11.922760] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev3_malloc 00:22:28.369 [2024-07-12 18:26:11.922812] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:22:28.369 [2024-07-12 18:26:11.922833] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1521bd0 00:22:28.369 [2024-07-12 18:26:11.922845] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:22:28.369 [2024-07-12 18:26:11.924457] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:22:28.369 [2024-07-12 18:26:11.924491] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev3 00:22:28.369 BaseBdev3 00:22:28.369 18:26:11 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:22:28.369 18:26:11 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4_malloc 00:22:28.635 BaseBdev4_malloc 00:22:28.635 18:26:12 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev4_malloc 00:22:28.897 true 00:22:28.897 18:26:12 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev4_malloc -p BaseBdev4 00:22:28.897 [2024-07-12 18:26:12.578395] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev4_malloc 00:22:28.897 [2024-07-12 18:26:12.578449] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:22:28.897 [2024-07-12 18:26:12.578471] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1522aa0 00:22:28.897 [2024-07-12 18:26:12.578483] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:22:28.897 [2024-07-12 18:26:12.580046] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:22:28.897 [2024-07-12 18:26:12.580078] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev4 00:22:28.897 BaseBdev4 00:22:28.897 18:26:12 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@819 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n raid_bdev1 -s 00:22:29.156 [2024-07-12 18:26:12.819068] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:22:29.156 [2024-07-12 18:26:12.820396] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:22:29.156 [2024-07-12 18:26:12.820467] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:22:29.156 [2024-07-12 18:26:12.820528] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:22:29.156 [2024-07-12 18:26:12.820760] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x151cc20 00:22:29.156 [2024-07-12 18:26:12.820771] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:22:29.156 [2024-07-12 18:26:12.820979] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1371260 00:22:29.156 [2024-07-12 18:26:12.821137] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x151cc20 00:22:29.156 [2024-07-12 18:26:12.821148] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x151cc20 00:22:29.156 [2024-07-12 18:26:12.821257] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:22:29.156 18:26:12 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@820 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 4 00:22:29.156 18:26:12 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:22:29.156 18:26:12 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:22:29.156 18:26:12 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:22:29.156 18:26:12 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:22:29.156 18:26:12 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:22:29.156 18:26:12 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:22:29.156 18:26:12 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:22:29.156 18:26:12 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:22:29.157 18:26:12 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:22:29.157 18:26:12 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:29.157 18:26:12 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:29.415 18:26:13 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:22:29.415 "name": "raid_bdev1", 00:22:29.415 "uuid": "c20de653-a87f-4928-a255-862e928fb022", 00:22:29.415 "strip_size_kb": 0, 00:22:29.415 "state": "online", 00:22:29.415 "raid_level": "raid1", 00:22:29.415 "superblock": true, 00:22:29.415 "num_base_bdevs": 4, 00:22:29.415 "num_base_bdevs_discovered": 4, 00:22:29.415 "num_base_bdevs_operational": 4, 00:22:29.415 "base_bdevs_list": [ 00:22:29.415 { 00:22:29.415 "name": "BaseBdev1", 00:22:29.415 "uuid": "c72e7205-330a-598e-90ec-0235702cf87d", 00:22:29.415 "is_configured": true, 00:22:29.415 "data_offset": 2048, 00:22:29.415 "data_size": 63488 00:22:29.415 }, 00:22:29.415 { 00:22:29.415 "name": "BaseBdev2", 00:22:29.415 "uuid": "2d138f6a-ef16-542c-a197-a16b9b6d2e3a", 00:22:29.415 "is_configured": true, 00:22:29.415 "data_offset": 2048, 00:22:29.415 "data_size": 63488 00:22:29.415 }, 00:22:29.415 { 00:22:29.415 "name": "BaseBdev3", 00:22:29.415 "uuid": "2542650b-cbb2-5277-aa2d-14b148c71c1a", 00:22:29.415 "is_configured": true, 00:22:29.415 "data_offset": 2048, 00:22:29.415 "data_size": 63488 00:22:29.415 }, 00:22:29.415 { 00:22:29.415 "name": "BaseBdev4", 00:22:29.415 "uuid": "16e58d00-1b38-596a-9b37-e18d4e81181c", 00:22:29.415 "is_configured": true, 00:22:29.415 "data_offset": 2048, 00:22:29.415 "data_size": 63488 00:22:29.415 } 00:22:29.415 ] 00:22:29.415 }' 00:22:29.415 18:26:13 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:22:29.415 18:26:13 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:22:29.980 18:26:13 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@824 -- # sleep 1 00:22:29.980 18:26:13 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@823 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/spdk-raid.sock perform_tests 00:22:29.980 [2024-07-12 18:26:13.705716] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1370c60 00:22:30.917 18:26:14 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@827 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_inject_error EE_BaseBdev1_malloc read failure 00:22:31.176 18:26:14 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@829 -- # local expected_num_base_bdevs 00:22:31.176 18:26:14 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@830 -- # [[ raid1 = \r\a\i\d\1 ]] 00:22:31.176 18:26:14 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@830 -- # [[ read = \w\r\i\t\e ]] 00:22:31.176 18:26:14 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@833 -- # expected_num_base_bdevs=4 00:22:31.176 18:26:14 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@835 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 4 00:22:31.176 18:26:14 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:22:31.176 18:26:14 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:22:31.176 18:26:14 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:22:31.176 18:26:14 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:22:31.176 18:26:14 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:22:31.176 18:26:14 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:22:31.176 18:26:14 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:22:31.176 18:26:14 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:22:31.176 18:26:14 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:22:31.176 18:26:14 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:31.176 18:26:14 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:31.435 18:26:14 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:22:31.435 "name": "raid_bdev1", 00:22:31.435 "uuid": "c20de653-a87f-4928-a255-862e928fb022", 00:22:31.435 "strip_size_kb": 0, 00:22:31.435 "state": "online", 00:22:31.435 "raid_level": "raid1", 00:22:31.435 "superblock": true, 00:22:31.435 "num_base_bdevs": 4, 00:22:31.435 "num_base_bdevs_discovered": 4, 00:22:31.435 "num_base_bdevs_operational": 4, 00:22:31.435 "base_bdevs_list": [ 00:22:31.435 { 00:22:31.435 "name": "BaseBdev1", 00:22:31.435 "uuid": "c72e7205-330a-598e-90ec-0235702cf87d", 00:22:31.435 "is_configured": true, 00:22:31.435 "data_offset": 2048, 00:22:31.435 "data_size": 63488 00:22:31.435 }, 00:22:31.435 { 00:22:31.435 "name": "BaseBdev2", 00:22:31.435 "uuid": "2d138f6a-ef16-542c-a197-a16b9b6d2e3a", 00:22:31.435 "is_configured": true, 00:22:31.435 "data_offset": 2048, 00:22:31.435 "data_size": 63488 00:22:31.435 }, 00:22:31.435 { 00:22:31.435 "name": "BaseBdev3", 00:22:31.435 "uuid": "2542650b-cbb2-5277-aa2d-14b148c71c1a", 00:22:31.435 "is_configured": true, 00:22:31.435 "data_offset": 2048, 00:22:31.435 "data_size": 63488 00:22:31.435 }, 00:22:31.435 { 00:22:31.435 "name": "BaseBdev4", 00:22:31.435 "uuid": "16e58d00-1b38-596a-9b37-e18d4e81181c", 00:22:31.435 "is_configured": true, 00:22:31.435 "data_offset": 2048, 00:22:31.435 "data_size": 63488 00:22:31.435 } 00:22:31.435 ] 00:22:31.435 }' 00:22:31.435 18:26:14 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:22:31.435 18:26:14 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:22:32.369 18:26:15 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@837 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:22:32.369 [2024-07-12 18:26:16.078597] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:22:32.370 [2024-07-12 18:26:16.078640] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:22:32.370 [2024-07-12 18:26:16.081901] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:22:32.370 [2024-07-12 18:26:16.081962] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:22:32.370 [2024-07-12 18:26:16.082080] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:22:32.370 [2024-07-12 18:26:16.082092] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x151cc20 name raid_bdev1, state offline 00:22:32.370 0 00:22:32.629 18:26:16 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@839 -- # killprocess 2562761 00:22:32.629 18:26:16 bdev_raid.raid_read_error_test -- common/autotest_common.sh@948 -- # '[' -z 2562761 ']' 00:22:32.629 18:26:16 bdev_raid.raid_read_error_test -- common/autotest_common.sh@952 -- # kill -0 2562761 00:22:32.629 18:26:16 bdev_raid.raid_read_error_test -- common/autotest_common.sh@953 -- # uname 00:22:32.629 18:26:16 bdev_raid.raid_read_error_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:22:32.629 18:26:16 bdev_raid.raid_read_error_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2562761 00:22:32.629 18:26:16 bdev_raid.raid_read_error_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:22:32.629 18:26:16 bdev_raid.raid_read_error_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:22:32.629 18:26:16 bdev_raid.raid_read_error_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2562761' 00:22:32.629 killing process with pid 2562761 00:22:32.629 18:26:16 bdev_raid.raid_read_error_test -- common/autotest_common.sh@967 -- # kill 2562761 00:22:32.629 [2024-07-12 18:26:16.145425] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:22:32.629 18:26:16 bdev_raid.raid_read_error_test -- common/autotest_common.sh@972 -- # wait 2562761 00:22:32.629 [2024-07-12 18:26:16.176929] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:22:32.888 18:26:16 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # grep -v Job /raidtest/tmp.C1JE28tzG7 00:22:32.888 18:26:16 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # grep raid_bdev1 00:22:32.888 18:26:16 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # awk '{print $6}' 00:22:32.888 18:26:16 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # fail_per_s=0.00 00:22:32.888 18:26:16 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@844 -- # has_redundancy raid1 00:22:32.888 18:26:16 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:22:32.888 18:26:16 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@214 -- # return 0 00:22:32.888 18:26:16 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@845 -- # [[ 0.00 = \0\.\0\0 ]] 00:22:32.888 00:22:32.888 real 0m7.514s 00:22:32.888 user 0m11.961s 00:22:32.888 sys 0m1.344s 00:22:32.888 18:26:16 bdev_raid.raid_read_error_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:22:32.888 18:26:16 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:22:32.888 ************************************ 00:22:32.888 END TEST raid_read_error_test 00:22:32.888 ************************************ 00:22:32.888 18:26:16 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:22:32.888 18:26:16 bdev_raid -- bdev/bdev_raid.sh@871 -- # run_test raid_write_error_test raid_io_error_test raid1 4 write 00:22:32.888 18:26:16 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:22:32.888 18:26:16 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:22:32.888 18:26:16 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:22:32.888 ************************************ 00:22:32.888 START TEST raid_write_error_test 00:22:32.888 ************************************ 00:22:32.888 18:26:16 bdev_raid.raid_write_error_test -- common/autotest_common.sh@1123 -- # raid_io_error_test raid1 4 write 00:22:32.888 18:26:16 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@788 -- # local raid_level=raid1 00:22:32.888 18:26:16 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@789 -- # local num_base_bdevs=4 00:22:32.888 18:26:16 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@790 -- # local error_io_type=write 00:22:32.888 18:26:16 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i = 1 )) 00:22:32.888 18:26:16 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:22:32.888 18:26:16 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev1 00:22:32.888 18:26:16 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:22:32.888 18:26:16 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:22:32.888 18:26:16 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev2 00:22:32.888 18:26:16 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:22:32.888 18:26:16 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:22:32.888 18:26:16 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev3 00:22:32.888 18:26:16 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:22:32.888 18:26:16 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:22:32.888 18:26:16 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev4 00:22:32.888 18:26:16 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:22:32.888 18:26:16 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:22:32.888 18:26:16 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3' 'BaseBdev4') 00:22:32.888 18:26:16 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # local base_bdevs 00:22:32.888 18:26:16 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@792 -- # local raid_bdev_name=raid_bdev1 00:22:32.888 18:26:16 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # local strip_size 00:22:32.888 18:26:16 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@794 -- # local create_arg 00:22:32.888 18:26:16 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@795 -- # local bdevperf_log 00:22:32.888 18:26:16 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@796 -- # local fail_per_s 00:22:32.888 18:26:16 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@798 -- # '[' raid1 '!=' raid1 ']' 00:22:32.888 18:26:16 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@802 -- # strip_size=0 00:22:32.888 18:26:16 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@805 -- # mktemp -p /raidtest 00:22:32.888 18:26:16 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@805 -- # bdevperf_log=/raidtest/tmp.mKIIUU6NXV 00:22:32.888 18:26:16 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@808 -- # raid_pid=2563890 00:22:32.888 18:26:16 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@809 -- # waitforlisten 2563890 /var/tmp/spdk-raid.sock 00:22:32.888 18:26:16 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@807 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 128k -q 1 -z -f -L bdev_raid 00:22:32.888 18:26:16 bdev_raid.raid_write_error_test -- common/autotest_common.sh@829 -- # '[' -z 2563890 ']' 00:22:32.888 18:26:16 bdev_raid.raid_write_error_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:22:32.888 18:26:16 bdev_raid.raid_write_error_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:22:32.888 18:26:16 bdev_raid.raid_write_error_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:22:32.888 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:22:32.889 18:26:16 bdev_raid.raid_write_error_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:22:32.889 18:26:16 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:22:32.889 [2024-07-12 18:26:16.577372] Starting SPDK v24.09-pre git sha1 182dd7de4 / DPDK 24.03.0 initialization... 00:22:32.889 [2024-07-12 18:26:16.577440] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2563890 ] 00:22:33.148 [2024-07-12 18:26:16.705180] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:22:33.148 [2024-07-12 18:26:16.806582] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:22:33.148 [2024-07-12 18:26:16.864800] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:22:33.148 [2024-07-12 18:26:16.864845] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:22:34.086 18:26:17 bdev_raid.raid_write_error_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:22:34.086 18:26:17 bdev_raid.raid_write_error_test -- common/autotest_common.sh@862 -- # return 0 00:22:34.086 18:26:17 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:22:34.086 18:26:17 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:22:34.086 BaseBdev1_malloc 00:22:34.086 18:26:17 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev1_malloc 00:22:34.345 true 00:22:34.345 18:26:18 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev1_malloc -p BaseBdev1 00:22:34.604 [2024-07-12 18:26:18.230690] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev1_malloc 00:22:34.605 [2024-07-12 18:26:18.230737] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:22:34.605 [2024-07-12 18:26:18.230758] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x274a0d0 00:22:34.605 [2024-07-12 18:26:18.230771] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:22:34.605 [2024-07-12 18:26:18.232654] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:22:34.605 [2024-07-12 18:26:18.232686] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:22:34.605 BaseBdev1 00:22:34.605 18:26:18 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:22:34.605 18:26:18 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:22:34.863 BaseBdev2_malloc 00:22:34.863 18:26:18 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev2_malloc 00:22:35.122 true 00:22:35.123 18:26:18 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev2_malloc -p BaseBdev2 00:22:35.382 [2024-07-12 18:26:18.965422] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev2_malloc 00:22:35.382 [2024-07-12 18:26:18.965467] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:22:35.382 [2024-07-12 18:26:18.965488] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x274e910 00:22:35.382 [2024-07-12 18:26:18.965501] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:22:35.382 [2024-07-12 18:26:18.967053] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:22:35.382 [2024-07-12 18:26:18.967083] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:22:35.382 BaseBdev2 00:22:35.382 18:26:18 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:22:35.382 18:26:18 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3_malloc 00:22:35.641 BaseBdev3_malloc 00:22:35.641 18:26:19 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev3_malloc 00:22:35.900 true 00:22:35.900 18:26:19 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev3_malloc -p BaseBdev3 00:22:36.159 [2024-07-12 18:26:19.679895] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev3_malloc 00:22:36.159 [2024-07-12 18:26:19.679944] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:22:36.159 [2024-07-12 18:26:19.679972] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x2750bd0 00:22:36.159 [2024-07-12 18:26:19.679986] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:22:36.159 [2024-07-12 18:26:19.681552] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:22:36.159 [2024-07-12 18:26:19.681583] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev3 00:22:36.159 BaseBdev3 00:22:36.159 18:26:19 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:22:36.159 18:26:19 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4_malloc 00:22:36.417 BaseBdev4_malloc 00:22:36.417 18:26:19 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev4_malloc 00:22:36.676 true 00:22:36.676 18:26:20 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev4_malloc -p BaseBdev4 00:22:36.934 [2024-07-12 18:26:20.423711] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev4_malloc 00:22:36.934 [2024-07-12 18:26:20.423757] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:22:36.934 [2024-07-12 18:26:20.423778] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x2751aa0 00:22:36.934 [2024-07-12 18:26:20.423791] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:22:36.934 [2024-07-12 18:26:20.425357] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:22:36.934 [2024-07-12 18:26:20.425385] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev4 00:22:36.934 BaseBdev4 00:22:36.934 18:26:20 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@819 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n raid_bdev1 -s 00:22:37.193 [2024-07-12 18:26:20.668552] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:22:37.193 [2024-07-12 18:26:20.669885] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:22:37.193 [2024-07-12 18:26:20.669964] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:22:37.193 [2024-07-12 18:26:20.670026] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:22:37.193 [2024-07-12 18:26:20.670256] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x274bc20 00:22:37.193 [2024-07-12 18:26:20.670267] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:22:37.193 [2024-07-12 18:26:20.670467] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x25a0260 00:22:37.193 [2024-07-12 18:26:20.670622] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x274bc20 00:22:37.193 [2024-07-12 18:26:20.670632] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x274bc20 00:22:37.193 [2024-07-12 18:26:20.670740] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:22:37.193 18:26:20 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@820 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 4 00:22:37.193 18:26:20 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:22:37.193 18:26:20 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:22:37.193 18:26:20 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:22:37.193 18:26:20 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:22:37.193 18:26:20 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:22:37.193 18:26:20 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:22:37.193 18:26:20 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:22:37.193 18:26:20 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:22:37.193 18:26:20 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:22:37.193 18:26:20 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:37.193 18:26:20 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:37.452 18:26:20 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:22:37.452 "name": "raid_bdev1", 00:22:37.452 "uuid": "75061ca5-890a-46a0-92b2-76c360028394", 00:22:37.452 "strip_size_kb": 0, 00:22:37.452 "state": "online", 00:22:37.452 "raid_level": "raid1", 00:22:37.452 "superblock": true, 00:22:37.452 "num_base_bdevs": 4, 00:22:37.452 "num_base_bdevs_discovered": 4, 00:22:37.452 "num_base_bdevs_operational": 4, 00:22:37.452 "base_bdevs_list": [ 00:22:37.452 { 00:22:37.452 "name": "BaseBdev1", 00:22:37.452 "uuid": "27b69e6b-6b7b-58b9-b261-9f5b8269aea3", 00:22:37.452 "is_configured": true, 00:22:37.452 "data_offset": 2048, 00:22:37.452 "data_size": 63488 00:22:37.452 }, 00:22:37.452 { 00:22:37.452 "name": "BaseBdev2", 00:22:37.452 "uuid": "19d55adf-db4f-55ee-aa28-c7cda32228d2", 00:22:37.452 "is_configured": true, 00:22:37.452 "data_offset": 2048, 00:22:37.452 "data_size": 63488 00:22:37.452 }, 00:22:37.452 { 00:22:37.452 "name": "BaseBdev3", 00:22:37.452 "uuid": "2b376320-9da4-5740-a7f1-e0ba357a5bae", 00:22:37.452 "is_configured": true, 00:22:37.452 "data_offset": 2048, 00:22:37.452 "data_size": 63488 00:22:37.452 }, 00:22:37.452 { 00:22:37.452 "name": "BaseBdev4", 00:22:37.452 "uuid": "84ef7432-2cdf-52ad-a699-eb12687ada76", 00:22:37.452 "is_configured": true, 00:22:37.452 "data_offset": 2048, 00:22:37.452 "data_size": 63488 00:22:37.452 } 00:22:37.452 ] 00:22:37.452 }' 00:22:37.452 18:26:20 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:22:37.452 18:26:20 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:22:38.016 18:26:21 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@824 -- # sleep 1 00:22:38.016 18:26:21 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@823 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/spdk-raid.sock perform_tests 00:22:38.016 [2024-07-12 18:26:21.643408] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x259fc60 00:22:38.952 18:26:22 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@827 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_inject_error EE_BaseBdev1_malloc write failure 00:22:39.211 [2024-07-12 18:26:22.763507] bdev_raid.c:2221:_raid_bdev_fail_base_bdev: *NOTICE*: Failing base bdev in slot 0 ('BaseBdev1') of raid bdev 'raid_bdev1' 00:22:39.211 [2024-07-12 18:26:22.763568] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:22:39.211 [2024-07-12 18:26:22.763777] bdev_raid.c:1919:raid_bdev_channel_remove_base_bdev: *DEBUG*: slot: 0 raid_ch: 0x259fc60 00:22:39.211 18:26:22 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@829 -- # local expected_num_base_bdevs 00:22:39.211 18:26:22 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@830 -- # [[ raid1 = \r\a\i\d\1 ]] 00:22:39.211 18:26:22 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@830 -- # [[ write = \w\r\i\t\e ]] 00:22:39.211 18:26:22 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@831 -- # expected_num_base_bdevs=3 00:22:39.211 18:26:22 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@835 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:22:39.211 18:26:22 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:22:39.211 18:26:22 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:22:39.211 18:26:22 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:22:39.211 18:26:22 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:22:39.211 18:26:22 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:22:39.211 18:26:22 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:22:39.211 18:26:22 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:22:39.211 18:26:22 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:22:39.211 18:26:22 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:22:39.211 18:26:22 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:39.211 18:26:22 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:39.470 18:26:22 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:22:39.470 "name": "raid_bdev1", 00:22:39.470 "uuid": "75061ca5-890a-46a0-92b2-76c360028394", 00:22:39.470 "strip_size_kb": 0, 00:22:39.470 "state": "online", 00:22:39.470 "raid_level": "raid1", 00:22:39.470 "superblock": true, 00:22:39.470 "num_base_bdevs": 4, 00:22:39.470 "num_base_bdevs_discovered": 3, 00:22:39.470 "num_base_bdevs_operational": 3, 00:22:39.470 "base_bdevs_list": [ 00:22:39.470 { 00:22:39.470 "name": null, 00:22:39.470 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:39.470 "is_configured": false, 00:22:39.470 "data_offset": 2048, 00:22:39.470 "data_size": 63488 00:22:39.470 }, 00:22:39.470 { 00:22:39.470 "name": "BaseBdev2", 00:22:39.470 "uuid": "19d55adf-db4f-55ee-aa28-c7cda32228d2", 00:22:39.470 "is_configured": true, 00:22:39.470 "data_offset": 2048, 00:22:39.470 "data_size": 63488 00:22:39.470 }, 00:22:39.470 { 00:22:39.470 "name": "BaseBdev3", 00:22:39.470 "uuid": "2b376320-9da4-5740-a7f1-e0ba357a5bae", 00:22:39.470 "is_configured": true, 00:22:39.470 "data_offset": 2048, 00:22:39.470 "data_size": 63488 00:22:39.470 }, 00:22:39.470 { 00:22:39.470 "name": "BaseBdev4", 00:22:39.470 "uuid": "84ef7432-2cdf-52ad-a699-eb12687ada76", 00:22:39.470 "is_configured": true, 00:22:39.470 "data_offset": 2048, 00:22:39.470 "data_size": 63488 00:22:39.470 } 00:22:39.470 ] 00:22:39.470 }' 00:22:39.470 18:26:22 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:22:39.470 18:26:22 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:22:40.038 18:26:23 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@837 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:22:40.297 [2024-07-12 18:26:23.814493] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:22:40.297 [2024-07-12 18:26:23.814534] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:22:40.297 [2024-07-12 18:26:23.817657] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:22:40.297 [2024-07-12 18:26:23.817692] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:22:40.297 [2024-07-12 18:26:23.817787] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:22:40.298 [2024-07-12 18:26:23.817799] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x274bc20 name raid_bdev1, state offline 00:22:40.298 0 00:22:40.298 18:26:23 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@839 -- # killprocess 2563890 00:22:40.298 18:26:23 bdev_raid.raid_write_error_test -- common/autotest_common.sh@948 -- # '[' -z 2563890 ']' 00:22:40.298 18:26:23 bdev_raid.raid_write_error_test -- common/autotest_common.sh@952 -- # kill -0 2563890 00:22:40.298 18:26:23 bdev_raid.raid_write_error_test -- common/autotest_common.sh@953 -- # uname 00:22:40.298 18:26:23 bdev_raid.raid_write_error_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:22:40.298 18:26:23 bdev_raid.raid_write_error_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2563890 00:22:40.298 18:26:23 bdev_raid.raid_write_error_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:22:40.298 18:26:23 bdev_raid.raid_write_error_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:22:40.298 18:26:23 bdev_raid.raid_write_error_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2563890' 00:22:40.298 killing process with pid 2563890 00:22:40.298 18:26:23 bdev_raid.raid_write_error_test -- common/autotest_common.sh@967 -- # kill 2563890 00:22:40.298 [2024-07-12 18:26:23.882523] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:22:40.298 18:26:23 bdev_raid.raid_write_error_test -- common/autotest_common.sh@972 -- # wait 2563890 00:22:40.298 [2024-07-12 18:26:23.914433] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:22:40.556 18:26:24 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # grep -v Job /raidtest/tmp.mKIIUU6NXV 00:22:40.556 18:26:24 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # grep raid_bdev1 00:22:40.556 18:26:24 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # awk '{print $6}' 00:22:40.556 18:26:24 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # fail_per_s=0.00 00:22:40.556 18:26:24 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@844 -- # has_redundancy raid1 00:22:40.556 18:26:24 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:22:40.556 18:26:24 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@214 -- # return 0 00:22:40.556 18:26:24 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@845 -- # [[ 0.00 = \0\.\0\0 ]] 00:22:40.556 00:22:40.556 real 0m7.658s 00:22:40.556 user 0m12.248s 00:22:40.556 sys 0m1.339s 00:22:40.556 18:26:24 bdev_raid.raid_write_error_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:22:40.557 18:26:24 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:22:40.557 ************************************ 00:22:40.557 END TEST raid_write_error_test 00:22:40.557 ************************************ 00:22:40.557 18:26:24 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:22:40.557 18:26:24 bdev_raid -- bdev/bdev_raid.sh@875 -- # '[' true = true ']' 00:22:40.557 18:26:24 bdev_raid -- bdev/bdev_raid.sh@876 -- # for n in 2 4 00:22:40.557 18:26:24 bdev_raid -- bdev/bdev_raid.sh@877 -- # run_test raid_rebuild_test raid_rebuild_test raid1 2 false false true 00:22:40.557 18:26:24 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 7 -le 1 ']' 00:22:40.557 18:26:24 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:22:40.557 18:26:24 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:22:40.557 ************************************ 00:22:40.557 START TEST raid_rebuild_test 00:22:40.557 ************************************ 00:22:40.557 18:26:24 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@1123 -- # raid_rebuild_test raid1 2 false false true 00:22:40.557 18:26:24 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@568 -- # local raid_level=raid1 00:22:40.557 18:26:24 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@569 -- # local num_base_bdevs=2 00:22:40.557 18:26:24 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@570 -- # local superblock=false 00:22:40.557 18:26:24 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@571 -- # local background_io=false 00:22:40.557 18:26:24 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@572 -- # local verify=true 00:22:40.557 18:26:24 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@573 -- # (( i = 1 )) 00:22:40.557 18:26:24 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:22:40.557 18:26:24 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@575 -- # echo BaseBdev1 00:22:40.557 18:26:24 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:22:40.557 18:26:24 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:22:40.557 18:26:24 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@575 -- # echo BaseBdev2 00:22:40.557 18:26:24 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:22:40.557 18:26:24 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:22:40.557 18:26:24 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@573 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:22:40.557 18:26:24 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@573 -- # local base_bdevs 00:22:40.557 18:26:24 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@574 -- # local raid_bdev_name=raid_bdev1 00:22:40.557 18:26:24 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@575 -- # local strip_size 00:22:40.557 18:26:24 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@576 -- # local create_arg 00:22:40.557 18:26:24 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@577 -- # local raid_bdev_size 00:22:40.557 18:26:24 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@578 -- # local data_offset 00:22:40.557 18:26:24 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@580 -- # '[' raid1 '!=' raid1 ']' 00:22:40.557 18:26:24 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@588 -- # strip_size=0 00:22:40.557 18:26:24 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@591 -- # '[' false = true ']' 00:22:40.557 18:26:24 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@596 -- # raid_pid=2565026 00:22:40.557 18:26:24 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@597 -- # waitforlisten 2565026 /var/tmp/spdk-raid.sock 00:22:40.557 18:26:24 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@595 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 3M -q 2 -U -z -L bdev_raid 00:22:40.557 18:26:24 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@829 -- # '[' -z 2565026 ']' 00:22:40.557 18:26:24 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:22:40.557 18:26:24 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:22:40.557 18:26:24 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:22:40.557 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:22:40.557 18:26:24 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:22:40.557 18:26:24 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@10 -- # set +x 00:22:40.816 [2024-07-12 18:26:24.324286] Starting SPDK v24.09-pre git sha1 182dd7de4 / DPDK 24.03.0 initialization... 00:22:40.816 [2024-07-12 18:26:24.324360] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2565026 ] 00:22:40.816 I/O size of 3145728 is greater than zero copy threshold (65536). 00:22:40.816 Zero copy mechanism will not be used. 00:22:40.816 [2024-07-12 18:26:24.451920] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:22:41.074 [2024-07-12 18:26:24.557298] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:22:41.074 [2024-07-12 18:26:24.616951] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:22:41.074 [2024-07-12 18:26:24.616985] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:22:41.674 18:26:25 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:22:41.674 18:26:25 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@862 -- # return 0 00:22:41.674 18:26:25 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:22:41.674 18:26:25 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:22:41.983 BaseBdev1_malloc 00:22:41.983 18:26:25 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev1_malloc -p BaseBdev1 00:22:42.242 [2024-07-12 18:26:25.729435] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev1_malloc 00:22:42.242 [2024-07-12 18:26:25.729483] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:22:42.242 [2024-07-12 18:26:25.729505] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x134ed40 00:22:42.242 [2024-07-12 18:26:25.729517] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:22:42.242 [2024-07-12 18:26:25.731105] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:22:42.242 [2024-07-12 18:26:25.731135] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:22:42.242 BaseBdev1 00:22:42.243 18:26:25 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:22:42.243 18:26:25 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:22:42.501 BaseBdev2_malloc 00:22:42.501 18:26:25 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev2_malloc -p BaseBdev2 00:22:42.501 [2024-07-12 18:26:26.223513] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev2_malloc 00:22:42.501 [2024-07-12 18:26:26.223558] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:22:42.501 [2024-07-12 18:26:26.223579] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x134f860 00:22:42.501 [2024-07-12 18:26:26.223592] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:22:42.501 [2024-07-12 18:26:26.224954] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:22:42.501 [2024-07-12 18:26:26.224982] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:22:42.760 BaseBdev2 00:22:42.760 18:26:26 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@606 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b spare_malloc 00:22:42.760 spare_malloc 00:22:43.018 18:26:26 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@607 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_delay_create -b spare_malloc -d spare_delay -r 0 -t 0 -w 100000 -n 100000 00:22:43.018 spare_delay 00:22:43.018 18:26:26 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@608 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:22:43.276 [2024-07-12 18:26:26.953993] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:22:43.276 [2024-07-12 18:26:26.954039] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:22:43.276 [2024-07-12 18:26:26.954059] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x14fdec0 00:22:43.276 [2024-07-12 18:26:26.954071] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:22:43.276 [2024-07-12 18:26:26.955521] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:22:43.276 [2024-07-12 18:26:26.955551] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:22:43.276 spare 00:22:43.276 18:26:26 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@611 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'BaseBdev1 BaseBdev2' -n raid_bdev1 00:22:43.535 [2024-07-12 18:26:27.198646] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:22:43.535 [2024-07-12 18:26:27.199804] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:22:43.535 [2024-07-12 18:26:27.199875] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x14ff070 00:22:43.535 [2024-07-12 18:26:27.199886] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 65536, blocklen 512 00:22:43.535 [2024-07-12 18:26:27.200091] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x14f8490 00:22:43.535 [2024-07-12 18:26:27.200224] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x14ff070 00:22:43.535 [2024-07-12 18:26:27.200234] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x14ff070 00:22:43.535 [2024-07-12 18:26:27.200337] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:22:43.535 18:26:27 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@612 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:22:43.535 18:26:27 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:22:43.535 18:26:27 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:22:43.535 18:26:27 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:22:43.535 18:26:27 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:22:43.535 18:26:27 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:22:43.535 18:26:27 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:22:43.535 18:26:27 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:22:43.535 18:26:27 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:22:43.535 18:26:27 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:22:43.535 18:26:27 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:43.536 18:26:27 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:43.794 18:26:27 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:22:43.794 "name": "raid_bdev1", 00:22:43.794 "uuid": "c7fe9b1e-da76-48e5-9926-8e4d385d7761", 00:22:43.794 "strip_size_kb": 0, 00:22:43.794 "state": "online", 00:22:43.794 "raid_level": "raid1", 00:22:43.794 "superblock": false, 00:22:43.794 "num_base_bdevs": 2, 00:22:43.794 "num_base_bdevs_discovered": 2, 00:22:43.794 "num_base_bdevs_operational": 2, 00:22:43.794 "base_bdevs_list": [ 00:22:43.794 { 00:22:43.794 "name": "BaseBdev1", 00:22:43.794 "uuid": "5f012bb1-3549-52de-8ecb-b1f9345d86f0", 00:22:43.795 "is_configured": true, 00:22:43.795 "data_offset": 0, 00:22:43.795 "data_size": 65536 00:22:43.795 }, 00:22:43.795 { 00:22:43.795 "name": "BaseBdev2", 00:22:43.795 "uuid": "23fdcb86-a2f5-51cf-a613-088c4b3c1a3f", 00:22:43.795 "is_configured": true, 00:22:43.795 "data_offset": 0, 00:22:43.795 "data_size": 65536 00:22:43.795 } 00:22:43.795 ] 00:22:43.795 }' 00:22:43.795 18:26:27 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:22:43.795 18:26:27 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@10 -- # set +x 00:22:44.361 18:26:28 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@615 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:22:44.361 18:26:28 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@615 -- # jq -r '.[].num_blocks' 00:22:44.619 [2024-07-12 18:26:28.205547] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:22:44.619 18:26:28 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@615 -- # raid_bdev_size=65536 00:22:44.619 18:26:28 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@618 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:44.619 18:26:28 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@618 -- # jq -r '.[].base_bdevs_list[0].data_offset' 00:22:44.877 18:26:28 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@618 -- # data_offset=0 00:22:44.877 18:26:28 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@620 -- # '[' false = true ']' 00:22:44.877 18:26:28 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@623 -- # '[' true = true ']' 00:22:44.877 18:26:28 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@624 -- # local write_unit_size 00:22:44.878 18:26:28 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@627 -- # nbd_start_disks /var/tmp/spdk-raid.sock raid_bdev1 /dev/nbd0 00:22:44.878 18:26:28 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:22:44.878 18:26:28 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@10 -- # bdev_list=('raid_bdev1') 00:22:44.878 18:26:28 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@10 -- # local bdev_list 00:22:44.878 18:26:28 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0') 00:22:44.878 18:26:28 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@11 -- # local nbd_list 00:22:44.878 18:26:28 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@12 -- # local i 00:22:44.878 18:26:28 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:22:44.878 18:26:28 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:22:44.878 18:26:28 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk raid_bdev1 /dev/nbd0 00:22:45.135 [2024-07-12 18:26:28.710679] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x14f8490 00:22:45.135 /dev/nbd0 00:22:45.135 18:26:28 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:22:45.135 18:26:28 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:22:45.135 18:26:28 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@866 -- # local nbd_name=nbd0 00:22:45.135 18:26:28 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@867 -- # local i 00:22:45.135 18:26:28 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:22:45.135 18:26:28 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:22:45.135 18:26:28 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@870 -- # grep -q -w nbd0 /proc/partitions 00:22:45.135 18:26:28 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@871 -- # break 00:22:45.135 18:26:28 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:22:45.135 18:26:28 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:22:45.135 18:26:28 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@883 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:22:45.135 1+0 records in 00:22:45.135 1+0 records out 00:22:45.135 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000250096 s, 16.4 MB/s 00:22:45.135 18:26:28 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:22:45.135 18:26:28 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@884 -- # size=4096 00:22:45.135 18:26:28 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:22:45.135 18:26:28 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:22:45.135 18:26:28 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@887 -- # return 0 00:22:45.135 18:26:28 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:22:45.135 18:26:28 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:22:45.135 18:26:28 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@628 -- # '[' raid1 = raid5f ']' 00:22:45.135 18:26:28 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@632 -- # write_unit_size=1 00:22:45.135 18:26:28 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@634 -- # dd if=/dev/urandom of=/dev/nbd0 bs=512 count=65536 oflag=direct 00:22:51.701 65536+0 records in 00:22:51.701 65536+0 records out 00:22:51.701 33554432 bytes (34 MB, 32 MiB) copied, 6.18428 s, 5.4 MB/s 00:22:51.701 18:26:34 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@635 -- # nbd_stop_disks /var/tmp/spdk-raid.sock /dev/nbd0 00:22:51.701 18:26:34 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:22:51.701 18:26:34 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0') 00:22:51.701 18:26:34 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@50 -- # local nbd_list 00:22:51.701 18:26:34 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@51 -- # local i 00:22:51.701 18:26:34 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:22:51.701 18:26:34 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd0 00:22:51.701 [2024-07-12 18:26:35.220616] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:22:51.701 18:26:35 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:22:51.701 18:26:35 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:22:51.701 18:26:35 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:22:51.701 18:26:35 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:22:51.701 18:26:35 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:22:51.701 18:26:35 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:22:51.701 18:26:35 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@41 -- # break 00:22:51.701 18:26:35 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@45 -- # return 0 00:22:51.701 18:26:35 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@639 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev1 00:22:51.701 [2024-07-12 18:26:35.389114] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:22:51.701 18:26:35 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@642 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:22:51.701 18:26:35 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:22:51.701 18:26:35 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:22:51.701 18:26:35 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:22:51.701 18:26:35 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:22:51.701 18:26:35 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:22:51.701 18:26:35 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:22:51.701 18:26:35 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:22:51.701 18:26:35 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:22:51.701 18:26:35 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:22:51.701 18:26:35 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:51.701 18:26:35 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:51.960 18:26:35 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:22:51.960 "name": "raid_bdev1", 00:22:51.960 "uuid": "c7fe9b1e-da76-48e5-9926-8e4d385d7761", 00:22:51.960 "strip_size_kb": 0, 00:22:51.960 "state": "online", 00:22:51.960 "raid_level": "raid1", 00:22:51.960 "superblock": false, 00:22:51.960 "num_base_bdevs": 2, 00:22:51.960 "num_base_bdevs_discovered": 1, 00:22:51.960 "num_base_bdevs_operational": 1, 00:22:51.960 "base_bdevs_list": [ 00:22:51.960 { 00:22:51.960 "name": null, 00:22:51.960 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:51.960 "is_configured": false, 00:22:51.960 "data_offset": 0, 00:22:51.960 "data_size": 65536 00:22:51.960 }, 00:22:51.960 { 00:22:51.960 "name": "BaseBdev2", 00:22:51.960 "uuid": "23fdcb86-a2f5-51cf-a613-088c4b3c1a3f", 00:22:51.960 "is_configured": true, 00:22:51.960 "data_offset": 0, 00:22:51.960 "data_size": 65536 00:22:51.960 } 00:22:51.960 ] 00:22:51.960 }' 00:22:51.960 18:26:35 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:22:51.960 18:26:35 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@10 -- # set +x 00:22:52.527 18:26:36 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@645 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:22:52.787 [2024-07-12 18:26:36.468008] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:22:52.787 [2024-07-12 18:26:36.472987] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x14ff880 00:22:52.787 [2024-07-12 18:26:36.475196] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:22:52.787 18:26:36 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@646 -- # sleep 1 00:22:54.164 18:26:37 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@649 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:22:54.164 18:26:37 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:22:54.164 18:26:37 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:22:54.164 18:26:37 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@184 -- # local target=spare 00:22:54.164 18:26:37 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:22:54.164 18:26:37 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:54.164 18:26:37 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:54.164 18:26:37 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:22:54.164 "name": "raid_bdev1", 00:22:54.164 "uuid": "c7fe9b1e-da76-48e5-9926-8e4d385d7761", 00:22:54.164 "strip_size_kb": 0, 00:22:54.164 "state": "online", 00:22:54.164 "raid_level": "raid1", 00:22:54.164 "superblock": false, 00:22:54.164 "num_base_bdevs": 2, 00:22:54.164 "num_base_bdevs_discovered": 2, 00:22:54.164 "num_base_bdevs_operational": 2, 00:22:54.164 "process": { 00:22:54.164 "type": "rebuild", 00:22:54.164 "target": "spare", 00:22:54.164 "progress": { 00:22:54.164 "blocks": 24576, 00:22:54.164 "percent": 37 00:22:54.164 } 00:22:54.164 }, 00:22:54.164 "base_bdevs_list": [ 00:22:54.164 { 00:22:54.164 "name": "spare", 00:22:54.164 "uuid": "85059c6f-9637-585f-ac7c-1e3e1f114ecb", 00:22:54.164 "is_configured": true, 00:22:54.164 "data_offset": 0, 00:22:54.164 "data_size": 65536 00:22:54.164 }, 00:22:54.164 { 00:22:54.164 "name": "BaseBdev2", 00:22:54.164 "uuid": "23fdcb86-a2f5-51cf-a613-088c4b3c1a3f", 00:22:54.164 "is_configured": true, 00:22:54.164 "data_offset": 0, 00:22:54.164 "data_size": 65536 00:22:54.164 } 00:22:54.164 ] 00:22:54.164 }' 00:22:54.164 18:26:37 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:22:54.164 18:26:37 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:22:54.164 18:26:37 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:22:54.164 18:26:37 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:22:54.164 18:26:37 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@652 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev spare 00:22:54.423 [2024-07-12 18:26:38.062075] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:22:54.423 [2024-07-12 18:26:38.087983] bdev_raid.c:2513:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:22:54.423 [2024-07-12 18:26:38.088027] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:22:54.423 [2024-07-12 18:26:38.088043] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:22:54.423 [2024-07-12 18:26:38.088051] bdev_raid.c:2451:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:22:54.423 18:26:38 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@655 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:22:54.423 18:26:38 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:22:54.423 18:26:38 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:22:54.423 18:26:38 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:22:54.423 18:26:38 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:22:54.423 18:26:38 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:22:54.423 18:26:38 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:22:54.423 18:26:38 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:22:54.423 18:26:38 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:22:54.423 18:26:38 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:22:54.423 18:26:38 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:54.423 18:26:38 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:54.681 18:26:38 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:22:54.681 "name": "raid_bdev1", 00:22:54.681 "uuid": "c7fe9b1e-da76-48e5-9926-8e4d385d7761", 00:22:54.681 "strip_size_kb": 0, 00:22:54.681 "state": "online", 00:22:54.681 "raid_level": "raid1", 00:22:54.681 "superblock": false, 00:22:54.681 "num_base_bdevs": 2, 00:22:54.681 "num_base_bdevs_discovered": 1, 00:22:54.681 "num_base_bdevs_operational": 1, 00:22:54.681 "base_bdevs_list": [ 00:22:54.681 { 00:22:54.681 "name": null, 00:22:54.681 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:54.681 "is_configured": false, 00:22:54.681 "data_offset": 0, 00:22:54.681 "data_size": 65536 00:22:54.681 }, 00:22:54.681 { 00:22:54.681 "name": "BaseBdev2", 00:22:54.681 "uuid": "23fdcb86-a2f5-51cf-a613-088c4b3c1a3f", 00:22:54.681 "is_configured": true, 00:22:54.681 "data_offset": 0, 00:22:54.681 "data_size": 65536 00:22:54.681 } 00:22:54.681 ] 00:22:54.681 }' 00:22:54.681 18:26:38 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:22:54.681 18:26:38 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@10 -- # set +x 00:22:55.248 18:26:38 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@658 -- # verify_raid_bdev_process raid_bdev1 none none 00:22:55.248 18:26:38 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:22:55.248 18:26:38 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:22:55.248 18:26:38 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@184 -- # local target=none 00:22:55.248 18:26:38 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:22:55.248 18:26:38 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:55.248 18:26:38 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:55.506 18:26:39 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:22:55.506 "name": "raid_bdev1", 00:22:55.506 "uuid": "c7fe9b1e-da76-48e5-9926-8e4d385d7761", 00:22:55.506 "strip_size_kb": 0, 00:22:55.506 "state": "online", 00:22:55.506 "raid_level": "raid1", 00:22:55.506 "superblock": false, 00:22:55.506 "num_base_bdevs": 2, 00:22:55.506 "num_base_bdevs_discovered": 1, 00:22:55.506 "num_base_bdevs_operational": 1, 00:22:55.506 "base_bdevs_list": [ 00:22:55.506 { 00:22:55.506 "name": null, 00:22:55.506 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:55.506 "is_configured": false, 00:22:55.506 "data_offset": 0, 00:22:55.506 "data_size": 65536 00:22:55.506 }, 00:22:55.506 { 00:22:55.506 "name": "BaseBdev2", 00:22:55.506 "uuid": "23fdcb86-a2f5-51cf-a613-088c4b3c1a3f", 00:22:55.507 "is_configured": true, 00:22:55.507 "data_offset": 0, 00:22:55.507 "data_size": 65536 00:22:55.507 } 00:22:55.507 ] 00:22:55.507 }' 00:22:55.507 18:26:39 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:22:55.507 18:26:39 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:22:55.765 18:26:39 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:22:55.765 18:26:39 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:22:55.765 18:26:39 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@661 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:22:56.024 [2024-07-12 18:26:39.501135] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:22:56.024 [2024-07-12 18:26:39.506076] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x14f8490 00:22:56.024 [2024-07-12 18:26:39.507536] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:22:56.024 18:26:39 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@662 -- # sleep 1 00:22:56.958 18:26:40 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@663 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:22:56.958 18:26:40 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:22:56.958 18:26:40 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:22:56.958 18:26:40 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@184 -- # local target=spare 00:22:56.958 18:26:40 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:22:56.958 18:26:40 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:56.958 18:26:40 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:57.216 18:26:40 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:22:57.216 "name": "raid_bdev1", 00:22:57.216 "uuid": "c7fe9b1e-da76-48e5-9926-8e4d385d7761", 00:22:57.216 "strip_size_kb": 0, 00:22:57.216 "state": "online", 00:22:57.216 "raid_level": "raid1", 00:22:57.216 "superblock": false, 00:22:57.216 "num_base_bdevs": 2, 00:22:57.216 "num_base_bdevs_discovered": 2, 00:22:57.216 "num_base_bdevs_operational": 2, 00:22:57.216 "process": { 00:22:57.216 "type": "rebuild", 00:22:57.216 "target": "spare", 00:22:57.216 "progress": { 00:22:57.216 "blocks": 24576, 00:22:57.216 "percent": 37 00:22:57.216 } 00:22:57.216 }, 00:22:57.216 "base_bdevs_list": [ 00:22:57.216 { 00:22:57.216 "name": "spare", 00:22:57.216 "uuid": "85059c6f-9637-585f-ac7c-1e3e1f114ecb", 00:22:57.216 "is_configured": true, 00:22:57.216 "data_offset": 0, 00:22:57.216 "data_size": 65536 00:22:57.216 }, 00:22:57.216 { 00:22:57.216 "name": "BaseBdev2", 00:22:57.216 "uuid": "23fdcb86-a2f5-51cf-a613-088c4b3c1a3f", 00:22:57.216 "is_configured": true, 00:22:57.216 "data_offset": 0, 00:22:57.216 "data_size": 65536 00:22:57.216 } 00:22:57.216 ] 00:22:57.216 }' 00:22:57.216 18:26:40 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:22:57.216 18:26:40 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:22:57.216 18:26:40 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:22:57.216 18:26:40 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:22:57.216 18:26:40 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@665 -- # '[' false = true ']' 00:22:57.216 18:26:40 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@690 -- # local num_base_bdevs_operational=2 00:22:57.216 18:26:40 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@692 -- # '[' raid1 = raid1 ']' 00:22:57.216 18:26:40 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@692 -- # '[' 2 -gt 2 ']' 00:22:57.216 18:26:40 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@705 -- # local timeout=773 00:22:57.216 18:26:40 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:22:57.216 18:26:40 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:22:57.216 18:26:40 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:22:57.216 18:26:40 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:22:57.216 18:26:40 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@184 -- # local target=spare 00:22:57.216 18:26:40 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:22:57.216 18:26:40 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:57.216 18:26:40 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:57.475 18:26:41 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:22:57.475 "name": "raid_bdev1", 00:22:57.475 "uuid": "c7fe9b1e-da76-48e5-9926-8e4d385d7761", 00:22:57.475 "strip_size_kb": 0, 00:22:57.475 "state": "online", 00:22:57.475 "raid_level": "raid1", 00:22:57.475 "superblock": false, 00:22:57.475 "num_base_bdevs": 2, 00:22:57.475 "num_base_bdevs_discovered": 2, 00:22:57.475 "num_base_bdevs_operational": 2, 00:22:57.475 "process": { 00:22:57.475 "type": "rebuild", 00:22:57.475 "target": "spare", 00:22:57.475 "progress": { 00:22:57.475 "blocks": 30720, 00:22:57.475 "percent": 46 00:22:57.475 } 00:22:57.475 }, 00:22:57.475 "base_bdevs_list": [ 00:22:57.475 { 00:22:57.475 "name": "spare", 00:22:57.475 "uuid": "85059c6f-9637-585f-ac7c-1e3e1f114ecb", 00:22:57.475 "is_configured": true, 00:22:57.475 "data_offset": 0, 00:22:57.475 "data_size": 65536 00:22:57.475 }, 00:22:57.475 { 00:22:57.475 "name": "BaseBdev2", 00:22:57.475 "uuid": "23fdcb86-a2f5-51cf-a613-088c4b3c1a3f", 00:22:57.475 "is_configured": true, 00:22:57.475 "data_offset": 0, 00:22:57.475 "data_size": 65536 00:22:57.475 } 00:22:57.475 ] 00:22:57.475 }' 00:22:57.475 18:26:41 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:22:57.475 18:26:41 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:22:57.475 18:26:41 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:22:57.476 18:26:41 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:22:57.476 18:26:41 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@710 -- # sleep 1 00:22:58.854 18:26:42 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:22:58.854 18:26:42 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:22:58.855 18:26:42 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:22:58.855 18:26:42 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:22:58.855 18:26:42 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@184 -- # local target=spare 00:22:58.855 18:26:42 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:22:58.855 18:26:42 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:58.855 18:26:42 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:58.855 18:26:42 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:22:58.855 "name": "raid_bdev1", 00:22:58.855 "uuid": "c7fe9b1e-da76-48e5-9926-8e4d385d7761", 00:22:58.855 "strip_size_kb": 0, 00:22:58.855 "state": "online", 00:22:58.855 "raid_level": "raid1", 00:22:58.855 "superblock": false, 00:22:58.855 "num_base_bdevs": 2, 00:22:58.855 "num_base_bdevs_discovered": 2, 00:22:58.855 "num_base_bdevs_operational": 2, 00:22:58.855 "process": { 00:22:58.855 "type": "rebuild", 00:22:58.855 "target": "spare", 00:22:58.855 "progress": { 00:22:58.855 "blocks": 57344, 00:22:58.855 "percent": 87 00:22:58.855 } 00:22:58.855 }, 00:22:58.855 "base_bdevs_list": [ 00:22:58.855 { 00:22:58.855 "name": "spare", 00:22:58.855 "uuid": "85059c6f-9637-585f-ac7c-1e3e1f114ecb", 00:22:58.855 "is_configured": true, 00:22:58.855 "data_offset": 0, 00:22:58.855 "data_size": 65536 00:22:58.855 }, 00:22:58.855 { 00:22:58.855 "name": "BaseBdev2", 00:22:58.855 "uuid": "23fdcb86-a2f5-51cf-a613-088c4b3c1a3f", 00:22:58.855 "is_configured": true, 00:22:58.855 "data_offset": 0, 00:22:58.855 "data_size": 65536 00:22:58.855 } 00:22:58.855 ] 00:22:58.855 }' 00:22:58.855 18:26:42 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:22:58.855 18:26:42 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:22:58.855 18:26:42 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:22:58.855 18:26:42 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:22:58.855 18:26:42 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@710 -- # sleep 1 00:22:59.114 [2024-07-12 18:26:42.732436] bdev_raid.c:2789:raid_bdev_process_thread_run: *DEBUG*: process completed on raid_bdev1 00:22:59.114 [2024-07-12 18:26:42.732496] bdev_raid.c:2504:raid_bdev_process_finish_done: *NOTICE*: Finished rebuild on raid bdev raid_bdev1 00:22:59.114 [2024-07-12 18:26:42.732532] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:23:00.048 18:26:43 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:23:00.048 18:26:43 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:23:00.048 18:26:43 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:23:00.048 18:26:43 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:23:00.048 18:26:43 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@184 -- # local target=spare 00:23:00.048 18:26:43 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:23:00.048 18:26:43 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:00.048 18:26:43 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:00.307 18:26:43 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:23:00.307 "name": "raid_bdev1", 00:23:00.307 "uuid": "c7fe9b1e-da76-48e5-9926-8e4d385d7761", 00:23:00.307 "strip_size_kb": 0, 00:23:00.307 "state": "online", 00:23:00.307 "raid_level": "raid1", 00:23:00.307 "superblock": false, 00:23:00.307 "num_base_bdevs": 2, 00:23:00.307 "num_base_bdevs_discovered": 2, 00:23:00.307 "num_base_bdevs_operational": 2, 00:23:00.307 "base_bdevs_list": [ 00:23:00.307 { 00:23:00.307 "name": "spare", 00:23:00.307 "uuid": "85059c6f-9637-585f-ac7c-1e3e1f114ecb", 00:23:00.307 "is_configured": true, 00:23:00.307 "data_offset": 0, 00:23:00.307 "data_size": 65536 00:23:00.307 }, 00:23:00.307 { 00:23:00.307 "name": "BaseBdev2", 00:23:00.307 "uuid": "23fdcb86-a2f5-51cf-a613-088c4b3c1a3f", 00:23:00.307 "is_configured": true, 00:23:00.307 "data_offset": 0, 00:23:00.307 "data_size": 65536 00:23:00.307 } 00:23:00.307 ] 00:23:00.307 }' 00:23:00.307 18:26:43 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:23:00.307 18:26:43 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # [[ none == \r\e\b\u\i\l\d ]] 00:23:00.307 18:26:43 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:23:00.307 18:26:43 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # [[ none == \s\p\a\r\e ]] 00:23:00.307 18:26:43 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@708 -- # break 00:23:00.307 18:26:43 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@714 -- # verify_raid_bdev_process raid_bdev1 none none 00:23:00.307 18:26:43 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:23:00.307 18:26:43 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:23:00.307 18:26:43 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@184 -- # local target=none 00:23:00.307 18:26:43 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:23:00.307 18:26:43 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:00.307 18:26:43 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:00.566 18:26:44 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:23:00.566 "name": "raid_bdev1", 00:23:00.566 "uuid": "c7fe9b1e-da76-48e5-9926-8e4d385d7761", 00:23:00.566 "strip_size_kb": 0, 00:23:00.566 "state": "online", 00:23:00.566 "raid_level": "raid1", 00:23:00.566 "superblock": false, 00:23:00.566 "num_base_bdevs": 2, 00:23:00.566 "num_base_bdevs_discovered": 2, 00:23:00.566 "num_base_bdevs_operational": 2, 00:23:00.566 "base_bdevs_list": [ 00:23:00.566 { 00:23:00.566 "name": "spare", 00:23:00.566 "uuid": "85059c6f-9637-585f-ac7c-1e3e1f114ecb", 00:23:00.566 "is_configured": true, 00:23:00.566 "data_offset": 0, 00:23:00.566 "data_size": 65536 00:23:00.566 }, 00:23:00.566 { 00:23:00.566 "name": "BaseBdev2", 00:23:00.566 "uuid": "23fdcb86-a2f5-51cf-a613-088c4b3c1a3f", 00:23:00.566 "is_configured": true, 00:23:00.566 "data_offset": 0, 00:23:00.566 "data_size": 65536 00:23:00.566 } 00:23:00.566 ] 00:23:00.566 }' 00:23:00.566 18:26:44 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:23:00.566 18:26:44 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:23:00.566 18:26:44 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:23:00.566 18:26:44 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:23:00.566 18:26:44 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@715 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:23:00.566 18:26:44 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:23:00.566 18:26:44 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:23:00.566 18:26:44 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:23:00.566 18:26:44 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:23:00.566 18:26:44 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:23:00.566 18:26:44 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:23:00.566 18:26:44 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:23:00.566 18:26:44 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:23:00.566 18:26:44 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:23:00.566 18:26:44 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:00.566 18:26:44 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:00.825 18:26:44 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:23:00.825 "name": "raid_bdev1", 00:23:00.825 "uuid": "c7fe9b1e-da76-48e5-9926-8e4d385d7761", 00:23:00.825 "strip_size_kb": 0, 00:23:00.825 "state": "online", 00:23:00.825 "raid_level": "raid1", 00:23:00.825 "superblock": false, 00:23:00.825 "num_base_bdevs": 2, 00:23:00.825 "num_base_bdevs_discovered": 2, 00:23:00.825 "num_base_bdevs_operational": 2, 00:23:00.825 "base_bdevs_list": [ 00:23:00.825 { 00:23:00.825 "name": "spare", 00:23:00.825 "uuid": "85059c6f-9637-585f-ac7c-1e3e1f114ecb", 00:23:00.825 "is_configured": true, 00:23:00.825 "data_offset": 0, 00:23:00.825 "data_size": 65536 00:23:00.825 }, 00:23:00.825 { 00:23:00.825 "name": "BaseBdev2", 00:23:00.825 "uuid": "23fdcb86-a2f5-51cf-a613-088c4b3c1a3f", 00:23:00.825 "is_configured": true, 00:23:00.825 "data_offset": 0, 00:23:00.825 "data_size": 65536 00:23:00.825 } 00:23:00.825 ] 00:23:00.825 }' 00:23:00.825 18:26:44 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:23:00.825 18:26:44 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@10 -- # set +x 00:23:01.394 18:26:45 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@718 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:23:01.653 [2024-07-12 18:26:45.299860] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:23:01.653 [2024-07-12 18:26:45.299887] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:23:01.653 [2024-07-12 18:26:45.299950] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:23:01.653 [2024-07-12 18:26:45.300005] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:23:01.653 [2024-07-12 18:26:45.300017] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x14ff070 name raid_bdev1, state offline 00:23:01.653 18:26:45 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@719 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:01.653 18:26:45 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@719 -- # jq length 00:23:01.912 18:26:45 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@719 -- # [[ 0 == 0 ]] 00:23:01.912 18:26:45 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@721 -- # '[' true = true ']' 00:23:01.913 18:26:45 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@722 -- # '[' false = true ']' 00:23:01.913 18:26:45 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@736 -- # nbd_start_disks /var/tmp/spdk-raid.sock 'BaseBdev1 spare' '/dev/nbd0 /dev/nbd1' 00:23:01.913 18:26:45 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:23:01.913 18:26:45 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@10 -- # bdev_list=('BaseBdev1' 'spare') 00:23:01.913 18:26:45 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@10 -- # local bdev_list 00:23:01.913 18:26:45 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:23:01.913 18:26:45 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@11 -- # local nbd_list 00:23:01.913 18:26:45 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@12 -- # local i 00:23:01.913 18:26:45 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:23:01.913 18:26:45 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:23:01.913 18:26:45 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk BaseBdev1 /dev/nbd0 00:23:02.480 /dev/nbd0 00:23:02.480 18:26:46 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:23:02.480 18:26:46 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:23:02.480 18:26:46 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@866 -- # local nbd_name=nbd0 00:23:02.480 18:26:46 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@867 -- # local i 00:23:02.480 18:26:46 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:23:02.480 18:26:46 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:23:02.480 18:26:46 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@870 -- # grep -q -w nbd0 /proc/partitions 00:23:02.480 18:26:46 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@871 -- # break 00:23:02.480 18:26:46 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:23:02.480 18:26:46 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:23:02.480 18:26:46 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@883 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:23:02.480 1+0 records in 00:23:02.480 1+0 records out 00:23:02.480 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000239364 s, 17.1 MB/s 00:23:02.480 18:26:46 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:23:02.480 18:26:46 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@884 -- # size=4096 00:23:02.480 18:26:46 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:23:02.480 18:26:46 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:23:02.480 18:26:46 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@887 -- # return 0 00:23:02.480 18:26:46 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:23:02.480 18:26:46 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:23:02.480 18:26:46 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk spare /dev/nbd1 00:23:02.739 /dev/nbd1 00:23:02.739 18:26:46 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:23:02.739 18:26:46 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:23:02.739 18:26:46 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@866 -- # local nbd_name=nbd1 00:23:02.739 18:26:46 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@867 -- # local i 00:23:02.739 18:26:46 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:23:02.739 18:26:46 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:23:02.739 18:26:46 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@870 -- # grep -q -w nbd1 /proc/partitions 00:23:02.739 18:26:46 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@871 -- # break 00:23:02.739 18:26:46 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:23:02.739 18:26:46 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:23:02.739 18:26:46 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@883 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:23:02.739 1+0 records in 00:23:02.739 1+0 records out 00:23:02.739 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000324654 s, 12.6 MB/s 00:23:02.739 18:26:46 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:23:02.739 18:26:46 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@884 -- # size=4096 00:23:02.739 18:26:46 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:23:02.739 18:26:46 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:23:02.739 18:26:46 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@887 -- # return 0 00:23:02.740 18:26:46 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:23:02.740 18:26:46 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:23:02.740 18:26:46 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@737 -- # cmp -i 0 /dev/nbd0 /dev/nbd1 00:23:02.998 18:26:46 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@738 -- # nbd_stop_disks /var/tmp/spdk-raid.sock '/dev/nbd0 /dev/nbd1' 00:23:02.998 18:26:46 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:23:02.998 18:26:46 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:23:02.998 18:26:46 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@50 -- # local nbd_list 00:23:02.998 18:26:46 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@51 -- # local i 00:23:02.998 18:26:46 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:23:02.998 18:26:46 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd0 00:23:03.262 18:26:46 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:23:03.262 18:26:46 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:23:03.262 18:26:46 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:23:03.262 18:26:46 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:23:03.262 18:26:46 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:23:03.262 18:26:46 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:23:03.262 18:26:46 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@41 -- # break 00:23:03.262 18:26:46 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@45 -- # return 0 00:23:03.262 18:26:46 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:23:03.262 18:26:46 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd1 00:23:03.522 18:26:47 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:23:03.522 18:26:47 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:23:03.522 18:26:47 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:23:03.522 18:26:47 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:23:03.522 18:26:47 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:23:03.522 18:26:47 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:23:03.522 18:26:47 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@41 -- # break 00:23:03.522 18:26:47 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@45 -- # return 0 00:23:03.522 18:26:47 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@742 -- # '[' false = true ']' 00:23:03.522 18:26:47 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@782 -- # killprocess 2565026 00:23:03.522 18:26:47 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@948 -- # '[' -z 2565026 ']' 00:23:03.522 18:26:47 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@952 -- # kill -0 2565026 00:23:03.522 18:26:47 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@953 -- # uname 00:23:03.522 18:26:47 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:23:03.522 18:26:47 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2565026 00:23:03.522 18:26:47 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:23:03.522 18:26:47 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:23:03.522 18:26:47 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2565026' 00:23:03.522 killing process with pid 2565026 00:23:03.522 18:26:47 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@967 -- # kill 2565026 00:23:03.522 Received shutdown signal, test time was about 60.000000 seconds 00:23:03.522 00:23:03.522 Latency(us) 00:23:03.522 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:23:03.522 =================================================================================================================== 00:23:03.522 Total : 0.00 0.00 0.00 0.00 0.00 18446744073709551616.00 0.00 00:23:03.522 [2024-07-12 18:26:47.063407] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:23:03.522 18:26:47 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@972 -- # wait 2565026 00:23:03.522 [2024-07-12 18:26:47.091155] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:23:03.782 18:26:47 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@784 -- # return 0 00:23:03.782 00:23:03.782 real 0m23.065s 00:23:03.782 user 0m30.228s 00:23:03.782 sys 0m5.625s 00:23:03.782 18:26:47 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:23:03.782 18:26:47 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@10 -- # set +x 00:23:03.782 ************************************ 00:23:03.782 END TEST raid_rebuild_test 00:23:03.782 ************************************ 00:23:03.782 18:26:47 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:23:03.782 18:26:47 bdev_raid -- bdev/bdev_raid.sh@878 -- # run_test raid_rebuild_test_sb raid_rebuild_test raid1 2 true false true 00:23:03.782 18:26:47 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 7 -le 1 ']' 00:23:03.782 18:26:47 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:23:03.782 18:26:47 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:23:03.782 ************************************ 00:23:03.782 START TEST raid_rebuild_test_sb 00:23:03.782 ************************************ 00:23:03.782 18:26:47 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@1123 -- # raid_rebuild_test raid1 2 true false true 00:23:03.782 18:26:47 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@568 -- # local raid_level=raid1 00:23:03.782 18:26:47 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@569 -- # local num_base_bdevs=2 00:23:03.782 18:26:47 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@570 -- # local superblock=true 00:23:03.782 18:26:47 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@571 -- # local background_io=false 00:23:03.782 18:26:47 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@572 -- # local verify=true 00:23:03.782 18:26:47 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@573 -- # (( i = 1 )) 00:23:03.782 18:26:47 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:23:03.782 18:26:47 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@575 -- # echo BaseBdev1 00:23:03.782 18:26:47 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:23:03.782 18:26:47 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:23:03.782 18:26:47 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@575 -- # echo BaseBdev2 00:23:03.782 18:26:47 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:23:03.782 18:26:47 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:23:03.782 18:26:47 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@573 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:23:03.782 18:26:47 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@573 -- # local base_bdevs 00:23:03.782 18:26:47 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@574 -- # local raid_bdev_name=raid_bdev1 00:23:03.782 18:26:47 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@575 -- # local strip_size 00:23:03.783 18:26:47 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@576 -- # local create_arg 00:23:03.783 18:26:47 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@577 -- # local raid_bdev_size 00:23:03.783 18:26:47 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@578 -- # local data_offset 00:23:03.783 18:26:47 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@580 -- # '[' raid1 '!=' raid1 ']' 00:23:03.783 18:26:47 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@588 -- # strip_size=0 00:23:03.783 18:26:47 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@591 -- # '[' true = true ']' 00:23:03.783 18:26:47 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@592 -- # create_arg+=' -s' 00:23:03.783 18:26:47 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@596 -- # raid_pid=2568263 00:23:03.783 18:26:47 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@597 -- # waitforlisten 2568263 /var/tmp/spdk-raid.sock 00:23:03.783 18:26:47 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@595 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 3M -q 2 -U -z -L bdev_raid 00:23:03.783 18:26:47 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@829 -- # '[' -z 2568263 ']' 00:23:03.783 18:26:47 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:23:03.783 18:26:47 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@834 -- # local max_retries=100 00:23:03.783 18:26:47 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:23:03.783 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:23:03.783 18:26:47 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@838 -- # xtrace_disable 00:23:03.783 18:26:47 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:23:03.783 [2024-07-12 18:26:47.478064] Starting SPDK v24.09-pre git sha1 182dd7de4 / DPDK 24.03.0 initialization... 00:23:03.783 [2024-07-12 18:26:47.478134] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2568263 ] 00:23:03.783 I/O size of 3145728 is greater than zero copy threshold (65536). 00:23:03.783 Zero copy mechanism will not be used. 00:23:04.042 [2024-07-12 18:26:47.610511] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:23:04.042 [2024-07-12 18:26:47.718133] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:23:04.300 [2024-07-12 18:26:47.777388] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:23:04.300 [2024-07-12 18:26:47.777415] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:23:04.868 18:26:48 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:23:04.868 18:26:48 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@862 -- # return 0 00:23:04.868 18:26:48 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:23:04.868 18:26:48 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:23:05.169 BaseBdev1_malloc 00:23:05.169 18:26:48 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev1_malloc -p BaseBdev1 00:23:05.169 [2024-07-12 18:26:48.881622] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev1_malloc 00:23:05.169 [2024-07-12 18:26:48.881673] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:23:05.169 [2024-07-12 18:26:48.881695] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x2345d40 00:23:05.169 [2024-07-12 18:26:48.881707] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:23:05.169 [2024-07-12 18:26:48.883317] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:23:05.169 [2024-07-12 18:26:48.883348] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:23:05.169 BaseBdev1 00:23:05.435 18:26:48 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:23:05.435 18:26:48 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:23:05.435 BaseBdev2_malloc 00:23:05.435 18:26:49 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev2_malloc -p BaseBdev2 00:23:05.694 [2024-07-12 18:26:49.375778] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev2_malloc 00:23:05.694 [2024-07-12 18:26:49.375823] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:23:05.694 [2024-07-12 18:26:49.375846] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x2346860 00:23:05.694 [2024-07-12 18:26:49.375858] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:23:05.694 [2024-07-12 18:26:49.377235] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:23:05.694 [2024-07-12 18:26:49.377263] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:23:05.694 BaseBdev2 00:23:05.694 18:26:49 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@606 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b spare_malloc 00:23:05.952 spare_malloc 00:23:05.952 18:26:49 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@607 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_delay_create -b spare_malloc -d spare_delay -r 0 -t 0 -w 100000 -n 100000 00:23:06.211 spare_delay 00:23:06.211 18:26:49 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@608 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:23:06.470 [2024-07-12 18:26:50.114171] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:23:06.470 [2024-07-12 18:26:50.114228] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:23:06.470 [2024-07-12 18:26:50.114251] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x24f4ec0 00:23:06.470 [2024-07-12 18:26:50.114264] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:23:06.470 [2024-07-12 18:26:50.115874] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:23:06.470 [2024-07-12 18:26:50.115903] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:23:06.470 spare 00:23:06.470 18:26:50 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@611 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2' -n raid_bdev1 00:23:06.729 [2024-07-12 18:26:50.362851] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:23:06.729 [2024-07-12 18:26:50.364114] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:23:06.729 [2024-07-12 18:26:50.364272] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x24f6070 00:23:06.729 [2024-07-12 18:26:50.364285] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:23:06.729 [2024-07-12 18:26:50.364480] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x24ef490 00:23:06.729 [2024-07-12 18:26:50.364619] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x24f6070 00:23:06.729 [2024-07-12 18:26:50.364629] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x24f6070 00:23:06.729 [2024-07-12 18:26:50.364724] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:23:06.729 18:26:50 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@612 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:23:06.729 18:26:50 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:23:06.729 18:26:50 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:23:06.729 18:26:50 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:23:06.729 18:26:50 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:23:06.729 18:26:50 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:23:06.729 18:26:50 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:23:06.729 18:26:50 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:23:06.729 18:26:50 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:23:06.729 18:26:50 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:23:06.729 18:26:50 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:06.729 18:26:50 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:06.988 18:26:50 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:23:06.988 "name": "raid_bdev1", 00:23:06.988 "uuid": "248d0cf4-af96-433f-ba03-fa32341f0f8d", 00:23:06.988 "strip_size_kb": 0, 00:23:06.988 "state": "online", 00:23:06.988 "raid_level": "raid1", 00:23:06.988 "superblock": true, 00:23:06.988 "num_base_bdevs": 2, 00:23:06.988 "num_base_bdevs_discovered": 2, 00:23:06.988 "num_base_bdevs_operational": 2, 00:23:06.988 "base_bdevs_list": [ 00:23:06.988 { 00:23:06.988 "name": "BaseBdev1", 00:23:06.988 "uuid": "b07ee27d-40b4-5cec-ae18-0e9465da7dc2", 00:23:06.988 "is_configured": true, 00:23:06.988 "data_offset": 2048, 00:23:06.988 "data_size": 63488 00:23:06.988 }, 00:23:06.988 { 00:23:06.988 "name": "BaseBdev2", 00:23:06.988 "uuid": "0ac3eda9-ba99-5f73-bdd7-c8357e5d7640", 00:23:06.988 "is_configured": true, 00:23:06.988 "data_offset": 2048, 00:23:06.988 "data_size": 63488 00:23:06.988 } 00:23:06.988 ] 00:23:06.988 }' 00:23:06.988 18:26:50 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:23:06.988 18:26:50 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:23:07.556 18:26:51 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@615 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:23:07.556 18:26:51 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@615 -- # jq -r '.[].num_blocks' 00:23:07.815 [2024-07-12 18:26:51.445912] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:23:07.815 18:26:51 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@615 -- # raid_bdev_size=63488 00:23:07.815 18:26:51 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@618 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:07.815 18:26:51 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@618 -- # jq -r '.[].base_bdevs_list[0].data_offset' 00:23:08.074 18:26:51 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@618 -- # data_offset=2048 00:23:08.074 18:26:51 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@620 -- # '[' false = true ']' 00:23:08.074 18:26:51 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@623 -- # '[' true = true ']' 00:23:08.074 18:26:51 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@624 -- # local write_unit_size 00:23:08.074 18:26:51 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@627 -- # nbd_start_disks /var/tmp/spdk-raid.sock raid_bdev1 /dev/nbd0 00:23:08.074 18:26:51 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:23:08.074 18:26:51 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@10 -- # bdev_list=('raid_bdev1') 00:23:08.074 18:26:51 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@10 -- # local bdev_list 00:23:08.074 18:26:51 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0') 00:23:08.074 18:26:51 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@11 -- # local nbd_list 00:23:08.074 18:26:51 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@12 -- # local i 00:23:08.074 18:26:51 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:23:08.074 18:26:51 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:23:08.074 18:26:51 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk raid_bdev1 /dev/nbd0 00:23:08.333 [2024-07-12 18:26:51.943043] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x24ef490 00:23:08.333 /dev/nbd0 00:23:08.334 18:26:51 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:23:08.334 18:26:51 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:23:08.334 18:26:51 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@866 -- # local nbd_name=nbd0 00:23:08.334 18:26:51 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@867 -- # local i 00:23:08.334 18:26:51 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:23:08.334 18:26:51 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:23:08.334 18:26:51 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@870 -- # grep -q -w nbd0 /proc/partitions 00:23:08.334 18:26:51 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@871 -- # break 00:23:08.334 18:26:51 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:23:08.334 18:26:51 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:23:08.334 18:26:51 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@883 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:23:08.334 1+0 records in 00:23:08.334 1+0 records out 00:23:08.334 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000236764 s, 17.3 MB/s 00:23:08.334 18:26:51 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:23:08.334 18:26:51 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@884 -- # size=4096 00:23:08.334 18:26:51 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:23:08.334 18:26:51 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:23:08.334 18:26:51 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@887 -- # return 0 00:23:08.334 18:26:51 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:23:08.334 18:26:51 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:23:08.334 18:26:51 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@628 -- # '[' raid1 = raid5f ']' 00:23:08.334 18:26:51 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@632 -- # write_unit_size=1 00:23:08.334 18:26:51 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@634 -- # dd if=/dev/urandom of=/dev/nbd0 bs=512 count=63488 oflag=direct 00:23:14.898 63488+0 records in 00:23:14.898 63488+0 records out 00:23:14.898 32505856 bytes (33 MB, 31 MiB) copied, 6.09312 s, 5.3 MB/s 00:23:14.898 18:26:58 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@635 -- # nbd_stop_disks /var/tmp/spdk-raid.sock /dev/nbd0 00:23:14.898 18:26:58 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:23:14.898 18:26:58 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0') 00:23:14.898 18:26:58 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@50 -- # local nbd_list 00:23:14.898 18:26:58 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@51 -- # local i 00:23:14.898 18:26:58 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:23:14.898 18:26:58 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd0 00:23:14.898 18:26:58 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:23:14.898 18:26:58 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:23:14.898 18:26:58 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:23:14.898 18:26:58 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:23:14.898 18:26:58 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:23:14.898 18:26:58 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:23:14.898 [2024-07-12 18:26:58.286695] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:23:14.898 18:26:58 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@41 -- # break 00:23:14.898 18:26:58 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@45 -- # return 0 00:23:14.898 18:26:58 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@639 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev1 00:23:14.898 [2024-07-12 18:26:58.519362] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:23:14.898 18:26:58 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@642 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:23:14.898 18:26:58 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:23:14.898 18:26:58 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:23:14.898 18:26:58 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:23:14.898 18:26:58 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:23:14.898 18:26:58 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:23:14.898 18:26:58 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:23:14.898 18:26:58 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:23:14.898 18:26:58 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:23:14.898 18:26:58 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:23:14.898 18:26:58 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:14.898 18:26:58 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:15.156 18:26:58 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:23:15.156 "name": "raid_bdev1", 00:23:15.156 "uuid": "248d0cf4-af96-433f-ba03-fa32341f0f8d", 00:23:15.156 "strip_size_kb": 0, 00:23:15.156 "state": "online", 00:23:15.156 "raid_level": "raid1", 00:23:15.156 "superblock": true, 00:23:15.156 "num_base_bdevs": 2, 00:23:15.156 "num_base_bdevs_discovered": 1, 00:23:15.156 "num_base_bdevs_operational": 1, 00:23:15.156 "base_bdevs_list": [ 00:23:15.156 { 00:23:15.156 "name": null, 00:23:15.156 "uuid": "00000000-0000-0000-0000-000000000000", 00:23:15.156 "is_configured": false, 00:23:15.156 "data_offset": 2048, 00:23:15.156 "data_size": 63488 00:23:15.156 }, 00:23:15.156 { 00:23:15.156 "name": "BaseBdev2", 00:23:15.156 "uuid": "0ac3eda9-ba99-5f73-bdd7-c8357e5d7640", 00:23:15.156 "is_configured": true, 00:23:15.156 "data_offset": 2048, 00:23:15.156 "data_size": 63488 00:23:15.156 } 00:23:15.156 ] 00:23:15.156 }' 00:23:15.156 18:26:58 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:23:15.156 18:26:58 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:23:15.722 18:26:59 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@645 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:23:15.981 [2024-07-12 18:26:59.586200] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:23:15.981 [2024-07-12 18:26:59.591143] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x24f5ce0 00:23:15.981 [2024-07-12 18:26:59.593348] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:23:15.981 18:26:59 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@646 -- # sleep 1 00:23:16.917 18:27:00 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@649 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:23:16.917 18:27:00 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:23:16.917 18:27:00 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:23:16.917 18:27:00 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=spare 00:23:16.917 18:27:00 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:23:16.917 18:27:00 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:16.917 18:27:00 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:17.176 18:27:00 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:23:17.176 "name": "raid_bdev1", 00:23:17.176 "uuid": "248d0cf4-af96-433f-ba03-fa32341f0f8d", 00:23:17.176 "strip_size_kb": 0, 00:23:17.176 "state": "online", 00:23:17.176 "raid_level": "raid1", 00:23:17.176 "superblock": true, 00:23:17.176 "num_base_bdevs": 2, 00:23:17.176 "num_base_bdevs_discovered": 2, 00:23:17.176 "num_base_bdevs_operational": 2, 00:23:17.176 "process": { 00:23:17.176 "type": "rebuild", 00:23:17.176 "target": "spare", 00:23:17.176 "progress": { 00:23:17.176 "blocks": 24576, 00:23:17.176 "percent": 38 00:23:17.176 } 00:23:17.176 }, 00:23:17.176 "base_bdevs_list": [ 00:23:17.176 { 00:23:17.176 "name": "spare", 00:23:17.176 "uuid": "5a376096-69e2-5cd0-976d-3bea0df81fef", 00:23:17.176 "is_configured": true, 00:23:17.176 "data_offset": 2048, 00:23:17.176 "data_size": 63488 00:23:17.176 }, 00:23:17.176 { 00:23:17.176 "name": "BaseBdev2", 00:23:17.176 "uuid": "0ac3eda9-ba99-5f73-bdd7-c8357e5d7640", 00:23:17.176 "is_configured": true, 00:23:17.176 "data_offset": 2048, 00:23:17.176 "data_size": 63488 00:23:17.176 } 00:23:17.176 ] 00:23:17.176 }' 00:23:17.176 18:27:00 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:23:17.435 18:27:00 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:23:17.435 18:27:00 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:23:17.435 18:27:00 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:23:17.435 18:27:00 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@652 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev spare 00:23:17.694 [2024-07-12 18:27:01.175888] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:23:17.694 [2024-07-12 18:27:01.205793] bdev_raid.c:2513:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:23:17.694 [2024-07-12 18:27:01.205839] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:23:17.694 [2024-07-12 18:27:01.205855] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:23:17.694 [2024-07-12 18:27:01.205869] bdev_raid.c:2451:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:23:17.694 18:27:01 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@655 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:23:17.694 18:27:01 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:23:17.694 18:27:01 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:23:17.694 18:27:01 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:23:17.694 18:27:01 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:23:17.694 18:27:01 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:23:17.694 18:27:01 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:23:17.694 18:27:01 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:23:17.694 18:27:01 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:23:17.694 18:27:01 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:23:17.694 18:27:01 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:17.694 18:27:01 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:17.951 18:27:01 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:23:17.951 "name": "raid_bdev1", 00:23:17.951 "uuid": "248d0cf4-af96-433f-ba03-fa32341f0f8d", 00:23:17.951 "strip_size_kb": 0, 00:23:17.951 "state": "online", 00:23:17.952 "raid_level": "raid1", 00:23:17.952 "superblock": true, 00:23:17.952 "num_base_bdevs": 2, 00:23:17.952 "num_base_bdevs_discovered": 1, 00:23:17.952 "num_base_bdevs_operational": 1, 00:23:17.952 "base_bdevs_list": [ 00:23:17.952 { 00:23:17.952 "name": null, 00:23:17.952 "uuid": "00000000-0000-0000-0000-000000000000", 00:23:17.952 "is_configured": false, 00:23:17.952 "data_offset": 2048, 00:23:17.952 "data_size": 63488 00:23:17.952 }, 00:23:17.952 { 00:23:17.952 "name": "BaseBdev2", 00:23:17.952 "uuid": "0ac3eda9-ba99-5f73-bdd7-c8357e5d7640", 00:23:17.952 "is_configured": true, 00:23:17.952 "data_offset": 2048, 00:23:17.952 "data_size": 63488 00:23:17.952 } 00:23:17.952 ] 00:23:17.952 }' 00:23:17.952 18:27:01 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:23:17.952 18:27:01 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:23:18.516 18:27:02 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@658 -- # verify_raid_bdev_process raid_bdev1 none none 00:23:18.516 18:27:02 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:23:18.516 18:27:02 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:23:18.516 18:27:02 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=none 00:23:18.516 18:27:02 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:23:18.517 18:27:02 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:18.517 18:27:02 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:18.774 18:27:02 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:23:18.774 "name": "raid_bdev1", 00:23:18.774 "uuid": "248d0cf4-af96-433f-ba03-fa32341f0f8d", 00:23:18.774 "strip_size_kb": 0, 00:23:18.774 "state": "online", 00:23:18.774 "raid_level": "raid1", 00:23:18.774 "superblock": true, 00:23:18.774 "num_base_bdevs": 2, 00:23:18.774 "num_base_bdevs_discovered": 1, 00:23:18.774 "num_base_bdevs_operational": 1, 00:23:18.774 "base_bdevs_list": [ 00:23:18.774 { 00:23:18.774 "name": null, 00:23:18.774 "uuid": "00000000-0000-0000-0000-000000000000", 00:23:18.774 "is_configured": false, 00:23:18.774 "data_offset": 2048, 00:23:18.774 "data_size": 63488 00:23:18.774 }, 00:23:18.774 { 00:23:18.774 "name": "BaseBdev2", 00:23:18.774 "uuid": "0ac3eda9-ba99-5f73-bdd7-c8357e5d7640", 00:23:18.774 "is_configured": true, 00:23:18.774 "data_offset": 2048, 00:23:18.774 "data_size": 63488 00:23:18.774 } 00:23:18.774 ] 00:23:18.774 }' 00:23:18.774 18:27:02 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:23:18.774 18:27:02 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:23:18.774 18:27:02 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:23:18.774 18:27:02 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:23:18.774 18:27:02 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@661 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:23:19.031 [2024-07-12 18:27:02.650735] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:23:19.031 [2024-07-12 18:27:02.656413] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x24f5ce0 00:23:19.031 [2024-07-12 18:27:02.657924] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:23:19.031 18:27:02 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@662 -- # sleep 1 00:23:19.964 18:27:03 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@663 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:23:19.964 18:27:03 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:23:19.964 18:27:03 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:23:19.964 18:27:03 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=spare 00:23:19.964 18:27:03 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:23:19.964 18:27:03 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:19.964 18:27:03 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:20.223 18:27:03 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:23:20.223 "name": "raid_bdev1", 00:23:20.223 "uuid": "248d0cf4-af96-433f-ba03-fa32341f0f8d", 00:23:20.223 "strip_size_kb": 0, 00:23:20.223 "state": "online", 00:23:20.223 "raid_level": "raid1", 00:23:20.223 "superblock": true, 00:23:20.223 "num_base_bdevs": 2, 00:23:20.223 "num_base_bdevs_discovered": 2, 00:23:20.223 "num_base_bdevs_operational": 2, 00:23:20.223 "process": { 00:23:20.223 "type": "rebuild", 00:23:20.223 "target": "spare", 00:23:20.223 "progress": { 00:23:20.223 "blocks": 24576, 00:23:20.223 "percent": 38 00:23:20.223 } 00:23:20.223 }, 00:23:20.223 "base_bdevs_list": [ 00:23:20.223 { 00:23:20.223 "name": "spare", 00:23:20.223 "uuid": "5a376096-69e2-5cd0-976d-3bea0df81fef", 00:23:20.223 "is_configured": true, 00:23:20.223 "data_offset": 2048, 00:23:20.223 "data_size": 63488 00:23:20.223 }, 00:23:20.223 { 00:23:20.223 "name": "BaseBdev2", 00:23:20.223 "uuid": "0ac3eda9-ba99-5f73-bdd7-c8357e5d7640", 00:23:20.223 "is_configured": true, 00:23:20.223 "data_offset": 2048, 00:23:20.223 "data_size": 63488 00:23:20.223 } 00:23:20.223 ] 00:23:20.223 }' 00:23:20.223 18:27:03 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:23:20.481 18:27:03 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:23:20.481 18:27:03 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:23:20.481 18:27:04 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:23:20.481 18:27:04 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@665 -- # '[' true = true ']' 00:23:20.481 18:27:04 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@665 -- # '[' = false ']' 00:23:20.481 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev_raid.sh: line 665: [: =: unary operator expected 00:23:20.481 18:27:04 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@690 -- # local num_base_bdevs_operational=2 00:23:20.481 18:27:04 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@692 -- # '[' raid1 = raid1 ']' 00:23:20.481 18:27:04 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@692 -- # '[' 2 -gt 2 ']' 00:23:20.482 18:27:04 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@705 -- # local timeout=797 00:23:20.482 18:27:04 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:23:20.482 18:27:04 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:23:20.482 18:27:04 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:23:20.482 18:27:04 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:23:20.482 18:27:04 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=spare 00:23:20.482 18:27:04 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:23:20.482 18:27:04 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:20.482 18:27:04 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:20.740 18:27:04 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:23:20.740 "name": "raid_bdev1", 00:23:20.740 "uuid": "248d0cf4-af96-433f-ba03-fa32341f0f8d", 00:23:20.740 "strip_size_kb": 0, 00:23:20.740 "state": "online", 00:23:20.740 "raid_level": "raid1", 00:23:20.740 "superblock": true, 00:23:20.740 "num_base_bdevs": 2, 00:23:20.740 "num_base_bdevs_discovered": 2, 00:23:20.740 "num_base_bdevs_operational": 2, 00:23:20.740 "process": { 00:23:20.740 "type": "rebuild", 00:23:20.740 "target": "spare", 00:23:20.740 "progress": { 00:23:20.740 "blocks": 30720, 00:23:20.740 "percent": 48 00:23:20.740 } 00:23:20.740 }, 00:23:20.740 "base_bdevs_list": [ 00:23:20.740 { 00:23:20.740 "name": "spare", 00:23:20.740 "uuid": "5a376096-69e2-5cd0-976d-3bea0df81fef", 00:23:20.740 "is_configured": true, 00:23:20.740 "data_offset": 2048, 00:23:20.740 "data_size": 63488 00:23:20.740 }, 00:23:20.740 { 00:23:20.740 "name": "BaseBdev2", 00:23:20.740 "uuid": "0ac3eda9-ba99-5f73-bdd7-c8357e5d7640", 00:23:20.740 "is_configured": true, 00:23:20.740 "data_offset": 2048, 00:23:20.740 "data_size": 63488 00:23:20.740 } 00:23:20.740 ] 00:23:20.741 }' 00:23:20.741 18:27:04 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:23:20.741 18:27:04 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:23:20.741 18:27:04 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:23:20.741 18:27:04 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:23:20.741 18:27:04 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@710 -- # sleep 1 00:23:21.676 18:27:05 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:23:21.676 18:27:05 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:23:21.676 18:27:05 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:23:21.676 18:27:05 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:23:21.676 18:27:05 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=spare 00:23:21.676 18:27:05 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:23:21.676 18:27:05 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:21.676 18:27:05 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:21.935 18:27:05 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:23:21.935 "name": "raid_bdev1", 00:23:21.935 "uuid": "248d0cf4-af96-433f-ba03-fa32341f0f8d", 00:23:21.935 "strip_size_kb": 0, 00:23:21.935 "state": "online", 00:23:21.935 "raid_level": "raid1", 00:23:21.935 "superblock": true, 00:23:21.935 "num_base_bdevs": 2, 00:23:21.935 "num_base_bdevs_discovered": 2, 00:23:21.935 "num_base_bdevs_operational": 2, 00:23:21.935 "process": { 00:23:21.935 "type": "rebuild", 00:23:21.935 "target": "spare", 00:23:21.935 "progress": { 00:23:21.935 "blocks": 57344, 00:23:21.935 "percent": 90 00:23:21.935 } 00:23:21.935 }, 00:23:21.935 "base_bdevs_list": [ 00:23:21.935 { 00:23:21.935 "name": "spare", 00:23:21.935 "uuid": "5a376096-69e2-5cd0-976d-3bea0df81fef", 00:23:21.935 "is_configured": true, 00:23:21.935 "data_offset": 2048, 00:23:21.935 "data_size": 63488 00:23:21.935 }, 00:23:21.935 { 00:23:21.935 "name": "BaseBdev2", 00:23:21.935 "uuid": "0ac3eda9-ba99-5f73-bdd7-c8357e5d7640", 00:23:21.935 "is_configured": true, 00:23:21.935 "data_offset": 2048, 00:23:21.935 "data_size": 63488 00:23:21.935 } 00:23:21.935 ] 00:23:21.935 }' 00:23:21.935 18:27:05 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:23:21.935 18:27:05 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:23:21.935 18:27:05 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:23:21.935 18:27:05 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:23:21.935 18:27:05 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@710 -- # sleep 1 00:23:22.193 [2024-07-12 18:27:05.782031] bdev_raid.c:2789:raid_bdev_process_thread_run: *DEBUG*: process completed on raid_bdev1 00:23:22.193 [2024-07-12 18:27:05.782091] bdev_raid.c:2504:raid_bdev_process_finish_done: *NOTICE*: Finished rebuild on raid bdev raid_bdev1 00:23:22.193 [2024-07-12 18:27:05.782181] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:23:23.128 18:27:06 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:23:23.128 18:27:06 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:23:23.128 18:27:06 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:23:23.128 18:27:06 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:23:23.128 18:27:06 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=spare 00:23:23.128 18:27:06 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:23:23.128 18:27:06 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:23.128 18:27:06 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:23.386 18:27:06 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:23:23.386 "name": "raid_bdev1", 00:23:23.386 "uuid": "248d0cf4-af96-433f-ba03-fa32341f0f8d", 00:23:23.386 "strip_size_kb": 0, 00:23:23.386 "state": "online", 00:23:23.386 "raid_level": "raid1", 00:23:23.386 "superblock": true, 00:23:23.386 "num_base_bdevs": 2, 00:23:23.386 "num_base_bdevs_discovered": 2, 00:23:23.386 "num_base_bdevs_operational": 2, 00:23:23.386 "base_bdevs_list": [ 00:23:23.386 { 00:23:23.386 "name": "spare", 00:23:23.386 "uuid": "5a376096-69e2-5cd0-976d-3bea0df81fef", 00:23:23.386 "is_configured": true, 00:23:23.386 "data_offset": 2048, 00:23:23.386 "data_size": 63488 00:23:23.386 }, 00:23:23.386 { 00:23:23.386 "name": "BaseBdev2", 00:23:23.386 "uuid": "0ac3eda9-ba99-5f73-bdd7-c8357e5d7640", 00:23:23.386 "is_configured": true, 00:23:23.386 "data_offset": 2048, 00:23:23.386 "data_size": 63488 00:23:23.386 } 00:23:23.386 ] 00:23:23.386 }' 00:23:23.386 18:27:06 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:23:23.386 18:27:06 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ none == \r\e\b\u\i\l\d ]] 00:23:23.386 18:27:06 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:23:23.386 18:27:06 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ none == \s\p\a\r\e ]] 00:23:23.386 18:27:06 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@708 -- # break 00:23:23.386 18:27:06 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@714 -- # verify_raid_bdev_process raid_bdev1 none none 00:23:23.386 18:27:06 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:23:23.386 18:27:06 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:23:23.386 18:27:06 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=none 00:23:23.386 18:27:06 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:23:23.386 18:27:06 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:23.387 18:27:06 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:23.645 18:27:07 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:23:23.645 "name": "raid_bdev1", 00:23:23.645 "uuid": "248d0cf4-af96-433f-ba03-fa32341f0f8d", 00:23:23.645 "strip_size_kb": 0, 00:23:23.645 "state": "online", 00:23:23.645 "raid_level": "raid1", 00:23:23.645 "superblock": true, 00:23:23.645 "num_base_bdevs": 2, 00:23:23.645 "num_base_bdevs_discovered": 2, 00:23:23.645 "num_base_bdevs_operational": 2, 00:23:23.645 "base_bdevs_list": [ 00:23:23.645 { 00:23:23.645 "name": "spare", 00:23:23.645 "uuid": "5a376096-69e2-5cd0-976d-3bea0df81fef", 00:23:23.645 "is_configured": true, 00:23:23.645 "data_offset": 2048, 00:23:23.645 "data_size": 63488 00:23:23.645 }, 00:23:23.645 { 00:23:23.645 "name": "BaseBdev2", 00:23:23.645 "uuid": "0ac3eda9-ba99-5f73-bdd7-c8357e5d7640", 00:23:23.645 "is_configured": true, 00:23:23.645 "data_offset": 2048, 00:23:23.645 "data_size": 63488 00:23:23.645 } 00:23:23.645 ] 00:23:23.645 }' 00:23:23.645 18:27:07 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:23:23.645 18:27:07 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:23:23.645 18:27:07 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:23:23.645 18:27:07 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:23:23.645 18:27:07 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@715 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:23:23.645 18:27:07 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:23:23.645 18:27:07 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:23:23.645 18:27:07 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:23:23.645 18:27:07 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:23:23.645 18:27:07 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:23:23.645 18:27:07 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:23:23.645 18:27:07 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:23:23.645 18:27:07 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:23:23.645 18:27:07 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:23:23.645 18:27:07 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:23.645 18:27:07 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:23.905 18:27:07 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:23:23.905 "name": "raid_bdev1", 00:23:23.905 "uuid": "248d0cf4-af96-433f-ba03-fa32341f0f8d", 00:23:23.905 "strip_size_kb": 0, 00:23:23.905 "state": "online", 00:23:23.905 "raid_level": "raid1", 00:23:23.905 "superblock": true, 00:23:23.905 "num_base_bdevs": 2, 00:23:23.905 "num_base_bdevs_discovered": 2, 00:23:23.905 "num_base_bdevs_operational": 2, 00:23:23.905 "base_bdevs_list": [ 00:23:23.905 { 00:23:23.905 "name": "spare", 00:23:23.905 "uuid": "5a376096-69e2-5cd0-976d-3bea0df81fef", 00:23:23.905 "is_configured": true, 00:23:23.905 "data_offset": 2048, 00:23:23.905 "data_size": 63488 00:23:23.905 }, 00:23:23.905 { 00:23:23.905 "name": "BaseBdev2", 00:23:23.905 "uuid": "0ac3eda9-ba99-5f73-bdd7-c8357e5d7640", 00:23:23.905 "is_configured": true, 00:23:23.905 "data_offset": 2048, 00:23:23.905 "data_size": 63488 00:23:23.905 } 00:23:23.905 ] 00:23:23.905 }' 00:23:23.905 18:27:07 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:23:23.905 18:27:07 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:23:24.472 18:27:08 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@718 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:23:24.730 [2024-07-12 18:27:08.345796] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:23:24.730 [2024-07-12 18:27:08.345825] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:23:24.730 [2024-07-12 18:27:08.345885] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:23:24.730 [2024-07-12 18:27:08.345948] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:23:24.730 [2024-07-12 18:27:08.345961] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x24f6070 name raid_bdev1, state offline 00:23:24.730 18:27:08 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@719 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:24.730 18:27:08 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@719 -- # jq length 00:23:24.989 18:27:08 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@719 -- # [[ 0 == 0 ]] 00:23:24.989 18:27:08 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@721 -- # '[' true = true ']' 00:23:24.989 18:27:08 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@722 -- # '[' false = true ']' 00:23:24.989 18:27:08 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@736 -- # nbd_start_disks /var/tmp/spdk-raid.sock 'BaseBdev1 spare' '/dev/nbd0 /dev/nbd1' 00:23:24.989 18:27:08 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:23:24.989 18:27:08 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@10 -- # bdev_list=('BaseBdev1' 'spare') 00:23:24.989 18:27:08 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@10 -- # local bdev_list 00:23:24.989 18:27:08 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:23:24.989 18:27:08 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@11 -- # local nbd_list 00:23:24.989 18:27:08 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@12 -- # local i 00:23:24.989 18:27:08 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:23:24.989 18:27:08 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:23:24.989 18:27:08 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk BaseBdev1 /dev/nbd0 00:23:25.247 /dev/nbd0 00:23:25.247 18:27:08 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:23:25.247 18:27:08 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:23:25.247 18:27:08 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@866 -- # local nbd_name=nbd0 00:23:25.247 18:27:08 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@867 -- # local i 00:23:25.247 18:27:08 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:23:25.247 18:27:08 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:23:25.247 18:27:08 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@870 -- # grep -q -w nbd0 /proc/partitions 00:23:25.247 18:27:08 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@871 -- # break 00:23:25.247 18:27:08 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:23:25.247 18:27:08 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:23:25.248 18:27:08 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@883 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:23:25.248 1+0 records in 00:23:25.248 1+0 records out 00:23:25.248 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000226937 s, 18.0 MB/s 00:23:25.248 18:27:08 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:23:25.248 18:27:08 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@884 -- # size=4096 00:23:25.248 18:27:08 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:23:25.248 18:27:08 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:23:25.248 18:27:08 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@887 -- # return 0 00:23:25.248 18:27:08 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:23:25.248 18:27:08 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:23:25.248 18:27:08 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk spare /dev/nbd1 00:23:25.506 /dev/nbd1 00:23:25.506 18:27:09 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:23:25.506 18:27:09 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:23:25.506 18:27:09 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@866 -- # local nbd_name=nbd1 00:23:25.506 18:27:09 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@867 -- # local i 00:23:25.506 18:27:09 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:23:25.506 18:27:09 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:23:25.506 18:27:09 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@870 -- # grep -q -w nbd1 /proc/partitions 00:23:25.506 18:27:09 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@871 -- # break 00:23:25.506 18:27:09 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:23:25.506 18:27:09 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:23:25.506 18:27:09 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@883 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:23:25.506 1+0 records in 00:23:25.506 1+0 records out 00:23:25.506 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000239072 s, 17.1 MB/s 00:23:25.506 18:27:09 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:23:25.506 18:27:09 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@884 -- # size=4096 00:23:25.506 18:27:09 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:23:25.506 18:27:09 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:23:25.506 18:27:09 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@887 -- # return 0 00:23:25.506 18:27:09 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:23:25.506 18:27:09 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:23:25.506 18:27:09 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@737 -- # cmp -i 1048576 /dev/nbd0 /dev/nbd1 00:23:25.765 18:27:09 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@738 -- # nbd_stop_disks /var/tmp/spdk-raid.sock '/dev/nbd0 /dev/nbd1' 00:23:25.765 18:27:09 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:23:25.765 18:27:09 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:23:25.765 18:27:09 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@50 -- # local nbd_list 00:23:25.765 18:27:09 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@51 -- # local i 00:23:25.765 18:27:09 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:23:25.765 18:27:09 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd0 00:23:26.024 18:27:09 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:23:26.024 18:27:09 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:23:26.024 18:27:09 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:23:26.024 18:27:09 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:23:26.024 18:27:09 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:23:26.024 18:27:09 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:23:26.024 18:27:09 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@41 -- # break 00:23:26.024 18:27:09 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@45 -- # return 0 00:23:26.024 18:27:09 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:23:26.024 18:27:09 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd1 00:23:26.283 18:27:09 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:23:26.283 18:27:09 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:23:26.283 18:27:09 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:23:26.283 18:27:09 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:23:26.283 18:27:09 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:23:26.283 18:27:09 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:23:26.283 18:27:09 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@41 -- # break 00:23:26.283 18:27:09 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@45 -- # return 0 00:23:26.283 18:27:09 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@742 -- # '[' true = true ']' 00:23:26.283 18:27:09 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@744 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:23:26.541 18:27:10 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@745 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:23:26.799 [2024-07-12 18:27:10.276651] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:23:26.799 [2024-07-12 18:27:10.276708] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:23:26.799 [2024-07-12 18:27:10.276731] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x24f5500 00:23:26.799 [2024-07-12 18:27:10.276743] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:23:26.799 [2024-07-12 18:27:10.278395] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:23:26.799 [2024-07-12 18:27:10.278424] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:23:26.799 [2024-07-12 18:27:10.278516] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev spare 00:23:26.799 [2024-07-12 18:27:10.278545] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:23:26.799 [2024-07-12 18:27:10.278644] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:23:26.799 spare 00:23:26.799 18:27:10 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@747 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:23:26.799 18:27:10 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:23:26.799 18:27:10 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:23:26.799 18:27:10 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:23:26.799 18:27:10 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:23:26.799 18:27:10 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:23:26.799 18:27:10 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:23:26.799 18:27:10 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:23:26.799 18:27:10 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:23:26.799 18:27:10 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:23:26.799 18:27:10 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:26.799 18:27:10 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:26.799 [2024-07-12 18:27:10.378957] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x24f4260 00:23:26.799 [2024-07-12 18:27:10.378971] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:23:26.799 [2024-07-12 18:27:10.379163] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x24ef490 00:23:26.799 [2024-07-12 18:27:10.379308] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x24f4260 00:23:26.799 [2024-07-12 18:27:10.379318] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x24f4260 00:23:26.799 [2024-07-12 18:27:10.379421] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:23:27.057 18:27:10 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:23:27.057 "name": "raid_bdev1", 00:23:27.057 "uuid": "248d0cf4-af96-433f-ba03-fa32341f0f8d", 00:23:27.057 "strip_size_kb": 0, 00:23:27.057 "state": "online", 00:23:27.057 "raid_level": "raid1", 00:23:27.057 "superblock": true, 00:23:27.057 "num_base_bdevs": 2, 00:23:27.057 "num_base_bdevs_discovered": 2, 00:23:27.057 "num_base_bdevs_operational": 2, 00:23:27.057 "base_bdevs_list": [ 00:23:27.057 { 00:23:27.057 "name": "spare", 00:23:27.057 "uuid": "5a376096-69e2-5cd0-976d-3bea0df81fef", 00:23:27.057 "is_configured": true, 00:23:27.057 "data_offset": 2048, 00:23:27.057 "data_size": 63488 00:23:27.057 }, 00:23:27.057 { 00:23:27.057 "name": "BaseBdev2", 00:23:27.057 "uuid": "0ac3eda9-ba99-5f73-bdd7-c8357e5d7640", 00:23:27.057 "is_configured": true, 00:23:27.057 "data_offset": 2048, 00:23:27.057 "data_size": 63488 00:23:27.057 } 00:23:27.057 ] 00:23:27.057 }' 00:23:27.057 18:27:10 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:23:27.057 18:27:10 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:23:27.621 18:27:11 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@748 -- # verify_raid_bdev_process raid_bdev1 none none 00:23:27.621 18:27:11 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:23:27.621 18:27:11 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:23:27.621 18:27:11 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=none 00:23:27.621 18:27:11 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:23:27.621 18:27:11 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:27.621 18:27:11 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:27.878 18:27:11 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:23:27.878 "name": "raid_bdev1", 00:23:27.878 "uuid": "248d0cf4-af96-433f-ba03-fa32341f0f8d", 00:23:27.878 "strip_size_kb": 0, 00:23:27.878 "state": "online", 00:23:27.878 "raid_level": "raid1", 00:23:27.878 "superblock": true, 00:23:27.878 "num_base_bdevs": 2, 00:23:27.878 "num_base_bdevs_discovered": 2, 00:23:27.878 "num_base_bdevs_operational": 2, 00:23:27.878 "base_bdevs_list": [ 00:23:27.878 { 00:23:27.878 "name": "spare", 00:23:27.878 "uuid": "5a376096-69e2-5cd0-976d-3bea0df81fef", 00:23:27.878 "is_configured": true, 00:23:27.878 "data_offset": 2048, 00:23:27.878 "data_size": 63488 00:23:27.878 }, 00:23:27.878 { 00:23:27.878 "name": "BaseBdev2", 00:23:27.878 "uuid": "0ac3eda9-ba99-5f73-bdd7-c8357e5d7640", 00:23:27.878 "is_configured": true, 00:23:27.878 "data_offset": 2048, 00:23:27.878 "data_size": 63488 00:23:27.878 } 00:23:27.878 ] 00:23:27.878 }' 00:23:27.878 18:27:11 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:23:27.878 18:27:11 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:23:27.878 18:27:11 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:23:27.878 18:27:11 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:23:27.878 18:27:11 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@749 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:27.878 18:27:11 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@749 -- # jq -r '.[].base_bdevs_list[0].name' 00:23:28.187 18:27:11 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@749 -- # [[ spare == \s\p\a\r\e ]] 00:23:28.187 18:27:11 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@752 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev spare 00:23:28.776 [2024-07-12 18:27:12.213933] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:23:28.776 18:27:12 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@753 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:23:28.776 18:27:12 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:23:28.776 18:27:12 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:23:28.776 18:27:12 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:23:28.776 18:27:12 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:23:28.776 18:27:12 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:23:28.776 18:27:12 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:23:28.776 18:27:12 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:23:28.776 18:27:12 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:23:28.776 18:27:12 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:23:28.776 18:27:12 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:28.776 18:27:12 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:28.776 18:27:12 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:23:28.776 "name": "raid_bdev1", 00:23:28.776 "uuid": "248d0cf4-af96-433f-ba03-fa32341f0f8d", 00:23:28.776 "strip_size_kb": 0, 00:23:28.776 "state": "online", 00:23:28.776 "raid_level": "raid1", 00:23:28.776 "superblock": true, 00:23:28.776 "num_base_bdevs": 2, 00:23:28.776 "num_base_bdevs_discovered": 1, 00:23:28.776 "num_base_bdevs_operational": 1, 00:23:28.776 "base_bdevs_list": [ 00:23:28.776 { 00:23:28.776 "name": null, 00:23:28.776 "uuid": "00000000-0000-0000-0000-000000000000", 00:23:28.776 "is_configured": false, 00:23:28.776 "data_offset": 2048, 00:23:28.776 "data_size": 63488 00:23:28.776 }, 00:23:28.776 { 00:23:28.776 "name": "BaseBdev2", 00:23:28.776 "uuid": "0ac3eda9-ba99-5f73-bdd7-c8357e5d7640", 00:23:28.776 "is_configured": true, 00:23:28.776 "data_offset": 2048, 00:23:28.776 "data_size": 63488 00:23:28.776 } 00:23:28.776 ] 00:23:28.776 }' 00:23:28.776 18:27:12 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:23:28.776 18:27:12 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:23:29.704 18:27:13 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@754 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:23:29.704 [2024-07-12 18:27:13.308848] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:23:29.705 [2024-07-12 18:27:13.309023] bdev_raid.c:3562:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev spare (4) smaller than existing raid bdev raid_bdev1 (5) 00:23:29.705 [2024-07-12 18:27:13.309041] bdev_raid.c:3620:raid_bdev_examine_sb: *NOTICE*: Re-adding bdev spare to raid bdev raid_bdev1. 00:23:29.705 [2024-07-12 18:27:13.309071] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:23:29.705 [2024-07-12 18:27:13.314624] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x24ef490 00:23:29.705 [2024-07-12 18:27:13.317017] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:23:29.705 18:27:13 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@755 -- # sleep 1 00:23:30.633 18:27:14 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@756 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:23:30.633 18:27:14 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:23:30.633 18:27:14 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:23:30.633 18:27:14 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=spare 00:23:30.633 18:27:14 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:23:30.633 18:27:14 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:30.633 18:27:14 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:30.891 18:27:14 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:23:30.891 "name": "raid_bdev1", 00:23:30.891 "uuid": "248d0cf4-af96-433f-ba03-fa32341f0f8d", 00:23:30.891 "strip_size_kb": 0, 00:23:30.891 "state": "online", 00:23:30.891 "raid_level": "raid1", 00:23:30.891 "superblock": true, 00:23:30.891 "num_base_bdevs": 2, 00:23:30.891 "num_base_bdevs_discovered": 2, 00:23:30.891 "num_base_bdevs_operational": 2, 00:23:30.891 "process": { 00:23:30.891 "type": "rebuild", 00:23:30.891 "target": "spare", 00:23:30.891 "progress": { 00:23:30.891 "blocks": 24576, 00:23:30.891 "percent": 38 00:23:30.891 } 00:23:30.891 }, 00:23:30.891 "base_bdevs_list": [ 00:23:30.891 { 00:23:30.891 "name": "spare", 00:23:30.891 "uuid": "5a376096-69e2-5cd0-976d-3bea0df81fef", 00:23:30.891 "is_configured": true, 00:23:30.891 "data_offset": 2048, 00:23:30.891 "data_size": 63488 00:23:30.891 }, 00:23:30.891 { 00:23:30.891 "name": "BaseBdev2", 00:23:30.891 "uuid": "0ac3eda9-ba99-5f73-bdd7-c8357e5d7640", 00:23:30.891 "is_configured": true, 00:23:30.891 "data_offset": 2048, 00:23:30.891 "data_size": 63488 00:23:30.891 } 00:23:30.891 ] 00:23:30.891 }' 00:23:30.891 18:27:14 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:23:31.149 18:27:14 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:23:31.149 18:27:14 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:23:31.149 18:27:14 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:23:31.149 18:27:14 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@759 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:23:31.406 [2024-07-12 18:27:14.923512] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:23:31.406 [2024-07-12 18:27:14.929782] bdev_raid.c:2513:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:23:31.406 [2024-07-12 18:27:14.929825] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:23:31.406 [2024-07-12 18:27:14.929840] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:23:31.406 [2024-07-12 18:27:14.929849] bdev_raid.c:2451:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:23:31.406 18:27:14 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@760 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:23:31.406 18:27:14 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:23:31.406 18:27:14 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:23:31.406 18:27:14 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:23:31.406 18:27:14 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:23:31.406 18:27:14 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:23:31.406 18:27:14 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:23:31.406 18:27:14 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:23:31.406 18:27:14 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:23:31.406 18:27:14 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:23:31.406 18:27:14 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:31.406 18:27:14 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:31.663 18:27:15 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:23:31.663 "name": "raid_bdev1", 00:23:31.663 "uuid": "248d0cf4-af96-433f-ba03-fa32341f0f8d", 00:23:31.663 "strip_size_kb": 0, 00:23:31.663 "state": "online", 00:23:31.663 "raid_level": "raid1", 00:23:31.663 "superblock": true, 00:23:31.663 "num_base_bdevs": 2, 00:23:31.663 "num_base_bdevs_discovered": 1, 00:23:31.663 "num_base_bdevs_operational": 1, 00:23:31.663 "base_bdevs_list": [ 00:23:31.663 { 00:23:31.663 "name": null, 00:23:31.663 "uuid": "00000000-0000-0000-0000-000000000000", 00:23:31.663 "is_configured": false, 00:23:31.663 "data_offset": 2048, 00:23:31.663 "data_size": 63488 00:23:31.663 }, 00:23:31.663 { 00:23:31.663 "name": "BaseBdev2", 00:23:31.663 "uuid": "0ac3eda9-ba99-5f73-bdd7-c8357e5d7640", 00:23:31.663 "is_configured": true, 00:23:31.663 "data_offset": 2048, 00:23:31.663 "data_size": 63488 00:23:31.663 } 00:23:31.663 ] 00:23:31.663 }' 00:23:31.663 18:27:15 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:23:31.663 18:27:15 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:23:32.228 18:27:15 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@761 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:23:32.485 [2024-07-12 18:27:16.009302] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:23:32.485 [2024-07-12 18:27:16.009359] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:23:32.485 [2024-07-12 18:27:16.009382] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x24f5730 00:23:32.485 [2024-07-12 18:27:16.009395] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:23:32.485 [2024-07-12 18:27:16.009774] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:23:32.485 [2024-07-12 18:27:16.009793] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:23:32.485 [2024-07-12 18:27:16.009875] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev spare 00:23:32.485 [2024-07-12 18:27:16.009888] bdev_raid.c:3562:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev spare (4) smaller than existing raid bdev raid_bdev1 (5) 00:23:32.485 [2024-07-12 18:27:16.009899] bdev_raid.c:3620:raid_bdev_examine_sb: *NOTICE*: Re-adding bdev spare to raid bdev raid_bdev1. 00:23:32.485 [2024-07-12 18:27:16.009918] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:23:32.485 [2024-07-12 18:27:16.014821] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x24f6aa0 00:23:32.485 spare 00:23:32.485 [2024-07-12 18:27:16.016288] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:23:32.485 18:27:16 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@762 -- # sleep 1 00:23:33.416 18:27:17 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@763 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:23:33.416 18:27:17 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:23:33.416 18:27:17 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:23:33.416 18:27:17 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=spare 00:23:33.416 18:27:17 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:23:33.416 18:27:17 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:33.416 18:27:17 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:33.674 18:27:17 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:23:33.674 "name": "raid_bdev1", 00:23:33.674 "uuid": "248d0cf4-af96-433f-ba03-fa32341f0f8d", 00:23:33.674 "strip_size_kb": 0, 00:23:33.674 "state": "online", 00:23:33.674 "raid_level": "raid1", 00:23:33.674 "superblock": true, 00:23:33.674 "num_base_bdevs": 2, 00:23:33.674 "num_base_bdevs_discovered": 2, 00:23:33.674 "num_base_bdevs_operational": 2, 00:23:33.674 "process": { 00:23:33.674 "type": "rebuild", 00:23:33.674 "target": "spare", 00:23:33.674 "progress": { 00:23:33.674 "blocks": 24576, 00:23:33.674 "percent": 38 00:23:33.674 } 00:23:33.674 }, 00:23:33.674 "base_bdevs_list": [ 00:23:33.674 { 00:23:33.674 "name": "spare", 00:23:33.674 "uuid": "5a376096-69e2-5cd0-976d-3bea0df81fef", 00:23:33.674 "is_configured": true, 00:23:33.674 "data_offset": 2048, 00:23:33.674 "data_size": 63488 00:23:33.674 }, 00:23:33.674 { 00:23:33.674 "name": "BaseBdev2", 00:23:33.674 "uuid": "0ac3eda9-ba99-5f73-bdd7-c8357e5d7640", 00:23:33.674 "is_configured": true, 00:23:33.674 "data_offset": 2048, 00:23:33.674 "data_size": 63488 00:23:33.674 } 00:23:33.674 ] 00:23:33.674 }' 00:23:33.674 18:27:17 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:23:33.674 18:27:17 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:23:33.674 18:27:17 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:23:33.674 18:27:17 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:23:33.674 18:27:17 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@766 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:23:33.932 [2024-07-12 18:27:17.607595] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:23:33.933 [2024-07-12 18:27:17.629039] bdev_raid.c:2513:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:23:33.933 [2024-07-12 18:27:17.629084] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:23:33.933 [2024-07-12 18:27:17.629100] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:23:33.933 [2024-07-12 18:27:17.629108] bdev_raid.c:2451:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:23:33.933 18:27:17 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@767 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:23:33.933 18:27:17 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:23:33.933 18:27:17 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:23:33.933 18:27:17 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:23:34.190 18:27:17 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:23:34.190 18:27:17 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:23:34.190 18:27:17 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:23:34.190 18:27:17 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:23:34.190 18:27:17 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:23:34.190 18:27:17 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:23:34.190 18:27:17 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:34.190 18:27:17 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:34.448 18:27:17 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:23:34.448 "name": "raid_bdev1", 00:23:34.448 "uuid": "248d0cf4-af96-433f-ba03-fa32341f0f8d", 00:23:34.448 "strip_size_kb": 0, 00:23:34.448 "state": "online", 00:23:34.448 "raid_level": "raid1", 00:23:34.448 "superblock": true, 00:23:34.448 "num_base_bdevs": 2, 00:23:34.448 "num_base_bdevs_discovered": 1, 00:23:34.448 "num_base_bdevs_operational": 1, 00:23:34.448 "base_bdevs_list": [ 00:23:34.448 { 00:23:34.448 "name": null, 00:23:34.448 "uuid": "00000000-0000-0000-0000-000000000000", 00:23:34.448 "is_configured": false, 00:23:34.448 "data_offset": 2048, 00:23:34.448 "data_size": 63488 00:23:34.448 }, 00:23:34.448 { 00:23:34.448 "name": "BaseBdev2", 00:23:34.448 "uuid": "0ac3eda9-ba99-5f73-bdd7-c8357e5d7640", 00:23:34.448 "is_configured": true, 00:23:34.448 "data_offset": 2048, 00:23:34.448 "data_size": 63488 00:23:34.448 } 00:23:34.448 ] 00:23:34.448 }' 00:23:34.448 18:27:17 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:23:34.448 18:27:17 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:23:35.013 18:27:18 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@768 -- # verify_raid_bdev_process raid_bdev1 none none 00:23:35.013 18:27:18 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:23:35.013 18:27:18 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:23:35.013 18:27:18 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=none 00:23:35.013 18:27:18 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:23:35.013 18:27:18 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:35.013 18:27:18 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:35.270 18:27:18 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:23:35.270 "name": "raid_bdev1", 00:23:35.270 "uuid": "248d0cf4-af96-433f-ba03-fa32341f0f8d", 00:23:35.270 "strip_size_kb": 0, 00:23:35.270 "state": "online", 00:23:35.270 "raid_level": "raid1", 00:23:35.270 "superblock": true, 00:23:35.270 "num_base_bdevs": 2, 00:23:35.270 "num_base_bdevs_discovered": 1, 00:23:35.270 "num_base_bdevs_operational": 1, 00:23:35.270 "base_bdevs_list": [ 00:23:35.270 { 00:23:35.270 "name": null, 00:23:35.270 "uuid": "00000000-0000-0000-0000-000000000000", 00:23:35.270 "is_configured": false, 00:23:35.270 "data_offset": 2048, 00:23:35.270 "data_size": 63488 00:23:35.270 }, 00:23:35.270 { 00:23:35.270 "name": "BaseBdev2", 00:23:35.270 "uuid": "0ac3eda9-ba99-5f73-bdd7-c8357e5d7640", 00:23:35.270 "is_configured": true, 00:23:35.270 "data_offset": 2048, 00:23:35.270 "data_size": 63488 00:23:35.270 } 00:23:35.270 ] 00:23:35.270 }' 00:23:35.270 18:27:18 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:23:35.270 18:27:18 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:23:35.270 18:27:18 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:23:35.270 18:27:18 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:23:35.270 18:27:18 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@771 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete BaseBdev1 00:23:35.528 18:27:19 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@772 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev1_malloc -p BaseBdev1 00:23:35.784 [2024-07-12 18:27:19.310659] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev1_malloc 00:23:35.784 [2024-07-12 18:27:19.310715] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:23:35.784 [2024-07-12 18:27:19.310739] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x24f0650 00:23:35.784 [2024-07-12 18:27:19.310753] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:23:35.784 [2024-07-12 18:27:19.311129] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:23:35.784 [2024-07-12 18:27:19.311149] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:23:35.784 [2024-07-12 18:27:19.311217] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev BaseBdev1 00:23:35.784 [2024-07-12 18:27:19.311231] bdev_raid.c:3562:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev BaseBdev1 (1) smaller than existing raid bdev raid_bdev1 (5) 00:23:35.784 [2024-07-12 18:27:19.311242] bdev_raid.c:3581:raid_bdev_examine_sb: *DEBUG*: raid superblock does not contain this bdev's uuid 00:23:35.784 BaseBdev1 00:23:35.784 18:27:19 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@773 -- # sleep 1 00:23:36.714 18:27:20 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@774 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:23:36.714 18:27:20 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:23:36.714 18:27:20 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:23:36.714 18:27:20 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:23:36.714 18:27:20 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:23:36.714 18:27:20 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:23:36.714 18:27:20 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:23:36.714 18:27:20 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:23:36.714 18:27:20 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:23:36.714 18:27:20 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:23:36.714 18:27:20 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:36.714 18:27:20 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:36.971 18:27:20 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:23:36.971 "name": "raid_bdev1", 00:23:36.971 "uuid": "248d0cf4-af96-433f-ba03-fa32341f0f8d", 00:23:36.971 "strip_size_kb": 0, 00:23:36.971 "state": "online", 00:23:36.971 "raid_level": "raid1", 00:23:36.971 "superblock": true, 00:23:36.971 "num_base_bdevs": 2, 00:23:36.971 "num_base_bdevs_discovered": 1, 00:23:36.971 "num_base_bdevs_operational": 1, 00:23:36.971 "base_bdevs_list": [ 00:23:36.971 { 00:23:36.971 "name": null, 00:23:36.971 "uuid": "00000000-0000-0000-0000-000000000000", 00:23:36.971 "is_configured": false, 00:23:36.971 "data_offset": 2048, 00:23:36.971 "data_size": 63488 00:23:36.971 }, 00:23:36.971 { 00:23:36.971 "name": "BaseBdev2", 00:23:36.971 "uuid": "0ac3eda9-ba99-5f73-bdd7-c8357e5d7640", 00:23:36.971 "is_configured": true, 00:23:36.971 "data_offset": 2048, 00:23:36.971 "data_size": 63488 00:23:36.971 } 00:23:36.971 ] 00:23:36.971 }' 00:23:36.971 18:27:20 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:23:36.971 18:27:20 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:23:37.534 18:27:21 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@775 -- # verify_raid_bdev_process raid_bdev1 none none 00:23:37.534 18:27:21 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:23:37.534 18:27:21 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:23:37.534 18:27:21 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=none 00:23:37.534 18:27:21 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:23:37.534 18:27:21 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:37.534 18:27:21 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:37.791 18:27:21 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:23:37.791 "name": "raid_bdev1", 00:23:37.791 "uuid": "248d0cf4-af96-433f-ba03-fa32341f0f8d", 00:23:37.791 "strip_size_kb": 0, 00:23:37.791 "state": "online", 00:23:37.791 "raid_level": "raid1", 00:23:37.791 "superblock": true, 00:23:37.791 "num_base_bdevs": 2, 00:23:37.791 "num_base_bdevs_discovered": 1, 00:23:37.791 "num_base_bdevs_operational": 1, 00:23:37.791 "base_bdevs_list": [ 00:23:37.791 { 00:23:37.791 "name": null, 00:23:37.791 "uuid": "00000000-0000-0000-0000-000000000000", 00:23:37.791 "is_configured": false, 00:23:37.791 "data_offset": 2048, 00:23:37.791 "data_size": 63488 00:23:37.791 }, 00:23:37.791 { 00:23:37.791 "name": "BaseBdev2", 00:23:37.791 "uuid": "0ac3eda9-ba99-5f73-bdd7-c8357e5d7640", 00:23:37.791 "is_configured": true, 00:23:37.791 "data_offset": 2048, 00:23:37.791 "data_size": 63488 00:23:37.791 } 00:23:37.791 ] 00:23:37.791 }' 00:23:37.791 18:27:21 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:23:37.791 18:27:21 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:23:37.791 18:27:21 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:23:37.791 18:27:21 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:23:37.791 18:27:21 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@776 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:23:37.791 18:27:21 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@648 -- # local es=0 00:23:37.791 18:27:21 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@650 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:23:37.791 18:27:21 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@636 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:23:37.791 18:27:21 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:23:37.791 18:27:21 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@640 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:23:37.791 18:27:21 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:23:37.791 18:27:21 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@642 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:23:37.791 18:27:21 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:23:37.791 18:27:21 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@642 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:23:37.791 18:27:21 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@642 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py ]] 00:23:37.791 18:27:21 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@651 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:23:38.048 [2024-07-12 18:27:21.729077] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:23:38.048 [2024-07-12 18:27:21.729207] bdev_raid.c:3562:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev BaseBdev1 (1) smaller than existing raid bdev raid_bdev1 (5) 00:23:38.048 [2024-07-12 18:27:21.729223] bdev_raid.c:3581:raid_bdev_examine_sb: *DEBUG*: raid superblock does not contain this bdev's uuid 00:23:38.048 request: 00:23:38.048 { 00:23:38.048 "base_bdev": "BaseBdev1", 00:23:38.048 "raid_bdev": "raid_bdev1", 00:23:38.048 "method": "bdev_raid_add_base_bdev", 00:23:38.048 "req_id": 1 00:23:38.048 } 00:23:38.048 Got JSON-RPC error response 00:23:38.048 response: 00:23:38.048 { 00:23:38.048 "code": -22, 00:23:38.048 "message": "Failed to add base bdev to RAID bdev: Invalid argument" 00:23:38.048 } 00:23:38.048 18:27:21 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@651 -- # es=1 00:23:38.048 18:27:21 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@659 -- # (( es > 128 )) 00:23:38.048 18:27:21 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@670 -- # [[ -n '' ]] 00:23:38.048 18:27:21 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@675 -- # (( !es == 0 )) 00:23:38.048 18:27:21 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@777 -- # sleep 1 00:23:39.417 18:27:22 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@778 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:23:39.417 18:27:22 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:23:39.417 18:27:22 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:23:39.417 18:27:22 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:23:39.417 18:27:22 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:23:39.417 18:27:22 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:23:39.417 18:27:22 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:23:39.417 18:27:22 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:23:39.417 18:27:22 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:23:39.417 18:27:22 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:23:39.417 18:27:22 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:39.417 18:27:22 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:39.417 18:27:23 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:23:39.417 "name": "raid_bdev1", 00:23:39.417 "uuid": "248d0cf4-af96-433f-ba03-fa32341f0f8d", 00:23:39.417 "strip_size_kb": 0, 00:23:39.417 "state": "online", 00:23:39.417 "raid_level": "raid1", 00:23:39.417 "superblock": true, 00:23:39.417 "num_base_bdevs": 2, 00:23:39.417 "num_base_bdevs_discovered": 1, 00:23:39.417 "num_base_bdevs_operational": 1, 00:23:39.417 "base_bdevs_list": [ 00:23:39.417 { 00:23:39.417 "name": null, 00:23:39.417 "uuid": "00000000-0000-0000-0000-000000000000", 00:23:39.417 "is_configured": false, 00:23:39.417 "data_offset": 2048, 00:23:39.417 "data_size": 63488 00:23:39.417 }, 00:23:39.417 { 00:23:39.417 "name": "BaseBdev2", 00:23:39.417 "uuid": "0ac3eda9-ba99-5f73-bdd7-c8357e5d7640", 00:23:39.417 "is_configured": true, 00:23:39.417 "data_offset": 2048, 00:23:39.417 "data_size": 63488 00:23:39.417 } 00:23:39.417 ] 00:23:39.417 }' 00:23:39.417 18:27:23 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:23:39.417 18:27:23 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:23:40.003 18:27:23 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@779 -- # verify_raid_bdev_process raid_bdev1 none none 00:23:40.003 18:27:23 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:23:40.003 18:27:23 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:23:40.003 18:27:23 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=none 00:23:40.003 18:27:23 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:23:40.003 18:27:23 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:40.003 18:27:23 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:40.259 18:27:23 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:23:40.259 "name": "raid_bdev1", 00:23:40.259 "uuid": "248d0cf4-af96-433f-ba03-fa32341f0f8d", 00:23:40.259 "strip_size_kb": 0, 00:23:40.259 "state": "online", 00:23:40.259 "raid_level": "raid1", 00:23:40.259 "superblock": true, 00:23:40.259 "num_base_bdevs": 2, 00:23:40.259 "num_base_bdevs_discovered": 1, 00:23:40.259 "num_base_bdevs_operational": 1, 00:23:40.259 "base_bdevs_list": [ 00:23:40.259 { 00:23:40.259 "name": null, 00:23:40.259 "uuid": "00000000-0000-0000-0000-000000000000", 00:23:40.259 "is_configured": false, 00:23:40.259 "data_offset": 2048, 00:23:40.259 "data_size": 63488 00:23:40.259 }, 00:23:40.259 { 00:23:40.259 "name": "BaseBdev2", 00:23:40.259 "uuid": "0ac3eda9-ba99-5f73-bdd7-c8357e5d7640", 00:23:40.259 "is_configured": true, 00:23:40.259 "data_offset": 2048, 00:23:40.259 "data_size": 63488 00:23:40.259 } 00:23:40.259 ] 00:23:40.259 }' 00:23:40.259 18:27:23 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:23:40.259 18:27:23 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:23:40.259 18:27:23 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:23:40.259 18:27:23 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:23:40.259 18:27:23 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@782 -- # killprocess 2568263 00:23:40.259 18:27:23 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@948 -- # '[' -z 2568263 ']' 00:23:40.259 18:27:23 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@952 -- # kill -0 2568263 00:23:40.259 18:27:23 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@953 -- # uname 00:23:40.259 18:27:23 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:23:40.259 18:27:23 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2568263 00:23:40.516 18:27:24 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:23:40.516 18:27:24 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:23:40.516 18:27:24 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2568263' 00:23:40.516 killing process with pid 2568263 00:23:40.516 18:27:24 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@967 -- # kill 2568263 00:23:40.516 Received shutdown signal, test time was about 60.000000 seconds 00:23:40.516 00:23:40.516 Latency(us) 00:23:40.516 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:23:40.516 =================================================================================================================== 00:23:40.516 Total : 0.00 0.00 0.00 0.00 0.00 18446744073709551616.00 0.00 00:23:40.516 [2024-07-12 18:27:24.009470] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:23:40.516 [2024-07-12 18:27:24.009572] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:23:40.516 18:27:24 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@972 -- # wait 2568263 00:23:40.516 [2024-07-12 18:27:24.009613] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:23:40.516 [2024-07-12 18:27:24.009629] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x24f4260 name raid_bdev1, state offline 00:23:40.516 [2024-07-12 18:27:24.041010] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:23:40.774 18:27:24 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@784 -- # return 0 00:23:40.774 00:23:40.774 real 0m36.861s 00:23:40.774 user 0m52.368s 00:23:40.774 sys 0m7.635s 00:23:40.774 18:27:24 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@1124 -- # xtrace_disable 00:23:40.774 18:27:24 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:23:40.774 ************************************ 00:23:40.774 END TEST raid_rebuild_test_sb 00:23:40.774 ************************************ 00:23:40.774 18:27:24 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:23:40.774 18:27:24 bdev_raid -- bdev/bdev_raid.sh@879 -- # run_test raid_rebuild_test_io raid_rebuild_test raid1 2 false true true 00:23:40.774 18:27:24 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 7 -le 1 ']' 00:23:40.774 18:27:24 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:23:40.774 18:27:24 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:23:40.774 ************************************ 00:23:40.774 START TEST raid_rebuild_test_io 00:23:40.774 ************************************ 00:23:40.774 18:27:24 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@1123 -- # raid_rebuild_test raid1 2 false true true 00:23:40.774 18:27:24 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@568 -- # local raid_level=raid1 00:23:40.774 18:27:24 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@569 -- # local num_base_bdevs=2 00:23:40.774 18:27:24 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@570 -- # local superblock=false 00:23:40.774 18:27:24 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@571 -- # local background_io=true 00:23:40.774 18:27:24 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@572 -- # local verify=true 00:23:40.774 18:27:24 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@573 -- # (( i = 1 )) 00:23:40.774 18:27:24 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:23:40.774 18:27:24 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@575 -- # echo BaseBdev1 00:23:40.774 18:27:24 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:23:40.774 18:27:24 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:23:40.774 18:27:24 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@575 -- # echo BaseBdev2 00:23:40.774 18:27:24 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:23:40.774 18:27:24 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:23:40.774 18:27:24 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@573 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:23:40.774 18:27:24 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@573 -- # local base_bdevs 00:23:40.774 18:27:24 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@574 -- # local raid_bdev_name=raid_bdev1 00:23:40.774 18:27:24 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@575 -- # local strip_size 00:23:40.774 18:27:24 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@576 -- # local create_arg 00:23:40.774 18:27:24 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@577 -- # local raid_bdev_size 00:23:40.774 18:27:24 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@578 -- # local data_offset 00:23:40.774 18:27:24 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@580 -- # '[' raid1 '!=' raid1 ']' 00:23:40.774 18:27:24 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@588 -- # strip_size=0 00:23:40.774 18:27:24 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@591 -- # '[' false = true ']' 00:23:40.774 18:27:24 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@596 -- # raid_pid=2573969 00:23:40.774 18:27:24 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@597 -- # waitforlisten 2573969 /var/tmp/spdk-raid.sock 00:23:40.774 18:27:24 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@595 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 3M -q 2 -U -z -L bdev_raid 00:23:40.774 18:27:24 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@829 -- # '[' -z 2573969 ']' 00:23:40.774 18:27:24 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:23:40.774 18:27:24 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@834 -- # local max_retries=100 00:23:40.774 18:27:24 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:23:40.774 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:23:40.774 18:27:24 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@838 -- # xtrace_disable 00:23:40.774 18:27:24 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@10 -- # set +x 00:23:40.774 [2024-07-12 18:27:24.424704] Starting SPDK v24.09-pre git sha1 182dd7de4 / DPDK 24.03.0 initialization... 00:23:40.774 [2024-07-12 18:27:24.424773] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2573969 ] 00:23:40.774 I/O size of 3145728 is greater than zero copy threshold (65536). 00:23:40.774 Zero copy mechanism will not be used. 00:23:41.032 [2024-07-12 18:27:24.556150] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:23:41.032 [2024-07-12 18:27:24.662873] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:23:41.032 [2024-07-12 18:27:24.731426] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:23:41.032 [2024-07-12 18:27:24.731460] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:23:41.962 18:27:25 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:23:41.962 18:27:25 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@862 -- # return 0 00:23:41.962 18:27:25 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:23:41.962 18:27:25 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:23:41.962 BaseBdev1_malloc 00:23:41.962 18:27:25 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev1_malloc -p BaseBdev1 00:23:42.219 [2024-07-12 18:27:25.817743] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev1_malloc 00:23:42.219 [2024-07-12 18:27:25.817794] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:23:42.219 [2024-07-12 18:27:25.817819] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xc66d40 00:23:42.219 [2024-07-12 18:27:25.817832] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:23:42.219 [2024-07-12 18:27:25.819607] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:23:42.219 [2024-07-12 18:27:25.819636] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:23:42.219 BaseBdev1 00:23:42.219 18:27:25 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:23:42.219 18:27:25 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:23:42.476 BaseBdev2_malloc 00:23:42.476 18:27:26 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev2_malloc -p BaseBdev2 00:23:42.732 [2024-07-12 18:27:26.297136] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev2_malloc 00:23:42.732 [2024-07-12 18:27:26.297186] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:23:42.732 [2024-07-12 18:27:26.297218] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xc67860 00:23:42.732 [2024-07-12 18:27:26.297231] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:23:42.732 [2024-07-12 18:27:26.298780] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:23:42.732 [2024-07-12 18:27:26.298808] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:23:42.732 BaseBdev2 00:23:42.732 18:27:26 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@606 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b spare_malloc 00:23:42.989 spare_malloc 00:23:42.989 18:27:26 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@607 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_delay_create -b spare_malloc -d spare_delay -r 0 -t 0 -w 100000 -n 100000 00:23:43.246 spare_delay 00:23:43.246 18:27:26 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@608 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:23:43.503 [2024-07-12 18:27:27.024937] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:23:43.503 [2024-07-12 18:27:27.024984] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:23:43.503 [2024-07-12 18:27:27.025005] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xe15ec0 00:23:43.503 [2024-07-12 18:27:27.025018] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:23:43.503 [2024-07-12 18:27:27.026588] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:23:43.503 [2024-07-12 18:27:27.026618] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:23:43.503 spare 00:23:43.503 18:27:27 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@611 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'BaseBdev1 BaseBdev2' -n raid_bdev1 00:23:43.760 [2024-07-12 18:27:27.269588] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:23:43.760 [2024-07-12 18:27:27.270899] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:23:43.760 [2024-07-12 18:27:27.270983] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0xe17070 00:23:43.760 [2024-07-12 18:27:27.270995] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 65536, blocklen 512 00:23:43.760 [2024-07-12 18:27:27.271206] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xe10490 00:23:43.760 [2024-07-12 18:27:27.271347] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0xe17070 00:23:43.760 [2024-07-12 18:27:27.271357] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0xe17070 00:23:43.760 [2024-07-12 18:27:27.271471] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:23:43.760 18:27:27 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@612 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:23:43.760 18:27:27 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:23:43.760 18:27:27 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:23:43.760 18:27:27 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:23:43.760 18:27:27 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:23:43.760 18:27:27 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:23:43.760 18:27:27 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:23:43.760 18:27:27 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:23:43.760 18:27:27 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:23:43.760 18:27:27 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:23:43.760 18:27:27 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:43.760 18:27:27 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:44.017 18:27:27 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:23:44.017 "name": "raid_bdev1", 00:23:44.017 "uuid": "ea234223-afcf-4f65-9476-04d040e16fb4", 00:23:44.017 "strip_size_kb": 0, 00:23:44.017 "state": "online", 00:23:44.017 "raid_level": "raid1", 00:23:44.017 "superblock": false, 00:23:44.017 "num_base_bdevs": 2, 00:23:44.017 "num_base_bdevs_discovered": 2, 00:23:44.017 "num_base_bdevs_operational": 2, 00:23:44.017 "base_bdevs_list": [ 00:23:44.017 { 00:23:44.017 "name": "BaseBdev1", 00:23:44.017 "uuid": "9bd59227-3381-56b1-b049-c9d44daf88e5", 00:23:44.017 "is_configured": true, 00:23:44.017 "data_offset": 0, 00:23:44.017 "data_size": 65536 00:23:44.017 }, 00:23:44.017 { 00:23:44.017 "name": "BaseBdev2", 00:23:44.017 "uuid": "ce3028aa-4605-5130-a00a-c8f24079b799", 00:23:44.017 "is_configured": true, 00:23:44.017 "data_offset": 0, 00:23:44.017 "data_size": 65536 00:23:44.017 } 00:23:44.017 ] 00:23:44.017 }' 00:23:44.017 18:27:27 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:23:44.017 18:27:27 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@10 -- # set +x 00:23:44.581 18:27:28 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@615 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:23:44.581 18:27:28 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@615 -- # jq -r '.[].num_blocks' 00:23:44.838 [2024-07-12 18:27:28.344680] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:23:44.838 18:27:28 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@615 -- # raid_bdev_size=65536 00:23:44.838 18:27:28 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@618 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:44.838 18:27:28 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@618 -- # jq -r '.[].base_bdevs_list[0].data_offset' 00:23:45.096 18:27:28 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@618 -- # data_offset=0 00:23:45.096 18:27:28 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@620 -- # '[' true = true ']' 00:23:45.096 18:27:28 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@639 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev1 00:23:45.096 18:27:28 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@622 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/spdk-raid.sock perform_tests 00:23:45.096 [2024-07-12 18:27:28.719490] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xe11bd0 00:23:45.096 I/O size of 3145728 is greater than zero copy threshold (65536). 00:23:45.096 Zero copy mechanism will not be used. 00:23:45.096 Running I/O for 60 seconds... 00:23:45.353 [2024-07-12 18:27:28.828679] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:23:45.353 [2024-07-12 18:27:28.844875] bdev_raid.c:1919:raid_bdev_channel_remove_base_bdev: *DEBUG*: slot: 0 raid_ch: 0xe11bd0 00:23:45.353 18:27:28 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@642 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:23:45.353 18:27:28 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:23:45.353 18:27:28 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:23:45.353 18:27:28 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:23:45.353 18:27:28 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:23:45.353 18:27:28 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:23:45.353 18:27:28 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:23:45.353 18:27:28 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:23:45.353 18:27:28 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:23:45.353 18:27:28 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:23:45.353 18:27:28 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:45.353 18:27:28 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:45.612 18:27:29 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:23:45.612 "name": "raid_bdev1", 00:23:45.612 "uuid": "ea234223-afcf-4f65-9476-04d040e16fb4", 00:23:45.612 "strip_size_kb": 0, 00:23:45.612 "state": "online", 00:23:45.612 "raid_level": "raid1", 00:23:45.612 "superblock": false, 00:23:45.612 "num_base_bdevs": 2, 00:23:45.612 "num_base_bdevs_discovered": 1, 00:23:45.612 "num_base_bdevs_operational": 1, 00:23:45.612 "base_bdevs_list": [ 00:23:45.612 { 00:23:45.612 "name": null, 00:23:45.612 "uuid": "00000000-0000-0000-0000-000000000000", 00:23:45.612 "is_configured": false, 00:23:45.612 "data_offset": 0, 00:23:45.612 "data_size": 65536 00:23:45.612 }, 00:23:45.612 { 00:23:45.612 "name": "BaseBdev2", 00:23:45.612 "uuid": "ce3028aa-4605-5130-a00a-c8f24079b799", 00:23:45.612 "is_configured": true, 00:23:45.612 "data_offset": 0, 00:23:45.612 "data_size": 65536 00:23:45.612 } 00:23:45.612 ] 00:23:45.612 }' 00:23:45.612 18:27:29 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:23:45.612 18:27:29 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@10 -- # set +x 00:23:46.177 18:27:29 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@645 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:23:46.436 [2024-07-12 18:27:29.965222] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:23:46.436 18:27:30 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@646 -- # sleep 1 00:23:46.436 [2024-07-12 18:27:30.033459] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xd998b0 00:23:46.436 [2024-07-12 18:27:30.036301] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:23:46.436 [2024-07-12 18:27:30.139045] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 2048 offset_begin: 0 offset_end: 6144 00:23:46.436 [2024-07-12 18:27:30.139411] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 2048 offset_begin: 0 offset_end: 6144 00:23:46.693 [2024-07-12 18:27:30.381459] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 4096 offset_begin: 0 offset_end: 6144 00:23:46.693 [2024-07-12 18:27:30.381650] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 4096 offset_begin: 0 offset_end: 6144 00:23:47.257 [2024-07-12 18:27:30.745382] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 8192 offset_begin: 6144 offset_end: 12288 00:23:47.257 [2024-07-12 18:27:30.964813] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 10240 offset_begin: 6144 offset_end: 12288 00:23:47.257 [2024-07-12 18:27:30.964964] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 10240 offset_begin: 6144 offset_end: 12288 00:23:47.513 18:27:31 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@649 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:23:47.513 18:27:31 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:23:47.513 18:27:31 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:23:47.513 18:27:31 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:23:47.513 18:27:31 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:23:47.513 18:27:31 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:47.513 18:27:31 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:47.514 [2024-07-12 18:27:31.213417] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 14336 offset_begin: 12288 offset_end: 18432 00:23:47.771 18:27:31 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:23:47.771 "name": "raid_bdev1", 00:23:47.771 "uuid": "ea234223-afcf-4f65-9476-04d040e16fb4", 00:23:47.771 "strip_size_kb": 0, 00:23:47.771 "state": "online", 00:23:47.771 "raid_level": "raid1", 00:23:47.771 "superblock": false, 00:23:47.771 "num_base_bdevs": 2, 00:23:47.771 "num_base_bdevs_discovered": 2, 00:23:47.771 "num_base_bdevs_operational": 2, 00:23:47.771 "process": { 00:23:47.771 "type": "rebuild", 00:23:47.771 "target": "spare", 00:23:47.771 "progress": { 00:23:47.771 "blocks": 14336, 00:23:47.771 "percent": 21 00:23:47.771 } 00:23:47.771 }, 00:23:47.771 "base_bdevs_list": [ 00:23:47.771 { 00:23:47.771 "name": "spare", 00:23:47.771 "uuid": "5ca5ef0b-4d47-5aa5-a0c8-9bd8de3300a0", 00:23:47.771 "is_configured": true, 00:23:47.771 "data_offset": 0, 00:23:47.771 "data_size": 65536 00:23:47.771 }, 00:23:47.771 { 00:23:47.771 "name": "BaseBdev2", 00:23:47.771 "uuid": "ce3028aa-4605-5130-a00a-c8f24079b799", 00:23:47.771 "is_configured": true, 00:23:47.771 "data_offset": 0, 00:23:47.771 "data_size": 65536 00:23:47.771 } 00:23:47.771 ] 00:23:47.771 }' 00:23:47.771 18:27:31 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:23:47.771 18:27:31 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:23:47.771 18:27:31 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:23:47.771 18:27:31 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:23:47.771 18:27:31 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@652 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev spare 00:23:47.771 [2024-07-12 18:27:31.440891] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 16384 offset_begin: 12288 offset_end: 18432 00:23:48.118 [2024-07-12 18:27:31.596444] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:23:48.118 [2024-07-12 18:27:31.753533] bdev_raid.c:2513:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:23:48.118 [2024-07-12 18:27:31.763532] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:23:48.118 [2024-07-12 18:27:31.763563] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:23:48.118 [2024-07-12 18:27:31.763573] bdev_raid.c:2451:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:23:48.118 [2024-07-12 18:27:31.786342] bdev_raid.c:1919:raid_bdev_channel_remove_base_bdev: *DEBUG*: slot: 0 raid_ch: 0xe11bd0 00:23:48.403 18:27:31 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@655 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:23:48.403 18:27:31 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:23:48.403 18:27:31 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:23:48.403 18:27:31 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:23:48.403 18:27:31 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:23:48.403 18:27:31 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:23:48.404 18:27:31 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:23:48.404 18:27:31 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:23:48.404 18:27:31 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:23:48.404 18:27:31 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:23:48.404 18:27:31 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:48.404 18:27:31 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:48.404 18:27:32 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:23:48.404 "name": "raid_bdev1", 00:23:48.404 "uuid": "ea234223-afcf-4f65-9476-04d040e16fb4", 00:23:48.404 "strip_size_kb": 0, 00:23:48.404 "state": "online", 00:23:48.404 "raid_level": "raid1", 00:23:48.404 "superblock": false, 00:23:48.404 "num_base_bdevs": 2, 00:23:48.404 "num_base_bdevs_discovered": 1, 00:23:48.404 "num_base_bdevs_operational": 1, 00:23:48.404 "base_bdevs_list": [ 00:23:48.404 { 00:23:48.404 "name": null, 00:23:48.404 "uuid": "00000000-0000-0000-0000-000000000000", 00:23:48.404 "is_configured": false, 00:23:48.404 "data_offset": 0, 00:23:48.404 "data_size": 65536 00:23:48.404 }, 00:23:48.404 { 00:23:48.404 "name": "BaseBdev2", 00:23:48.404 "uuid": "ce3028aa-4605-5130-a00a-c8f24079b799", 00:23:48.404 "is_configured": true, 00:23:48.404 "data_offset": 0, 00:23:48.404 "data_size": 65536 00:23:48.404 } 00:23:48.404 ] 00:23:48.404 }' 00:23:48.404 18:27:32 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:23:48.404 18:27:32 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@10 -- # set +x 00:23:49.335 18:27:32 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@658 -- # verify_raid_bdev_process raid_bdev1 none none 00:23:49.335 18:27:32 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:23:49.335 18:27:32 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:23:49.335 18:27:32 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@184 -- # local target=none 00:23:49.335 18:27:32 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:23:49.335 18:27:32 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:49.335 18:27:32 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:49.335 18:27:32 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:23:49.335 "name": "raid_bdev1", 00:23:49.335 "uuid": "ea234223-afcf-4f65-9476-04d040e16fb4", 00:23:49.335 "strip_size_kb": 0, 00:23:49.335 "state": "online", 00:23:49.335 "raid_level": "raid1", 00:23:49.335 "superblock": false, 00:23:49.335 "num_base_bdevs": 2, 00:23:49.335 "num_base_bdevs_discovered": 1, 00:23:49.335 "num_base_bdevs_operational": 1, 00:23:49.335 "base_bdevs_list": [ 00:23:49.335 { 00:23:49.335 "name": null, 00:23:49.335 "uuid": "00000000-0000-0000-0000-000000000000", 00:23:49.335 "is_configured": false, 00:23:49.335 "data_offset": 0, 00:23:49.335 "data_size": 65536 00:23:49.335 }, 00:23:49.335 { 00:23:49.335 "name": "BaseBdev2", 00:23:49.335 "uuid": "ce3028aa-4605-5130-a00a-c8f24079b799", 00:23:49.335 "is_configured": true, 00:23:49.335 "data_offset": 0, 00:23:49.335 "data_size": 65536 00:23:49.335 } 00:23:49.335 ] 00:23:49.335 }' 00:23:49.335 18:27:32 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:23:49.335 18:27:33 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:23:49.335 18:27:33 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:23:49.591 18:27:33 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:23:49.591 18:27:33 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@661 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:23:49.591 [2024-07-12 18:27:33.294899] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:23:49.848 18:27:33 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@662 -- # sleep 1 00:23:49.848 [2024-07-12 18:27:33.355326] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xd99870 00:23:49.848 [2024-07-12 18:27:33.356831] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:23:49.848 [2024-07-12 18:27:33.476007] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 2048 offset_begin: 0 offset_end: 6144 00:23:49.848 [2024-07-12 18:27:33.476457] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 2048 offset_begin: 0 offset_end: 6144 00:23:50.105 [2024-07-12 18:27:33.722030] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 4096 offset_begin: 0 offset_end: 6144 00:23:50.105 [2024-07-12 18:27:33.722255] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 4096 offset_begin: 0 offset_end: 6144 00:23:50.362 [2024-07-12 18:27:34.062494] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 8192 offset_begin: 6144 offset_end: 12288 00:23:50.926 18:27:34 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@663 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:23:50.926 18:27:34 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:23:50.926 18:27:34 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:23:50.926 18:27:34 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:23:50.926 18:27:34 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:23:50.926 18:27:34 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:50.926 18:27:34 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:50.926 [2024-07-12 18:27:34.411245] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 14336 offset_begin: 12288 offset_end: 18432 00:23:50.926 [2024-07-12 18:27:34.411720] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 14336 offset_begin: 12288 offset_end: 18432 00:23:50.926 18:27:34 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:23:50.926 "name": "raid_bdev1", 00:23:50.926 "uuid": "ea234223-afcf-4f65-9476-04d040e16fb4", 00:23:50.926 "strip_size_kb": 0, 00:23:50.926 "state": "online", 00:23:50.926 "raid_level": "raid1", 00:23:50.926 "superblock": false, 00:23:50.926 "num_base_bdevs": 2, 00:23:50.926 "num_base_bdevs_discovered": 2, 00:23:50.926 "num_base_bdevs_operational": 2, 00:23:50.926 "process": { 00:23:50.926 "type": "rebuild", 00:23:50.926 "target": "spare", 00:23:50.926 "progress": { 00:23:50.926 "blocks": 14336, 00:23:50.926 "percent": 21 00:23:50.926 } 00:23:50.926 }, 00:23:50.926 "base_bdevs_list": [ 00:23:50.926 { 00:23:50.926 "name": "spare", 00:23:50.926 "uuid": "5ca5ef0b-4d47-5aa5-a0c8-9bd8de3300a0", 00:23:50.926 "is_configured": true, 00:23:50.926 "data_offset": 0, 00:23:50.926 "data_size": 65536 00:23:50.926 }, 00:23:50.926 { 00:23:50.926 "name": "BaseBdev2", 00:23:50.926 "uuid": "ce3028aa-4605-5130-a00a-c8f24079b799", 00:23:50.926 "is_configured": true, 00:23:50.926 "data_offset": 0, 00:23:50.926 "data_size": 65536 00:23:50.926 } 00:23:50.926 ] 00:23:50.926 }' 00:23:50.926 18:27:34 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:23:50.926 [2024-07-12 18:27:34.632343] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 16384 offset_begin: 12288 offset_end: 18432 00:23:50.926 [2024-07-12 18:27:34.632590] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 16384 offset_begin: 12288 offset_end: 18432 00:23:50.926 18:27:34 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:23:50.926 18:27:34 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:23:51.184 18:27:34 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:23:51.184 18:27:34 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@665 -- # '[' false = true ']' 00:23:51.184 18:27:34 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@690 -- # local num_base_bdevs_operational=2 00:23:51.184 18:27:34 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@692 -- # '[' raid1 = raid1 ']' 00:23:51.184 18:27:34 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@692 -- # '[' 2 -gt 2 ']' 00:23:51.184 18:27:34 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@705 -- # local timeout=827 00:23:51.184 18:27:34 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:23:51.184 18:27:34 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:23:51.184 18:27:34 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:23:51.184 18:27:34 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:23:51.184 18:27:34 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:23:51.184 18:27:34 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:23:51.184 18:27:34 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:51.184 18:27:34 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:51.442 18:27:34 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:23:51.442 "name": "raid_bdev1", 00:23:51.442 "uuid": "ea234223-afcf-4f65-9476-04d040e16fb4", 00:23:51.442 "strip_size_kb": 0, 00:23:51.442 "state": "online", 00:23:51.442 "raid_level": "raid1", 00:23:51.442 "superblock": false, 00:23:51.442 "num_base_bdevs": 2, 00:23:51.442 "num_base_bdevs_discovered": 2, 00:23:51.442 "num_base_bdevs_operational": 2, 00:23:51.442 "process": { 00:23:51.442 "type": "rebuild", 00:23:51.442 "target": "spare", 00:23:51.442 "progress": { 00:23:51.442 "blocks": 18432, 00:23:51.442 "percent": 28 00:23:51.442 } 00:23:51.442 }, 00:23:51.442 "base_bdevs_list": [ 00:23:51.442 { 00:23:51.442 "name": "spare", 00:23:51.442 "uuid": "5ca5ef0b-4d47-5aa5-a0c8-9bd8de3300a0", 00:23:51.442 "is_configured": true, 00:23:51.442 "data_offset": 0, 00:23:51.442 "data_size": 65536 00:23:51.442 }, 00:23:51.442 { 00:23:51.442 "name": "BaseBdev2", 00:23:51.442 "uuid": "ce3028aa-4605-5130-a00a-c8f24079b799", 00:23:51.442 "is_configured": true, 00:23:51.442 "data_offset": 0, 00:23:51.442 "data_size": 65536 00:23:51.442 } 00:23:51.442 ] 00:23:51.442 }' 00:23:51.442 18:27:34 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:23:51.442 [2024-07-12 18:27:34.981417] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 20480 offset_begin: 18432 offset_end: 24576 00:23:51.442 18:27:34 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:23:51.442 18:27:34 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:23:51.442 18:27:35 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:23:51.442 18:27:35 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@710 -- # sleep 1 00:23:51.442 [2024-07-12 18:27:35.108818] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 22528 offset_begin: 18432 offset_end: 24576 00:23:51.442 [2024-07-12 18:27:35.109002] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 22528 offset_begin: 18432 offset_end: 24576 00:23:52.374 18:27:36 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:23:52.374 18:27:36 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:23:52.374 18:27:36 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:23:52.374 18:27:36 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:23:52.374 18:27:36 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:23:52.374 18:27:36 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:23:52.374 18:27:36 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:52.374 18:27:36 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:52.631 [2024-07-12 18:27:36.115905] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 40960 offset_begin: 36864 offset_end: 43008 00:23:52.631 18:27:36 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:23:52.631 "name": "raid_bdev1", 00:23:52.631 "uuid": "ea234223-afcf-4f65-9476-04d040e16fb4", 00:23:52.631 "strip_size_kb": 0, 00:23:52.631 "state": "online", 00:23:52.631 "raid_level": "raid1", 00:23:52.631 "superblock": false, 00:23:52.631 "num_base_bdevs": 2, 00:23:52.631 "num_base_bdevs_discovered": 2, 00:23:52.631 "num_base_bdevs_operational": 2, 00:23:52.631 "process": { 00:23:52.631 "type": "rebuild", 00:23:52.631 "target": "spare", 00:23:52.631 "progress": { 00:23:52.631 "blocks": 40960, 00:23:52.631 "percent": 62 00:23:52.631 } 00:23:52.631 }, 00:23:52.631 "base_bdevs_list": [ 00:23:52.631 { 00:23:52.631 "name": "spare", 00:23:52.631 "uuid": "5ca5ef0b-4d47-5aa5-a0c8-9bd8de3300a0", 00:23:52.631 "is_configured": true, 00:23:52.631 "data_offset": 0, 00:23:52.631 "data_size": 65536 00:23:52.631 }, 00:23:52.631 { 00:23:52.631 "name": "BaseBdev2", 00:23:52.631 "uuid": "ce3028aa-4605-5130-a00a-c8f24079b799", 00:23:52.631 "is_configured": true, 00:23:52.631 "data_offset": 0, 00:23:52.631 "data_size": 65536 00:23:52.631 } 00:23:52.631 ] 00:23:52.631 }' 00:23:52.631 18:27:36 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:23:52.631 18:27:36 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:23:52.631 18:27:36 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:23:52.888 18:27:36 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:23:52.888 18:27:36 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@710 -- # sleep 1 00:23:52.888 [2024-07-12 18:27:36.461735] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 45056 offset_begin: 43008 offset_end: 49152 00:23:53.820 [2024-07-12 18:27:37.248164] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 59392 offset_begin: 55296 offset_end: 61440 00:23:53.820 18:27:37 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:23:53.820 18:27:37 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:23:53.820 18:27:37 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:23:53.820 18:27:37 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:23:53.820 18:27:37 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:23:53.820 18:27:37 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:23:53.820 18:27:37 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:53.820 18:27:37 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:54.103 [2024-07-12 18:27:37.589961] bdev_raid.c:2789:raid_bdev_process_thread_run: *DEBUG*: process completed on raid_bdev1 00:23:54.103 18:27:37 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:23:54.103 "name": "raid_bdev1", 00:23:54.103 "uuid": "ea234223-afcf-4f65-9476-04d040e16fb4", 00:23:54.103 "strip_size_kb": 0, 00:23:54.103 "state": "online", 00:23:54.103 "raid_level": "raid1", 00:23:54.103 "superblock": false, 00:23:54.103 "num_base_bdevs": 2, 00:23:54.103 "num_base_bdevs_discovered": 2, 00:23:54.103 "num_base_bdevs_operational": 2, 00:23:54.103 "process": { 00:23:54.103 "type": "rebuild", 00:23:54.103 "target": "spare", 00:23:54.103 "progress": { 00:23:54.103 "blocks": 65536, 00:23:54.103 "percent": 100 00:23:54.103 } 00:23:54.103 }, 00:23:54.103 "base_bdevs_list": [ 00:23:54.103 { 00:23:54.103 "name": "spare", 00:23:54.103 "uuid": "5ca5ef0b-4d47-5aa5-a0c8-9bd8de3300a0", 00:23:54.103 "is_configured": true, 00:23:54.103 "data_offset": 0, 00:23:54.103 "data_size": 65536 00:23:54.103 }, 00:23:54.103 { 00:23:54.103 "name": "BaseBdev2", 00:23:54.103 "uuid": "ce3028aa-4605-5130-a00a-c8f24079b799", 00:23:54.103 "is_configured": true, 00:23:54.103 "data_offset": 0, 00:23:54.103 "data_size": 65536 00:23:54.103 } 00:23:54.103 ] 00:23:54.103 }' 00:23:54.103 18:27:37 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:23:54.103 18:27:37 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:23:54.104 18:27:37 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:23:54.104 [2024-07-12 18:27:37.698240] bdev_raid.c:2504:raid_bdev_process_finish_done: *NOTICE*: Finished rebuild on raid bdev raid_bdev1 00:23:54.104 [2024-07-12 18:27:37.700314] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:23:54.104 18:27:37 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:23:54.104 18:27:37 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@710 -- # sleep 1 00:23:55.034 18:27:38 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:23:55.034 18:27:38 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:23:55.034 18:27:38 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:23:55.034 18:27:38 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:23:55.034 18:27:38 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:23:55.034 18:27:38 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:23:55.034 18:27:38 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:55.034 18:27:38 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:55.291 18:27:38 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:23:55.291 "name": "raid_bdev1", 00:23:55.291 "uuid": "ea234223-afcf-4f65-9476-04d040e16fb4", 00:23:55.291 "strip_size_kb": 0, 00:23:55.291 "state": "online", 00:23:55.291 "raid_level": "raid1", 00:23:55.291 "superblock": false, 00:23:55.291 "num_base_bdevs": 2, 00:23:55.291 "num_base_bdevs_discovered": 2, 00:23:55.291 "num_base_bdevs_operational": 2, 00:23:55.291 "base_bdevs_list": [ 00:23:55.291 { 00:23:55.291 "name": "spare", 00:23:55.291 "uuid": "5ca5ef0b-4d47-5aa5-a0c8-9bd8de3300a0", 00:23:55.291 "is_configured": true, 00:23:55.291 "data_offset": 0, 00:23:55.291 "data_size": 65536 00:23:55.291 }, 00:23:55.291 { 00:23:55.291 "name": "BaseBdev2", 00:23:55.291 "uuid": "ce3028aa-4605-5130-a00a-c8f24079b799", 00:23:55.291 "is_configured": true, 00:23:55.291 "data_offset": 0, 00:23:55.291 "data_size": 65536 00:23:55.291 } 00:23:55.291 ] 00:23:55.291 }' 00:23:55.291 18:27:38 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:23:55.547 18:27:39 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # [[ none == \r\e\b\u\i\l\d ]] 00:23:55.547 18:27:39 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:23:55.547 18:27:39 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # [[ none == \s\p\a\r\e ]] 00:23:55.547 18:27:39 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@708 -- # break 00:23:55.547 18:27:39 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@714 -- # verify_raid_bdev_process raid_bdev1 none none 00:23:55.547 18:27:39 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:23:55.547 18:27:39 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:23:55.547 18:27:39 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@184 -- # local target=none 00:23:55.547 18:27:39 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:23:55.547 18:27:39 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:55.547 18:27:39 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:55.808 18:27:39 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:23:55.808 "name": "raid_bdev1", 00:23:55.808 "uuid": "ea234223-afcf-4f65-9476-04d040e16fb4", 00:23:55.808 "strip_size_kb": 0, 00:23:55.808 "state": "online", 00:23:55.808 "raid_level": "raid1", 00:23:55.808 "superblock": false, 00:23:55.808 "num_base_bdevs": 2, 00:23:55.808 "num_base_bdevs_discovered": 2, 00:23:55.808 "num_base_bdevs_operational": 2, 00:23:55.808 "base_bdevs_list": [ 00:23:55.808 { 00:23:55.808 "name": "spare", 00:23:55.808 "uuid": "5ca5ef0b-4d47-5aa5-a0c8-9bd8de3300a0", 00:23:55.808 "is_configured": true, 00:23:55.808 "data_offset": 0, 00:23:55.808 "data_size": 65536 00:23:55.808 }, 00:23:55.808 { 00:23:55.808 "name": "BaseBdev2", 00:23:55.808 "uuid": "ce3028aa-4605-5130-a00a-c8f24079b799", 00:23:55.808 "is_configured": true, 00:23:55.808 "data_offset": 0, 00:23:55.808 "data_size": 65536 00:23:55.808 } 00:23:55.808 ] 00:23:55.808 }' 00:23:55.808 18:27:39 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:23:55.808 18:27:39 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:23:55.808 18:27:39 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:23:55.808 18:27:39 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:23:55.808 18:27:39 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@715 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:23:55.808 18:27:39 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:23:55.808 18:27:39 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:23:55.808 18:27:39 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:23:55.808 18:27:39 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:23:55.808 18:27:39 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:23:55.808 18:27:39 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:23:55.808 18:27:39 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:23:55.808 18:27:39 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:23:55.808 18:27:39 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:23:55.808 18:27:39 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:55.808 18:27:39 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:56.065 18:27:39 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:23:56.065 "name": "raid_bdev1", 00:23:56.065 "uuid": "ea234223-afcf-4f65-9476-04d040e16fb4", 00:23:56.065 "strip_size_kb": 0, 00:23:56.065 "state": "online", 00:23:56.065 "raid_level": "raid1", 00:23:56.065 "superblock": false, 00:23:56.065 "num_base_bdevs": 2, 00:23:56.065 "num_base_bdevs_discovered": 2, 00:23:56.065 "num_base_bdevs_operational": 2, 00:23:56.065 "base_bdevs_list": [ 00:23:56.065 { 00:23:56.065 "name": "spare", 00:23:56.065 "uuid": "5ca5ef0b-4d47-5aa5-a0c8-9bd8de3300a0", 00:23:56.065 "is_configured": true, 00:23:56.065 "data_offset": 0, 00:23:56.065 "data_size": 65536 00:23:56.065 }, 00:23:56.065 { 00:23:56.065 "name": "BaseBdev2", 00:23:56.065 "uuid": "ce3028aa-4605-5130-a00a-c8f24079b799", 00:23:56.065 "is_configured": true, 00:23:56.065 "data_offset": 0, 00:23:56.065 "data_size": 65536 00:23:56.065 } 00:23:56.065 ] 00:23:56.065 }' 00:23:56.065 18:27:39 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:23:56.065 18:27:39 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@10 -- # set +x 00:23:56.629 18:27:40 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@718 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:23:56.886 [2024-07-12 18:27:40.494018] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:23:56.886 [2024-07-12 18:27:40.494050] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:23:56.886 00:23:56.886 Latency(us) 00:23:56.886 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:23:56.886 Job: raid_bdev1 (Core Mask 0x1, workload: randrw, percentage: 50, depth: 2, IO size: 3145728) 00:23:56.886 raid_bdev1 : 11.81 93.72 281.17 0.00 0.00 14448.77 293.84 118534.68 00:23:56.886 =================================================================================================================== 00:23:56.886 Total : 93.72 281.17 0.00 0.00 14448.77 293.84 118534.68 00:23:56.886 [2024-07-12 18:27:40.566140] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:23:56.886 [2024-07-12 18:27:40.566166] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:23:56.886 [2024-07-12 18:27:40.566239] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:23:56.886 [2024-07-12 18:27:40.566251] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xe17070 name raid_bdev1, state offline 00:23:56.886 0 00:23:56.886 18:27:40 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@719 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:56.886 18:27:40 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@719 -- # jq length 00:23:57.142 18:27:40 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@719 -- # [[ 0 == 0 ]] 00:23:57.142 18:27:40 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@721 -- # '[' true = true ']' 00:23:57.142 18:27:40 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@722 -- # '[' true = true ']' 00:23:57.142 18:27:40 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@724 -- # nbd_start_disks /var/tmp/spdk-raid.sock spare /dev/nbd0 00:23:57.142 18:27:40 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:23:57.142 18:27:40 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@10 -- # bdev_list=('spare') 00:23:57.142 18:27:40 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@10 -- # local bdev_list 00:23:57.142 18:27:40 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0') 00:23:57.142 18:27:40 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@11 -- # local nbd_list 00:23:57.142 18:27:40 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@12 -- # local i 00:23:57.142 18:27:40 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:23:57.143 18:27:40 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:23:57.143 18:27:40 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk spare /dev/nbd0 00:23:57.399 /dev/nbd0 00:23:57.399 18:27:41 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:23:57.399 18:27:41 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:23:57.399 18:27:41 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@866 -- # local nbd_name=nbd0 00:23:57.399 18:27:41 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@867 -- # local i 00:23:57.399 18:27:41 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:23:57.400 18:27:41 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:23:57.400 18:27:41 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@870 -- # grep -q -w nbd0 /proc/partitions 00:23:57.400 18:27:41 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@871 -- # break 00:23:57.400 18:27:41 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:23:57.400 18:27:41 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:23:57.400 18:27:41 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@883 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:23:57.400 1+0 records in 00:23:57.400 1+0 records out 00:23:57.400 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.00027065 s, 15.1 MB/s 00:23:57.400 18:27:41 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:23:57.400 18:27:41 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@884 -- # size=4096 00:23:57.400 18:27:41 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:23:57.657 18:27:41 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:23:57.658 18:27:41 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@887 -- # return 0 00:23:57.658 18:27:41 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:23:57.658 18:27:41 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:23:57.658 18:27:41 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@725 -- # for bdev in "${base_bdevs[@]:1}" 00:23:57.658 18:27:41 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@726 -- # '[' -z BaseBdev2 ']' 00:23:57.658 18:27:41 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@729 -- # nbd_start_disks /var/tmp/spdk-raid.sock BaseBdev2 /dev/nbd1 00:23:57.658 18:27:41 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:23:57.658 18:27:41 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@10 -- # bdev_list=('BaseBdev2') 00:23:57.658 18:27:41 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@10 -- # local bdev_list 00:23:57.658 18:27:41 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd1') 00:23:57.658 18:27:41 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@11 -- # local nbd_list 00:23:57.658 18:27:41 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@12 -- # local i 00:23:57.658 18:27:41 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:23:57.658 18:27:41 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:23:57.658 18:27:41 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk BaseBdev2 /dev/nbd1 00:23:57.658 /dev/nbd1 00:23:57.915 18:27:41 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:23:57.915 18:27:41 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:23:57.915 18:27:41 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@866 -- # local nbd_name=nbd1 00:23:57.915 18:27:41 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@867 -- # local i 00:23:57.915 18:27:41 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:23:57.915 18:27:41 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:23:57.915 18:27:41 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@870 -- # grep -q -w nbd1 /proc/partitions 00:23:57.915 18:27:41 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@871 -- # break 00:23:57.915 18:27:41 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:23:57.915 18:27:41 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:23:57.915 18:27:41 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@883 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:23:57.915 1+0 records in 00:23:57.915 1+0 records out 00:23:57.915 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.00022197 s, 18.5 MB/s 00:23:57.915 18:27:41 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:23:57.915 18:27:41 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@884 -- # size=4096 00:23:57.915 18:27:41 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:23:57.915 18:27:41 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:23:57.915 18:27:41 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@887 -- # return 0 00:23:57.915 18:27:41 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:23:57.915 18:27:41 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:23:57.915 18:27:41 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@730 -- # cmp -i 0 /dev/nbd0 /dev/nbd1 00:23:57.915 18:27:41 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@731 -- # nbd_stop_disks /var/tmp/spdk-raid.sock /dev/nbd1 00:23:57.915 18:27:41 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:23:57.915 18:27:41 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd1') 00:23:57.915 18:27:41 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@50 -- # local nbd_list 00:23:57.915 18:27:41 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@51 -- # local i 00:23:57.915 18:27:41 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:23:57.915 18:27:41 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd1 00:23:58.172 18:27:41 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:23:58.172 18:27:41 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:23:58.172 18:27:41 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:23:58.172 18:27:41 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:23:58.172 18:27:41 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:23:58.172 18:27:41 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:23:58.172 18:27:41 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@41 -- # break 00:23:58.172 18:27:41 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@45 -- # return 0 00:23:58.172 18:27:41 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@733 -- # nbd_stop_disks /var/tmp/spdk-raid.sock /dev/nbd0 00:23:58.172 18:27:41 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:23:58.172 18:27:41 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0') 00:23:58.172 18:27:41 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@50 -- # local nbd_list 00:23:58.173 18:27:41 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@51 -- # local i 00:23:58.173 18:27:41 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:23:58.173 18:27:41 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd0 00:23:58.430 18:27:41 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:23:58.430 18:27:41 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:23:58.430 18:27:41 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:23:58.430 18:27:41 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:23:58.430 18:27:41 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:23:58.430 18:27:41 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:23:58.430 18:27:41 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@41 -- # break 00:23:58.430 18:27:41 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@45 -- # return 0 00:23:58.430 18:27:41 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@742 -- # '[' false = true ']' 00:23:58.430 18:27:41 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@782 -- # killprocess 2573969 00:23:58.430 18:27:41 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@948 -- # '[' -z 2573969 ']' 00:23:58.430 18:27:41 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@952 -- # kill -0 2573969 00:23:58.430 18:27:41 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@953 -- # uname 00:23:58.430 18:27:42 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:23:58.430 18:27:42 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2573969 00:23:58.430 18:27:42 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:23:58.430 18:27:42 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:23:58.430 18:27:42 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2573969' 00:23:58.430 killing process with pid 2573969 00:23:58.430 18:27:42 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@967 -- # kill 2573969 00:23:58.430 Received shutdown signal, test time was about 13.293786 seconds 00:23:58.430 00:23:58.430 Latency(us) 00:23:58.430 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:23:58.430 =================================================================================================================== 00:23:58.430 Total : 0.00 0.00 0.00 0.00 0.00 0.00 0.00 00:23:58.430 [2024-07-12 18:27:42.047979] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:23:58.430 18:27:42 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@972 -- # wait 2573969 00:23:58.430 [2024-07-12 18:27:42.068763] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:23:58.688 18:27:42 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@784 -- # return 0 00:23:58.688 00:23:58.688 real 0m17.925s 00:23:58.688 user 0m27.283s 00:23:58.688 sys 0m2.850s 00:23:58.688 18:27:42 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@1124 -- # xtrace_disable 00:23:58.688 18:27:42 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@10 -- # set +x 00:23:58.688 ************************************ 00:23:58.688 END TEST raid_rebuild_test_io 00:23:58.688 ************************************ 00:23:58.688 18:27:42 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:23:58.688 18:27:42 bdev_raid -- bdev/bdev_raid.sh@880 -- # run_test raid_rebuild_test_sb_io raid_rebuild_test raid1 2 true true true 00:23:58.688 18:27:42 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 7 -le 1 ']' 00:23:58.688 18:27:42 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:23:58.688 18:27:42 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:23:58.688 ************************************ 00:23:58.688 START TEST raid_rebuild_test_sb_io 00:23:58.688 ************************************ 00:23:58.688 18:27:42 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@1123 -- # raid_rebuild_test raid1 2 true true true 00:23:58.688 18:27:42 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@568 -- # local raid_level=raid1 00:23:58.688 18:27:42 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@569 -- # local num_base_bdevs=2 00:23:58.688 18:27:42 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@570 -- # local superblock=true 00:23:58.688 18:27:42 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@571 -- # local background_io=true 00:23:58.688 18:27:42 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@572 -- # local verify=true 00:23:58.688 18:27:42 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@573 -- # (( i = 1 )) 00:23:58.688 18:27:42 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:23:58.688 18:27:42 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@575 -- # echo BaseBdev1 00:23:58.688 18:27:42 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:23:58.688 18:27:42 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:23:58.688 18:27:42 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@575 -- # echo BaseBdev2 00:23:58.688 18:27:42 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:23:58.688 18:27:42 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:23:58.688 18:27:42 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@573 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:23:58.688 18:27:42 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@573 -- # local base_bdevs 00:23:58.688 18:27:42 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@574 -- # local raid_bdev_name=raid_bdev1 00:23:58.688 18:27:42 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@575 -- # local strip_size 00:23:58.688 18:27:42 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@576 -- # local create_arg 00:23:58.688 18:27:42 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@577 -- # local raid_bdev_size 00:23:58.688 18:27:42 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@578 -- # local data_offset 00:23:58.688 18:27:42 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@580 -- # '[' raid1 '!=' raid1 ']' 00:23:58.688 18:27:42 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@588 -- # strip_size=0 00:23:58.688 18:27:42 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@591 -- # '[' true = true ']' 00:23:58.688 18:27:42 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@592 -- # create_arg+=' -s' 00:23:58.688 18:27:42 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@596 -- # raid_pid=2576499 00:23:58.688 18:27:42 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@597 -- # waitforlisten 2576499 /var/tmp/spdk-raid.sock 00:23:58.688 18:27:42 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@595 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 3M -q 2 -U -z -L bdev_raid 00:23:58.688 18:27:42 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@829 -- # '[' -z 2576499 ']' 00:23:58.688 18:27:42 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:23:58.688 18:27:42 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@834 -- # local max_retries=100 00:23:58.688 18:27:42 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:23:58.688 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:23:58.688 18:27:42 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@838 -- # xtrace_disable 00:23:58.688 18:27:42 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:23:58.946 [2024-07-12 18:27:42.432318] Starting SPDK v24.09-pre git sha1 182dd7de4 / DPDK 24.03.0 initialization... 00:23:58.946 [2024-07-12 18:27:42.432386] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2576499 ] 00:23:58.946 I/O size of 3145728 is greater than zero copy threshold (65536). 00:23:58.946 Zero copy mechanism will not be used. 00:23:58.946 [2024-07-12 18:27:42.559881] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:23:58.946 [2024-07-12 18:27:42.657746] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:23:59.203 [2024-07-12 18:27:42.715716] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:23:59.203 [2024-07-12 18:27:42.715772] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:23:59.766 18:27:43 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:23:59.766 18:27:43 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@862 -- # return 0 00:23:59.766 18:27:43 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:23:59.766 18:27:43 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:24:00.022 BaseBdev1_malloc 00:24:00.022 18:27:43 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev1_malloc -p BaseBdev1 00:24:00.279 [2024-07-12 18:27:43.756023] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev1_malloc 00:24:00.279 [2024-07-12 18:27:43.756073] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:24:00.279 [2024-07-12 18:27:43.756099] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x265dd40 00:24:00.279 [2024-07-12 18:27:43.756112] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:24:00.279 [2024-07-12 18:27:43.757896] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:24:00.279 [2024-07-12 18:27:43.757934] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:24:00.279 BaseBdev1 00:24:00.279 18:27:43 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:24:00.279 18:27:43 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:24:00.279 BaseBdev2_malloc 00:24:00.535 18:27:44 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev2_malloc -p BaseBdev2 00:24:00.535 [2024-07-12 18:27:44.242292] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev2_malloc 00:24:00.535 [2024-07-12 18:27:44.242344] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:24:00.535 [2024-07-12 18:27:44.242371] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x265e860 00:24:00.535 [2024-07-12 18:27:44.242384] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:24:00.535 [2024-07-12 18:27:44.243902] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:24:00.535 [2024-07-12 18:27:44.243940] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:24:00.535 BaseBdev2 00:24:00.792 18:27:44 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@606 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b spare_malloc 00:24:00.792 spare_malloc 00:24:01.049 18:27:44 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@607 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_delay_create -b spare_malloc -d spare_delay -r 0 -t 0 -w 100000 -n 100000 00:24:01.049 spare_delay 00:24:01.049 18:27:44 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@608 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:24:01.305 [2024-07-12 18:27:45.002172] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:24:01.305 [2024-07-12 18:27:45.002226] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:24:01.305 [2024-07-12 18:27:45.002251] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x280cec0 00:24:01.305 [2024-07-12 18:27:45.002264] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:24:01.305 [2024-07-12 18:27:45.003826] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:24:01.305 [2024-07-12 18:27:45.003856] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:24:01.305 spare 00:24:01.305 18:27:45 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@611 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2' -n raid_bdev1 00:24:01.562 [2024-07-12 18:27:45.254869] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:24:01.562 [2024-07-12 18:27:45.256243] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:24:01.562 [2024-07-12 18:27:45.256418] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x280e070 00:24:01.562 [2024-07-12 18:27:45.256431] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:24:01.562 [2024-07-12 18:27:45.256633] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x2807490 00:24:01.562 [2024-07-12 18:27:45.256776] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x280e070 00:24:01.562 [2024-07-12 18:27:45.256786] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x280e070 00:24:01.562 [2024-07-12 18:27:45.256893] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:24:01.562 18:27:45 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@612 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:24:01.562 18:27:45 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:24:01.562 18:27:45 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:24:01.562 18:27:45 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:24:01.562 18:27:45 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:24:01.562 18:27:45 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:24:01.562 18:27:45 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:24:01.562 18:27:45 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:24:01.562 18:27:45 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:24:01.562 18:27:45 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:24:01.562 18:27:45 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:01.562 18:27:45 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:01.819 18:27:45 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:24:01.819 "name": "raid_bdev1", 00:24:01.819 "uuid": "737d8ccd-c49c-49ab-9f74-876a4b38b3ca", 00:24:01.819 "strip_size_kb": 0, 00:24:01.819 "state": "online", 00:24:01.819 "raid_level": "raid1", 00:24:01.819 "superblock": true, 00:24:01.819 "num_base_bdevs": 2, 00:24:01.819 "num_base_bdevs_discovered": 2, 00:24:01.819 "num_base_bdevs_operational": 2, 00:24:01.819 "base_bdevs_list": [ 00:24:01.819 { 00:24:01.819 "name": "BaseBdev1", 00:24:01.819 "uuid": "4070e1ae-de08-5f3a-a661-48ee9f7a3587", 00:24:01.819 "is_configured": true, 00:24:01.819 "data_offset": 2048, 00:24:01.819 "data_size": 63488 00:24:01.819 }, 00:24:01.819 { 00:24:01.819 "name": "BaseBdev2", 00:24:01.819 "uuid": "aa72c974-8220-599c-be82-4e8437946c19", 00:24:01.819 "is_configured": true, 00:24:01.819 "data_offset": 2048, 00:24:01.819 "data_size": 63488 00:24:01.819 } 00:24:01.819 ] 00:24:01.819 }' 00:24:01.819 18:27:45 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:24:01.819 18:27:45 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:24:02.749 18:27:46 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@615 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:24:02.749 18:27:46 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@615 -- # jq -r '.[].num_blocks' 00:24:02.749 [2024-07-12 18:27:46.333977] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:24:02.749 18:27:46 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@615 -- # raid_bdev_size=63488 00:24:02.749 18:27:46 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@618 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:02.749 18:27:46 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@618 -- # jq -r '.[].base_bdevs_list[0].data_offset' 00:24:03.005 18:27:46 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@618 -- # data_offset=2048 00:24:03.005 18:27:46 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@620 -- # '[' true = true ']' 00:24:03.005 18:27:46 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@639 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev1 00:24:03.005 18:27:46 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@622 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/spdk-raid.sock perform_tests 00:24:03.005 [2024-07-12 18:27:46.708813] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x280ec50 00:24:03.005 I/O size of 3145728 is greater than zero copy threshold (65536). 00:24:03.005 Zero copy mechanism will not be used. 00:24:03.005 Running I/O for 60 seconds... 00:24:03.263 [2024-07-12 18:27:46.829314] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:24:03.263 [2024-07-12 18:27:46.829509] bdev_raid.c:1919:raid_bdev_channel_remove_base_bdev: *DEBUG*: slot: 0 raid_ch: 0x280ec50 00:24:03.263 18:27:46 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@642 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:24:03.263 18:27:46 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:24:03.263 18:27:46 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:24:03.263 18:27:46 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:24:03.263 18:27:46 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:24:03.263 18:27:46 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:24:03.263 18:27:46 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:24:03.263 18:27:46 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:24:03.263 18:27:46 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:24:03.263 18:27:46 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:24:03.263 18:27:46 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:03.263 18:27:46 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:03.520 18:27:47 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:24:03.520 "name": "raid_bdev1", 00:24:03.520 "uuid": "737d8ccd-c49c-49ab-9f74-876a4b38b3ca", 00:24:03.520 "strip_size_kb": 0, 00:24:03.520 "state": "online", 00:24:03.520 "raid_level": "raid1", 00:24:03.520 "superblock": true, 00:24:03.520 "num_base_bdevs": 2, 00:24:03.520 "num_base_bdevs_discovered": 1, 00:24:03.520 "num_base_bdevs_operational": 1, 00:24:03.520 "base_bdevs_list": [ 00:24:03.520 { 00:24:03.520 "name": null, 00:24:03.520 "uuid": "00000000-0000-0000-0000-000000000000", 00:24:03.520 "is_configured": false, 00:24:03.520 "data_offset": 2048, 00:24:03.520 "data_size": 63488 00:24:03.520 }, 00:24:03.520 { 00:24:03.520 "name": "BaseBdev2", 00:24:03.520 "uuid": "aa72c974-8220-599c-be82-4e8437946c19", 00:24:03.520 "is_configured": true, 00:24:03.520 "data_offset": 2048, 00:24:03.520 "data_size": 63488 00:24:03.520 } 00:24:03.520 ] 00:24:03.520 }' 00:24:03.520 18:27:47 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:24:03.520 18:27:47 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:24:04.084 18:27:47 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@645 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:24:04.341 [2024-07-12 18:27:47.990902] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:24:04.341 18:27:48 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@646 -- # sleep 1 00:24:04.341 [2024-07-12 18:27:48.058239] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x277a230 00:24:04.341 [2024-07-12 18:27:48.060551] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:24:04.597 [2024-07-12 18:27:48.195455] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 2048 offset_begin: 0 offset_end: 6144 00:24:04.597 [2024-07-12 18:27:48.195858] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 2048 offset_begin: 0 offset_end: 6144 00:24:04.854 [2024-07-12 18:27:48.415322] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 4096 offset_begin: 0 offset_end: 6144 00:24:04.854 [2024-07-12 18:27:48.415580] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 4096 offset_begin: 0 offset_end: 6144 00:24:05.418 [2024-07-12 18:27:48.887785] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 10240 offset_begin: 6144 offset_end: 12288 00:24:05.418 [2024-07-12 18:27:48.887988] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 10240 offset_begin: 6144 offset_end: 12288 00:24:05.418 18:27:49 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@649 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:24:05.418 18:27:49 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:24:05.418 18:27:49 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:24:05.418 18:27:49 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:24:05.418 18:27:49 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:24:05.418 18:27:49 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:05.418 18:27:49 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:05.675 [2024-07-12 18:27:49.262217] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 14336 offset_begin: 12288 offset_end: 18432 00:24:05.675 [2024-07-12 18:27:49.262547] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 14336 offset_begin: 12288 offset_end: 18432 00:24:05.675 18:27:49 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:24:05.675 "name": "raid_bdev1", 00:24:05.675 "uuid": "737d8ccd-c49c-49ab-9f74-876a4b38b3ca", 00:24:05.675 "strip_size_kb": 0, 00:24:05.675 "state": "online", 00:24:05.675 "raid_level": "raid1", 00:24:05.675 "superblock": true, 00:24:05.676 "num_base_bdevs": 2, 00:24:05.676 "num_base_bdevs_discovered": 2, 00:24:05.676 "num_base_bdevs_operational": 2, 00:24:05.676 "process": { 00:24:05.676 "type": "rebuild", 00:24:05.676 "target": "spare", 00:24:05.676 "progress": { 00:24:05.676 "blocks": 14336, 00:24:05.676 "percent": 22 00:24:05.676 } 00:24:05.676 }, 00:24:05.676 "base_bdevs_list": [ 00:24:05.676 { 00:24:05.676 "name": "spare", 00:24:05.676 "uuid": "99b95efc-cc70-53f9-905e-a350d59bdb36", 00:24:05.676 "is_configured": true, 00:24:05.676 "data_offset": 2048, 00:24:05.676 "data_size": 63488 00:24:05.676 }, 00:24:05.676 { 00:24:05.676 "name": "BaseBdev2", 00:24:05.676 "uuid": "aa72c974-8220-599c-be82-4e8437946c19", 00:24:05.676 "is_configured": true, 00:24:05.676 "data_offset": 2048, 00:24:05.676 "data_size": 63488 00:24:05.676 } 00:24:05.676 ] 00:24:05.676 }' 00:24:05.676 18:27:49 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:24:05.676 18:27:49 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:24:05.676 18:27:49 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:24:05.676 18:27:49 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:24:05.676 18:27:49 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@652 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev spare 00:24:05.933 [2024-07-12 18:27:49.473114] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 16384 offset_begin: 12288 offset_end: 18432 00:24:05.933 [2024-07-12 18:27:49.473425] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 16384 offset_begin: 12288 offset_end: 18432 00:24:05.933 [2024-07-12 18:27:49.623219] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:24:06.190 [2024-07-12 18:27:49.796188] bdev_raid.c:2513:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:24:06.190 [2024-07-12 18:27:49.814047] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:24:06.190 [2024-07-12 18:27:49.814077] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:24:06.190 [2024-07-12 18:27:49.814087] bdev_raid.c:2451:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:24:06.190 [2024-07-12 18:27:49.828443] bdev_raid.c:1919:raid_bdev_channel_remove_base_bdev: *DEBUG*: slot: 0 raid_ch: 0x280ec50 00:24:06.190 18:27:49 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@655 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:24:06.190 18:27:49 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:24:06.190 18:27:49 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:24:06.190 18:27:49 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:24:06.190 18:27:49 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:24:06.190 18:27:49 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:24:06.190 18:27:49 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:24:06.190 18:27:49 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:24:06.191 18:27:49 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:24:06.191 18:27:49 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:24:06.191 18:27:49 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:06.191 18:27:49 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:06.447 18:27:50 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:24:06.447 "name": "raid_bdev1", 00:24:06.447 "uuid": "737d8ccd-c49c-49ab-9f74-876a4b38b3ca", 00:24:06.447 "strip_size_kb": 0, 00:24:06.447 "state": "online", 00:24:06.447 "raid_level": "raid1", 00:24:06.447 "superblock": true, 00:24:06.447 "num_base_bdevs": 2, 00:24:06.447 "num_base_bdevs_discovered": 1, 00:24:06.447 "num_base_bdevs_operational": 1, 00:24:06.447 "base_bdevs_list": [ 00:24:06.447 { 00:24:06.447 "name": null, 00:24:06.447 "uuid": "00000000-0000-0000-0000-000000000000", 00:24:06.447 "is_configured": false, 00:24:06.448 "data_offset": 2048, 00:24:06.448 "data_size": 63488 00:24:06.448 }, 00:24:06.448 { 00:24:06.448 "name": "BaseBdev2", 00:24:06.448 "uuid": "aa72c974-8220-599c-be82-4e8437946c19", 00:24:06.448 "is_configured": true, 00:24:06.448 "data_offset": 2048, 00:24:06.448 "data_size": 63488 00:24:06.448 } 00:24:06.448 ] 00:24:06.448 }' 00:24:06.448 18:27:50 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:24:06.448 18:27:50 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:24:07.379 18:27:50 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@658 -- # verify_raid_bdev_process raid_bdev1 none none 00:24:07.379 18:27:50 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:24:07.379 18:27:50 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:24:07.379 18:27:50 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=none 00:24:07.379 18:27:50 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:24:07.379 18:27:50 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:07.379 18:27:50 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:07.379 18:27:50 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:24:07.379 "name": "raid_bdev1", 00:24:07.379 "uuid": "737d8ccd-c49c-49ab-9f74-876a4b38b3ca", 00:24:07.379 "strip_size_kb": 0, 00:24:07.379 "state": "online", 00:24:07.379 "raid_level": "raid1", 00:24:07.379 "superblock": true, 00:24:07.379 "num_base_bdevs": 2, 00:24:07.379 "num_base_bdevs_discovered": 1, 00:24:07.379 "num_base_bdevs_operational": 1, 00:24:07.379 "base_bdevs_list": [ 00:24:07.379 { 00:24:07.379 "name": null, 00:24:07.379 "uuid": "00000000-0000-0000-0000-000000000000", 00:24:07.379 "is_configured": false, 00:24:07.379 "data_offset": 2048, 00:24:07.379 "data_size": 63488 00:24:07.379 }, 00:24:07.379 { 00:24:07.379 "name": "BaseBdev2", 00:24:07.379 "uuid": "aa72c974-8220-599c-be82-4e8437946c19", 00:24:07.379 "is_configured": true, 00:24:07.379 "data_offset": 2048, 00:24:07.379 "data_size": 63488 00:24:07.379 } 00:24:07.379 ] 00:24:07.379 }' 00:24:07.379 18:27:51 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:24:07.379 18:27:51 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:24:07.379 18:27:51 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:24:07.379 18:27:51 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:24:07.379 18:27:51 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@661 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:24:07.692 [2024-07-12 18:27:51.315252] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:24:07.692 18:27:51 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@662 -- # sleep 1 00:24:07.692 [2024-07-12 18:27:51.399408] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x280ee60 00:24:07.692 [2024-07-12 18:27:51.400907] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:24:07.979 [2024-07-12 18:27:51.533336] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 2048 offset_begin: 0 offset_end: 6144 00:24:07.979 [2024-07-12 18:27:51.660509] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 4096 offset_begin: 0 offset_end: 6144 00:24:07.979 [2024-07-12 18:27:51.660748] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 4096 offset_begin: 0 offset_end: 6144 00:24:08.544 [2024-07-12 18:27:52.000850] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 8192 offset_begin: 6144 offset_end: 12288 00:24:08.544 [2024-07-12 18:27:52.001301] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 8192 offset_begin: 6144 offset_end: 12288 00:24:08.544 [2024-07-12 18:27:52.220883] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 10240 offset_begin: 6144 offset_end: 12288 00:24:08.544 [2024-07-12 18:27:52.221144] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 10240 offset_begin: 6144 offset_end: 12288 00:24:08.801 18:27:52 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@663 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:24:08.801 18:27:52 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:24:08.801 18:27:52 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:24:08.801 18:27:52 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:24:08.801 18:27:52 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:24:08.801 18:27:52 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:08.801 18:27:52 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:09.059 18:27:52 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:24:09.059 "name": "raid_bdev1", 00:24:09.059 "uuid": "737d8ccd-c49c-49ab-9f74-876a4b38b3ca", 00:24:09.059 "strip_size_kb": 0, 00:24:09.059 "state": "online", 00:24:09.059 "raid_level": "raid1", 00:24:09.059 "superblock": true, 00:24:09.059 "num_base_bdevs": 2, 00:24:09.059 "num_base_bdevs_discovered": 2, 00:24:09.059 "num_base_bdevs_operational": 2, 00:24:09.059 "process": { 00:24:09.059 "type": "rebuild", 00:24:09.059 "target": "spare", 00:24:09.059 "progress": { 00:24:09.059 "blocks": 14336, 00:24:09.059 "percent": 22 00:24:09.059 } 00:24:09.059 }, 00:24:09.059 "base_bdevs_list": [ 00:24:09.059 { 00:24:09.059 "name": "spare", 00:24:09.059 "uuid": "99b95efc-cc70-53f9-905e-a350d59bdb36", 00:24:09.059 "is_configured": true, 00:24:09.059 "data_offset": 2048, 00:24:09.059 "data_size": 63488 00:24:09.059 }, 00:24:09.059 { 00:24:09.059 "name": "BaseBdev2", 00:24:09.059 "uuid": "aa72c974-8220-599c-be82-4e8437946c19", 00:24:09.059 "is_configured": true, 00:24:09.059 "data_offset": 2048, 00:24:09.059 "data_size": 63488 00:24:09.059 } 00:24:09.059 ] 00:24:09.059 }' 00:24:09.059 18:27:52 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:24:09.059 18:27:52 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:24:09.059 18:27:52 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:24:09.059 [2024-07-12 18:27:52.686634] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 16384 offset_begin: 12288 offset_end: 18432 00:24:09.059 18:27:52 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:24:09.059 18:27:52 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@665 -- # '[' true = true ']' 00:24:09.059 18:27:52 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@665 -- # '[' = false ']' 00:24:09.059 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev_raid.sh: line 665: [: =: unary operator expected 00:24:09.059 18:27:52 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@690 -- # local num_base_bdevs_operational=2 00:24:09.059 18:27:52 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@692 -- # '[' raid1 = raid1 ']' 00:24:09.059 18:27:52 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@692 -- # '[' 2 -gt 2 ']' 00:24:09.059 18:27:52 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@705 -- # local timeout=845 00:24:09.059 18:27:52 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:24:09.059 18:27:52 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:24:09.059 18:27:52 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:24:09.059 18:27:52 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:24:09.059 18:27:52 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:24:09.059 18:27:52 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:24:09.059 18:27:52 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:09.059 18:27:52 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:09.316 18:27:52 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:24:09.316 "name": "raid_bdev1", 00:24:09.316 "uuid": "737d8ccd-c49c-49ab-9f74-876a4b38b3ca", 00:24:09.316 "strip_size_kb": 0, 00:24:09.316 "state": "online", 00:24:09.316 "raid_level": "raid1", 00:24:09.316 "superblock": true, 00:24:09.316 "num_base_bdevs": 2, 00:24:09.316 "num_base_bdevs_discovered": 2, 00:24:09.316 "num_base_bdevs_operational": 2, 00:24:09.316 "process": { 00:24:09.316 "type": "rebuild", 00:24:09.316 "target": "spare", 00:24:09.316 "progress": { 00:24:09.316 "blocks": 18432, 00:24:09.316 "percent": 29 00:24:09.316 } 00:24:09.316 }, 00:24:09.316 "base_bdevs_list": [ 00:24:09.316 { 00:24:09.316 "name": "spare", 00:24:09.316 "uuid": "99b95efc-cc70-53f9-905e-a350d59bdb36", 00:24:09.316 "is_configured": true, 00:24:09.316 "data_offset": 2048, 00:24:09.316 "data_size": 63488 00:24:09.316 }, 00:24:09.316 { 00:24:09.316 "name": "BaseBdev2", 00:24:09.316 "uuid": "aa72c974-8220-599c-be82-4e8437946c19", 00:24:09.316 "is_configured": true, 00:24:09.316 "data_offset": 2048, 00:24:09.316 "data_size": 63488 00:24:09.316 } 00:24:09.316 ] 00:24:09.316 }' 00:24:09.316 18:27:52 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:24:09.316 18:27:53 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:24:09.316 18:27:53 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:24:09.572 18:27:53 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:24:09.572 18:27:53 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@710 -- # sleep 1 00:24:09.830 [2024-07-12 18:27:53.473140] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 28672 offset_begin: 24576 offset_end: 30720 00:24:10.088 [2024-07-12 18:27:53.810919] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 32768 offset_begin: 30720 offset_end: 36864 00:24:10.345 [2024-07-12 18:27:53.920410] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 34816 offset_begin: 30720 offset_end: 36864 00:24:10.345 18:27:54 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:24:10.345 18:27:54 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:24:10.345 18:27:54 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:24:10.345 18:27:54 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:24:10.345 18:27:54 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:24:10.345 18:27:54 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:24:10.345 18:27:54 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:10.345 18:27:54 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:10.603 18:27:54 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:24:10.603 "name": "raid_bdev1", 00:24:10.603 "uuid": "737d8ccd-c49c-49ab-9f74-876a4b38b3ca", 00:24:10.603 "strip_size_kb": 0, 00:24:10.603 "state": "online", 00:24:10.603 "raid_level": "raid1", 00:24:10.603 "superblock": true, 00:24:10.603 "num_base_bdevs": 2, 00:24:10.603 "num_base_bdevs_discovered": 2, 00:24:10.603 "num_base_bdevs_operational": 2, 00:24:10.603 "process": { 00:24:10.603 "type": "rebuild", 00:24:10.603 "target": "spare", 00:24:10.603 "progress": { 00:24:10.603 "blocks": 40960, 00:24:10.603 "percent": 64 00:24:10.603 } 00:24:10.603 }, 00:24:10.603 "base_bdevs_list": [ 00:24:10.603 { 00:24:10.603 "name": "spare", 00:24:10.603 "uuid": "99b95efc-cc70-53f9-905e-a350d59bdb36", 00:24:10.603 "is_configured": true, 00:24:10.603 "data_offset": 2048, 00:24:10.603 "data_size": 63488 00:24:10.603 }, 00:24:10.603 { 00:24:10.603 "name": "BaseBdev2", 00:24:10.603 "uuid": "aa72c974-8220-599c-be82-4e8437946c19", 00:24:10.603 "is_configured": true, 00:24:10.603 "data_offset": 2048, 00:24:10.603 "data_size": 63488 00:24:10.603 } 00:24:10.603 ] 00:24:10.603 }' 00:24:10.603 18:27:54 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:24:10.860 18:27:54 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:24:10.860 18:27:54 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:24:10.860 18:27:54 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:24:10.860 18:27:54 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@710 -- # sleep 1 00:24:11.425 [2024-07-12 18:27:54.876998] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 51200 offset_begin: 49152 offset_end: 55296 00:24:11.683 [2024-07-12 18:27:55.314601] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 57344 offset_begin: 55296 offset_end: 61440 00:24:11.683 [2024-07-12 18:27:55.315007] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 57344 offset_begin: 55296 offset_end: 61440 00:24:11.683 18:27:55 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:24:11.683 18:27:55 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:24:11.683 18:27:55 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:24:11.683 18:27:55 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:24:11.683 18:27:55 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:24:11.683 18:27:55 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:24:11.941 18:27:55 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:11.941 18:27:55 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:11.941 [2024-07-12 18:27:55.524740] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 59392 offset_begin: 55296 offset_end: 61440 00:24:11.941 18:27:55 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:24:11.941 "name": "raid_bdev1", 00:24:11.941 "uuid": "737d8ccd-c49c-49ab-9f74-876a4b38b3ca", 00:24:11.941 "strip_size_kb": 0, 00:24:11.941 "state": "online", 00:24:11.941 "raid_level": "raid1", 00:24:11.941 "superblock": true, 00:24:11.941 "num_base_bdevs": 2, 00:24:11.941 "num_base_bdevs_discovered": 2, 00:24:11.941 "num_base_bdevs_operational": 2, 00:24:11.941 "process": { 00:24:11.941 "type": "rebuild", 00:24:11.941 "target": "spare", 00:24:11.941 "progress": { 00:24:11.941 "blocks": 59392, 00:24:11.941 "percent": 93 00:24:11.941 } 00:24:11.941 }, 00:24:11.941 "base_bdevs_list": [ 00:24:11.941 { 00:24:11.941 "name": "spare", 00:24:11.941 "uuid": "99b95efc-cc70-53f9-905e-a350d59bdb36", 00:24:11.941 "is_configured": true, 00:24:11.941 "data_offset": 2048, 00:24:11.941 "data_size": 63488 00:24:11.941 }, 00:24:11.941 { 00:24:11.941 "name": "BaseBdev2", 00:24:11.941 "uuid": "aa72c974-8220-599c-be82-4e8437946c19", 00:24:11.941 "is_configured": true, 00:24:11.941 "data_offset": 2048, 00:24:11.941 "data_size": 63488 00:24:11.941 } 00:24:11.941 ] 00:24:11.941 }' 00:24:11.941 18:27:55 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:24:12.199 18:27:55 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:24:12.199 18:27:55 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:24:12.199 18:27:55 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:24:12.199 18:27:55 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@710 -- # sleep 1 00:24:12.199 [2024-07-12 18:27:55.830259] bdev_raid.c:2789:raid_bdev_process_thread_run: *DEBUG*: process completed on raid_bdev1 00:24:12.199 [2024-07-12 18:27:55.872087] bdev_raid.c:2504:raid_bdev_process_finish_done: *NOTICE*: Finished rebuild on raid bdev raid_bdev1 00:24:12.199 [2024-07-12 18:27:55.873845] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:24:13.133 18:27:56 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:24:13.133 18:27:56 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:24:13.133 18:27:56 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:24:13.133 18:27:56 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:24:13.133 18:27:56 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:24:13.133 18:27:56 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:24:13.133 18:27:56 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:13.133 18:27:56 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:13.391 18:27:57 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:24:13.391 "name": "raid_bdev1", 00:24:13.391 "uuid": "737d8ccd-c49c-49ab-9f74-876a4b38b3ca", 00:24:13.391 "strip_size_kb": 0, 00:24:13.391 "state": "online", 00:24:13.391 "raid_level": "raid1", 00:24:13.391 "superblock": true, 00:24:13.391 "num_base_bdevs": 2, 00:24:13.391 "num_base_bdevs_discovered": 2, 00:24:13.391 "num_base_bdevs_operational": 2, 00:24:13.391 "base_bdevs_list": [ 00:24:13.391 { 00:24:13.391 "name": "spare", 00:24:13.391 "uuid": "99b95efc-cc70-53f9-905e-a350d59bdb36", 00:24:13.391 "is_configured": true, 00:24:13.391 "data_offset": 2048, 00:24:13.391 "data_size": 63488 00:24:13.391 }, 00:24:13.391 { 00:24:13.391 "name": "BaseBdev2", 00:24:13.391 "uuid": "aa72c974-8220-599c-be82-4e8437946c19", 00:24:13.391 "is_configured": true, 00:24:13.391 "data_offset": 2048, 00:24:13.391 "data_size": 63488 00:24:13.391 } 00:24:13.391 ] 00:24:13.391 }' 00:24:13.391 18:27:57 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:24:13.391 18:27:57 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ none == \r\e\b\u\i\l\d ]] 00:24:13.391 18:27:57 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:24:13.391 18:27:57 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ none == \s\p\a\r\e ]] 00:24:13.391 18:27:57 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@708 -- # break 00:24:13.391 18:27:57 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@714 -- # verify_raid_bdev_process raid_bdev1 none none 00:24:13.391 18:27:57 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:24:13.391 18:27:57 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:24:13.391 18:27:57 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=none 00:24:13.391 18:27:57 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:24:13.391 18:27:57 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:13.391 18:27:57 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:13.649 18:27:57 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:24:13.649 "name": "raid_bdev1", 00:24:13.649 "uuid": "737d8ccd-c49c-49ab-9f74-876a4b38b3ca", 00:24:13.649 "strip_size_kb": 0, 00:24:13.649 "state": "online", 00:24:13.649 "raid_level": "raid1", 00:24:13.649 "superblock": true, 00:24:13.649 "num_base_bdevs": 2, 00:24:13.649 "num_base_bdevs_discovered": 2, 00:24:13.649 "num_base_bdevs_operational": 2, 00:24:13.649 "base_bdevs_list": [ 00:24:13.649 { 00:24:13.649 "name": "spare", 00:24:13.649 "uuid": "99b95efc-cc70-53f9-905e-a350d59bdb36", 00:24:13.649 "is_configured": true, 00:24:13.649 "data_offset": 2048, 00:24:13.649 "data_size": 63488 00:24:13.649 }, 00:24:13.649 { 00:24:13.649 "name": "BaseBdev2", 00:24:13.649 "uuid": "aa72c974-8220-599c-be82-4e8437946c19", 00:24:13.649 "is_configured": true, 00:24:13.649 "data_offset": 2048, 00:24:13.649 "data_size": 63488 00:24:13.649 } 00:24:13.649 ] 00:24:13.649 }' 00:24:13.649 18:27:57 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:24:13.907 18:27:57 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:24:13.907 18:27:57 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:24:13.907 18:27:57 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:24:13.907 18:27:57 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@715 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:24:13.907 18:27:57 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:24:13.907 18:27:57 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:24:13.907 18:27:57 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:24:13.907 18:27:57 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:24:13.907 18:27:57 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:24:13.907 18:27:57 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:24:13.907 18:27:57 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:24:13.907 18:27:57 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:24:13.907 18:27:57 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:24:13.908 18:27:57 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:13.908 18:27:57 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:14.165 18:27:57 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:24:14.165 "name": "raid_bdev1", 00:24:14.165 "uuid": "737d8ccd-c49c-49ab-9f74-876a4b38b3ca", 00:24:14.165 "strip_size_kb": 0, 00:24:14.165 "state": "online", 00:24:14.165 "raid_level": "raid1", 00:24:14.165 "superblock": true, 00:24:14.165 "num_base_bdevs": 2, 00:24:14.165 "num_base_bdevs_discovered": 2, 00:24:14.165 "num_base_bdevs_operational": 2, 00:24:14.165 "base_bdevs_list": [ 00:24:14.165 { 00:24:14.165 "name": "spare", 00:24:14.165 "uuid": "99b95efc-cc70-53f9-905e-a350d59bdb36", 00:24:14.165 "is_configured": true, 00:24:14.165 "data_offset": 2048, 00:24:14.165 "data_size": 63488 00:24:14.165 }, 00:24:14.165 { 00:24:14.165 "name": "BaseBdev2", 00:24:14.165 "uuid": "aa72c974-8220-599c-be82-4e8437946c19", 00:24:14.165 "is_configured": true, 00:24:14.165 "data_offset": 2048, 00:24:14.165 "data_size": 63488 00:24:14.165 } 00:24:14.165 ] 00:24:14.165 }' 00:24:14.165 18:27:57 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:24:14.165 18:27:57 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:24:14.731 18:27:58 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@718 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:24:14.989 [2024-07-12 18:27:58.514315] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:24:14.989 [2024-07-12 18:27:58.514347] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:24:14.989 00:24:14.989 Latency(us) 00:24:14.989 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:24:14.989 Job: raid_bdev1 (Core Mask 0x1, workload: randrw, percentage: 50, depth: 2, IO size: 3145728) 00:24:14.989 raid_bdev1 : 11.87 92.80 278.41 0.00 0.00 14534.85 295.62 119446.48 00:24:14.989 =================================================================================================================== 00:24:14.989 Total : 92.80 278.41 0.00 0.00 14534.85 295.62 119446.48 00:24:14.989 [2024-07-12 18:27:58.618565] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:24:14.989 [2024-07-12 18:27:58.618593] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:24:14.989 [2024-07-12 18:27:58.618665] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:24:14.989 [2024-07-12 18:27:58.618677] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x280e070 name raid_bdev1, state offline 00:24:14.989 0 00:24:14.989 18:27:58 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@719 -- # jq length 00:24:14.989 18:27:58 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@719 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:15.247 18:27:58 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@719 -- # [[ 0 == 0 ]] 00:24:15.247 18:27:58 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@721 -- # '[' true = true ']' 00:24:15.247 18:27:58 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@722 -- # '[' true = true ']' 00:24:15.247 18:27:58 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@724 -- # nbd_start_disks /var/tmp/spdk-raid.sock spare /dev/nbd0 00:24:15.247 18:27:58 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:24:15.247 18:27:58 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@10 -- # bdev_list=('spare') 00:24:15.247 18:27:58 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@10 -- # local bdev_list 00:24:15.247 18:27:58 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0') 00:24:15.247 18:27:58 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@11 -- # local nbd_list 00:24:15.247 18:27:58 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@12 -- # local i 00:24:15.247 18:27:58 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:24:15.247 18:27:58 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:24:15.247 18:27:58 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk spare /dev/nbd0 00:24:15.505 /dev/nbd0 00:24:15.505 18:27:59 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:24:15.505 18:27:59 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:24:15.505 18:27:59 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@866 -- # local nbd_name=nbd0 00:24:15.505 18:27:59 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@867 -- # local i 00:24:15.505 18:27:59 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:24:15.505 18:27:59 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:24:15.505 18:27:59 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@870 -- # grep -q -w nbd0 /proc/partitions 00:24:15.505 18:27:59 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@871 -- # break 00:24:15.505 18:27:59 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:24:15.505 18:27:59 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:24:15.505 18:27:59 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@883 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:24:15.505 1+0 records in 00:24:15.505 1+0 records out 00:24:15.505 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000241954 s, 16.9 MB/s 00:24:15.505 18:27:59 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:24:15.505 18:27:59 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@884 -- # size=4096 00:24:15.505 18:27:59 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:24:15.505 18:27:59 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:24:15.505 18:27:59 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@887 -- # return 0 00:24:15.505 18:27:59 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:24:15.505 18:27:59 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:24:15.505 18:27:59 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@725 -- # for bdev in "${base_bdevs[@]:1}" 00:24:15.505 18:27:59 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@726 -- # '[' -z BaseBdev2 ']' 00:24:15.505 18:27:59 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@729 -- # nbd_start_disks /var/tmp/spdk-raid.sock BaseBdev2 /dev/nbd1 00:24:15.505 18:27:59 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:24:15.505 18:27:59 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@10 -- # bdev_list=('BaseBdev2') 00:24:15.505 18:27:59 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@10 -- # local bdev_list 00:24:15.505 18:27:59 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd1') 00:24:15.505 18:27:59 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@11 -- # local nbd_list 00:24:15.505 18:27:59 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@12 -- # local i 00:24:15.505 18:27:59 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:24:15.505 18:27:59 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:24:15.505 18:27:59 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk BaseBdev2 /dev/nbd1 00:24:15.763 /dev/nbd1 00:24:15.763 18:27:59 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:24:15.763 18:27:59 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:24:15.763 18:27:59 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@866 -- # local nbd_name=nbd1 00:24:15.763 18:27:59 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@867 -- # local i 00:24:15.763 18:27:59 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:24:15.763 18:27:59 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:24:15.763 18:27:59 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@870 -- # grep -q -w nbd1 /proc/partitions 00:24:15.763 18:27:59 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@871 -- # break 00:24:15.763 18:27:59 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:24:15.763 18:27:59 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:24:15.763 18:27:59 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@883 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:24:15.763 1+0 records in 00:24:15.763 1+0 records out 00:24:15.763 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000275963 s, 14.8 MB/s 00:24:15.763 18:27:59 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:24:15.763 18:27:59 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@884 -- # size=4096 00:24:15.763 18:27:59 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:24:15.763 18:27:59 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:24:15.763 18:27:59 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@887 -- # return 0 00:24:15.763 18:27:59 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:24:15.763 18:27:59 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:24:15.763 18:27:59 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@730 -- # cmp -i 1048576 /dev/nbd0 /dev/nbd1 00:24:16.020 18:27:59 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@731 -- # nbd_stop_disks /var/tmp/spdk-raid.sock /dev/nbd1 00:24:16.020 18:27:59 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:24:16.020 18:27:59 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd1') 00:24:16.020 18:27:59 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@50 -- # local nbd_list 00:24:16.020 18:27:59 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@51 -- # local i 00:24:16.020 18:27:59 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:24:16.020 18:27:59 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd1 00:24:16.278 18:27:59 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:24:16.278 18:27:59 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:24:16.278 18:27:59 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:24:16.278 18:27:59 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:24:16.278 18:27:59 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:24:16.278 18:27:59 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:24:16.278 18:27:59 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@41 -- # break 00:24:16.278 18:27:59 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@45 -- # return 0 00:24:16.278 18:27:59 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@733 -- # nbd_stop_disks /var/tmp/spdk-raid.sock /dev/nbd0 00:24:16.278 18:27:59 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:24:16.278 18:27:59 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0') 00:24:16.278 18:27:59 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@50 -- # local nbd_list 00:24:16.278 18:27:59 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@51 -- # local i 00:24:16.278 18:27:59 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:24:16.278 18:27:59 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd0 00:24:16.535 18:28:00 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:24:16.535 18:28:00 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:24:16.536 18:28:00 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:24:16.536 18:28:00 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:24:16.536 18:28:00 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:24:16.536 18:28:00 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:24:16.536 18:28:00 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@41 -- # break 00:24:16.536 18:28:00 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@45 -- # return 0 00:24:16.536 18:28:00 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@742 -- # '[' true = true ']' 00:24:16.536 18:28:00 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@744 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:24:16.794 18:28:00 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@745 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:24:17.052 [2024-07-12 18:28:00.559960] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:24:17.052 [2024-07-12 18:28:00.560009] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:24:17.052 [2024-07-12 18:28:00.560034] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x265d490 00:24:17.052 [2024-07-12 18:28:00.560046] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:24:17.052 [2024-07-12 18:28:00.561661] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:24:17.052 [2024-07-12 18:28:00.561692] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:24:17.052 [2024-07-12 18:28:00.561772] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev spare 00:24:17.052 [2024-07-12 18:28:00.561797] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:24:17.052 [2024-07-12 18:28:00.561895] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:24:17.052 spare 00:24:17.053 18:28:00 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@747 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:24:17.053 18:28:00 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:24:17.053 18:28:00 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:24:17.053 18:28:00 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:24:17.053 18:28:00 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:24:17.053 18:28:00 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:24:17.053 18:28:00 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:24:17.053 18:28:00 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:24:17.053 18:28:00 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:24:17.053 18:28:00 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:24:17.053 18:28:00 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:17.053 18:28:00 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:17.053 [2024-07-12 18:28:00.662222] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x265cf70 00:24:17.053 [2024-07-12 18:28:00.662241] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:24:17.053 [2024-07-12 18:28:00.662442] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x2807490 00:24:17.053 [2024-07-12 18:28:00.662595] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x265cf70 00:24:17.053 [2024-07-12 18:28:00.662606] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x265cf70 00:24:17.053 [2024-07-12 18:28:00.662718] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:24:17.311 18:28:00 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:24:17.311 "name": "raid_bdev1", 00:24:17.311 "uuid": "737d8ccd-c49c-49ab-9f74-876a4b38b3ca", 00:24:17.311 "strip_size_kb": 0, 00:24:17.311 "state": "online", 00:24:17.311 "raid_level": "raid1", 00:24:17.311 "superblock": true, 00:24:17.311 "num_base_bdevs": 2, 00:24:17.311 "num_base_bdevs_discovered": 2, 00:24:17.311 "num_base_bdevs_operational": 2, 00:24:17.311 "base_bdevs_list": [ 00:24:17.311 { 00:24:17.311 "name": "spare", 00:24:17.311 "uuid": "99b95efc-cc70-53f9-905e-a350d59bdb36", 00:24:17.311 "is_configured": true, 00:24:17.311 "data_offset": 2048, 00:24:17.311 "data_size": 63488 00:24:17.311 }, 00:24:17.311 { 00:24:17.311 "name": "BaseBdev2", 00:24:17.311 "uuid": "aa72c974-8220-599c-be82-4e8437946c19", 00:24:17.311 "is_configured": true, 00:24:17.311 "data_offset": 2048, 00:24:17.311 "data_size": 63488 00:24:17.311 } 00:24:17.311 ] 00:24:17.311 }' 00:24:17.311 18:28:00 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:24:17.311 18:28:00 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:24:17.877 18:28:01 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@748 -- # verify_raid_bdev_process raid_bdev1 none none 00:24:17.877 18:28:01 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:24:17.877 18:28:01 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:24:17.877 18:28:01 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=none 00:24:17.877 18:28:01 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:24:17.877 18:28:01 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:17.877 18:28:01 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:18.147 18:28:01 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:24:18.147 "name": "raid_bdev1", 00:24:18.147 "uuid": "737d8ccd-c49c-49ab-9f74-876a4b38b3ca", 00:24:18.147 "strip_size_kb": 0, 00:24:18.147 "state": "online", 00:24:18.147 "raid_level": "raid1", 00:24:18.147 "superblock": true, 00:24:18.147 "num_base_bdevs": 2, 00:24:18.147 "num_base_bdevs_discovered": 2, 00:24:18.147 "num_base_bdevs_operational": 2, 00:24:18.147 "base_bdevs_list": [ 00:24:18.147 { 00:24:18.147 "name": "spare", 00:24:18.147 "uuid": "99b95efc-cc70-53f9-905e-a350d59bdb36", 00:24:18.147 "is_configured": true, 00:24:18.147 "data_offset": 2048, 00:24:18.147 "data_size": 63488 00:24:18.147 }, 00:24:18.147 { 00:24:18.147 "name": "BaseBdev2", 00:24:18.147 "uuid": "aa72c974-8220-599c-be82-4e8437946c19", 00:24:18.147 "is_configured": true, 00:24:18.147 "data_offset": 2048, 00:24:18.147 "data_size": 63488 00:24:18.147 } 00:24:18.147 ] 00:24:18.147 }' 00:24:18.147 18:28:01 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:24:18.147 18:28:01 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:24:18.147 18:28:01 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:24:18.147 18:28:01 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:24:18.147 18:28:01 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@749 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:18.147 18:28:01 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@749 -- # jq -r '.[].base_bdevs_list[0].name' 00:24:18.405 18:28:02 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@749 -- # [[ spare == \s\p\a\r\e ]] 00:24:18.405 18:28:02 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@752 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev spare 00:24:18.663 [2024-07-12 18:28:02.244793] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:24:18.663 18:28:02 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@753 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:24:18.663 18:28:02 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:24:18.663 18:28:02 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:24:18.663 18:28:02 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:24:18.663 18:28:02 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:24:18.663 18:28:02 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:24:18.663 18:28:02 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:24:18.663 18:28:02 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:24:18.663 18:28:02 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:24:18.663 18:28:02 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:24:18.663 18:28:02 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:18.663 18:28:02 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:18.921 18:28:02 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:24:18.921 "name": "raid_bdev1", 00:24:18.921 "uuid": "737d8ccd-c49c-49ab-9f74-876a4b38b3ca", 00:24:18.921 "strip_size_kb": 0, 00:24:18.921 "state": "online", 00:24:18.921 "raid_level": "raid1", 00:24:18.921 "superblock": true, 00:24:18.921 "num_base_bdevs": 2, 00:24:18.921 "num_base_bdevs_discovered": 1, 00:24:18.921 "num_base_bdevs_operational": 1, 00:24:18.921 "base_bdevs_list": [ 00:24:18.921 { 00:24:18.921 "name": null, 00:24:18.921 "uuid": "00000000-0000-0000-0000-000000000000", 00:24:18.921 "is_configured": false, 00:24:18.921 "data_offset": 2048, 00:24:18.921 "data_size": 63488 00:24:18.921 }, 00:24:18.921 { 00:24:18.921 "name": "BaseBdev2", 00:24:18.921 "uuid": "aa72c974-8220-599c-be82-4e8437946c19", 00:24:18.921 "is_configured": true, 00:24:18.921 "data_offset": 2048, 00:24:18.921 "data_size": 63488 00:24:18.921 } 00:24:18.921 ] 00:24:18.921 }' 00:24:18.921 18:28:02 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:24:18.921 18:28:02 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:24:19.487 18:28:03 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@754 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:24:19.744 [2024-07-12 18:28:03.307776] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:24:19.744 [2024-07-12 18:28:03.307915] bdev_raid.c:3562:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev spare (4) smaller than existing raid bdev raid_bdev1 (5) 00:24:19.744 [2024-07-12 18:28:03.307942] bdev_raid.c:3620:raid_bdev_examine_sb: *NOTICE*: Re-adding bdev spare to raid bdev raid_bdev1. 00:24:19.744 [2024-07-12 18:28:03.307971] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:24:19.744 [2024-07-12 18:28:03.313221] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x2364fa0 00:24:19.744 [2024-07-12 18:28:03.315525] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:24:19.744 18:28:03 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@755 -- # sleep 1 00:24:20.676 18:28:04 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@756 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:24:20.676 18:28:04 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:24:20.676 18:28:04 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:24:20.676 18:28:04 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:24:20.676 18:28:04 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:24:20.676 18:28:04 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:20.676 18:28:04 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:20.934 18:28:04 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:24:20.934 "name": "raid_bdev1", 00:24:20.934 "uuid": "737d8ccd-c49c-49ab-9f74-876a4b38b3ca", 00:24:20.934 "strip_size_kb": 0, 00:24:20.934 "state": "online", 00:24:20.934 "raid_level": "raid1", 00:24:20.934 "superblock": true, 00:24:20.934 "num_base_bdevs": 2, 00:24:20.934 "num_base_bdevs_discovered": 2, 00:24:20.934 "num_base_bdevs_operational": 2, 00:24:20.934 "process": { 00:24:20.934 "type": "rebuild", 00:24:20.934 "target": "spare", 00:24:20.934 "progress": { 00:24:20.934 "blocks": 24576, 00:24:20.934 "percent": 38 00:24:20.934 } 00:24:20.934 }, 00:24:20.934 "base_bdevs_list": [ 00:24:20.934 { 00:24:20.934 "name": "spare", 00:24:20.934 "uuid": "99b95efc-cc70-53f9-905e-a350d59bdb36", 00:24:20.934 "is_configured": true, 00:24:20.934 "data_offset": 2048, 00:24:20.934 "data_size": 63488 00:24:20.934 }, 00:24:20.934 { 00:24:20.934 "name": "BaseBdev2", 00:24:20.934 "uuid": "aa72c974-8220-599c-be82-4e8437946c19", 00:24:20.934 "is_configured": true, 00:24:20.934 "data_offset": 2048, 00:24:20.934 "data_size": 63488 00:24:20.934 } 00:24:20.934 ] 00:24:20.934 }' 00:24:20.934 18:28:04 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:24:20.934 18:28:04 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:24:20.934 18:28:04 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:24:21.192 18:28:04 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:24:21.192 18:28:04 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@759 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:24:21.192 [2024-07-12 18:28:04.902913] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:24:21.450 [2024-07-12 18:28:04.928306] bdev_raid.c:2513:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:24:21.450 [2024-07-12 18:28:04.928351] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:24:21.450 [2024-07-12 18:28:04.928367] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:24:21.450 [2024-07-12 18:28:04.928375] bdev_raid.c:2451:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:24:21.450 18:28:04 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@760 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:24:21.450 18:28:04 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:24:21.450 18:28:04 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:24:21.450 18:28:04 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:24:21.450 18:28:04 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:24:21.450 18:28:04 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:24:21.450 18:28:04 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:24:21.450 18:28:04 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:24:21.450 18:28:04 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:24:21.450 18:28:04 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:24:21.450 18:28:04 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:21.450 18:28:04 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:21.708 18:28:05 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:24:21.708 "name": "raid_bdev1", 00:24:21.708 "uuid": "737d8ccd-c49c-49ab-9f74-876a4b38b3ca", 00:24:21.708 "strip_size_kb": 0, 00:24:21.708 "state": "online", 00:24:21.708 "raid_level": "raid1", 00:24:21.708 "superblock": true, 00:24:21.708 "num_base_bdevs": 2, 00:24:21.708 "num_base_bdevs_discovered": 1, 00:24:21.708 "num_base_bdevs_operational": 1, 00:24:21.708 "base_bdevs_list": [ 00:24:21.708 { 00:24:21.708 "name": null, 00:24:21.708 "uuid": "00000000-0000-0000-0000-000000000000", 00:24:21.708 "is_configured": false, 00:24:21.708 "data_offset": 2048, 00:24:21.708 "data_size": 63488 00:24:21.708 }, 00:24:21.708 { 00:24:21.708 "name": "BaseBdev2", 00:24:21.708 "uuid": "aa72c974-8220-599c-be82-4e8437946c19", 00:24:21.708 "is_configured": true, 00:24:21.708 "data_offset": 2048, 00:24:21.708 "data_size": 63488 00:24:21.708 } 00:24:21.708 ] 00:24:21.708 }' 00:24:21.708 18:28:05 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:24:21.708 18:28:05 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:24:22.273 18:28:05 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@761 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:24:22.531 [2024-07-12 18:28:06.024257] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:24:22.531 [2024-07-12 18:28:06.024304] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:24:22.531 [2024-07-12 18:28:06.024329] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x266fa30 00:24:22.531 [2024-07-12 18:28:06.024342] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:24:22.531 [2024-07-12 18:28:06.024708] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:24:22.531 [2024-07-12 18:28:06.024727] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:24:22.531 [2024-07-12 18:28:06.024806] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev spare 00:24:22.531 [2024-07-12 18:28:06.024818] bdev_raid.c:3562:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev spare (4) smaller than existing raid bdev raid_bdev1 (5) 00:24:22.532 [2024-07-12 18:28:06.024829] bdev_raid.c:3620:raid_bdev_examine_sb: *NOTICE*: Re-adding bdev spare to raid bdev raid_bdev1. 00:24:22.532 [2024-07-12 18:28:06.024846] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:24:22.532 [2024-07-12 18:28:06.030181] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x28098f0 00:24:22.532 spare 00:24:22.532 [2024-07-12 18:28:06.031634] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:24:22.532 18:28:06 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@762 -- # sleep 1 00:24:23.465 18:28:07 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@763 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:24:23.465 18:28:07 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:24:23.465 18:28:07 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:24:23.465 18:28:07 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:24:23.465 18:28:07 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:24:23.465 18:28:07 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:23.465 18:28:07 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:23.723 18:28:07 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:24:23.723 "name": "raid_bdev1", 00:24:23.723 "uuid": "737d8ccd-c49c-49ab-9f74-876a4b38b3ca", 00:24:23.723 "strip_size_kb": 0, 00:24:23.723 "state": "online", 00:24:23.723 "raid_level": "raid1", 00:24:23.723 "superblock": true, 00:24:23.723 "num_base_bdevs": 2, 00:24:23.723 "num_base_bdevs_discovered": 2, 00:24:23.723 "num_base_bdevs_operational": 2, 00:24:23.723 "process": { 00:24:23.723 "type": "rebuild", 00:24:23.723 "target": "spare", 00:24:23.723 "progress": { 00:24:23.723 "blocks": 24576, 00:24:23.723 "percent": 38 00:24:23.723 } 00:24:23.723 }, 00:24:23.723 "base_bdevs_list": [ 00:24:23.723 { 00:24:23.723 "name": "spare", 00:24:23.723 "uuid": "99b95efc-cc70-53f9-905e-a350d59bdb36", 00:24:23.723 "is_configured": true, 00:24:23.723 "data_offset": 2048, 00:24:23.723 "data_size": 63488 00:24:23.723 }, 00:24:23.723 { 00:24:23.723 "name": "BaseBdev2", 00:24:23.723 "uuid": "aa72c974-8220-599c-be82-4e8437946c19", 00:24:23.723 "is_configured": true, 00:24:23.723 "data_offset": 2048, 00:24:23.723 "data_size": 63488 00:24:23.723 } 00:24:23.723 ] 00:24:23.723 }' 00:24:23.723 18:28:07 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:24:23.723 18:28:07 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:24:23.723 18:28:07 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:24:23.723 18:28:07 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:24:23.723 18:28:07 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@766 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:24:23.981 [2024-07-12 18:28:07.623840] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:24:23.981 [2024-07-12 18:28:07.644296] bdev_raid.c:2513:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:24:23.981 [2024-07-12 18:28:07.644341] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:24:23.981 [2024-07-12 18:28:07.644356] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:24:23.981 [2024-07-12 18:28:07.644364] bdev_raid.c:2451:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:24:23.981 18:28:07 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@767 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:24:23.981 18:28:07 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:24:23.981 18:28:07 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:24:23.981 18:28:07 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:24:23.981 18:28:07 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:24:23.981 18:28:07 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:24:23.981 18:28:07 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:24:23.981 18:28:07 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:24:23.981 18:28:07 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:24:23.981 18:28:07 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:24:23.981 18:28:07 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:23.981 18:28:07 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:24.239 18:28:07 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:24:24.239 "name": "raid_bdev1", 00:24:24.239 "uuid": "737d8ccd-c49c-49ab-9f74-876a4b38b3ca", 00:24:24.239 "strip_size_kb": 0, 00:24:24.239 "state": "online", 00:24:24.239 "raid_level": "raid1", 00:24:24.239 "superblock": true, 00:24:24.239 "num_base_bdevs": 2, 00:24:24.239 "num_base_bdevs_discovered": 1, 00:24:24.239 "num_base_bdevs_operational": 1, 00:24:24.239 "base_bdevs_list": [ 00:24:24.239 { 00:24:24.239 "name": null, 00:24:24.239 "uuid": "00000000-0000-0000-0000-000000000000", 00:24:24.239 "is_configured": false, 00:24:24.239 "data_offset": 2048, 00:24:24.239 "data_size": 63488 00:24:24.239 }, 00:24:24.239 { 00:24:24.239 "name": "BaseBdev2", 00:24:24.239 "uuid": "aa72c974-8220-599c-be82-4e8437946c19", 00:24:24.239 "is_configured": true, 00:24:24.239 "data_offset": 2048, 00:24:24.239 "data_size": 63488 00:24:24.239 } 00:24:24.239 ] 00:24:24.239 }' 00:24:24.239 18:28:07 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:24:24.239 18:28:07 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:24:24.805 18:28:08 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@768 -- # verify_raid_bdev_process raid_bdev1 none none 00:24:24.805 18:28:08 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:24:24.805 18:28:08 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:24:24.805 18:28:08 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=none 00:24:24.805 18:28:08 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:24:24.805 18:28:08 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:24.805 18:28:08 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:25.063 18:28:08 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:24:25.063 "name": "raid_bdev1", 00:24:25.063 "uuid": "737d8ccd-c49c-49ab-9f74-876a4b38b3ca", 00:24:25.063 "strip_size_kb": 0, 00:24:25.063 "state": "online", 00:24:25.063 "raid_level": "raid1", 00:24:25.063 "superblock": true, 00:24:25.063 "num_base_bdevs": 2, 00:24:25.063 "num_base_bdevs_discovered": 1, 00:24:25.063 "num_base_bdevs_operational": 1, 00:24:25.063 "base_bdevs_list": [ 00:24:25.063 { 00:24:25.063 "name": null, 00:24:25.063 "uuid": "00000000-0000-0000-0000-000000000000", 00:24:25.063 "is_configured": false, 00:24:25.063 "data_offset": 2048, 00:24:25.063 "data_size": 63488 00:24:25.063 }, 00:24:25.063 { 00:24:25.063 "name": "BaseBdev2", 00:24:25.063 "uuid": "aa72c974-8220-599c-be82-4e8437946c19", 00:24:25.063 "is_configured": true, 00:24:25.063 "data_offset": 2048, 00:24:25.063 "data_size": 63488 00:24:25.063 } 00:24:25.063 ] 00:24:25.063 }' 00:24:25.063 18:28:08 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:24:25.063 18:28:08 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:24:25.063 18:28:08 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:24:25.063 18:28:08 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:24:25.063 18:28:08 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@771 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete BaseBdev1 00:24:25.321 18:28:09 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@772 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev1_malloc -p BaseBdev1 00:24:25.579 [2024-07-12 18:28:09.249815] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev1_malloc 00:24:25.579 [2024-07-12 18:28:09.249865] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:24:25.579 [2024-07-12 18:28:09.249888] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x280bf00 00:24:25.579 [2024-07-12 18:28:09.249901] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:24:25.579 [2024-07-12 18:28:09.250240] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:24:25.579 [2024-07-12 18:28:09.250260] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:24:25.579 [2024-07-12 18:28:09.250323] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev BaseBdev1 00:24:25.579 [2024-07-12 18:28:09.250335] bdev_raid.c:3562:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev BaseBdev1 (1) smaller than existing raid bdev raid_bdev1 (5) 00:24:25.579 [2024-07-12 18:28:09.250346] bdev_raid.c:3581:raid_bdev_examine_sb: *DEBUG*: raid superblock does not contain this bdev's uuid 00:24:25.579 BaseBdev1 00:24:25.579 18:28:09 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@773 -- # sleep 1 00:24:26.954 18:28:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@774 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:24:26.954 18:28:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:24:26.954 18:28:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:24:26.954 18:28:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:24:26.954 18:28:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:24:26.954 18:28:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:24:26.954 18:28:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:24:26.954 18:28:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:24:26.954 18:28:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:24:26.954 18:28:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:24:26.954 18:28:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:26.954 18:28:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:26.954 18:28:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:24:26.954 "name": "raid_bdev1", 00:24:26.954 "uuid": "737d8ccd-c49c-49ab-9f74-876a4b38b3ca", 00:24:26.954 "strip_size_kb": 0, 00:24:26.954 "state": "online", 00:24:26.954 "raid_level": "raid1", 00:24:26.954 "superblock": true, 00:24:26.954 "num_base_bdevs": 2, 00:24:26.954 "num_base_bdevs_discovered": 1, 00:24:26.954 "num_base_bdevs_operational": 1, 00:24:26.954 "base_bdevs_list": [ 00:24:26.954 { 00:24:26.954 "name": null, 00:24:26.954 "uuid": "00000000-0000-0000-0000-000000000000", 00:24:26.954 "is_configured": false, 00:24:26.954 "data_offset": 2048, 00:24:26.954 "data_size": 63488 00:24:26.954 }, 00:24:26.954 { 00:24:26.954 "name": "BaseBdev2", 00:24:26.954 "uuid": "aa72c974-8220-599c-be82-4e8437946c19", 00:24:26.954 "is_configured": true, 00:24:26.954 "data_offset": 2048, 00:24:26.954 "data_size": 63488 00:24:26.954 } 00:24:26.954 ] 00:24:26.954 }' 00:24:26.954 18:28:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:24:26.954 18:28:10 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:24:27.549 18:28:11 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@775 -- # verify_raid_bdev_process raid_bdev1 none none 00:24:27.549 18:28:11 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:24:27.549 18:28:11 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:24:27.549 18:28:11 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=none 00:24:27.549 18:28:11 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:24:27.549 18:28:11 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:27.549 18:28:11 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:27.807 18:28:11 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:24:27.807 "name": "raid_bdev1", 00:24:27.807 "uuid": "737d8ccd-c49c-49ab-9f74-876a4b38b3ca", 00:24:27.807 "strip_size_kb": 0, 00:24:27.807 "state": "online", 00:24:27.807 "raid_level": "raid1", 00:24:27.807 "superblock": true, 00:24:27.807 "num_base_bdevs": 2, 00:24:27.807 "num_base_bdevs_discovered": 1, 00:24:27.807 "num_base_bdevs_operational": 1, 00:24:27.807 "base_bdevs_list": [ 00:24:27.807 { 00:24:27.807 "name": null, 00:24:27.807 "uuid": "00000000-0000-0000-0000-000000000000", 00:24:27.807 "is_configured": false, 00:24:27.807 "data_offset": 2048, 00:24:27.807 "data_size": 63488 00:24:27.807 }, 00:24:27.807 { 00:24:27.807 "name": "BaseBdev2", 00:24:27.807 "uuid": "aa72c974-8220-599c-be82-4e8437946c19", 00:24:27.807 "is_configured": true, 00:24:27.807 "data_offset": 2048, 00:24:27.807 "data_size": 63488 00:24:27.807 } 00:24:27.807 ] 00:24:27.807 }' 00:24:27.807 18:28:11 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:24:27.807 18:28:11 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:24:27.807 18:28:11 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:24:27.807 18:28:11 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:24:27.807 18:28:11 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@776 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:24:27.807 18:28:11 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@648 -- # local es=0 00:24:27.807 18:28:11 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@650 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:24:27.807 18:28:11 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@636 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:24:27.807 18:28:11 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:24:27.807 18:28:11 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@640 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:24:27.807 18:28:11 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:24:27.807 18:28:11 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@642 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:24:27.807 18:28:11 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:24:27.807 18:28:11 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@642 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:24:27.807 18:28:11 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@642 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py ]] 00:24:27.807 18:28:11 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@651 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:24:28.065 [2024-07-12 18:28:11.680613] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:24:28.065 [2024-07-12 18:28:11.680731] bdev_raid.c:3562:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev BaseBdev1 (1) smaller than existing raid bdev raid_bdev1 (5) 00:24:28.065 [2024-07-12 18:28:11.680747] bdev_raid.c:3581:raid_bdev_examine_sb: *DEBUG*: raid superblock does not contain this bdev's uuid 00:24:28.065 request: 00:24:28.065 { 00:24:28.065 "base_bdev": "BaseBdev1", 00:24:28.065 "raid_bdev": "raid_bdev1", 00:24:28.065 "method": "bdev_raid_add_base_bdev", 00:24:28.065 "req_id": 1 00:24:28.065 } 00:24:28.065 Got JSON-RPC error response 00:24:28.065 response: 00:24:28.065 { 00:24:28.065 "code": -22, 00:24:28.065 "message": "Failed to add base bdev to RAID bdev: Invalid argument" 00:24:28.065 } 00:24:28.065 18:28:11 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@651 -- # es=1 00:24:28.065 18:28:11 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@659 -- # (( es > 128 )) 00:24:28.065 18:28:11 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@670 -- # [[ -n '' ]] 00:24:28.065 18:28:11 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@675 -- # (( !es == 0 )) 00:24:28.065 18:28:11 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@777 -- # sleep 1 00:24:28.996 18:28:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@778 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:24:28.996 18:28:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:24:28.996 18:28:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:24:28.996 18:28:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:24:28.996 18:28:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:24:28.996 18:28:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:24:28.996 18:28:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:24:28.996 18:28:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:24:28.996 18:28:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:24:28.996 18:28:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:24:28.996 18:28:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:28.996 18:28:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:29.253 18:28:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:24:29.254 "name": "raid_bdev1", 00:24:29.254 "uuid": "737d8ccd-c49c-49ab-9f74-876a4b38b3ca", 00:24:29.254 "strip_size_kb": 0, 00:24:29.254 "state": "online", 00:24:29.254 "raid_level": "raid1", 00:24:29.254 "superblock": true, 00:24:29.254 "num_base_bdevs": 2, 00:24:29.254 "num_base_bdevs_discovered": 1, 00:24:29.254 "num_base_bdevs_operational": 1, 00:24:29.254 "base_bdevs_list": [ 00:24:29.254 { 00:24:29.254 "name": null, 00:24:29.254 "uuid": "00000000-0000-0000-0000-000000000000", 00:24:29.254 "is_configured": false, 00:24:29.254 "data_offset": 2048, 00:24:29.254 "data_size": 63488 00:24:29.254 }, 00:24:29.254 { 00:24:29.254 "name": "BaseBdev2", 00:24:29.254 "uuid": "aa72c974-8220-599c-be82-4e8437946c19", 00:24:29.254 "is_configured": true, 00:24:29.254 "data_offset": 2048, 00:24:29.254 "data_size": 63488 00:24:29.254 } 00:24:29.254 ] 00:24:29.254 }' 00:24:29.254 18:28:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:24:29.254 18:28:12 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:24:30.186 18:28:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@779 -- # verify_raid_bdev_process raid_bdev1 none none 00:24:30.186 18:28:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:24:30.186 18:28:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:24:30.186 18:28:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=none 00:24:30.186 18:28:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:24:30.186 18:28:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:30.186 18:28:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:30.186 18:28:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:24:30.186 "name": "raid_bdev1", 00:24:30.186 "uuid": "737d8ccd-c49c-49ab-9f74-876a4b38b3ca", 00:24:30.186 "strip_size_kb": 0, 00:24:30.186 "state": "online", 00:24:30.186 "raid_level": "raid1", 00:24:30.186 "superblock": true, 00:24:30.186 "num_base_bdevs": 2, 00:24:30.186 "num_base_bdevs_discovered": 1, 00:24:30.186 "num_base_bdevs_operational": 1, 00:24:30.186 "base_bdevs_list": [ 00:24:30.186 { 00:24:30.186 "name": null, 00:24:30.186 "uuid": "00000000-0000-0000-0000-000000000000", 00:24:30.186 "is_configured": false, 00:24:30.186 "data_offset": 2048, 00:24:30.186 "data_size": 63488 00:24:30.186 }, 00:24:30.186 { 00:24:30.187 "name": "BaseBdev2", 00:24:30.187 "uuid": "aa72c974-8220-599c-be82-4e8437946c19", 00:24:30.187 "is_configured": true, 00:24:30.187 "data_offset": 2048, 00:24:30.187 "data_size": 63488 00:24:30.187 } 00:24:30.187 ] 00:24:30.187 }' 00:24:30.187 18:28:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:24:30.187 18:28:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:24:30.187 18:28:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:24:30.187 18:28:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:24:30.187 18:28:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@782 -- # killprocess 2576499 00:24:30.187 18:28:13 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@948 -- # '[' -z 2576499 ']' 00:24:30.187 18:28:13 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@952 -- # kill -0 2576499 00:24:30.187 18:28:13 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@953 -- # uname 00:24:30.187 18:28:13 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:24:30.187 18:28:13 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2576499 00:24:30.444 18:28:13 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:24:30.444 18:28:13 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:24:30.444 18:28:13 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2576499' 00:24:30.444 killing process with pid 2576499 00:24:30.444 18:28:13 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@967 -- # kill 2576499 00:24:30.444 Received shutdown signal, test time was about 27.147044 seconds 00:24:30.444 00:24:30.444 Latency(us) 00:24:30.444 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:24:30.444 =================================================================================================================== 00:24:30.444 Total : 0.00 0.00 0.00 0.00 0.00 0.00 0.00 00:24:30.444 [2024-07-12 18:28:13.924384] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:24:30.444 [2024-07-12 18:28:13.924480] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:24:30.444 [2024-07-12 18:28:13.924522] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:24:30.444 [2024-07-12 18:28:13.924533] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x265cf70 name raid_bdev1, state offline 00:24:30.444 18:28:13 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@972 -- # wait 2576499 00:24:30.444 [2024-07-12 18:28:13.946667] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:24:30.702 18:28:14 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@784 -- # return 0 00:24:30.702 00:24:30.702 real 0m31.815s 00:24:30.702 user 0m49.596s 00:24:30.702 sys 0m4.637s 00:24:30.702 18:28:14 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@1124 -- # xtrace_disable 00:24:30.702 18:28:14 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:24:30.702 ************************************ 00:24:30.702 END TEST raid_rebuild_test_sb_io 00:24:30.702 ************************************ 00:24:30.702 18:28:14 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:24:30.702 18:28:14 bdev_raid -- bdev/bdev_raid.sh@876 -- # for n in 2 4 00:24:30.702 18:28:14 bdev_raid -- bdev/bdev_raid.sh@877 -- # run_test raid_rebuild_test raid_rebuild_test raid1 4 false false true 00:24:30.702 18:28:14 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 7 -le 1 ']' 00:24:30.702 18:28:14 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:24:30.702 18:28:14 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:24:30.702 ************************************ 00:24:30.702 START TEST raid_rebuild_test 00:24:30.702 ************************************ 00:24:30.702 18:28:14 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@1123 -- # raid_rebuild_test raid1 4 false false true 00:24:30.702 18:28:14 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@568 -- # local raid_level=raid1 00:24:30.702 18:28:14 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@569 -- # local num_base_bdevs=4 00:24:30.702 18:28:14 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@570 -- # local superblock=false 00:24:30.702 18:28:14 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@571 -- # local background_io=false 00:24:30.702 18:28:14 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@572 -- # local verify=true 00:24:30.702 18:28:14 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@573 -- # (( i = 1 )) 00:24:30.702 18:28:14 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:24:30.702 18:28:14 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@575 -- # echo BaseBdev1 00:24:30.702 18:28:14 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:24:30.702 18:28:14 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:24:30.702 18:28:14 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@575 -- # echo BaseBdev2 00:24:30.702 18:28:14 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:24:30.702 18:28:14 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:24:30.702 18:28:14 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@575 -- # echo BaseBdev3 00:24:30.702 18:28:14 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:24:30.702 18:28:14 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:24:30.702 18:28:14 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@575 -- # echo BaseBdev4 00:24:30.702 18:28:14 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:24:30.702 18:28:14 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:24:30.702 18:28:14 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@573 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3' 'BaseBdev4') 00:24:30.702 18:28:14 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@573 -- # local base_bdevs 00:24:30.702 18:28:14 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@574 -- # local raid_bdev_name=raid_bdev1 00:24:30.702 18:28:14 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@575 -- # local strip_size 00:24:30.702 18:28:14 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@576 -- # local create_arg 00:24:30.702 18:28:14 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@577 -- # local raid_bdev_size 00:24:30.703 18:28:14 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@578 -- # local data_offset 00:24:30.703 18:28:14 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@580 -- # '[' raid1 '!=' raid1 ']' 00:24:30.703 18:28:14 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@588 -- # strip_size=0 00:24:30.703 18:28:14 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@591 -- # '[' false = true ']' 00:24:30.703 18:28:14 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@596 -- # raid_pid=2581072 00:24:30.703 18:28:14 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@597 -- # waitforlisten 2581072 /var/tmp/spdk-raid.sock 00:24:30.703 18:28:14 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@595 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 3M -q 2 -U -z -L bdev_raid 00:24:30.703 18:28:14 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@829 -- # '[' -z 2581072 ']' 00:24:30.703 18:28:14 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:24:30.703 18:28:14 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:24:30.703 18:28:14 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:24:30.703 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:24:30.703 18:28:14 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:24:30.703 18:28:14 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@10 -- # set +x 00:24:30.703 [2024-07-12 18:28:14.330715] Starting SPDK v24.09-pre git sha1 182dd7de4 / DPDK 24.03.0 initialization... 00:24:30.703 [2024-07-12 18:28:14.330784] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2581072 ] 00:24:30.703 I/O size of 3145728 is greater than zero copy threshold (65536). 00:24:30.703 Zero copy mechanism will not be used. 00:24:30.961 [2024-07-12 18:28:14.449775] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:24:30.961 [2024-07-12 18:28:14.555673] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:24:30.961 [2024-07-12 18:28:14.622861] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:24:30.961 [2024-07-12 18:28:14.622901] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:24:31.527 18:28:15 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:24:31.527 18:28:15 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@862 -- # return 0 00:24:31.527 18:28:15 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:24:31.527 18:28:15 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:24:31.784 BaseBdev1_malloc 00:24:31.784 18:28:15 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev1_malloc -p BaseBdev1 00:24:32.041 [2024-07-12 18:28:15.713019] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev1_malloc 00:24:32.041 [2024-07-12 18:28:15.713065] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:24:32.041 [2024-07-12 18:28:15.713088] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x26f2d40 00:24:32.041 [2024-07-12 18:28:15.713102] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:24:32.041 [2024-07-12 18:28:15.714816] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:24:32.041 [2024-07-12 18:28:15.714848] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:24:32.041 BaseBdev1 00:24:32.041 18:28:15 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:24:32.041 18:28:15 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:24:32.299 BaseBdev2_malloc 00:24:32.299 18:28:15 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev2_malloc -p BaseBdev2 00:24:32.557 [2024-07-12 18:28:16.211247] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev2_malloc 00:24:32.557 [2024-07-12 18:28:16.211294] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:24:32.557 [2024-07-12 18:28:16.211317] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x26f3860 00:24:32.557 [2024-07-12 18:28:16.211330] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:24:32.557 [2024-07-12 18:28:16.212901] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:24:32.557 [2024-07-12 18:28:16.212938] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:24:32.557 BaseBdev2 00:24:32.557 18:28:16 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:24:32.557 18:28:16 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3_malloc 00:24:32.816 BaseBdev3_malloc 00:24:32.816 18:28:16 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev3_malloc -p BaseBdev3 00:24:33.073 [2024-07-12 18:28:16.706100] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev3_malloc 00:24:33.073 [2024-07-12 18:28:16.706147] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:24:33.073 [2024-07-12 18:28:16.706168] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x28a08f0 00:24:33.073 [2024-07-12 18:28:16.706181] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:24:33.073 [2024-07-12 18:28:16.707756] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:24:33.073 [2024-07-12 18:28:16.707785] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev3 00:24:33.073 BaseBdev3 00:24:33.073 18:28:16 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:24:33.073 18:28:16 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4_malloc 00:24:33.331 BaseBdev4_malloc 00:24:33.331 18:28:16 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev4_malloc -p BaseBdev4 00:24:33.588 [2024-07-12 18:28:17.201246] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev4_malloc 00:24:33.588 [2024-07-12 18:28:17.201292] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:24:33.588 [2024-07-12 18:28:17.201312] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x289fad0 00:24:33.588 [2024-07-12 18:28:17.201325] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:24:33.588 [2024-07-12 18:28:17.202836] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:24:33.588 [2024-07-12 18:28:17.202867] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev4 00:24:33.588 BaseBdev4 00:24:33.588 18:28:17 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@606 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b spare_malloc 00:24:33.845 spare_malloc 00:24:33.846 18:28:17 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@607 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_delay_create -b spare_malloc -d spare_delay -r 0 -t 0 -w 100000 -n 100000 00:24:34.103 spare_delay 00:24:34.103 18:28:17 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@608 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:24:34.360 [2024-07-12 18:28:17.911647] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:24:34.361 [2024-07-12 18:28:17.911694] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:24:34.361 [2024-07-12 18:28:17.911715] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x28a45b0 00:24:34.361 [2024-07-12 18:28:17.911728] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:24:34.361 [2024-07-12 18:28:17.913312] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:24:34.361 [2024-07-12 18:28:17.913341] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:24:34.361 spare 00:24:34.361 18:28:17 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@611 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n raid_bdev1 00:24:34.618 [2024-07-12 18:28:18.156314] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:24:34.618 [2024-07-12 18:28:18.157658] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:24:34.618 [2024-07-12 18:28:18.157713] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:24:34.618 [2024-07-12 18:28:18.157759] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:24:34.618 [2024-07-12 18:28:18.157840] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x28238a0 00:24:34.618 [2024-07-12 18:28:18.157851] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 65536, blocklen 512 00:24:34.618 [2024-07-12 18:28:18.158071] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x289de10 00:24:34.618 [2024-07-12 18:28:18.158221] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x28238a0 00:24:34.618 [2024-07-12 18:28:18.158232] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x28238a0 00:24:34.618 [2024-07-12 18:28:18.158347] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:24:34.618 18:28:18 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@612 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 4 00:24:34.618 18:28:18 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:24:34.618 18:28:18 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:24:34.618 18:28:18 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:24:34.618 18:28:18 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:24:34.618 18:28:18 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:24:34.618 18:28:18 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:24:34.618 18:28:18 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:24:34.618 18:28:18 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:24:34.618 18:28:18 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:24:34.618 18:28:18 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:34.618 18:28:18 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:34.875 18:28:18 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:24:34.875 "name": "raid_bdev1", 00:24:34.875 "uuid": "d8fc280a-8c4c-490c-9f4f-b1fcc2885eeb", 00:24:34.875 "strip_size_kb": 0, 00:24:34.875 "state": "online", 00:24:34.875 "raid_level": "raid1", 00:24:34.875 "superblock": false, 00:24:34.875 "num_base_bdevs": 4, 00:24:34.875 "num_base_bdevs_discovered": 4, 00:24:34.875 "num_base_bdevs_operational": 4, 00:24:34.875 "base_bdevs_list": [ 00:24:34.875 { 00:24:34.875 "name": "BaseBdev1", 00:24:34.875 "uuid": "04ddfa43-b462-56ed-905f-e80c727e1b9a", 00:24:34.875 "is_configured": true, 00:24:34.875 "data_offset": 0, 00:24:34.875 "data_size": 65536 00:24:34.875 }, 00:24:34.875 { 00:24:34.875 "name": "BaseBdev2", 00:24:34.875 "uuid": "5e462edf-13b7-5acd-b9b3-73280c58b2df", 00:24:34.875 "is_configured": true, 00:24:34.875 "data_offset": 0, 00:24:34.875 "data_size": 65536 00:24:34.875 }, 00:24:34.875 { 00:24:34.875 "name": "BaseBdev3", 00:24:34.875 "uuid": "dfe76da4-afbb-5d30-aa47-36bfefdbed12", 00:24:34.875 "is_configured": true, 00:24:34.875 "data_offset": 0, 00:24:34.875 "data_size": 65536 00:24:34.875 }, 00:24:34.875 { 00:24:34.875 "name": "BaseBdev4", 00:24:34.875 "uuid": "f70f6b4f-7527-5ab6-961d-da122d62e8a7", 00:24:34.875 "is_configured": true, 00:24:34.875 "data_offset": 0, 00:24:34.875 "data_size": 65536 00:24:34.875 } 00:24:34.876 ] 00:24:34.876 }' 00:24:34.876 18:28:18 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:24:34.876 18:28:18 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@10 -- # set +x 00:24:35.441 18:28:19 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@615 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:24:35.441 18:28:19 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@615 -- # jq -r '.[].num_blocks' 00:24:35.699 [2024-07-12 18:28:19.259507] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:24:35.699 18:28:19 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@615 -- # raid_bdev_size=65536 00:24:35.699 18:28:19 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@618 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:35.699 18:28:19 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@618 -- # jq -r '.[].base_bdevs_list[0].data_offset' 00:24:35.956 18:28:19 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@618 -- # data_offset=0 00:24:35.956 18:28:19 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@620 -- # '[' false = true ']' 00:24:35.956 18:28:19 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@623 -- # '[' true = true ']' 00:24:35.956 18:28:19 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@624 -- # local write_unit_size 00:24:35.956 18:28:19 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@627 -- # nbd_start_disks /var/tmp/spdk-raid.sock raid_bdev1 /dev/nbd0 00:24:35.956 18:28:19 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:24:35.956 18:28:19 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@10 -- # bdev_list=('raid_bdev1') 00:24:35.956 18:28:19 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@10 -- # local bdev_list 00:24:35.956 18:28:19 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0') 00:24:35.956 18:28:19 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@11 -- # local nbd_list 00:24:35.956 18:28:19 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@12 -- # local i 00:24:35.956 18:28:19 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:24:35.956 18:28:19 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:24:35.956 18:28:19 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk raid_bdev1 /dev/nbd0 00:24:36.214 [2024-07-12 18:28:19.760576] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x289de10 00:24:36.214 /dev/nbd0 00:24:36.214 18:28:19 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:24:36.214 18:28:19 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:24:36.214 18:28:19 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@866 -- # local nbd_name=nbd0 00:24:36.214 18:28:19 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@867 -- # local i 00:24:36.214 18:28:19 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:24:36.214 18:28:19 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:24:36.214 18:28:19 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@870 -- # grep -q -w nbd0 /proc/partitions 00:24:36.214 18:28:19 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@871 -- # break 00:24:36.214 18:28:19 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:24:36.214 18:28:19 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:24:36.214 18:28:19 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@883 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:24:36.214 1+0 records in 00:24:36.214 1+0 records out 00:24:36.214 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000229677 s, 17.8 MB/s 00:24:36.214 18:28:19 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:24:36.214 18:28:19 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@884 -- # size=4096 00:24:36.214 18:28:19 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:24:36.214 18:28:19 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:24:36.214 18:28:19 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@887 -- # return 0 00:24:36.214 18:28:19 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:24:36.214 18:28:19 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:24:36.214 18:28:19 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@628 -- # '[' raid1 = raid5f ']' 00:24:36.214 18:28:19 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@632 -- # write_unit_size=1 00:24:36.214 18:28:19 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@634 -- # dd if=/dev/urandom of=/dev/nbd0 bs=512 count=65536 oflag=direct 00:24:44.318 65536+0 records in 00:24:44.318 65536+0 records out 00:24:44.318 33554432 bytes (34 MB, 32 MiB) copied, 7.22811 s, 4.6 MB/s 00:24:44.318 18:28:27 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@635 -- # nbd_stop_disks /var/tmp/spdk-raid.sock /dev/nbd0 00:24:44.318 18:28:27 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:24:44.318 18:28:27 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0') 00:24:44.318 18:28:27 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@50 -- # local nbd_list 00:24:44.318 18:28:27 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@51 -- # local i 00:24:44.318 18:28:27 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:24:44.318 18:28:27 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd0 00:24:44.318 18:28:27 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:24:44.318 [2024-07-12 18:28:27.318798] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:24:44.318 18:28:27 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:24:44.318 18:28:27 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:24:44.318 18:28:27 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:24:44.318 18:28:27 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:24:44.318 18:28:27 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:24:44.318 18:28:27 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@41 -- # break 00:24:44.318 18:28:27 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@45 -- # return 0 00:24:44.318 18:28:27 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@639 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev1 00:24:44.318 [2024-07-12 18:28:27.555472] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:24:44.318 18:28:27 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@642 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:24:44.318 18:28:27 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:24:44.318 18:28:27 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:24:44.318 18:28:27 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:24:44.318 18:28:27 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:24:44.318 18:28:27 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:24:44.318 18:28:27 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:24:44.318 18:28:27 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:24:44.318 18:28:27 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:24:44.318 18:28:27 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:24:44.318 18:28:27 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:44.318 18:28:27 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:44.318 18:28:27 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:24:44.318 "name": "raid_bdev1", 00:24:44.318 "uuid": "d8fc280a-8c4c-490c-9f4f-b1fcc2885eeb", 00:24:44.318 "strip_size_kb": 0, 00:24:44.318 "state": "online", 00:24:44.318 "raid_level": "raid1", 00:24:44.318 "superblock": false, 00:24:44.318 "num_base_bdevs": 4, 00:24:44.318 "num_base_bdevs_discovered": 3, 00:24:44.318 "num_base_bdevs_operational": 3, 00:24:44.318 "base_bdevs_list": [ 00:24:44.318 { 00:24:44.318 "name": null, 00:24:44.318 "uuid": "00000000-0000-0000-0000-000000000000", 00:24:44.318 "is_configured": false, 00:24:44.318 "data_offset": 0, 00:24:44.318 "data_size": 65536 00:24:44.318 }, 00:24:44.318 { 00:24:44.318 "name": "BaseBdev2", 00:24:44.318 "uuid": "5e462edf-13b7-5acd-b9b3-73280c58b2df", 00:24:44.318 "is_configured": true, 00:24:44.318 "data_offset": 0, 00:24:44.318 "data_size": 65536 00:24:44.318 }, 00:24:44.318 { 00:24:44.318 "name": "BaseBdev3", 00:24:44.318 "uuid": "dfe76da4-afbb-5d30-aa47-36bfefdbed12", 00:24:44.318 "is_configured": true, 00:24:44.318 "data_offset": 0, 00:24:44.318 "data_size": 65536 00:24:44.318 }, 00:24:44.318 { 00:24:44.318 "name": "BaseBdev4", 00:24:44.318 "uuid": "f70f6b4f-7527-5ab6-961d-da122d62e8a7", 00:24:44.318 "is_configured": true, 00:24:44.318 "data_offset": 0, 00:24:44.318 "data_size": 65536 00:24:44.318 } 00:24:44.318 ] 00:24:44.318 }' 00:24:44.318 18:28:27 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:24:44.318 18:28:27 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@10 -- # set +x 00:24:44.884 18:28:28 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@645 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:24:45.142 [2024-07-12 18:28:28.650393] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:24:45.142 [2024-07-12 18:28:28.654498] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x28296b0 00:24:45.142 [2024-07-12 18:28:28.656851] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:24:45.142 18:28:28 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@646 -- # sleep 1 00:24:46.077 18:28:29 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@649 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:24:46.077 18:28:29 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:24:46.077 18:28:29 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:24:46.077 18:28:29 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@184 -- # local target=spare 00:24:46.077 18:28:29 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:24:46.077 18:28:29 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:46.077 18:28:29 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:46.335 18:28:29 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:24:46.335 "name": "raid_bdev1", 00:24:46.335 "uuid": "d8fc280a-8c4c-490c-9f4f-b1fcc2885eeb", 00:24:46.335 "strip_size_kb": 0, 00:24:46.335 "state": "online", 00:24:46.335 "raid_level": "raid1", 00:24:46.335 "superblock": false, 00:24:46.335 "num_base_bdevs": 4, 00:24:46.335 "num_base_bdevs_discovered": 4, 00:24:46.335 "num_base_bdevs_operational": 4, 00:24:46.335 "process": { 00:24:46.335 "type": "rebuild", 00:24:46.335 "target": "spare", 00:24:46.335 "progress": { 00:24:46.335 "blocks": 24576, 00:24:46.335 "percent": 37 00:24:46.335 } 00:24:46.335 }, 00:24:46.335 "base_bdevs_list": [ 00:24:46.335 { 00:24:46.335 "name": "spare", 00:24:46.335 "uuid": "f4018b23-a994-58e7-8d00-0e40f1135d85", 00:24:46.335 "is_configured": true, 00:24:46.335 "data_offset": 0, 00:24:46.335 "data_size": 65536 00:24:46.335 }, 00:24:46.335 { 00:24:46.335 "name": "BaseBdev2", 00:24:46.335 "uuid": "5e462edf-13b7-5acd-b9b3-73280c58b2df", 00:24:46.335 "is_configured": true, 00:24:46.335 "data_offset": 0, 00:24:46.335 "data_size": 65536 00:24:46.335 }, 00:24:46.335 { 00:24:46.335 "name": "BaseBdev3", 00:24:46.335 "uuid": "dfe76da4-afbb-5d30-aa47-36bfefdbed12", 00:24:46.335 "is_configured": true, 00:24:46.335 "data_offset": 0, 00:24:46.335 "data_size": 65536 00:24:46.335 }, 00:24:46.335 { 00:24:46.335 "name": "BaseBdev4", 00:24:46.336 "uuid": "f70f6b4f-7527-5ab6-961d-da122d62e8a7", 00:24:46.336 "is_configured": true, 00:24:46.336 "data_offset": 0, 00:24:46.336 "data_size": 65536 00:24:46.336 } 00:24:46.336 ] 00:24:46.336 }' 00:24:46.336 18:28:29 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:24:46.336 18:28:29 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:24:46.336 18:28:29 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:24:46.336 18:28:30 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:24:46.336 18:28:30 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@652 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev spare 00:24:46.593 [2024-07-12 18:28:30.244512] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:24:46.593 [2024-07-12 18:28:30.269449] bdev_raid.c:2513:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:24:46.593 [2024-07-12 18:28:30.269493] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:24:46.593 [2024-07-12 18:28:30.269510] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:24:46.593 [2024-07-12 18:28:30.269519] bdev_raid.c:2451:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:24:46.594 18:28:30 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@655 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:24:46.594 18:28:30 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:24:46.594 18:28:30 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:24:46.594 18:28:30 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:24:46.594 18:28:30 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:24:46.594 18:28:30 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:24:46.594 18:28:30 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:24:46.594 18:28:30 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:24:46.594 18:28:30 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:24:46.594 18:28:30 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:24:46.594 18:28:30 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:46.594 18:28:30 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:46.851 18:28:30 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:24:46.851 "name": "raid_bdev1", 00:24:46.851 "uuid": "d8fc280a-8c4c-490c-9f4f-b1fcc2885eeb", 00:24:46.851 "strip_size_kb": 0, 00:24:46.851 "state": "online", 00:24:46.851 "raid_level": "raid1", 00:24:46.851 "superblock": false, 00:24:46.851 "num_base_bdevs": 4, 00:24:46.851 "num_base_bdevs_discovered": 3, 00:24:46.851 "num_base_bdevs_operational": 3, 00:24:46.851 "base_bdevs_list": [ 00:24:46.851 { 00:24:46.851 "name": null, 00:24:46.851 "uuid": "00000000-0000-0000-0000-000000000000", 00:24:46.851 "is_configured": false, 00:24:46.851 "data_offset": 0, 00:24:46.851 "data_size": 65536 00:24:46.851 }, 00:24:46.851 { 00:24:46.851 "name": "BaseBdev2", 00:24:46.851 "uuid": "5e462edf-13b7-5acd-b9b3-73280c58b2df", 00:24:46.851 "is_configured": true, 00:24:46.851 "data_offset": 0, 00:24:46.851 "data_size": 65536 00:24:46.851 }, 00:24:46.851 { 00:24:46.851 "name": "BaseBdev3", 00:24:46.851 "uuid": "dfe76da4-afbb-5d30-aa47-36bfefdbed12", 00:24:46.851 "is_configured": true, 00:24:46.851 "data_offset": 0, 00:24:46.851 "data_size": 65536 00:24:46.851 }, 00:24:46.851 { 00:24:46.851 "name": "BaseBdev4", 00:24:46.851 "uuid": "f70f6b4f-7527-5ab6-961d-da122d62e8a7", 00:24:46.851 "is_configured": true, 00:24:46.851 "data_offset": 0, 00:24:46.851 "data_size": 65536 00:24:46.851 } 00:24:46.851 ] 00:24:46.851 }' 00:24:46.851 18:28:30 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:24:46.851 18:28:30 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@10 -- # set +x 00:24:47.783 18:28:31 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@658 -- # verify_raid_bdev_process raid_bdev1 none none 00:24:47.783 18:28:31 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:24:47.783 18:28:31 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:24:47.783 18:28:31 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@184 -- # local target=none 00:24:47.783 18:28:31 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:24:47.783 18:28:31 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:47.783 18:28:31 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:47.783 18:28:31 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:24:47.783 "name": "raid_bdev1", 00:24:47.783 "uuid": "d8fc280a-8c4c-490c-9f4f-b1fcc2885eeb", 00:24:47.783 "strip_size_kb": 0, 00:24:47.783 "state": "online", 00:24:47.783 "raid_level": "raid1", 00:24:47.783 "superblock": false, 00:24:47.783 "num_base_bdevs": 4, 00:24:47.783 "num_base_bdevs_discovered": 3, 00:24:47.783 "num_base_bdevs_operational": 3, 00:24:47.783 "base_bdevs_list": [ 00:24:47.783 { 00:24:47.783 "name": null, 00:24:47.783 "uuid": "00000000-0000-0000-0000-000000000000", 00:24:47.783 "is_configured": false, 00:24:47.783 "data_offset": 0, 00:24:47.783 "data_size": 65536 00:24:47.783 }, 00:24:47.783 { 00:24:47.783 "name": "BaseBdev2", 00:24:47.783 "uuid": "5e462edf-13b7-5acd-b9b3-73280c58b2df", 00:24:47.783 "is_configured": true, 00:24:47.783 "data_offset": 0, 00:24:47.783 "data_size": 65536 00:24:47.783 }, 00:24:47.783 { 00:24:47.783 "name": "BaseBdev3", 00:24:47.783 "uuid": "dfe76da4-afbb-5d30-aa47-36bfefdbed12", 00:24:47.783 "is_configured": true, 00:24:47.783 "data_offset": 0, 00:24:47.783 "data_size": 65536 00:24:47.783 }, 00:24:47.783 { 00:24:47.783 "name": "BaseBdev4", 00:24:47.783 "uuid": "f70f6b4f-7527-5ab6-961d-da122d62e8a7", 00:24:47.783 "is_configured": true, 00:24:47.783 "data_offset": 0, 00:24:47.783 "data_size": 65536 00:24:47.783 } 00:24:47.783 ] 00:24:47.783 }' 00:24:47.783 18:28:31 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:24:47.783 18:28:31 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:24:47.783 18:28:31 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:24:47.783 18:28:31 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:24:47.783 18:28:31 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@661 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:24:48.041 [2024-07-12 18:28:31.705338] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:24:48.041 [2024-07-12 18:28:31.709603] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x28296b0 00:24:48.041 [2024-07-12 18:28:31.711115] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:24:48.041 18:28:31 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@662 -- # sleep 1 00:24:49.412 18:28:32 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@663 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:24:49.412 18:28:32 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:24:49.412 18:28:32 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:24:49.412 18:28:32 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@184 -- # local target=spare 00:24:49.412 18:28:32 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:24:49.412 18:28:32 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:49.412 18:28:32 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:49.412 18:28:32 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:24:49.412 "name": "raid_bdev1", 00:24:49.412 "uuid": "d8fc280a-8c4c-490c-9f4f-b1fcc2885eeb", 00:24:49.412 "strip_size_kb": 0, 00:24:49.412 "state": "online", 00:24:49.412 "raid_level": "raid1", 00:24:49.412 "superblock": false, 00:24:49.412 "num_base_bdevs": 4, 00:24:49.412 "num_base_bdevs_discovered": 4, 00:24:49.412 "num_base_bdevs_operational": 4, 00:24:49.412 "process": { 00:24:49.412 "type": "rebuild", 00:24:49.412 "target": "spare", 00:24:49.412 "progress": { 00:24:49.412 "blocks": 24576, 00:24:49.412 "percent": 37 00:24:49.412 } 00:24:49.412 }, 00:24:49.412 "base_bdevs_list": [ 00:24:49.412 { 00:24:49.412 "name": "spare", 00:24:49.412 "uuid": "f4018b23-a994-58e7-8d00-0e40f1135d85", 00:24:49.412 "is_configured": true, 00:24:49.412 "data_offset": 0, 00:24:49.412 "data_size": 65536 00:24:49.412 }, 00:24:49.412 { 00:24:49.412 "name": "BaseBdev2", 00:24:49.412 "uuid": "5e462edf-13b7-5acd-b9b3-73280c58b2df", 00:24:49.412 "is_configured": true, 00:24:49.412 "data_offset": 0, 00:24:49.412 "data_size": 65536 00:24:49.412 }, 00:24:49.412 { 00:24:49.412 "name": "BaseBdev3", 00:24:49.412 "uuid": "dfe76da4-afbb-5d30-aa47-36bfefdbed12", 00:24:49.412 "is_configured": true, 00:24:49.412 "data_offset": 0, 00:24:49.412 "data_size": 65536 00:24:49.412 }, 00:24:49.412 { 00:24:49.412 "name": "BaseBdev4", 00:24:49.412 "uuid": "f70f6b4f-7527-5ab6-961d-da122d62e8a7", 00:24:49.412 "is_configured": true, 00:24:49.412 "data_offset": 0, 00:24:49.412 "data_size": 65536 00:24:49.412 } 00:24:49.412 ] 00:24:49.412 }' 00:24:49.412 18:28:32 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:24:49.412 18:28:33 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:24:49.412 18:28:33 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:24:49.412 18:28:33 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:24:49.412 18:28:33 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@665 -- # '[' false = true ']' 00:24:49.412 18:28:33 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@690 -- # local num_base_bdevs_operational=4 00:24:49.412 18:28:33 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@692 -- # '[' raid1 = raid1 ']' 00:24:49.412 18:28:33 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@692 -- # '[' 4 -gt 2 ']' 00:24:49.412 18:28:33 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@694 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev2 00:24:49.669 [2024-07-12 18:28:33.295495] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:24:49.669 [2024-07-12 18:28:33.323540] bdev_raid.c:1919:raid_bdev_channel_remove_base_bdev: *DEBUG*: slot: 1 raid_ch: 0x28296b0 00:24:49.669 18:28:33 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@697 -- # base_bdevs[1]= 00:24:49.669 18:28:33 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@698 -- # (( num_base_bdevs_operational-- )) 00:24:49.669 18:28:33 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@701 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:24:49.669 18:28:33 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:24:49.669 18:28:33 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:24:49.669 18:28:33 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@184 -- # local target=spare 00:24:49.669 18:28:33 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:24:49.669 18:28:33 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:49.669 18:28:33 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:49.926 18:28:33 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:24:49.926 "name": "raid_bdev1", 00:24:49.926 "uuid": "d8fc280a-8c4c-490c-9f4f-b1fcc2885eeb", 00:24:49.926 "strip_size_kb": 0, 00:24:49.926 "state": "online", 00:24:49.926 "raid_level": "raid1", 00:24:49.926 "superblock": false, 00:24:49.926 "num_base_bdevs": 4, 00:24:49.926 "num_base_bdevs_discovered": 3, 00:24:49.926 "num_base_bdevs_operational": 3, 00:24:49.926 "process": { 00:24:49.926 "type": "rebuild", 00:24:49.926 "target": "spare", 00:24:49.926 "progress": { 00:24:49.926 "blocks": 36864, 00:24:49.926 "percent": 56 00:24:49.926 } 00:24:49.926 }, 00:24:49.926 "base_bdevs_list": [ 00:24:49.926 { 00:24:49.926 "name": "spare", 00:24:49.926 "uuid": "f4018b23-a994-58e7-8d00-0e40f1135d85", 00:24:49.926 "is_configured": true, 00:24:49.926 "data_offset": 0, 00:24:49.926 "data_size": 65536 00:24:49.926 }, 00:24:49.926 { 00:24:49.926 "name": null, 00:24:49.926 "uuid": "00000000-0000-0000-0000-000000000000", 00:24:49.926 "is_configured": false, 00:24:49.926 "data_offset": 0, 00:24:49.926 "data_size": 65536 00:24:49.926 }, 00:24:49.926 { 00:24:49.926 "name": "BaseBdev3", 00:24:49.926 "uuid": "dfe76da4-afbb-5d30-aa47-36bfefdbed12", 00:24:49.926 "is_configured": true, 00:24:49.926 "data_offset": 0, 00:24:49.926 "data_size": 65536 00:24:49.926 }, 00:24:49.926 { 00:24:49.926 "name": "BaseBdev4", 00:24:49.926 "uuid": "f70f6b4f-7527-5ab6-961d-da122d62e8a7", 00:24:49.926 "is_configured": true, 00:24:49.926 "data_offset": 0, 00:24:49.926 "data_size": 65536 00:24:49.926 } 00:24:49.926 ] 00:24:49.926 }' 00:24:49.926 18:28:33 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:24:49.926 18:28:33 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:24:49.926 18:28:33 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:24:50.183 18:28:33 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:24:50.183 18:28:33 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@705 -- # local timeout=886 00:24:50.183 18:28:33 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:24:50.183 18:28:33 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:24:50.183 18:28:33 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:24:50.183 18:28:33 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:24:50.183 18:28:33 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@184 -- # local target=spare 00:24:50.183 18:28:33 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:24:50.183 18:28:33 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:50.183 18:28:33 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:50.440 18:28:33 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:24:50.440 "name": "raid_bdev1", 00:24:50.440 "uuid": "d8fc280a-8c4c-490c-9f4f-b1fcc2885eeb", 00:24:50.440 "strip_size_kb": 0, 00:24:50.440 "state": "online", 00:24:50.440 "raid_level": "raid1", 00:24:50.440 "superblock": false, 00:24:50.440 "num_base_bdevs": 4, 00:24:50.440 "num_base_bdevs_discovered": 3, 00:24:50.440 "num_base_bdevs_operational": 3, 00:24:50.440 "process": { 00:24:50.440 "type": "rebuild", 00:24:50.440 "target": "spare", 00:24:50.440 "progress": { 00:24:50.440 "blocks": 43008, 00:24:50.440 "percent": 65 00:24:50.440 } 00:24:50.440 }, 00:24:50.440 "base_bdevs_list": [ 00:24:50.440 { 00:24:50.440 "name": "spare", 00:24:50.440 "uuid": "f4018b23-a994-58e7-8d00-0e40f1135d85", 00:24:50.440 "is_configured": true, 00:24:50.440 "data_offset": 0, 00:24:50.440 "data_size": 65536 00:24:50.440 }, 00:24:50.440 { 00:24:50.440 "name": null, 00:24:50.440 "uuid": "00000000-0000-0000-0000-000000000000", 00:24:50.440 "is_configured": false, 00:24:50.440 "data_offset": 0, 00:24:50.440 "data_size": 65536 00:24:50.440 }, 00:24:50.440 { 00:24:50.440 "name": "BaseBdev3", 00:24:50.440 "uuid": "dfe76da4-afbb-5d30-aa47-36bfefdbed12", 00:24:50.440 "is_configured": true, 00:24:50.440 "data_offset": 0, 00:24:50.440 "data_size": 65536 00:24:50.440 }, 00:24:50.440 { 00:24:50.440 "name": "BaseBdev4", 00:24:50.440 "uuid": "f70f6b4f-7527-5ab6-961d-da122d62e8a7", 00:24:50.440 "is_configured": true, 00:24:50.440 "data_offset": 0, 00:24:50.440 "data_size": 65536 00:24:50.440 } 00:24:50.440 ] 00:24:50.440 }' 00:24:50.440 18:28:33 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:24:50.440 18:28:33 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:24:50.440 18:28:33 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:24:50.440 18:28:34 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:24:50.440 18:28:34 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@710 -- # sleep 1 00:24:51.374 [2024-07-12 18:28:34.936060] bdev_raid.c:2789:raid_bdev_process_thread_run: *DEBUG*: process completed on raid_bdev1 00:24:51.374 [2024-07-12 18:28:34.936120] bdev_raid.c:2504:raid_bdev_process_finish_done: *NOTICE*: Finished rebuild on raid bdev raid_bdev1 00:24:51.374 [2024-07-12 18:28:34.936162] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:24:51.374 18:28:35 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:24:51.374 18:28:35 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:24:51.374 18:28:35 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:24:51.374 18:28:35 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:24:51.374 18:28:35 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@184 -- # local target=spare 00:24:51.374 18:28:35 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:24:51.374 18:28:35 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:51.374 18:28:35 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:51.631 18:28:35 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:24:51.631 "name": "raid_bdev1", 00:24:51.631 "uuid": "d8fc280a-8c4c-490c-9f4f-b1fcc2885eeb", 00:24:51.631 "strip_size_kb": 0, 00:24:51.631 "state": "online", 00:24:51.631 "raid_level": "raid1", 00:24:51.631 "superblock": false, 00:24:51.631 "num_base_bdevs": 4, 00:24:51.631 "num_base_bdevs_discovered": 3, 00:24:51.631 "num_base_bdevs_operational": 3, 00:24:51.631 "base_bdevs_list": [ 00:24:51.631 { 00:24:51.631 "name": "spare", 00:24:51.631 "uuid": "f4018b23-a994-58e7-8d00-0e40f1135d85", 00:24:51.631 "is_configured": true, 00:24:51.631 "data_offset": 0, 00:24:51.631 "data_size": 65536 00:24:51.631 }, 00:24:51.631 { 00:24:51.631 "name": null, 00:24:51.631 "uuid": "00000000-0000-0000-0000-000000000000", 00:24:51.631 "is_configured": false, 00:24:51.631 "data_offset": 0, 00:24:51.631 "data_size": 65536 00:24:51.631 }, 00:24:51.631 { 00:24:51.631 "name": "BaseBdev3", 00:24:51.631 "uuid": "dfe76da4-afbb-5d30-aa47-36bfefdbed12", 00:24:51.631 "is_configured": true, 00:24:51.631 "data_offset": 0, 00:24:51.631 "data_size": 65536 00:24:51.631 }, 00:24:51.631 { 00:24:51.632 "name": "BaseBdev4", 00:24:51.632 "uuid": "f70f6b4f-7527-5ab6-961d-da122d62e8a7", 00:24:51.632 "is_configured": true, 00:24:51.632 "data_offset": 0, 00:24:51.632 "data_size": 65536 00:24:51.632 } 00:24:51.632 ] 00:24:51.632 }' 00:24:51.632 18:28:35 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:24:51.632 18:28:35 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # [[ none == \r\e\b\u\i\l\d ]] 00:24:51.632 18:28:35 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:24:51.890 18:28:35 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # [[ none == \s\p\a\r\e ]] 00:24:51.890 18:28:35 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@708 -- # break 00:24:51.890 18:28:35 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@714 -- # verify_raid_bdev_process raid_bdev1 none none 00:24:51.890 18:28:35 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:24:51.890 18:28:35 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:24:51.890 18:28:35 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@184 -- # local target=none 00:24:51.890 18:28:35 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:24:51.890 18:28:35 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:51.890 18:28:35 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:51.890 18:28:35 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:24:51.890 "name": "raid_bdev1", 00:24:51.890 "uuid": "d8fc280a-8c4c-490c-9f4f-b1fcc2885eeb", 00:24:51.890 "strip_size_kb": 0, 00:24:51.890 "state": "online", 00:24:51.890 "raid_level": "raid1", 00:24:51.890 "superblock": false, 00:24:51.890 "num_base_bdevs": 4, 00:24:51.890 "num_base_bdevs_discovered": 3, 00:24:51.890 "num_base_bdevs_operational": 3, 00:24:51.890 "base_bdevs_list": [ 00:24:51.890 { 00:24:51.890 "name": "spare", 00:24:51.890 "uuid": "f4018b23-a994-58e7-8d00-0e40f1135d85", 00:24:51.890 "is_configured": true, 00:24:51.890 "data_offset": 0, 00:24:51.890 "data_size": 65536 00:24:51.890 }, 00:24:51.890 { 00:24:51.890 "name": null, 00:24:51.890 "uuid": "00000000-0000-0000-0000-000000000000", 00:24:51.890 "is_configured": false, 00:24:51.890 "data_offset": 0, 00:24:51.890 "data_size": 65536 00:24:51.890 }, 00:24:51.890 { 00:24:51.890 "name": "BaseBdev3", 00:24:51.890 "uuid": "dfe76da4-afbb-5d30-aa47-36bfefdbed12", 00:24:51.890 "is_configured": true, 00:24:51.890 "data_offset": 0, 00:24:51.890 "data_size": 65536 00:24:51.890 }, 00:24:51.890 { 00:24:51.890 "name": "BaseBdev4", 00:24:51.890 "uuid": "f70f6b4f-7527-5ab6-961d-da122d62e8a7", 00:24:51.890 "is_configured": true, 00:24:51.890 "data_offset": 0, 00:24:51.890 "data_size": 65536 00:24:51.890 } 00:24:51.890 ] 00:24:51.890 }' 00:24:51.890 18:28:35 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:24:52.236 18:28:35 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:24:52.236 18:28:35 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:24:52.236 18:28:35 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:24:52.236 18:28:35 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@715 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:24:52.236 18:28:35 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:24:52.236 18:28:35 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:24:52.236 18:28:35 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:24:52.236 18:28:35 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:24:52.236 18:28:35 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:24:52.236 18:28:35 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:24:52.236 18:28:35 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:24:52.236 18:28:35 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:24:52.236 18:28:35 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:24:52.236 18:28:35 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:52.236 18:28:35 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:52.494 18:28:35 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:24:52.494 "name": "raid_bdev1", 00:24:52.494 "uuid": "d8fc280a-8c4c-490c-9f4f-b1fcc2885eeb", 00:24:52.494 "strip_size_kb": 0, 00:24:52.494 "state": "online", 00:24:52.494 "raid_level": "raid1", 00:24:52.494 "superblock": false, 00:24:52.494 "num_base_bdevs": 4, 00:24:52.494 "num_base_bdevs_discovered": 3, 00:24:52.494 "num_base_bdevs_operational": 3, 00:24:52.494 "base_bdevs_list": [ 00:24:52.494 { 00:24:52.494 "name": "spare", 00:24:52.494 "uuid": "f4018b23-a994-58e7-8d00-0e40f1135d85", 00:24:52.494 "is_configured": true, 00:24:52.494 "data_offset": 0, 00:24:52.494 "data_size": 65536 00:24:52.494 }, 00:24:52.494 { 00:24:52.494 "name": null, 00:24:52.494 "uuid": "00000000-0000-0000-0000-000000000000", 00:24:52.494 "is_configured": false, 00:24:52.494 "data_offset": 0, 00:24:52.494 "data_size": 65536 00:24:52.494 }, 00:24:52.494 { 00:24:52.494 "name": "BaseBdev3", 00:24:52.494 "uuid": "dfe76da4-afbb-5d30-aa47-36bfefdbed12", 00:24:52.494 "is_configured": true, 00:24:52.494 "data_offset": 0, 00:24:52.494 "data_size": 65536 00:24:52.494 }, 00:24:52.494 { 00:24:52.494 "name": "BaseBdev4", 00:24:52.494 "uuid": "f70f6b4f-7527-5ab6-961d-da122d62e8a7", 00:24:52.494 "is_configured": true, 00:24:52.494 "data_offset": 0, 00:24:52.495 "data_size": 65536 00:24:52.495 } 00:24:52.495 ] 00:24:52.495 }' 00:24:52.495 18:28:35 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:24:52.495 18:28:35 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@10 -- # set +x 00:24:53.060 18:28:36 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@718 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:24:53.060 [2024-07-12 18:28:36.785040] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:24:53.060 [2024-07-12 18:28:36.785068] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:24:53.060 [2024-07-12 18:28:36.785123] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:24:53.060 [2024-07-12 18:28:36.785188] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:24:53.060 [2024-07-12 18:28:36.785200] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x28238a0 name raid_bdev1, state offline 00:24:53.318 18:28:36 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@719 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:53.318 18:28:36 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@719 -- # jq length 00:24:53.318 18:28:36 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@719 -- # [[ 0 == 0 ]] 00:24:53.318 18:28:36 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@721 -- # '[' true = true ']' 00:24:53.318 18:28:36 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@722 -- # '[' false = true ']' 00:24:53.318 18:28:36 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@736 -- # nbd_start_disks /var/tmp/spdk-raid.sock 'BaseBdev1 spare' '/dev/nbd0 /dev/nbd1' 00:24:53.318 18:28:36 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:24:53.318 18:28:36 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@10 -- # bdev_list=('BaseBdev1' 'spare') 00:24:53.318 18:28:36 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@10 -- # local bdev_list 00:24:53.318 18:28:36 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:24:53.318 18:28:36 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@11 -- # local nbd_list 00:24:53.318 18:28:36 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@12 -- # local i 00:24:53.318 18:28:36 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:24:53.318 18:28:36 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:24:53.319 18:28:36 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk BaseBdev1 /dev/nbd0 00:24:53.577 /dev/nbd0 00:24:53.577 18:28:37 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:24:53.577 18:28:37 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:24:53.577 18:28:37 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@866 -- # local nbd_name=nbd0 00:24:53.577 18:28:37 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@867 -- # local i 00:24:53.577 18:28:37 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:24:53.577 18:28:37 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:24:53.577 18:28:37 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@870 -- # grep -q -w nbd0 /proc/partitions 00:24:53.577 18:28:37 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@871 -- # break 00:24:53.577 18:28:37 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:24:53.577 18:28:37 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:24:53.577 18:28:37 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@883 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:24:53.577 1+0 records in 00:24:53.577 1+0 records out 00:24:53.577 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000247845 s, 16.5 MB/s 00:24:53.577 18:28:37 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:24:53.577 18:28:37 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@884 -- # size=4096 00:24:53.577 18:28:37 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:24:53.577 18:28:37 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:24:53.577 18:28:37 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@887 -- # return 0 00:24:53.577 18:28:37 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:24:53.577 18:28:37 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:24:53.577 18:28:37 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk spare /dev/nbd1 00:24:53.835 /dev/nbd1 00:24:53.835 18:28:37 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:24:53.835 18:28:37 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:24:53.835 18:28:37 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@866 -- # local nbd_name=nbd1 00:24:53.835 18:28:37 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@867 -- # local i 00:24:53.835 18:28:37 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:24:53.835 18:28:37 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:24:53.835 18:28:37 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@870 -- # grep -q -w nbd1 /proc/partitions 00:24:53.835 18:28:37 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@871 -- # break 00:24:53.835 18:28:37 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:24:53.835 18:28:37 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:24:53.835 18:28:37 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@883 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:24:53.835 1+0 records in 00:24:53.835 1+0 records out 00:24:53.835 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000354652 s, 11.5 MB/s 00:24:53.835 18:28:37 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:24:53.835 18:28:37 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@884 -- # size=4096 00:24:53.835 18:28:37 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:24:53.835 18:28:37 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:24:53.835 18:28:37 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@887 -- # return 0 00:24:53.835 18:28:37 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:24:53.835 18:28:37 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:24:53.836 18:28:37 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@737 -- # cmp -i 0 /dev/nbd0 /dev/nbd1 00:24:54.094 18:28:37 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@738 -- # nbd_stop_disks /var/tmp/spdk-raid.sock '/dev/nbd0 /dev/nbd1' 00:24:54.094 18:28:37 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:24:54.094 18:28:37 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:24:54.094 18:28:37 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@50 -- # local nbd_list 00:24:54.094 18:28:37 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@51 -- # local i 00:24:54.094 18:28:37 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:24:54.094 18:28:37 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd0 00:24:54.353 18:28:37 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:24:54.353 18:28:37 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:24:54.353 18:28:37 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:24:54.353 18:28:37 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:24:54.353 18:28:37 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:24:54.353 18:28:37 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:24:54.353 18:28:37 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@41 -- # break 00:24:54.353 18:28:37 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@45 -- # return 0 00:24:54.353 18:28:37 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:24:54.353 18:28:37 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd1 00:24:54.611 18:28:38 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:24:54.611 18:28:38 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:24:54.611 18:28:38 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:24:54.611 18:28:38 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:24:54.611 18:28:38 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:24:54.611 18:28:38 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:24:54.611 18:28:38 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@41 -- # break 00:24:54.611 18:28:38 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@45 -- # return 0 00:24:54.611 18:28:38 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@742 -- # '[' false = true ']' 00:24:54.611 18:28:38 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@782 -- # killprocess 2581072 00:24:54.612 18:28:38 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@948 -- # '[' -z 2581072 ']' 00:24:54.612 18:28:38 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@952 -- # kill -0 2581072 00:24:54.612 18:28:38 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@953 -- # uname 00:24:54.612 18:28:38 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:24:54.612 18:28:38 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2581072 00:24:54.612 18:28:38 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:24:54.612 18:28:38 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:24:54.612 18:28:38 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2581072' 00:24:54.612 killing process with pid 2581072 00:24:54.612 18:28:38 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@967 -- # kill 2581072 00:24:54.612 Received shutdown signal, test time was about 60.000000 seconds 00:24:54.612 00:24:54.612 Latency(us) 00:24:54.612 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:24:54.612 =================================================================================================================== 00:24:54.612 Total : 0.00 0.00 0.00 0.00 0.00 18446744073709551616.00 0.00 00:24:54.612 [2024-07-12 18:28:38.232435] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:24:54.612 18:28:38 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@972 -- # wait 2581072 00:24:54.612 [2024-07-12 18:28:38.282595] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:24:54.871 18:28:38 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@784 -- # return 0 00:24:54.871 00:24:54.871 real 0m24.243s 00:24:54.871 user 0m32.856s 00:24:54.871 sys 0m5.188s 00:24:54.871 18:28:38 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:24:54.871 18:28:38 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@10 -- # set +x 00:24:54.871 ************************************ 00:24:54.871 END TEST raid_rebuild_test 00:24:54.871 ************************************ 00:24:54.871 18:28:38 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:24:54.871 18:28:38 bdev_raid -- bdev/bdev_raid.sh@878 -- # run_test raid_rebuild_test_sb raid_rebuild_test raid1 4 true false true 00:24:54.871 18:28:38 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 7 -le 1 ']' 00:24:54.871 18:28:38 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:24:54.871 18:28:38 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:24:55.130 ************************************ 00:24:55.130 START TEST raid_rebuild_test_sb 00:24:55.130 ************************************ 00:24:55.130 18:28:38 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@1123 -- # raid_rebuild_test raid1 4 true false true 00:24:55.130 18:28:38 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@568 -- # local raid_level=raid1 00:24:55.130 18:28:38 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@569 -- # local num_base_bdevs=4 00:24:55.130 18:28:38 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@570 -- # local superblock=true 00:24:55.130 18:28:38 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@571 -- # local background_io=false 00:24:55.130 18:28:38 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@572 -- # local verify=true 00:24:55.130 18:28:38 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@573 -- # (( i = 1 )) 00:24:55.130 18:28:38 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:24:55.130 18:28:38 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@575 -- # echo BaseBdev1 00:24:55.130 18:28:38 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:24:55.130 18:28:38 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:24:55.130 18:28:38 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@575 -- # echo BaseBdev2 00:24:55.130 18:28:38 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:24:55.130 18:28:38 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:24:55.130 18:28:38 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@575 -- # echo BaseBdev3 00:24:55.130 18:28:38 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:24:55.130 18:28:38 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:24:55.130 18:28:38 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@575 -- # echo BaseBdev4 00:24:55.130 18:28:38 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:24:55.130 18:28:38 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:24:55.130 18:28:38 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@573 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3' 'BaseBdev4') 00:24:55.130 18:28:38 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@573 -- # local base_bdevs 00:24:55.130 18:28:38 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@574 -- # local raid_bdev_name=raid_bdev1 00:24:55.130 18:28:38 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@575 -- # local strip_size 00:24:55.130 18:28:38 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@576 -- # local create_arg 00:24:55.130 18:28:38 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@577 -- # local raid_bdev_size 00:24:55.130 18:28:38 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@578 -- # local data_offset 00:24:55.130 18:28:38 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@580 -- # '[' raid1 '!=' raid1 ']' 00:24:55.130 18:28:38 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@588 -- # strip_size=0 00:24:55.130 18:28:38 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@591 -- # '[' true = true ']' 00:24:55.130 18:28:38 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@592 -- # create_arg+=' -s' 00:24:55.130 18:28:38 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@596 -- # raid_pid=2584387 00:24:55.130 18:28:38 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@597 -- # waitforlisten 2584387 /var/tmp/spdk-raid.sock 00:24:55.131 18:28:38 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@595 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 3M -q 2 -U -z -L bdev_raid 00:24:55.131 18:28:38 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@829 -- # '[' -z 2584387 ']' 00:24:55.131 18:28:38 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:24:55.131 18:28:38 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@834 -- # local max_retries=100 00:24:55.131 18:28:38 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:24:55.131 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:24:55.131 18:28:38 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@838 -- # xtrace_disable 00:24:55.131 18:28:38 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:24:55.131 [2024-07-12 18:28:38.673646] Starting SPDK v24.09-pre git sha1 182dd7de4 / DPDK 24.03.0 initialization... 00:24:55.131 [2024-07-12 18:28:38.673718] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2584387 ] 00:24:55.131 I/O size of 3145728 is greater than zero copy threshold (65536). 00:24:55.131 Zero copy mechanism will not be used. 00:24:55.131 [2024-07-12 18:28:38.805858] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:24:55.389 [2024-07-12 18:28:38.914125] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:24:55.389 [2024-07-12 18:28:38.972399] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:24:55.389 [2024-07-12 18:28:38.972428] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:24:55.954 18:28:39 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:24:55.954 18:28:39 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@862 -- # return 0 00:24:55.954 18:28:39 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:24:55.954 18:28:39 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:24:56.211 BaseBdev1_malloc 00:24:56.211 18:28:39 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev1_malloc -p BaseBdev1 00:24:56.211 [2024-07-12 18:28:39.850120] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev1_malloc 00:24:56.211 [2024-07-12 18:28:39.850171] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:24:56.211 [2024-07-12 18:28:39.850193] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1df5d40 00:24:56.211 [2024-07-12 18:28:39.850206] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:24:56.211 [2024-07-12 18:28:39.851764] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:24:56.211 [2024-07-12 18:28:39.851792] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:24:56.211 BaseBdev1 00:24:56.211 18:28:39 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:24:56.211 18:28:39 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:24:56.468 BaseBdev2_malloc 00:24:56.468 18:28:40 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev2_malloc -p BaseBdev2 00:24:56.726 [2024-07-12 18:28:40.220097] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev2_malloc 00:24:56.726 [2024-07-12 18:28:40.220147] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:24:56.726 [2024-07-12 18:28:40.220172] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1df6860 00:24:56.726 [2024-07-12 18:28:40.220185] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:24:56.726 [2024-07-12 18:28:40.221694] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:24:56.726 [2024-07-12 18:28:40.221723] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:24:56.726 BaseBdev2 00:24:56.726 18:28:40 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:24:56.726 18:28:40 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3_malloc 00:24:56.726 BaseBdev3_malloc 00:24:56.726 18:28:40 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev3_malloc -p BaseBdev3 00:24:56.984 [2024-07-12 18:28:40.585787] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev3_malloc 00:24:56.984 [2024-07-12 18:28:40.585834] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:24:56.984 [2024-07-12 18:28:40.585854] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1fa38f0 00:24:56.984 [2024-07-12 18:28:40.585866] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:24:56.984 [2024-07-12 18:28:40.587258] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:24:56.984 [2024-07-12 18:28:40.587289] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev3 00:24:56.984 BaseBdev3 00:24:56.984 18:28:40 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:24:56.984 18:28:40 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4_malloc 00:24:57.242 BaseBdev4_malloc 00:24:57.242 18:28:40 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev4_malloc -p BaseBdev4 00:24:57.242 [2024-07-12 18:28:40.947231] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev4_malloc 00:24:57.242 [2024-07-12 18:28:40.947276] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:24:57.242 [2024-07-12 18:28:40.947296] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1fa2ad0 00:24:57.242 [2024-07-12 18:28:40.947308] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:24:57.242 [2024-07-12 18:28:40.948659] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:24:57.242 [2024-07-12 18:28:40.948692] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev4 00:24:57.242 BaseBdev4 00:24:57.242 18:28:40 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@606 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b spare_malloc 00:24:57.500 spare_malloc 00:24:57.500 18:28:41 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@607 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_delay_create -b spare_malloc -d spare_delay -r 0 -t 0 -w 100000 -n 100000 00:24:57.757 spare_delay 00:24:57.757 18:28:41 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@608 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:24:57.757 [2024-07-12 18:28:41.457042] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:24:57.757 [2024-07-12 18:28:41.457091] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:24:57.757 [2024-07-12 18:28:41.457110] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1fa75b0 00:24:57.757 [2024-07-12 18:28:41.457123] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:24:57.757 [2024-07-12 18:28:41.458514] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:24:57.757 [2024-07-12 18:28:41.458541] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:24:57.757 spare 00:24:57.757 18:28:41 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@611 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n raid_bdev1 00:24:58.015 [2024-07-12 18:28:41.641564] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:24:58.015 [2024-07-12 18:28:41.642717] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:24:58.015 [2024-07-12 18:28:41.642770] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:24:58.015 [2024-07-12 18:28:41.642815] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:24:58.015 [2024-07-12 18:28:41.643006] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x1f268a0 00:24:58.015 [2024-07-12 18:28:41.643018] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:24:58.015 [2024-07-12 18:28:41.643197] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1fa0e10 00:24:58.015 [2024-07-12 18:28:41.643340] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1f268a0 00:24:58.015 [2024-07-12 18:28:41.643350] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x1f268a0 00:24:58.015 [2024-07-12 18:28:41.643437] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:24:58.015 18:28:41 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@612 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 4 00:24:58.015 18:28:41 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:24:58.015 18:28:41 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:24:58.015 18:28:41 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:24:58.015 18:28:41 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:24:58.015 18:28:41 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:24:58.015 18:28:41 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:24:58.015 18:28:41 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:24:58.015 18:28:41 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:24:58.015 18:28:41 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:24:58.015 18:28:41 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:58.015 18:28:41 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:58.274 18:28:41 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:24:58.274 "name": "raid_bdev1", 00:24:58.274 "uuid": "1681fd7e-4c21-4426-95ed-ecfd4b1abaf9", 00:24:58.274 "strip_size_kb": 0, 00:24:58.274 "state": "online", 00:24:58.274 "raid_level": "raid1", 00:24:58.274 "superblock": true, 00:24:58.274 "num_base_bdevs": 4, 00:24:58.274 "num_base_bdevs_discovered": 4, 00:24:58.274 "num_base_bdevs_operational": 4, 00:24:58.274 "base_bdevs_list": [ 00:24:58.274 { 00:24:58.274 "name": "BaseBdev1", 00:24:58.274 "uuid": "8fb36253-bf9b-5320-bc0b-fcb6b6e4e008", 00:24:58.274 "is_configured": true, 00:24:58.274 "data_offset": 2048, 00:24:58.274 "data_size": 63488 00:24:58.274 }, 00:24:58.274 { 00:24:58.274 "name": "BaseBdev2", 00:24:58.274 "uuid": "76e518af-1c43-5cba-9625-6b887213e732", 00:24:58.274 "is_configured": true, 00:24:58.274 "data_offset": 2048, 00:24:58.274 "data_size": 63488 00:24:58.274 }, 00:24:58.274 { 00:24:58.274 "name": "BaseBdev3", 00:24:58.274 "uuid": "6d1c7c30-9fb2-5ea7-a605-d0a8dc8796dd", 00:24:58.274 "is_configured": true, 00:24:58.274 "data_offset": 2048, 00:24:58.274 "data_size": 63488 00:24:58.274 }, 00:24:58.274 { 00:24:58.274 "name": "BaseBdev4", 00:24:58.274 "uuid": "b83a9cf7-e7c4-5387-aadb-9b347feea760", 00:24:58.274 "is_configured": true, 00:24:58.274 "data_offset": 2048, 00:24:58.274 "data_size": 63488 00:24:58.274 } 00:24:58.274 ] 00:24:58.274 }' 00:24:58.274 18:28:41 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:24:58.274 18:28:41 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:24:58.839 18:28:42 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@615 -- # jq -r '.[].num_blocks' 00:24:58.839 18:28:42 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@615 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:24:59.098 [2024-07-12 18:28:42.748802] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:24:59.098 18:28:42 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@615 -- # raid_bdev_size=63488 00:24:59.098 18:28:42 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@618 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:59.098 18:28:42 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@618 -- # jq -r '.[].base_bdevs_list[0].data_offset' 00:24:59.355 18:28:43 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@618 -- # data_offset=2048 00:24:59.355 18:28:43 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@620 -- # '[' false = true ']' 00:24:59.355 18:28:43 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@623 -- # '[' true = true ']' 00:24:59.355 18:28:43 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@624 -- # local write_unit_size 00:24:59.355 18:28:43 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@627 -- # nbd_start_disks /var/tmp/spdk-raid.sock raid_bdev1 /dev/nbd0 00:24:59.355 18:28:43 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:24:59.355 18:28:43 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@10 -- # bdev_list=('raid_bdev1') 00:24:59.355 18:28:43 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@10 -- # local bdev_list 00:24:59.355 18:28:43 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0') 00:24:59.355 18:28:43 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@11 -- # local nbd_list 00:24:59.355 18:28:43 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@12 -- # local i 00:24:59.355 18:28:43 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:24:59.355 18:28:43 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:24:59.355 18:28:43 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk raid_bdev1 /dev/nbd0 00:24:59.613 [2024-07-12 18:28:43.241851] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1fa0e10 00:24:59.613 /dev/nbd0 00:24:59.613 18:28:43 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:24:59.613 18:28:43 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:24:59.613 18:28:43 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@866 -- # local nbd_name=nbd0 00:24:59.613 18:28:43 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@867 -- # local i 00:24:59.613 18:28:43 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:24:59.613 18:28:43 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:24:59.613 18:28:43 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@870 -- # grep -q -w nbd0 /proc/partitions 00:24:59.613 18:28:43 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@871 -- # break 00:24:59.613 18:28:43 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:24:59.613 18:28:43 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:24:59.613 18:28:43 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@883 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:24:59.613 1+0 records in 00:24:59.613 1+0 records out 00:24:59.613 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000244471 s, 16.8 MB/s 00:24:59.613 18:28:43 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:24:59.613 18:28:43 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@884 -- # size=4096 00:24:59.613 18:28:43 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:24:59.613 18:28:43 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:24:59.613 18:28:43 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@887 -- # return 0 00:24:59.613 18:28:43 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:24:59.613 18:28:43 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:24:59.613 18:28:43 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@628 -- # '[' raid1 = raid5f ']' 00:24:59.613 18:28:43 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@632 -- # write_unit_size=1 00:24:59.613 18:28:43 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@634 -- # dd if=/dev/urandom of=/dev/nbd0 bs=512 count=63488 oflag=direct 00:25:07.712 63488+0 records in 00:25:07.712 63488+0 records out 00:25:07.712 32505856 bytes (33 MB, 31 MiB) copied, 7.23321 s, 4.5 MB/s 00:25:07.712 18:28:50 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@635 -- # nbd_stop_disks /var/tmp/spdk-raid.sock /dev/nbd0 00:25:07.712 18:28:50 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:25:07.712 18:28:50 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0') 00:25:07.712 18:28:50 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@50 -- # local nbd_list 00:25:07.712 18:28:50 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@51 -- # local i 00:25:07.712 18:28:50 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:25:07.712 18:28:50 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd0 00:25:07.712 [2024-07-12 18:28:50.810503] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:25:07.712 18:28:50 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:25:07.712 18:28:50 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:25:07.712 18:28:50 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:25:07.712 18:28:50 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:25:07.712 18:28:50 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:25:07.712 18:28:50 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:25:07.712 18:28:50 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@41 -- # break 00:25:07.712 18:28:50 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@45 -- # return 0 00:25:07.712 18:28:50 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@639 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev1 00:25:07.712 [2024-07-12 18:28:51.046397] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:25:07.712 18:28:51 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@642 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:25:07.712 18:28:51 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:25:07.712 18:28:51 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:25:07.712 18:28:51 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:25:07.712 18:28:51 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:25:07.712 18:28:51 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:25:07.712 18:28:51 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:25:07.712 18:28:51 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:25:07.712 18:28:51 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:25:07.712 18:28:51 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:25:07.712 18:28:51 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:07.712 18:28:51 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:07.712 18:28:51 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:25:07.712 "name": "raid_bdev1", 00:25:07.712 "uuid": "1681fd7e-4c21-4426-95ed-ecfd4b1abaf9", 00:25:07.712 "strip_size_kb": 0, 00:25:07.712 "state": "online", 00:25:07.712 "raid_level": "raid1", 00:25:07.712 "superblock": true, 00:25:07.712 "num_base_bdevs": 4, 00:25:07.712 "num_base_bdevs_discovered": 3, 00:25:07.712 "num_base_bdevs_operational": 3, 00:25:07.712 "base_bdevs_list": [ 00:25:07.712 { 00:25:07.712 "name": null, 00:25:07.712 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:07.712 "is_configured": false, 00:25:07.712 "data_offset": 2048, 00:25:07.712 "data_size": 63488 00:25:07.712 }, 00:25:07.712 { 00:25:07.712 "name": "BaseBdev2", 00:25:07.712 "uuid": "76e518af-1c43-5cba-9625-6b887213e732", 00:25:07.712 "is_configured": true, 00:25:07.712 "data_offset": 2048, 00:25:07.712 "data_size": 63488 00:25:07.712 }, 00:25:07.712 { 00:25:07.712 "name": "BaseBdev3", 00:25:07.712 "uuid": "6d1c7c30-9fb2-5ea7-a605-d0a8dc8796dd", 00:25:07.712 "is_configured": true, 00:25:07.712 "data_offset": 2048, 00:25:07.712 "data_size": 63488 00:25:07.712 }, 00:25:07.712 { 00:25:07.712 "name": "BaseBdev4", 00:25:07.712 "uuid": "b83a9cf7-e7c4-5387-aadb-9b347feea760", 00:25:07.712 "is_configured": true, 00:25:07.712 "data_offset": 2048, 00:25:07.712 "data_size": 63488 00:25:07.712 } 00:25:07.712 ] 00:25:07.712 }' 00:25:07.712 18:28:51 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:25:07.712 18:28:51 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:25:08.276 18:28:51 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@645 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:25:08.533 [2024-07-12 18:28:52.125270] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:25:08.533 [2024-07-12 18:28:52.129384] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1fa0e10 00:25:08.533 [2024-07-12 18:28:52.131750] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:25:08.533 18:28:52 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@646 -- # sleep 1 00:25:09.464 18:28:53 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@649 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:25:09.464 18:28:53 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:25:09.464 18:28:53 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:25:09.464 18:28:53 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=spare 00:25:09.464 18:28:53 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:25:09.464 18:28:53 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:09.464 18:28:53 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:09.721 18:28:53 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:25:09.721 "name": "raid_bdev1", 00:25:09.721 "uuid": "1681fd7e-4c21-4426-95ed-ecfd4b1abaf9", 00:25:09.721 "strip_size_kb": 0, 00:25:09.721 "state": "online", 00:25:09.721 "raid_level": "raid1", 00:25:09.721 "superblock": true, 00:25:09.721 "num_base_bdevs": 4, 00:25:09.721 "num_base_bdevs_discovered": 4, 00:25:09.721 "num_base_bdevs_operational": 4, 00:25:09.721 "process": { 00:25:09.721 "type": "rebuild", 00:25:09.721 "target": "spare", 00:25:09.721 "progress": { 00:25:09.721 "blocks": 24576, 00:25:09.721 "percent": 38 00:25:09.721 } 00:25:09.721 }, 00:25:09.721 "base_bdevs_list": [ 00:25:09.721 { 00:25:09.721 "name": "spare", 00:25:09.721 "uuid": "370f62c3-9fab-5839-8b6d-fde934fc5960", 00:25:09.721 "is_configured": true, 00:25:09.721 "data_offset": 2048, 00:25:09.721 "data_size": 63488 00:25:09.721 }, 00:25:09.721 { 00:25:09.721 "name": "BaseBdev2", 00:25:09.721 "uuid": "76e518af-1c43-5cba-9625-6b887213e732", 00:25:09.721 "is_configured": true, 00:25:09.721 "data_offset": 2048, 00:25:09.721 "data_size": 63488 00:25:09.721 }, 00:25:09.721 { 00:25:09.721 "name": "BaseBdev3", 00:25:09.722 "uuid": "6d1c7c30-9fb2-5ea7-a605-d0a8dc8796dd", 00:25:09.722 "is_configured": true, 00:25:09.722 "data_offset": 2048, 00:25:09.722 "data_size": 63488 00:25:09.722 }, 00:25:09.722 { 00:25:09.722 "name": "BaseBdev4", 00:25:09.722 "uuid": "b83a9cf7-e7c4-5387-aadb-9b347feea760", 00:25:09.722 "is_configured": true, 00:25:09.722 "data_offset": 2048, 00:25:09.722 "data_size": 63488 00:25:09.722 } 00:25:09.722 ] 00:25:09.722 }' 00:25:09.722 18:28:53 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:25:09.722 18:28:53 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:25:09.979 18:28:53 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:25:09.979 18:28:53 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:25:09.979 18:28:53 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@652 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev spare 00:25:10.237 [2024-07-12 18:28:53.723387] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:25:10.237 [2024-07-12 18:28:53.744347] bdev_raid.c:2513:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:25:10.237 [2024-07-12 18:28:53.744406] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:25:10.237 [2024-07-12 18:28:53.744423] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:25:10.237 [2024-07-12 18:28:53.744431] bdev_raid.c:2451:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:25:10.237 18:28:53 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@655 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:25:10.237 18:28:53 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:25:10.237 18:28:53 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:25:10.237 18:28:53 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:25:10.237 18:28:53 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:25:10.237 18:28:53 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:25:10.237 18:28:53 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:25:10.237 18:28:53 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:25:10.237 18:28:53 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:25:10.237 18:28:53 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:25:10.237 18:28:53 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:10.237 18:28:53 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:10.495 18:28:54 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:25:10.495 "name": "raid_bdev1", 00:25:10.495 "uuid": "1681fd7e-4c21-4426-95ed-ecfd4b1abaf9", 00:25:10.495 "strip_size_kb": 0, 00:25:10.495 "state": "online", 00:25:10.495 "raid_level": "raid1", 00:25:10.495 "superblock": true, 00:25:10.495 "num_base_bdevs": 4, 00:25:10.495 "num_base_bdevs_discovered": 3, 00:25:10.495 "num_base_bdevs_operational": 3, 00:25:10.495 "base_bdevs_list": [ 00:25:10.495 { 00:25:10.495 "name": null, 00:25:10.495 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:10.495 "is_configured": false, 00:25:10.495 "data_offset": 2048, 00:25:10.495 "data_size": 63488 00:25:10.495 }, 00:25:10.495 { 00:25:10.495 "name": "BaseBdev2", 00:25:10.495 "uuid": "76e518af-1c43-5cba-9625-6b887213e732", 00:25:10.495 "is_configured": true, 00:25:10.495 "data_offset": 2048, 00:25:10.495 "data_size": 63488 00:25:10.495 }, 00:25:10.495 { 00:25:10.495 "name": "BaseBdev3", 00:25:10.495 "uuid": "6d1c7c30-9fb2-5ea7-a605-d0a8dc8796dd", 00:25:10.495 "is_configured": true, 00:25:10.495 "data_offset": 2048, 00:25:10.495 "data_size": 63488 00:25:10.495 }, 00:25:10.495 { 00:25:10.495 "name": "BaseBdev4", 00:25:10.495 "uuid": "b83a9cf7-e7c4-5387-aadb-9b347feea760", 00:25:10.495 "is_configured": true, 00:25:10.495 "data_offset": 2048, 00:25:10.495 "data_size": 63488 00:25:10.495 } 00:25:10.495 ] 00:25:10.495 }' 00:25:10.495 18:28:54 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:25:10.495 18:28:54 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:25:11.061 18:28:54 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@658 -- # verify_raid_bdev_process raid_bdev1 none none 00:25:11.061 18:28:54 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:25:11.061 18:28:54 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:25:11.061 18:28:54 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=none 00:25:11.061 18:28:54 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:25:11.061 18:28:54 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:11.061 18:28:54 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:11.319 18:28:54 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:25:11.319 "name": "raid_bdev1", 00:25:11.319 "uuid": "1681fd7e-4c21-4426-95ed-ecfd4b1abaf9", 00:25:11.319 "strip_size_kb": 0, 00:25:11.319 "state": "online", 00:25:11.319 "raid_level": "raid1", 00:25:11.319 "superblock": true, 00:25:11.319 "num_base_bdevs": 4, 00:25:11.319 "num_base_bdevs_discovered": 3, 00:25:11.319 "num_base_bdevs_operational": 3, 00:25:11.319 "base_bdevs_list": [ 00:25:11.319 { 00:25:11.319 "name": null, 00:25:11.319 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:11.319 "is_configured": false, 00:25:11.319 "data_offset": 2048, 00:25:11.319 "data_size": 63488 00:25:11.319 }, 00:25:11.319 { 00:25:11.319 "name": "BaseBdev2", 00:25:11.319 "uuid": "76e518af-1c43-5cba-9625-6b887213e732", 00:25:11.319 "is_configured": true, 00:25:11.319 "data_offset": 2048, 00:25:11.319 "data_size": 63488 00:25:11.319 }, 00:25:11.319 { 00:25:11.319 "name": "BaseBdev3", 00:25:11.319 "uuid": "6d1c7c30-9fb2-5ea7-a605-d0a8dc8796dd", 00:25:11.319 "is_configured": true, 00:25:11.319 "data_offset": 2048, 00:25:11.319 "data_size": 63488 00:25:11.319 }, 00:25:11.319 { 00:25:11.319 "name": "BaseBdev4", 00:25:11.319 "uuid": "b83a9cf7-e7c4-5387-aadb-9b347feea760", 00:25:11.319 "is_configured": true, 00:25:11.319 "data_offset": 2048, 00:25:11.319 "data_size": 63488 00:25:11.319 } 00:25:11.319 ] 00:25:11.319 }' 00:25:11.319 18:28:54 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:25:11.319 18:28:54 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:25:11.319 18:28:54 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:25:11.319 18:28:54 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:25:11.319 18:28:54 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@661 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:25:11.578 [2024-07-12 18:28:55.172260] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:25:11.578 [2024-07-12 18:28:55.176374] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1f26e90 00:25:11.578 [2024-07-12 18:28:55.177862] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:25:11.578 18:28:55 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@662 -- # sleep 1 00:25:12.513 18:28:56 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@663 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:25:12.513 18:28:56 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:25:12.513 18:28:56 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:25:12.513 18:28:56 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=spare 00:25:12.513 18:28:56 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:25:12.513 18:28:56 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:12.513 18:28:56 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:12.772 18:28:56 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:25:12.772 "name": "raid_bdev1", 00:25:12.772 "uuid": "1681fd7e-4c21-4426-95ed-ecfd4b1abaf9", 00:25:12.772 "strip_size_kb": 0, 00:25:12.772 "state": "online", 00:25:12.772 "raid_level": "raid1", 00:25:12.772 "superblock": true, 00:25:12.772 "num_base_bdevs": 4, 00:25:12.772 "num_base_bdevs_discovered": 4, 00:25:12.772 "num_base_bdevs_operational": 4, 00:25:12.772 "process": { 00:25:12.772 "type": "rebuild", 00:25:12.772 "target": "spare", 00:25:12.772 "progress": { 00:25:12.772 "blocks": 24576, 00:25:12.772 "percent": 38 00:25:12.772 } 00:25:12.772 }, 00:25:12.772 "base_bdevs_list": [ 00:25:12.772 { 00:25:12.772 "name": "spare", 00:25:12.772 "uuid": "370f62c3-9fab-5839-8b6d-fde934fc5960", 00:25:12.772 "is_configured": true, 00:25:12.772 "data_offset": 2048, 00:25:12.772 "data_size": 63488 00:25:12.772 }, 00:25:12.772 { 00:25:12.772 "name": "BaseBdev2", 00:25:12.772 "uuid": "76e518af-1c43-5cba-9625-6b887213e732", 00:25:12.772 "is_configured": true, 00:25:12.772 "data_offset": 2048, 00:25:12.772 "data_size": 63488 00:25:12.772 }, 00:25:12.772 { 00:25:12.772 "name": "BaseBdev3", 00:25:12.772 "uuid": "6d1c7c30-9fb2-5ea7-a605-d0a8dc8796dd", 00:25:12.772 "is_configured": true, 00:25:12.772 "data_offset": 2048, 00:25:12.772 "data_size": 63488 00:25:12.772 }, 00:25:12.772 { 00:25:12.772 "name": "BaseBdev4", 00:25:12.772 "uuid": "b83a9cf7-e7c4-5387-aadb-9b347feea760", 00:25:12.772 "is_configured": true, 00:25:12.772 "data_offset": 2048, 00:25:12.772 "data_size": 63488 00:25:12.772 } 00:25:12.772 ] 00:25:12.772 }' 00:25:12.772 18:28:56 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:25:12.772 18:28:56 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:25:12.772 18:28:56 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:25:13.031 18:28:56 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:25:13.031 18:28:56 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@665 -- # '[' true = true ']' 00:25:13.031 18:28:56 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@665 -- # '[' = false ']' 00:25:13.031 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev_raid.sh: line 665: [: =: unary operator expected 00:25:13.031 18:28:56 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@690 -- # local num_base_bdevs_operational=4 00:25:13.031 18:28:56 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@692 -- # '[' raid1 = raid1 ']' 00:25:13.031 18:28:56 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@692 -- # '[' 4 -gt 2 ']' 00:25:13.031 18:28:56 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@694 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev2 00:25:13.290 [2024-07-12 18:28:56.761428] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:25:13.290 [2024-07-12 18:28:56.890631] bdev_raid.c:1919:raid_bdev_channel_remove_base_bdev: *DEBUG*: slot: 1 raid_ch: 0x1f26e90 00:25:13.290 18:28:56 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@697 -- # base_bdevs[1]= 00:25:13.290 18:28:56 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@698 -- # (( num_base_bdevs_operational-- )) 00:25:13.290 18:28:56 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@701 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:25:13.290 18:28:56 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:25:13.290 18:28:56 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:25:13.290 18:28:56 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=spare 00:25:13.290 18:28:56 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:25:13.290 18:28:56 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:13.290 18:28:56 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:13.549 18:28:57 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:25:13.549 "name": "raid_bdev1", 00:25:13.549 "uuid": "1681fd7e-4c21-4426-95ed-ecfd4b1abaf9", 00:25:13.549 "strip_size_kb": 0, 00:25:13.549 "state": "online", 00:25:13.549 "raid_level": "raid1", 00:25:13.549 "superblock": true, 00:25:13.549 "num_base_bdevs": 4, 00:25:13.549 "num_base_bdevs_discovered": 3, 00:25:13.549 "num_base_bdevs_operational": 3, 00:25:13.549 "process": { 00:25:13.549 "type": "rebuild", 00:25:13.549 "target": "spare", 00:25:13.549 "progress": { 00:25:13.549 "blocks": 36864, 00:25:13.549 "percent": 58 00:25:13.549 } 00:25:13.549 }, 00:25:13.549 "base_bdevs_list": [ 00:25:13.549 { 00:25:13.549 "name": "spare", 00:25:13.549 "uuid": "370f62c3-9fab-5839-8b6d-fde934fc5960", 00:25:13.549 "is_configured": true, 00:25:13.549 "data_offset": 2048, 00:25:13.549 "data_size": 63488 00:25:13.549 }, 00:25:13.549 { 00:25:13.549 "name": null, 00:25:13.549 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:13.549 "is_configured": false, 00:25:13.549 "data_offset": 2048, 00:25:13.549 "data_size": 63488 00:25:13.549 }, 00:25:13.549 { 00:25:13.549 "name": "BaseBdev3", 00:25:13.549 "uuid": "6d1c7c30-9fb2-5ea7-a605-d0a8dc8796dd", 00:25:13.549 "is_configured": true, 00:25:13.549 "data_offset": 2048, 00:25:13.549 "data_size": 63488 00:25:13.549 }, 00:25:13.549 { 00:25:13.549 "name": "BaseBdev4", 00:25:13.549 "uuid": "b83a9cf7-e7c4-5387-aadb-9b347feea760", 00:25:13.549 "is_configured": true, 00:25:13.549 "data_offset": 2048, 00:25:13.549 "data_size": 63488 00:25:13.549 } 00:25:13.549 ] 00:25:13.549 }' 00:25:13.549 18:28:57 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:25:13.549 18:28:57 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:25:13.550 18:28:57 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:25:13.550 18:28:57 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:25:13.550 18:28:57 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@705 -- # local timeout=910 00:25:13.550 18:28:57 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:25:13.550 18:28:57 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:25:13.550 18:28:57 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:25:13.550 18:28:57 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:25:13.550 18:28:57 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=spare 00:25:13.550 18:28:57 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:25:13.550 18:28:57 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:13.550 18:28:57 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:14.117 18:28:57 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:25:14.117 "name": "raid_bdev1", 00:25:14.117 "uuid": "1681fd7e-4c21-4426-95ed-ecfd4b1abaf9", 00:25:14.117 "strip_size_kb": 0, 00:25:14.117 "state": "online", 00:25:14.117 "raid_level": "raid1", 00:25:14.117 "superblock": true, 00:25:14.117 "num_base_bdevs": 4, 00:25:14.117 "num_base_bdevs_discovered": 3, 00:25:14.117 "num_base_bdevs_operational": 3, 00:25:14.117 "process": { 00:25:14.117 "type": "rebuild", 00:25:14.117 "target": "spare", 00:25:14.117 "progress": { 00:25:14.117 "blocks": 49152, 00:25:14.117 "percent": 77 00:25:14.117 } 00:25:14.117 }, 00:25:14.117 "base_bdevs_list": [ 00:25:14.117 { 00:25:14.117 "name": "spare", 00:25:14.117 "uuid": "370f62c3-9fab-5839-8b6d-fde934fc5960", 00:25:14.117 "is_configured": true, 00:25:14.117 "data_offset": 2048, 00:25:14.117 "data_size": 63488 00:25:14.117 }, 00:25:14.117 { 00:25:14.117 "name": null, 00:25:14.117 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:14.117 "is_configured": false, 00:25:14.117 "data_offset": 2048, 00:25:14.117 "data_size": 63488 00:25:14.117 }, 00:25:14.117 { 00:25:14.117 "name": "BaseBdev3", 00:25:14.117 "uuid": "6d1c7c30-9fb2-5ea7-a605-d0a8dc8796dd", 00:25:14.117 "is_configured": true, 00:25:14.117 "data_offset": 2048, 00:25:14.117 "data_size": 63488 00:25:14.117 }, 00:25:14.117 { 00:25:14.117 "name": "BaseBdev4", 00:25:14.117 "uuid": "b83a9cf7-e7c4-5387-aadb-9b347feea760", 00:25:14.117 "is_configured": true, 00:25:14.117 "data_offset": 2048, 00:25:14.117 "data_size": 63488 00:25:14.117 } 00:25:14.117 ] 00:25:14.117 }' 00:25:14.117 18:28:57 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:25:14.117 18:28:57 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:25:14.117 18:28:57 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:25:14.117 18:28:57 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:25:14.117 18:28:57 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@710 -- # sleep 1 00:25:14.683 [2024-07-12 18:28:58.402287] bdev_raid.c:2789:raid_bdev_process_thread_run: *DEBUG*: process completed on raid_bdev1 00:25:14.683 [2024-07-12 18:28:58.402345] bdev_raid.c:2504:raid_bdev_process_finish_done: *NOTICE*: Finished rebuild on raid bdev raid_bdev1 00:25:14.683 [2024-07-12 18:28:58.402438] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:25:15.249 18:28:58 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:25:15.249 18:28:58 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:25:15.249 18:28:58 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:25:15.249 18:28:58 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:25:15.249 18:28:58 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=spare 00:25:15.249 18:28:58 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:25:15.249 18:28:58 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:15.249 18:28:58 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:15.508 18:28:59 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:25:15.508 "name": "raid_bdev1", 00:25:15.508 "uuid": "1681fd7e-4c21-4426-95ed-ecfd4b1abaf9", 00:25:15.508 "strip_size_kb": 0, 00:25:15.508 "state": "online", 00:25:15.508 "raid_level": "raid1", 00:25:15.508 "superblock": true, 00:25:15.508 "num_base_bdevs": 4, 00:25:15.508 "num_base_bdevs_discovered": 3, 00:25:15.508 "num_base_bdevs_operational": 3, 00:25:15.508 "base_bdevs_list": [ 00:25:15.508 { 00:25:15.508 "name": "spare", 00:25:15.508 "uuid": "370f62c3-9fab-5839-8b6d-fde934fc5960", 00:25:15.508 "is_configured": true, 00:25:15.508 "data_offset": 2048, 00:25:15.508 "data_size": 63488 00:25:15.508 }, 00:25:15.508 { 00:25:15.508 "name": null, 00:25:15.508 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:15.508 "is_configured": false, 00:25:15.508 "data_offset": 2048, 00:25:15.508 "data_size": 63488 00:25:15.508 }, 00:25:15.508 { 00:25:15.508 "name": "BaseBdev3", 00:25:15.508 "uuid": "6d1c7c30-9fb2-5ea7-a605-d0a8dc8796dd", 00:25:15.508 "is_configured": true, 00:25:15.508 "data_offset": 2048, 00:25:15.508 "data_size": 63488 00:25:15.508 }, 00:25:15.508 { 00:25:15.508 "name": "BaseBdev4", 00:25:15.508 "uuid": "b83a9cf7-e7c4-5387-aadb-9b347feea760", 00:25:15.508 "is_configured": true, 00:25:15.508 "data_offset": 2048, 00:25:15.508 "data_size": 63488 00:25:15.508 } 00:25:15.508 ] 00:25:15.508 }' 00:25:15.508 18:28:59 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:25:15.508 18:28:59 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ none == \r\e\b\u\i\l\d ]] 00:25:15.509 18:28:59 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:25:15.509 18:28:59 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ none == \s\p\a\r\e ]] 00:25:15.509 18:28:59 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@708 -- # break 00:25:15.509 18:28:59 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@714 -- # verify_raid_bdev_process raid_bdev1 none none 00:25:15.509 18:28:59 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:25:15.509 18:28:59 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:25:15.509 18:28:59 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=none 00:25:15.509 18:28:59 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:25:15.509 18:28:59 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:15.509 18:28:59 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:15.767 18:28:59 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:25:15.767 "name": "raid_bdev1", 00:25:15.767 "uuid": "1681fd7e-4c21-4426-95ed-ecfd4b1abaf9", 00:25:15.767 "strip_size_kb": 0, 00:25:15.767 "state": "online", 00:25:15.767 "raid_level": "raid1", 00:25:15.767 "superblock": true, 00:25:15.767 "num_base_bdevs": 4, 00:25:15.767 "num_base_bdevs_discovered": 3, 00:25:15.767 "num_base_bdevs_operational": 3, 00:25:15.767 "base_bdevs_list": [ 00:25:15.767 { 00:25:15.767 "name": "spare", 00:25:15.767 "uuid": "370f62c3-9fab-5839-8b6d-fde934fc5960", 00:25:15.767 "is_configured": true, 00:25:15.767 "data_offset": 2048, 00:25:15.767 "data_size": 63488 00:25:15.767 }, 00:25:15.767 { 00:25:15.767 "name": null, 00:25:15.767 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:15.767 "is_configured": false, 00:25:15.767 "data_offset": 2048, 00:25:15.767 "data_size": 63488 00:25:15.767 }, 00:25:15.767 { 00:25:15.767 "name": "BaseBdev3", 00:25:15.767 "uuid": "6d1c7c30-9fb2-5ea7-a605-d0a8dc8796dd", 00:25:15.767 "is_configured": true, 00:25:15.767 "data_offset": 2048, 00:25:15.767 "data_size": 63488 00:25:15.767 }, 00:25:15.767 { 00:25:15.767 "name": "BaseBdev4", 00:25:15.767 "uuid": "b83a9cf7-e7c4-5387-aadb-9b347feea760", 00:25:15.767 "is_configured": true, 00:25:15.767 "data_offset": 2048, 00:25:15.767 "data_size": 63488 00:25:15.767 } 00:25:15.767 ] 00:25:15.767 }' 00:25:15.767 18:28:59 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:25:15.767 18:28:59 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:25:15.767 18:28:59 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:25:16.026 18:28:59 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:25:16.026 18:28:59 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@715 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:25:16.026 18:28:59 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:25:16.026 18:28:59 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:25:16.026 18:28:59 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:25:16.026 18:28:59 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:25:16.026 18:28:59 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:25:16.026 18:28:59 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:25:16.026 18:28:59 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:25:16.026 18:28:59 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:25:16.026 18:28:59 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:25:16.026 18:28:59 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:16.026 18:28:59 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:16.026 18:28:59 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:25:16.026 "name": "raid_bdev1", 00:25:16.026 "uuid": "1681fd7e-4c21-4426-95ed-ecfd4b1abaf9", 00:25:16.026 "strip_size_kb": 0, 00:25:16.026 "state": "online", 00:25:16.026 "raid_level": "raid1", 00:25:16.026 "superblock": true, 00:25:16.026 "num_base_bdevs": 4, 00:25:16.026 "num_base_bdevs_discovered": 3, 00:25:16.026 "num_base_bdevs_operational": 3, 00:25:16.026 "base_bdevs_list": [ 00:25:16.026 { 00:25:16.026 "name": "spare", 00:25:16.026 "uuid": "370f62c3-9fab-5839-8b6d-fde934fc5960", 00:25:16.026 "is_configured": true, 00:25:16.026 "data_offset": 2048, 00:25:16.026 "data_size": 63488 00:25:16.026 }, 00:25:16.026 { 00:25:16.026 "name": null, 00:25:16.026 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:16.026 "is_configured": false, 00:25:16.026 "data_offset": 2048, 00:25:16.026 "data_size": 63488 00:25:16.026 }, 00:25:16.026 { 00:25:16.026 "name": "BaseBdev3", 00:25:16.026 "uuid": "6d1c7c30-9fb2-5ea7-a605-d0a8dc8796dd", 00:25:16.026 "is_configured": true, 00:25:16.026 "data_offset": 2048, 00:25:16.026 "data_size": 63488 00:25:16.026 }, 00:25:16.026 { 00:25:16.026 "name": "BaseBdev4", 00:25:16.026 "uuid": "b83a9cf7-e7c4-5387-aadb-9b347feea760", 00:25:16.026 "is_configured": true, 00:25:16.026 "data_offset": 2048, 00:25:16.026 "data_size": 63488 00:25:16.026 } 00:25:16.026 ] 00:25:16.026 }' 00:25:16.026 18:28:59 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:25:16.026 18:28:59 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:25:16.709 18:29:00 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@718 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:25:16.968 [2024-07-12 18:29:00.523870] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:25:16.968 [2024-07-12 18:29:00.523900] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:25:16.968 [2024-07-12 18:29:00.523966] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:25:16.968 [2024-07-12 18:29:00.524036] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:25:16.968 [2024-07-12 18:29:00.524048] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1f268a0 name raid_bdev1, state offline 00:25:16.968 18:29:00 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@719 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:16.968 18:29:00 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@719 -- # jq length 00:25:17.226 18:29:00 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@719 -- # [[ 0 == 0 ]] 00:25:17.226 18:29:00 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@721 -- # '[' true = true ']' 00:25:17.226 18:29:00 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@722 -- # '[' false = true ']' 00:25:17.226 18:29:00 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@736 -- # nbd_start_disks /var/tmp/spdk-raid.sock 'BaseBdev1 spare' '/dev/nbd0 /dev/nbd1' 00:25:17.226 18:29:00 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:25:17.226 18:29:00 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@10 -- # bdev_list=('BaseBdev1' 'spare') 00:25:17.226 18:29:00 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@10 -- # local bdev_list 00:25:17.226 18:29:00 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:25:17.226 18:29:00 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@11 -- # local nbd_list 00:25:17.226 18:29:00 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@12 -- # local i 00:25:17.226 18:29:00 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:25:17.226 18:29:00 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:25:17.226 18:29:00 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk BaseBdev1 /dev/nbd0 00:25:17.484 /dev/nbd0 00:25:17.484 18:29:01 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:25:17.484 18:29:01 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:25:17.484 18:29:01 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@866 -- # local nbd_name=nbd0 00:25:17.484 18:29:01 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@867 -- # local i 00:25:17.484 18:29:01 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:25:17.484 18:29:01 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:25:17.484 18:29:01 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@870 -- # grep -q -w nbd0 /proc/partitions 00:25:17.484 18:29:01 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@871 -- # break 00:25:17.484 18:29:01 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:25:17.484 18:29:01 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:25:17.484 18:29:01 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@883 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:25:17.484 1+0 records in 00:25:17.484 1+0 records out 00:25:17.484 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000261025 s, 15.7 MB/s 00:25:17.484 18:29:01 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:25:17.484 18:29:01 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@884 -- # size=4096 00:25:17.484 18:29:01 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:25:17.484 18:29:01 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:25:17.484 18:29:01 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@887 -- # return 0 00:25:17.484 18:29:01 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:25:17.484 18:29:01 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:25:17.484 18:29:01 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk spare /dev/nbd1 00:25:17.743 /dev/nbd1 00:25:17.743 18:29:01 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:25:17.743 18:29:01 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:25:17.743 18:29:01 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@866 -- # local nbd_name=nbd1 00:25:17.743 18:29:01 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@867 -- # local i 00:25:17.743 18:29:01 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:25:17.743 18:29:01 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:25:17.743 18:29:01 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@870 -- # grep -q -w nbd1 /proc/partitions 00:25:17.743 18:29:01 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@871 -- # break 00:25:17.743 18:29:01 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:25:17.743 18:29:01 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:25:17.743 18:29:01 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@883 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:25:17.743 1+0 records in 00:25:17.743 1+0 records out 00:25:17.743 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000333193 s, 12.3 MB/s 00:25:17.743 18:29:01 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:25:17.743 18:29:01 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@884 -- # size=4096 00:25:17.743 18:29:01 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:25:17.743 18:29:01 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:25:17.743 18:29:01 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@887 -- # return 0 00:25:17.743 18:29:01 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:25:17.743 18:29:01 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:25:17.743 18:29:01 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@737 -- # cmp -i 1048576 /dev/nbd0 /dev/nbd1 00:25:17.743 18:29:01 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@738 -- # nbd_stop_disks /var/tmp/spdk-raid.sock '/dev/nbd0 /dev/nbd1' 00:25:17.743 18:29:01 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:25:17.743 18:29:01 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:25:17.743 18:29:01 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@50 -- # local nbd_list 00:25:17.743 18:29:01 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@51 -- # local i 00:25:17.743 18:29:01 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:25:17.743 18:29:01 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd0 00:25:18.002 18:29:01 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:25:18.002 18:29:01 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:25:18.002 18:29:01 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:25:18.002 18:29:01 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:25:18.002 18:29:01 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:25:18.002 18:29:01 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:25:18.002 18:29:01 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@41 -- # break 00:25:18.002 18:29:01 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@45 -- # return 0 00:25:18.002 18:29:01 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:25:18.002 18:29:01 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd1 00:25:18.570 18:29:01 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:25:18.570 18:29:01 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:25:18.570 18:29:01 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:25:18.570 18:29:01 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:25:18.570 18:29:01 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:25:18.570 18:29:01 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:25:18.570 18:29:01 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@41 -- # break 00:25:18.570 18:29:01 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@45 -- # return 0 00:25:18.570 18:29:01 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@742 -- # '[' true = true ']' 00:25:18.570 18:29:01 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@744 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:25:18.570 18:29:02 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@745 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:25:18.829 [2024-07-12 18:29:02.465465] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:25:18.829 [2024-07-12 18:29:02.465512] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:25:18.829 [2024-07-12 18:29:02.465533] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1fa0b40 00:25:18.829 [2024-07-12 18:29:02.465546] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:25:18.829 [2024-07-12 18:29:02.467195] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:25:18.829 [2024-07-12 18:29:02.467228] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:25:18.829 [2024-07-12 18:29:02.467307] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev spare 00:25:18.830 [2024-07-12 18:29:02.467335] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:25:18.830 [2024-07-12 18:29:02.467442] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:25:18.830 [2024-07-12 18:29:02.467515] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:25:18.830 spare 00:25:18.830 18:29:02 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@747 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:25:18.830 18:29:02 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:25:18.830 18:29:02 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:25:18.830 18:29:02 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:25:18.830 18:29:02 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:25:18.830 18:29:02 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:25:18.830 18:29:02 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:25:18.830 18:29:02 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:25:18.830 18:29:02 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:25:18.830 18:29:02 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:25:18.830 18:29:02 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:18.830 18:29:02 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:19.089 [2024-07-12 18:29:02.567829] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x1f2aba0 00:25:19.089 [2024-07-12 18:29:02.567846] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:25:19.089 [2024-07-12 18:29:02.568052] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1f27560 00:25:19.089 [2024-07-12 18:29:02.568199] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1f2aba0 00:25:19.089 [2024-07-12 18:29:02.568209] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x1f2aba0 00:25:19.089 [2024-07-12 18:29:02.568305] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:25:19.089 18:29:02 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:25:19.089 "name": "raid_bdev1", 00:25:19.089 "uuid": "1681fd7e-4c21-4426-95ed-ecfd4b1abaf9", 00:25:19.089 "strip_size_kb": 0, 00:25:19.089 "state": "online", 00:25:19.089 "raid_level": "raid1", 00:25:19.089 "superblock": true, 00:25:19.089 "num_base_bdevs": 4, 00:25:19.089 "num_base_bdevs_discovered": 3, 00:25:19.089 "num_base_bdevs_operational": 3, 00:25:19.089 "base_bdevs_list": [ 00:25:19.089 { 00:25:19.089 "name": "spare", 00:25:19.089 "uuid": "370f62c3-9fab-5839-8b6d-fde934fc5960", 00:25:19.089 "is_configured": true, 00:25:19.089 "data_offset": 2048, 00:25:19.089 "data_size": 63488 00:25:19.089 }, 00:25:19.089 { 00:25:19.089 "name": null, 00:25:19.089 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:19.089 "is_configured": false, 00:25:19.089 "data_offset": 2048, 00:25:19.089 "data_size": 63488 00:25:19.089 }, 00:25:19.089 { 00:25:19.089 "name": "BaseBdev3", 00:25:19.089 "uuid": "6d1c7c30-9fb2-5ea7-a605-d0a8dc8796dd", 00:25:19.089 "is_configured": true, 00:25:19.089 "data_offset": 2048, 00:25:19.089 "data_size": 63488 00:25:19.089 }, 00:25:19.089 { 00:25:19.089 "name": "BaseBdev4", 00:25:19.089 "uuid": "b83a9cf7-e7c4-5387-aadb-9b347feea760", 00:25:19.089 "is_configured": true, 00:25:19.089 "data_offset": 2048, 00:25:19.089 "data_size": 63488 00:25:19.089 } 00:25:19.089 ] 00:25:19.089 }' 00:25:19.089 18:29:02 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:25:19.089 18:29:02 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:25:19.658 18:29:03 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@748 -- # verify_raid_bdev_process raid_bdev1 none none 00:25:19.658 18:29:03 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:25:19.658 18:29:03 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:25:19.658 18:29:03 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=none 00:25:19.658 18:29:03 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:25:19.658 18:29:03 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:19.658 18:29:03 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:19.917 18:29:03 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:25:19.917 "name": "raid_bdev1", 00:25:19.917 "uuid": "1681fd7e-4c21-4426-95ed-ecfd4b1abaf9", 00:25:19.917 "strip_size_kb": 0, 00:25:19.917 "state": "online", 00:25:19.917 "raid_level": "raid1", 00:25:19.917 "superblock": true, 00:25:19.917 "num_base_bdevs": 4, 00:25:19.917 "num_base_bdevs_discovered": 3, 00:25:19.917 "num_base_bdevs_operational": 3, 00:25:19.917 "base_bdevs_list": [ 00:25:19.917 { 00:25:19.917 "name": "spare", 00:25:19.917 "uuid": "370f62c3-9fab-5839-8b6d-fde934fc5960", 00:25:19.917 "is_configured": true, 00:25:19.917 "data_offset": 2048, 00:25:19.917 "data_size": 63488 00:25:19.917 }, 00:25:19.917 { 00:25:19.917 "name": null, 00:25:19.917 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:19.917 "is_configured": false, 00:25:19.917 "data_offset": 2048, 00:25:19.917 "data_size": 63488 00:25:19.917 }, 00:25:19.917 { 00:25:19.917 "name": "BaseBdev3", 00:25:19.917 "uuid": "6d1c7c30-9fb2-5ea7-a605-d0a8dc8796dd", 00:25:19.917 "is_configured": true, 00:25:19.917 "data_offset": 2048, 00:25:19.917 "data_size": 63488 00:25:19.917 }, 00:25:19.917 { 00:25:19.917 "name": "BaseBdev4", 00:25:19.917 "uuid": "b83a9cf7-e7c4-5387-aadb-9b347feea760", 00:25:19.917 "is_configured": true, 00:25:19.917 "data_offset": 2048, 00:25:19.917 "data_size": 63488 00:25:19.917 } 00:25:19.917 ] 00:25:19.917 }' 00:25:19.917 18:29:03 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:25:19.917 18:29:03 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:25:19.917 18:29:03 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:25:20.176 18:29:03 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:25:20.176 18:29:03 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@749 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:20.176 18:29:03 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@749 -- # jq -r '.[].base_bdevs_list[0].name' 00:25:20.435 18:29:03 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@749 -- # [[ spare == \s\p\a\r\e ]] 00:25:20.435 18:29:03 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@752 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev spare 00:25:20.435 [2024-07-12 18:29:04.150084] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:25:20.694 18:29:04 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@753 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:25:20.694 18:29:04 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:25:20.694 18:29:04 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:25:20.694 18:29:04 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:25:20.694 18:29:04 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:25:20.694 18:29:04 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:25:20.694 18:29:04 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:25:20.694 18:29:04 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:25:20.694 18:29:04 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:25:20.694 18:29:04 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:25:20.694 18:29:04 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:20.694 18:29:04 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:20.694 18:29:04 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:25:20.694 "name": "raid_bdev1", 00:25:20.694 "uuid": "1681fd7e-4c21-4426-95ed-ecfd4b1abaf9", 00:25:20.694 "strip_size_kb": 0, 00:25:20.694 "state": "online", 00:25:20.694 "raid_level": "raid1", 00:25:20.694 "superblock": true, 00:25:20.694 "num_base_bdevs": 4, 00:25:20.694 "num_base_bdevs_discovered": 2, 00:25:20.694 "num_base_bdevs_operational": 2, 00:25:20.694 "base_bdevs_list": [ 00:25:20.694 { 00:25:20.694 "name": null, 00:25:20.694 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:20.695 "is_configured": false, 00:25:20.695 "data_offset": 2048, 00:25:20.695 "data_size": 63488 00:25:20.695 }, 00:25:20.695 { 00:25:20.695 "name": null, 00:25:20.695 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:20.695 "is_configured": false, 00:25:20.695 "data_offset": 2048, 00:25:20.695 "data_size": 63488 00:25:20.695 }, 00:25:20.695 { 00:25:20.695 "name": "BaseBdev3", 00:25:20.695 "uuid": "6d1c7c30-9fb2-5ea7-a605-d0a8dc8796dd", 00:25:20.695 "is_configured": true, 00:25:20.695 "data_offset": 2048, 00:25:20.695 "data_size": 63488 00:25:20.695 }, 00:25:20.695 { 00:25:20.695 "name": "BaseBdev4", 00:25:20.695 "uuid": "b83a9cf7-e7c4-5387-aadb-9b347feea760", 00:25:20.695 "is_configured": true, 00:25:20.695 "data_offset": 2048, 00:25:20.695 "data_size": 63488 00:25:20.695 } 00:25:20.695 ] 00:25:20.695 }' 00:25:20.695 18:29:04 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:25:20.695 18:29:04 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:25:21.629 18:29:05 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@754 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:25:21.629 [2024-07-12 18:29:05.180830] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:25:21.629 [2024-07-12 18:29:05.180985] bdev_raid.c:3562:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev spare (5) smaller than existing raid bdev raid_bdev1 (6) 00:25:21.629 [2024-07-12 18:29:05.181002] bdev_raid.c:3620:raid_bdev_examine_sb: *NOTICE*: Re-adding bdev spare to raid bdev raid_bdev1. 00:25:21.629 [2024-07-12 18:29:05.181032] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:25:21.629 [2024-07-12 18:29:05.185014] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1f2a740 00:25:21.629 [2024-07-12 18:29:05.187433] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:25:21.629 18:29:05 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@755 -- # sleep 1 00:25:22.562 18:29:06 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@756 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:25:22.562 18:29:06 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:25:22.562 18:29:06 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:25:22.562 18:29:06 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=spare 00:25:22.562 18:29:06 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:25:22.562 18:29:06 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:22.562 18:29:06 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:22.820 18:29:06 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:25:22.820 "name": "raid_bdev1", 00:25:22.820 "uuid": "1681fd7e-4c21-4426-95ed-ecfd4b1abaf9", 00:25:22.820 "strip_size_kb": 0, 00:25:22.820 "state": "online", 00:25:22.820 "raid_level": "raid1", 00:25:22.820 "superblock": true, 00:25:22.820 "num_base_bdevs": 4, 00:25:22.820 "num_base_bdevs_discovered": 3, 00:25:22.820 "num_base_bdevs_operational": 3, 00:25:22.820 "process": { 00:25:22.820 "type": "rebuild", 00:25:22.820 "target": "spare", 00:25:22.820 "progress": { 00:25:22.820 "blocks": 24576, 00:25:22.820 "percent": 38 00:25:22.820 } 00:25:22.820 }, 00:25:22.820 "base_bdevs_list": [ 00:25:22.820 { 00:25:22.820 "name": "spare", 00:25:22.820 "uuid": "370f62c3-9fab-5839-8b6d-fde934fc5960", 00:25:22.820 "is_configured": true, 00:25:22.820 "data_offset": 2048, 00:25:22.820 "data_size": 63488 00:25:22.820 }, 00:25:22.820 { 00:25:22.820 "name": null, 00:25:22.820 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:22.820 "is_configured": false, 00:25:22.820 "data_offset": 2048, 00:25:22.820 "data_size": 63488 00:25:22.820 }, 00:25:22.820 { 00:25:22.820 "name": "BaseBdev3", 00:25:22.820 "uuid": "6d1c7c30-9fb2-5ea7-a605-d0a8dc8796dd", 00:25:22.820 "is_configured": true, 00:25:22.820 "data_offset": 2048, 00:25:22.820 "data_size": 63488 00:25:22.820 }, 00:25:22.820 { 00:25:22.820 "name": "BaseBdev4", 00:25:22.820 "uuid": "b83a9cf7-e7c4-5387-aadb-9b347feea760", 00:25:22.820 "is_configured": true, 00:25:22.820 "data_offset": 2048, 00:25:22.820 "data_size": 63488 00:25:22.820 } 00:25:22.820 ] 00:25:22.820 }' 00:25:22.820 18:29:06 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:25:22.820 18:29:06 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:25:22.820 18:29:06 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:25:23.079 18:29:06 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:25:23.079 18:29:06 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@759 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:25:23.079 [2024-07-12 18:29:06.779162] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:25:23.079 [2024-07-12 18:29:06.800039] bdev_raid.c:2513:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:25:23.079 [2024-07-12 18:29:06.800082] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:25:23.079 [2024-07-12 18:29:06.800099] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:25:23.079 [2024-07-12 18:29:06.800107] bdev_raid.c:2451:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:25:23.339 18:29:06 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@760 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:25:23.339 18:29:06 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:25:23.339 18:29:06 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:25:23.339 18:29:06 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:25:23.339 18:29:06 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:25:23.339 18:29:06 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:25:23.339 18:29:06 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:25:23.339 18:29:06 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:25:23.339 18:29:06 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:25:23.339 18:29:06 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:25:23.339 18:29:06 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:23.339 18:29:06 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:23.339 18:29:07 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:25:23.339 "name": "raid_bdev1", 00:25:23.339 "uuid": "1681fd7e-4c21-4426-95ed-ecfd4b1abaf9", 00:25:23.339 "strip_size_kb": 0, 00:25:23.339 "state": "online", 00:25:23.339 "raid_level": "raid1", 00:25:23.339 "superblock": true, 00:25:23.339 "num_base_bdevs": 4, 00:25:23.339 "num_base_bdevs_discovered": 2, 00:25:23.339 "num_base_bdevs_operational": 2, 00:25:23.339 "base_bdevs_list": [ 00:25:23.339 { 00:25:23.339 "name": null, 00:25:23.339 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:23.339 "is_configured": false, 00:25:23.339 "data_offset": 2048, 00:25:23.339 "data_size": 63488 00:25:23.339 }, 00:25:23.339 { 00:25:23.339 "name": null, 00:25:23.339 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:23.339 "is_configured": false, 00:25:23.339 "data_offset": 2048, 00:25:23.339 "data_size": 63488 00:25:23.339 }, 00:25:23.339 { 00:25:23.339 "name": "BaseBdev3", 00:25:23.339 "uuid": "6d1c7c30-9fb2-5ea7-a605-d0a8dc8796dd", 00:25:23.339 "is_configured": true, 00:25:23.339 "data_offset": 2048, 00:25:23.339 "data_size": 63488 00:25:23.339 }, 00:25:23.339 { 00:25:23.339 "name": "BaseBdev4", 00:25:23.339 "uuid": "b83a9cf7-e7c4-5387-aadb-9b347feea760", 00:25:23.339 "is_configured": true, 00:25:23.339 "data_offset": 2048, 00:25:23.339 "data_size": 63488 00:25:23.339 } 00:25:23.339 ] 00:25:23.339 }' 00:25:23.339 18:29:07 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:25:23.339 18:29:07 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:25:23.907 18:29:07 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@761 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:25:24.167 [2024-07-12 18:29:07.742601] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:25:24.167 [2024-07-12 18:29:07.742655] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:25:24.167 [2024-07-12 18:29:07.742676] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1f2b010 00:25:24.167 [2024-07-12 18:29:07.742689] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:25:24.167 [2024-07-12 18:29:07.743077] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:25:24.167 [2024-07-12 18:29:07.743097] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:25:24.167 [2024-07-12 18:29:07.743176] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev spare 00:25:24.167 [2024-07-12 18:29:07.743188] bdev_raid.c:3562:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev spare (5) smaller than existing raid bdev raid_bdev1 (6) 00:25:24.167 [2024-07-12 18:29:07.743199] bdev_raid.c:3620:raid_bdev_examine_sb: *NOTICE*: Re-adding bdev spare to raid bdev raid_bdev1. 00:25:24.167 [2024-07-12 18:29:07.743218] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:25:24.167 [2024-07-12 18:29:07.747255] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1fa6420 00:25:24.167 spare 00:25:24.167 [2024-07-12 18:29:07.748729] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:25:24.167 18:29:07 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@762 -- # sleep 1 00:25:25.102 18:29:08 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@763 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:25:25.102 18:29:08 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:25:25.102 18:29:08 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:25:25.102 18:29:08 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=spare 00:25:25.102 18:29:08 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:25:25.102 18:29:08 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:25.102 18:29:08 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:25.360 18:29:08 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:25:25.360 "name": "raid_bdev1", 00:25:25.360 "uuid": "1681fd7e-4c21-4426-95ed-ecfd4b1abaf9", 00:25:25.360 "strip_size_kb": 0, 00:25:25.360 "state": "online", 00:25:25.360 "raid_level": "raid1", 00:25:25.360 "superblock": true, 00:25:25.360 "num_base_bdevs": 4, 00:25:25.360 "num_base_bdevs_discovered": 3, 00:25:25.360 "num_base_bdevs_operational": 3, 00:25:25.360 "process": { 00:25:25.360 "type": "rebuild", 00:25:25.360 "target": "spare", 00:25:25.360 "progress": { 00:25:25.360 "blocks": 22528, 00:25:25.361 "percent": 35 00:25:25.361 } 00:25:25.361 }, 00:25:25.361 "base_bdevs_list": [ 00:25:25.361 { 00:25:25.361 "name": "spare", 00:25:25.361 "uuid": "370f62c3-9fab-5839-8b6d-fde934fc5960", 00:25:25.361 "is_configured": true, 00:25:25.361 "data_offset": 2048, 00:25:25.361 "data_size": 63488 00:25:25.361 }, 00:25:25.361 { 00:25:25.361 "name": null, 00:25:25.361 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:25.361 "is_configured": false, 00:25:25.361 "data_offset": 2048, 00:25:25.361 "data_size": 63488 00:25:25.361 }, 00:25:25.361 { 00:25:25.361 "name": "BaseBdev3", 00:25:25.361 "uuid": "6d1c7c30-9fb2-5ea7-a605-d0a8dc8796dd", 00:25:25.361 "is_configured": true, 00:25:25.361 "data_offset": 2048, 00:25:25.361 "data_size": 63488 00:25:25.361 }, 00:25:25.361 { 00:25:25.361 "name": "BaseBdev4", 00:25:25.361 "uuid": "b83a9cf7-e7c4-5387-aadb-9b347feea760", 00:25:25.361 "is_configured": true, 00:25:25.361 "data_offset": 2048, 00:25:25.361 "data_size": 63488 00:25:25.361 } 00:25:25.361 ] 00:25:25.361 }' 00:25:25.361 18:29:08 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:25:25.361 18:29:09 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:25:25.361 18:29:09 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:25:25.361 18:29:09 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:25:25.361 18:29:09 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@766 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:25:25.619 [2024-07-12 18:29:09.264776] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:25:25.876 [2024-07-12 18:29:09.361514] bdev_raid.c:2513:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:25:25.876 [2024-07-12 18:29:09.361561] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:25:25.876 [2024-07-12 18:29:09.361578] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:25:25.876 [2024-07-12 18:29:09.361586] bdev_raid.c:2451:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:25:25.876 18:29:09 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@767 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:25:25.876 18:29:09 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:25:25.876 18:29:09 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:25:25.876 18:29:09 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:25:25.876 18:29:09 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:25:25.876 18:29:09 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:25:25.876 18:29:09 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:25:25.876 18:29:09 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:25:25.876 18:29:09 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:25:25.876 18:29:09 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:25:25.876 18:29:09 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:25.876 18:29:09 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:26.134 18:29:09 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:25:26.134 "name": "raid_bdev1", 00:25:26.134 "uuid": "1681fd7e-4c21-4426-95ed-ecfd4b1abaf9", 00:25:26.134 "strip_size_kb": 0, 00:25:26.134 "state": "online", 00:25:26.134 "raid_level": "raid1", 00:25:26.134 "superblock": true, 00:25:26.134 "num_base_bdevs": 4, 00:25:26.134 "num_base_bdevs_discovered": 2, 00:25:26.134 "num_base_bdevs_operational": 2, 00:25:26.134 "base_bdevs_list": [ 00:25:26.134 { 00:25:26.134 "name": null, 00:25:26.134 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:26.134 "is_configured": false, 00:25:26.135 "data_offset": 2048, 00:25:26.135 "data_size": 63488 00:25:26.135 }, 00:25:26.135 { 00:25:26.135 "name": null, 00:25:26.135 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:26.135 "is_configured": false, 00:25:26.135 "data_offset": 2048, 00:25:26.135 "data_size": 63488 00:25:26.135 }, 00:25:26.135 { 00:25:26.135 "name": "BaseBdev3", 00:25:26.135 "uuid": "6d1c7c30-9fb2-5ea7-a605-d0a8dc8796dd", 00:25:26.135 "is_configured": true, 00:25:26.135 "data_offset": 2048, 00:25:26.135 "data_size": 63488 00:25:26.135 }, 00:25:26.135 { 00:25:26.135 "name": "BaseBdev4", 00:25:26.135 "uuid": "b83a9cf7-e7c4-5387-aadb-9b347feea760", 00:25:26.135 "is_configured": true, 00:25:26.135 "data_offset": 2048, 00:25:26.135 "data_size": 63488 00:25:26.135 } 00:25:26.135 ] 00:25:26.135 }' 00:25:26.135 18:29:09 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:25:26.135 18:29:09 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:25:26.701 18:29:10 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@768 -- # verify_raid_bdev_process raid_bdev1 none none 00:25:26.701 18:29:10 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:25:26.701 18:29:10 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:25:26.701 18:29:10 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=none 00:25:26.701 18:29:10 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:25:26.701 18:29:10 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:26.701 18:29:10 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:27.267 18:29:10 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:25:27.267 "name": "raid_bdev1", 00:25:27.267 "uuid": "1681fd7e-4c21-4426-95ed-ecfd4b1abaf9", 00:25:27.267 "strip_size_kb": 0, 00:25:27.267 "state": "online", 00:25:27.267 "raid_level": "raid1", 00:25:27.267 "superblock": true, 00:25:27.267 "num_base_bdevs": 4, 00:25:27.267 "num_base_bdevs_discovered": 2, 00:25:27.267 "num_base_bdevs_operational": 2, 00:25:27.267 "base_bdevs_list": [ 00:25:27.267 { 00:25:27.267 "name": null, 00:25:27.267 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:27.267 "is_configured": false, 00:25:27.267 "data_offset": 2048, 00:25:27.267 "data_size": 63488 00:25:27.267 }, 00:25:27.267 { 00:25:27.267 "name": null, 00:25:27.267 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:27.267 "is_configured": false, 00:25:27.267 "data_offset": 2048, 00:25:27.267 "data_size": 63488 00:25:27.267 }, 00:25:27.267 { 00:25:27.267 "name": "BaseBdev3", 00:25:27.267 "uuid": "6d1c7c30-9fb2-5ea7-a605-d0a8dc8796dd", 00:25:27.267 "is_configured": true, 00:25:27.267 "data_offset": 2048, 00:25:27.267 "data_size": 63488 00:25:27.267 }, 00:25:27.267 { 00:25:27.267 "name": "BaseBdev4", 00:25:27.267 "uuid": "b83a9cf7-e7c4-5387-aadb-9b347feea760", 00:25:27.267 "is_configured": true, 00:25:27.267 "data_offset": 2048, 00:25:27.267 "data_size": 63488 00:25:27.267 } 00:25:27.267 ] 00:25:27.267 }' 00:25:27.267 18:29:10 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:25:27.267 18:29:10 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:25:27.267 18:29:10 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:25:27.267 18:29:10 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:25:27.267 18:29:10 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@771 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete BaseBdev1 00:25:27.525 18:29:11 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@772 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev1_malloc -p BaseBdev1 00:25:27.782 [2024-07-12 18:29:11.327456] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev1_malloc 00:25:27.782 [2024-07-12 18:29:11.327511] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:25:27.782 [2024-07-12 18:29:11.327532] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1fa6e30 00:25:27.782 [2024-07-12 18:29:11.327544] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:25:27.782 [2024-07-12 18:29:11.327887] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:25:27.782 [2024-07-12 18:29:11.327907] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:25:27.782 [2024-07-12 18:29:11.327980] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev BaseBdev1 00:25:27.782 [2024-07-12 18:29:11.327993] bdev_raid.c:3562:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev BaseBdev1 (1) smaller than existing raid bdev raid_bdev1 (6) 00:25:27.782 [2024-07-12 18:29:11.328004] bdev_raid.c:3581:raid_bdev_examine_sb: *DEBUG*: raid superblock does not contain this bdev's uuid 00:25:27.782 BaseBdev1 00:25:27.782 18:29:11 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@773 -- # sleep 1 00:25:28.715 18:29:12 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@774 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:25:28.715 18:29:12 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:25:28.715 18:29:12 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:25:28.715 18:29:12 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:25:28.715 18:29:12 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:25:28.715 18:29:12 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:25:28.715 18:29:12 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:25:28.715 18:29:12 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:25:28.715 18:29:12 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:25:28.716 18:29:12 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:25:28.716 18:29:12 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:28.716 18:29:12 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:28.973 18:29:12 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:25:28.973 "name": "raid_bdev1", 00:25:28.973 "uuid": "1681fd7e-4c21-4426-95ed-ecfd4b1abaf9", 00:25:28.973 "strip_size_kb": 0, 00:25:28.973 "state": "online", 00:25:28.973 "raid_level": "raid1", 00:25:28.973 "superblock": true, 00:25:28.973 "num_base_bdevs": 4, 00:25:28.973 "num_base_bdevs_discovered": 2, 00:25:28.973 "num_base_bdevs_operational": 2, 00:25:28.973 "base_bdevs_list": [ 00:25:28.973 { 00:25:28.973 "name": null, 00:25:28.973 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:28.973 "is_configured": false, 00:25:28.973 "data_offset": 2048, 00:25:28.973 "data_size": 63488 00:25:28.974 }, 00:25:28.974 { 00:25:28.974 "name": null, 00:25:28.974 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:28.974 "is_configured": false, 00:25:28.974 "data_offset": 2048, 00:25:28.974 "data_size": 63488 00:25:28.974 }, 00:25:28.974 { 00:25:28.974 "name": "BaseBdev3", 00:25:28.974 "uuid": "6d1c7c30-9fb2-5ea7-a605-d0a8dc8796dd", 00:25:28.974 "is_configured": true, 00:25:28.974 "data_offset": 2048, 00:25:28.974 "data_size": 63488 00:25:28.974 }, 00:25:28.974 { 00:25:28.974 "name": "BaseBdev4", 00:25:28.974 "uuid": "b83a9cf7-e7c4-5387-aadb-9b347feea760", 00:25:28.974 "is_configured": true, 00:25:28.974 "data_offset": 2048, 00:25:28.974 "data_size": 63488 00:25:28.974 } 00:25:28.974 ] 00:25:28.974 }' 00:25:28.974 18:29:12 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:25:28.974 18:29:12 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:25:29.539 18:29:13 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@775 -- # verify_raid_bdev_process raid_bdev1 none none 00:25:29.539 18:29:13 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:25:29.539 18:29:13 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:25:29.539 18:29:13 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=none 00:25:29.539 18:29:13 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:25:29.539 18:29:13 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:29.539 18:29:13 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:29.796 18:29:13 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:25:29.796 "name": "raid_bdev1", 00:25:29.796 "uuid": "1681fd7e-4c21-4426-95ed-ecfd4b1abaf9", 00:25:29.796 "strip_size_kb": 0, 00:25:29.796 "state": "online", 00:25:29.796 "raid_level": "raid1", 00:25:29.796 "superblock": true, 00:25:29.796 "num_base_bdevs": 4, 00:25:29.796 "num_base_bdevs_discovered": 2, 00:25:29.796 "num_base_bdevs_operational": 2, 00:25:29.796 "base_bdevs_list": [ 00:25:29.796 { 00:25:29.796 "name": null, 00:25:29.796 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:29.796 "is_configured": false, 00:25:29.796 "data_offset": 2048, 00:25:29.796 "data_size": 63488 00:25:29.796 }, 00:25:29.796 { 00:25:29.796 "name": null, 00:25:29.796 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:29.796 "is_configured": false, 00:25:29.796 "data_offset": 2048, 00:25:29.796 "data_size": 63488 00:25:29.796 }, 00:25:29.796 { 00:25:29.796 "name": "BaseBdev3", 00:25:29.796 "uuid": "6d1c7c30-9fb2-5ea7-a605-d0a8dc8796dd", 00:25:29.796 "is_configured": true, 00:25:29.796 "data_offset": 2048, 00:25:29.796 "data_size": 63488 00:25:29.796 }, 00:25:29.796 { 00:25:29.796 "name": "BaseBdev4", 00:25:29.796 "uuid": "b83a9cf7-e7c4-5387-aadb-9b347feea760", 00:25:29.796 "is_configured": true, 00:25:29.796 "data_offset": 2048, 00:25:29.796 "data_size": 63488 00:25:29.796 } 00:25:29.796 ] 00:25:29.796 }' 00:25:29.796 18:29:13 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:25:29.796 18:29:13 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:25:29.796 18:29:13 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:25:29.796 18:29:13 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:25:29.796 18:29:13 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@776 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:25:29.796 18:29:13 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@648 -- # local es=0 00:25:29.796 18:29:13 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@650 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:25:29.796 18:29:13 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@636 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:25:29.796 18:29:13 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:25:29.796 18:29:13 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@640 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:25:29.796 18:29:13 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:25:29.796 18:29:13 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@642 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:25:29.796 18:29:13 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:25:29.796 18:29:13 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@642 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:25:29.796 18:29:13 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@642 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py ]] 00:25:29.796 18:29:13 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@651 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:25:30.054 [2024-07-12 18:29:13.613535] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:25:30.054 [2024-07-12 18:29:13.613662] bdev_raid.c:3562:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev BaseBdev1 (1) smaller than existing raid bdev raid_bdev1 (6) 00:25:30.054 [2024-07-12 18:29:13.613679] bdev_raid.c:3581:raid_bdev_examine_sb: *DEBUG*: raid superblock does not contain this bdev's uuid 00:25:30.054 request: 00:25:30.054 { 00:25:30.054 "base_bdev": "BaseBdev1", 00:25:30.054 "raid_bdev": "raid_bdev1", 00:25:30.054 "method": "bdev_raid_add_base_bdev", 00:25:30.054 "req_id": 1 00:25:30.054 } 00:25:30.054 Got JSON-RPC error response 00:25:30.054 response: 00:25:30.054 { 00:25:30.054 "code": -22, 00:25:30.054 "message": "Failed to add base bdev to RAID bdev: Invalid argument" 00:25:30.054 } 00:25:30.054 18:29:13 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@651 -- # es=1 00:25:30.054 18:29:13 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@659 -- # (( es > 128 )) 00:25:30.054 18:29:13 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@670 -- # [[ -n '' ]] 00:25:30.054 18:29:13 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@675 -- # (( !es == 0 )) 00:25:30.054 18:29:13 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@777 -- # sleep 1 00:25:30.988 18:29:14 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@778 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:25:30.988 18:29:14 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:25:30.988 18:29:14 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:25:30.988 18:29:14 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:25:30.988 18:29:14 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:25:30.988 18:29:14 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:25:30.988 18:29:14 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:25:30.988 18:29:14 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:25:30.988 18:29:14 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:25:30.988 18:29:14 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:25:30.988 18:29:14 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:30.988 18:29:14 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:31.247 18:29:14 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:25:31.247 "name": "raid_bdev1", 00:25:31.247 "uuid": "1681fd7e-4c21-4426-95ed-ecfd4b1abaf9", 00:25:31.247 "strip_size_kb": 0, 00:25:31.247 "state": "online", 00:25:31.247 "raid_level": "raid1", 00:25:31.247 "superblock": true, 00:25:31.247 "num_base_bdevs": 4, 00:25:31.247 "num_base_bdevs_discovered": 2, 00:25:31.247 "num_base_bdevs_operational": 2, 00:25:31.247 "base_bdevs_list": [ 00:25:31.247 { 00:25:31.247 "name": null, 00:25:31.247 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:31.247 "is_configured": false, 00:25:31.247 "data_offset": 2048, 00:25:31.247 "data_size": 63488 00:25:31.247 }, 00:25:31.247 { 00:25:31.247 "name": null, 00:25:31.247 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:31.247 "is_configured": false, 00:25:31.247 "data_offset": 2048, 00:25:31.247 "data_size": 63488 00:25:31.247 }, 00:25:31.247 { 00:25:31.247 "name": "BaseBdev3", 00:25:31.247 "uuid": "6d1c7c30-9fb2-5ea7-a605-d0a8dc8796dd", 00:25:31.247 "is_configured": true, 00:25:31.247 "data_offset": 2048, 00:25:31.247 "data_size": 63488 00:25:31.247 }, 00:25:31.247 { 00:25:31.247 "name": "BaseBdev4", 00:25:31.247 "uuid": "b83a9cf7-e7c4-5387-aadb-9b347feea760", 00:25:31.247 "is_configured": true, 00:25:31.247 "data_offset": 2048, 00:25:31.247 "data_size": 63488 00:25:31.247 } 00:25:31.247 ] 00:25:31.247 }' 00:25:31.247 18:29:14 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:25:31.247 18:29:14 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:25:31.813 18:29:15 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@779 -- # verify_raid_bdev_process raid_bdev1 none none 00:25:31.813 18:29:15 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:25:31.813 18:29:15 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:25:31.813 18:29:15 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=none 00:25:31.813 18:29:15 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:25:31.813 18:29:15 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:31.813 18:29:15 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:32.070 18:29:15 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:25:32.071 "name": "raid_bdev1", 00:25:32.071 "uuid": "1681fd7e-4c21-4426-95ed-ecfd4b1abaf9", 00:25:32.071 "strip_size_kb": 0, 00:25:32.071 "state": "online", 00:25:32.071 "raid_level": "raid1", 00:25:32.071 "superblock": true, 00:25:32.071 "num_base_bdevs": 4, 00:25:32.071 "num_base_bdevs_discovered": 2, 00:25:32.071 "num_base_bdevs_operational": 2, 00:25:32.071 "base_bdevs_list": [ 00:25:32.071 { 00:25:32.071 "name": null, 00:25:32.071 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:32.071 "is_configured": false, 00:25:32.071 "data_offset": 2048, 00:25:32.071 "data_size": 63488 00:25:32.071 }, 00:25:32.071 { 00:25:32.071 "name": null, 00:25:32.071 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:32.071 "is_configured": false, 00:25:32.071 "data_offset": 2048, 00:25:32.071 "data_size": 63488 00:25:32.071 }, 00:25:32.071 { 00:25:32.071 "name": "BaseBdev3", 00:25:32.071 "uuid": "6d1c7c30-9fb2-5ea7-a605-d0a8dc8796dd", 00:25:32.071 "is_configured": true, 00:25:32.071 "data_offset": 2048, 00:25:32.071 "data_size": 63488 00:25:32.071 }, 00:25:32.071 { 00:25:32.071 "name": "BaseBdev4", 00:25:32.071 "uuid": "b83a9cf7-e7c4-5387-aadb-9b347feea760", 00:25:32.071 "is_configured": true, 00:25:32.071 "data_offset": 2048, 00:25:32.071 "data_size": 63488 00:25:32.071 } 00:25:32.071 ] 00:25:32.071 }' 00:25:32.071 18:29:15 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:25:32.071 18:29:15 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:25:32.071 18:29:15 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:25:32.071 18:29:15 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:25:32.071 18:29:15 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@782 -- # killprocess 2584387 00:25:32.071 18:29:15 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@948 -- # '[' -z 2584387 ']' 00:25:32.071 18:29:15 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@952 -- # kill -0 2584387 00:25:32.071 18:29:15 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@953 -- # uname 00:25:32.071 18:29:15 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:25:32.071 18:29:15 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2584387 00:25:32.071 18:29:15 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:25:32.071 18:29:15 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:25:32.071 18:29:15 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2584387' 00:25:32.071 killing process with pid 2584387 00:25:32.071 18:29:15 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@967 -- # kill 2584387 00:25:32.071 Received shutdown signal, test time was about 60.000000 seconds 00:25:32.071 00:25:32.071 Latency(us) 00:25:32.071 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:25:32.071 =================================================================================================================== 00:25:32.071 Total : 0.00 0.00 0.00 0.00 0.00 18446744073709551616.00 0.00 00:25:32.071 [2024-07-12 18:29:15.788617] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:25:32.071 [2024-07-12 18:29:15.788712] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:25:32.071 [2024-07-12 18:29:15.788768] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:25:32.071 18:29:15 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@972 -- # wait 2584387 00:25:32.071 [2024-07-12 18:29:15.788781] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1f2aba0 name raid_bdev1, state offline 00:25:32.329 [2024-07-12 18:29:15.835433] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:25:32.329 18:29:16 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@784 -- # return 0 00:25:32.329 00:25:32.329 real 0m37.442s 00:25:32.329 user 0m53.842s 00:25:32.329 sys 0m6.959s 00:25:32.329 18:29:16 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@1124 -- # xtrace_disable 00:25:32.329 18:29:16 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:25:32.329 ************************************ 00:25:32.329 END TEST raid_rebuild_test_sb 00:25:32.329 ************************************ 00:25:32.587 18:29:16 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:25:32.587 18:29:16 bdev_raid -- bdev/bdev_raid.sh@879 -- # run_test raid_rebuild_test_io raid_rebuild_test raid1 4 false true true 00:25:32.587 18:29:16 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 7 -le 1 ']' 00:25:32.587 18:29:16 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:25:32.587 18:29:16 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:25:32.587 ************************************ 00:25:32.587 START TEST raid_rebuild_test_io 00:25:32.587 ************************************ 00:25:32.587 18:29:16 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@1123 -- # raid_rebuild_test raid1 4 false true true 00:25:32.587 18:29:16 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@568 -- # local raid_level=raid1 00:25:32.587 18:29:16 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@569 -- # local num_base_bdevs=4 00:25:32.587 18:29:16 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@570 -- # local superblock=false 00:25:32.587 18:29:16 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@571 -- # local background_io=true 00:25:32.587 18:29:16 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@572 -- # local verify=true 00:25:32.587 18:29:16 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@573 -- # (( i = 1 )) 00:25:32.587 18:29:16 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:25:32.587 18:29:16 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@575 -- # echo BaseBdev1 00:25:32.587 18:29:16 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:25:32.587 18:29:16 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:25:32.587 18:29:16 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@575 -- # echo BaseBdev2 00:25:32.587 18:29:16 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:25:32.587 18:29:16 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:25:32.587 18:29:16 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@575 -- # echo BaseBdev3 00:25:32.587 18:29:16 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:25:32.587 18:29:16 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:25:32.587 18:29:16 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@575 -- # echo BaseBdev4 00:25:32.587 18:29:16 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:25:32.587 18:29:16 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:25:32.587 18:29:16 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@573 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3' 'BaseBdev4') 00:25:32.587 18:29:16 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@573 -- # local base_bdevs 00:25:32.587 18:29:16 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@574 -- # local raid_bdev_name=raid_bdev1 00:25:32.587 18:29:16 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@575 -- # local strip_size 00:25:32.587 18:29:16 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@576 -- # local create_arg 00:25:32.587 18:29:16 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@577 -- # local raid_bdev_size 00:25:32.587 18:29:16 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@578 -- # local data_offset 00:25:32.587 18:29:16 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@580 -- # '[' raid1 '!=' raid1 ']' 00:25:32.587 18:29:16 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@588 -- # strip_size=0 00:25:32.587 18:29:16 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@591 -- # '[' false = true ']' 00:25:32.587 18:29:16 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@596 -- # raid_pid=2589745 00:25:32.587 18:29:16 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@597 -- # waitforlisten 2589745 /var/tmp/spdk-raid.sock 00:25:32.587 18:29:16 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@829 -- # '[' -z 2589745 ']' 00:25:32.587 18:29:16 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:25:32.587 18:29:16 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@834 -- # local max_retries=100 00:25:32.588 18:29:16 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:25:32.588 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:25:32.588 18:29:16 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@838 -- # xtrace_disable 00:25:32.588 18:29:16 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@10 -- # set +x 00:25:32.588 18:29:16 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@595 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 3M -q 2 -U -z -L bdev_raid 00:25:32.588 [2024-07-12 18:29:16.182587] Starting SPDK v24.09-pre git sha1 182dd7de4 / DPDK 24.03.0 initialization... 00:25:32.588 [2024-07-12 18:29:16.182653] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2589745 ] 00:25:32.588 I/O size of 3145728 is greater than zero copy threshold (65536). 00:25:32.588 Zero copy mechanism will not be used. 00:25:32.588 [2024-07-12 18:29:16.310870] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:25:32.846 [2024-07-12 18:29:16.416987] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:25:32.846 [2024-07-12 18:29:16.486170] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:25:32.846 [2024-07-12 18:29:16.486206] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:25:33.412 18:29:17 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:25:33.412 18:29:17 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@862 -- # return 0 00:25:33.412 18:29:17 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:25:33.412 18:29:17 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:25:33.978 BaseBdev1_malloc 00:25:33.978 18:29:17 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev1_malloc -p BaseBdev1 00:25:34.236 [2024-07-12 18:29:17.825122] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev1_malloc 00:25:34.236 [2024-07-12 18:29:17.825168] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:25:34.236 [2024-07-12 18:29:17.825195] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x133fd40 00:25:34.236 [2024-07-12 18:29:17.825208] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:25:34.236 [2024-07-12 18:29:17.826942] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:25:34.236 [2024-07-12 18:29:17.826974] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:25:34.236 BaseBdev1 00:25:34.236 18:29:17 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:25:34.236 18:29:17 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:25:34.802 BaseBdev2_malloc 00:25:34.802 18:29:18 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev2_malloc -p BaseBdev2 00:25:35.060 [2024-07-12 18:29:18.576579] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev2_malloc 00:25:35.060 [2024-07-12 18:29:18.576629] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:25:35.060 [2024-07-12 18:29:18.576655] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1340860 00:25:35.060 [2024-07-12 18:29:18.576668] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:25:35.060 [2024-07-12 18:29:18.578262] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:25:35.060 [2024-07-12 18:29:18.578294] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:25:35.060 BaseBdev2 00:25:35.060 18:29:18 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:25:35.060 18:29:18 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3_malloc 00:25:35.318 BaseBdev3_malloc 00:25:35.318 18:29:18 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev3_malloc -p BaseBdev3 00:25:35.884 [2024-07-12 18:29:19.316340] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev3_malloc 00:25:35.885 [2024-07-12 18:29:19.316389] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:25:35.885 [2024-07-12 18:29:19.316414] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x14ed8f0 00:25:35.885 [2024-07-12 18:29:19.316427] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:25:35.885 [2024-07-12 18:29:19.318027] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:25:35.885 [2024-07-12 18:29:19.318057] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev3 00:25:35.885 BaseBdev3 00:25:35.885 18:29:19 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:25:35.885 18:29:19 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4_malloc 00:25:36.144 BaseBdev4_malloc 00:25:36.144 18:29:19 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev4_malloc -p BaseBdev4 00:25:36.427 [2024-07-12 18:29:20.078935] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev4_malloc 00:25:36.427 [2024-07-12 18:29:20.078986] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:25:36.427 [2024-07-12 18:29:20.079012] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x14ecad0 00:25:36.427 [2024-07-12 18:29:20.079032] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:25:36.427 [2024-07-12 18:29:20.080592] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:25:36.427 [2024-07-12 18:29:20.080621] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev4 00:25:36.427 BaseBdev4 00:25:36.427 18:29:20 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@606 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b spare_malloc 00:25:36.689 spare_malloc 00:25:36.689 18:29:20 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@607 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_delay_create -b spare_malloc -d spare_delay -r 0 -t 0 -w 100000 -n 100000 00:25:36.947 spare_delay 00:25:36.947 18:29:20 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@608 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:25:37.206 [2024-07-12 18:29:20.793445] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:25:37.206 [2024-07-12 18:29:20.793491] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:25:37.206 [2024-07-12 18:29:20.793516] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x14f15b0 00:25:37.206 [2024-07-12 18:29:20.793529] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:25:37.206 [2024-07-12 18:29:20.795145] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:25:37.206 [2024-07-12 18:29:20.795175] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:25:37.206 spare 00:25:37.206 18:29:20 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@611 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n raid_bdev1 00:25:37.464 [2024-07-12 18:29:21.026080] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:25:37.464 [2024-07-12 18:29:21.027434] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:25:37.464 [2024-07-12 18:29:21.027489] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:25:37.464 [2024-07-12 18:29:21.027535] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:25:37.464 [2024-07-12 18:29:21.027614] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x14708a0 00:25:37.464 [2024-07-12 18:29:21.027624] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 65536, blocklen 512 00:25:37.464 [2024-07-12 18:29:21.027842] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x14eae10 00:25:37.464 [2024-07-12 18:29:21.028003] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x14708a0 00:25:37.464 [2024-07-12 18:29:21.028013] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x14708a0 00:25:37.464 [2024-07-12 18:29:21.028129] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:25:37.464 18:29:21 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@612 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 4 00:25:37.464 18:29:21 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:25:37.464 18:29:21 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:25:37.464 18:29:21 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:25:37.464 18:29:21 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:25:37.464 18:29:21 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:25:37.464 18:29:21 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:25:37.464 18:29:21 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:25:37.464 18:29:21 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:25:37.464 18:29:21 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:25:37.464 18:29:21 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:37.464 18:29:21 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:37.723 18:29:21 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:25:37.723 "name": "raid_bdev1", 00:25:37.723 "uuid": "4bfbb459-f59c-4530-a0bc-8a3a8a84613e", 00:25:37.723 "strip_size_kb": 0, 00:25:37.723 "state": "online", 00:25:37.723 "raid_level": "raid1", 00:25:37.723 "superblock": false, 00:25:37.723 "num_base_bdevs": 4, 00:25:37.723 "num_base_bdevs_discovered": 4, 00:25:37.723 "num_base_bdevs_operational": 4, 00:25:37.723 "base_bdevs_list": [ 00:25:37.723 { 00:25:37.723 "name": "BaseBdev1", 00:25:37.723 "uuid": "4d66ce15-17cd-5b91-8c60-712278062e40", 00:25:37.723 "is_configured": true, 00:25:37.723 "data_offset": 0, 00:25:37.723 "data_size": 65536 00:25:37.723 }, 00:25:37.723 { 00:25:37.723 "name": "BaseBdev2", 00:25:37.723 "uuid": "d86ff2da-5a37-50b5-b771-5dbdaece3a63", 00:25:37.723 "is_configured": true, 00:25:37.723 "data_offset": 0, 00:25:37.723 "data_size": 65536 00:25:37.723 }, 00:25:37.723 { 00:25:37.723 "name": "BaseBdev3", 00:25:37.723 "uuid": "faa0aa7b-a27c-5e36-b21a-687b3efd1190", 00:25:37.723 "is_configured": true, 00:25:37.723 "data_offset": 0, 00:25:37.723 "data_size": 65536 00:25:37.723 }, 00:25:37.723 { 00:25:37.723 "name": "BaseBdev4", 00:25:37.723 "uuid": "f64c2d6d-590e-583a-801f-0a9fc4d3d075", 00:25:37.723 "is_configured": true, 00:25:37.723 "data_offset": 0, 00:25:37.723 "data_size": 65536 00:25:37.723 } 00:25:37.723 ] 00:25:37.723 }' 00:25:37.723 18:29:21 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:25:37.723 18:29:21 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@10 -- # set +x 00:25:38.290 18:29:21 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@615 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:25:38.290 18:29:21 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@615 -- # jq -r '.[].num_blocks' 00:25:38.548 [2024-07-12 18:29:22.105227] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:25:38.548 18:29:22 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@615 -- # raid_bdev_size=65536 00:25:38.548 18:29:22 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@618 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:38.548 18:29:22 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@618 -- # jq -r '.[].base_bdevs_list[0].data_offset' 00:25:38.807 18:29:22 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@618 -- # data_offset=0 00:25:38.807 18:29:22 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@620 -- # '[' true = true ']' 00:25:38.807 18:29:22 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@639 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev1 00:25:38.807 18:29:22 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@622 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/spdk-raid.sock perform_tests 00:25:38.807 [2024-07-12 18:29:22.484062] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1476970 00:25:38.807 I/O size of 3145728 is greater than zero copy threshold (65536). 00:25:38.807 Zero copy mechanism will not be used. 00:25:38.807 Running I/O for 60 seconds... 00:25:39.065 [2024-07-12 18:29:22.615792] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:25:39.065 [2024-07-12 18:29:22.632089] bdev_raid.c:1919:raid_bdev_channel_remove_base_bdev: *DEBUG*: slot: 0 raid_ch: 0x1476970 00:25:39.065 18:29:22 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@642 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:25:39.065 18:29:22 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:25:39.065 18:29:22 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:25:39.065 18:29:22 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:25:39.065 18:29:22 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:25:39.065 18:29:22 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:25:39.065 18:29:22 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:25:39.065 18:29:22 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:25:39.065 18:29:22 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:25:39.065 18:29:22 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:25:39.065 18:29:22 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:39.065 18:29:22 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:39.323 18:29:22 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:25:39.323 "name": "raid_bdev1", 00:25:39.323 "uuid": "4bfbb459-f59c-4530-a0bc-8a3a8a84613e", 00:25:39.323 "strip_size_kb": 0, 00:25:39.323 "state": "online", 00:25:39.323 "raid_level": "raid1", 00:25:39.323 "superblock": false, 00:25:39.323 "num_base_bdevs": 4, 00:25:39.323 "num_base_bdevs_discovered": 3, 00:25:39.323 "num_base_bdevs_operational": 3, 00:25:39.323 "base_bdevs_list": [ 00:25:39.323 { 00:25:39.323 "name": null, 00:25:39.323 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:39.323 "is_configured": false, 00:25:39.323 "data_offset": 0, 00:25:39.323 "data_size": 65536 00:25:39.323 }, 00:25:39.323 { 00:25:39.323 "name": "BaseBdev2", 00:25:39.323 "uuid": "d86ff2da-5a37-50b5-b771-5dbdaece3a63", 00:25:39.323 "is_configured": true, 00:25:39.323 "data_offset": 0, 00:25:39.323 "data_size": 65536 00:25:39.323 }, 00:25:39.323 { 00:25:39.323 "name": "BaseBdev3", 00:25:39.323 "uuid": "faa0aa7b-a27c-5e36-b21a-687b3efd1190", 00:25:39.323 "is_configured": true, 00:25:39.323 "data_offset": 0, 00:25:39.323 "data_size": 65536 00:25:39.323 }, 00:25:39.323 { 00:25:39.323 "name": "BaseBdev4", 00:25:39.323 "uuid": "f64c2d6d-590e-583a-801f-0a9fc4d3d075", 00:25:39.323 "is_configured": true, 00:25:39.323 "data_offset": 0, 00:25:39.323 "data_size": 65536 00:25:39.323 } 00:25:39.323 ] 00:25:39.323 }' 00:25:39.323 18:29:22 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:25:39.323 18:29:22 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@10 -- # set +x 00:25:39.888 18:29:23 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@645 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:25:40.146 [2024-07-12 18:29:23.773038] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:25:40.146 18:29:23 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@646 -- # sleep 1 00:25:40.146 [2024-07-12 18:29:23.856722] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1046fa0 00:25:40.146 [2024-07-12 18:29:23.859102] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:25:40.404 [2024-07-12 18:29:23.969985] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 2048 offset_begin: 0 offset_end: 6144 00:25:40.404 [2024-07-12 18:29:23.970390] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 2048 offset_begin: 0 offset_end: 6144 00:25:40.663 [2024-07-12 18:29:24.221810] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 4096 offset_begin: 0 offset_end: 6144 00:25:40.663 [2024-07-12 18:29:24.222093] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 4096 offset_begin: 0 offset_end: 6144 00:25:41.229 [2024-07-12 18:29:24.699118] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 10240 offset_begin: 6144 offset_end: 12288 00:25:41.229 [2024-07-12 18:29:24.699324] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 10240 offset_begin: 6144 offset_end: 12288 00:25:41.229 18:29:24 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@649 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:25:41.229 18:29:24 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:25:41.229 18:29:24 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:25:41.229 18:29:24 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:25:41.229 18:29:24 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:25:41.229 18:29:24 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:41.229 18:29:24 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:41.229 [2024-07-12 18:29:24.945154] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 14336 offset_begin: 12288 offset_end: 18432 00:25:41.488 [2024-07-12 18:29:25.075658] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 16384 offset_begin: 12288 offset_end: 18432 00:25:41.488 [2024-07-12 18:29:25.076313] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 16384 offset_begin: 12288 offset_end: 18432 00:25:41.747 18:29:25 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:25:41.748 "name": "raid_bdev1", 00:25:41.748 "uuid": "4bfbb459-f59c-4530-a0bc-8a3a8a84613e", 00:25:41.748 "strip_size_kb": 0, 00:25:41.748 "state": "online", 00:25:41.748 "raid_level": "raid1", 00:25:41.748 "superblock": false, 00:25:41.748 "num_base_bdevs": 4, 00:25:41.748 "num_base_bdevs_discovered": 4, 00:25:41.748 "num_base_bdevs_operational": 4, 00:25:41.748 "process": { 00:25:41.748 "type": "rebuild", 00:25:41.748 "target": "spare", 00:25:41.748 "progress": { 00:25:41.748 "blocks": 18432, 00:25:41.748 "percent": 28 00:25:41.748 } 00:25:41.748 }, 00:25:41.748 "base_bdevs_list": [ 00:25:41.748 { 00:25:41.748 "name": "spare", 00:25:41.748 "uuid": "38e23596-e70f-57d0-96db-8962f1930dbe", 00:25:41.748 "is_configured": true, 00:25:41.748 "data_offset": 0, 00:25:41.748 "data_size": 65536 00:25:41.748 }, 00:25:41.748 { 00:25:41.748 "name": "BaseBdev2", 00:25:41.748 "uuid": "d86ff2da-5a37-50b5-b771-5dbdaece3a63", 00:25:41.748 "is_configured": true, 00:25:41.748 "data_offset": 0, 00:25:41.748 "data_size": 65536 00:25:41.748 }, 00:25:41.748 { 00:25:41.748 "name": "BaseBdev3", 00:25:41.748 "uuid": "faa0aa7b-a27c-5e36-b21a-687b3efd1190", 00:25:41.748 "is_configured": true, 00:25:41.748 "data_offset": 0, 00:25:41.748 "data_size": 65536 00:25:41.748 }, 00:25:41.748 { 00:25:41.748 "name": "BaseBdev4", 00:25:41.748 "uuid": "f64c2d6d-590e-583a-801f-0a9fc4d3d075", 00:25:41.748 "is_configured": true, 00:25:41.748 "data_offset": 0, 00:25:41.748 "data_size": 65536 00:25:41.748 } 00:25:41.748 ] 00:25:41.748 }' 00:25:41.748 18:29:25 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:25:41.748 18:29:25 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:25:41.748 18:29:25 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:25:41.748 [2024-07-12 18:29:25.454408] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 20480 offset_begin: 18432 offset_end: 24576 00:25:42.007 18:29:25 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:25:42.007 18:29:25 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@652 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev spare 00:25:42.007 [2024-07-12 18:29:25.699112] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 22528 offset_begin: 18432 offset_end: 24576 00:25:42.007 [2024-07-12 18:29:25.707179] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:25:42.265 [2024-07-12 18:29:25.947344] bdev_raid.c:2513:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:25:42.265 [2024-07-12 18:29:25.960099] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:25:42.265 [2024-07-12 18:29:25.960139] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:25:42.265 [2024-07-12 18:29:25.960151] bdev_raid.c:2451:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:25:42.265 [2024-07-12 18:29:25.991784] bdev_raid.c:1919:raid_bdev_channel_remove_base_bdev: *DEBUG*: slot: 0 raid_ch: 0x1476970 00:25:42.524 18:29:26 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@655 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:25:42.524 18:29:26 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:25:42.524 18:29:26 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:25:42.524 18:29:26 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:25:42.524 18:29:26 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:25:42.524 18:29:26 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:25:42.524 18:29:26 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:25:42.524 18:29:26 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:25:42.524 18:29:26 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:25:42.524 18:29:26 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:25:42.524 18:29:26 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:42.524 18:29:26 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:42.783 18:29:26 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:25:42.783 "name": "raid_bdev1", 00:25:42.783 "uuid": "4bfbb459-f59c-4530-a0bc-8a3a8a84613e", 00:25:42.783 "strip_size_kb": 0, 00:25:42.783 "state": "online", 00:25:42.783 "raid_level": "raid1", 00:25:42.783 "superblock": false, 00:25:42.783 "num_base_bdevs": 4, 00:25:42.783 "num_base_bdevs_discovered": 3, 00:25:42.783 "num_base_bdevs_operational": 3, 00:25:42.783 "base_bdevs_list": [ 00:25:42.783 { 00:25:42.783 "name": null, 00:25:42.783 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:42.783 "is_configured": false, 00:25:42.783 "data_offset": 0, 00:25:42.783 "data_size": 65536 00:25:42.783 }, 00:25:42.783 { 00:25:42.783 "name": "BaseBdev2", 00:25:42.783 "uuid": "d86ff2da-5a37-50b5-b771-5dbdaece3a63", 00:25:42.783 "is_configured": true, 00:25:42.783 "data_offset": 0, 00:25:42.783 "data_size": 65536 00:25:42.783 }, 00:25:42.783 { 00:25:42.783 "name": "BaseBdev3", 00:25:42.783 "uuid": "faa0aa7b-a27c-5e36-b21a-687b3efd1190", 00:25:42.783 "is_configured": true, 00:25:42.783 "data_offset": 0, 00:25:42.783 "data_size": 65536 00:25:42.783 }, 00:25:42.783 { 00:25:42.783 "name": "BaseBdev4", 00:25:42.783 "uuid": "f64c2d6d-590e-583a-801f-0a9fc4d3d075", 00:25:42.783 "is_configured": true, 00:25:42.783 "data_offset": 0, 00:25:42.783 "data_size": 65536 00:25:42.783 } 00:25:42.783 ] 00:25:42.783 }' 00:25:42.783 18:29:26 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:25:42.783 18:29:26 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@10 -- # set +x 00:25:43.350 18:29:26 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@658 -- # verify_raid_bdev_process raid_bdev1 none none 00:25:43.350 18:29:26 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:25:43.350 18:29:26 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:25:43.350 18:29:26 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@184 -- # local target=none 00:25:43.350 18:29:26 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:25:43.350 18:29:26 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:43.350 18:29:26 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:43.608 18:29:27 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:25:43.608 "name": "raid_bdev1", 00:25:43.608 "uuid": "4bfbb459-f59c-4530-a0bc-8a3a8a84613e", 00:25:43.608 "strip_size_kb": 0, 00:25:43.608 "state": "online", 00:25:43.609 "raid_level": "raid1", 00:25:43.609 "superblock": false, 00:25:43.609 "num_base_bdevs": 4, 00:25:43.609 "num_base_bdevs_discovered": 3, 00:25:43.609 "num_base_bdevs_operational": 3, 00:25:43.609 "base_bdevs_list": [ 00:25:43.609 { 00:25:43.609 "name": null, 00:25:43.609 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:43.609 "is_configured": false, 00:25:43.609 "data_offset": 0, 00:25:43.609 "data_size": 65536 00:25:43.609 }, 00:25:43.609 { 00:25:43.609 "name": "BaseBdev2", 00:25:43.609 "uuid": "d86ff2da-5a37-50b5-b771-5dbdaece3a63", 00:25:43.609 "is_configured": true, 00:25:43.609 "data_offset": 0, 00:25:43.609 "data_size": 65536 00:25:43.609 }, 00:25:43.609 { 00:25:43.609 "name": "BaseBdev3", 00:25:43.609 "uuid": "faa0aa7b-a27c-5e36-b21a-687b3efd1190", 00:25:43.609 "is_configured": true, 00:25:43.609 "data_offset": 0, 00:25:43.609 "data_size": 65536 00:25:43.609 }, 00:25:43.609 { 00:25:43.609 "name": "BaseBdev4", 00:25:43.609 "uuid": "f64c2d6d-590e-583a-801f-0a9fc4d3d075", 00:25:43.609 "is_configured": true, 00:25:43.609 "data_offset": 0, 00:25:43.609 "data_size": 65536 00:25:43.609 } 00:25:43.609 ] 00:25:43.609 }' 00:25:43.609 18:29:27 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:25:43.609 18:29:27 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:25:43.609 18:29:27 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:25:43.609 18:29:27 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:25:43.609 18:29:27 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@661 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:25:43.867 [2024-07-12 18:29:27.484895] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:25:43.867 18:29:27 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@662 -- # sleep 1 00:25:43.867 [2024-07-12 18:29:27.579687] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1473270 00:25:43.867 [2024-07-12 18:29:27.581250] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:25:44.125 [2024-07-12 18:29:27.690245] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 2048 offset_begin: 0 offset_end: 6144 00:25:44.125 [2024-07-12 18:29:27.690582] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 2048 offset_begin: 0 offset_end: 6144 00:25:44.384 [2024-07-12 18:29:27.904607] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 4096 offset_begin: 0 offset_end: 6144 00:25:44.384 [2024-07-12 18:29:27.904887] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 4096 offset_begin: 0 offset_end: 6144 00:25:44.642 [2024-07-12 18:29:28.138274] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 8192 offset_begin: 6144 offset_end: 12288 00:25:44.642 [2024-07-12 18:29:28.139508] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 8192 offset_begin: 6144 offset_end: 12288 00:25:44.900 [2024-07-12 18:29:28.388091] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 10240 offset_begin: 6144 offset_end: 12288 00:25:44.900 [2024-07-12 18:29:28.388364] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 10240 offset_begin: 6144 offset_end: 12288 00:25:44.900 18:29:28 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@663 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:25:44.900 18:29:28 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:25:44.900 18:29:28 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:25:44.900 18:29:28 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:25:44.900 18:29:28 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:25:44.900 18:29:28 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:44.900 18:29:28 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:45.158 [2024-07-12 18:29:28.703877] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 14336 offset_begin: 12288 offset_end: 18432 00:25:45.158 [2024-07-12 18:29:28.704378] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 14336 offset_begin: 12288 offset_end: 18432 00:25:45.158 18:29:28 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:25:45.158 "name": "raid_bdev1", 00:25:45.158 "uuid": "4bfbb459-f59c-4530-a0bc-8a3a8a84613e", 00:25:45.158 "strip_size_kb": 0, 00:25:45.158 "state": "online", 00:25:45.158 "raid_level": "raid1", 00:25:45.158 "superblock": false, 00:25:45.158 "num_base_bdevs": 4, 00:25:45.158 "num_base_bdevs_discovered": 4, 00:25:45.158 "num_base_bdevs_operational": 4, 00:25:45.158 "process": { 00:25:45.158 "type": "rebuild", 00:25:45.158 "target": "spare", 00:25:45.158 "progress": { 00:25:45.158 "blocks": 14336, 00:25:45.158 "percent": 21 00:25:45.158 } 00:25:45.158 }, 00:25:45.158 "base_bdevs_list": [ 00:25:45.158 { 00:25:45.158 "name": "spare", 00:25:45.158 "uuid": "38e23596-e70f-57d0-96db-8962f1930dbe", 00:25:45.158 "is_configured": true, 00:25:45.158 "data_offset": 0, 00:25:45.158 "data_size": 65536 00:25:45.158 }, 00:25:45.158 { 00:25:45.158 "name": "BaseBdev2", 00:25:45.158 "uuid": "d86ff2da-5a37-50b5-b771-5dbdaece3a63", 00:25:45.158 "is_configured": true, 00:25:45.158 "data_offset": 0, 00:25:45.158 "data_size": 65536 00:25:45.158 }, 00:25:45.158 { 00:25:45.158 "name": "BaseBdev3", 00:25:45.158 "uuid": "faa0aa7b-a27c-5e36-b21a-687b3efd1190", 00:25:45.158 "is_configured": true, 00:25:45.158 "data_offset": 0, 00:25:45.158 "data_size": 65536 00:25:45.158 }, 00:25:45.158 { 00:25:45.158 "name": "BaseBdev4", 00:25:45.158 "uuid": "f64c2d6d-590e-583a-801f-0a9fc4d3d075", 00:25:45.158 "is_configured": true, 00:25:45.158 "data_offset": 0, 00:25:45.158 "data_size": 65536 00:25:45.158 } 00:25:45.158 ] 00:25:45.158 }' 00:25:45.158 18:29:28 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:25:45.158 [2024-07-12 18:29:28.826593] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 16384 offset_begin: 12288 offset_end: 18432 00:25:45.158 [2024-07-12 18:29:28.826881] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 16384 offset_begin: 12288 offset_end: 18432 00:25:45.158 18:29:28 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:25:45.158 18:29:28 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:25:45.417 18:29:28 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:25:45.417 18:29:28 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@665 -- # '[' false = true ']' 00:25:45.417 18:29:28 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@690 -- # local num_base_bdevs_operational=4 00:25:45.417 18:29:28 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@692 -- # '[' raid1 = raid1 ']' 00:25:45.417 18:29:28 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@692 -- # '[' 4 -gt 2 ']' 00:25:45.417 18:29:28 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@694 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev2 00:25:45.417 [2024-07-12 18:29:29.115589] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:25:45.675 [2024-07-12 18:29:29.155904] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 20480 offset_begin: 18432 offset_end: 24576 00:25:45.675 [2024-07-12 18:29:29.264176] bdev_raid.c:1919:raid_bdev_channel_remove_base_bdev: *DEBUG*: slot: 1 raid_ch: 0x1476970 00:25:45.675 [2024-07-12 18:29:29.264204] bdev_raid.c:1919:raid_bdev_channel_remove_base_bdev: *DEBUG*: slot: 1 raid_ch: 0x1473270 00:25:45.675 18:29:29 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@697 -- # base_bdevs[1]= 00:25:45.675 18:29:29 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@698 -- # (( num_base_bdevs_operational-- )) 00:25:45.675 18:29:29 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@701 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:25:45.675 18:29:29 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:25:45.675 18:29:29 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:25:45.675 18:29:29 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:25:45.675 18:29:29 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:25:45.675 18:29:29 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:45.675 18:29:29 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:45.934 18:29:29 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:25:45.934 "name": "raid_bdev1", 00:25:45.934 "uuid": "4bfbb459-f59c-4530-a0bc-8a3a8a84613e", 00:25:45.934 "strip_size_kb": 0, 00:25:45.934 "state": "online", 00:25:45.934 "raid_level": "raid1", 00:25:45.934 "superblock": false, 00:25:45.934 "num_base_bdevs": 4, 00:25:45.934 "num_base_bdevs_discovered": 3, 00:25:45.934 "num_base_bdevs_operational": 3, 00:25:45.934 "process": { 00:25:45.934 "type": "rebuild", 00:25:45.934 "target": "spare", 00:25:45.934 "progress": { 00:25:45.934 "blocks": 24576, 00:25:45.934 "percent": 37 00:25:45.934 } 00:25:45.934 }, 00:25:45.934 "base_bdevs_list": [ 00:25:45.934 { 00:25:45.934 "name": "spare", 00:25:45.934 "uuid": "38e23596-e70f-57d0-96db-8962f1930dbe", 00:25:45.934 "is_configured": true, 00:25:45.934 "data_offset": 0, 00:25:45.934 "data_size": 65536 00:25:45.934 }, 00:25:45.934 { 00:25:45.934 "name": null, 00:25:45.934 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:45.934 "is_configured": false, 00:25:45.934 "data_offset": 0, 00:25:45.934 "data_size": 65536 00:25:45.934 }, 00:25:45.934 { 00:25:45.934 "name": "BaseBdev3", 00:25:45.934 "uuid": "faa0aa7b-a27c-5e36-b21a-687b3efd1190", 00:25:45.934 "is_configured": true, 00:25:45.934 "data_offset": 0, 00:25:45.934 "data_size": 65536 00:25:45.934 }, 00:25:45.934 { 00:25:45.934 "name": "BaseBdev4", 00:25:45.934 "uuid": "f64c2d6d-590e-583a-801f-0a9fc4d3d075", 00:25:45.934 "is_configured": true, 00:25:45.934 "data_offset": 0, 00:25:45.934 "data_size": 65536 00:25:45.934 } 00:25:45.934 ] 00:25:45.934 }' 00:25:45.934 18:29:29 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:25:45.934 18:29:29 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:25:45.934 18:29:29 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:25:45.935 18:29:29 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:25:45.935 18:29:29 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@705 -- # local timeout=942 00:25:45.935 18:29:29 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:25:45.935 18:29:29 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:25:45.935 18:29:29 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:25:45.935 18:29:29 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:25:45.935 18:29:29 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:25:45.935 18:29:29 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:25:45.935 18:29:29 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:45.935 18:29:29 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:46.193 [2024-07-12 18:29:29.771322] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 28672 offset_begin: 24576 offset_end: 30720 00:25:46.193 18:29:29 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:25:46.193 "name": "raid_bdev1", 00:25:46.193 "uuid": "4bfbb459-f59c-4530-a0bc-8a3a8a84613e", 00:25:46.193 "strip_size_kb": 0, 00:25:46.193 "state": "online", 00:25:46.193 "raid_level": "raid1", 00:25:46.193 "superblock": false, 00:25:46.193 "num_base_bdevs": 4, 00:25:46.193 "num_base_bdevs_discovered": 3, 00:25:46.193 "num_base_bdevs_operational": 3, 00:25:46.193 "process": { 00:25:46.193 "type": "rebuild", 00:25:46.193 "target": "spare", 00:25:46.193 "progress": { 00:25:46.193 "blocks": 28672, 00:25:46.193 "percent": 43 00:25:46.193 } 00:25:46.193 }, 00:25:46.193 "base_bdevs_list": [ 00:25:46.193 { 00:25:46.193 "name": "spare", 00:25:46.193 "uuid": "38e23596-e70f-57d0-96db-8962f1930dbe", 00:25:46.193 "is_configured": true, 00:25:46.193 "data_offset": 0, 00:25:46.193 "data_size": 65536 00:25:46.193 }, 00:25:46.193 { 00:25:46.193 "name": null, 00:25:46.193 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:46.193 "is_configured": false, 00:25:46.193 "data_offset": 0, 00:25:46.193 "data_size": 65536 00:25:46.193 }, 00:25:46.193 { 00:25:46.193 "name": "BaseBdev3", 00:25:46.193 "uuid": "faa0aa7b-a27c-5e36-b21a-687b3efd1190", 00:25:46.193 "is_configured": true, 00:25:46.193 "data_offset": 0, 00:25:46.193 "data_size": 65536 00:25:46.193 }, 00:25:46.193 { 00:25:46.193 "name": "BaseBdev4", 00:25:46.193 "uuid": "f64c2d6d-590e-583a-801f-0a9fc4d3d075", 00:25:46.193 "is_configured": true, 00:25:46.193 "data_offset": 0, 00:25:46.193 "data_size": 65536 00:25:46.193 } 00:25:46.193 ] 00:25:46.193 }' 00:25:46.193 18:29:29 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:25:46.452 18:29:29 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:25:46.452 18:29:29 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:25:46.452 18:29:30 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:25:46.452 18:29:30 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@710 -- # sleep 1 00:25:47.387 18:29:31 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:25:47.387 18:29:31 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:25:47.387 18:29:31 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:25:47.387 18:29:31 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:25:47.387 18:29:31 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:25:47.387 18:29:31 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:25:47.387 18:29:31 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:47.387 18:29:31 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:47.646 [2024-07-12 18:29:31.215341] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 53248 offset_begin: 49152 offset_end: 55296 00:25:47.646 18:29:31 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:25:47.646 "name": "raid_bdev1", 00:25:47.646 "uuid": "4bfbb459-f59c-4530-a0bc-8a3a8a84613e", 00:25:47.646 "strip_size_kb": 0, 00:25:47.646 "state": "online", 00:25:47.646 "raid_level": "raid1", 00:25:47.646 "superblock": false, 00:25:47.646 "num_base_bdevs": 4, 00:25:47.646 "num_base_bdevs_discovered": 3, 00:25:47.646 "num_base_bdevs_operational": 3, 00:25:47.646 "process": { 00:25:47.646 "type": "rebuild", 00:25:47.646 "target": "spare", 00:25:47.646 "progress": { 00:25:47.646 "blocks": 53248, 00:25:47.646 "percent": 81 00:25:47.646 } 00:25:47.646 }, 00:25:47.646 "base_bdevs_list": [ 00:25:47.646 { 00:25:47.646 "name": "spare", 00:25:47.646 "uuid": "38e23596-e70f-57d0-96db-8962f1930dbe", 00:25:47.646 "is_configured": true, 00:25:47.646 "data_offset": 0, 00:25:47.646 "data_size": 65536 00:25:47.646 }, 00:25:47.646 { 00:25:47.646 "name": null, 00:25:47.646 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:47.646 "is_configured": false, 00:25:47.646 "data_offset": 0, 00:25:47.646 "data_size": 65536 00:25:47.646 }, 00:25:47.646 { 00:25:47.646 "name": "BaseBdev3", 00:25:47.646 "uuid": "faa0aa7b-a27c-5e36-b21a-687b3efd1190", 00:25:47.646 "is_configured": true, 00:25:47.646 "data_offset": 0, 00:25:47.646 "data_size": 65536 00:25:47.646 }, 00:25:47.646 { 00:25:47.646 "name": "BaseBdev4", 00:25:47.646 "uuid": "f64c2d6d-590e-583a-801f-0a9fc4d3d075", 00:25:47.646 "is_configured": true, 00:25:47.646 "data_offset": 0, 00:25:47.646 "data_size": 65536 00:25:47.646 } 00:25:47.646 ] 00:25:47.646 }' 00:25:47.646 18:29:31 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:25:47.646 18:29:31 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:25:47.646 18:29:31 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:25:47.646 18:29:31 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:25:47.646 18:29:31 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@710 -- # sleep 1 00:25:47.905 [2024-07-12 18:29:31.566286] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 59392 offset_begin: 55296 offset_end: 61440 00:25:48.470 [2024-07-12 18:29:31.907482] bdev_raid.c:2789:raid_bdev_process_thread_run: *DEBUG*: process completed on raid_bdev1 00:25:48.470 [2024-07-12 18:29:32.007758] bdev_raid.c:2504:raid_bdev_process_finish_done: *NOTICE*: Finished rebuild on raid bdev raid_bdev1 00:25:48.470 [2024-07-12 18:29:32.009082] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:25:48.728 18:29:32 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:25:48.728 18:29:32 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:25:48.728 18:29:32 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:25:48.728 18:29:32 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:25:48.728 18:29:32 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:25:48.728 18:29:32 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:25:48.728 18:29:32 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:48.728 18:29:32 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:48.986 18:29:32 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:25:48.986 "name": "raid_bdev1", 00:25:48.986 "uuid": "4bfbb459-f59c-4530-a0bc-8a3a8a84613e", 00:25:48.986 "strip_size_kb": 0, 00:25:48.986 "state": "online", 00:25:48.986 "raid_level": "raid1", 00:25:48.986 "superblock": false, 00:25:48.986 "num_base_bdevs": 4, 00:25:48.986 "num_base_bdevs_discovered": 3, 00:25:48.986 "num_base_bdevs_operational": 3, 00:25:48.986 "base_bdevs_list": [ 00:25:48.986 { 00:25:48.986 "name": "spare", 00:25:48.987 "uuid": "38e23596-e70f-57d0-96db-8962f1930dbe", 00:25:48.987 "is_configured": true, 00:25:48.987 "data_offset": 0, 00:25:48.987 "data_size": 65536 00:25:48.987 }, 00:25:48.987 { 00:25:48.987 "name": null, 00:25:48.987 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:48.987 "is_configured": false, 00:25:48.987 "data_offset": 0, 00:25:48.987 "data_size": 65536 00:25:48.987 }, 00:25:48.987 { 00:25:48.987 "name": "BaseBdev3", 00:25:48.987 "uuid": "faa0aa7b-a27c-5e36-b21a-687b3efd1190", 00:25:48.987 "is_configured": true, 00:25:48.987 "data_offset": 0, 00:25:48.987 "data_size": 65536 00:25:48.987 }, 00:25:48.987 { 00:25:48.987 "name": "BaseBdev4", 00:25:48.987 "uuid": "f64c2d6d-590e-583a-801f-0a9fc4d3d075", 00:25:48.987 "is_configured": true, 00:25:48.987 "data_offset": 0, 00:25:48.987 "data_size": 65536 00:25:48.987 } 00:25:48.987 ] 00:25:48.987 }' 00:25:48.987 18:29:32 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:25:48.987 18:29:32 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # [[ none == \r\e\b\u\i\l\d ]] 00:25:48.987 18:29:32 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:25:48.987 18:29:32 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # [[ none == \s\p\a\r\e ]] 00:25:48.987 18:29:32 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@708 -- # break 00:25:48.987 18:29:32 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@714 -- # verify_raid_bdev_process raid_bdev1 none none 00:25:48.987 18:29:32 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:25:48.987 18:29:32 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:25:48.987 18:29:32 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@184 -- # local target=none 00:25:48.987 18:29:32 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:25:48.987 18:29:32 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:48.987 18:29:32 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:49.246 18:29:32 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:25:49.246 "name": "raid_bdev1", 00:25:49.246 "uuid": "4bfbb459-f59c-4530-a0bc-8a3a8a84613e", 00:25:49.246 "strip_size_kb": 0, 00:25:49.246 "state": "online", 00:25:49.246 "raid_level": "raid1", 00:25:49.246 "superblock": false, 00:25:49.246 "num_base_bdevs": 4, 00:25:49.246 "num_base_bdevs_discovered": 3, 00:25:49.246 "num_base_bdevs_operational": 3, 00:25:49.246 "base_bdevs_list": [ 00:25:49.246 { 00:25:49.246 "name": "spare", 00:25:49.246 "uuid": "38e23596-e70f-57d0-96db-8962f1930dbe", 00:25:49.246 "is_configured": true, 00:25:49.246 "data_offset": 0, 00:25:49.246 "data_size": 65536 00:25:49.246 }, 00:25:49.246 { 00:25:49.246 "name": null, 00:25:49.246 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:49.246 "is_configured": false, 00:25:49.246 "data_offset": 0, 00:25:49.246 "data_size": 65536 00:25:49.246 }, 00:25:49.246 { 00:25:49.246 "name": "BaseBdev3", 00:25:49.246 "uuid": "faa0aa7b-a27c-5e36-b21a-687b3efd1190", 00:25:49.246 "is_configured": true, 00:25:49.246 "data_offset": 0, 00:25:49.246 "data_size": 65536 00:25:49.246 }, 00:25:49.246 { 00:25:49.246 "name": "BaseBdev4", 00:25:49.246 "uuid": "f64c2d6d-590e-583a-801f-0a9fc4d3d075", 00:25:49.246 "is_configured": true, 00:25:49.246 "data_offset": 0, 00:25:49.246 "data_size": 65536 00:25:49.246 } 00:25:49.246 ] 00:25:49.247 }' 00:25:49.247 18:29:32 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:25:49.506 18:29:32 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:25:49.506 18:29:32 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:25:49.506 18:29:33 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:25:49.506 18:29:33 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@715 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:25:49.506 18:29:33 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:25:49.506 18:29:33 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:25:49.506 18:29:33 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:25:49.506 18:29:33 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:25:49.506 18:29:33 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:25:49.506 18:29:33 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:25:49.506 18:29:33 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:25:49.506 18:29:33 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:25:49.506 18:29:33 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:25:49.506 18:29:33 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:49.506 18:29:33 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:49.780 18:29:33 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:25:49.780 "name": "raid_bdev1", 00:25:49.780 "uuid": "4bfbb459-f59c-4530-a0bc-8a3a8a84613e", 00:25:49.780 "strip_size_kb": 0, 00:25:49.780 "state": "online", 00:25:49.780 "raid_level": "raid1", 00:25:49.780 "superblock": false, 00:25:49.780 "num_base_bdevs": 4, 00:25:49.780 "num_base_bdevs_discovered": 3, 00:25:49.780 "num_base_bdevs_operational": 3, 00:25:49.780 "base_bdevs_list": [ 00:25:49.780 { 00:25:49.780 "name": "spare", 00:25:49.780 "uuid": "38e23596-e70f-57d0-96db-8962f1930dbe", 00:25:49.780 "is_configured": true, 00:25:49.780 "data_offset": 0, 00:25:49.780 "data_size": 65536 00:25:49.780 }, 00:25:49.780 { 00:25:49.780 "name": null, 00:25:49.780 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:49.780 "is_configured": false, 00:25:49.780 "data_offset": 0, 00:25:49.780 "data_size": 65536 00:25:49.780 }, 00:25:49.780 { 00:25:49.780 "name": "BaseBdev3", 00:25:49.780 "uuid": "faa0aa7b-a27c-5e36-b21a-687b3efd1190", 00:25:49.780 "is_configured": true, 00:25:49.780 "data_offset": 0, 00:25:49.780 "data_size": 65536 00:25:49.780 }, 00:25:49.780 { 00:25:49.780 "name": "BaseBdev4", 00:25:49.780 "uuid": "f64c2d6d-590e-583a-801f-0a9fc4d3d075", 00:25:49.780 "is_configured": true, 00:25:49.780 "data_offset": 0, 00:25:49.780 "data_size": 65536 00:25:49.780 } 00:25:49.780 ] 00:25:49.780 }' 00:25:49.780 18:29:33 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:25:49.780 18:29:33 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@10 -- # set +x 00:25:50.345 18:29:33 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@718 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:25:50.603 [2024-07-12 18:29:34.094638] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:25:50.603 [2024-07-12 18:29:34.094669] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:25:50.603 00:25:50.603 Latency(us) 00:25:50.603 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:25:50.603 Job: raid_bdev1 (Core Mask 0x1, workload: randrw, percentage: 50, depth: 2, IO size: 3145728) 00:25:50.603 raid_bdev1 : 11.60 90.52 271.56 0.00 0.00 14695.24 288.50 113063.85 00:25:50.603 =================================================================================================================== 00:25:50.603 Total : 90.52 271.56 0.00 0.00 14695.24 288.50 113063.85 00:25:50.603 [2024-07-12 18:29:34.118762] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:25:50.603 [2024-07-12 18:29:34.118791] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:25:50.603 [2024-07-12 18:29:34.118883] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:25:50.603 [2024-07-12 18:29:34.118895] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x14708a0 name raid_bdev1, state offline 00:25:50.603 0 00:25:50.603 18:29:34 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@719 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:50.603 18:29:34 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@719 -- # jq length 00:25:50.861 18:29:34 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@719 -- # [[ 0 == 0 ]] 00:25:50.861 18:29:34 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@721 -- # '[' true = true ']' 00:25:50.861 18:29:34 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@722 -- # '[' true = true ']' 00:25:50.861 18:29:34 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@724 -- # nbd_start_disks /var/tmp/spdk-raid.sock spare /dev/nbd0 00:25:50.861 18:29:34 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:25:50.861 18:29:34 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@10 -- # bdev_list=('spare') 00:25:50.861 18:29:34 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@10 -- # local bdev_list 00:25:50.861 18:29:34 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0') 00:25:50.861 18:29:34 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@11 -- # local nbd_list 00:25:50.861 18:29:34 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@12 -- # local i 00:25:50.861 18:29:34 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:25:50.861 18:29:34 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:25:50.861 18:29:34 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk spare /dev/nbd0 00:25:51.118 /dev/nbd0 00:25:51.118 18:29:34 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:25:51.118 18:29:34 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:25:51.118 18:29:34 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@866 -- # local nbd_name=nbd0 00:25:51.118 18:29:34 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@867 -- # local i 00:25:51.118 18:29:34 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:25:51.118 18:29:34 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:25:51.118 18:29:34 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@870 -- # grep -q -w nbd0 /proc/partitions 00:25:51.118 18:29:34 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@871 -- # break 00:25:51.118 18:29:34 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:25:51.118 18:29:34 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:25:51.118 18:29:34 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@883 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:25:51.118 1+0 records in 00:25:51.118 1+0 records out 00:25:51.118 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000264733 s, 15.5 MB/s 00:25:51.118 18:29:34 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:25:51.118 18:29:34 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@884 -- # size=4096 00:25:51.118 18:29:34 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:25:51.118 18:29:34 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:25:51.118 18:29:34 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@887 -- # return 0 00:25:51.118 18:29:34 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:25:51.118 18:29:34 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:25:51.118 18:29:34 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@725 -- # for bdev in "${base_bdevs[@]:1}" 00:25:51.118 18:29:34 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@726 -- # '[' -z '' ']' 00:25:51.118 18:29:34 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@727 -- # continue 00:25:51.119 18:29:34 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@725 -- # for bdev in "${base_bdevs[@]:1}" 00:25:51.119 18:29:34 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@726 -- # '[' -z BaseBdev3 ']' 00:25:51.119 18:29:34 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@729 -- # nbd_start_disks /var/tmp/spdk-raid.sock BaseBdev3 /dev/nbd1 00:25:51.119 18:29:34 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:25:51.119 18:29:34 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@10 -- # bdev_list=('BaseBdev3') 00:25:51.119 18:29:34 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@10 -- # local bdev_list 00:25:51.119 18:29:34 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd1') 00:25:51.119 18:29:34 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@11 -- # local nbd_list 00:25:51.119 18:29:34 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@12 -- # local i 00:25:51.119 18:29:34 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:25:51.119 18:29:34 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:25:51.119 18:29:34 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk BaseBdev3 /dev/nbd1 00:25:51.119 /dev/nbd1 00:25:51.377 18:29:34 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:25:51.377 18:29:34 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:25:51.377 18:29:34 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@866 -- # local nbd_name=nbd1 00:25:51.377 18:29:34 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@867 -- # local i 00:25:51.377 18:29:34 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:25:51.377 18:29:34 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:25:51.377 18:29:34 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@870 -- # grep -q -w nbd1 /proc/partitions 00:25:51.377 18:29:34 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@871 -- # break 00:25:51.377 18:29:34 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:25:51.377 18:29:34 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:25:51.377 18:29:34 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@883 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:25:51.377 1+0 records in 00:25:51.377 1+0 records out 00:25:51.377 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000233187 s, 17.6 MB/s 00:25:51.377 18:29:34 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:25:51.377 18:29:34 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@884 -- # size=4096 00:25:51.377 18:29:34 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:25:51.377 18:29:34 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:25:51.377 18:29:34 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@887 -- # return 0 00:25:51.377 18:29:34 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:25:51.377 18:29:34 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:25:51.377 18:29:34 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@730 -- # cmp -i 0 /dev/nbd0 /dev/nbd1 00:25:51.377 18:29:34 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@731 -- # nbd_stop_disks /var/tmp/spdk-raid.sock /dev/nbd1 00:25:51.377 18:29:34 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:25:51.377 18:29:34 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd1') 00:25:51.377 18:29:34 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@50 -- # local nbd_list 00:25:51.377 18:29:34 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@51 -- # local i 00:25:51.377 18:29:34 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:25:51.377 18:29:34 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd1 00:25:51.635 18:29:35 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:25:51.635 18:29:35 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:25:51.635 18:29:35 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:25:51.635 18:29:35 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:25:51.635 18:29:35 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:25:51.635 18:29:35 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:25:51.635 18:29:35 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@41 -- # break 00:25:51.635 18:29:35 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@45 -- # return 0 00:25:51.635 18:29:35 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@725 -- # for bdev in "${base_bdevs[@]:1}" 00:25:51.635 18:29:35 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@726 -- # '[' -z BaseBdev4 ']' 00:25:51.635 18:29:35 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@729 -- # nbd_start_disks /var/tmp/spdk-raid.sock BaseBdev4 /dev/nbd1 00:25:51.635 18:29:35 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:25:51.635 18:29:35 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@10 -- # bdev_list=('BaseBdev4') 00:25:51.635 18:29:35 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@10 -- # local bdev_list 00:25:51.635 18:29:35 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd1') 00:25:51.635 18:29:35 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@11 -- # local nbd_list 00:25:51.635 18:29:35 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@12 -- # local i 00:25:51.635 18:29:35 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:25:51.635 18:29:35 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:25:51.635 18:29:35 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk BaseBdev4 /dev/nbd1 00:25:51.892 /dev/nbd1 00:25:51.892 18:29:35 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:25:51.892 18:29:35 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:25:51.892 18:29:35 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@866 -- # local nbd_name=nbd1 00:25:51.892 18:29:35 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@867 -- # local i 00:25:51.892 18:29:35 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:25:51.892 18:29:35 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:25:51.892 18:29:35 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@870 -- # grep -q -w nbd1 /proc/partitions 00:25:51.892 18:29:35 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@871 -- # break 00:25:51.892 18:29:35 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:25:51.892 18:29:35 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:25:51.892 18:29:35 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@883 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:25:51.892 1+0 records in 00:25:51.892 1+0 records out 00:25:51.892 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000285406 s, 14.4 MB/s 00:25:51.892 18:29:35 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:25:51.892 18:29:35 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@884 -- # size=4096 00:25:51.892 18:29:35 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:25:51.892 18:29:35 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:25:51.892 18:29:35 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@887 -- # return 0 00:25:51.892 18:29:35 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:25:51.892 18:29:35 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:25:51.892 18:29:35 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@730 -- # cmp -i 0 /dev/nbd0 /dev/nbd1 00:25:51.892 18:29:35 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@731 -- # nbd_stop_disks /var/tmp/spdk-raid.sock /dev/nbd1 00:25:51.892 18:29:35 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:25:51.892 18:29:35 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd1') 00:25:51.892 18:29:35 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@50 -- # local nbd_list 00:25:51.892 18:29:35 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@51 -- # local i 00:25:51.892 18:29:35 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:25:51.892 18:29:35 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd1 00:25:52.150 18:29:35 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:25:52.150 18:29:35 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:25:52.150 18:29:35 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:25:52.150 18:29:35 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:25:52.150 18:29:35 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:25:52.150 18:29:35 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:25:52.150 18:29:35 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@41 -- # break 00:25:52.150 18:29:35 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@45 -- # return 0 00:25:52.150 18:29:35 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@733 -- # nbd_stop_disks /var/tmp/spdk-raid.sock /dev/nbd0 00:25:52.150 18:29:35 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:25:52.150 18:29:35 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0') 00:25:52.150 18:29:35 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@50 -- # local nbd_list 00:25:52.150 18:29:35 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@51 -- # local i 00:25:52.150 18:29:35 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:25:52.150 18:29:35 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd0 00:25:52.419 18:29:35 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:25:52.419 18:29:35 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:25:52.419 18:29:35 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:25:52.419 18:29:35 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:25:52.419 18:29:35 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:25:52.419 18:29:35 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:25:52.419 18:29:35 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@41 -- # break 00:25:52.419 18:29:35 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@45 -- # return 0 00:25:52.419 18:29:35 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@742 -- # '[' false = true ']' 00:25:52.419 18:29:35 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@782 -- # killprocess 2589745 00:25:52.419 18:29:35 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@948 -- # '[' -z 2589745 ']' 00:25:52.419 18:29:35 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@952 -- # kill -0 2589745 00:25:52.419 18:29:35 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@953 -- # uname 00:25:52.419 18:29:35 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:25:52.419 18:29:35 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2589745 00:25:52.419 18:29:36 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:25:52.419 18:29:36 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:25:52.419 18:29:36 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2589745' 00:25:52.419 killing process with pid 2589745 00:25:52.419 18:29:36 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@967 -- # kill 2589745 00:25:52.419 Received shutdown signal, test time was about 13.503463 seconds 00:25:52.419 00:25:52.419 Latency(us) 00:25:52.419 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:25:52.419 =================================================================================================================== 00:25:52.419 Total : 0.00 0.00 0.00 0.00 0.00 0.00 0.00 00:25:52.419 [2024-07-12 18:29:36.022893] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:25:52.420 18:29:36 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@972 -- # wait 2589745 00:25:52.420 [2024-07-12 18:29:36.063806] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:25:52.679 18:29:36 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@784 -- # return 0 00:25:52.679 00:25:52.679 real 0m20.151s 00:25:52.679 user 0m31.664s 00:25:52.679 sys 0m3.469s 00:25:52.679 18:29:36 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@1124 -- # xtrace_disable 00:25:52.679 18:29:36 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@10 -- # set +x 00:25:52.679 ************************************ 00:25:52.679 END TEST raid_rebuild_test_io 00:25:52.679 ************************************ 00:25:52.679 18:29:36 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:25:52.679 18:29:36 bdev_raid -- bdev/bdev_raid.sh@880 -- # run_test raid_rebuild_test_sb_io raid_rebuild_test raid1 4 true true true 00:25:52.679 18:29:36 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 7 -le 1 ']' 00:25:52.679 18:29:36 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:25:52.679 18:29:36 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:25:52.679 ************************************ 00:25:52.679 START TEST raid_rebuild_test_sb_io 00:25:52.679 ************************************ 00:25:52.679 18:29:36 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@1123 -- # raid_rebuild_test raid1 4 true true true 00:25:52.679 18:29:36 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@568 -- # local raid_level=raid1 00:25:52.679 18:29:36 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@569 -- # local num_base_bdevs=4 00:25:52.679 18:29:36 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@570 -- # local superblock=true 00:25:52.679 18:29:36 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@571 -- # local background_io=true 00:25:52.679 18:29:36 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@572 -- # local verify=true 00:25:52.679 18:29:36 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@573 -- # (( i = 1 )) 00:25:52.679 18:29:36 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:25:52.679 18:29:36 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@575 -- # echo BaseBdev1 00:25:52.679 18:29:36 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:25:52.679 18:29:36 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:25:52.679 18:29:36 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@575 -- # echo BaseBdev2 00:25:52.679 18:29:36 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:25:52.679 18:29:36 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:25:52.679 18:29:36 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@575 -- # echo BaseBdev3 00:25:52.679 18:29:36 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:25:52.679 18:29:36 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:25:52.679 18:29:36 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@575 -- # echo BaseBdev4 00:25:52.679 18:29:36 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:25:52.679 18:29:36 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:25:52.679 18:29:36 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@573 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3' 'BaseBdev4') 00:25:52.679 18:29:36 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@573 -- # local base_bdevs 00:25:52.679 18:29:36 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@574 -- # local raid_bdev_name=raid_bdev1 00:25:52.679 18:29:36 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@575 -- # local strip_size 00:25:52.679 18:29:36 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@576 -- # local create_arg 00:25:52.679 18:29:36 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@577 -- # local raid_bdev_size 00:25:52.679 18:29:36 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@578 -- # local data_offset 00:25:52.680 18:29:36 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@580 -- # '[' raid1 '!=' raid1 ']' 00:25:52.680 18:29:36 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@588 -- # strip_size=0 00:25:52.680 18:29:36 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@591 -- # '[' true = true ']' 00:25:52.680 18:29:36 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@592 -- # create_arg+=' -s' 00:25:52.680 18:29:36 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@596 -- # raid_pid=2592510 00:25:52.680 18:29:36 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@597 -- # waitforlisten 2592510 /var/tmp/spdk-raid.sock 00:25:52.680 18:29:36 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@595 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 3M -q 2 -U -z -L bdev_raid 00:25:52.680 18:29:36 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@829 -- # '[' -z 2592510 ']' 00:25:52.680 18:29:36 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:25:52.680 18:29:36 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@834 -- # local max_retries=100 00:25:52.680 18:29:36 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:25:52.680 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:25:52.680 18:29:36 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@838 -- # xtrace_disable 00:25:52.680 18:29:36 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:25:52.936 [2024-07-12 18:29:36.428303] Starting SPDK v24.09-pre git sha1 182dd7de4 / DPDK 24.03.0 initialization... 00:25:52.936 [2024-07-12 18:29:36.428373] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2592510 ] 00:25:52.936 I/O size of 3145728 is greater than zero copy threshold (65536). 00:25:52.936 Zero copy mechanism will not be used. 00:25:52.936 [2024-07-12 18:29:36.558402] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:25:52.936 [2024-07-12 18:29:36.655375] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:25:53.192 [2024-07-12 18:29:36.715326] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:25:53.192 [2024-07-12 18:29:36.715364] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:25:53.757 18:29:37 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:25:53.757 18:29:37 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@862 -- # return 0 00:25:53.757 18:29:37 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:25:53.757 18:29:37 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:25:54.014 BaseBdev1_malloc 00:25:54.014 18:29:37 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev1_malloc -p BaseBdev1 00:25:54.272 [2024-07-12 18:29:37.774855] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev1_malloc 00:25:54.272 [2024-07-12 18:29:37.774903] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:25:54.272 [2024-07-12 18:29:37.774924] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x2406d40 00:25:54.272 [2024-07-12 18:29:37.774944] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:25:54.272 [2024-07-12 18:29:37.776594] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:25:54.272 [2024-07-12 18:29:37.776625] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:25:54.272 BaseBdev1 00:25:54.272 18:29:37 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:25:54.272 18:29:37 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:25:54.272 BaseBdev2_malloc 00:25:54.272 18:29:37 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev2_malloc -p BaseBdev2 00:25:54.529 [2024-07-12 18:29:38.140703] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev2_malloc 00:25:54.529 [2024-07-12 18:29:38.140747] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:25:54.529 [2024-07-12 18:29:38.140769] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x2407860 00:25:54.529 [2024-07-12 18:29:38.140781] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:25:54.529 [2024-07-12 18:29:38.142177] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:25:54.529 [2024-07-12 18:29:38.142205] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:25:54.529 BaseBdev2 00:25:54.529 18:29:38 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:25:54.529 18:29:38 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3_malloc 00:25:54.789 BaseBdev3_malloc 00:25:54.789 18:29:38 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev3_malloc -p BaseBdev3 00:25:55.092 [2024-07-12 18:29:38.634878] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev3_malloc 00:25:55.092 [2024-07-12 18:29:38.634936] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:25:55.092 [2024-07-12 18:29:38.634962] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x25b48f0 00:25:55.092 [2024-07-12 18:29:38.634975] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:25:55.092 [2024-07-12 18:29:38.636442] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:25:55.092 [2024-07-12 18:29:38.636471] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev3 00:25:55.092 BaseBdev3 00:25:55.092 18:29:38 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:25:55.092 18:29:38 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4_malloc 00:25:55.358 BaseBdev4_malloc 00:25:55.358 18:29:38 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev4_malloc -p BaseBdev4 00:25:55.358 [2024-07-12 18:29:39.060670] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev4_malloc 00:25:55.358 [2024-07-12 18:29:39.060722] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:25:55.358 [2024-07-12 18:29:39.060742] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x25b3ad0 00:25:55.358 [2024-07-12 18:29:39.060755] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:25:55.358 [2024-07-12 18:29:39.062200] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:25:55.358 [2024-07-12 18:29:39.062230] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev4 00:25:55.358 BaseBdev4 00:25:55.358 18:29:39 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@606 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b spare_malloc 00:25:55.616 spare_malloc 00:25:55.616 18:29:39 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@607 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_delay_create -b spare_malloc -d spare_delay -r 0 -t 0 -w 100000 -n 100000 00:25:55.874 spare_delay 00:25:55.874 18:29:39 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@608 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:25:56.132 [2024-07-12 18:29:39.799253] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:25:56.132 [2024-07-12 18:29:39.799299] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:25:56.132 [2024-07-12 18:29:39.799318] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x25b85b0 00:25:56.132 [2024-07-12 18:29:39.799331] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:25:56.132 [2024-07-12 18:29:39.800722] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:25:56.132 [2024-07-12 18:29:39.800748] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:25:56.132 spare 00:25:56.132 18:29:39 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@611 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n raid_bdev1 00:25:56.390 [2024-07-12 18:29:40.039939] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:25:56.390 [2024-07-12 18:29:40.041161] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:25:56.390 [2024-07-12 18:29:40.041217] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:25:56.390 [2024-07-12 18:29:40.041266] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:25:56.390 [2024-07-12 18:29:40.041454] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x25378a0 00:25:56.390 [2024-07-12 18:29:40.041467] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:25:56.390 [2024-07-12 18:29:40.041651] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x25b1e10 00:25:56.390 [2024-07-12 18:29:40.041795] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x25378a0 00:25:56.390 [2024-07-12 18:29:40.041811] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x25378a0 00:25:56.390 [2024-07-12 18:29:40.041900] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:25:56.390 18:29:40 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@612 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 4 00:25:56.390 18:29:40 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:25:56.390 18:29:40 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:25:56.390 18:29:40 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:25:56.390 18:29:40 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:25:56.390 18:29:40 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:25:56.390 18:29:40 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:25:56.390 18:29:40 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:25:56.390 18:29:40 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:25:56.390 18:29:40 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:25:56.390 18:29:40 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:56.390 18:29:40 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:56.648 18:29:40 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:25:56.648 "name": "raid_bdev1", 00:25:56.648 "uuid": "6fbf3f3f-9a1e-4915-9d01-4c9f0cb9c2a2", 00:25:56.648 "strip_size_kb": 0, 00:25:56.648 "state": "online", 00:25:56.648 "raid_level": "raid1", 00:25:56.648 "superblock": true, 00:25:56.648 "num_base_bdevs": 4, 00:25:56.648 "num_base_bdevs_discovered": 4, 00:25:56.648 "num_base_bdevs_operational": 4, 00:25:56.648 "base_bdevs_list": [ 00:25:56.648 { 00:25:56.648 "name": "BaseBdev1", 00:25:56.648 "uuid": "66448032-8702-5bdb-b568-b422f5084afc", 00:25:56.648 "is_configured": true, 00:25:56.648 "data_offset": 2048, 00:25:56.648 "data_size": 63488 00:25:56.648 }, 00:25:56.648 { 00:25:56.648 "name": "BaseBdev2", 00:25:56.648 "uuid": "b198e750-9714-5192-b845-777d43e36444", 00:25:56.648 "is_configured": true, 00:25:56.648 "data_offset": 2048, 00:25:56.648 "data_size": 63488 00:25:56.648 }, 00:25:56.648 { 00:25:56.648 "name": "BaseBdev3", 00:25:56.648 "uuid": "f6fcb52a-aa9f-5e61-bc4d-15408291353f", 00:25:56.648 "is_configured": true, 00:25:56.648 "data_offset": 2048, 00:25:56.648 "data_size": 63488 00:25:56.648 }, 00:25:56.648 { 00:25:56.648 "name": "BaseBdev4", 00:25:56.648 "uuid": "a660dbeb-0a77-5b3e-9ca1-f6163da5ea4d", 00:25:56.648 "is_configured": true, 00:25:56.648 "data_offset": 2048, 00:25:56.648 "data_size": 63488 00:25:56.648 } 00:25:56.648 ] 00:25:56.648 }' 00:25:56.648 18:29:40 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:25:56.648 18:29:40 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:25:57.213 18:29:40 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@615 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:25:57.213 18:29:40 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@615 -- # jq -r '.[].num_blocks' 00:25:57.471 [2024-07-12 18:29:41.127086] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:25:57.471 18:29:41 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@615 -- # raid_bdev_size=63488 00:25:57.471 18:29:41 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@618 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:57.471 18:29:41 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@618 -- # jq -r '.[].base_bdevs_list[0].data_offset' 00:25:57.729 18:29:41 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@618 -- # data_offset=2048 00:25:57.729 18:29:41 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@620 -- # '[' true = true ']' 00:25:57.729 18:29:41 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@639 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev1 00:25:57.729 18:29:41 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@622 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/spdk-raid.sock perform_tests 00:25:57.987 [2024-07-12 18:29:41.505938] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x2406670 00:25:57.987 I/O size of 3145728 is greater than zero copy threshold (65536). 00:25:57.987 Zero copy mechanism will not be used. 00:25:57.987 Running I/O for 60 seconds... 00:25:57.987 [2024-07-12 18:29:41.622035] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:25:57.987 [2024-07-12 18:29:41.630256] bdev_raid.c:1919:raid_bdev_channel_remove_base_bdev: *DEBUG*: slot: 0 raid_ch: 0x2406670 00:25:57.987 18:29:41 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@642 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:25:57.987 18:29:41 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:25:57.987 18:29:41 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:25:57.988 18:29:41 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:25:57.988 18:29:41 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:25:57.988 18:29:41 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:25:57.988 18:29:41 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:25:57.988 18:29:41 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:25:57.988 18:29:41 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:25:57.988 18:29:41 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:25:57.988 18:29:41 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:57.988 18:29:41 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:58.246 18:29:41 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:25:58.246 "name": "raid_bdev1", 00:25:58.246 "uuid": "6fbf3f3f-9a1e-4915-9d01-4c9f0cb9c2a2", 00:25:58.246 "strip_size_kb": 0, 00:25:58.246 "state": "online", 00:25:58.246 "raid_level": "raid1", 00:25:58.246 "superblock": true, 00:25:58.246 "num_base_bdevs": 4, 00:25:58.246 "num_base_bdevs_discovered": 3, 00:25:58.246 "num_base_bdevs_operational": 3, 00:25:58.246 "base_bdevs_list": [ 00:25:58.246 { 00:25:58.246 "name": null, 00:25:58.246 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:58.246 "is_configured": false, 00:25:58.246 "data_offset": 2048, 00:25:58.246 "data_size": 63488 00:25:58.246 }, 00:25:58.246 { 00:25:58.246 "name": "BaseBdev2", 00:25:58.246 "uuid": "b198e750-9714-5192-b845-777d43e36444", 00:25:58.246 "is_configured": true, 00:25:58.246 "data_offset": 2048, 00:25:58.246 "data_size": 63488 00:25:58.246 }, 00:25:58.246 { 00:25:58.246 "name": "BaseBdev3", 00:25:58.246 "uuid": "f6fcb52a-aa9f-5e61-bc4d-15408291353f", 00:25:58.246 "is_configured": true, 00:25:58.246 "data_offset": 2048, 00:25:58.246 "data_size": 63488 00:25:58.246 }, 00:25:58.246 { 00:25:58.246 "name": "BaseBdev4", 00:25:58.246 "uuid": "a660dbeb-0a77-5b3e-9ca1-f6163da5ea4d", 00:25:58.246 "is_configured": true, 00:25:58.246 "data_offset": 2048, 00:25:58.246 "data_size": 63488 00:25:58.246 } 00:25:58.246 ] 00:25:58.246 }' 00:25:58.246 18:29:41 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:25:58.246 18:29:41 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:25:59.180 18:29:42 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@645 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:25:59.180 [2024-07-12 18:29:42.840187] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:25:59.180 [2024-07-12 18:29:42.896764] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x2539ba0 00:25:59.180 18:29:42 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@646 -- # sleep 1 00:25:59.180 [2024-07-12 18:29:42.899148] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:25:59.460 [2024-07-12 18:29:43.029322] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 2048 offset_begin: 0 offset_end: 6144 00:25:59.460 [2024-07-12 18:29:43.029827] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 2048 offset_begin: 0 offset_end: 6144 00:25:59.460 [2024-07-12 18:29:43.178030] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 4096 offset_begin: 0 offset_end: 6144 00:25:59.460 [2024-07-12 18:29:43.178626] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 4096 offset_begin: 0 offset_end: 6144 00:26:00.026 [2024-07-12 18:29:43.536239] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 8192 offset_begin: 6144 offset_end: 12288 00:26:00.026 [2024-07-12 18:29:43.648295] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 10240 offset_begin: 6144 offset_end: 12288 00:26:00.026 [2024-07-12 18:29:43.648957] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 10240 offset_begin: 6144 offset_end: 12288 00:26:00.284 18:29:43 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@649 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:26:00.284 18:29:43 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:26:00.284 18:29:43 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:26:00.284 18:29:43 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:26:00.284 18:29:43 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:26:00.284 18:29:43 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:00.284 18:29:43 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:00.542 [2024-07-12 18:29:44.013240] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 14336 offset_begin: 12288 offset_end: 18432 00:26:00.542 [2024-07-12 18:29:44.126118] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 16384 offset_begin: 12288 offset_end: 18432 00:26:00.542 [2024-07-12 18:29:44.126345] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 16384 offset_begin: 12288 offset_end: 18432 00:26:00.542 18:29:44 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:26:00.542 "name": "raid_bdev1", 00:26:00.542 "uuid": "6fbf3f3f-9a1e-4915-9d01-4c9f0cb9c2a2", 00:26:00.542 "strip_size_kb": 0, 00:26:00.542 "state": "online", 00:26:00.542 "raid_level": "raid1", 00:26:00.542 "superblock": true, 00:26:00.542 "num_base_bdevs": 4, 00:26:00.542 "num_base_bdevs_discovered": 4, 00:26:00.542 "num_base_bdevs_operational": 4, 00:26:00.542 "process": { 00:26:00.542 "type": "rebuild", 00:26:00.542 "target": "spare", 00:26:00.542 "progress": { 00:26:00.542 "blocks": 16384, 00:26:00.542 "percent": 25 00:26:00.542 } 00:26:00.542 }, 00:26:00.542 "base_bdevs_list": [ 00:26:00.542 { 00:26:00.542 "name": "spare", 00:26:00.542 "uuid": "245dce74-1e73-5c12-b02f-49405435812b", 00:26:00.542 "is_configured": true, 00:26:00.542 "data_offset": 2048, 00:26:00.542 "data_size": 63488 00:26:00.542 }, 00:26:00.542 { 00:26:00.542 "name": "BaseBdev2", 00:26:00.542 "uuid": "b198e750-9714-5192-b845-777d43e36444", 00:26:00.542 "is_configured": true, 00:26:00.542 "data_offset": 2048, 00:26:00.542 "data_size": 63488 00:26:00.542 }, 00:26:00.542 { 00:26:00.542 "name": "BaseBdev3", 00:26:00.542 "uuid": "f6fcb52a-aa9f-5e61-bc4d-15408291353f", 00:26:00.542 "is_configured": true, 00:26:00.542 "data_offset": 2048, 00:26:00.542 "data_size": 63488 00:26:00.542 }, 00:26:00.542 { 00:26:00.542 "name": "BaseBdev4", 00:26:00.542 "uuid": "a660dbeb-0a77-5b3e-9ca1-f6163da5ea4d", 00:26:00.542 "is_configured": true, 00:26:00.542 "data_offset": 2048, 00:26:00.542 "data_size": 63488 00:26:00.542 } 00:26:00.542 ] 00:26:00.542 }' 00:26:00.542 18:29:44 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:26:00.542 18:29:44 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:26:00.542 18:29:44 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:26:00.542 18:29:44 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:26:00.542 18:29:44 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@652 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev spare 00:26:00.800 [2024-07-12 18:29:44.490829] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:26:00.800 [2024-07-12 18:29:44.490913] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 22528 offset_begin: 18432 offset_end: 24576 00:26:00.800 [2024-07-12 18:29:44.491529] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 22528 offset_begin: 18432 offset_end: 24576 00:26:01.059 [2024-07-12 18:29:44.593828] bdev_raid.c:2513:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:26:01.059 [2024-07-12 18:29:44.604941] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:26:01.059 [2024-07-12 18:29:44.604973] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:26:01.059 [2024-07-12 18:29:44.604984] bdev_raid.c:2451:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:26:01.059 [2024-07-12 18:29:44.619844] bdev_raid.c:1919:raid_bdev_channel_remove_base_bdev: *DEBUG*: slot: 0 raid_ch: 0x2406670 00:26:01.059 18:29:44 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@655 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:26:01.059 18:29:44 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:26:01.059 18:29:44 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:26:01.059 18:29:44 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:26:01.059 18:29:44 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:26:01.059 18:29:44 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:26:01.059 18:29:44 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:26:01.059 18:29:44 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:26:01.059 18:29:44 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:26:01.059 18:29:44 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:26:01.059 18:29:44 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:01.059 18:29:44 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:01.318 18:29:44 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:26:01.318 "name": "raid_bdev1", 00:26:01.318 "uuid": "6fbf3f3f-9a1e-4915-9d01-4c9f0cb9c2a2", 00:26:01.318 "strip_size_kb": 0, 00:26:01.318 "state": "online", 00:26:01.318 "raid_level": "raid1", 00:26:01.318 "superblock": true, 00:26:01.318 "num_base_bdevs": 4, 00:26:01.318 "num_base_bdevs_discovered": 3, 00:26:01.318 "num_base_bdevs_operational": 3, 00:26:01.318 "base_bdevs_list": [ 00:26:01.318 { 00:26:01.318 "name": null, 00:26:01.318 "uuid": "00000000-0000-0000-0000-000000000000", 00:26:01.318 "is_configured": false, 00:26:01.318 "data_offset": 2048, 00:26:01.318 "data_size": 63488 00:26:01.318 }, 00:26:01.318 { 00:26:01.318 "name": "BaseBdev2", 00:26:01.318 "uuid": "b198e750-9714-5192-b845-777d43e36444", 00:26:01.318 "is_configured": true, 00:26:01.318 "data_offset": 2048, 00:26:01.318 "data_size": 63488 00:26:01.318 }, 00:26:01.318 { 00:26:01.318 "name": "BaseBdev3", 00:26:01.318 "uuid": "f6fcb52a-aa9f-5e61-bc4d-15408291353f", 00:26:01.318 "is_configured": true, 00:26:01.318 "data_offset": 2048, 00:26:01.318 "data_size": 63488 00:26:01.318 }, 00:26:01.318 { 00:26:01.318 "name": "BaseBdev4", 00:26:01.318 "uuid": "a660dbeb-0a77-5b3e-9ca1-f6163da5ea4d", 00:26:01.318 "is_configured": true, 00:26:01.318 "data_offset": 2048, 00:26:01.318 "data_size": 63488 00:26:01.318 } 00:26:01.318 ] 00:26:01.318 }' 00:26:01.318 18:29:44 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:26:01.318 18:29:44 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:26:01.884 18:29:45 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@658 -- # verify_raid_bdev_process raid_bdev1 none none 00:26:01.884 18:29:45 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:26:01.884 18:29:45 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:26:01.884 18:29:45 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=none 00:26:01.884 18:29:45 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:26:01.884 18:29:45 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:01.884 18:29:45 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:02.143 18:29:45 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:26:02.143 "name": "raid_bdev1", 00:26:02.143 "uuid": "6fbf3f3f-9a1e-4915-9d01-4c9f0cb9c2a2", 00:26:02.143 "strip_size_kb": 0, 00:26:02.143 "state": "online", 00:26:02.143 "raid_level": "raid1", 00:26:02.143 "superblock": true, 00:26:02.143 "num_base_bdevs": 4, 00:26:02.143 "num_base_bdevs_discovered": 3, 00:26:02.143 "num_base_bdevs_operational": 3, 00:26:02.143 "base_bdevs_list": [ 00:26:02.143 { 00:26:02.143 "name": null, 00:26:02.143 "uuid": "00000000-0000-0000-0000-000000000000", 00:26:02.143 "is_configured": false, 00:26:02.143 "data_offset": 2048, 00:26:02.143 "data_size": 63488 00:26:02.143 }, 00:26:02.143 { 00:26:02.143 "name": "BaseBdev2", 00:26:02.143 "uuid": "b198e750-9714-5192-b845-777d43e36444", 00:26:02.143 "is_configured": true, 00:26:02.143 "data_offset": 2048, 00:26:02.143 "data_size": 63488 00:26:02.143 }, 00:26:02.143 { 00:26:02.143 "name": "BaseBdev3", 00:26:02.143 "uuid": "f6fcb52a-aa9f-5e61-bc4d-15408291353f", 00:26:02.143 "is_configured": true, 00:26:02.143 "data_offset": 2048, 00:26:02.143 "data_size": 63488 00:26:02.143 }, 00:26:02.143 { 00:26:02.143 "name": "BaseBdev4", 00:26:02.143 "uuid": "a660dbeb-0a77-5b3e-9ca1-f6163da5ea4d", 00:26:02.143 "is_configured": true, 00:26:02.143 "data_offset": 2048, 00:26:02.143 "data_size": 63488 00:26:02.143 } 00:26:02.143 ] 00:26:02.143 }' 00:26:02.143 18:29:45 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:26:02.143 18:29:45 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:26:02.143 18:29:45 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:26:02.143 18:29:45 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:26:02.143 18:29:45 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@661 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:26:02.401 [2024-07-12 18:29:46.043500] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:26:02.401 18:29:46 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@662 -- # sleep 1 00:26:02.401 [2024-07-12 18:29:46.109598] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x2405d60 00:26:02.401 [2024-07-12 18:29:46.111142] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:26:02.660 [2024-07-12 18:29:46.231946] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 2048 offset_begin: 0 offset_end: 6144 00:26:02.660 [2024-07-12 18:29:46.232377] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 2048 offset_begin: 0 offset_end: 6144 00:26:02.918 [2024-07-12 18:29:46.473088] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 4096 offset_begin: 0 offset_end: 6144 00:26:02.918 [2024-07-12 18:29:46.473751] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 4096 offset_begin: 0 offset_end: 6144 00:26:03.177 [2024-07-12 18:29:46.819304] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 8192 offset_begin: 6144 offset_end: 12288 00:26:03.177 [2024-07-12 18:29:46.819623] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 8192 offset_begin: 6144 offset_end: 12288 00:26:03.435 [2024-07-12 18:29:47.060039] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 10240 offset_begin: 6144 offset_end: 12288 00:26:03.435 18:29:47 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@663 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:26:03.435 18:29:47 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:26:03.435 18:29:47 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:26:03.435 18:29:47 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:26:03.435 18:29:47 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:26:03.435 18:29:47 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:03.435 18:29:47 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:03.694 18:29:47 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:26:03.694 "name": "raid_bdev1", 00:26:03.694 "uuid": "6fbf3f3f-9a1e-4915-9d01-4c9f0cb9c2a2", 00:26:03.694 "strip_size_kb": 0, 00:26:03.694 "state": "online", 00:26:03.694 "raid_level": "raid1", 00:26:03.694 "superblock": true, 00:26:03.694 "num_base_bdevs": 4, 00:26:03.694 "num_base_bdevs_discovered": 4, 00:26:03.694 "num_base_bdevs_operational": 4, 00:26:03.694 "process": { 00:26:03.694 "type": "rebuild", 00:26:03.694 "target": "spare", 00:26:03.694 "progress": { 00:26:03.694 "blocks": 12288, 00:26:03.694 "percent": 19 00:26:03.694 } 00:26:03.694 }, 00:26:03.694 "base_bdevs_list": [ 00:26:03.694 { 00:26:03.694 "name": "spare", 00:26:03.694 "uuid": "245dce74-1e73-5c12-b02f-49405435812b", 00:26:03.694 "is_configured": true, 00:26:03.694 "data_offset": 2048, 00:26:03.694 "data_size": 63488 00:26:03.694 }, 00:26:03.694 { 00:26:03.694 "name": "BaseBdev2", 00:26:03.694 "uuid": "b198e750-9714-5192-b845-777d43e36444", 00:26:03.694 "is_configured": true, 00:26:03.694 "data_offset": 2048, 00:26:03.694 "data_size": 63488 00:26:03.694 }, 00:26:03.694 { 00:26:03.694 "name": "BaseBdev3", 00:26:03.694 "uuid": "f6fcb52a-aa9f-5e61-bc4d-15408291353f", 00:26:03.694 "is_configured": true, 00:26:03.695 "data_offset": 2048, 00:26:03.695 "data_size": 63488 00:26:03.695 }, 00:26:03.695 { 00:26:03.695 "name": "BaseBdev4", 00:26:03.695 "uuid": "a660dbeb-0a77-5b3e-9ca1-f6163da5ea4d", 00:26:03.695 "is_configured": true, 00:26:03.695 "data_offset": 2048, 00:26:03.695 "data_size": 63488 00:26:03.695 } 00:26:03.695 ] 00:26:03.695 }' 00:26:03.695 18:29:47 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:26:03.695 18:29:47 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:26:03.695 18:29:47 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:26:03.953 [2024-07-12 18:29:47.429435] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 14336 offset_begin: 12288 offset_end: 18432 00:26:03.953 [2024-07-12 18:29:47.430673] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 14336 offset_begin: 12288 offset_end: 18432 00:26:03.953 18:29:47 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:26:03.953 18:29:47 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@665 -- # '[' true = true ']' 00:26:03.953 18:29:47 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@665 -- # '[' = false ']' 00:26:03.953 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev_raid.sh: line 665: [: =: unary operator expected 00:26:03.953 18:29:47 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@690 -- # local num_base_bdevs_operational=4 00:26:03.953 18:29:47 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@692 -- # '[' raid1 = raid1 ']' 00:26:03.953 18:29:47 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@692 -- # '[' 4 -gt 2 ']' 00:26:03.953 18:29:47 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@694 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev2 00:26:03.953 [2024-07-12 18:29:47.663460] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:26:03.953 [2024-07-12 18:29:47.663529] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 16384 offset_begin: 12288 offset_end: 18432 00:26:03.953 [2024-07-12 18:29:47.663811] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 16384 offset_begin: 12288 offset_end: 18432 00:26:04.212 [2024-07-12 18:29:47.781291] bdev_raid.c:1919:raid_bdev_channel_remove_base_bdev: *DEBUG*: slot: 1 raid_ch: 0x2406670 00:26:04.212 [2024-07-12 18:29:47.781321] bdev_raid.c:1919:raid_bdev_channel_remove_base_bdev: *DEBUG*: slot: 1 raid_ch: 0x2405d60 00:26:04.212 18:29:47 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@697 -- # base_bdevs[1]= 00:26:04.212 18:29:47 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@698 -- # (( num_base_bdevs_operational-- )) 00:26:04.212 18:29:47 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@701 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:26:04.212 18:29:47 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:26:04.212 18:29:47 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:26:04.212 18:29:47 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:26:04.212 18:29:47 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:26:04.212 18:29:47 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:04.212 18:29:47 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:04.471 [2024-07-12 18:29:48.045389] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 20480 offset_begin: 18432 offset_end: 24576 00:26:04.471 [2024-07-12 18:29:48.046264] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 20480 offset_begin: 18432 offset_end: 24576 00:26:04.471 18:29:48 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:26:04.471 "name": "raid_bdev1", 00:26:04.471 "uuid": "6fbf3f3f-9a1e-4915-9d01-4c9f0cb9c2a2", 00:26:04.471 "strip_size_kb": 0, 00:26:04.471 "state": "online", 00:26:04.471 "raid_level": "raid1", 00:26:04.471 "superblock": true, 00:26:04.471 "num_base_bdevs": 4, 00:26:04.471 "num_base_bdevs_discovered": 3, 00:26:04.471 "num_base_bdevs_operational": 3, 00:26:04.471 "process": { 00:26:04.471 "type": "rebuild", 00:26:04.471 "target": "spare", 00:26:04.471 "progress": { 00:26:04.471 "blocks": 20480, 00:26:04.471 "percent": 32 00:26:04.471 } 00:26:04.471 }, 00:26:04.471 "base_bdevs_list": [ 00:26:04.471 { 00:26:04.471 "name": "spare", 00:26:04.471 "uuid": "245dce74-1e73-5c12-b02f-49405435812b", 00:26:04.471 "is_configured": true, 00:26:04.471 "data_offset": 2048, 00:26:04.471 "data_size": 63488 00:26:04.471 }, 00:26:04.471 { 00:26:04.471 "name": null, 00:26:04.471 "uuid": "00000000-0000-0000-0000-000000000000", 00:26:04.471 "is_configured": false, 00:26:04.471 "data_offset": 2048, 00:26:04.471 "data_size": 63488 00:26:04.471 }, 00:26:04.471 { 00:26:04.471 "name": "BaseBdev3", 00:26:04.471 "uuid": "f6fcb52a-aa9f-5e61-bc4d-15408291353f", 00:26:04.471 "is_configured": true, 00:26:04.471 "data_offset": 2048, 00:26:04.471 "data_size": 63488 00:26:04.471 }, 00:26:04.471 { 00:26:04.471 "name": "BaseBdev4", 00:26:04.471 "uuid": "a660dbeb-0a77-5b3e-9ca1-f6163da5ea4d", 00:26:04.471 "is_configured": true, 00:26:04.471 "data_offset": 2048, 00:26:04.471 "data_size": 63488 00:26:04.471 } 00:26:04.471 ] 00:26:04.471 }' 00:26:04.471 18:29:48 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:26:04.471 18:29:48 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:26:04.471 18:29:48 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:26:04.471 18:29:48 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:26:04.471 18:29:48 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@705 -- # local timeout=961 00:26:04.471 18:29:48 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:26:04.471 18:29:48 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:26:04.471 18:29:48 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:26:04.471 18:29:48 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:26:04.472 18:29:48 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:26:04.472 18:29:48 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:26:04.472 18:29:48 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:04.472 18:29:48 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:04.730 [2024-07-12 18:29:48.277098] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 22528 offset_begin: 18432 offset_end: 24576 00:26:04.730 18:29:48 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:26:04.730 "name": "raid_bdev1", 00:26:04.730 "uuid": "6fbf3f3f-9a1e-4915-9d01-4c9f0cb9c2a2", 00:26:04.730 "strip_size_kb": 0, 00:26:04.730 "state": "online", 00:26:04.730 "raid_level": "raid1", 00:26:04.730 "superblock": true, 00:26:04.730 "num_base_bdevs": 4, 00:26:04.730 "num_base_bdevs_discovered": 3, 00:26:04.730 "num_base_bdevs_operational": 3, 00:26:04.730 "process": { 00:26:04.730 "type": "rebuild", 00:26:04.730 "target": "spare", 00:26:04.730 "progress": { 00:26:04.730 "blocks": 22528, 00:26:04.730 "percent": 35 00:26:04.730 } 00:26:04.730 }, 00:26:04.730 "base_bdevs_list": [ 00:26:04.730 { 00:26:04.730 "name": "spare", 00:26:04.730 "uuid": "245dce74-1e73-5c12-b02f-49405435812b", 00:26:04.730 "is_configured": true, 00:26:04.730 "data_offset": 2048, 00:26:04.731 "data_size": 63488 00:26:04.731 }, 00:26:04.731 { 00:26:04.731 "name": null, 00:26:04.731 "uuid": "00000000-0000-0000-0000-000000000000", 00:26:04.731 "is_configured": false, 00:26:04.731 "data_offset": 2048, 00:26:04.731 "data_size": 63488 00:26:04.731 }, 00:26:04.731 { 00:26:04.731 "name": "BaseBdev3", 00:26:04.731 "uuid": "f6fcb52a-aa9f-5e61-bc4d-15408291353f", 00:26:04.731 "is_configured": true, 00:26:04.731 "data_offset": 2048, 00:26:04.731 "data_size": 63488 00:26:04.731 }, 00:26:04.731 { 00:26:04.731 "name": "BaseBdev4", 00:26:04.731 "uuid": "a660dbeb-0a77-5b3e-9ca1-f6163da5ea4d", 00:26:04.731 "is_configured": true, 00:26:04.731 "data_offset": 2048, 00:26:04.731 "data_size": 63488 00:26:04.731 } 00:26:04.731 ] 00:26:04.731 }' 00:26:04.731 18:29:48 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:26:04.988 18:29:48 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:26:04.988 18:29:48 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:26:04.988 18:29:48 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:26:04.988 18:29:48 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@710 -- # sleep 1 00:26:05.555 [2024-07-12 18:29:49.079445] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 34816 offset_begin: 30720 offset_end: 36864 00:26:05.813 18:29:49 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:26:05.813 18:29:49 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:26:05.813 18:29:49 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:26:05.813 18:29:49 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:26:05.813 18:29:49 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:26:05.813 18:29:49 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:26:05.813 18:29:49 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:05.813 18:29:49 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:06.070 18:29:49 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:26:06.070 "name": "raid_bdev1", 00:26:06.070 "uuid": "6fbf3f3f-9a1e-4915-9d01-4c9f0cb9c2a2", 00:26:06.070 "strip_size_kb": 0, 00:26:06.070 "state": "online", 00:26:06.070 "raid_level": "raid1", 00:26:06.070 "superblock": true, 00:26:06.070 "num_base_bdevs": 4, 00:26:06.070 "num_base_bdevs_discovered": 3, 00:26:06.070 "num_base_bdevs_operational": 3, 00:26:06.070 "process": { 00:26:06.070 "type": "rebuild", 00:26:06.071 "target": "spare", 00:26:06.071 "progress": { 00:26:06.071 "blocks": 43008, 00:26:06.071 "percent": 67 00:26:06.071 } 00:26:06.071 }, 00:26:06.071 "base_bdevs_list": [ 00:26:06.071 { 00:26:06.071 "name": "spare", 00:26:06.071 "uuid": "245dce74-1e73-5c12-b02f-49405435812b", 00:26:06.071 "is_configured": true, 00:26:06.071 "data_offset": 2048, 00:26:06.071 "data_size": 63488 00:26:06.071 }, 00:26:06.071 { 00:26:06.071 "name": null, 00:26:06.071 "uuid": "00000000-0000-0000-0000-000000000000", 00:26:06.071 "is_configured": false, 00:26:06.071 "data_offset": 2048, 00:26:06.071 "data_size": 63488 00:26:06.071 }, 00:26:06.071 { 00:26:06.071 "name": "BaseBdev3", 00:26:06.071 "uuid": "f6fcb52a-aa9f-5e61-bc4d-15408291353f", 00:26:06.071 "is_configured": true, 00:26:06.071 "data_offset": 2048, 00:26:06.071 "data_size": 63488 00:26:06.071 }, 00:26:06.071 { 00:26:06.071 "name": "BaseBdev4", 00:26:06.071 "uuid": "a660dbeb-0a77-5b3e-9ca1-f6163da5ea4d", 00:26:06.071 "is_configured": true, 00:26:06.071 "data_offset": 2048, 00:26:06.071 "data_size": 63488 00:26:06.071 } 00:26:06.071 ] 00:26:06.071 }' 00:26:06.071 18:29:49 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:26:06.330 18:29:49 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:26:06.330 18:29:49 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:26:06.330 18:29:49 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:26:06.330 18:29:49 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@710 -- # sleep 1 00:26:06.588 [2024-07-12 18:29:50.094583] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 51200 offset_begin: 49152 offset_end: 55296 00:26:06.847 [2024-07-12 18:29:50.326736] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 53248 offset_begin: 49152 offset_end: 55296 00:26:07.413 18:29:50 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:26:07.413 18:29:50 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:26:07.413 18:29:50 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:26:07.413 18:29:50 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:26:07.413 18:29:50 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:26:07.413 18:29:50 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:26:07.413 18:29:50 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:07.413 18:29:50 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:07.413 [2024-07-12 18:29:50.897672] bdev_raid.c:2789:raid_bdev_process_thread_run: *DEBUG*: process completed on raid_bdev1 00:26:07.413 [2024-07-12 18:29:51.005945] bdev_raid.c:2504:raid_bdev_process_finish_done: *NOTICE*: Finished rebuild on raid bdev raid_bdev1 00:26:07.413 [2024-07-12 18:29:51.009565] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:26:07.413 18:29:51 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:26:07.413 "name": "raid_bdev1", 00:26:07.413 "uuid": "6fbf3f3f-9a1e-4915-9d01-4c9f0cb9c2a2", 00:26:07.413 "strip_size_kb": 0, 00:26:07.413 "state": "online", 00:26:07.413 "raid_level": "raid1", 00:26:07.413 "superblock": true, 00:26:07.413 "num_base_bdevs": 4, 00:26:07.413 "num_base_bdevs_discovered": 3, 00:26:07.413 "num_base_bdevs_operational": 3, 00:26:07.413 "base_bdevs_list": [ 00:26:07.413 { 00:26:07.413 "name": "spare", 00:26:07.413 "uuid": "245dce74-1e73-5c12-b02f-49405435812b", 00:26:07.413 "is_configured": true, 00:26:07.413 "data_offset": 2048, 00:26:07.413 "data_size": 63488 00:26:07.413 }, 00:26:07.413 { 00:26:07.413 "name": null, 00:26:07.413 "uuid": "00000000-0000-0000-0000-000000000000", 00:26:07.413 "is_configured": false, 00:26:07.413 "data_offset": 2048, 00:26:07.414 "data_size": 63488 00:26:07.414 }, 00:26:07.414 { 00:26:07.414 "name": "BaseBdev3", 00:26:07.414 "uuid": "f6fcb52a-aa9f-5e61-bc4d-15408291353f", 00:26:07.414 "is_configured": true, 00:26:07.414 "data_offset": 2048, 00:26:07.414 "data_size": 63488 00:26:07.414 }, 00:26:07.414 { 00:26:07.414 "name": "BaseBdev4", 00:26:07.414 "uuid": "a660dbeb-0a77-5b3e-9ca1-f6163da5ea4d", 00:26:07.414 "is_configured": true, 00:26:07.414 "data_offset": 2048, 00:26:07.414 "data_size": 63488 00:26:07.414 } 00:26:07.414 ] 00:26:07.414 }' 00:26:07.414 18:29:51 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:26:07.673 18:29:51 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ none == \r\e\b\u\i\l\d ]] 00:26:07.673 18:29:51 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:26:07.673 18:29:51 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ none == \s\p\a\r\e ]] 00:26:07.673 18:29:51 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@708 -- # break 00:26:07.673 18:29:51 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@714 -- # verify_raid_bdev_process raid_bdev1 none none 00:26:07.673 18:29:51 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:26:07.673 18:29:51 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:26:07.673 18:29:51 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=none 00:26:07.673 18:29:51 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:26:07.673 18:29:51 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:07.673 18:29:51 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:07.932 18:29:51 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:26:07.932 "name": "raid_bdev1", 00:26:07.932 "uuid": "6fbf3f3f-9a1e-4915-9d01-4c9f0cb9c2a2", 00:26:07.932 "strip_size_kb": 0, 00:26:07.932 "state": "online", 00:26:07.932 "raid_level": "raid1", 00:26:07.932 "superblock": true, 00:26:07.932 "num_base_bdevs": 4, 00:26:07.932 "num_base_bdevs_discovered": 3, 00:26:07.932 "num_base_bdevs_operational": 3, 00:26:07.932 "base_bdevs_list": [ 00:26:07.932 { 00:26:07.932 "name": "spare", 00:26:07.932 "uuid": "245dce74-1e73-5c12-b02f-49405435812b", 00:26:07.932 "is_configured": true, 00:26:07.932 "data_offset": 2048, 00:26:07.932 "data_size": 63488 00:26:07.932 }, 00:26:07.932 { 00:26:07.932 "name": null, 00:26:07.932 "uuid": "00000000-0000-0000-0000-000000000000", 00:26:07.932 "is_configured": false, 00:26:07.932 "data_offset": 2048, 00:26:07.932 "data_size": 63488 00:26:07.932 }, 00:26:07.932 { 00:26:07.932 "name": "BaseBdev3", 00:26:07.932 "uuid": "f6fcb52a-aa9f-5e61-bc4d-15408291353f", 00:26:07.932 "is_configured": true, 00:26:07.932 "data_offset": 2048, 00:26:07.932 "data_size": 63488 00:26:07.932 }, 00:26:07.932 { 00:26:07.932 "name": "BaseBdev4", 00:26:07.932 "uuid": "a660dbeb-0a77-5b3e-9ca1-f6163da5ea4d", 00:26:07.932 "is_configured": true, 00:26:07.932 "data_offset": 2048, 00:26:07.932 "data_size": 63488 00:26:07.932 } 00:26:07.932 ] 00:26:07.932 }' 00:26:07.932 18:29:51 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:26:07.932 18:29:51 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:26:07.932 18:29:51 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:26:07.932 18:29:51 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:26:07.932 18:29:51 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@715 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:26:07.932 18:29:51 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:26:07.932 18:29:51 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:26:07.932 18:29:51 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:26:07.932 18:29:51 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:26:07.932 18:29:51 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:26:07.932 18:29:51 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:26:07.932 18:29:51 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:26:07.932 18:29:51 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:26:07.932 18:29:51 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:26:07.932 18:29:51 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:07.932 18:29:51 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:08.192 18:29:51 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:26:08.192 "name": "raid_bdev1", 00:26:08.192 "uuid": "6fbf3f3f-9a1e-4915-9d01-4c9f0cb9c2a2", 00:26:08.192 "strip_size_kb": 0, 00:26:08.192 "state": "online", 00:26:08.192 "raid_level": "raid1", 00:26:08.192 "superblock": true, 00:26:08.192 "num_base_bdevs": 4, 00:26:08.192 "num_base_bdevs_discovered": 3, 00:26:08.192 "num_base_bdevs_operational": 3, 00:26:08.192 "base_bdevs_list": [ 00:26:08.192 { 00:26:08.192 "name": "spare", 00:26:08.192 "uuid": "245dce74-1e73-5c12-b02f-49405435812b", 00:26:08.192 "is_configured": true, 00:26:08.192 "data_offset": 2048, 00:26:08.192 "data_size": 63488 00:26:08.192 }, 00:26:08.192 { 00:26:08.192 "name": null, 00:26:08.192 "uuid": "00000000-0000-0000-0000-000000000000", 00:26:08.192 "is_configured": false, 00:26:08.192 "data_offset": 2048, 00:26:08.192 "data_size": 63488 00:26:08.192 }, 00:26:08.192 { 00:26:08.192 "name": "BaseBdev3", 00:26:08.192 "uuid": "f6fcb52a-aa9f-5e61-bc4d-15408291353f", 00:26:08.192 "is_configured": true, 00:26:08.192 "data_offset": 2048, 00:26:08.192 "data_size": 63488 00:26:08.192 }, 00:26:08.192 { 00:26:08.192 "name": "BaseBdev4", 00:26:08.192 "uuid": "a660dbeb-0a77-5b3e-9ca1-f6163da5ea4d", 00:26:08.192 "is_configured": true, 00:26:08.192 "data_offset": 2048, 00:26:08.192 "data_size": 63488 00:26:08.192 } 00:26:08.192 ] 00:26:08.192 }' 00:26:08.192 18:29:51 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:26:08.192 18:29:51 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:26:08.759 18:29:52 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@718 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:26:09.017 [2024-07-12 18:29:52.633624] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:26:09.017 [2024-07-12 18:29:52.633656] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:26:09.017 00:26:09.017 Latency(us) 00:26:09.017 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:26:09.017 Job: raid_bdev1 (Core Mask 0x1, workload: randrw, percentage: 50, depth: 2, IO size: 3145728) 00:26:09.017 raid_bdev1 : 11.13 89.58 268.73 0.00 0.00 14830.31 283.16 113519.75 00:26:09.017 =================================================================================================================== 00:26:09.017 Total : 89.58 268.73 0.00 0.00 14830.31 283.16 113519.75 00:26:09.017 [2024-07-12 18:29:52.669662] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:26:09.017 [2024-07-12 18:29:52.669694] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:26:09.017 [2024-07-12 18:29:52.669786] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:26:09.017 [2024-07-12 18:29:52.669799] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x25378a0 name raid_bdev1, state offline 00:26:09.017 0 00:26:09.017 18:29:52 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@719 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:09.017 18:29:52 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@719 -- # jq length 00:26:09.285 18:29:52 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@719 -- # [[ 0 == 0 ]] 00:26:09.285 18:29:52 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@721 -- # '[' true = true ']' 00:26:09.285 18:29:52 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@722 -- # '[' true = true ']' 00:26:09.285 18:29:52 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@724 -- # nbd_start_disks /var/tmp/spdk-raid.sock spare /dev/nbd0 00:26:09.285 18:29:52 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:26:09.285 18:29:52 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@10 -- # bdev_list=('spare') 00:26:09.285 18:29:52 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@10 -- # local bdev_list 00:26:09.286 18:29:52 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0') 00:26:09.286 18:29:52 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@11 -- # local nbd_list 00:26:09.286 18:29:52 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@12 -- # local i 00:26:09.286 18:29:52 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:26:09.286 18:29:52 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:26:09.286 18:29:52 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk spare /dev/nbd0 00:26:09.552 /dev/nbd0 00:26:09.552 18:29:53 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:26:09.552 18:29:53 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:26:09.552 18:29:53 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@866 -- # local nbd_name=nbd0 00:26:09.552 18:29:53 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@867 -- # local i 00:26:09.552 18:29:53 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:26:09.552 18:29:53 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:26:09.552 18:29:53 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@870 -- # grep -q -w nbd0 /proc/partitions 00:26:09.552 18:29:53 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@871 -- # break 00:26:09.552 18:29:53 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:26:09.552 18:29:53 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:26:09.552 18:29:53 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@883 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:26:09.552 1+0 records in 00:26:09.552 1+0 records out 00:26:09.552 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000270883 s, 15.1 MB/s 00:26:09.552 18:29:53 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:26:09.552 18:29:53 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@884 -- # size=4096 00:26:09.552 18:29:53 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:26:09.552 18:29:53 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:26:09.553 18:29:53 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@887 -- # return 0 00:26:09.553 18:29:53 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:26:09.553 18:29:53 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:26:09.553 18:29:53 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@725 -- # for bdev in "${base_bdevs[@]:1}" 00:26:09.553 18:29:53 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@726 -- # '[' -z '' ']' 00:26:09.553 18:29:53 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@727 -- # continue 00:26:09.553 18:29:53 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@725 -- # for bdev in "${base_bdevs[@]:1}" 00:26:09.553 18:29:53 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@726 -- # '[' -z BaseBdev3 ']' 00:26:09.553 18:29:53 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@729 -- # nbd_start_disks /var/tmp/spdk-raid.sock BaseBdev3 /dev/nbd1 00:26:09.553 18:29:53 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:26:09.553 18:29:53 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@10 -- # bdev_list=('BaseBdev3') 00:26:09.553 18:29:53 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@10 -- # local bdev_list 00:26:09.553 18:29:53 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd1') 00:26:09.553 18:29:53 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@11 -- # local nbd_list 00:26:09.553 18:29:53 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@12 -- # local i 00:26:09.553 18:29:53 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:26:09.553 18:29:53 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:26:09.553 18:29:53 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk BaseBdev3 /dev/nbd1 00:26:09.811 /dev/nbd1 00:26:09.811 18:29:53 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:26:09.811 18:29:53 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:26:09.811 18:29:53 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@866 -- # local nbd_name=nbd1 00:26:09.811 18:29:53 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@867 -- # local i 00:26:09.811 18:29:53 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:26:09.811 18:29:53 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:26:09.811 18:29:53 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@870 -- # grep -q -w nbd1 /proc/partitions 00:26:09.811 18:29:53 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@871 -- # break 00:26:09.811 18:29:53 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:26:09.811 18:29:53 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:26:09.811 18:29:53 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@883 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:26:09.811 1+0 records in 00:26:09.811 1+0 records out 00:26:09.811 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.00026247 s, 15.6 MB/s 00:26:09.811 18:29:53 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:26:09.811 18:29:53 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@884 -- # size=4096 00:26:09.811 18:29:53 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:26:09.811 18:29:53 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:26:09.811 18:29:53 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@887 -- # return 0 00:26:09.811 18:29:53 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:26:09.811 18:29:53 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:26:09.811 18:29:53 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@730 -- # cmp -i 1048576 /dev/nbd0 /dev/nbd1 00:26:09.811 18:29:53 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@731 -- # nbd_stop_disks /var/tmp/spdk-raid.sock /dev/nbd1 00:26:09.811 18:29:53 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:26:09.811 18:29:53 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd1') 00:26:09.811 18:29:53 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@50 -- # local nbd_list 00:26:09.811 18:29:53 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@51 -- # local i 00:26:09.811 18:29:53 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:26:09.811 18:29:53 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd1 00:26:10.070 18:29:53 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:26:10.070 18:29:53 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:26:10.070 18:29:53 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:26:10.070 18:29:53 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:26:10.070 18:29:53 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:26:10.070 18:29:53 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:26:10.070 18:29:53 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@41 -- # break 00:26:10.070 18:29:53 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@45 -- # return 0 00:26:10.070 18:29:53 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@725 -- # for bdev in "${base_bdevs[@]:1}" 00:26:10.070 18:29:53 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@726 -- # '[' -z BaseBdev4 ']' 00:26:10.070 18:29:53 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@729 -- # nbd_start_disks /var/tmp/spdk-raid.sock BaseBdev4 /dev/nbd1 00:26:10.070 18:29:53 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:26:10.070 18:29:53 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@10 -- # bdev_list=('BaseBdev4') 00:26:10.070 18:29:53 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@10 -- # local bdev_list 00:26:10.070 18:29:53 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd1') 00:26:10.070 18:29:53 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@11 -- # local nbd_list 00:26:10.070 18:29:53 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@12 -- # local i 00:26:10.070 18:29:53 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:26:10.070 18:29:53 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:26:10.070 18:29:53 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk BaseBdev4 /dev/nbd1 00:26:10.328 /dev/nbd1 00:26:10.328 18:29:54 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:26:10.328 18:29:54 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:26:10.328 18:29:54 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@866 -- # local nbd_name=nbd1 00:26:10.328 18:29:54 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@867 -- # local i 00:26:10.328 18:29:54 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:26:10.328 18:29:54 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:26:10.328 18:29:54 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@870 -- # grep -q -w nbd1 /proc/partitions 00:26:10.328 18:29:54 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@871 -- # break 00:26:10.328 18:29:54 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:26:10.328 18:29:54 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:26:10.328 18:29:54 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@883 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:26:10.586 1+0 records in 00:26:10.586 1+0 records out 00:26:10.586 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000263138 s, 15.6 MB/s 00:26:10.587 18:29:54 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:26:10.587 18:29:54 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@884 -- # size=4096 00:26:10.587 18:29:54 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:26:10.587 18:29:54 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:26:10.587 18:29:54 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@887 -- # return 0 00:26:10.587 18:29:54 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:26:10.587 18:29:54 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:26:10.587 18:29:54 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@730 -- # cmp -i 1048576 /dev/nbd0 /dev/nbd1 00:26:10.587 18:29:54 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@731 -- # nbd_stop_disks /var/tmp/spdk-raid.sock /dev/nbd1 00:26:10.587 18:29:54 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:26:10.587 18:29:54 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd1') 00:26:10.587 18:29:54 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@50 -- # local nbd_list 00:26:10.587 18:29:54 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@51 -- # local i 00:26:10.587 18:29:54 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:26:10.587 18:29:54 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd1 00:26:10.845 18:29:54 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:26:10.845 18:29:54 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:26:10.845 18:29:54 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:26:10.845 18:29:54 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:26:10.845 18:29:54 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:26:10.845 18:29:54 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:26:10.845 18:29:54 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@41 -- # break 00:26:10.845 18:29:54 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@45 -- # return 0 00:26:10.845 18:29:54 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@733 -- # nbd_stop_disks /var/tmp/spdk-raid.sock /dev/nbd0 00:26:10.845 18:29:54 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:26:10.845 18:29:54 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0') 00:26:10.845 18:29:54 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@50 -- # local nbd_list 00:26:10.845 18:29:54 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@51 -- # local i 00:26:10.845 18:29:54 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:26:10.845 18:29:54 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd0 00:26:11.103 18:29:54 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:26:11.103 18:29:54 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:26:11.103 18:29:54 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:26:11.103 18:29:54 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:26:11.103 18:29:54 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:26:11.103 18:29:54 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:26:11.103 18:29:54 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@41 -- # break 00:26:11.103 18:29:54 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@45 -- # return 0 00:26:11.103 18:29:54 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@742 -- # '[' true = true ']' 00:26:11.103 18:29:54 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@744 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:26:11.362 18:29:54 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@745 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:26:11.621 [2024-07-12 18:29:55.144649] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:26:11.621 [2024-07-12 18:29:55.144699] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:26:11.621 [2024-07-12 18:29:55.144720] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x2538340 00:26:11.621 [2024-07-12 18:29:55.144733] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:26:11.621 [2024-07-12 18:29:55.146383] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:26:11.621 [2024-07-12 18:29:55.146416] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:26:11.621 [2024-07-12 18:29:55.146498] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev spare 00:26:11.621 [2024-07-12 18:29:55.146525] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:26:11.621 [2024-07-12 18:29:55.146634] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:26:11.621 [2024-07-12 18:29:55.146711] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:26:11.621 spare 00:26:11.621 18:29:55 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@747 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:26:11.621 18:29:55 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:26:11.621 18:29:55 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:26:11.621 18:29:55 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:26:11.621 18:29:55 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:26:11.621 18:29:55 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:26:11.621 18:29:55 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:26:11.621 18:29:55 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:26:11.621 18:29:55 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:26:11.621 18:29:55 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:26:11.621 18:29:55 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:11.621 18:29:55 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:11.621 [2024-07-12 18:29:55.247025] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x2538650 00:26:11.621 [2024-07-12 18:29:55.247042] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:26:11.621 [2024-07-12 18:29:55.247240] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x253a510 00:26:11.621 [2024-07-12 18:29:55.247385] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x2538650 00:26:11.621 [2024-07-12 18:29:55.247396] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x2538650 00:26:11.621 [2024-07-12 18:29:55.247502] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:26:11.880 18:29:55 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:26:11.880 "name": "raid_bdev1", 00:26:11.880 "uuid": "6fbf3f3f-9a1e-4915-9d01-4c9f0cb9c2a2", 00:26:11.880 "strip_size_kb": 0, 00:26:11.880 "state": "online", 00:26:11.880 "raid_level": "raid1", 00:26:11.880 "superblock": true, 00:26:11.880 "num_base_bdevs": 4, 00:26:11.880 "num_base_bdevs_discovered": 3, 00:26:11.880 "num_base_bdevs_operational": 3, 00:26:11.880 "base_bdevs_list": [ 00:26:11.880 { 00:26:11.880 "name": "spare", 00:26:11.880 "uuid": "245dce74-1e73-5c12-b02f-49405435812b", 00:26:11.880 "is_configured": true, 00:26:11.880 "data_offset": 2048, 00:26:11.880 "data_size": 63488 00:26:11.880 }, 00:26:11.880 { 00:26:11.880 "name": null, 00:26:11.880 "uuid": "00000000-0000-0000-0000-000000000000", 00:26:11.880 "is_configured": false, 00:26:11.880 "data_offset": 2048, 00:26:11.880 "data_size": 63488 00:26:11.880 }, 00:26:11.880 { 00:26:11.880 "name": "BaseBdev3", 00:26:11.880 "uuid": "f6fcb52a-aa9f-5e61-bc4d-15408291353f", 00:26:11.880 "is_configured": true, 00:26:11.880 "data_offset": 2048, 00:26:11.880 "data_size": 63488 00:26:11.880 }, 00:26:11.880 { 00:26:11.880 "name": "BaseBdev4", 00:26:11.880 "uuid": "a660dbeb-0a77-5b3e-9ca1-f6163da5ea4d", 00:26:11.880 "is_configured": true, 00:26:11.880 "data_offset": 2048, 00:26:11.880 "data_size": 63488 00:26:11.880 } 00:26:11.880 ] 00:26:11.880 }' 00:26:11.880 18:29:55 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:26:11.880 18:29:55 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:26:12.445 18:29:55 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@748 -- # verify_raid_bdev_process raid_bdev1 none none 00:26:12.445 18:29:55 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:26:12.445 18:29:55 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:26:12.445 18:29:55 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=none 00:26:12.445 18:29:56 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:26:12.445 18:29:56 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:12.445 18:29:56 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:12.704 18:29:56 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:26:12.704 "name": "raid_bdev1", 00:26:12.704 "uuid": "6fbf3f3f-9a1e-4915-9d01-4c9f0cb9c2a2", 00:26:12.704 "strip_size_kb": 0, 00:26:12.704 "state": "online", 00:26:12.704 "raid_level": "raid1", 00:26:12.704 "superblock": true, 00:26:12.704 "num_base_bdevs": 4, 00:26:12.704 "num_base_bdevs_discovered": 3, 00:26:12.704 "num_base_bdevs_operational": 3, 00:26:12.704 "base_bdevs_list": [ 00:26:12.704 { 00:26:12.704 "name": "spare", 00:26:12.704 "uuid": "245dce74-1e73-5c12-b02f-49405435812b", 00:26:12.704 "is_configured": true, 00:26:12.704 "data_offset": 2048, 00:26:12.704 "data_size": 63488 00:26:12.704 }, 00:26:12.704 { 00:26:12.704 "name": null, 00:26:12.704 "uuid": "00000000-0000-0000-0000-000000000000", 00:26:12.704 "is_configured": false, 00:26:12.704 "data_offset": 2048, 00:26:12.704 "data_size": 63488 00:26:12.704 }, 00:26:12.704 { 00:26:12.704 "name": "BaseBdev3", 00:26:12.704 "uuid": "f6fcb52a-aa9f-5e61-bc4d-15408291353f", 00:26:12.704 "is_configured": true, 00:26:12.704 "data_offset": 2048, 00:26:12.704 "data_size": 63488 00:26:12.704 }, 00:26:12.704 { 00:26:12.704 "name": "BaseBdev4", 00:26:12.704 "uuid": "a660dbeb-0a77-5b3e-9ca1-f6163da5ea4d", 00:26:12.704 "is_configured": true, 00:26:12.704 "data_offset": 2048, 00:26:12.704 "data_size": 63488 00:26:12.704 } 00:26:12.704 ] 00:26:12.704 }' 00:26:12.704 18:29:56 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:26:12.704 18:29:56 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:26:12.704 18:29:56 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:26:12.704 18:29:56 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:26:12.704 18:29:56 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@749 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:12.704 18:29:56 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@749 -- # jq -r '.[].base_bdevs_list[0].name' 00:26:12.963 18:29:56 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@749 -- # [[ spare == \s\p\a\r\e ]] 00:26:12.963 18:29:56 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@752 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev spare 00:26:13.222 [2024-07-12 18:29:56.741198] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:26:13.222 18:29:56 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@753 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:26:13.222 18:29:56 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:26:13.222 18:29:56 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:26:13.222 18:29:56 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:26:13.222 18:29:56 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:26:13.222 18:29:56 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:26:13.222 18:29:56 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:26:13.222 18:29:56 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:26:13.223 18:29:56 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:26:13.223 18:29:56 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:26:13.223 18:29:56 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:13.223 18:29:56 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:13.481 18:29:57 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:26:13.481 "name": "raid_bdev1", 00:26:13.481 "uuid": "6fbf3f3f-9a1e-4915-9d01-4c9f0cb9c2a2", 00:26:13.481 "strip_size_kb": 0, 00:26:13.481 "state": "online", 00:26:13.481 "raid_level": "raid1", 00:26:13.481 "superblock": true, 00:26:13.481 "num_base_bdevs": 4, 00:26:13.481 "num_base_bdevs_discovered": 2, 00:26:13.481 "num_base_bdevs_operational": 2, 00:26:13.481 "base_bdevs_list": [ 00:26:13.481 { 00:26:13.481 "name": null, 00:26:13.481 "uuid": "00000000-0000-0000-0000-000000000000", 00:26:13.481 "is_configured": false, 00:26:13.481 "data_offset": 2048, 00:26:13.481 "data_size": 63488 00:26:13.481 }, 00:26:13.481 { 00:26:13.481 "name": null, 00:26:13.481 "uuid": "00000000-0000-0000-0000-000000000000", 00:26:13.481 "is_configured": false, 00:26:13.481 "data_offset": 2048, 00:26:13.481 "data_size": 63488 00:26:13.481 }, 00:26:13.481 { 00:26:13.481 "name": "BaseBdev3", 00:26:13.481 "uuid": "f6fcb52a-aa9f-5e61-bc4d-15408291353f", 00:26:13.481 "is_configured": true, 00:26:13.481 "data_offset": 2048, 00:26:13.481 "data_size": 63488 00:26:13.481 }, 00:26:13.481 { 00:26:13.481 "name": "BaseBdev4", 00:26:13.481 "uuid": "a660dbeb-0a77-5b3e-9ca1-f6163da5ea4d", 00:26:13.481 "is_configured": true, 00:26:13.481 "data_offset": 2048, 00:26:13.481 "data_size": 63488 00:26:13.481 } 00:26:13.481 ] 00:26:13.481 }' 00:26:13.481 18:29:57 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:26:13.481 18:29:57 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:26:14.102 18:29:57 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@754 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:26:14.102 [2024-07-12 18:29:57.804170] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:26:14.102 [2024-07-12 18:29:57.804321] bdev_raid.c:3562:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev spare (5) smaller than existing raid bdev raid_bdev1 (6) 00:26:14.102 [2024-07-12 18:29:57.804337] bdev_raid.c:3620:raid_bdev_examine_sb: *NOTICE*: Re-adding bdev spare to raid bdev raid_bdev1. 00:26:14.102 [2024-07-12 18:29:57.804366] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:26:14.102 [2024-07-12 18:29:57.808827] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x253a5d0 00:26:14.102 [2024-07-12 18:29:57.811101] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:26:14.377 18:29:57 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@755 -- # sleep 1 00:26:15.311 18:29:58 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@756 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:26:15.311 18:29:58 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:26:15.311 18:29:58 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:26:15.311 18:29:58 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:26:15.311 18:29:58 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:26:15.311 18:29:58 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:15.311 18:29:58 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:15.570 18:29:59 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:26:15.570 "name": "raid_bdev1", 00:26:15.570 "uuid": "6fbf3f3f-9a1e-4915-9d01-4c9f0cb9c2a2", 00:26:15.570 "strip_size_kb": 0, 00:26:15.570 "state": "online", 00:26:15.570 "raid_level": "raid1", 00:26:15.570 "superblock": true, 00:26:15.570 "num_base_bdevs": 4, 00:26:15.570 "num_base_bdevs_discovered": 3, 00:26:15.570 "num_base_bdevs_operational": 3, 00:26:15.570 "process": { 00:26:15.570 "type": "rebuild", 00:26:15.570 "target": "spare", 00:26:15.570 "progress": { 00:26:15.570 "blocks": 24576, 00:26:15.570 "percent": 38 00:26:15.570 } 00:26:15.570 }, 00:26:15.570 "base_bdevs_list": [ 00:26:15.570 { 00:26:15.570 "name": "spare", 00:26:15.570 "uuid": "245dce74-1e73-5c12-b02f-49405435812b", 00:26:15.570 "is_configured": true, 00:26:15.570 "data_offset": 2048, 00:26:15.570 "data_size": 63488 00:26:15.570 }, 00:26:15.570 { 00:26:15.570 "name": null, 00:26:15.570 "uuid": "00000000-0000-0000-0000-000000000000", 00:26:15.570 "is_configured": false, 00:26:15.570 "data_offset": 2048, 00:26:15.570 "data_size": 63488 00:26:15.570 }, 00:26:15.570 { 00:26:15.570 "name": "BaseBdev3", 00:26:15.570 "uuid": "f6fcb52a-aa9f-5e61-bc4d-15408291353f", 00:26:15.570 "is_configured": true, 00:26:15.570 "data_offset": 2048, 00:26:15.570 "data_size": 63488 00:26:15.570 }, 00:26:15.570 { 00:26:15.570 "name": "BaseBdev4", 00:26:15.570 "uuid": "a660dbeb-0a77-5b3e-9ca1-f6163da5ea4d", 00:26:15.570 "is_configured": true, 00:26:15.570 "data_offset": 2048, 00:26:15.570 "data_size": 63488 00:26:15.570 } 00:26:15.570 ] 00:26:15.570 }' 00:26:15.570 18:29:59 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:26:15.570 18:29:59 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:26:15.570 18:29:59 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:26:15.570 18:29:59 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:26:15.570 18:29:59 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@759 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:26:15.829 [2024-07-12 18:29:59.391858] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:26:15.829 [2024-07-12 18:29:59.423871] bdev_raid.c:2513:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:26:15.829 [2024-07-12 18:29:59.423914] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:26:15.829 [2024-07-12 18:29:59.423937] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:26:15.829 [2024-07-12 18:29:59.423945] bdev_raid.c:2451:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:26:15.829 18:29:59 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@760 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:26:15.829 18:29:59 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:26:15.829 18:29:59 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:26:15.829 18:29:59 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:26:15.829 18:29:59 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:26:15.829 18:29:59 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:26:15.829 18:29:59 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:26:15.829 18:29:59 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:26:15.829 18:29:59 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:26:15.829 18:29:59 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:26:15.829 18:29:59 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:15.829 18:29:59 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:16.087 18:29:59 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:26:16.087 "name": "raid_bdev1", 00:26:16.087 "uuid": "6fbf3f3f-9a1e-4915-9d01-4c9f0cb9c2a2", 00:26:16.087 "strip_size_kb": 0, 00:26:16.087 "state": "online", 00:26:16.087 "raid_level": "raid1", 00:26:16.087 "superblock": true, 00:26:16.087 "num_base_bdevs": 4, 00:26:16.087 "num_base_bdevs_discovered": 2, 00:26:16.087 "num_base_bdevs_operational": 2, 00:26:16.087 "base_bdevs_list": [ 00:26:16.087 { 00:26:16.087 "name": null, 00:26:16.087 "uuid": "00000000-0000-0000-0000-000000000000", 00:26:16.087 "is_configured": false, 00:26:16.087 "data_offset": 2048, 00:26:16.087 "data_size": 63488 00:26:16.087 }, 00:26:16.087 { 00:26:16.088 "name": null, 00:26:16.088 "uuid": "00000000-0000-0000-0000-000000000000", 00:26:16.088 "is_configured": false, 00:26:16.088 "data_offset": 2048, 00:26:16.088 "data_size": 63488 00:26:16.088 }, 00:26:16.088 { 00:26:16.088 "name": "BaseBdev3", 00:26:16.088 "uuid": "f6fcb52a-aa9f-5e61-bc4d-15408291353f", 00:26:16.088 "is_configured": true, 00:26:16.088 "data_offset": 2048, 00:26:16.088 "data_size": 63488 00:26:16.088 }, 00:26:16.088 { 00:26:16.088 "name": "BaseBdev4", 00:26:16.088 "uuid": "a660dbeb-0a77-5b3e-9ca1-f6163da5ea4d", 00:26:16.088 "is_configured": true, 00:26:16.088 "data_offset": 2048, 00:26:16.088 "data_size": 63488 00:26:16.088 } 00:26:16.088 ] 00:26:16.088 }' 00:26:16.088 18:29:59 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:26:16.088 18:29:59 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:26:16.654 18:30:00 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@761 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:26:16.912 [2024-07-12 18:30:00.547299] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:26:16.912 [2024-07-12 18:30:00.547359] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:26:16.912 [2024-07-12 18:30:00.547384] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x2539380 00:26:16.912 [2024-07-12 18:30:00.547397] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:26:16.912 [2024-07-12 18:30:00.547791] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:26:16.912 [2024-07-12 18:30:00.547811] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:26:16.912 [2024-07-12 18:30:00.547902] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev spare 00:26:16.912 [2024-07-12 18:30:00.547914] bdev_raid.c:3562:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev spare (5) smaller than existing raid bdev raid_bdev1 (6) 00:26:16.912 [2024-07-12 18:30:00.547936] bdev_raid.c:3620:raid_bdev_examine_sb: *NOTICE*: Re-adding bdev spare to raid bdev raid_bdev1. 00:26:16.912 [2024-07-12 18:30:00.547957] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:26:16.912 [2024-07-12 18:30:00.552705] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x253ca60 00:26:16.912 spare 00:26:16.912 [2024-07-12 18:30:00.554129] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:26:16.912 18:30:00 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@762 -- # sleep 1 00:26:18.284 18:30:01 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@763 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:26:18.284 18:30:01 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:26:18.284 18:30:01 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:26:18.284 18:30:01 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:26:18.284 18:30:01 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:26:18.284 18:30:01 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:18.284 18:30:01 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:18.284 18:30:01 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:26:18.284 "name": "raid_bdev1", 00:26:18.284 "uuid": "6fbf3f3f-9a1e-4915-9d01-4c9f0cb9c2a2", 00:26:18.284 "strip_size_kb": 0, 00:26:18.284 "state": "online", 00:26:18.284 "raid_level": "raid1", 00:26:18.284 "superblock": true, 00:26:18.284 "num_base_bdevs": 4, 00:26:18.284 "num_base_bdevs_discovered": 3, 00:26:18.284 "num_base_bdevs_operational": 3, 00:26:18.284 "process": { 00:26:18.284 "type": "rebuild", 00:26:18.284 "target": "spare", 00:26:18.284 "progress": { 00:26:18.284 "blocks": 24576, 00:26:18.284 "percent": 38 00:26:18.284 } 00:26:18.284 }, 00:26:18.284 "base_bdevs_list": [ 00:26:18.284 { 00:26:18.284 "name": "spare", 00:26:18.284 "uuid": "245dce74-1e73-5c12-b02f-49405435812b", 00:26:18.284 "is_configured": true, 00:26:18.284 "data_offset": 2048, 00:26:18.284 "data_size": 63488 00:26:18.284 }, 00:26:18.284 { 00:26:18.284 "name": null, 00:26:18.284 "uuid": "00000000-0000-0000-0000-000000000000", 00:26:18.284 "is_configured": false, 00:26:18.284 "data_offset": 2048, 00:26:18.284 "data_size": 63488 00:26:18.284 }, 00:26:18.284 { 00:26:18.284 "name": "BaseBdev3", 00:26:18.284 "uuid": "f6fcb52a-aa9f-5e61-bc4d-15408291353f", 00:26:18.284 "is_configured": true, 00:26:18.284 "data_offset": 2048, 00:26:18.284 "data_size": 63488 00:26:18.284 }, 00:26:18.284 { 00:26:18.284 "name": "BaseBdev4", 00:26:18.284 "uuid": "a660dbeb-0a77-5b3e-9ca1-f6163da5ea4d", 00:26:18.284 "is_configured": true, 00:26:18.284 "data_offset": 2048, 00:26:18.284 "data_size": 63488 00:26:18.284 } 00:26:18.284 ] 00:26:18.284 }' 00:26:18.284 18:30:01 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:26:18.284 18:30:01 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:26:18.284 18:30:01 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:26:18.284 18:30:01 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:26:18.284 18:30:01 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@766 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:26:18.543 [2024-07-12 18:30:02.133782] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:26:18.543 [2024-07-12 18:30:02.166719] bdev_raid.c:2513:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:26:18.543 [2024-07-12 18:30:02.166761] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:26:18.543 [2024-07-12 18:30:02.166778] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:26:18.543 [2024-07-12 18:30:02.166786] bdev_raid.c:2451:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:26:18.543 18:30:02 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@767 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:26:18.543 18:30:02 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:26:18.543 18:30:02 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:26:18.543 18:30:02 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:26:18.543 18:30:02 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:26:18.543 18:30:02 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:26:18.543 18:30:02 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:26:18.543 18:30:02 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:26:18.543 18:30:02 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:26:18.543 18:30:02 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:26:18.543 18:30:02 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:18.543 18:30:02 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:18.800 18:30:02 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:26:18.800 "name": "raid_bdev1", 00:26:18.800 "uuid": "6fbf3f3f-9a1e-4915-9d01-4c9f0cb9c2a2", 00:26:18.800 "strip_size_kb": 0, 00:26:18.800 "state": "online", 00:26:18.800 "raid_level": "raid1", 00:26:18.800 "superblock": true, 00:26:18.800 "num_base_bdevs": 4, 00:26:18.800 "num_base_bdevs_discovered": 2, 00:26:18.800 "num_base_bdevs_operational": 2, 00:26:18.800 "base_bdevs_list": [ 00:26:18.800 { 00:26:18.800 "name": null, 00:26:18.800 "uuid": "00000000-0000-0000-0000-000000000000", 00:26:18.800 "is_configured": false, 00:26:18.800 "data_offset": 2048, 00:26:18.800 "data_size": 63488 00:26:18.800 }, 00:26:18.800 { 00:26:18.800 "name": null, 00:26:18.800 "uuid": "00000000-0000-0000-0000-000000000000", 00:26:18.800 "is_configured": false, 00:26:18.800 "data_offset": 2048, 00:26:18.800 "data_size": 63488 00:26:18.800 }, 00:26:18.800 { 00:26:18.800 "name": "BaseBdev3", 00:26:18.800 "uuid": "f6fcb52a-aa9f-5e61-bc4d-15408291353f", 00:26:18.800 "is_configured": true, 00:26:18.800 "data_offset": 2048, 00:26:18.800 "data_size": 63488 00:26:18.800 }, 00:26:18.800 { 00:26:18.800 "name": "BaseBdev4", 00:26:18.800 "uuid": "a660dbeb-0a77-5b3e-9ca1-f6163da5ea4d", 00:26:18.800 "is_configured": true, 00:26:18.800 "data_offset": 2048, 00:26:18.800 "data_size": 63488 00:26:18.800 } 00:26:18.800 ] 00:26:18.800 }' 00:26:18.800 18:30:02 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:26:18.800 18:30:02 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:26:19.365 18:30:03 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@768 -- # verify_raid_bdev_process raid_bdev1 none none 00:26:19.365 18:30:03 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:26:19.365 18:30:03 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:26:19.365 18:30:03 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=none 00:26:19.365 18:30:03 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:26:19.365 18:30:03 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:19.365 18:30:03 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:19.621 18:30:03 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:26:19.621 "name": "raid_bdev1", 00:26:19.621 "uuid": "6fbf3f3f-9a1e-4915-9d01-4c9f0cb9c2a2", 00:26:19.621 "strip_size_kb": 0, 00:26:19.621 "state": "online", 00:26:19.621 "raid_level": "raid1", 00:26:19.621 "superblock": true, 00:26:19.622 "num_base_bdevs": 4, 00:26:19.622 "num_base_bdevs_discovered": 2, 00:26:19.622 "num_base_bdevs_operational": 2, 00:26:19.622 "base_bdevs_list": [ 00:26:19.622 { 00:26:19.622 "name": null, 00:26:19.622 "uuid": "00000000-0000-0000-0000-000000000000", 00:26:19.622 "is_configured": false, 00:26:19.622 "data_offset": 2048, 00:26:19.622 "data_size": 63488 00:26:19.622 }, 00:26:19.622 { 00:26:19.622 "name": null, 00:26:19.622 "uuid": "00000000-0000-0000-0000-000000000000", 00:26:19.622 "is_configured": false, 00:26:19.622 "data_offset": 2048, 00:26:19.622 "data_size": 63488 00:26:19.622 }, 00:26:19.622 { 00:26:19.622 "name": "BaseBdev3", 00:26:19.622 "uuid": "f6fcb52a-aa9f-5e61-bc4d-15408291353f", 00:26:19.622 "is_configured": true, 00:26:19.622 "data_offset": 2048, 00:26:19.622 "data_size": 63488 00:26:19.622 }, 00:26:19.622 { 00:26:19.622 "name": "BaseBdev4", 00:26:19.622 "uuid": "a660dbeb-0a77-5b3e-9ca1-f6163da5ea4d", 00:26:19.622 "is_configured": true, 00:26:19.622 "data_offset": 2048, 00:26:19.622 "data_size": 63488 00:26:19.622 } 00:26:19.622 ] 00:26:19.622 }' 00:26:19.622 18:30:03 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:26:19.622 18:30:03 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:26:19.622 18:30:03 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:26:19.878 18:30:03 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:26:19.878 18:30:03 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@771 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete BaseBdev1 00:26:20.136 18:30:03 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@772 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev1_malloc -p BaseBdev1 00:26:20.136 [2024-07-12 18:30:03.843605] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev1_malloc 00:26:20.136 [2024-07-12 18:30:03.843659] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:26:20.136 [2024-07-12 18:30:03.843681] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x253df10 00:26:20.136 [2024-07-12 18:30:03.843694] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:26:20.136 [2024-07-12 18:30:03.844060] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:26:20.136 [2024-07-12 18:30:03.844080] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:26:20.136 [2024-07-12 18:30:03.844149] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev BaseBdev1 00:26:20.136 [2024-07-12 18:30:03.844161] bdev_raid.c:3562:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev BaseBdev1 (1) smaller than existing raid bdev raid_bdev1 (6) 00:26:20.136 [2024-07-12 18:30:03.844172] bdev_raid.c:3581:raid_bdev_examine_sb: *DEBUG*: raid superblock does not contain this bdev's uuid 00:26:20.136 BaseBdev1 00:26:20.393 18:30:03 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@773 -- # sleep 1 00:26:21.325 18:30:04 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@774 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:26:21.325 18:30:04 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:26:21.325 18:30:04 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:26:21.325 18:30:04 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:26:21.325 18:30:04 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:26:21.325 18:30:04 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:26:21.325 18:30:04 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:26:21.325 18:30:04 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:26:21.325 18:30:04 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:26:21.325 18:30:04 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:26:21.325 18:30:04 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:21.325 18:30:04 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:21.603 18:30:05 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:26:21.603 "name": "raid_bdev1", 00:26:21.603 "uuid": "6fbf3f3f-9a1e-4915-9d01-4c9f0cb9c2a2", 00:26:21.603 "strip_size_kb": 0, 00:26:21.603 "state": "online", 00:26:21.603 "raid_level": "raid1", 00:26:21.603 "superblock": true, 00:26:21.603 "num_base_bdevs": 4, 00:26:21.603 "num_base_bdevs_discovered": 2, 00:26:21.603 "num_base_bdevs_operational": 2, 00:26:21.603 "base_bdevs_list": [ 00:26:21.603 { 00:26:21.603 "name": null, 00:26:21.603 "uuid": "00000000-0000-0000-0000-000000000000", 00:26:21.603 "is_configured": false, 00:26:21.603 "data_offset": 2048, 00:26:21.603 "data_size": 63488 00:26:21.603 }, 00:26:21.603 { 00:26:21.603 "name": null, 00:26:21.603 "uuid": "00000000-0000-0000-0000-000000000000", 00:26:21.603 "is_configured": false, 00:26:21.603 "data_offset": 2048, 00:26:21.603 "data_size": 63488 00:26:21.603 }, 00:26:21.603 { 00:26:21.603 "name": "BaseBdev3", 00:26:21.603 "uuid": "f6fcb52a-aa9f-5e61-bc4d-15408291353f", 00:26:21.603 "is_configured": true, 00:26:21.603 "data_offset": 2048, 00:26:21.603 "data_size": 63488 00:26:21.603 }, 00:26:21.603 { 00:26:21.603 "name": "BaseBdev4", 00:26:21.603 "uuid": "a660dbeb-0a77-5b3e-9ca1-f6163da5ea4d", 00:26:21.603 "is_configured": true, 00:26:21.603 "data_offset": 2048, 00:26:21.603 "data_size": 63488 00:26:21.603 } 00:26:21.603 ] 00:26:21.603 }' 00:26:21.603 18:30:05 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:26:21.603 18:30:05 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:26:22.166 18:30:05 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@775 -- # verify_raid_bdev_process raid_bdev1 none none 00:26:22.167 18:30:05 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:26:22.167 18:30:05 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:26:22.167 18:30:05 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=none 00:26:22.167 18:30:05 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:26:22.167 18:30:05 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:22.167 18:30:05 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:22.424 18:30:05 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:26:22.424 "name": "raid_bdev1", 00:26:22.424 "uuid": "6fbf3f3f-9a1e-4915-9d01-4c9f0cb9c2a2", 00:26:22.424 "strip_size_kb": 0, 00:26:22.424 "state": "online", 00:26:22.424 "raid_level": "raid1", 00:26:22.424 "superblock": true, 00:26:22.424 "num_base_bdevs": 4, 00:26:22.424 "num_base_bdevs_discovered": 2, 00:26:22.424 "num_base_bdevs_operational": 2, 00:26:22.424 "base_bdevs_list": [ 00:26:22.424 { 00:26:22.424 "name": null, 00:26:22.424 "uuid": "00000000-0000-0000-0000-000000000000", 00:26:22.424 "is_configured": false, 00:26:22.424 "data_offset": 2048, 00:26:22.424 "data_size": 63488 00:26:22.424 }, 00:26:22.424 { 00:26:22.424 "name": null, 00:26:22.424 "uuid": "00000000-0000-0000-0000-000000000000", 00:26:22.424 "is_configured": false, 00:26:22.424 "data_offset": 2048, 00:26:22.424 "data_size": 63488 00:26:22.424 }, 00:26:22.424 { 00:26:22.424 "name": "BaseBdev3", 00:26:22.424 "uuid": "f6fcb52a-aa9f-5e61-bc4d-15408291353f", 00:26:22.424 "is_configured": true, 00:26:22.424 "data_offset": 2048, 00:26:22.424 "data_size": 63488 00:26:22.424 }, 00:26:22.424 { 00:26:22.424 "name": "BaseBdev4", 00:26:22.424 "uuid": "a660dbeb-0a77-5b3e-9ca1-f6163da5ea4d", 00:26:22.424 "is_configured": true, 00:26:22.424 "data_offset": 2048, 00:26:22.424 "data_size": 63488 00:26:22.424 } 00:26:22.424 ] 00:26:22.424 }' 00:26:22.424 18:30:05 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:26:22.424 18:30:06 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:26:22.424 18:30:06 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:26:22.424 18:30:06 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:26:22.424 18:30:06 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@776 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:26:22.424 18:30:06 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@648 -- # local es=0 00:26:22.424 18:30:06 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@650 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:26:22.424 18:30:06 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@636 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:26:22.424 18:30:06 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:26:22.424 18:30:06 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@640 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:26:22.424 18:30:06 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:26:22.424 18:30:06 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@642 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:26:22.424 18:30:06 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:26:22.424 18:30:06 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@642 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:26:22.424 18:30:06 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@642 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py ]] 00:26:22.424 18:30:06 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@651 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:26:22.990 [2024-07-12 18:30:06.543139] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:26:22.990 [2024-07-12 18:30:06.543270] bdev_raid.c:3562:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev BaseBdev1 (1) smaller than existing raid bdev raid_bdev1 (6) 00:26:22.990 [2024-07-12 18:30:06.543286] bdev_raid.c:3581:raid_bdev_examine_sb: *DEBUG*: raid superblock does not contain this bdev's uuid 00:26:22.990 request: 00:26:22.990 { 00:26:22.990 "base_bdev": "BaseBdev1", 00:26:22.990 "raid_bdev": "raid_bdev1", 00:26:22.990 "method": "bdev_raid_add_base_bdev", 00:26:22.990 "req_id": 1 00:26:22.990 } 00:26:22.990 Got JSON-RPC error response 00:26:22.990 response: 00:26:22.990 { 00:26:22.990 "code": -22, 00:26:22.990 "message": "Failed to add base bdev to RAID bdev: Invalid argument" 00:26:22.990 } 00:26:22.990 18:30:06 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@651 -- # es=1 00:26:22.990 18:30:06 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@659 -- # (( es > 128 )) 00:26:22.990 18:30:06 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@670 -- # [[ -n '' ]] 00:26:22.990 18:30:06 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@675 -- # (( !es == 0 )) 00:26:22.990 18:30:06 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@777 -- # sleep 1 00:26:23.922 18:30:07 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@778 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:26:23.922 18:30:07 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:26:23.922 18:30:07 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:26:23.922 18:30:07 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:26:23.922 18:30:07 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:26:23.922 18:30:07 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:26:23.922 18:30:07 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:26:23.922 18:30:07 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:26:23.922 18:30:07 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:26:23.922 18:30:07 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:26:23.922 18:30:07 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:23.922 18:30:07 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:24.180 18:30:07 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:26:24.180 "name": "raid_bdev1", 00:26:24.180 "uuid": "6fbf3f3f-9a1e-4915-9d01-4c9f0cb9c2a2", 00:26:24.180 "strip_size_kb": 0, 00:26:24.180 "state": "online", 00:26:24.180 "raid_level": "raid1", 00:26:24.180 "superblock": true, 00:26:24.180 "num_base_bdevs": 4, 00:26:24.180 "num_base_bdevs_discovered": 2, 00:26:24.180 "num_base_bdevs_operational": 2, 00:26:24.180 "base_bdevs_list": [ 00:26:24.180 { 00:26:24.180 "name": null, 00:26:24.180 "uuid": "00000000-0000-0000-0000-000000000000", 00:26:24.180 "is_configured": false, 00:26:24.180 "data_offset": 2048, 00:26:24.180 "data_size": 63488 00:26:24.180 }, 00:26:24.180 { 00:26:24.180 "name": null, 00:26:24.180 "uuid": "00000000-0000-0000-0000-000000000000", 00:26:24.180 "is_configured": false, 00:26:24.180 "data_offset": 2048, 00:26:24.180 "data_size": 63488 00:26:24.180 }, 00:26:24.180 { 00:26:24.180 "name": "BaseBdev3", 00:26:24.180 "uuid": "f6fcb52a-aa9f-5e61-bc4d-15408291353f", 00:26:24.180 "is_configured": true, 00:26:24.180 "data_offset": 2048, 00:26:24.180 "data_size": 63488 00:26:24.180 }, 00:26:24.180 { 00:26:24.180 "name": "BaseBdev4", 00:26:24.180 "uuid": "a660dbeb-0a77-5b3e-9ca1-f6163da5ea4d", 00:26:24.180 "is_configured": true, 00:26:24.180 "data_offset": 2048, 00:26:24.180 "data_size": 63488 00:26:24.180 } 00:26:24.180 ] 00:26:24.180 }' 00:26:24.180 18:30:07 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:26:24.180 18:30:07 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:26:24.745 18:30:08 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@779 -- # verify_raid_bdev_process raid_bdev1 none none 00:26:24.745 18:30:08 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:26:24.745 18:30:08 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:26:24.745 18:30:08 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=none 00:26:24.745 18:30:08 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:26:24.745 18:30:08 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:24.745 18:30:08 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:25.003 18:30:08 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:26:25.003 "name": "raid_bdev1", 00:26:25.003 "uuid": "6fbf3f3f-9a1e-4915-9d01-4c9f0cb9c2a2", 00:26:25.003 "strip_size_kb": 0, 00:26:25.003 "state": "online", 00:26:25.003 "raid_level": "raid1", 00:26:25.003 "superblock": true, 00:26:25.003 "num_base_bdevs": 4, 00:26:25.003 "num_base_bdevs_discovered": 2, 00:26:25.003 "num_base_bdevs_operational": 2, 00:26:25.003 "base_bdevs_list": [ 00:26:25.003 { 00:26:25.003 "name": null, 00:26:25.003 "uuid": "00000000-0000-0000-0000-000000000000", 00:26:25.003 "is_configured": false, 00:26:25.003 "data_offset": 2048, 00:26:25.003 "data_size": 63488 00:26:25.003 }, 00:26:25.003 { 00:26:25.003 "name": null, 00:26:25.003 "uuid": "00000000-0000-0000-0000-000000000000", 00:26:25.003 "is_configured": false, 00:26:25.003 "data_offset": 2048, 00:26:25.004 "data_size": 63488 00:26:25.004 }, 00:26:25.004 { 00:26:25.004 "name": "BaseBdev3", 00:26:25.004 "uuid": "f6fcb52a-aa9f-5e61-bc4d-15408291353f", 00:26:25.004 "is_configured": true, 00:26:25.004 "data_offset": 2048, 00:26:25.004 "data_size": 63488 00:26:25.004 }, 00:26:25.004 { 00:26:25.004 "name": "BaseBdev4", 00:26:25.004 "uuid": "a660dbeb-0a77-5b3e-9ca1-f6163da5ea4d", 00:26:25.004 "is_configured": true, 00:26:25.004 "data_offset": 2048, 00:26:25.004 "data_size": 63488 00:26:25.004 } 00:26:25.004 ] 00:26:25.004 }' 00:26:25.004 18:30:08 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:26:25.004 18:30:08 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:26:25.004 18:30:08 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:26:25.262 18:30:08 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:26:25.262 18:30:08 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@782 -- # killprocess 2592510 00:26:25.262 18:30:08 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@948 -- # '[' -z 2592510 ']' 00:26:25.262 18:30:08 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@952 -- # kill -0 2592510 00:26:25.262 18:30:08 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@953 -- # uname 00:26:25.262 18:30:08 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:26:25.262 18:30:08 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2592510 00:26:25.262 18:30:08 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:26:25.262 18:30:08 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:26:25.262 18:30:08 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2592510' 00:26:25.262 killing process with pid 2592510 00:26:25.262 18:30:08 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@967 -- # kill 2592510 00:26:25.262 Received shutdown signal, test time was about 27.236288 seconds 00:26:25.262 00:26:25.262 Latency(us) 00:26:25.262 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:26:25.262 =================================================================================================================== 00:26:25.262 Total : 0.00 0.00 0.00 0.00 0.00 0.00 0.00 00:26:25.262 [2024-07-12 18:30:08.811182] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:26:25.262 [2024-07-12 18:30:08.811286] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:26:25.262 18:30:08 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@972 -- # wait 2592510 00:26:25.262 [2024-07-12 18:30:08.811347] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:26:25.262 [2024-07-12 18:30:08.811361] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x2538650 name raid_bdev1, state offline 00:26:25.262 [2024-07-12 18:30:08.853187] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:26:25.520 18:30:09 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@784 -- # return 0 00:26:25.520 00:26:25.520 real 0m32.722s 00:26:25.520 user 0m51.751s 00:26:25.520 sys 0m5.230s 00:26:25.520 18:30:09 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@1124 -- # xtrace_disable 00:26:25.520 18:30:09 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:26:25.520 ************************************ 00:26:25.520 END TEST raid_rebuild_test_sb_io 00:26:25.520 ************************************ 00:26:25.520 18:30:09 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:26:25.520 18:30:09 bdev_raid -- bdev/bdev_raid.sh@884 -- # '[' n == y ']' 00:26:25.520 18:30:09 bdev_raid -- bdev/bdev_raid.sh@896 -- # base_blocklen=4096 00:26:25.520 18:30:09 bdev_raid -- bdev/bdev_raid.sh@898 -- # run_test raid_state_function_test_sb_4k raid_state_function_test raid1 2 true 00:26:25.520 18:30:09 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:26:25.520 18:30:09 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:26:25.520 18:30:09 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:26:25.520 ************************************ 00:26:25.520 START TEST raid_state_function_test_sb_4k 00:26:25.520 ************************************ 00:26:25.520 18:30:09 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@1123 -- # raid_state_function_test raid1 2 true 00:26:25.520 18:30:09 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@220 -- # local raid_level=raid1 00:26:25.520 18:30:09 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@221 -- # local num_base_bdevs=2 00:26:25.520 18:30:09 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@222 -- # local superblock=true 00:26:25.520 18:30:09 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@223 -- # local raid_bdev 00:26:25.520 18:30:09 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@224 -- # (( i = 1 )) 00:26:25.520 18:30:09 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:26:25.520 18:30:09 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@226 -- # echo BaseBdev1 00:26:25.520 18:30:09 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:26:25.520 18:30:09 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:26:25.520 18:30:09 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@226 -- # echo BaseBdev2 00:26:25.521 18:30:09 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:26:25.521 18:30:09 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:26:25.521 18:30:09 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@224 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:26:25.521 18:30:09 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@224 -- # local base_bdevs 00:26:25.521 18:30:09 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@225 -- # local raid_bdev_name=Existed_Raid 00:26:25.521 18:30:09 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@226 -- # local strip_size 00:26:25.521 18:30:09 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@227 -- # local strip_size_create_arg 00:26:25.521 18:30:09 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@228 -- # local superblock_create_arg 00:26:25.521 18:30:09 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@230 -- # '[' raid1 '!=' raid1 ']' 00:26:25.521 18:30:09 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@234 -- # strip_size=0 00:26:25.521 18:30:09 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@237 -- # '[' true = true ']' 00:26:25.521 18:30:09 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@238 -- # superblock_create_arg=-s 00:26:25.521 18:30:09 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@244 -- # raid_pid=2597705 00:26:25.521 18:30:09 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@245 -- # echo 'Process raid pid: 2597705' 00:26:25.521 Process raid pid: 2597705 00:26:25.521 18:30:09 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@246 -- # waitforlisten 2597705 /var/tmp/spdk-raid.sock 00:26:25.521 18:30:09 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@829 -- # '[' -z 2597705 ']' 00:26:25.521 18:30:09 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:26:25.521 18:30:09 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@834 -- # local max_retries=100 00:26:25.521 18:30:09 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:26:25.521 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:26:25.521 18:30:09 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@838 -- # xtrace_disable 00:26:25.521 18:30:09 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@10 -- # set +x 00:26:25.521 18:30:09 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@243 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:26:25.521 [2024-07-12 18:30:09.217256] Starting SPDK v24.09-pre git sha1 182dd7de4 / DPDK 24.03.0 initialization... 00:26:25.521 [2024-07-12 18:30:09.217324] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:26:25.778 [2024-07-12 18:30:09.346183] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:26:25.778 [2024-07-12 18:30:09.452375] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:26:26.036 [2024-07-12 18:30:09.515811] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:26:26.036 [2024-07-12 18:30:09.515842] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:26:26.602 18:30:10 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:26:26.602 18:30:10 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@862 -- # return 0 00:26:26.602 18:30:10 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@250 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:26:26.602 [2024-07-12 18:30:10.310872] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:26:26.602 [2024-07-12 18:30:10.310914] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:26:26.602 [2024-07-12 18:30:10.310933] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:26:26.602 [2024-07-12 18:30:10.310945] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:26:26.861 18:30:10 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@251 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 2 00:26:26.861 18:30:10 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:26:26.861 18:30:10 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:26:26.861 18:30:10 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:26:26.861 18:30:10 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:26:26.861 18:30:10 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:26:26.861 18:30:10 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:26:26.861 18:30:10 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:26:26.861 18:30:10 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:26:26.861 18:30:10 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@124 -- # local tmp 00:26:26.861 18:30:10 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:26:26.861 18:30:10 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:26.861 18:30:10 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:26:26.861 "name": "Existed_Raid", 00:26:26.861 "uuid": "908d7929-f834-4e4d-aac5-9bc42ca6c2fe", 00:26:26.861 "strip_size_kb": 0, 00:26:26.861 "state": "configuring", 00:26:26.861 "raid_level": "raid1", 00:26:26.861 "superblock": true, 00:26:26.861 "num_base_bdevs": 2, 00:26:26.861 "num_base_bdevs_discovered": 0, 00:26:26.861 "num_base_bdevs_operational": 2, 00:26:26.861 "base_bdevs_list": [ 00:26:26.861 { 00:26:26.861 "name": "BaseBdev1", 00:26:26.861 "uuid": "00000000-0000-0000-0000-000000000000", 00:26:26.861 "is_configured": false, 00:26:26.861 "data_offset": 0, 00:26:26.861 "data_size": 0 00:26:26.861 }, 00:26:26.861 { 00:26:26.861 "name": "BaseBdev2", 00:26:26.861 "uuid": "00000000-0000-0000-0000-000000000000", 00:26:26.861 "is_configured": false, 00:26:26.861 "data_offset": 0, 00:26:26.861 "data_size": 0 00:26:26.861 } 00:26:26.861 ] 00:26:26.861 }' 00:26:26.861 18:30:10 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:26:26.861 18:30:10 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@10 -- # set +x 00:26:27.426 18:30:11 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@252 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:26:27.683 [2024-07-12 18:30:11.305355] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:26:27.683 [2024-07-12 18:30:11.305385] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1effa80 name Existed_Raid, state configuring 00:26:27.683 18:30:11 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@256 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:26:27.940 [2024-07-12 18:30:11.554029] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:26:27.940 [2024-07-12 18:30:11.554054] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:26:27.940 [2024-07-12 18:30:11.554063] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:26:27.940 [2024-07-12 18:30:11.554075] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:26:27.940 18:30:11 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@257 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -b BaseBdev1 00:26:28.197 [2024-07-12 18:30:11.808645] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:26:28.197 BaseBdev1 00:26:28.197 18:30:11 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@258 -- # waitforbdev BaseBdev1 00:26:28.197 18:30:11 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev1 00:26:28.197 18:30:11 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:26:28.197 18:30:11 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@899 -- # local i 00:26:28.197 18:30:11 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:26:28.197 18:30:11 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:26:28.197 18:30:11 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:26:28.455 18:30:12 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:26:28.765 [ 00:26:28.765 { 00:26:28.765 "name": "BaseBdev1", 00:26:28.765 "aliases": [ 00:26:28.765 "c0fe2ba1-fa43-447d-b016-07c76f36f198" 00:26:28.765 ], 00:26:28.765 "product_name": "Malloc disk", 00:26:28.765 "block_size": 4096, 00:26:28.765 "num_blocks": 8192, 00:26:28.765 "uuid": "c0fe2ba1-fa43-447d-b016-07c76f36f198", 00:26:28.765 "assigned_rate_limits": { 00:26:28.765 "rw_ios_per_sec": 0, 00:26:28.765 "rw_mbytes_per_sec": 0, 00:26:28.765 "r_mbytes_per_sec": 0, 00:26:28.765 "w_mbytes_per_sec": 0 00:26:28.765 }, 00:26:28.765 "claimed": true, 00:26:28.765 "claim_type": "exclusive_write", 00:26:28.765 "zoned": false, 00:26:28.765 "supported_io_types": { 00:26:28.765 "read": true, 00:26:28.765 "write": true, 00:26:28.765 "unmap": true, 00:26:28.765 "flush": true, 00:26:28.765 "reset": true, 00:26:28.765 "nvme_admin": false, 00:26:28.765 "nvme_io": false, 00:26:28.765 "nvme_io_md": false, 00:26:28.765 "write_zeroes": true, 00:26:28.765 "zcopy": true, 00:26:28.765 "get_zone_info": false, 00:26:28.765 "zone_management": false, 00:26:28.765 "zone_append": false, 00:26:28.765 "compare": false, 00:26:28.765 "compare_and_write": false, 00:26:28.765 "abort": true, 00:26:28.765 "seek_hole": false, 00:26:28.765 "seek_data": false, 00:26:28.765 "copy": true, 00:26:28.765 "nvme_iov_md": false 00:26:28.765 }, 00:26:28.765 "memory_domains": [ 00:26:28.765 { 00:26:28.765 "dma_device_id": "system", 00:26:28.765 "dma_device_type": 1 00:26:28.765 }, 00:26:28.765 { 00:26:28.765 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:26:28.765 "dma_device_type": 2 00:26:28.765 } 00:26:28.765 ], 00:26:28.765 "driver_specific": {} 00:26:28.765 } 00:26:28.765 ] 00:26:28.765 18:30:12 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@905 -- # return 0 00:26:28.765 18:30:12 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@259 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 2 00:26:28.765 18:30:12 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:26:28.765 18:30:12 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:26:28.765 18:30:12 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:26:28.765 18:30:12 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:26:28.765 18:30:12 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:26:28.765 18:30:12 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:26:28.765 18:30:12 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:26:28.765 18:30:12 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:26:28.765 18:30:12 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@124 -- # local tmp 00:26:28.765 18:30:12 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:26:28.765 18:30:12 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:29.029 18:30:12 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:26:29.029 "name": "Existed_Raid", 00:26:29.029 "uuid": "bbd63bea-f9c8-43d7-a644-3deec5cf25dc", 00:26:29.029 "strip_size_kb": 0, 00:26:29.029 "state": "configuring", 00:26:29.029 "raid_level": "raid1", 00:26:29.029 "superblock": true, 00:26:29.029 "num_base_bdevs": 2, 00:26:29.029 "num_base_bdevs_discovered": 1, 00:26:29.029 "num_base_bdevs_operational": 2, 00:26:29.029 "base_bdevs_list": [ 00:26:29.029 { 00:26:29.029 "name": "BaseBdev1", 00:26:29.029 "uuid": "c0fe2ba1-fa43-447d-b016-07c76f36f198", 00:26:29.029 "is_configured": true, 00:26:29.029 "data_offset": 256, 00:26:29.029 "data_size": 7936 00:26:29.029 }, 00:26:29.029 { 00:26:29.029 "name": "BaseBdev2", 00:26:29.029 "uuid": "00000000-0000-0000-0000-000000000000", 00:26:29.029 "is_configured": false, 00:26:29.029 "data_offset": 0, 00:26:29.029 "data_size": 0 00:26:29.029 } 00:26:29.029 ] 00:26:29.029 }' 00:26:29.029 18:30:12 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:26:29.029 18:30:12 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@10 -- # set +x 00:26:29.593 18:30:13 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@260 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:26:29.850 [2024-07-12 18:30:13.392835] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:26:29.850 [2024-07-12 18:30:13.392871] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1eff350 name Existed_Raid, state configuring 00:26:29.850 18:30:13 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@264 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:26:30.109 [2024-07-12 18:30:13.637516] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:26:30.109 [2024-07-12 18:30:13.638998] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:26:30.109 [2024-07-12 18:30:13.639031] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:26:30.109 18:30:13 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@265 -- # (( i = 1 )) 00:26:30.109 18:30:13 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:26:30.109 18:30:13 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 2 00:26:30.109 18:30:13 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:26:30.109 18:30:13 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:26:30.109 18:30:13 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:26:30.109 18:30:13 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:26:30.109 18:30:13 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:26:30.109 18:30:13 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:26:30.109 18:30:13 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:26:30.109 18:30:13 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:26:30.109 18:30:13 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@124 -- # local tmp 00:26:30.109 18:30:13 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:30.109 18:30:13 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:26:30.366 18:30:13 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:26:30.366 "name": "Existed_Raid", 00:26:30.366 "uuid": "fc8cd25c-fb13-4b72-b38e-48ed96b274d2", 00:26:30.366 "strip_size_kb": 0, 00:26:30.366 "state": "configuring", 00:26:30.366 "raid_level": "raid1", 00:26:30.366 "superblock": true, 00:26:30.366 "num_base_bdevs": 2, 00:26:30.366 "num_base_bdevs_discovered": 1, 00:26:30.366 "num_base_bdevs_operational": 2, 00:26:30.366 "base_bdevs_list": [ 00:26:30.366 { 00:26:30.366 "name": "BaseBdev1", 00:26:30.366 "uuid": "c0fe2ba1-fa43-447d-b016-07c76f36f198", 00:26:30.366 "is_configured": true, 00:26:30.366 "data_offset": 256, 00:26:30.366 "data_size": 7936 00:26:30.366 }, 00:26:30.366 { 00:26:30.366 "name": "BaseBdev2", 00:26:30.366 "uuid": "00000000-0000-0000-0000-000000000000", 00:26:30.366 "is_configured": false, 00:26:30.366 "data_offset": 0, 00:26:30.366 "data_size": 0 00:26:30.366 } 00:26:30.366 ] 00:26:30.366 }' 00:26:30.366 18:30:13 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:26:30.366 18:30:13 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@10 -- # set +x 00:26:30.931 18:30:14 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -b BaseBdev2 00:26:31.189 [2024-07-12 18:30:14.764414] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:26:31.189 [2024-07-12 18:30:14.764559] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x1f00000 00:26:31.189 [2024-07-12 18:30:14.764573] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4096 00:26:31.189 [2024-07-12 18:30:14.764747] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1e1a0c0 00:26:31.189 [2024-07-12 18:30:14.764867] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1f00000 00:26:31.189 [2024-07-12 18:30:14.764877] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x1f00000 00:26:31.189 [2024-07-12 18:30:14.764978] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:26:31.189 BaseBdev2 00:26:31.189 18:30:14 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev2 00:26:31.189 18:30:14 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev2 00:26:31.189 18:30:14 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:26:31.189 18:30:14 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@899 -- # local i 00:26:31.189 18:30:14 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:26:31.189 18:30:14 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:26:31.189 18:30:14 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:26:31.448 18:30:15 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:26:31.707 [ 00:26:31.707 { 00:26:31.707 "name": "BaseBdev2", 00:26:31.707 "aliases": [ 00:26:31.707 "736c993e-48b8-4e9f-bae4-ed5e44f8479c" 00:26:31.707 ], 00:26:31.707 "product_name": "Malloc disk", 00:26:31.707 "block_size": 4096, 00:26:31.707 "num_blocks": 8192, 00:26:31.707 "uuid": "736c993e-48b8-4e9f-bae4-ed5e44f8479c", 00:26:31.707 "assigned_rate_limits": { 00:26:31.707 "rw_ios_per_sec": 0, 00:26:31.707 "rw_mbytes_per_sec": 0, 00:26:31.707 "r_mbytes_per_sec": 0, 00:26:31.707 "w_mbytes_per_sec": 0 00:26:31.707 }, 00:26:31.707 "claimed": true, 00:26:31.707 "claim_type": "exclusive_write", 00:26:31.707 "zoned": false, 00:26:31.707 "supported_io_types": { 00:26:31.707 "read": true, 00:26:31.707 "write": true, 00:26:31.707 "unmap": true, 00:26:31.707 "flush": true, 00:26:31.707 "reset": true, 00:26:31.707 "nvme_admin": false, 00:26:31.707 "nvme_io": false, 00:26:31.707 "nvme_io_md": false, 00:26:31.707 "write_zeroes": true, 00:26:31.707 "zcopy": true, 00:26:31.707 "get_zone_info": false, 00:26:31.707 "zone_management": false, 00:26:31.707 "zone_append": false, 00:26:31.707 "compare": false, 00:26:31.707 "compare_and_write": false, 00:26:31.707 "abort": true, 00:26:31.707 "seek_hole": false, 00:26:31.707 "seek_data": false, 00:26:31.707 "copy": true, 00:26:31.707 "nvme_iov_md": false 00:26:31.707 }, 00:26:31.707 "memory_domains": [ 00:26:31.707 { 00:26:31.707 "dma_device_id": "system", 00:26:31.707 "dma_device_type": 1 00:26:31.707 }, 00:26:31.707 { 00:26:31.707 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:26:31.707 "dma_device_type": 2 00:26:31.707 } 00:26:31.707 ], 00:26:31.707 "driver_specific": {} 00:26:31.707 } 00:26:31.707 ] 00:26:31.707 18:30:15 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@905 -- # return 0 00:26:31.707 18:30:15 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:26:31.707 18:30:15 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:26:31.707 18:30:15 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@270 -- # verify_raid_bdev_state Existed_Raid online raid1 0 2 00:26:31.707 18:30:15 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:26:31.707 18:30:15 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:26:31.707 18:30:15 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:26:31.707 18:30:15 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:26:31.707 18:30:15 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:26:31.707 18:30:15 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:26:31.707 18:30:15 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:26:31.707 18:30:15 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:26:31.707 18:30:15 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@124 -- # local tmp 00:26:31.707 18:30:15 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:31.707 18:30:15 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:26:31.966 18:30:15 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:26:31.966 "name": "Existed_Raid", 00:26:31.966 "uuid": "fc8cd25c-fb13-4b72-b38e-48ed96b274d2", 00:26:31.966 "strip_size_kb": 0, 00:26:31.966 "state": "online", 00:26:31.966 "raid_level": "raid1", 00:26:31.966 "superblock": true, 00:26:31.966 "num_base_bdevs": 2, 00:26:31.966 "num_base_bdevs_discovered": 2, 00:26:31.966 "num_base_bdevs_operational": 2, 00:26:31.966 "base_bdevs_list": [ 00:26:31.966 { 00:26:31.966 "name": "BaseBdev1", 00:26:31.966 "uuid": "c0fe2ba1-fa43-447d-b016-07c76f36f198", 00:26:31.966 "is_configured": true, 00:26:31.966 "data_offset": 256, 00:26:31.966 "data_size": 7936 00:26:31.966 }, 00:26:31.966 { 00:26:31.966 "name": "BaseBdev2", 00:26:31.966 "uuid": "736c993e-48b8-4e9f-bae4-ed5e44f8479c", 00:26:31.966 "is_configured": true, 00:26:31.966 "data_offset": 256, 00:26:31.966 "data_size": 7936 00:26:31.966 } 00:26:31.966 ] 00:26:31.966 }' 00:26:31.966 18:30:15 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:26:31.966 18:30:15 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@10 -- # set +x 00:26:32.533 18:30:16 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_properties Existed_Raid 00:26:32.533 18:30:16 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:26:32.533 18:30:16 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:26:32.533 18:30:16 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:26:32.533 18:30:16 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:26:32.533 18:30:16 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@198 -- # local name 00:26:32.533 18:30:16 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:26:32.533 18:30:16 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:26:32.791 [2024-07-12 18:30:16.328837] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:26:32.791 18:30:16 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:26:32.791 "name": "Existed_Raid", 00:26:32.791 "aliases": [ 00:26:32.791 "fc8cd25c-fb13-4b72-b38e-48ed96b274d2" 00:26:32.791 ], 00:26:32.791 "product_name": "Raid Volume", 00:26:32.791 "block_size": 4096, 00:26:32.791 "num_blocks": 7936, 00:26:32.791 "uuid": "fc8cd25c-fb13-4b72-b38e-48ed96b274d2", 00:26:32.791 "assigned_rate_limits": { 00:26:32.791 "rw_ios_per_sec": 0, 00:26:32.791 "rw_mbytes_per_sec": 0, 00:26:32.791 "r_mbytes_per_sec": 0, 00:26:32.791 "w_mbytes_per_sec": 0 00:26:32.791 }, 00:26:32.791 "claimed": false, 00:26:32.791 "zoned": false, 00:26:32.791 "supported_io_types": { 00:26:32.791 "read": true, 00:26:32.791 "write": true, 00:26:32.791 "unmap": false, 00:26:32.791 "flush": false, 00:26:32.791 "reset": true, 00:26:32.791 "nvme_admin": false, 00:26:32.791 "nvme_io": false, 00:26:32.791 "nvme_io_md": false, 00:26:32.791 "write_zeroes": true, 00:26:32.791 "zcopy": false, 00:26:32.791 "get_zone_info": false, 00:26:32.791 "zone_management": false, 00:26:32.791 "zone_append": false, 00:26:32.791 "compare": false, 00:26:32.791 "compare_and_write": false, 00:26:32.791 "abort": false, 00:26:32.791 "seek_hole": false, 00:26:32.791 "seek_data": false, 00:26:32.791 "copy": false, 00:26:32.791 "nvme_iov_md": false 00:26:32.791 }, 00:26:32.791 "memory_domains": [ 00:26:32.791 { 00:26:32.791 "dma_device_id": "system", 00:26:32.791 "dma_device_type": 1 00:26:32.791 }, 00:26:32.791 { 00:26:32.791 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:26:32.791 "dma_device_type": 2 00:26:32.791 }, 00:26:32.791 { 00:26:32.791 "dma_device_id": "system", 00:26:32.791 "dma_device_type": 1 00:26:32.791 }, 00:26:32.791 { 00:26:32.791 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:26:32.791 "dma_device_type": 2 00:26:32.791 } 00:26:32.791 ], 00:26:32.791 "driver_specific": { 00:26:32.791 "raid": { 00:26:32.791 "uuid": "fc8cd25c-fb13-4b72-b38e-48ed96b274d2", 00:26:32.791 "strip_size_kb": 0, 00:26:32.791 "state": "online", 00:26:32.791 "raid_level": "raid1", 00:26:32.792 "superblock": true, 00:26:32.792 "num_base_bdevs": 2, 00:26:32.792 "num_base_bdevs_discovered": 2, 00:26:32.792 "num_base_bdevs_operational": 2, 00:26:32.792 "base_bdevs_list": [ 00:26:32.792 { 00:26:32.792 "name": "BaseBdev1", 00:26:32.792 "uuid": "c0fe2ba1-fa43-447d-b016-07c76f36f198", 00:26:32.792 "is_configured": true, 00:26:32.792 "data_offset": 256, 00:26:32.792 "data_size": 7936 00:26:32.792 }, 00:26:32.792 { 00:26:32.792 "name": "BaseBdev2", 00:26:32.792 "uuid": "736c993e-48b8-4e9f-bae4-ed5e44f8479c", 00:26:32.792 "is_configured": true, 00:26:32.792 "data_offset": 256, 00:26:32.792 "data_size": 7936 00:26:32.792 } 00:26:32.792 ] 00:26:32.792 } 00:26:32.792 } 00:26:32.792 }' 00:26:32.792 18:30:16 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:26:32.792 18:30:16 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@201 -- # base_bdev_names='BaseBdev1 00:26:32.792 BaseBdev2' 00:26:32.792 18:30:16 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:26:32.792 18:30:16 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:26:32.792 18:30:16 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:26:33.050 18:30:16 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:26:33.050 "name": "BaseBdev1", 00:26:33.050 "aliases": [ 00:26:33.050 "c0fe2ba1-fa43-447d-b016-07c76f36f198" 00:26:33.050 ], 00:26:33.050 "product_name": "Malloc disk", 00:26:33.050 "block_size": 4096, 00:26:33.050 "num_blocks": 8192, 00:26:33.050 "uuid": "c0fe2ba1-fa43-447d-b016-07c76f36f198", 00:26:33.050 "assigned_rate_limits": { 00:26:33.050 "rw_ios_per_sec": 0, 00:26:33.050 "rw_mbytes_per_sec": 0, 00:26:33.050 "r_mbytes_per_sec": 0, 00:26:33.050 "w_mbytes_per_sec": 0 00:26:33.050 }, 00:26:33.050 "claimed": true, 00:26:33.050 "claim_type": "exclusive_write", 00:26:33.050 "zoned": false, 00:26:33.050 "supported_io_types": { 00:26:33.050 "read": true, 00:26:33.050 "write": true, 00:26:33.050 "unmap": true, 00:26:33.050 "flush": true, 00:26:33.050 "reset": true, 00:26:33.050 "nvme_admin": false, 00:26:33.050 "nvme_io": false, 00:26:33.050 "nvme_io_md": false, 00:26:33.050 "write_zeroes": true, 00:26:33.050 "zcopy": true, 00:26:33.050 "get_zone_info": false, 00:26:33.050 "zone_management": false, 00:26:33.050 "zone_append": false, 00:26:33.050 "compare": false, 00:26:33.050 "compare_and_write": false, 00:26:33.050 "abort": true, 00:26:33.050 "seek_hole": false, 00:26:33.050 "seek_data": false, 00:26:33.050 "copy": true, 00:26:33.050 "nvme_iov_md": false 00:26:33.050 }, 00:26:33.050 "memory_domains": [ 00:26:33.050 { 00:26:33.050 "dma_device_id": "system", 00:26:33.051 "dma_device_type": 1 00:26:33.051 }, 00:26:33.051 { 00:26:33.051 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:26:33.051 "dma_device_type": 2 00:26:33.051 } 00:26:33.051 ], 00:26:33.051 "driver_specific": {} 00:26:33.051 }' 00:26:33.051 18:30:16 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:26:33.051 18:30:16 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:26:33.051 18:30:16 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@205 -- # [[ 4096 == 4096 ]] 00:26:33.051 18:30:16 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:26:33.051 18:30:16 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:26:33.308 18:30:16 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:26:33.308 18:30:16 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:26:33.308 18:30:16 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:26:33.308 18:30:16 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:26:33.308 18:30:16 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:26:33.308 18:30:16 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:26:33.308 18:30:16 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:26:33.308 18:30:16 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:26:33.309 18:30:16 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:26:33.309 18:30:16 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:26:33.600 18:30:17 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:26:33.600 "name": "BaseBdev2", 00:26:33.600 "aliases": [ 00:26:33.600 "736c993e-48b8-4e9f-bae4-ed5e44f8479c" 00:26:33.600 ], 00:26:33.600 "product_name": "Malloc disk", 00:26:33.600 "block_size": 4096, 00:26:33.600 "num_blocks": 8192, 00:26:33.600 "uuid": "736c993e-48b8-4e9f-bae4-ed5e44f8479c", 00:26:33.600 "assigned_rate_limits": { 00:26:33.600 "rw_ios_per_sec": 0, 00:26:33.600 "rw_mbytes_per_sec": 0, 00:26:33.600 "r_mbytes_per_sec": 0, 00:26:33.600 "w_mbytes_per_sec": 0 00:26:33.600 }, 00:26:33.600 "claimed": true, 00:26:33.600 "claim_type": "exclusive_write", 00:26:33.600 "zoned": false, 00:26:33.600 "supported_io_types": { 00:26:33.600 "read": true, 00:26:33.600 "write": true, 00:26:33.600 "unmap": true, 00:26:33.600 "flush": true, 00:26:33.600 "reset": true, 00:26:33.600 "nvme_admin": false, 00:26:33.600 "nvme_io": false, 00:26:33.600 "nvme_io_md": false, 00:26:33.600 "write_zeroes": true, 00:26:33.600 "zcopy": true, 00:26:33.600 "get_zone_info": false, 00:26:33.600 "zone_management": false, 00:26:33.600 "zone_append": false, 00:26:33.600 "compare": false, 00:26:33.600 "compare_and_write": false, 00:26:33.600 "abort": true, 00:26:33.600 "seek_hole": false, 00:26:33.600 "seek_data": false, 00:26:33.600 "copy": true, 00:26:33.600 "nvme_iov_md": false 00:26:33.600 }, 00:26:33.600 "memory_domains": [ 00:26:33.600 { 00:26:33.600 "dma_device_id": "system", 00:26:33.600 "dma_device_type": 1 00:26:33.600 }, 00:26:33.600 { 00:26:33.600 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:26:33.600 "dma_device_type": 2 00:26:33.600 } 00:26:33.600 ], 00:26:33.600 "driver_specific": {} 00:26:33.600 }' 00:26:33.600 18:30:17 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:26:33.600 18:30:17 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:26:33.878 18:30:17 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@205 -- # [[ 4096 == 4096 ]] 00:26:33.878 18:30:17 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:26:33.878 18:30:17 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:26:33.878 18:30:17 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:26:33.878 18:30:17 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:26:33.878 18:30:17 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:26:33.878 18:30:17 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:26:33.878 18:30:17 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:26:33.878 18:30:17 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:26:33.878 18:30:17 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:26:33.878 18:30:17 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@274 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:26:34.444 [2024-07-12 18:30:18.081277] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:26:34.444 18:30:18 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@275 -- # local expected_state 00:26:34.444 18:30:18 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@276 -- # has_redundancy raid1 00:26:34.444 18:30:18 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@213 -- # case $1 in 00:26:34.444 18:30:18 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@214 -- # return 0 00:26:34.444 18:30:18 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@279 -- # expected_state=online 00:26:34.444 18:30:18 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@281 -- # verify_raid_bdev_state Existed_Raid online raid1 0 1 00:26:34.444 18:30:18 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:26:34.444 18:30:18 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:26:34.444 18:30:18 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:26:34.444 18:30:18 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:26:34.444 18:30:18 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:26:34.444 18:30:18 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:26:34.444 18:30:18 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:26:34.444 18:30:18 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:26:34.444 18:30:18 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@124 -- # local tmp 00:26:34.444 18:30:18 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:34.444 18:30:18 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:26:34.702 18:30:18 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:26:34.702 "name": "Existed_Raid", 00:26:34.702 "uuid": "fc8cd25c-fb13-4b72-b38e-48ed96b274d2", 00:26:34.702 "strip_size_kb": 0, 00:26:34.702 "state": "online", 00:26:34.702 "raid_level": "raid1", 00:26:34.702 "superblock": true, 00:26:34.702 "num_base_bdevs": 2, 00:26:34.702 "num_base_bdevs_discovered": 1, 00:26:34.702 "num_base_bdevs_operational": 1, 00:26:34.702 "base_bdevs_list": [ 00:26:34.702 { 00:26:34.702 "name": null, 00:26:34.702 "uuid": "00000000-0000-0000-0000-000000000000", 00:26:34.702 "is_configured": false, 00:26:34.702 "data_offset": 256, 00:26:34.702 "data_size": 7936 00:26:34.702 }, 00:26:34.702 { 00:26:34.702 "name": "BaseBdev2", 00:26:34.702 "uuid": "736c993e-48b8-4e9f-bae4-ed5e44f8479c", 00:26:34.702 "is_configured": true, 00:26:34.702 "data_offset": 256, 00:26:34.702 "data_size": 7936 00:26:34.702 } 00:26:34.702 ] 00:26:34.702 }' 00:26:34.702 18:30:18 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:26:34.702 18:30:18 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@10 -- # set +x 00:26:35.268 18:30:18 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@285 -- # (( i = 1 )) 00:26:35.268 18:30:18 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:26:35.268 18:30:18 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:35.268 18:30:18 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:26:35.526 18:30:19 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:26:35.526 18:30:19 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:26:35.526 18:30:19 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:26:35.784 [2024-07-12 18:30:19.429882] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:26:35.784 [2024-07-12 18:30:19.429971] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:26:35.784 [2024-07-12 18:30:19.441111] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:26:35.784 [2024-07-12 18:30:19.441146] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:26:35.784 [2024-07-12 18:30:19.441157] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1f00000 name Existed_Raid, state offline 00:26:35.784 18:30:19 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:26:35.784 18:30:19 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:26:35.784 18:30:19 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@293 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:35.784 18:30:19 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@293 -- # jq -r '.[0]["name"] | select(.)' 00:26:36.042 18:30:19 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@293 -- # raid_bdev= 00:26:36.042 18:30:19 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@294 -- # '[' -n '' ']' 00:26:36.042 18:30:19 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@299 -- # '[' 2 -gt 2 ']' 00:26:36.042 18:30:19 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@341 -- # killprocess 2597705 00:26:36.042 18:30:19 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@948 -- # '[' -z 2597705 ']' 00:26:36.042 18:30:19 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@952 -- # kill -0 2597705 00:26:36.042 18:30:19 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@953 -- # uname 00:26:36.042 18:30:19 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:26:36.042 18:30:19 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2597705 00:26:36.042 18:30:19 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:26:36.042 18:30:19 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:26:36.042 18:30:19 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2597705' 00:26:36.042 killing process with pid 2597705 00:26:36.042 18:30:19 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@967 -- # kill 2597705 00:26:36.042 [2024-07-12 18:30:19.758312] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:26:36.042 18:30:19 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@972 -- # wait 2597705 00:26:36.042 [2024-07-12 18:30:19.759162] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:26:36.300 18:30:19 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@343 -- # return 0 00:26:36.300 00:26:36.300 real 0m10.810s 00:26:36.300 user 0m19.216s 00:26:36.300 sys 0m2.044s 00:26:36.300 18:30:19 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@1124 -- # xtrace_disable 00:26:36.300 18:30:19 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@10 -- # set +x 00:26:36.300 ************************************ 00:26:36.300 END TEST raid_state_function_test_sb_4k 00:26:36.300 ************************************ 00:26:36.300 18:30:20 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:26:36.300 18:30:20 bdev_raid -- bdev/bdev_raid.sh@899 -- # run_test raid_superblock_test_4k raid_superblock_test raid1 2 00:26:36.300 18:30:20 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 4 -le 1 ']' 00:26:36.300 18:30:20 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:26:36.300 18:30:20 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:26:36.558 ************************************ 00:26:36.558 START TEST raid_superblock_test_4k 00:26:36.558 ************************************ 00:26:36.558 18:30:20 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@1123 -- # raid_superblock_test raid1 2 00:26:36.558 18:30:20 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@392 -- # local raid_level=raid1 00:26:36.558 18:30:20 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@393 -- # local num_base_bdevs=2 00:26:36.558 18:30:20 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@394 -- # base_bdevs_malloc=() 00:26:36.558 18:30:20 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@394 -- # local base_bdevs_malloc 00:26:36.558 18:30:20 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@395 -- # base_bdevs_pt=() 00:26:36.558 18:30:20 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@395 -- # local base_bdevs_pt 00:26:36.558 18:30:20 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@396 -- # base_bdevs_pt_uuid=() 00:26:36.558 18:30:20 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@396 -- # local base_bdevs_pt_uuid 00:26:36.558 18:30:20 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@397 -- # local raid_bdev_name=raid_bdev1 00:26:36.558 18:30:20 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@398 -- # local strip_size 00:26:36.558 18:30:20 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@399 -- # local strip_size_create_arg 00:26:36.558 18:30:20 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@400 -- # local raid_bdev_uuid 00:26:36.558 18:30:20 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@401 -- # local raid_bdev 00:26:36.558 18:30:20 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@403 -- # '[' raid1 '!=' raid1 ']' 00:26:36.558 18:30:20 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@407 -- # strip_size=0 00:26:36.558 18:30:20 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@411 -- # raid_pid=2599321 00:26:36.559 18:30:20 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@412 -- # waitforlisten 2599321 /var/tmp/spdk-raid.sock 00:26:36.559 18:30:20 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@410 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -L bdev_raid 00:26:36.559 18:30:20 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@829 -- # '[' -z 2599321 ']' 00:26:36.559 18:30:20 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:26:36.559 18:30:20 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@834 -- # local max_retries=100 00:26:36.559 18:30:20 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:26:36.559 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:26:36.559 18:30:20 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@838 -- # xtrace_disable 00:26:36.559 18:30:20 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@10 -- # set +x 00:26:36.559 [2024-07-12 18:30:20.116457] Starting SPDK v24.09-pre git sha1 182dd7de4 / DPDK 24.03.0 initialization... 00:26:36.559 [2024-07-12 18:30:20.116526] [ DPDK EAL parameters: bdev_svc --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2599321 ] 00:26:36.559 [2024-07-12 18:30:20.247843] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:26:36.816 [2024-07-12 18:30:20.359909] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:26:36.816 [2024-07-12 18:30:20.422262] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:26:36.816 [2024-07-12 18:30:20.422296] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:26:37.074 18:30:20 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:26:37.074 18:30:20 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@862 -- # return 0 00:26:37.074 18:30:20 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@415 -- # (( i = 1 )) 00:26:37.074 18:30:20 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:26:37.074 18:30:20 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc1 00:26:37.074 18:30:20 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt1 00:26:37.074 18:30:20 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000001 00:26:37.074 18:30:20 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:26:37.074 18:30:20 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:26:37.074 18:30:20 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:26:37.074 18:30:20 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -b malloc1 00:26:37.333 malloc1 00:26:37.333 18:30:20 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:26:37.333 [2024-07-12 18:30:21.054067] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:26:37.333 [2024-07-12 18:30:21.054118] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:26:37.333 [2024-07-12 18:30:21.054136] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1561570 00:26:37.333 [2024-07-12 18:30:21.054149] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:26:37.333 [2024-07-12 18:30:21.055705] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:26:37.333 [2024-07-12 18:30:21.055736] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:26:37.333 pt1 00:26:37.591 18:30:21 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:26:37.591 18:30:21 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:26:37.591 18:30:21 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc2 00:26:37.591 18:30:21 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt2 00:26:37.591 18:30:21 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000002 00:26:37.591 18:30:21 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:26:37.591 18:30:21 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:26:37.591 18:30:21 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:26:37.591 18:30:21 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -b malloc2 00:26:37.591 malloc2 00:26:37.850 18:30:21 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:26:37.850 [2024-07-12 18:30:21.548156] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:26:37.850 [2024-07-12 18:30:21.548207] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:26:37.850 [2024-07-12 18:30:21.548225] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1562970 00:26:37.850 [2024-07-12 18:30:21.548238] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:26:37.850 [2024-07-12 18:30:21.549793] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:26:37.850 [2024-07-12 18:30:21.549824] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:26:37.850 pt2 00:26:37.850 18:30:21 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:26:37.850 18:30:21 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:26:37.850 18:30:21 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@429 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'pt1 pt2' -n raid_bdev1 -s 00:26:38.108 [2024-07-12 18:30:21.792827] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:26:38.108 [2024-07-12 18:30:21.794064] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:26:38.108 [2024-07-12 18:30:21.794206] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x1705270 00:26:38.108 [2024-07-12 18:30:21.794218] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4096 00:26:38.108 [2024-07-12 18:30:21.794418] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x15590e0 00:26:38.108 [2024-07-12 18:30:21.794560] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1705270 00:26:38.108 [2024-07-12 18:30:21.794570] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x1705270 00:26:38.108 [2024-07-12 18:30:21.794664] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:26:38.108 18:30:21 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@430 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:26:38.108 18:30:21 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:26:38.108 18:30:21 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:26:38.108 18:30:21 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:26:38.108 18:30:21 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:26:38.108 18:30:21 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:26:38.108 18:30:21 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:26:38.108 18:30:21 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:26:38.108 18:30:21 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:26:38.108 18:30:21 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@124 -- # local tmp 00:26:38.108 18:30:21 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:38.108 18:30:21 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:38.366 18:30:22 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:26:38.366 "name": "raid_bdev1", 00:26:38.366 "uuid": "e4de1b50-c863-4327-a1b3-705bb3386632", 00:26:38.366 "strip_size_kb": 0, 00:26:38.366 "state": "online", 00:26:38.366 "raid_level": "raid1", 00:26:38.366 "superblock": true, 00:26:38.366 "num_base_bdevs": 2, 00:26:38.366 "num_base_bdevs_discovered": 2, 00:26:38.366 "num_base_bdevs_operational": 2, 00:26:38.366 "base_bdevs_list": [ 00:26:38.366 { 00:26:38.366 "name": "pt1", 00:26:38.366 "uuid": "00000000-0000-0000-0000-000000000001", 00:26:38.366 "is_configured": true, 00:26:38.366 "data_offset": 256, 00:26:38.366 "data_size": 7936 00:26:38.366 }, 00:26:38.366 { 00:26:38.366 "name": "pt2", 00:26:38.366 "uuid": "00000000-0000-0000-0000-000000000002", 00:26:38.366 "is_configured": true, 00:26:38.366 "data_offset": 256, 00:26:38.366 "data_size": 7936 00:26:38.366 } 00:26:38.366 ] 00:26:38.366 }' 00:26:38.366 18:30:22 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:26:38.366 18:30:22 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@10 -- # set +x 00:26:39.311 18:30:22 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@431 -- # verify_raid_bdev_properties raid_bdev1 00:26:39.311 18:30:22 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:26:39.312 18:30:22 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:26:39.312 18:30:22 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:26:39.312 18:30:22 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:26:39.312 18:30:22 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@198 -- # local name 00:26:39.312 18:30:22 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:26:39.312 18:30:22 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:26:39.312 [2024-07-12 18:30:22.899980] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:26:39.312 18:30:22 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:26:39.312 "name": "raid_bdev1", 00:26:39.312 "aliases": [ 00:26:39.312 "e4de1b50-c863-4327-a1b3-705bb3386632" 00:26:39.312 ], 00:26:39.312 "product_name": "Raid Volume", 00:26:39.312 "block_size": 4096, 00:26:39.312 "num_blocks": 7936, 00:26:39.312 "uuid": "e4de1b50-c863-4327-a1b3-705bb3386632", 00:26:39.312 "assigned_rate_limits": { 00:26:39.312 "rw_ios_per_sec": 0, 00:26:39.312 "rw_mbytes_per_sec": 0, 00:26:39.312 "r_mbytes_per_sec": 0, 00:26:39.312 "w_mbytes_per_sec": 0 00:26:39.312 }, 00:26:39.312 "claimed": false, 00:26:39.312 "zoned": false, 00:26:39.312 "supported_io_types": { 00:26:39.312 "read": true, 00:26:39.312 "write": true, 00:26:39.312 "unmap": false, 00:26:39.312 "flush": false, 00:26:39.312 "reset": true, 00:26:39.312 "nvme_admin": false, 00:26:39.312 "nvme_io": false, 00:26:39.312 "nvme_io_md": false, 00:26:39.312 "write_zeroes": true, 00:26:39.312 "zcopy": false, 00:26:39.312 "get_zone_info": false, 00:26:39.312 "zone_management": false, 00:26:39.312 "zone_append": false, 00:26:39.312 "compare": false, 00:26:39.312 "compare_and_write": false, 00:26:39.312 "abort": false, 00:26:39.312 "seek_hole": false, 00:26:39.312 "seek_data": false, 00:26:39.312 "copy": false, 00:26:39.312 "nvme_iov_md": false 00:26:39.312 }, 00:26:39.312 "memory_domains": [ 00:26:39.312 { 00:26:39.312 "dma_device_id": "system", 00:26:39.312 "dma_device_type": 1 00:26:39.312 }, 00:26:39.312 { 00:26:39.312 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:26:39.312 "dma_device_type": 2 00:26:39.312 }, 00:26:39.312 { 00:26:39.312 "dma_device_id": "system", 00:26:39.312 "dma_device_type": 1 00:26:39.312 }, 00:26:39.312 { 00:26:39.312 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:26:39.312 "dma_device_type": 2 00:26:39.312 } 00:26:39.312 ], 00:26:39.312 "driver_specific": { 00:26:39.312 "raid": { 00:26:39.312 "uuid": "e4de1b50-c863-4327-a1b3-705bb3386632", 00:26:39.312 "strip_size_kb": 0, 00:26:39.312 "state": "online", 00:26:39.312 "raid_level": "raid1", 00:26:39.312 "superblock": true, 00:26:39.312 "num_base_bdevs": 2, 00:26:39.312 "num_base_bdevs_discovered": 2, 00:26:39.312 "num_base_bdevs_operational": 2, 00:26:39.312 "base_bdevs_list": [ 00:26:39.312 { 00:26:39.312 "name": "pt1", 00:26:39.312 "uuid": "00000000-0000-0000-0000-000000000001", 00:26:39.312 "is_configured": true, 00:26:39.312 "data_offset": 256, 00:26:39.312 "data_size": 7936 00:26:39.312 }, 00:26:39.312 { 00:26:39.312 "name": "pt2", 00:26:39.312 "uuid": "00000000-0000-0000-0000-000000000002", 00:26:39.312 "is_configured": true, 00:26:39.312 "data_offset": 256, 00:26:39.312 "data_size": 7936 00:26:39.312 } 00:26:39.312 ] 00:26:39.312 } 00:26:39.312 } 00:26:39.312 }' 00:26:39.312 18:30:22 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:26:39.312 18:30:22 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:26:39.312 pt2' 00:26:39.312 18:30:22 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:26:39.312 18:30:22 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:26:39.312 18:30:22 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:26:39.574 18:30:23 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:26:39.574 "name": "pt1", 00:26:39.574 "aliases": [ 00:26:39.574 "00000000-0000-0000-0000-000000000001" 00:26:39.574 ], 00:26:39.574 "product_name": "passthru", 00:26:39.574 "block_size": 4096, 00:26:39.574 "num_blocks": 8192, 00:26:39.574 "uuid": "00000000-0000-0000-0000-000000000001", 00:26:39.574 "assigned_rate_limits": { 00:26:39.574 "rw_ios_per_sec": 0, 00:26:39.574 "rw_mbytes_per_sec": 0, 00:26:39.574 "r_mbytes_per_sec": 0, 00:26:39.574 "w_mbytes_per_sec": 0 00:26:39.574 }, 00:26:39.574 "claimed": true, 00:26:39.574 "claim_type": "exclusive_write", 00:26:39.574 "zoned": false, 00:26:39.574 "supported_io_types": { 00:26:39.574 "read": true, 00:26:39.574 "write": true, 00:26:39.574 "unmap": true, 00:26:39.574 "flush": true, 00:26:39.574 "reset": true, 00:26:39.574 "nvme_admin": false, 00:26:39.574 "nvme_io": false, 00:26:39.574 "nvme_io_md": false, 00:26:39.574 "write_zeroes": true, 00:26:39.574 "zcopy": true, 00:26:39.574 "get_zone_info": false, 00:26:39.574 "zone_management": false, 00:26:39.574 "zone_append": false, 00:26:39.574 "compare": false, 00:26:39.574 "compare_and_write": false, 00:26:39.574 "abort": true, 00:26:39.574 "seek_hole": false, 00:26:39.574 "seek_data": false, 00:26:39.574 "copy": true, 00:26:39.574 "nvme_iov_md": false 00:26:39.574 }, 00:26:39.574 "memory_domains": [ 00:26:39.574 { 00:26:39.574 "dma_device_id": "system", 00:26:39.574 "dma_device_type": 1 00:26:39.574 }, 00:26:39.574 { 00:26:39.574 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:26:39.574 "dma_device_type": 2 00:26:39.574 } 00:26:39.574 ], 00:26:39.574 "driver_specific": { 00:26:39.574 "passthru": { 00:26:39.574 "name": "pt1", 00:26:39.574 "base_bdev_name": "malloc1" 00:26:39.574 } 00:26:39.574 } 00:26:39.574 }' 00:26:39.574 18:30:23 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:26:39.574 18:30:23 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:26:39.832 18:30:23 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@205 -- # [[ 4096 == 4096 ]] 00:26:39.832 18:30:23 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:26:39.832 18:30:23 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:26:39.832 18:30:23 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:26:39.832 18:30:23 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:26:39.832 18:30:23 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:26:39.832 18:30:23 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:26:39.832 18:30:23 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:26:39.832 18:30:23 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:26:40.090 18:30:23 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:26:40.090 18:30:23 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:26:40.090 18:30:23 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:26:40.090 18:30:23 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:26:40.090 18:30:23 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:26:40.090 "name": "pt2", 00:26:40.090 "aliases": [ 00:26:40.090 "00000000-0000-0000-0000-000000000002" 00:26:40.090 ], 00:26:40.090 "product_name": "passthru", 00:26:40.090 "block_size": 4096, 00:26:40.090 "num_blocks": 8192, 00:26:40.090 "uuid": "00000000-0000-0000-0000-000000000002", 00:26:40.090 "assigned_rate_limits": { 00:26:40.090 "rw_ios_per_sec": 0, 00:26:40.090 "rw_mbytes_per_sec": 0, 00:26:40.090 "r_mbytes_per_sec": 0, 00:26:40.090 "w_mbytes_per_sec": 0 00:26:40.090 }, 00:26:40.090 "claimed": true, 00:26:40.090 "claim_type": "exclusive_write", 00:26:40.090 "zoned": false, 00:26:40.090 "supported_io_types": { 00:26:40.090 "read": true, 00:26:40.090 "write": true, 00:26:40.090 "unmap": true, 00:26:40.090 "flush": true, 00:26:40.090 "reset": true, 00:26:40.090 "nvme_admin": false, 00:26:40.090 "nvme_io": false, 00:26:40.090 "nvme_io_md": false, 00:26:40.090 "write_zeroes": true, 00:26:40.090 "zcopy": true, 00:26:40.090 "get_zone_info": false, 00:26:40.090 "zone_management": false, 00:26:40.090 "zone_append": false, 00:26:40.090 "compare": false, 00:26:40.090 "compare_and_write": false, 00:26:40.090 "abort": true, 00:26:40.090 "seek_hole": false, 00:26:40.090 "seek_data": false, 00:26:40.090 "copy": true, 00:26:40.090 "nvme_iov_md": false 00:26:40.090 }, 00:26:40.090 "memory_domains": [ 00:26:40.090 { 00:26:40.090 "dma_device_id": "system", 00:26:40.090 "dma_device_type": 1 00:26:40.090 }, 00:26:40.090 { 00:26:40.090 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:26:40.090 "dma_device_type": 2 00:26:40.090 } 00:26:40.090 ], 00:26:40.090 "driver_specific": { 00:26:40.090 "passthru": { 00:26:40.090 "name": "pt2", 00:26:40.090 "base_bdev_name": "malloc2" 00:26:40.090 } 00:26:40.090 } 00:26:40.090 }' 00:26:40.090 18:30:23 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:26:40.348 18:30:23 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:26:40.348 18:30:23 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@205 -- # [[ 4096 == 4096 ]] 00:26:40.348 18:30:23 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:26:40.348 18:30:23 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:26:40.348 18:30:23 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:26:40.348 18:30:23 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:26:40.348 18:30:24 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:26:40.348 18:30:24 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:26:40.348 18:30:24 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:26:40.606 18:30:24 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:26:40.606 18:30:24 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:26:40.606 18:30:24 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@434 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:26:40.606 18:30:24 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@434 -- # jq -r '.[] | .uuid' 00:26:40.864 [2024-07-12 18:30:24.367883] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:26:40.864 18:30:24 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@434 -- # raid_bdev_uuid=e4de1b50-c863-4327-a1b3-705bb3386632 00:26:40.864 18:30:24 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@435 -- # '[' -z e4de1b50-c863-4327-a1b3-705bb3386632 ']' 00:26:40.864 18:30:24 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@440 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:26:41.122 [2024-07-12 18:30:24.612284] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:26:41.122 [2024-07-12 18:30:24.612305] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:26:41.122 [2024-07-12 18:30:24.612356] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:26:41.122 [2024-07-12 18:30:24.612412] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:26:41.122 [2024-07-12 18:30:24.612423] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1705270 name raid_bdev1, state offline 00:26:41.123 18:30:24 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@441 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:41.123 18:30:24 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@441 -- # jq -r '.[]' 00:26:41.381 18:30:24 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@441 -- # raid_bdev= 00:26:41.381 18:30:24 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@442 -- # '[' -n '' ']' 00:26:41.381 18:30:24 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:26:41.381 18:30:24 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt1 00:26:41.381 18:30:25 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:26:41.639 18:30:25 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:26:41.639 18:30:25 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@450 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs 00:26:41.639 18:30:25 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@450 -- # jq -r '[.[] | select(.product_name == "passthru")] | any' 00:26:41.897 18:30:25 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@450 -- # '[' false == true ']' 00:26:41.897 18:30:25 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@456 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'malloc1 malloc2' -n raid_bdev1 00:26:41.898 18:30:25 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@648 -- # local es=0 00:26:41.898 18:30:25 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@650 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'malloc1 malloc2' -n raid_bdev1 00:26:41.898 18:30:25 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@636 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:26:41.898 18:30:25 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:26:41.898 18:30:25 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@640 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:26:41.898 18:30:25 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:26:41.898 18:30:25 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@642 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:26:41.898 18:30:25 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:26:41.898 18:30:25 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@642 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:26:41.898 18:30:25 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@642 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py ]] 00:26:41.898 18:30:25 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@651 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'malloc1 malloc2' -n raid_bdev1 00:26:42.156 [2024-07-12 18:30:25.831465] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc1 is claimed 00:26:42.156 [2024-07-12 18:30:25.832803] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc2 is claimed 00:26:42.156 [2024-07-12 18:30:25.832858] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc1 00:26:42.156 [2024-07-12 18:30:25.832897] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc2 00:26:42.156 [2024-07-12 18:30:25.832916] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:26:42.156 [2024-07-12 18:30:25.832934] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1704ff0 name raid_bdev1, state configuring 00:26:42.156 request: 00:26:42.156 { 00:26:42.156 "name": "raid_bdev1", 00:26:42.156 "raid_level": "raid1", 00:26:42.156 "base_bdevs": [ 00:26:42.156 "malloc1", 00:26:42.156 "malloc2" 00:26:42.156 ], 00:26:42.156 "superblock": false, 00:26:42.156 "method": "bdev_raid_create", 00:26:42.156 "req_id": 1 00:26:42.156 } 00:26:42.156 Got JSON-RPC error response 00:26:42.156 response: 00:26:42.156 { 00:26:42.156 "code": -17, 00:26:42.156 "message": "Failed to create RAID bdev raid_bdev1: File exists" 00:26:42.156 } 00:26:42.156 18:30:25 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@651 -- # es=1 00:26:42.156 18:30:25 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@659 -- # (( es > 128 )) 00:26:42.156 18:30:25 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@670 -- # [[ -n '' ]] 00:26:42.156 18:30:25 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@675 -- # (( !es == 0 )) 00:26:42.156 18:30:25 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@458 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:42.156 18:30:25 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@458 -- # jq -r '.[]' 00:26:42.415 18:30:26 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@458 -- # raid_bdev= 00:26:42.415 18:30:26 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@459 -- # '[' -n '' ']' 00:26:42.415 18:30:26 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@464 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:26:42.674 [2024-07-12 18:30:26.324719] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:26:42.674 [2024-07-12 18:30:26.324765] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:26:42.674 [2024-07-12 18:30:26.324786] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x15617a0 00:26:42.674 [2024-07-12 18:30:26.324800] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:26:42.674 [2024-07-12 18:30:26.326398] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:26:42.674 [2024-07-12 18:30:26.326429] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:26:42.674 [2024-07-12 18:30:26.326496] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt1 00:26:42.674 [2024-07-12 18:30:26.326520] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:26:42.674 pt1 00:26:42.674 18:30:26 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@467 -- # verify_raid_bdev_state raid_bdev1 configuring raid1 0 2 00:26:42.674 18:30:26 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:26:42.674 18:30:26 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:26:42.674 18:30:26 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:26:42.674 18:30:26 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:26:42.674 18:30:26 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:26:42.674 18:30:26 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:26:42.674 18:30:26 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:26:42.674 18:30:26 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:26:42.674 18:30:26 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@124 -- # local tmp 00:26:42.674 18:30:26 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:42.674 18:30:26 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:42.932 18:30:26 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:26:42.932 "name": "raid_bdev1", 00:26:42.932 "uuid": "e4de1b50-c863-4327-a1b3-705bb3386632", 00:26:42.932 "strip_size_kb": 0, 00:26:42.932 "state": "configuring", 00:26:42.932 "raid_level": "raid1", 00:26:42.932 "superblock": true, 00:26:42.932 "num_base_bdevs": 2, 00:26:42.932 "num_base_bdevs_discovered": 1, 00:26:42.932 "num_base_bdevs_operational": 2, 00:26:42.932 "base_bdevs_list": [ 00:26:42.932 { 00:26:42.932 "name": "pt1", 00:26:42.932 "uuid": "00000000-0000-0000-0000-000000000001", 00:26:42.932 "is_configured": true, 00:26:42.932 "data_offset": 256, 00:26:42.932 "data_size": 7936 00:26:42.932 }, 00:26:42.932 { 00:26:42.932 "name": null, 00:26:42.932 "uuid": "00000000-0000-0000-0000-000000000002", 00:26:42.932 "is_configured": false, 00:26:42.932 "data_offset": 256, 00:26:42.932 "data_size": 7936 00:26:42.932 } 00:26:42.932 ] 00:26:42.932 }' 00:26:42.932 18:30:26 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:26:42.932 18:30:26 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@10 -- # set +x 00:26:43.498 18:30:27 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@469 -- # '[' 2 -gt 2 ']' 00:26:43.498 18:30:27 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@477 -- # (( i = 1 )) 00:26:43.498 18:30:27 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:26:43.498 18:30:27 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@478 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:26:43.757 [2024-07-12 18:30:27.411610] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:26:43.757 [2024-07-12 18:30:27.411669] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:26:43.757 [2024-07-12 18:30:27.411687] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x16f96f0 00:26:43.757 [2024-07-12 18:30:27.411700] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:26:43.757 [2024-07-12 18:30:27.412056] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:26:43.757 [2024-07-12 18:30:27.412076] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:26:43.757 [2024-07-12 18:30:27.412138] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:26:43.757 [2024-07-12 18:30:27.412158] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:26:43.757 [2024-07-12 18:30:27.412255] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x16fa590 00:26:43.757 [2024-07-12 18:30:27.412266] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4096 00:26:43.757 [2024-07-12 18:30:27.412439] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x155b540 00:26:43.757 [2024-07-12 18:30:27.412564] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x16fa590 00:26:43.757 [2024-07-12 18:30:27.412574] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x16fa590 00:26:43.757 [2024-07-12 18:30:27.412669] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:26:43.757 pt2 00:26:43.757 18:30:27 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@477 -- # (( i++ )) 00:26:43.757 18:30:27 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:26:43.757 18:30:27 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@482 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:26:43.757 18:30:27 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:26:43.757 18:30:27 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:26:43.757 18:30:27 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:26:43.757 18:30:27 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:26:43.757 18:30:27 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:26:43.757 18:30:27 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:26:43.757 18:30:27 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:26:43.757 18:30:27 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:26:43.757 18:30:27 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@124 -- # local tmp 00:26:43.757 18:30:27 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:43.757 18:30:27 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:44.016 18:30:27 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:26:44.016 "name": "raid_bdev1", 00:26:44.016 "uuid": "e4de1b50-c863-4327-a1b3-705bb3386632", 00:26:44.016 "strip_size_kb": 0, 00:26:44.016 "state": "online", 00:26:44.016 "raid_level": "raid1", 00:26:44.016 "superblock": true, 00:26:44.016 "num_base_bdevs": 2, 00:26:44.016 "num_base_bdevs_discovered": 2, 00:26:44.016 "num_base_bdevs_operational": 2, 00:26:44.016 "base_bdevs_list": [ 00:26:44.016 { 00:26:44.016 "name": "pt1", 00:26:44.016 "uuid": "00000000-0000-0000-0000-000000000001", 00:26:44.016 "is_configured": true, 00:26:44.016 "data_offset": 256, 00:26:44.016 "data_size": 7936 00:26:44.016 }, 00:26:44.016 { 00:26:44.016 "name": "pt2", 00:26:44.016 "uuid": "00000000-0000-0000-0000-000000000002", 00:26:44.016 "is_configured": true, 00:26:44.016 "data_offset": 256, 00:26:44.016 "data_size": 7936 00:26:44.016 } 00:26:44.017 ] 00:26:44.017 }' 00:26:44.017 18:30:27 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:26:44.017 18:30:27 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@10 -- # set +x 00:26:44.583 18:30:28 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@483 -- # verify_raid_bdev_properties raid_bdev1 00:26:44.583 18:30:28 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:26:44.583 18:30:28 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:26:44.583 18:30:28 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:26:44.583 18:30:28 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:26:44.583 18:30:28 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@198 -- # local name 00:26:44.583 18:30:28 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:26:44.583 18:30:28 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:26:44.841 [2024-07-12 18:30:28.522803] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:26:44.841 18:30:28 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:26:44.841 "name": "raid_bdev1", 00:26:44.841 "aliases": [ 00:26:44.841 "e4de1b50-c863-4327-a1b3-705bb3386632" 00:26:44.841 ], 00:26:44.841 "product_name": "Raid Volume", 00:26:44.841 "block_size": 4096, 00:26:44.841 "num_blocks": 7936, 00:26:44.841 "uuid": "e4de1b50-c863-4327-a1b3-705bb3386632", 00:26:44.841 "assigned_rate_limits": { 00:26:44.841 "rw_ios_per_sec": 0, 00:26:44.841 "rw_mbytes_per_sec": 0, 00:26:44.841 "r_mbytes_per_sec": 0, 00:26:44.841 "w_mbytes_per_sec": 0 00:26:44.841 }, 00:26:44.841 "claimed": false, 00:26:44.841 "zoned": false, 00:26:44.841 "supported_io_types": { 00:26:44.841 "read": true, 00:26:44.842 "write": true, 00:26:44.842 "unmap": false, 00:26:44.842 "flush": false, 00:26:44.842 "reset": true, 00:26:44.842 "nvme_admin": false, 00:26:44.842 "nvme_io": false, 00:26:44.842 "nvme_io_md": false, 00:26:44.842 "write_zeroes": true, 00:26:44.842 "zcopy": false, 00:26:44.842 "get_zone_info": false, 00:26:44.842 "zone_management": false, 00:26:44.842 "zone_append": false, 00:26:44.842 "compare": false, 00:26:44.842 "compare_and_write": false, 00:26:44.842 "abort": false, 00:26:44.842 "seek_hole": false, 00:26:44.842 "seek_data": false, 00:26:44.842 "copy": false, 00:26:44.842 "nvme_iov_md": false 00:26:44.842 }, 00:26:44.842 "memory_domains": [ 00:26:44.842 { 00:26:44.842 "dma_device_id": "system", 00:26:44.842 "dma_device_type": 1 00:26:44.842 }, 00:26:44.842 { 00:26:44.842 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:26:44.842 "dma_device_type": 2 00:26:44.842 }, 00:26:44.842 { 00:26:44.842 "dma_device_id": "system", 00:26:44.842 "dma_device_type": 1 00:26:44.842 }, 00:26:44.842 { 00:26:44.842 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:26:44.842 "dma_device_type": 2 00:26:44.842 } 00:26:44.842 ], 00:26:44.842 "driver_specific": { 00:26:44.842 "raid": { 00:26:44.842 "uuid": "e4de1b50-c863-4327-a1b3-705bb3386632", 00:26:44.842 "strip_size_kb": 0, 00:26:44.842 "state": "online", 00:26:44.842 "raid_level": "raid1", 00:26:44.842 "superblock": true, 00:26:44.842 "num_base_bdevs": 2, 00:26:44.842 "num_base_bdevs_discovered": 2, 00:26:44.842 "num_base_bdevs_operational": 2, 00:26:44.842 "base_bdevs_list": [ 00:26:44.842 { 00:26:44.842 "name": "pt1", 00:26:44.842 "uuid": "00000000-0000-0000-0000-000000000001", 00:26:44.842 "is_configured": true, 00:26:44.842 "data_offset": 256, 00:26:44.842 "data_size": 7936 00:26:44.842 }, 00:26:44.842 { 00:26:44.842 "name": "pt2", 00:26:44.842 "uuid": "00000000-0000-0000-0000-000000000002", 00:26:44.842 "is_configured": true, 00:26:44.842 "data_offset": 256, 00:26:44.842 "data_size": 7936 00:26:44.842 } 00:26:44.842 ] 00:26:44.842 } 00:26:44.842 } 00:26:44.842 }' 00:26:44.842 18:30:28 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:26:45.100 18:30:28 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:26:45.100 pt2' 00:26:45.100 18:30:28 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:26:45.100 18:30:28 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:26:45.100 18:30:28 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:26:45.100 18:30:28 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:26:45.100 "name": "pt1", 00:26:45.100 "aliases": [ 00:26:45.100 "00000000-0000-0000-0000-000000000001" 00:26:45.100 ], 00:26:45.100 "product_name": "passthru", 00:26:45.100 "block_size": 4096, 00:26:45.100 "num_blocks": 8192, 00:26:45.100 "uuid": "00000000-0000-0000-0000-000000000001", 00:26:45.100 "assigned_rate_limits": { 00:26:45.100 "rw_ios_per_sec": 0, 00:26:45.100 "rw_mbytes_per_sec": 0, 00:26:45.100 "r_mbytes_per_sec": 0, 00:26:45.100 "w_mbytes_per_sec": 0 00:26:45.100 }, 00:26:45.100 "claimed": true, 00:26:45.100 "claim_type": "exclusive_write", 00:26:45.100 "zoned": false, 00:26:45.100 "supported_io_types": { 00:26:45.101 "read": true, 00:26:45.101 "write": true, 00:26:45.101 "unmap": true, 00:26:45.101 "flush": true, 00:26:45.101 "reset": true, 00:26:45.101 "nvme_admin": false, 00:26:45.101 "nvme_io": false, 00:26:45.101 "nvme_io_md": false, 00:26:45.101 "write_zeroes": true, 00:26:45.101 "zcopy": true, 00:26:45.101 "get_zone_info": false, 00:26:45.101 "zone_management": false, 00:26:45.101 "zone_append": false, 00:26:45.101 "compare": false, 00:26:45.101 "compare_and_write": false, 00:26:45.101 "abort": true, 00:26:45.101 "seek_hole": false, 00:26:45.101 "seek_data": false, 00:26:45.101 "copy": true, 00:26:45.101 "nvme_iov_md": false 00:26:45.101 }, 00:26:45.101 "memory_domains": [ 00:26:45.101 { 00:26:45.101 "dma_device_id": "system", 00:26:45.101 "dma_device_type": 1 00:26:45.101 }, 00:26:45.101 { 00:26:45.101 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:26:45.101 "dma_device_type": 2 00:26:45.101 } 00:26:45.101 ], 00:26:45.101 "driver_specific": { 00:26:45.101 "passthru": { 00:26:45.101 "name": "pt1", 00:26:45.101 "base_bdev_name": "malloc1" 00:26:45.101 } 00:26:45.101 } 00:26:45.101 }' 00:26:45.358 18:30:28 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:26:45.358 18:30:28 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:26:45.358 18:30:28 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@205 -- # [[ 4096 == 4096 ]] 00:26:45.358 18:30:28 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:26:45.358 18:30:28 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:26:45.358 18:30:29 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:26:45.358 18:30:29 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:26:45.358 18:30:29 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:26:45.617 18:30:29 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:26:45.617 18:30:29 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:26:45.617 18:30:29 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:26:45.617 18:30:29 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:26:45.617 18:30:29 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:26:45.617 18:30:29 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:26:45.617 18:30:29 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:26:45.875 18:30:29 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:26:45.875 "name": "pt2", 00:26:45.875 "aliases": [ 00:26:45.875 "00000000-0000-0000-0000-000000000002" 00:26:45.875 ], 00:26:45.875 "product_name": "passthru", 00:26:45.875 "block_size": 4096, 00:26:45.875 "num_blocks": 8192, 00:26:45.875 "uuid": "00000000-0000-0000-0000-000000000002", 00:26:45.875 "assigned_rate_limits": { 00:26:45.875 "rw_ios_per_sec": 0, 00:26:45.875 "rw_mbytes_per_sec": 0, 00:26:45.875 "r_mbytes_per_sec": 0, 00:26:45.875 "w_mbytes_per_sec": 0 00:26:45.875 }, 00:26:45.875 "claimed": true, 00:26:45.875 "claim_type": "exclusive_write", 00:26:45.875 "zoned": false, 00:26:45.875 "supported_io_types": { 00:26:45.875 "read": true, 00:26:45.875 "write": true, 00:26:45.875 "unmap": true, 00:26:45.875 "flush": true, 00:26:45.875 "reset": true, 00:26:45.875 "nvme_admin": false, 00:26:45.875 "nvme_io": false, 00:26:45.875 "nvme_io_md": false, 00:26:45.875 "write_zeroes": true, 00:26:45.875 "zcopy": true, 00:26:45.875 "get_zone_info": false, 00:26:45.875 "zone_management": false, 00:26:45.875 "zone_append": false, 00:26:45.875 "compare": false, 00:26:45.875 "compare_and_write": false, 00:26:45.875 "abort": true, 00:26:45.875 "seek_hole": false, 00:26:45.875 "seek_data": false, 00:26:45.875 "copy": true, 00:26:45.875 "nvme_iov_md": false 00:26:45.875 }, 00:26:45.875 "memory_domains": [ 00:26:45.875 { 00:26:45.875 "dma_device_id": "system", 00:26:45.875 "dma_device_type": 1 00:26:45.875 }, 00:26:45.875 { 00:26:45.875 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:26:45.875 "dma_device_type": 2 00:26:45.875 } 00:26:45.875 ], 00:26:45.875 "driver_specific": { 00:26:45.875 "passthru": { 00:26:45.875 "name": "pt2", 00:26:45.875 "base_bdev_name": "malloc2" 00:26:45.875 } 00:26:45.875 } 00:26:45.875 }' 00:26:45.875 18:30:29 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:26:45.875 18:30:29 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:26:45.875 18:30:29 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@205 -- # [[ 4096 == 4096 ]] 00:26:45.875 18:30:29 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:26:45.875 18:30:29 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:26:46.133 18:30:29 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:26:46.133 18:30:29 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:26:46.133 18:30:29 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:26:46.133 18:30:29 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:26:46.133 18:30:29 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:26:46.133 18:30:29 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:26:46.133 18:30:29 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:26:46.133 18:30:29 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@486 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:26:46.133 18:30:29 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@486 -- # jq -r '.[] | .uuid' 00:26:46.390 [2024-07-12 18:30:30.034830] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:26:46.390 18:30:30 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@486 -- # '[' e4de1b50-c863-4327-a1b3-705bb3386632 '!=' e4de1b50-c863-4327-a1b3-705bb3386632 ']' 00:26:46.390 18:30:30 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@490 -- # has_redundancy raid1 00:26:46.390 18:30:30 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@213 -- # case $1 in 00:26:46.391 18:30:30 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@214 -- # return 0 00:26:46.391 18:30:30 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@492 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt1 00:26:46.648 [2024-07-12 18:30:30.283265] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: pt1 00:26:46.648 18:30:30 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@495 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:26:46.648 18:30:30 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:26:46.648 18:30:30 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:26:46.648 18:30:30 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:26:46.648 18:30:30 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:26:46.648 18:30:30 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:26:46.648 18:30:30 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:26:46.648 18:30:30 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:26:46.648 18:30:30 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:26:46.648 18:30:30 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@124 -- # local tmp 00:26:46.648 18:30:30 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:46.648 18:30:30 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:46.906 18:30:30 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:26:46.906 "name": "raid_bdev1", 00:26:46.906 "uuid": "e4de1b50-c863-4327-a1b3-705bb3386632", 00:26:46.906 "strip_size_kb": 0, 00:26:46.906 "state": "online", 00:26:46.906 "raid_level": "raid1", 00:26:46.906 "superblock": true, 00:26:46.906 "num_base_bdevs": 2, 00:26:46.906 "num_base_bdevs_discovered": 1, 00:26:46.906 "num_base_bdevs_operational": 1, 00:26:46.906 "base_bdevs_list": [ 00:26:46.906 { 00:26:46.906 "name": null, 00:26:46.906 "uuid": "00000000-0000-0000-0000-000000000000", 00:26:46.906 "is_configured": false, 00:26:46.906 "data_offset": 256, 00:26:46.906 "data_size": 7936 00:26:46.906 }, 00:26:46.906 { 00:26:46.906 "name": "pt2", 00:26:46.906 "uuid": "00000000-0000-0000-0000-000000000002", 00:26:46.906 "is_configured": true, 00:26:46.906 "data_offset": 256, 00:26:46.906 "data_size": 7936 00:26:46.906 } 00:26:46.906 ] 00:26:46.906 }' 00:26:46.906 18:30:30 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:26:46.906 18:30:30 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@10 -- # set +x 00:26:47.472 18:30:31 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@498 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:26:47.731 [2024-07-12 18:30:31.370120] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:26:47.731 [2024-07-12 18:30:31.370147] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:26:47.731 [2024-07-12 18:30:31.370194] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:26:47.731 [2024-07-12 18:30:31.370233] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:26:47.731 [2024-07-12 18:30:31.370244] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x16fa590 name raid_bdev1, state offline 00:26:47.731 18:30:31 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@499 -- # jq -r '.[]' 00:26:47.731 18:30:31 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@499 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:47.989 18:30:31 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@499 -- # raid_bdev= 00:26:47.989 18:30:31 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@500 -- # '[' -n '' ']' 00:26:47.989 18:30:31 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@505 -- # (( i = 1 )) 00:26:47.989 18:30:31 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@505 -- # (( i < num_base_bdevs )) 00:26:47.989 18:30:31 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@506 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:26:48.248 18:30:31 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@505 -- # (( i++ )) 00:26:48.248 18:30:31 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@505 -- # (( i < num_base_bdevs )) 00:26:48.248 18:30:31 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@510 -- # (( i = 1 )) 00:26:48.248 18:30:31 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@510 -- # (( i < num_base_bdevs - 1 )) 00:26:48.248 18:30:31 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@518 -- # i=1 00:26:48.248 18:30:31 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@519 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:26:48.507 [2024-07-12 18:30:32.112043] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:26:48.507 [2024-07-12 18:30:32.112085] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:26:48.507 [2024-07-12 18:30:32.112102] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1562160 00:26:48.507 [2024-07-12 18:30:32.112114] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:26:48.507 [2024-07-12 18:30:32.113710] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:26:48.507 [2024-07-12 18:30:32.113741] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:26:48.507 [2024-07-12 18:30:32.113805] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:26:48.507 [2024-07-12 18:30:32.113829] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:26:48.507 [2024-07-12 18:30:32.113910] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x1558380 00:26:48.507 [2024-07-12 18:30:32.113920] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4096 00:26:48.507 [2024-07-12 18:30:32.114095] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1559a80 00:26:48.507 [2024-07-12 18:30:32.114216] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1558380 00:26:48.507 [2024-07-12 18:30:32.114225] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x1558380 00:26:48.507 [2024-07-12 18:30:32.114320] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:26:48.507 pt2 00:26:48.507 18:30:32 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@522 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:26:48.507 18:30:32 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:26:48.507 18:30:32 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:26:48.508 18:30:32 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:26:48.508 18:30:32 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:26:48.508 18:30:32 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:26:48.508 18:30:32 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:26:48.508 18:30:32 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:26:48.508 18:30:32 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:26:48.508 18:30:32 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@124 -- # local tmp 00:26:48.508 18:30:32 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:48.508 18:30:32 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:48.766 18:30:32 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:26:48.766 "name": "raid_bdev1", 00:26:48.766 "uuid": "e4de1b50-c863-4327-a1b3-705bb3386632", 00:26:48.766 "strip_size_kb": 0, 00:26:48.766 "state": "online", 00:26:48.766 "raid_level": "raid1", 00:26:48.766 "superblock": true, 00:26:48.766 "num_base_bdevs": 2, 00:26:48.766 "num_base_bdevs_discovered": 1, 00:26:48.766 "num_base_bdevs_operational": 1, 00:26:48.766 "base_bdevs_list": [ 00:26:48.766 { 00:26:48.766 "name": null, 00:26:48.766 "uuid": "00000000-0000-0000-0000-000000000000", 00:26:48.766 "is_configured": false, 00:26:48.766 "data_offset": 256, 00:26:48.766 "data_size": 7936 00:26:48.766 }, 00:26:48.766 { 00:26:48.766 "name": "pt2", 00:26:48.766 "uuid": "00000000-0000-0000-0000-000000000002", 00:26:48.766 "is_configured": true, 00:26:48.766 "data_offset": 256, 00:26:48.766 "data_size": 7936 00:26:48.766 } 00:26:48.766 ] 00:26:48.766 }' 00:26:48.766 18:30:32 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:26:48.766 18:30:32 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@10 -- # set +x 00:26:49.331 18:30:32 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@525 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:26:49.589 [2024-07-12 18:30:33.207009] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:26:49.589 [2024-07-12 18:30:33.207039] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:26:49.589 [2024-07-12 18:30:33.207090] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:26:49.589 [2024-07-12 18:30:33.207132] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:26:49.589 [2024-07-12 18:30:33.207143] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1558380 name raid_bdev1, state offline 00:26:49.589 18:30:33 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@526 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:49.589 18:30:33 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@526 -- # jq -r '.[]' 00:26:49.846 18:30:33 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@526 -- # raid_bdev= 00:26:49.846 18:30:33 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@527 -- # '[' -n '' ']' 00:26:49.846 18:30:33 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@531 -- # '[' 2 -gt 2 ']' 00:26:49.846 18:30:33 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@539 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:26:50.103 [2024-07-12 18:30:33.680247] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:26:50.103 [2024-07-12 18:30:33.680293] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:26:50.103 [2024-07-12 18:30:33.680313] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1704520 00:26:50.103 [2024-07-12 18:30:33.680326] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:26:50.103 [2024-07-12 18:30:33.682002] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:26:50.103 [2024-07-12 18:30:33.682043] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:26:50.103 [2024-07-12 18:30:33.682110] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt1 00:26:50.103 [2024-07-12 18:30:33.682136] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:26:50.103 [2024-07-12 18:30:33.682237] bdev_raid.c:3547:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev pt2 (4) greater than existing raid bdev raid_bdev1 (2) 00:26:50.103 [2024-07-12 18:30:33.682250] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:26:50.103 [2024-07-12 18:30:33.682263] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x15593f0 name raid_bdev1, state configuring 00:26:50.103 [2024-07-12 18:30:33.682286] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:26:50.103 [2024-07-12 18:30:33.682343] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x155b2b0 00:26:50.103 [2024-07-12 18:30:33.682354] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4096 00:26:50.103 [2024-07-12 18:30:33.682520] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1558350 00:26:50.103 [2024-07-12 18:30:33.682640] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x155b2b0 00:26:50.103 [2024-07-12 18:30:33.682650] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x155b2b0 00:26:50.103 [2024-07-12 18:30:33.682748] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:26:50.103 pt1 00:26:50.103 18:30:33 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@541 -- # '[' 2 -gt 2 ']' 00:26:50.103 18:30:33 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@553 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:26:50.103 18:30:33 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:26:50.103 18:30:33 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:26:50.103 18:30:33 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:26:50.103 18:30:33 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:26:50.103 18:30:33 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:26:50.103 18:30:33 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:26:50.103 18:30:33 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:26:50.103 18:30:33 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:26:50.103 18:30:33 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@124 -- # local tmp 00:26:50.103 18:30:33 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:50.103 18:30:33 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:50.360 18:30:33 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:26:50.360 "name": "raid_bdev1", 00:26:50.360 "uuid": "e4de1b50-c863-4327-a1b3-705bb3386632", 00:26:50.360 "strip_size_kb": 0, 00:26:50.360 "state": "online", 00:26:50.360 "raid_level": "raid1", 00:26:50.360 "superblock": true, 00:26:50.360 "num_base_bdevs": 2, 00:26:50.361 "num_base_bdevs_discovered": 1, 00:26:50.361 "num_base_bdevs_operational": 1, 00:26:50.361 "base_bdevs_list": [ 00:26:50.361 { 00:26:50.361 "name": null, 00:26:50.361 "uuid": "00000000-0000-0000-0000-000000000000", 00:26:50.361 "is_configured": false, 00:26:50.361 "data_offset": 256, 00:26:50.361 "data_size": 7936 00:26:50.361 }, 00:26:50.361 { 00:26:50.361 "name": "pt2", 00:26:50.361 "uuid": "00000000-0000-0000-0000-000000000002", 00:26:50.361 "is_configured": true, 00:26:50.361 "data_offset": 256, 00:26:50.361 "data_size": 7936 00:26:50.361 } 00:26:50.361 ] 00:26:50.361 }' 00:26:50.361 18:30:33 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:26:50.361 18:30:33 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@10 -- # set +x 00:26:50.926 18:30:34 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@554 -- # jq -r '.[].base_bdevs_list[0].is_configured' 00:26:50.927 18:30:34 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@554 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs online 00:26:51.184 18:30:34 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@554 -- # [[ false == \f\a\l\s\e ]] 00:26:51.184 18:30:34 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@557 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:26:51.184 18:30:34 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@557 -- # jq -r '.[] | .uuid' 00:26:51.443 [2024-07-12 18:30:34.991934] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:26:51.443 18:30:35 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@557 -- # '[' e4de1b50-c863-4327-a1b3-705bb3386632 '!=' e4de1b50-c863-4327-a1b3-705bb3386632 ']' 00:26:51.443 18:30:35 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@562 -- # killprocess 2599321 00:26:51.443 18:30:35 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@948 -- # '[' -z 2599321 ']' 00:26:51.443 18:30:35 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@952 -- # kill -0 2599321 00:26:51.443 18:30:35 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@953 -- # uname 00:26:51.443 18:30:35 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:26:51.443 18:30:35 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2599321 00:26:51.443 18:30:35 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:26:51.443 18:30:35 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:26:51.443 18:30:35 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2599321' 00:26:51.443 killing process with pid 2599321 00:26:51.443 18:30:35 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@967 -- # kill 2599321 00:26:51.443 [2024-07-12 18:30:35.060079] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:26:51.443 [2024-07-12 18:30:35.060130] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:26:51.443 [2024-07-12 18:30:35.060171] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:26:51.443 [2024-07-12 18:30:35.060182] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x155b2b0 name raid_bdev1, state offline 00:26:51.443 18:30:35 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@972 -- # wait 2599321 00:26:51.443 [2024-07-12 18:30:35.076462] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:26:51.701 18:30:35 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@564 -- # return 0 00:26:51.701 00:26:51.701 real 0m15.223s 00:26:51.701 user 0m28.116s 00:26:51.701 sys 0m2.845s 00:26:51.701 18:30:35 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@1124 -- # xtrace_disable 00:26:51.701 18:30:35 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@10 -- # set +x 00:26:51.701 ************************************ 00:26:51.701 END TEST raid_superblock_test_4k 00:26:51.701 ************************************ 00:26:51.701 18:30:35 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:26:51.701 18:30:35 bdev_raid -- bdev/bdev_raid.sh@900 -- # '[' true = true ']' 00:26:51.701 18:30:35 bdev_raid -- bdev/bdev_raid.sh@901 -- # run_test raid_rebuild_test_sb_4k raid_rebuild_test raid1 2 true false true 00:26:51.701 18:30:35 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 7 -le 1 ']' 00:26:51.701 18:30:35 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:26:51.701 18:30:35 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:26:51.701 ************************************ 00:26:51.701 START TEST raid_rebuild_test_sb_4k 00:26:51.701 ************************************ 00:26:51.701 18:30:35 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@1123 -- # raid_rebuild_test raid1 2 true false true 00:26:51.701 18:30:35 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@568 -- # local raid_level=raid1 00:26:51.701 18:30:35 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@569 -- # local num_base_bdevs=2 00:26:51.701 18:30:35 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@570 -- # local superblock=true 00:26:51.701 18:30:35 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@571 -- # local background_io=false 00:26:51.701 18:30:35 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@572 -- # local verify=true 00:26:51.701 18:30:35 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@573 -- # (( i = 1 )) 00:26:51.701 18:30:35 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:26:51.701 18:30:35 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@575 -- # echo BaseBdev1 00:26:51.701 18:30:35 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:26:51.701 18:30:35 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:26:51.701 18:30:35 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@575 -- # echo BaseBdev2 00:26:51.701 18:30:35 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:26:51.701 18:30:35 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:26:51.701 18:30:35 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@573 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:26:51.701 18:30:35 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@573 -- # local base_bdevs 00:26:51.701 18:30:35 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@574 -- # local raid_bdev_name=raid_bdev1 00:26:51.701 18:30:35 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@575 -- # local strip_size 00:26:51.701 18:30:35 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@576 -- # local create_arg 00:26:51.701 18:30:35 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@577 -- # local raid_bdev_size 00:26:51.701 18:30:35 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@578 -- # local data_offset 00:26:51.701 18:30:35 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@580 -- # '[' raid1 '!=' raid1 ']' 00:26:51.701 18:30:35 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@588 -- # strip_size=0 00:26:51.701 18:30:35 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@591 -- # '[' true = true ']' 00:26:51.701 18:30:35 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@592 -- # create_arg+=' -s' 00:26:51.701 18:30:35 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@596 -- # raid_pid=2601681 00:26:51.701 18:30:35 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@597 -- # waitforlisten 2601681 /var/tmp/spdk-raid.sock 00:26:51.701 18:30:35 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@595 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 3M -q 2 -U -z -L bdev_raid 00:26:51.701 18:30:35 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@829 -- # '[' -z 2601681 ']' 00:26:51.701 18:30:35 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:26:51.701 18:30:35 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@834 -- # local max_retries=100 00:26:51.701 18:30:35 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:26:51.701 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:26:51.701 18:30:35 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@838 -- # xtrace_disable 00:26:51.701 18:30:35 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@10 -- # set +x 00:26:51.960 [2024-07-12 18:30:35.432709] Starting SPDK v24.09-pre git sha1 182dd7de4 / DPDK 24.03.0 initialization... 00:26:51.960 [2024-07-12 18:30:35.432778] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2601681 ] 00:26:51.960 I/O size of 3145728 is greater than zero copy threshold (65536). 00:26:51.960 Zero copy mechanism will not be used. 00:26:51.960 [2024-07-12 18:30:35.551651] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:26:51.960 [2024-07-12 18:30:35.657391] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:26:52.227 [2024-07-12 18:30:35.722323] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:26:52.227 [2024-07-12 18:30:35.722359] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:26:52.836 18:30:36 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:26:52.836 18:30:36 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@862 -- # return 0 00:26:52.836 18:30:36 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:26:52.836 18:30:36 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -b BaseBdev1_malloc 00:26:53.095 BaseBdev1_malloc 00:26:53.096 18:30:36 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev1_malloc -p BaseBdev1 00:26:53.354 [2024-07-12 18:30:36.824501] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev1_malloc 00:26:53.354 [2024-07-12 18:30:36.824548] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:26:53.354 [2024-07-12 18:30:36.824572] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x24e9d40 00:26:53.354 [2024-07-12 18:30:36.824585] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:26:53.354 [2024-07-12 18:30:36.826294] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:26:53.354 [2024-07-12 18:30:36.826324] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:26:53.354 BaseBdev1 00:26:53.354 18:30:36 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:26:53.354 18:30:36 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -b BaseBdev2_malloc 00:26:53.354 BaseBdev2_malloc 00:26:53.612 18:30:37 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev2_malloc -p BaseBdev2 00:26:53.612 [2024-07-12 18:30:37.306586] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev2_malloc 00:26:53.612 [2024-07-12 18:30:37.306631] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:26:53.612 [2024-07-12 18:30:37.306654] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x24ea860 00:26:53.612 [2024-07-12 18:30:37.306667] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:26:53.612 [2024-07-12 18:30:37.308164] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:26:53.612 [2024-07-12 18:30:37.308194] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:26:53.612 BaseBdev2 00:26:53.612 18:30:37 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@606 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -b spare_malloc 00:26:53.870 spare_malloc 00:26:53.870 18:30:37 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@607 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_delay_create -b spare_malloc -d spare_delay -r 0 -t 0 -w 100000 -n 100000 00:26:54.128 spare_delay 00:26:54.128 18:30:37 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@608 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:26:54.386 [2024-07-12 18:30:38.046312] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:26:54.386 [2024-07-12 18:30:38.046358] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:26:54.386 [2024-07-12 18:30:38.046378] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x2698ec0 00:26:54.386 [2024-07-12 18:30:38.046392] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:26:54.386 [2024-07-12 18:30:38.047999] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:26:54.386 [2024-07-12 18:30:38.048031] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:26:54.386 spare 00:26:54.386 18:30:38 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@611 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2' -n raid_bdev1 00:26:54.645 [2024-07-12 18:30:38.278957] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:26:54.645 [2024-07-12 18:30:38.280287] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:26:54.645 [2024-07-12 18:30:38.280448] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x269a070 00:26:54.645 [2024-07-12 18:30:38.280462] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4096 00:26:54.645 [2024-07-12 18:30:38.280659] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x2693490 00:26:54.645 [2024-07-12 18:30:38.280800] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x269a070 00:26:54.645 [2024-07-12 18:30:38.280810] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x269a070 00:26:54.645 [2024-07-12 18:30:38.280908] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:26:54.645 18:30:38 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@612 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:26:54.645 18:30:38 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:26:54.645 18:30:38 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:26:54.645 18:30:38 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:26:54.645 18:30:38 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:26:54.645 18:30:38 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:26:54.645 18:30:38 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:26:54.645 18:30:38 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:26:54.645 18:30:38 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:26:54.645 18:30:38 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@124 -- # local tmp 00:26:54.645 18:30:38 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:54.645 18:30:38 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:54.903 18:30:38 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:26:54.903 "name": "raid_bdev1", 00:26:54.903 "uuid": "3b478957-8395-4e94-b309-ff999f5d79bb", 00:26:54.903 "strip_size_kb": 0, 00:26:54.903 "state": "online", 00:26:54.903 "raid_level": "raid1", 00:26:54.903 "superblock": true, 00:26:54.903 "num_base_bdevs": 2, 00:26:54.903 "num_base_bdevs_discovered": 2, 00:26:54.903 "num_base_bdevs_operational": 2, 00:26:54.903 "base_bdevs_list": [ 00:26:54.903 { 00:26:54.903 "name": "BaseBdev1", 00:26:54.903 "uuid": "b908bb79-3e5e-54f8-91c6-ddbeee905a24", 00:26:54.903 "is_configured": true, 00:26:54.903 "data_offset": 256, 00:26:54.903 "data_size": 7936 00:26:54.903 }, 00:26:54.903 { 00:26:54.903 "name": "BaseBdev2", 00:26:54.903 "uuid": "ef2cf9b7-cc87-53c4-bdb7-0dd2bc9ef946", 00:26:54.903 "is_configured": true, 00:26:54.903 "data_offset": 256, 00:26:54.903 "data_size": 7936 00:26:54.903 } 00:26:54.903 ] 00:26:54.903 }' 00:26:54.903 18:30:38 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:26:54.903 18:30:38 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@10 -- # set +x 00:26:55.469 18:30:39 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@615 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:26:55.469 18:30:39 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@615 -- # jq -r '.[].num_blocks' 00:26:55.727 [2024-07-12 18:30:39.354006] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:26:55.727 18:30:39 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@615 -- # raid_bdev_size=7936 00:26:55.727 18:30:39 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@618 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:55.727 18:30:39 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@618 -- # jq -r '.[].base_bdevs_list[0].data_offset' 00:26:55.985 18:30:39 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@618 -- # data_offset=256 00:26:55.985 18:30:39 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@620 -- # '[' false = true ']' 00:26:55.985 18:30:39 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@623 -- # '[' true = true ']' 00:26:55.985 18:30:39 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@624 -- # local write_unit_size 00:26:55.985 18:30:39 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@627 -- # nbd_start_disks /var/tmp/spdk-raid.sock raid_bdev1 /dev/nbd0 00:26:55.985 18:30:39 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:26:55.985 18:30:39 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@10 -- # bdev_list=('raid_bdev1') 00:26:55.985 18:30:39 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@10 -- # local bdev_list 00:26:55.985 18:30:39 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0') 00:26:55.985 18:30:39 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@11 -- # local nbd_list 00:26:55.985 18:30:39 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@12 -- # local i 00:26:55.985 18:30:39 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:26:55.985 18:30:39 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:26:55.985 18:30:39 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk raid_bdev1 /dev/nbd0 00:26:56.243 [2024-07-12 18:30:39.863148] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x2693490 00:26:56.243 /dev/nbd0 00:26:56.243 18:30:39 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:26:56.243 18:30:39 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:26:56.243 18:30:39 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@866 -- # local nbd_name=nbd0 00:26:56.243 18:30:39 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@867 -- # local i 00:26:56.243 18:30:39 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:26:56.243 18:30:39 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:26:56.243 18:30:39 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@870 -- # grep -q -w nbd0 /proc/partitions 00:26:56.243 18:30:39 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@871 -- # break 00:26:56.243 18:30:39 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:26:56.243 18:30:39 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:26:56.243 18:30:39 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@883 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:26:56.243 1+0 records in 00:26:56.243 1+0 records out 00:26:56.243 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000218764 s, 18.7 MB/s 00:26:56.243 18:30:39 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:26:56.243 18:30:39 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@884 -- # size=4096 00:26:56.243 18:30:39 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:26:56.243 18:30:39 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:26:56.243 18:30:39 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@887 -- # return 0 00:26:56.243 18:30:39 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:26:56.243 18:30:39 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:26:56.243 18:30:39 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@628 -- # '[' raid1 = raid5f ']' 00:26:56.243 18:30:39 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@632 -- # write_unit_size=1 00:26:56.243 18:30:39 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@634 -- # dd if=/dev/urandom of=/dev/nbd0 bs=4096 count=7936 oflag=direct 00:26:57.176 7936+0 records in 00:26:57.176 7936+0 records out 00:26:57.176 32505856 bytes (33 MB, 31 MiB) copied, 0.752826 s, 43.2 MB/s 00:26:57.176 18:30:40 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@635 -- # nbd_stop_disks /var/tmp/spdk-raid.sock /dev/nbd0 00:26:57.176 18:30:40 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:26:57.176 18:30:40 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0') 00:26:57.176 18:30:40 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@50 -- # local nbd_list 00:26:57.176 18:30:40 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@51 -- # local i 00:26:57.176 18:30:40 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:26:57.176 18:30:40 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd0 00:26:57.434 18:30:40 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:26:57.434 [2024-07-12 18:30:40.944582] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:26:57.434 18:30:40 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:26:57.434 18:30:40 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:26:57.434 18:30:40 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:26:57.434 18:30:40 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:26:57.434 18:30:40 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:26:57.434 18:30:40 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@41 -- # break 00:26:57.434 18:30:40 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@45 -- # return 0 00:26:57.434 18:30:40 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@639 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev1 00:26:57.691 [2024-07-12 18:30:41.177246] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:26:57.691 18:30:41 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@642 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:26:57.691 18:30:41 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:26:57.691 18:30:41 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:26:57.691 18:30:41 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:26:57.691 18:30:41 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:26:57.691 18:30:41 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:26:57.691 18:30:41 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:26:57.692 18:30:41 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:26:57.692 18:30:41 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:26:57.692 18:30:41 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@124 -- # local tmp 00:26:57.692 18:30:41 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:57.692 18:30:41 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:57.949 18:30:41 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:26:57.949 "name": "raid_bdev1", 00:26:57.949 "uuid": "3b478957-8395-4e94-b309-ff999f5d79bb", 00:26:57.949 "strip_size_kb": 0, 00:26:57.949 "state": "online", 00:26:57.949 "raid_level": "raid1", 00:26:57.949 "superblock": true, 00:26:57.949 "num_base_bdevs": 2, 00:26:57.949 "num_base_bdevs_discovered": 1, 00:26:57.949 "num_base_bdevs_operational": 1, 00:26:57.949 "base_bdevs_list": [ 00:26:57.949 { 00:26:57.949 "name": null, 00:26:57.949 "uuid": "00000000-0000-0000-0000-000000000000", 00:26:57.949 "is_configured": false, 00:26:57.949 "data_offset": 256, 00:26:57.949 "data_size": 7936 00:26:57.949 }, 00:26:57.949 { 00:26:57.949 "name": "BaseBdev2", 00:26:57.949 "uuid": "ef2cf9b7-cc87-53c4-bdb7-0dd2bc9ef946", 00:26:57.949 "is_configured": true, 00:26:57.949 "data_offset": 256, 00:26:57.949 "data_size": 7936 00:26:57.949 } 00:26:57.949 ] 00:26:57.949 }' 00:26:57.949 18:30:41 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:26:57.949 18:30:41 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@10 -- # set +x 00:26:58.514 18:30:42 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@645 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:26:58.772 [2024-07-12 18:30:42.264126] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:26:58.772 [2024-07-12 18:30:42.269034] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x2699ce0 00:26:58.772 [2024-07-12 18:30:42.271237] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:26:58.772 18:30:42 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@646 -- # sleep 1 00:26:59.704 18:30:43 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@649 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:26:59.704 18:30:43 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:26:59.704 18:30:43 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:26:59.705 18:30:43 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@184 -- # local target=spare 00:26:59.705 18:30:43 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:26:59.705 18:30:43 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:59.705 18:30:43 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:59.962 18:30:43 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:26:59.962 "name": "raid_bdev1", 00:26:59.962 "uuid": "3b478957-8395-4e94-b309-ff999f5d79bb", 00:26:59.962 "strip_size_kb": 0, 00:26:59.962 "state": "online", 00:26:59.962 "raid_level": "raid1", 00:26:59.962 "superblock": true, 00:26:59.962 "num_base_bdevs": 2, 00:26:59.962 "num_base_bdevs_discovered": 2, 00:26:59.962 "num_base_bdevs_operational": 2, 00:26:59.962 "process": { 00:26:59.962 "type": "rebuild", 00:26:59.962 "target": "spare", 00:26:59.962 "progress": { 00:26:59.962 "blocks": 3072, 00:26:59.962 "percent": 38 00:26:59.962 } 00:26:59.962 }, 00:26:59.962 "base_bdevs_list": [ 00:26:59.962 { 00:26:59.962 "name": "spare", 00:26:59.962 "uuid": "e75d0853-6262-51af-be09-d68c60034700", 00:26:59.962 "is_configured": true, 00:26:59.962 "data_offset": 256, 00:26:59.962 "data_size": 7936 00:26:59.962 }, 00:26:59.962 { 00:26:59.962 "name": "BaseBdev2", 00:26:59.962 "uuid": "ef2cf9b7-cc87-53c4-bdb7-0dd2bc9ef946", 00:26:59.962 "is_configured": true, 00:26:59.962 "data_offset": 256, 00:26:59.962 "data_size": 7936 00:26:59.962 } 00:26:59.962 ] 00:26:59.962 }' 00:26:59.962 18:30:43 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:26:59.962 18:30:43 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:26:59.962 18:30:43 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:26:59.962 18:30:43 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:26:59.962 18:30:43 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@652 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev spare 00:27:00.526 [2024-07-12 18:30:44.110402] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:27:00.527 [2024-07-12 18:30:44.185845] bdev_raid.c:2513:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:27:00.527 [2024-07-12 18:30:44.185890] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:27:00.527 [2024-07-12 18:30:44.185906] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:27:00.527 [2024-07-12 18:30:44.185914] bdev_raid.c:2451:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:27:00.527 18:30:44 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@655 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:27:00.527 18:30:44 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:27:00.527 18:30:44 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:27:00.527 18:30:44 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:27:00.527 18:30:44 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:27:00.527 18:30:44 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:27:00.527 18:30:44 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:27:00.527 18:30:44 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:27:00.527 18:30:44 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:27:00.527 18:30:44 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@124 -- # local tmp 00:27:00.527 18:30:44 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:00.527 18:30:44 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:27:00.783 18:30:44 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:27:00.783 "name": "raid_bdev1", 00:27:00.783 "uuid": "3b478957-8395-4e94-b309-ff999f5d79bb", 00:27:00.783 "strip_size_kb": 0, 00:27:00.783 "state": "online", 00:27:00.783 "raid_level": "raid1", 00:27:00.783 "superblock": true, 00:27:00.783 "num_base_bdevs": 2, 00:27:00.783 "num_base_bdevs_discovered": 1, 00:27:00.783 "num_base_bdevs_operational": 1, 00:27:00.783 "base_bdevs_list": [ 00:27:00.783 { 00:27:00.783 "name": null, 00:27:00.783 "uuid": "00000000-0000-0000-0000-000000000000", 00:27:00.783 "is_configured": false, 00:27:00.783 "data_offset": 256, 00:27:00.783 "data_size": 7936 00:27:00.783 }, 00:27:00.783 { 00:27:00.783 "name": "BaseBdev2", 00:27:00.783 "uuid": "ef2cf9b7-cc87-53c4-bdb7-0dd2bc9ef946", 00:27:00.783 "is_configured": true, 00:27:00.783 "data_offset": 256, 00:27:00.783 "data_size": 7936 00:27:00.783 } 00:27:00.783 ] 00:27:00.783 }' 00:27:00.783 18:30:44 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:27:00.783 18:30:44 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@10 -- # set +x 00:27:01.349 18:30:45 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@658 -- # verify_raid_bdev_process raid_bdev1 none none 00:27:01.349 18:30:45 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:27:01.349 18:30:45 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:27:01.349 18:30:45 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@184 -- # local target=none 00:27:01.349 18:30:45 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:27:01.349 18:30:45 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:01.349 18:30:45 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:27:01.607 18:30:45 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:27:01.607 "name": "raid_bdev1", 00:27:01.607 "uuid": "3b478957-8395-4e94-b309-ff999f5d79bb", 00:27:01.607 "strip_size_kb": 0, 00:27:01.607 "state": "online", 00:27:01.607 "raid_level": "raid1", 00:27:01.607 "superblock": true, 00:27:01.607 "num_base_bdevs": 2, 00:27:01.607 "num_base_bdevs_discovered": 1, 00:27:01.607 "num_base_bdevs_operational": 1, 00:27:01.607 "base_bdevs_list": [ 00:27:01.607 { 00:27:01.607 "name": null, 00:27:01.607 "uuid": "00000000-0000-0000-0000-000000000000", 00:27:01.608 "is_configured": false, 00:27:01.608 "data_offset": 256, 00:27:01.608 "data_size": 7936 00:27:01.608 }, 00:27:01.608 { 00:27:01.608 "name": "BaseBdev2", 00:27:01.608 "uuid": "ef2cf9b7-cc87-53c4-bdb7-0dd2bc9ef946", 00:27:01.608 "is_configured": true, 00:27:01.608 "data_offset": 256, 00:27:01.608 "data_size": 7936 00:27:01.608 } 00:27:01.608 ] 00:27:01.608 }' 00:27:01.608 18:30:45 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:27:01.866 18:30:45 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:27:01.866 18:30:45 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:27:01.866 18:30:45 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:27:01.866 18:30:45 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@661 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:27:02.124 [2024-07-12 18:30:45.634634] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:27:02.124 [2024-07-12 18:30:45.639581] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x2699ce0 00:27:02.124 [2024-07-12 18:30:45.641043] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:27:02.124 18:30:45 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@662 -- # sleep 1 00:27:03.058 18:30:46 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@663 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:27:03.058 18:30:46 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:27:03.058 18:30:46 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:27:03.058 18:30:46 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@184 -- # local target=spare 00:27:03.058 18:30:46 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:27:03.058 18:30:46 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:03.058 18:30:46 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:27:03.317 18:30:46 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:27:03.317 "name": "raid_bdev1", 00:27:03.317 "uuid": "3b478957-8395-4e94-b309-ff999f5d79bb", 00:27:03.317 "strip_size_kb": 0, 00:27:03.317 "state": "online", 00:27:03.317 "raid_level": "raid1", 00:27:03.317 "superblock": true, 00:27:03.317 "num_base_bdevs": 2, 00:27:03.317 "num_base_bdevs_discovered": 2, 00:27:03.317 "num_base_bdevs_operational": 2, 00:27:03.317 "process": { 00:27:03.317 "type": "rebuild", 00:27:03.317 "target": "spare", 00:27:03.317 "progress": { 00:27:03.317 "blocks": 3072, 00:27:03.317 "percent": 38 00:27:03.317 } 00:27:03.317 }, 00:27:03.317 "base_bdevs_list": [ 00:27:03.317 { 00:27:03.317 "name": "spare", 00:27:03.317 "uuid": "e75d0853-6262-51af-be09-d68c60034700", 00:27:03.317 "is_configured": true, 00:27:03.317 "data_offset": 256, 00:27:03.317 "data_size": 7936 00:27:03.317 }, 00:27:03.317 { 00:27:03.317 "name": "BaseBdev2", 00:27:03.317 "uuid": "ef2cf9b7-cc87-53c4-bdb7-0dd2bc9ef946", 00:27:03.317 "is_configured": true, 00:27:03.317 "data_offset": 256, 00:27:03.317 "data_size": 7936 00:27:03.317 } 00:27:03.317 ] 00:27:03.317 }' 00:27:03.317 18:30:46 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:27:03.317 18:30:46 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:27:03.317 18:30:46 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:27:03.317 18:30:46 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:27:03.317 18:30:46 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@665 -- # '[' true = true ']' 00:27:03.317 18:30:46 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@665 -- # '[' = false ']' 00:27:03.317 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev_raid.sh: line 665: [: =: unary operator expected 00:27:03.317 18:30:46 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@690 -- # local num_base_bdevs_operational=2 00:27:03.317 18:30:46 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@692 -- # '[' raid1 = raid1 ']' 00:27:03.317 18:30:46 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@692 -- # '[' 2 -gt 2 ']' 00:27:03.317 18:30:46 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@705 -- # local timeout=1019 00:27:03.317 18:30:46 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:27:03.317 18:30:46 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:27:03.317 18:30:46 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:27:03.317 18:30:46 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:27:03.317 18:30:46 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@184 -- # local target=spare 00:27:03.317 18:30:46 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:27:03.317 18:30:46 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:03.317 18:30:47 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:27:03.575 18:30:47 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:27:03.575 "name": "raid_bdev1", 00:27:03.575 "uuid": "3b478957-8395-4e94-b309-ff999f5d79bb", 00:27:03.575 "strip_size_kb": 0, 00:27:03.575 "state": "online", 00:27:03.575 "raid_level": "raid1", 00:27:03.575 "superblock": true, 00:27:03.575 "num_base_bdevs": 2, 00:27:03.575 "num_base_bdevs_discovered": 2, 00:27:03.575 "num_base_bdevs_operational": 2, 00:27:03.575 "process": { 00:27:03.575 "type": "rebuild", 00:27:03.575 "target": "spare", 00:27:03.575 "progress": { 00:27:03.575 "blocks": 3840, 00:27:03.575 "percent": 48 00:27:03.575 } 00:27:03.575 }, 00:27:03.575 "base_bdevs_list": [ 00:27:03.575 { 00:27:03.575 "name": "spare", 00:27:03.575 "uuid": "e75d0853-6262-51af-be09-d68c60034700", 00:27:03.576 "is_configured": true, 00:27:03.576 "data_offset": 256, 00:27:03.576 "data_size": 7936 00:27:03.576 }, 00:27:03.576 { 00:27:03.576 "name": "BaseBdev2", 00:27:03.576 "uuid": "ef2cf9b7-cc87-53c4-bdb7-0dd2bc9ef946", 00:27:03.576 "is_configured": true, 00:27:03.576 "data_offset": 256, 00:27:03.576 "data_size": 7936 00:27:03.576 } 00:27:03.576 ] 00:27:03.576 }' 00:27:03.576 18:30:47 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:27:03.576 18:30:47 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:27:03.576 18:30:47 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:27:03.834 18:30:47 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:27:03.834 18:30:47 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@710 -- # sleep 1 00:27:04.768 18:30:48 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:27:04.768 18:30:48 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:27:04.768 18:30:48 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:27:04.768 18:30:48 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:27:04.768 18:30:48 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@184 -- # local target=spare 00:27:04.768 18:30:48 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:27:04.768 18:30:48 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:04.768 18:30:48 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:27:05.026 18:30:48 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:27:05.026 "name": "raid_bdev1", 00:27:05.026 "uuid": "3b478957-8395-4e94-b309-ff999f5d79bb", 00:27:05.026 "strip_size_kb": 0, 00:27:05.026 "state": "online", 00:27:05.026 "raid_level": "raid1", 00:27:05.026 "superblock": true, 00:27:05.026 "num_base_bdevs": 2, 00:27:05.026 "num_base_bdevs_discovered": 2, 00:27:05.026 "num_base_bdevs_operational": 2, 00:27:05.026 "process": { 00:27:05.026 "type": "rebuild", 00:27:05.026 "target": "spare", 00:27:05.026 "progress": { 00:27:05.026 "blocks": 7168, 00:27:05.026 "percent": 90 00:27:05.026 } 00:27:05.026 }, 00:27:05.026 "base_bdevs_list": [ 00:27:05.026 { 00:27:05.026 "name": "spare", 00:27:05.026 "uuid": "e75d0853-6262-51af-be09-d68c60034700", 00:27:05.026 "is_configured": true, 00:27:05.026 "data_offset": 256, 00:27:05.026 "data_size": 7936 00:27:05.026 }, 00:27:05.026 { 00:27:05.026 "name": "BaseBdev2", 00:27:05.026 "uuid": "ef2cf9b7-cc87-53c4-bdb7-0dd2bc9ef946", 00:27:05.026 "is_configured": true, 00:27:05.026 "data_offset": 256, 00:27:05.026 "data_size": 7936 00:27:05.026 } 00:27:05.026 ] 00:27:05.026 }' 00:27:05.026 18:30:48 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:27:05.026 18:30:48 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:27:05.026 18:30:48 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:27:05.026 18:30:48 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:27:05.026 18:30:48 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@710 -- # sleep 1 00:27:05.284 [2024-07-12 18:30:48.765083] bdev_raid.c:2789:raid_bdev_process_thread_run: *DEBUG*: process completed on raid_bdev1 00:27:05.284 [2024-07-12 18:30:48.765140] bdev_raid.c:2504:raid_bdev_process_finish_done: *NOTICE*: Finished rebuild on raid bdev raid_bdev1 00:27:05.284 [2024-07-12 18:30:48.765218] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:27:06.218 18:30:49 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:27:06.218 18:30:49 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:27:06.218 18:30:49 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:27:06.218 18:30:49 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:27:06.218 18:30:49 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@184 -- # local target=spare 00:27:06.218 18:30:49 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:27:06.218 18:30:49 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:06.218 18:30:49 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:27:06.218 18:30:49 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:27:06.218 "name": "raid_bdev1", 00:27:06.218 "uuid": "3b478957-8395-4e94-b309-ff999f5d79bb", 00:27:06.218 "strip_size_kb": 0, 00:27:06.218 "state": "online", 00:27:06.218 "raid_level": "raid1", 00:27:06.218 "superblock": true, 00:27:06.218 "num_base_bdevs": 2, 00:27:06.218 "num_base_bdevs_discovered": 2, 00:27:06.218 "num_base_bdevs_operational": 2, 00:27:06.218 "base_bdevs_list": [ 00:27:06.218 { 00:27:06.218 "name": "spare", 00:27:06.218 "uuid": "e75d0853-6262-51af-be09-d68c60034700", 00:27:06.218 "is_configured": true, 00:27:06.218 "data_offset": 256, 00:27:06.218 "data_size": 7936 00:27:06.218 }, 00:27:06.218 { 00:27:06.218 "name": "BaseBdev2", 00:27:06.218 "uuid": "ef2cf9b7-cc87-53c4-bdb7-0dd2bc9ef946", 00:27:06.218 "is_configured": true, 00:27:06.218 "data_offset": 256, 00:27:06.218 "data_size": 7936 00:27:06.218 } 00:27:06.218 ] 00:27:06.218 }' 00:27:06.218 18:30:49 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:27:06.476 18:30:49 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # [[ none == \r\e\b\u\i\l\d ]] 00:27:06.476 18:30:49 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:27:06.476 18:30:50 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # [[ none == \s\p\a\r\e ]] 00:27:06.476 18:30:50 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@708 -- # break 00:27:06.477 18:30:50 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@714 -- # verify_raid_bdev_process raid_bdev1 none none 00:27:06.477 18:30:50 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:27:06.477 18:30:50 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:27:06.477 18:30:50 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@184 -- # local target=none 00:27:06.477 18:30:50 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:27:06.477 18:30:50 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:06.477 18:30:50 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:27:06.735 18:30:50 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:27:06.735 "name": "raid_bdev1", 00:27:06.735 "uuid": "3b478957-8395-4e94-b309-ff999f5d79bb", 00:27:06.735 "strip_size_kb": 0, 00:27:06.735 "state": "online", 00:27:06.735 "raid_level": "raid1", 00:27:06.735 "superblock": true, 00:27:06.735 "num_base_bdevs": 2, 00:27:06.735 "num_base_bdevs_discovered": 2, 00:27:06.735 "num_base_bdevs_operational": 2, 00:27:06.735 "base_bdevs_list": [ 00:27:06.735 { 00:27:06.735 "name": "spare", 00:27:06.735 "uuid": "e75d0853-6262-51af-be09-d68c60034700", 00:27:06.735 "is_configured": true, 00:27:06.735 "data_offset": 256, 00:27:06.735 "data_size": 7936 00:27:06.735 }, 00:27:06.735 { 00:27:06.735 "name": "BaseBdev2", 00:27:06.735 "uuid": "ef2cf9b7-cc87-53c4-bdb7-0dd2bc9ef946", 00:27:06.735 "is_configured": true, 00:27:06.735 "data_offset": 256, 00:27:06.735 "data_size": 7936 00:27:06.735 } 00:27:06.735 ] 00:27:06.735 }' 00:27:06.735 18:30:50 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:27:06.735 18:30:50 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:27:06.735 18:30:50 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:27:06.735 18:30:50 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:27:06.735 18:30:50 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@715 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:27:06.735 18:30:50 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:27:06.735 18:30:50 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:27:06.735 18:30:50 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:27:06.735 18:30:50 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:27:06.735 18:30:50 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:27:06.735 18:30:50 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:27:06.735 18:30:50 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:27:06.735 18:30:50 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:27:06.735 18:30:50 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@124 -- # local tmp 00:27:06.735 18:30:50 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:06.735 18:30:50 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:27:06.993 18:30:50 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:27:06.993 "name": "raid_bdev1", 00:27:06.993 "uuid": "3b478957-8395-4e94-b309-ff999f5d79bb", 00:27:06.993 "strip_size_kb": 0, 00:27:06.993 "state": "online", 00:27:06.993 "raid_level": "raid1", 00:27:06.993 "superblock": true, 00:27:06.993 "num_base_bdevs": 2, 00:27:06.993 "num_base_bdevs_discovered": 2, 00:27:06.993 "num_base_bdevs_operational": 2, 00:27:06.993 "base_bdevs_list": [ 00:27:06.993 { 00:27:06.993 "name": "spare", 00:27:06.993 "uuid": "e75d0853-6262-51af-be09-d68c60034700", 00:27:06.993 "is_configured": true, 00:27:06.993 "data_offset": 256, 00:27:06.993 "data_size": 7936 00:27:06.993 }, 00:27:06.993 { 00:27:06.993 "name": "BaseBdev2", 00:27:06.993 "uuid": "ef2cf9b7-cc87-53c4-bdb7-0dd2bc9ef946", 00:27:06.993 "is_configured": true, 00:27:06.993 "data_offset": 256, 00:27:06.993 "data_size": 7936 00:27:06.993 } 00:27:06.993 ] 00:27:06.993 }' 00:27:06.993 18:30:50 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:27:06.993 18:30:50 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@10 -- # set +x 00:27:07.559 18:30:51 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@718 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:27:07.817 [2024-07-12 18:30:51.417436] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:27:07.817 [2024-07-12 18:30:51.417462] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:27:07.817 [2024-07-12 18:30:51.417516] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:27:07.817 [2024-07-12 18:30:51.417569] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:27:07.817 [2024-07-12 18:30:51.417581] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x269a070 name raid_bdev1, state offline 00:27:07.817 18:30:51 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@719 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:07.817 18:30:51 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@719 -- # jq length 00:27:08.076 18:30:51 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@719 -- # [[ 0 == 0 ]] 00:27:08.076 18:30:51 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@721 -- # '[' true = true ']' 00:27:08.076 18:30:51 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@722 -- # '[' false = true ']' 00:27:08.076 18:30:51 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@736 -- # nbd_start_disks /var/tmp/spdk-raid.sock 'BaseBdev1 spare' '/dev/nbd0 /dev/nbd1' 00:27:08.076 18:30:51 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:27:08.076 18:30:51 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@10 -- # bdev_list=('BaseBdev1' 'spare') 00:27:08.076 18:30:51 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@10 -- # local bdev_list 00:27:08.076 18:30:51 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:27:08.076 18:30:51 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@11 -- # local nbd_list 00:27:08.076 18:30:51 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@12 -- # local i 00:27:08.076 18:30:51 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:27:08.076 18:30:51 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:27:08.076 18:30:51 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk BaseBdev1 /dev/nbd0 00:27:08.334 /dev/nbd0 00:27:08.334 18:30:51 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:27:08.334 18:30:51 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:27:08.334 18:30:51 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@866 -- # local nbd_name=nbd0 00:27:08.334 18:30:51 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@867 -- # local i 00:27:08.334 18:30:51 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:27:08.334 18:30:51 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:27:08.335 18:30:51 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@870 -- # grep -q -w nbd0 /proc/partitions 00:27:08.335 18:30:51 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@871 -- # break 00:27:08.335 18:30:51 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:27:08.335 18:30:51 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:27:08.335 18:30:51 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@883 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:27:08.335 1+0 records in 00:27:08.335 1+0 records out 00:27:08.335 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000250683 s, 16.3 MB/s 00:27:08.335 18:30:51 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:27:08.335 18:30:51 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@884 -- # size=4096 00:27:08.335 18:30:51 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:27:08.335 18:30:51 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:27:08.335 18:30:51 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@887 -- # return 0 00:27:08.335 18:30:51 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:27:08.335 18:30:51 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:27:08.335 18:30:51 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk spare /dev/nbd1 00:27:08.592 /dev/nbd1 00:27:08.592 18:30:52 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:27:08.592 18:30:52 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:27:08.592 18:30:52 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@866 -- # local nbd_name=nbd1 00:27:08.592 18:30:52 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@867 -- # local i 00:27:08.592 18:30:52 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:27:08.592 18:30:52 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:27:08.592 18:30:52 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@870 -- # grep -q -w nbd1 /proc/partitions 00:27:08.592 18:30:52 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@871 -- # break 00:27:08.592 18:30:52 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:27:08.592 18:30:52 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:27:08.592 18:30:52 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@883 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:27:08.592 1+0 records in 00:27:08.592 1+0 records out 00:27:08.592 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000241749 s, 16.9 MB/s 00:27:08.592 18:30:52 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:27:08.593 18:30:52 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@884 -- # size=4096 00:27:08.593 18:30:52 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:27:08.593 18:30:52 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:27:08.593 18:30:52 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@887 -- # return 0 00:27:08.593 18:30:52 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:27:08.593 18:30:52 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:27:08.593 18:30:52 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@737 -- # cmp -i 1048576 /dev/nbd0 /dev/nbd1 00:27:08.593 18:30:52 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@738 -- # nbd_stop_disks /var/tmp/spdk-raid.sock '/dev/nbd0 /dev/nbd1' 00:27:08.593 18:30:52 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:27:08.593 18:30:52 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:27:08.593 18:30:52 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@50 -- # local nbd_list 00:27:08.593 18:30:52 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@51 -- # local i 00:27:08.593 18:30:52 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:27:08.593 18:30:52 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd0 00:27:08.850 18:30:52 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:27:08.850 18:30:52 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:27:08.850 18:30:52 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:27:08.850 18:30:52 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:27:08.850 18:30:52 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:27:08.850 18:30:52 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:27:08.850 18:30:52 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@41 -- # break 00:27:08.850 18:30:52 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@45 -- # return 0 00:27:08.850 18:30:52 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:27:08.850 18:30:52 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd1 00:27:09.107 18:30:52 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:27:09.107 18:30:52 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:27:09.107 18:30:52 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:27:09.107 18:30:52 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:27:09.107 18:30:52 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:27:09.107 18:30:52 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:27:09.107 18:30:52 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@41 -- # break 00:27:09.107 18:30:52 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@45 -- # return 0 00:27:09.107 18:30:52 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@742 -- # '[' true = true ']' 00:27:09.107 18:30:52 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@744 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:27:09.364 18:30:53 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@745 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:27:09.621 [2024-07-12 18:30:53.260411] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:27:09.621 [2024-07-12 18:30:53.260452] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:27:09.621 [2024-07-12 18:30:53.260472] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x2699500 00:27:09.621 [2024-07-12 18:30:53.260484] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:27:09.621 [2024-07-12 18:30:53.262094] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:27:09.621 [2024-07-12 18:30:53.262125] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:27:09.621 [2024-07-12 18:30:53.262201] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev spare 00:27:09.621 [2024-07-12 18:30:53.262227] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:27:09.621 [2024-07-12 18:30:53.262326] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:27:09.621 spare 00:27:09.621 18:30:53 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@747 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:27:09.621 18:30:53 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:27:09.621 18:30:53 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:27:09.621 18:30:53 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:27:09.621 18:30:53 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:27:09.621 18:30:53 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:27:09.621 18:30:53 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:27:09.621 18:30:53 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:27:09.621 18:30:53 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:27:09.622 18:30:53 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@124 -- # local tmp 00:27:09.622 18:30:53 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:09.622 18:30:53 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:27:09.880 [2024-07-12 18:30:53.362646] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x269a7b0 00:27:09.880 [2024-07-12 18:30:53.362663] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4096 00:27:09.880 [2024-07-12 18:30:53.362854] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x2692f50 00:27:09.880 [2024-07-12 18:30:53.363010] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x269a7b0 00:27:09.880 [2024-07-12 18:30:53.363021] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x269a7b0 00:27:09.880 [2024-07-12 18:30:53.363124] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:27:09.880 18:30:53 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:27:09.880 "name": "raid_bdev1", 00:27:09.880 "uuid": "3b478957-8395-4e94-b309-ff999f5d79bb", 00:27:09.880 "strip_size_kb": 0, 00:27:09.880 "state": "online", 00:27:09.880 "raid_level": "raid1", 00:27:09.880 "superblock": true, 00:27:09.880 "num_base_bdevs": 2, 00:27:09.880 "num_base_bdevs_discovered": 2, 00:27:09.880 "num_base_bdevs_operational": 2, 00:27:09.880 "base_bdevs_list": [ 00:27:09.880 { 00:27:09.880 "name": "spare", 00:27:09.880 "uuid": "e75d0853-6262-51af-be09-d68c60034700", 00:27:09.880 "is_configured": true, 00:27:09.880 "data_offset": 256, 00:27:09.880 "data_size": 7936 00:27:09.880 }, 00:27:09.880 { 00:27:09.880 "name": "BaseBdev2", 00:27:09.880 "uuid": "ef2cf9b7-cc87-53c4-bdb7-0dd2bc9ef946", 00:27:09.880 "is_configured": true, 00:27:09.880 "data_offset": 256, 00:27:09.880 "data_size": 7936 00:27:09.880 } 00:27:09.880 ] 00:27:09.880 }' 00:27:09.880 18:30:53 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:27:09.880 18:30:53 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@10 -- # set +x 00:27:10.442 18:30:54 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@748 -- # verify_raid_bdev_process raid_bdev1 none none 00:27:10.442 18:30:54 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:27:10.442 18:30:54 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:27:10.442 18:30:54 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@184 -- # local target=none 00:27:10.442 18:30:54 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:27:10.442 18:30:54 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:10.442 18:30:54 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:27:10.699 18:30:54 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:27:10.699 "name": "raid_bdev1", 00:27:10.699 "uuid": "3b478957-8395-4e94-b309-ff999f5d79bb", 00:27:10.699 "strip_size_kb": 0, 00:27:10.699 "state": "online", 00:27:10.699 "raid_level": "raid1", 00:27:10.699 "superblock": true, 00:27:10.699 "num_base_bdevs": 2, 00:27:10.699 "num_base_bdevs_discovered": 2, 00:27:10.699 "num_base_bdevs_operational": 2, 00:27:10.699 "base_bdevs_list": [ 00:27:10.699 { 00:27:10.699 "name": "spare", 00:27:10.699 "uuid": "e75d0853-6262-51af-be09-d68c60034700", 00:27:10.699 "is_configured": true, 00:27:10.699 "data_offset": 256, 00:27:10.699 "data_size": 7936 00:27:10.699 }, 00:27:10.699 { 00:27:10.699 "name": "BaseBdev2", 00:27:10.699 "uuid": "ef2cf9b7-cc87-53c4-bdb7-0dd2bc9ef946", 00:27:10.699 "is_configured": true, 00:27:10.699 "data_offset": 256, 00:27:10.699 "data_size": 7936 00:27:10.699 } 00:27:10.699 ] 00:27:10.699 }' 00:27:10.699 18:30:54 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:27:10.699 18:30:54 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:27:10.699 18:30:54 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:27:10.956 18:30:54 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:27:10.956 18:30:54 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@749 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:10.956 18:30:54 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@749 -- # jq -r '.[].base_bdevs_list[0].name' 00:27:10.956 18:30:54 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@749 -- # [[ spare == \s\p\a\r\e ]] 00:27:10.956 18:30:54 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@752 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev spare 00:27:11.214 [2024-07-12 18:30:54.904903] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:27:11.214 18:30:54 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@753 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:27:11.214 18:30:54 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:27:11.214 18:30:54 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:27:11.214 18:30:54 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:27:11.214 18:30:54 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:27:11.214 18:30:54 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:27:11.214 18:30:54 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:27:11.214 18:30:54 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:27:11.214 18:30:54 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:27:11.214 18:30:54 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@124 -- # local tmp 00:27:11.214 18:30:54 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:11.214 18:30:54 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:27:11.472 18:30:55 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:27:11.472 "name": "raid_bdev1", 00:27:11.472 "uuid": "3b478957-8395-4e94-b309-ff999f5d79bb", 00:27:11.472 "strip_size_kb": 0, 00:27:11.472 "state": "online", 00:27:11.472 "raid_level": "raid1", 00:27:11.472 "superblock": true, 00:27:11.472 "num_base_bdevs": 2, 00:27:11.472 "num_base_bdevs_discovered": 1, 00:27:11.472 "num_base_bdevs_operational": 1, 00:27:11.472 "base_bdevs_list": [ 00:27:11.472 { 00:27:11.472 "name": null, 00:27:11.472 "uuid": "00000000-0000-0000-0000-000000000000", 00:27:11.472 "is_configured": false, 00:27:11.472 "data_offset": 256, 00:27:11.472 "data_size": 7936 00:27:11.472 }, 00:27:11.472 { 00:27:11.472 "name": "BaseBdev2", 00:27:11.472 "uuid": "ef2cf9b7-cc87-53c4-bdb7-0dd2bc9ef946", 00:27:11.472 "is_configured": true, 00:27:11.472 "data_offset": 256, 00:27:11.472 "data_size": 7936 00:27:11.472 } 00:27:11.472 ] 00:27:11.472 }' 00:27:11.472 18:30:55 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:27:11.472 18:30:55 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@10 -- # set +x 00:27:12.085 18:30:55 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@754 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:27:12.662 [2024-07-12 18:30:56.252499] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:27:12.662 [2024-07-12 18:30:56.252644] bdev_raid.c:3562:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev spare (4) smaller than existing raid bdev raid_bdev1 (5) 00:27:12.662 [2024-07-12 18:30:56.252666] bdev_raid.c:3620:raid_bdev_examine_sb: *NOTICE*: Re-adding bdev spare to raid bdev raid_bdev1. 00:27:12.662 [2024-07-12 18:30:56.252695] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:27:12.662 [2024-07-12 18:30:56.257519] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x2692f50 00:27:12.662 [2024-07-12 18:30:56.259830] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:27:12.662 18:30:56 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@755 -- # sleep 1 00:27:13.595 18:30:57 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@756 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:27:13.595 18:30:57 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:27:13.595 18:30:57 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:27:13.595 18:30:57 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@184 -- # local target=spare 00:27:13.595 18:30:57 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:27:13.595 18:30:57 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:13.595 18:30:57 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:27:13.851 18:30:57 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:27:13.851 "name": "raid_bdev1", 00:27:13.851 "uuid": "3b478957-8395-4e94-b309-ff999f5d79bb", 00:27:13.851 "strip_size_kb": 0, 00:27:13.851 "state": "online", 00:27:13.851 "raid_level": "raid1", 00:27:13.851 "superblock": true, 00:27:13.851 "num_base_bdevs": 2, 00:27:13.851 "num_base_bdevs_discovered": 2, 00:27:13.851 "num_base_bdevs_operational": 2, 00:27:13.851 "process": { 00:27:13.851 "type": "rebuild", 00:27:13.851 "target": "spare", 00:27:13.851 "progress": { 00:27:13.851 "blocks": 3072, 00:27:13.851 "percent": 38 00:27:13.851 } 00:27:13.851 }, 00:27:13.851 "base_bdevs_list": [ 00:27:13.851 { 00:27:13.851 "name": "spare", 00:27:13.851 "uuid": "e75d0853-6262-51af-be09-d68c60034700", 00:27:13.851 "is_configured": true, 00:27:13.851 "data_offset": 256, 00:27:13.851 "data_size": 7936 00:27:13.851 }, 00:27:13.851 { 00:27:13.851 "name": "BaseBdev2", 00:27:13.851 "uuid": "ef2cf9b7-cc87-53c4-bdb7-0dd2bc9ef946", 00:27:13.851 "is_configured": true, 00:27:13.851 "data_offset": 256, 00:27:13.851 "data_size": 7936 00:27:13.851 } 00:27:13.851 ] 00:27:13.851 }' 00:27:13.851 18:30:57 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:27:14.109 18:30:57 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:27:14.109 18:30:57 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:27:14.109 18:30:57 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:27:14.109 18:30:57 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@759 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:27:14.110 [2024-07-12 18:30:57.833866] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:27:14.367 [2024-07-12 18:30:57.872442] bdev_raid.c:2513:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:27:14.367 [2024-07-12 18:30:57.872485] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:27:14.367 [2024-07-12 18:30:57.872501] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:27:14.367 [2024-07-12 18:30:57.872509] bdev_raid.c:2451:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:27:14.367 18:30:57 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@760 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:27:14.367 18:30:57 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:27:14.367 18:30:57 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:27:14.367 18:30:57 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:27:14.367 18:30:57 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:27:14.367 18:30:57 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:27:14.367 18:30:57 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:27:14.367 18:30:57 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:27:14.367 18:30:57 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:27:14.367 18:30:57 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@124 -- # local tmp 00:27:14.367 18:30:57 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:14.367 18:30:57 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:27:14.625 18:30:58 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:27:14.625 "name": "raid_bdev1", 00:27:14.625 "uuid": "3b478957-8395-4e94-b309-ff999f5d79bb", 00:27:14.625 "strip_size_kb": 0, 00:27:14.625 "state": "online", 00:27:14.625 "raid_level": "raid1", 00:27:14.625 "superblock": true, 00:27:14.625 "num_base_bdevs": 2, 00:27:14.625 "num_base_bdevs_discovered": 1, 00:27:14.625 "num_base_bdevs_operational": 1, 00:27:14.625 "base_bdevs_list": [ 00:27:14.625 { 00:27:14.625 "name": null, 00:27:14.625 "uuid": "00000000-0000-0000-0000-000000000000", 00:27:14.625 "is_configured": false, 00:27:14.625 "data_offset": 256, 00:27:14.625 "data_size": 7936 00:27:14.625 }, 00:27:14.625 { 00:27:14.625 "name": "BaseBdev2", 00:27:14.625 "uuid": "ef2cf9b7-cc87-53c4-bdb7-0dd2bc9ef946", 00:27:14.625 "is_configured": true, 00:27:14.625 "data_offset": 256, 00:27:14.625 "data_size": 7936 00:27:14.625 } 00:27:14.625 ] 00:27:14.625 }' 00:27:14.625 18:30:58 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:27:14.625 18:30:58 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@10 -- # set +x 00:27:15.188 18:30:58 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@761 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:27:15.444 [2024-07-12 18:30:58.975840] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:27:15.444 [2024-07-12 18:30:58.975888] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:27:15.444 [2024-07-12 18:30:58.975909] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x2696ad0 00:27:15.444 [2024-07-12 18:30:58.975921] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:27:15.444 [2024-07-12 18:30:58.976298] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:27:15.444 [2024-07-12 18:30:58.976320] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:27:15.444 [2024-07-12 18:30:58.976397] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev spare 00:27:15.444 [2024-07-12 18:30:58.976411] bdev_raid.c:3562:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev spare (4) smaller than existing raid bdev raid_bdev1 (5) 00:27:15.444 [2024-07-12 18:30:58.976422] bdev_raid.c:3620:raid_bdev_examine_sb: *NOTICE*: Re-adding bdev spare to raid bdev raid_bdev1. 00:27:15.445 [2024-07-12 18:30:58.976441] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:27:15.445 [2024-07-12 18:30:58.981966] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x2694a60 00:27:15.445 spare 00:27:15.445 [2024-07-12 18:30:58.983464] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:27:15.445 18:30:58 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@762 -- # sleep 1 00:27:16.373 18:30:59 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@763 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:27:16.373 18:30:59 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:27:16.373 18:30:59 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:27:16.373 18:30:59 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@184 -- # local target=spare 00:27:16.373 18:30:59 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:27:16.373 18:31:00 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:16.374 18:31:00 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:27:16.631 18:31:00 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:27:16.631 "name": "raid_bdev1", 00:27:16.631 "uuid": "3b478957-8395-4e94-b309-ff999f5d79bb", 00:27:16.631 "strip_size_kb": 0, 00:27:16.631 "state": "online", 00:27:16.631 "raid_level": "raid1", 00:27:16.631 "superblock": true, 00:27:16.631 "num_base_bdevs": 2, 00:27:16.632 "num_base_bdevs_discovered": 2, 00:27:16.632 "num_base_bdevs_operational": 2, 00:27:16.632 "process": { 00:27:16.632 "type": "rebuild", 00:27:16.632 "target": "spare", 00:27:16.632 "progress": { 00:27:16.632 "blocks": 2816, 00:27:16.632 "percent": 35 00:27:16.632 } 00:27:16.632 }, 00:27:16.632 "base_bdevs_list": [ 00:27:16.632 { 00:27:16.632 "name": "spare", 00:27:16.632 "uuid": "e75d0853-6262-51af-be09-d68c60034700", 00:27:16.632 "is_configured": true, 00:27:16.632 "data_offset": 256, 00:27:16.632 "data_size": 7936 00:27:16.632 }, 00:27:16.632 { 00:27:16.632 "name": "BaseBdev2", 00:27:16.632 "uuid": "ef2cf9b7-cc87-53c4-bdb7-0dd2bc9ef946", 00:27:16.632 "is_configured": true, 00:27:16.632 "data_offset": 256, 00:27:16.632 "data_size": 7936 00:27:16.632 } 00:27:16.632 ] 00:27:16.632 }' 00:27:16.632 18:31:00 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:27:16.632 18:31:00 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:27:16.632 18:31:00 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:27:16.632 18:31:00 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:27:16.632 18:31:00 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@766 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:27:16.889 [2024-07-12 18:31:00.522558] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:27:16.889 [2024-07-12 18:31:00.596058] bdev_raid.c:2513:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:27:16.889 [2024-07-12 18:31:00.596101] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:27:16.889 [2024-07-12 18:31:00.596116] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:27:16.889 [2024-07-12 18:31:00.596124] bdev_raid.c:2451:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:27:17.147 18:31:00 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@767 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:27:17.147 18:31:00 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:27:17.147 18:31:00 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:27:17.147 18:31:00 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:27:17.147 18:31:00 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:27:17.147 18:31:00 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:27:17.147 18:31:00 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:27:17.147 18:31:00 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:27:17.147 18:31:00 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:27:17.147 18:31:00 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@124 -- # local tmp 00:27:17.147 18:31:00 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:17.147 18:31:00 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:27:17.147 18:31:00 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:27:17.147 "name": "raid_bdev1", 00:27:17.147 "uuid": "3b478957-8395-4e94-b309-ff999f5d79bb", 00:27:17.147 "strip_size_kb": 0, 00:27:17.147 "state": "online", 00:27:17.147 "raid_level": "raid1", 00:27:17.147 "superblock": true, 00:27:17.147 "num_base_bdevs": 2, 00:27:17.147 "num_base_bdevs_discovered": 1, 00:27:17.147 "num_base_bdevs_operational": 1, 00:27:17.147 "base_bdevs_list": [ 00:27:17.147 { 00:27:17.147 "name": null, 00:27:17.147 "uuid": "00000000-0000-0000-0000-000000000000", 00:27:17.147 "is_configured": false, 00:27:17.147 "data_offset": 256, 00:27:17.147 "data_size": 7936 00:27:17.147 }, 00:27:17.147 { 00:27:17.147 "name": "BaseBdev2", 00:27:17.147 "uuid": "ef2cf9b7-cc87-53c4-bdb7-0dd2bc9ef946", 00:27:17.147 "is_configured": true, 00:27:17.147 "data_offset": 256, 00:27:17.147 "data_size": 7936 00:27:17.147 } 00:27:17.147 ] 00:27:17.147 }' 00:27:17.147 18:31:00 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:27:17.147 18:31:00 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@10 -- # set +x 00:27:18.080 18:31:01 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@768 -- # verify_raid_bdev_process raid_bdev1 none none 00:27:18.080 18:31:01 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:27:18.080 18:31:01 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:27:18.080 18:31:01 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@184 -- # local target=none 00:27:18.081 18:31:01 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:27:18.081 18:31:01 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:27:18.081 18:31:01 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:18.081 18:31:01 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:27:18.081 "name": "raid_bdev1", 00:27:18.081 "uuid": "3b478957-8395-4e94-b309-ff999f5d79bb", 00:27:18.081 "strip_size_kb": 0, 00:27:18.081 "state": "online", 00:27:18.081 "raid_level": "raid1", 00:27:18.081 "superblock": true, 00:27:18.081 "num_base_bdevs": 2, 00:27:18.081 "num_base_bdevs_discovered": 1, 00:27:18.081 "num_base_bdevs_operational": 1, 00:27:18.081 "base_bdevs_list": [ 00:27:18.081 { 00:27:18.081 "name": null, 00:27:18.081 "uuid": "00000000-0000-0000-0000-000000000000", 00:27:18.081 "is_configured": false, 00:27:18.081 "data_offset": 256, 00:27:18.081 "data_size": 7936 00:27:18.081 }, 00:27:18.081 { 00:27:18.081 "name": "BaseBdev2", 00:27:18.081 "uuid": "ef2cf9b7-cc87-53c4-bdb7-0dd2bc9ef946", 00:27:18.081 "is_configured": true, 00:27:18.081 "data_offset": 256, 00:27:18.081 "data_size": 7936 00:27:18.081 } 00:27:18.081 ] 00:27:18.081 }' 00:27:18.081 18:31:01 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:27:18.081 18:31:01 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:27:18.081 18:31:01 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:27:18.081 18:31:01 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:27:18.081 18:31:01 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@771 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete BaseBdev1 00:27:18.339 18:31:02 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@772 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev1_malloc -p BaseBdev1 00:27:18.597 [2024-07-12 18:31:02.256667] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev1_malloc 00:27:18.597 [2024-07-12 18:31:02.256712] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:27:18.597 [2024-07-12 18:31:02.256734] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x2605980 00:27:18.597 [2024-07-12 18:31:02.256747] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:27:18.597 [2024-07-12 18:31:02.257089] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:27:18.597 [2024-07-12 18:31:02.257110] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:27:18.597 [2024-07-12 18:31:02.257172] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev BaseBdev1 00:27:18.597 [2024-07-12 18:31:02.257185] bdev_raid.c:3562:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev BaseBdev1 (1) smaller than existing raid bdev raid_bdev1 (5) 00:27:18.597 [2024-07-12 18:31:02.257196] bdev_raid.c:3581:raid_bdev_examine_sb: *DEBUG*: raid superblock does not contain this bdev's uuid 00:27:18.597 BaseBdev1 00:27:18.597 18:31:02 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@773 -- # sleep 1 00:27:19.969 18:31:03 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@774 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:27:19.969 18:31:03 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:27:19.969 18:31:03 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:27:19.969 18:31:03 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:27:19.969 18:31:03 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:27:19.969 18:31:03 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:27:19.969 18:31:03 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:27:19.969 18:31:03 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:27:19.969 18:31:03 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:27:19.969 18:31:03 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@124 -- # local tmp 00:27:19.969 18:31:03 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:19.969 18:31:03 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:27:19.970 18:31:03 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:27:19.970 "name": "raid_bdev1", 00:27:19.970 "uuid": "3b478957-8395-4e94-b309-ff999f5d79bb", 00:27:19.970 "strip_size_kb": 0, 00:27:19.970 "state": "online", 00:27:19.970 "raid_level": "raid1", 00:27:19.970 "superblock": true, 00:27:19.970 "num_base_bdevs": 2, 00:27:19.970 "num_base_bdevs_discovered": 1, 00:27:19.970 "num_base_bdevs_operational": 1, 00:27:19.970 "base_bdevs_list": [ 00:27:19.970 { 00:27:19.970 "name": null, 00:27:19.970 "uuid": "00000000-0000-0000-0000-000000000000", 00:27:19.970 "is_configured": false, 00:27:19.970 "data_offset": 256, 00:27:19.970 "data_size": 7936 00:27:19.970 }, 00:27:19.970 { 00:27:19.970 "name": "BaseBdev2", 00:27:19.970 "uuid": "ef2cf9b7-cc87-53c4-bdb7-0dd2bc9ef946", 00:27:19.970 "is_configured": true, 00:27:19.970 "data_offset": 256, 00:27:19.970 "data_size": 7936 00:27:19.970 } 00:27:19.970 ] 00:27:19.970 }' 00:27:19.970 18:31:03 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:27:19.970 18:31:03 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@10 -- # set +x 00:27:20.544 18:31:04 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@775 -- # verify_raid_bdev_process raid_bdev1 none none 00:27:20.544 18:31:04 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:27:20.544 18:31:04 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:27:20.544 18:31:04 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@184 -- # local target=none 00:27:20.544 18:31:04 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:27:20.544 18:31:04 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:20.544 18:31:04 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:27:20.803 18:31:04 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:27:20.803 "name": "raid_bdev1", 00:27:20.803 "uuid": "3b478957-8395-4e94-b309-ff999f5d79bb", 00:27:20.803 "strip_size_kb": 0, 00:27:20.803 "state": "online", 00:27:20.803 "raid_level": "raid1", 00:27:20.803 "superblock": true, 00:27:20.803 "num_base_bdevs": 2, 00:27:20.803 "num_base_bdevs_discovered": 1, 00:27:20.803 "num_base_bdevs_operational": 1, 00:27:20.803 "base_bdevs_list": [ 00:27:20.803 { 00:27:20.803 "name": null, 00:27:20.803 "uuid": "00000000-0000-0000-0000-000000000000", 00:27:20.803 "is_configured": false, 00:27:20.803 "data_offset": 256, 00:27:20.803 "data_size": 7936 00:27:20.803 }, 00:27:20.803 { 00:27:20.803 "name": "BaseBdev2", 00:27:20.803 "uuid": "ef2cf9b7-cc87-53c4-bdb7-0dd2bc9ef946", 00:27:20.803 "is_configured": true, 00:27:20.803 "data_offset": 256, 00:27:20.803 "data_size": 7936 00:27:20.803 } 00:27:20.803 ] 00:27:20.803 }' 00:27:20.803 18:31:04 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:27:20.803 18:31:04 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:27:20.803 18:31:04 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:27:20.803 18:31:04 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:27:20.803 18:31:04 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@776 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:27:20.803 18:31:04 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@648 -- # local es=0 00:27:20.803 18:31:04 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@650 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:27:20.803 18:31:04 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@636 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:27:20.803 18:31:04 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:27:20.803 18:31:04 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@640 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:27:20.803 18:31:04 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:27:20.803 18:31:04 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@642 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:27:20.803 18:31:04 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:27:20.803 18:31:04 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@642 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:27:20.803 18:31:04 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@642 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py ]] 00:27:20.803 18:31:04 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@651 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:27:21.061 [2024-07-12 18:31:04.683140] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:27:21.061 [2024-07-12 18:31:04.683257] bdev_raid.c:3562:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev BaseBdev1 (1) smaller than existing raid bdev raid_bdev1 (5) 00:27:21.061 [2024-07-12 18:31:04.683272] bdev_raid.c:3581:raid_bdev_examine_sb: *DEBUG*: raid superblock does not contain this bdev's uuid 00:27:21.061 request: 00:27:21.061 { 00:27:21.061 "base_bdev": "BaseBdev1", 00:27:21.061 "raid_bdev": "raid_bdev1", 00:27:21.061 "method": "bdev_raid_add_base_bdev", 00:27:21.061 "req_id": 1 00:27:21.061 } 00:27:21.061 Got JSON-RPC error response 00:27:21.061 response: 00:27:21.061 { 00:27:21.061 "code": -22, 00:27:21.061 "message": "Failed to add base bdev to RAID bdev: Invalid argument" 00:27:21.061 } 00:27:21.061 18:31:04 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@651 -- # es=1 00:27:21.061 18:31:04 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@659 -- # (( es > 128 )) 00:27:21.061 18:31:04 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@670 -- # [[ -n '' ]] 00:27:21.061 18:31:04 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@675 -- # (( !es == 0 )) 00:27:21.061 18:31:04 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@777 -- # sleep 1 00:27:21.994 18:31:05 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@778 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:27:21.994 18:31:05 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:27:21.994 18:31:05 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:27:21.994 18:31:05 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:27:21.994 18:31:05 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:27:21.994 18:31:05 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:27:21.994 18:31:05 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:27:21.995 18:31:05 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:27:21.995 18:31:05 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:27:21.995 18:31:05 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@124 -- # local tmp 00:27:21.995 18:31:05 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:21.995 18:31:05 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:27:22.252 18:31:05 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:27:22.252 "name": "raid_bdev1", 00:27:22.252 "uuid": "3b478957-8395-4e94-b309-ff999f5d79bb", 00:27:22.252 "strip_size_kb": 0, 00:27:22.252 "state": "online", 00:27:22.252 "raid_level": "raid1", 00:27:22.252 "superblock": true, 00:27:22.252 "num_base_bdevs": 2, 00:27:22.252 "num_base_bdevs_discovered": 1, 00:27:22.252 "num_base_bdevs_operational": 1, 00:27:22.252 "base_bdevs_list": [ 00:27:22.252 { 00:27:22.252 "name": null, 00:27:22.252 "uuid": "00000000-0000-0000-0000-000000000000", 00:27:22.252 "is_configured": false, 00:27:22.252 "data_offset": 256, 00:27:22.252 "data_size": 7936 00:27:22.252 }, 00:27:22.252 { 00:27:22.252 "name": "BaseBdev2", 00:27:22.252 "uuid": "ef2cf9b7-cc87-53c4-bdb7-0dd2bc9ef946", 00:27:22.252 "is_configured": true, 00:27:22.252 "data_offset": 256, 00:27:22.252 "data_size": 7936 00:27:22.252 } 00:27:22.252 ] 00:27:22.252 }' 00:27:22.252 18:31:05 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:27:22.252 18:31:05 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@10 -- # set +x 00:27:23.186 18:31:06 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@779 -- # verify_raid_bdev_process raid_bdev1 none none 00:27:23.186 18:31:06 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:27:23.186 18:31:06 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:27:23.186 18:31:06 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@184 -- # local target=none 00:27:23.186 18:31:06 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:27:23.186 18:31:06 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:23.186 18:31:06 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:27:23.186 18:31:06 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:27:23.186 "name": "raid_bdev1", 00:27:23.186 "uuid": "3b478957-8395-4e94-b309-ff999f5d79bb", 00:27:23.186 "strip_size_kb": 0, 00:27:23.186 "state": "online", 00:27:23.186 "raid_level": "raid1", 00:27:23.186 "superblock": true, 00:27:23.186 "num_base_bdevs": 2, 00:27:23.186 "num_base_bdevs_discovered": 1, 00:27:23.186 "num_base_bdevs_operational": 1, 00:27:23.186 "base_bdevs_list": [ 00:27:23.186 { 00:27:23.186 "name": null, 00:27:23.186 "uuid": "00000000-0000-0000-0000-000000000000", 00:27:23.186 "is_configured": false, 00:27:23.186 "data_offset": 256, 00:27:23.186 "data_size": 7936 00:27:23.186 }, 00:27:23.186 { 00:27:23.186 "name": "BaseBdev2", 00:27:23.186 "uuid": "ef2cf9b7-cc87-53c4-bdb7-0dd2bc9ef946", 00:27:23.186 "is_configured": true, 00:27:23.186 "data_offset": 256, 00:27:23.186 "data_size": 7936 00:27:23.186 } 00:27:23.186 ] 00:27:23.186 }' 00:27:23.186 18:31:06 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:27:23.186 18:31:06 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:27:23.186 18:31:06 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:27:23.186 18:31:06 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:27:23.186 18:31:06 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@782 -- # killprocess 2601681 00:27:23.186 18:31:06 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@948 -- # '[' -z 2601681 ']' 00:27:23.186 18:31:06 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@952 -- # kill -0 2601681 00:27:23.186 18:31:06 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@953 -- # uname 00:27:23.186 18:31:06 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:27:23.186 18:31:06 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2601681 00:27:23.186 18:31:06 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:27:23.186 18:31:06 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:27:23.186 18:31:06 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2601681' 00:27:23.186 killing process with pid 2601681 00:27:23.186 18:31:06 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@967 -- # kill 2601681 00:27:23.186 Received shutdown signal, test time was about 60.000000 seconds 00:27:23.186 00:27:23.186 Latency(us) 00:27:23.186 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:27:23.186 =================================================================================================================== 00:27:23.186 Total : 0.00 0.00 0.00 0.00 0.00 18446744073709551616.00 0.00 00:27:23.186 [2024-07-12 18:31:06.889634] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:27:23.186 [2024-07-12 18:31:06.889722] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:27:23.186 [2024-07-12 18:31:06.889767] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:27:23.186 [2024-07-12 18:31:06.889781] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x269a7b0 name raid_bdev1, state offline 00:27:23.186 18:31:06 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@972 -- # wait 2601681 00:27:23.445 [2024-07-12 18:31:06.917789] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:27:23.445 18:31:07 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@784 -- # return 0 00:27:23.445 00:27:23.445 real 0m31.775s 00:27:23.445 user 0m49.701s 00:27:23.445 sys 0m5.113s 00:27:23.445 18:31:07 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@1124 -- # xtrace_disable 00:27:23.445 18:31:07 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@10 -- # set +x 00:27:23.445 ************************************ 00:27:23.445 END TEST raid_rebuild_test_sb_4k 00:27:23.445 ************************************ 00:27:23.703 18:31:07 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:27:23.703 18:31:07 bdev_raid -- bdev/bdev_raid.sh@904 -- # base_malloc_params='-m 32' 00:27:23.703 18:31:07 bdev_raid -- bdev/bdev_raid.sh@905 -- # run_test raid_state_function_test_sb_md_separate raid_state_function_test raid1 2 true 00:27:23.703 18:31:07 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:27:23.703 18:31:07 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:27:23.703 18:31:07 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:27:23.703 ************************************ 00:27:23.703 START TEST raid_state_function_test_sb_md_separate 00:27:23.703 ************************************ 00:27:23.703 18:31:07 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@1123 -- # raid_state_function_test raid1 2 true 00:27:23.703 18:31:07 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@220 -- # local raid_level=raid1 00:27:23.703 18:31:07 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@221 -- # local num_base_bdevs=2 00:27:23.703 18:31:07 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@222 -- # local superblock=true 00:27:23.703 18:31:07 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@223 -- # local raid_bdev 00:27:23.703 18:31:07 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@224 -- # (( i = 1 )) 00:27:23.703 18:31:07 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:27:23.703 18:31:07 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@226 -- # echo BaseBdev1 00:27:23.703 18:31:07 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:27:23.703 18:31:07 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:27:23.703 18:31:07 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@226 -- # echo BaseBdev2 00:27:23.703 18:31:07 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:27:23.704 18:31:07 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:27:23.704 18:31:07 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@224 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:27:23.704 18:31:07 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@224 -- # local base_bdevs 00:27:23.704 18:31:07 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@225 -- # local raid_bdev_name=Existed_Raid 00:27:23.704 18:31:07 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@226 -- # local strip_size 00:27:23.704 18:31:07 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@227 -- # local strip_size_create_arg 00:27:23.704 18:31:07 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@228 -- # local superblock_create_arg 00:27:23.704 18:31:07 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@230 -- # '[' raid1 '!=' raid1 ']' 00:27:23.704 18:31:07 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@234 -- # strip_size=0 00:27:23.704 18:31:07 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@237 -- # '[' true = true ']' 00:27:23.704 18:31:07 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@238 -- # superblock_create_arg=-s 00:27:23.704 18:31:07 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@244 -- # raid_pid=2606181 00:27:23.704 18:31:07 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@245 -- # echo 'Process raid pid: 2606181' 00:27:23.704 Process raid pid: 2606181 00:27:23.704 18:31:07 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@243 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:27:23.704 18:31:07 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@246 -- # waitforlisten 2606181 /var/tmp/spdk-raid.sock 00:27:23.704 18:31:07 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@829 -- # '[' -z 2606181 ']' 00:27:23.704 18:31:07 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:27:23.704 18:31:07 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@834 -- # local max_retries=100 00:27:23.704 18:31:07 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:27:23.704 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:27:23.704 18:31:07 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@838 -- # xtrace_disable 00:27:23.704 18:31:07 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@10 -- # set +x 00:27:23.704 [2024-07-12 18:31:07.301607] Starting SPDK v24.09-pre git sha1 182dd7de4 / DPDK 24.03.0 initialization... 00:27:23.704 [2024-07-12 18:31:07.301683] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:27:23.962 [2024-07-12 18:31:07.432399] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:27:23.962 [2024-07-12 18:31:07.529617] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:27:23.962 [2024-07-12 18:31:07.587324] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:27:23.962 [2024-07-12 18:31:07.587359] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:27:24.530 18:31:08 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:27:24.530 18:31:08 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@862 -- # return 0 00:27:24.530 18:31:08 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@250 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:27:24.789 [2024-07-12 18:31:08.383520] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:27:24.789 [2024-07-12 18:31:08.383565] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:27:24.789 [2024-07-12 18:31:08.383576] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:27:24.789 [2024-07-12 18:31:08.383588] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:27:24.789 18:31:08 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@251 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 2 00:27:24.789 18:31:08 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:27:24.789 18:31:08 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:27:24.789 18:31:08 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:27:24.789 18:31:08 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:27:24.789 18:31:08 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:27:24.789 18:31:08 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:27:24.789 18:31:08 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:27:24.789 18:31:08 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:27:24.789 18:31:08 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@124 -- # local tmp 00:27:24.789 18:31:08 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:24.789 18:31:08 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:27:25.048 18:31:08 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:27:25.048 "name": "Existed_Raid", 00:27:25.048 "uuid": "145d6b35-633a-4b15-bb4d-fdc3372ef065", 00:27:25.048 "strip_size_kb": 0, 00:27:25.048 "state": "configuring", 00:27:25.048 "raid_level": "raid1", 00:27:25.048 "superblock": true, 00:27:25.048 "num_base_bdevs": 2, 00:27:25.048 "num_base_bdevs_discovered": 0, 00:27:25.048 "num_base_bdevs_operational": 2, 00:27:25.048 "base_bdevs_list": [ 00:27:25.048 { 00:27:25.048 "name": "BaseBdev1", 00:27:25.048 "uuid": "00000000-0000-0000-0000-000000000000", 00:27:25.048 "is_configured": false, 00:27:25.048 "data_offset": 0, 00:27:25.048 "data_size": 0 00:27:25.048 }, 00:27:25.048 { 00:27:25.048 "name": "BaseBdev2", 00:27:25.048 "uuid": "00000000-0000-0000-0000-000000000000", 00:27:25.048 "is_configured": false, 00:27:25.048 "data_offset": 0, 00:27:25.048 "data_size": 0 00:27:25.048 } 00:27:25.048 ] 00:27:25.048 }' 00:27:25.048 18:31:08 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:27:25.048 18:31:08 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@10 -- # set +x 00:27:25.615 18:31:09 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@252 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:27:25.873 [2024-07-12 18:31:09.466254] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:27:25.873 [2024-07-12 18:31:09.466285] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1daaa80 name Existed_Raid, state configuring 00:27:25.873 18:31:09 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@256 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:27:26.131 [2024-07-12 18:31:09.706906] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:27:26.131 [2024-07-12 18:31:09.706942] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:27:26.131 [2024-07-12 18:31:09.706952] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:27:26.131 [2024-07-12 18:31:09.706963] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:27:26.131 18:31:09 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@257 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -m 32 -b BaseBdev1 00:27:26.389 [2024-07-12 18:31:09.959275] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:27:26.389 BaseBdev1 00:27:26.389 18:31:09 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@258 -- # waitforbdev BaseBdev1 00:27:26.389 18:31:09 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev1 00:27:26.389 18:31:09 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:27:26.389 18:31:09 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@899 -- # local i 00:27:26.389 18:31:09 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:27:26.389 18:31:09 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:27:26.389 18:31:09 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:27:26.648 18:31:10 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:27:26.906 [ 00:27:26.906 { 00:27:26.906 "name": "BaseBdev1", 00:27:26.906 "aliases": [ 00:27:26.906 "df50682c-774b-48bf-b4dd-61f53de37d0c" 00:27:26.906 ], 00:27:26.906 "product_name": "Malloc disk", 00:27:26.906 "block_size": 4096, 00:27:26.906 "num_blocks": 8192, 00:27:26.906 "uuid": "df50682c-774b-48bf-b4dd-61f53de37d0c", 00:27:26.906 "md_size": 32, 00:27:26.906 "md_interleave": false, 00:27:26.906 "dif_type": 0, 00:27:26.906 "assigned_rate_limits": { 00:27:26.906 "rw_ios_per_sec": 0, 00:27:26.906 "rw_mbytes_per_sec": 0, 00:27:26.906 "r_mbytes_per_sec": 0, 00:27:26.906 "w_mbytes_per_sec": 0 00:27:26.906 }, 00:27:26.906 "claimed": true, 00:27:26.906 "claim_type": "exclusive_write", 00:27:26.906 "zoned": false, 00:27:26.906 "supported_io_types": { 00:27:26.906 "read": true, 00:27:26.906 "write": true, 00:27:26.906 "unmap": true, 00:27:26.906 "flush": true, 00:27:26.906 "reset": true, 00:27:26.906 "nvme_admin": false, 00:27:26.906 "nvme_io": false, 00:27:26.906 "nvme_io_md": false, 00:27:26.906 "write_zeroes": true, 00:27:26.906 "zcopy": true, 00:27:26.906 "get_zone_info": false, 00:27:26.906 "zone_management": false, 00:27:26.906 "zone_append": false, 00:27:26.906 "compare": false, 00:27:26.906 "compare_and_write": false, 00:27:26.906 "abort": true, 00:27:26.906 "seek_hole": false, 00:27:26.906 "seek_data": false, 00:27:26.906 "copy": true, 00:27:26.906 "nvme_iov_md": false 00:27:26.906 }, 00:27:26.906 "memory_domains": [ 00:27:26.906 { 00:27:26.906 "dma_device_id": "system", 00:27:26.906 "dma_device_type": 1 00:27:26.906 }, 00:27:26.906 { 00:27:26.906 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:27:26.906 "dma_device_type": 2 00:27:26.906 } 00:27:26.906 ], 00:27:26.906 "driver_specific": {} 00:27:26.906 } 00:27:26.906 ] 00:27:26.906 18:31:10 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@905 -- # return 0 00:27:26.906 18:31:10 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@259 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 2 00:27:26.907 18:31:10 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:27:26.907 18:31:10 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:27:26.907 18:31:10 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:27:26.907 18:31:10 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:27:26.907 18:31:10 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:27:26.907 18:31:10 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:27:26.907 18:31:10 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:27:26.907 18:31:10 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:27:26.907 18:31:10 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@124 -- # local tmp 00:27:26.907 18:31:10 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:26.907 18:31:10 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:27:27.165 18:31:10 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:27:27.165 "name": "Existed_Raid", 00:27:27.165 "uuid": "190da272-42f8-4d97-b892-e2719e98be8e", 00:27:27.165 "strip_size_kb": 0, 00:27:27.165 "state": "configuring", 00:27:27.165 "raid_level": "raid1", 00:27:27.165 "superblock": true, 00:27:27.165 "num_base_bdevs": 2, 00:27:27.165 "num_base_bdevs_discovered": 1, 00:27:27.165 "num_base_bdevs_operational": 2, 00:27:27.165 "base_bdevs_list": [ 00:27:27.165 { 00:27:27.165 "name": "BaseBdev1", 00:27:27.165 "uuid": "df50682c-774b-48bf-b4dd-61f53de37d0c", 00:27:27.165 "is_configured": true, 00:27:27.165 "data_offset": 256, 00:27:27.165 "data_size": 7936 00:27:27.165 }, 00:27:27.165 { 00:27:27.165 "name": "BaseBdev2", 00:27:27.165 "uuid": "00000000-0000-0000-0000-000000000000", 00:27:27.165 "is_configured": false, 00:27:27.165 "data_offset": 0, 00:27:27.165 "data_size": 0 00:27:27.165 } 00:27:27.165 ] 00:27:27.165 }' 00:27:27.165 18:31:10 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:27:27.165 18:31:10 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@10 -- # set +x 00:27:27.756 18:31:11 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@260 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:27:28.014 [2024-07-12 18:31:11.543494] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:27:28.014 [2024-07-12 18:31:11.543533] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1daa350 name Existed_Raid, state configuring 00:27:28.014 18:31:11 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@264 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:27:28.272 [2024-07-12 18:31:11.788183] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:27:28.272 [2024-07-12 18:31:11.789678] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:27:28.272 [2024-07-12 18:31:11.789713] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:27:28.272 18:31:11 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@265 -- # (( i = 1 )) 00:27:28.272 18:31:11 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:27:28.272 18:31:11 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 2 00:27:28.272 18:31:11 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:27:28.272 18:31:11 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:27:28.272 18:31:11 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:27:28.272 18:31:11 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:27:28.272 18:31:11 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:27:28.272 18:31:11 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:27:28.272 18:31:11 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:27:28.272 18:31:11 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:27:28.272 18:31:11 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@124 -- # local tmp 00:27:28.272 18:31:11 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:28.272 18:31:11 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:27:28.539 18:31:12 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:27:28.539 "name": "Existed_Raid", 00:27:28.539 "uuid": "7857e012-85b4-42ed-a158-88b7df1add2d", 00:27:28.539 "strip_size_kb": 0, 00:27:28.539 "state": "configuring", 00:27:28.539 "raid_level": "raid1", 00:27:28.539 "superblock": true, 00:27:28.539 "num_base_bdevs": 2, 00:27:28.539 "num_base_bdevs_discovered": 1, 00:27:28.539 "num_base_bdevs_operational": 2, 00:27:28.539 "base_bdevs_list": [ 00:27:28.539 { 00:27:28.539 "name": "BaseBdev1", 00:27:28.539 "uuid": "df50682c-774b-48bf-b4dd-61f53de37d0c", 00:27:28.539 "is_configured": true, 00:27:28.539 "data_offset": 256, 00:27:28.539 "data_size": 7936 00:27:28.539 }, 00:27:28.539 { 00:27:28.539 "name": "BaseBdev2", 00:27:28.539 "uuid": "00000000-0000-0000-0000-000000000000", 00:27:28.539 "is_configured": false, 00:27:28.539 "data_offset": 0, 00:27:28.539 "data_size": 0 00:27:28.539 } 00:27:28.539 ] 00:27:28.539 }' 00:27:28.539 18:31:12 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:27:28.539 18:31:12 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@10 -- # set +x 00:27:29.147 18:31:12 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -m 32 -b BaseBdev2 00:27:29.405 [2024-07-12 18:31:12.895223] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:27:29.405 [2024-07-12 18:31:12.895367] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x1dac210 00:27:29.405 [2024-07-12 18:31:12.895380] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4096 00:27:29.405 [2024-07-12 18:31:12.895442] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1dabc50 00:27:29.405 [2024-07-12 18:31:12.895540] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1dac210 00:27:29.405 [2024-07-12 18:31:12.895551] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x1dac210 00:27:29.405 [2024-07-12 18:31:12.895614] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:27:29.405 BaseBdev2 00:27:29.405 18:31:12 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev2 00:27:29.405 18:31:12 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev2 00:27:29.405 18:31:12 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:27:29.405 18:31:12 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@899 -- # local i 00:27:29.405 18:31:12 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:27:29.405 18:31:12 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:27:29.405 18:31:12 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:27:29.663 18:31:13 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:27:29.663 [ 00:27:29.664 { 00:27:29.664 "name": "BaseBdev2", 00:27:29.664 "aliases": [ 00:27:29.664 "7e8a9da9-91fa-4290-8ebe-29530b47af5c" 00:27:29.664 ], 00:27:29.664 "product_name": "Malloc disk", 00:27:29.664 "block_size": 4096, 00:27:29.664 "num_blocks": 8192, 00:27:29.664 "uuid": "7e8a9da9-91fa-4290-8ebe-29530b47af5c", 00:27:29.664 "md_size": 32, 00:27:29.664 "md_interleave": false, 00:27:29.664 "dif_type": 0, 00:27:29.664 "assigned_rate_limits": { 00:27:29.664 "rw_ios_per_sec": 0, 00:27:29.664 "rw_mbytes_per_sec": 0, 00:27:29.664 "r_mbytes_per_sec": 0, 00:27:29.664 "w_mbytes_per_sec": 0 00:27:29.664 }, 00:27:29.664 "claimed": true, 00:27:29.664 "claim_type": "exclusive_write", 00:27:29.664 "zoned": false, 00:27:29.664 "supported_io_types": { 00:27:29.664 "read": true, 00:27:29.664 "write": true, 00:27:29.664 "unmap": true, 00:27:29.664 "flush": true, 00:27:29.664 "reset": true, 00:27:29.664 "nvme_admin": false, 00:27:29.664 "nvme_io": false, 00:27:29.664 "nvme_io_md": false, 00:27:29.664 "write_zeroes": true, 00:27:29.664 "zcopy": true, 00:27:29.664 "get_zone_info": false, 00:27:29.664 "zone_management": false, 00:27:29.664 "zone_append": false, 00:27:29.664 "compare": false, 00:27:29.664 "compare_and_write": false, 00:27:29.664 "abort": true, 00:27:29.664 "seek_hole": false, 00:27:29.664 "seek_data": false, 00:27:29.664 "copy": true, 00:27:29.664 "nvme_iov_md": false 00:27:29.664 }, 00:27:29.664 "memory_domains": [ 00:27:29.664 { 00:27:29.664 "dma_device_id": "system", 00:27:29.664 "dma_device_type": 1 00:27:29.664 }, 00:27:29.664 { 00:27:29.664 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:27:29.664 "dma_device_type": 2 00:27:29.664 } 00:27:29.664 ], 00:27:29.664 "driver_specific": {} 00:27:29.664 } 00:27:29.664 ] 00:27:29.922 18:31:13 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@905 -- # return 0 00:27:29.922 18:31:13 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:27:29.922 18:31:13 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:27:29.922 18:31:13 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@270 -- # verify_raid_bdev_state Existed_Raid online raid1 0 2 00:27:29.922 18:31:13 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:27:29.922 18:31:13 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:27:29.922 18:31:13 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:27:29.922 18:31:13 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:27:29.922 18:31:13 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:27:29.922 18:31:13 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:27:29.922 18:31:13 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:27:29.922 18:31:13 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:27:29.922 18:31:13 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@124 -- # local tmp 00:27:29.922 18:31:13 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:29.922 18:31:13 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:27:29.922 18:31:13 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:27:29.922 "name": "Existed_Raid", 00:27:29.922 "uuid": "7857e012-85b4-42ed-a158-88b7df1add2d", 00:27:29.922 "strip_size_kb": 0, 00:27:29.922 "state": "online", 00:27:29.922 "raid_level": "raid1", 00:27:29.922 "superblock": true, 00:27:29.922 "num_base_bdevs": 2, 00:27:29.922 "num_base_bdevs_discovered": 2, 00:27:29.922 "num_base_bdevs_operational": 2, 00:27:29.922 "base_bdevs_list": [ 00:27:29.922 { 00:27:29.922 "name": "BaseBdev1", 00:27:29.922 "uuid": "df50682c-774b-48bf-b4dd-61f53de37d0c", 00:27:29.922 "is_configured": true, 00:27:29.922 "data_offset": 256, 00:27:29.922 "data_size": 7936 00:27:29.922 }, 00:27:29.922 { 00:27:29.922 "name": "BaseBdev2", 00:27:29.922 "uuid": "7e8a9da9-91fa-4290-8ebe-29530b47af5c", 00:27:29.922 "is_configured": true, 00:27:29.922 "data_offset": 256, 00:27:29.922 "data_size": 7936 00:27:29.922 } 00:27:29.922 ] 00:27:29.922 }' 00:27:29.922 18:31:13 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:27:29.922 18:31:13 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@10 -- # set +x 00:27:30.489 18:31:14 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_properties Existed_Raid 00:27:30.489 18:31:14 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:27:30.489 18:31:14 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:27:30.489 18:31:14 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:27:30.489 18:31:14 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:27:30.489 18:31:14 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@198 -- # local name 00:27:30.489 18:31:14 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:27:30.489 18:31:14 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:27:30.747 [2024-07-12 18:31:14.423563] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:27:30.747 18:31:14 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:27:30.747 "name": "Existed_Raid", 00:27:30.747 "aliases": [ 00:27:30.747 "7857e012-85b4-42ed-a158-88b7df1add2d" 00:27:30.747 ], 00:27:30.747 "product_name": "Raid Volume", 00:27:30.747 "block_size": 4096, 00:27:30.747 "num_blocks": 7936, 00:27:30.747 "uuid": "7857e012-85b4-42ed-a158-88b7df1add2d", 00:27:30.747 "md_size": 32, 00:27:30.747 "md_interleave": false, 00:27:30.747 "dif_type": 0, 00:27:30.747 "assigned_rate_limits": { 00:27:30.747 "rw_ios_per_sec": 0, 00:27:30.747 "rw_mbytes_per_sec": 0, 00:27:30.747 "r_mbytes_per_sec": 0, 00:27:30.747 "w_mbytes_per_sec": 0 00:27:30.747 }, 00:27:30.747 "claimed": false, 00:27:30.747 "zoned": false, 00:27:30.747 "supported_io_types": { 00:27:30.747 "read": true, 00:27:30.747 "write": true, 00:27:30.747 "unmap": false, 00:27:30.747 "flush": false, 00:27:30.747 "reset": true, 00:27:30.747 "nvme_admin": false, 00:27:30.747 "nvme_io": false, 00:27:30.747 "nvme_io_md": false, 00:27:30.747 "write_zeroes": true, 00:27:30.747 "zcopy": false, 00:27:30.747 "get_zone_info": false, 00:27:30.747 "zone_management": false, 00:27:30.747 "zone_append": false, 00:27:30.747 "compare": false, 00:27:30.747 "compare_and_write": false, 00:27:30.747 "abort": false, 00:27:30.747 "seek_hole": false, 00:27:30.747 "seek_data": false, 00:27:30.747 "copy": false, 00:27:30.747 "nvme_iov_md": false 00:27:30.747 }, 00:27:30.747 "memory_domains": [ 00:27:30.747 { 00:27:30.747 "dma_device_id": "system", 00:27:30.747 "dma_device_type": 1 00:27:30.747 }, 00:27:30.747 { 00:27:30.747 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:27:30.747 "dma_device_type": 2 00:27:30.747 }, 00:27:30.747 { 00:27:30.747 "dma_device_id": "system", 00:27:30.747 "dma_device_type": 1 00:27:30.747 }, 00:27:30.747 { 00:27:30.747 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:27:30.747 "dma_device_type": 2 00:27:30.747 } 00:27:30.747 ], 00:27:30.747 "driver_specific": { 00:27:30.747 "raid": { 00:27:30.747 "uuid": "7857e012-85b4-42ed-a158-88b7df1add2d", 00:27:30.747 "strip_size_kb": 0, 00:27:30.747 "state": "online", 00:27:30.747 "raid_level": "raid1", 00:27:30.747 "superblock": true, 00:27:30.747 "num_base_bdevs": 2, 00:27:30.747 "num_base_bdevs_discovered": 2, 00:27:30.747 "num_base_bdevs_operational": 2, 00:27:30.747 "base_bdevs_list": [ 00:27:30.747 { 00:27:30.747 "name": "BaseBdev1", 00:27:30.747 "uuid": "df50682c-774b-48bf-b4dd-61f53de37d0c", 00:27:30.747 "is_configured": true, 00:27:30.747 "data_offset": 256, 00:27:30.747 "data_size": 7936 00:27:30.747 }, 00:27:30.747 { 00:27:30.747 "name": "BaseBdev2", 00:27:30.747 "uuid": "7e8a9da9-91fa-4290-8ebe-29530b47af5c", 00:27:30.747 "is_configured": true, 00:27:30.747 "data_offset": 256, 00:27:30.747 "data_size": 7936 00:27:30.747 } 00:27:30.747 ] 00:27:30.747 } 00:27:30.747 } 00:27:30.747 }' 00:27:30.747 18:31:14 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:27:31.009 18:31:14 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@201 -- # base_bdev_names='BaseBdev1 00:27:31.009 BaseBdev2' 00:27:31.009 18:31:14 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:27:31.009 18:31:14 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:27:31.009 18:31:14 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:27:31.009 18:31:14 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:27:31.009 "name": "BaseBdev1", 00:27:31.009 "aliases": [ 00:27:31.009 "df50682c-774b-48bf-b4dd-61f53de37d0c" 00:27:31.009 ], 00:27:31.009 "product_name": "Malloc disk", 00:27:31.009 "block_size": 4096, 00:27:31.009 "num_blocks": 8192, 00:27:31.009 "uuid": "df50682c-774b-48bf-b4dd-61f53de37d0c", 00:27:31.009 "md_size": 32, 00:27:31.009 "md_interleave": false, 00:27:31.009 "dif_type": 0, 00:27:31.009 "assigned_rate_limits": { 00:27:31.009 "rw_ios_per_sec": 0, 00:27:31.009 "rw_mbytes_per_sec": 0, 00:27:31.009 "r_mbytes_per_sec": 0, 00:27:31.009 "w_mbytes_per_sec": 0 00:27:31.009 }, 00:27:31.009 "claimed": true, 00:27:31.009 "claim_type": "exclusive_write", 00:27:31.009 "zoned": false, 00:27:31.010 "supported_io_types": { 00:27:31.010 "read": true, 00:27:31.010 "write": true, 00:27:31.010 "unmap": true, 00:27:31.010 "flush": true, 00:27:31.010 "reset": true, 00:27:31.010 "nvme_admin": false, 00:27:31.010 "nvme_io": false, 00:27:31.010 "nvme_io_md": false, 00:27:31.010 "write_zeroes": true, 00:27:31.010 "zcopy": true, 00:27:31.010 "get_zone_info": false, 00:27:31.010 "zone_management": false, 00:27:31.010 "zone_append": false, 00:27:31.010 "compare": false, 00:27:31.010 "compare_and_write": false, 00:27:31.010 "abort": true, 00:27:31.010 "seek_hole": false, 00:27:31.010 "seek_data": false, 00:27:31.010 "copy": true, 00:27:31.010 "nvme_iov_md": false 00:27:31.010 }, 00:27:31.010 "memory_domains": [ 00:27:31.010 { 00:27:31.010 "dma_device_id": "system", 00:27:31.010 "dma_device_type": 1 00:27:31.010 }, 00:27:31.010 { 00:27:31.010 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:27:31.010 "dma_device_type": 2 00:27:31.010 } 00:27:31.010 ], 00:27:31.010 "driver_specific": {} 00:27:31.010 }' 00:27:31.268 18:31:14 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:27:31.268 18:31:14 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:27:31.268 18:31:14 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@205 -- # [[ 4096 == 4096 ]] 00:27:31.268 18:31:14 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:27:31.268 18:31:14 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:27:31.268 18:31:14 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@206 -- # [[ 32 == 32 ]] 00:27:31.268 18:31:14 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:27:31.268 18:31:14 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:27:31.268 18:31:14 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@207 -- # [[ false == false ]] 00:27:31.268 18:31:14 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:27:31.525 18:31:15 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:27:31.525 18:31:15 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@208 -- # [[ 0 == 0 ]] 00:27:31.525 18:31:15 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:27:31.525 18:31:15 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:27:31.525 18:31:15 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:27:31.782 18:31:15 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:27:31.782 "name": "BaseBdev2", 00:27:31.782 "aliases": [ 00:27:31.782 "7e8a9da9-91fa-4290-8ebe-29530b47af5c" 00:27:31.782 ], 00:27:31.782 "product_name": "Malloc disk", 00:27:31.782 "block_size": 4096, 00:27:31.782 "num_blocks": 8192, 00:27:31.782 "uuid": "7e8a9da9-91fa-4290-8ebe-29530b47af5c", 00:27:31.782 "md_size": 32, 00:27:31.782 "md_interleave": false, 00:27:31.782 "dif_type": 0, 00:27:31.782 "assigned_rate_limits": { 00:27:31.782 "rw_ios_per_sec": 0, 00:27:31.782 "rw_mbytes_per_sec": 0, 00:27:31.782 "r_mbytes_per_sec": 0, 00:27:31.782 "w_mbytes_per_sec": 0 00:27:31.782 }, 00:27:31.782 "claimed": true, 00:27:31.782 "claim_type": "exclusive_write", 00:27:31.782 "zoned": false, 00:27:31.782 "supported_io_types": { 00:27:31.782 "read": true, 00:27:31.782 "write": true, 00:27:31.782 "unmap": true, 00:27:31.782 "flush": true, 00:27:31.782 "reset": true, 00:27:31.782 "nvme_admin": false, 00:27:31.782 "nvme_io": false, 00:27:31.782 "nvme_io_md": false, 00:27:31.782 "write_zeroes": true, 00:27:31.782 "zcopy": true, 00:27:31.782 "get_zone_info": false, 00:27:31.782 "zone_management": false, 00:27:31.782 "zone_append": false, 00:27:31.782 "compare": false, 00:27:31.782 "compare_and_write": false, 00:27:31.782 "abort": true, 00:27:31.782 "seek_hole": false, 00:27:31.782 "seek_data": false, 00:27:31.782 "copy": true, 00:27:31.782 "nvme_iov_md": false 00:27:31.782 }, 00:27:31.782 "memory_domains": [ 00:27:31.782 { 00:27:31.782 "dma_device_id": "system", 00:27:31.782 "dma_device_type": 1 00:27:31.782 }, 00:27:31.782 { 00:27:31.782 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:27:31.782 "dma_device_type": 2 00:27:31.782 } 00:27:31.782 ], 00:27:31.782 "driver_specific": {} 00:27:31.782 }' 00:27:31.782 18:31:15 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:27:31.782 18:31:15 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:27:31.782 18:31:15 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@205 -- # [[ 4096 == 4096 ]] 00:27:31.782 18:31:15 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:27:31.782 18:31:15 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:27:32.038 18:31:15 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@206 -- # [[ 32 == 32 ]] 00:27:32.038 18:31:15 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:27:32.038 18:31:15 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:27:32.038 18:31:15 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@207 -- # [[ false == false ]] 00:27:32.038 18:31:15 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:27:32.038 18:31:15 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:27:32.038 18:31:15 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@208 -- # [[ 0 == 0 ]] 00:27:32.038 18:31:15 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@274 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:27:32.295 [2024-07-12 18:31:15.919316] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:27:32.295 18:31:15 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@275 -- # local expected_state 00:27:32.295 18:31:15 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@276 -- # has_redundancy raid1 00:27:32.295 18:31:15 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@213 -- # case $1 in 00:27:32.295 18:31:15 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@214 -- # return 0 00:27:32.295 18:31:15 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@279 -- # expected_state=online 00:27:32.295 18:31:15 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@281 -- # verify_raid_bdev_state Existed_Raid online raid1 0 1 00:27:32.295 18:31:15 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:27:32.295 18:31:15 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:27:32.295 18:31:15 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:27:32.295 18:31:15 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:27:32.295 18:31:15 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:27:32.295 18:31:15 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:27:32.295 18:31:15 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:27:32.295 18:31:15 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:27:32.295 18:31:15 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@124 -- # local tmp 00:27:32.295 18:31:15 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:32.295 18:31:15 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:27:32.553 18:31:16 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:27:32.553 "name": "Existed_Raid", 00:27:32.553 "uuid": "7857e012-85b4-42ed-a158-88b7df1add2d", 00:27:32.553 "strip_size_kb": 0, 00:27:32.553 "state": "online", 00:27:32.553 "raid_level": "raid1", 00:27:32.553 "superblock": true, 00:27:32.553 "num_base_bdevs": 2, 00:27:32.553 "num_base_bdevs_discovered": 1, 00:27:32.553 "num_base_bdevs_operational": 1, 00:27:32.553 "base_bdevs_list": [ 00:27:32.553 { 00:27:32.553 "name": null, 00:27:32.553 "uuid": "00000000-0000-0000-0000-000000000000", 00:27:32.553 "is_configured": false, 00:27:32.553 "data_offset": 256, 00:27:32.553 "data_size": 7936 00:27:32.553 }, 00:27:32.553 { 00:27:32.553 "name": "BaseBdev2", 00:27:32.553 "uuid": "7e8a9da9-91fa-4290-8ebe-29530b47af5c", 00:27:32.553 "is_configured": true, 00:27:32.553 "data_offset": 256, 00:27:32.553 "data_size": 7936 00:27:32.553 } 00:27:32.553 ] 00:27:32.553 }' 00:27:32.553 18:31:16 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:27:32.553 18:31:16 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@10 -- # set +x 00:27:33.115 18:31:16 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@285 -- # (( i = 1 )) 00:27:33.115 18:31:16 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:27:33.115 18:31:16 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:33.115 18:31:16 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:27:33.373 18:31:17 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:27:33.373 18:31:17 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:27:33.373 18:31:17 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:27:33.629 [2024-07-12 18:31:17.281588] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:27:33.629 [2024-07-12 18:31:17.281673] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:27:33.629 [2024-07-12 18:31:17.293125] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:27:33.629 [2024-07-12 18:31:17.293161] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:27:33.629 [2024-07-12 18:31:17.293177] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1dac210 name Existed_Raid, state offline 00:27:33.629 18:31:17 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:27:33.629 18:31:17 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:27:33.629 18:31:17 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@293 -- # jq -r '.[0]["name"] | select(.)' 00:27:33.629 18:31:17 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@293 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:33.886 18:31:17 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@293 -- # raid_bdev= 00:27:33.886 18:31:17 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@294 -- # '[' -n '' ']' 00:27:33.886 18:31:17 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@299 -- # '[' 2 -gt 2 ']' 00:27:33.886 18:31:17 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@341 -- # killprocess 2606181 00:27:33.886 18:31:17 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@948 -- # '[' -z 2606181 ']' 00:27:33.886 18:31:17 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@952 -- # kill -0 2606181 00:27:33.886 18:31:17 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@953 -- # uname 00:27:33.886 18:31:17 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:27:33.886 18:31:17 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2606181 00:27:33.886 18:31:17 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:27:33.886 18:31:17 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:27:33.886 18:31:17 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2606181' 00:27:33.886 killing process with pid 2606181 00:27:33.886 18:31:17 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@967 -- # kill 2606181 00:27:33.886 [2024-07-12 18:31:17.594298] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:27:33.886 18:31:17 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@972 -- # wait 2606181 00:27:33.886 [2024-07-12 18:31:17.595175] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:27:34.143 18:31:17 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@343 -- # return 0 00:27:34.143 00:27:34.143 real 0m10.568s 00:27:34.143 user 0m18.808s 00:27:34.143 sys 0m1.971s 00:27:34.143 18:31:17 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@1124 -- # xtrace_disable 00:27:34.143 18:31:17 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@10 -- # set +x 00:27:34.143 ************************************ 00:27:34.143 END TEST raid_state_function_test_sb_md_separate 00:27:34.143 ************************************ 00:27:34.143 18:31:17 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:27:34.143 18:31:17 bdev_raid -- bdev/bdev_raid.sh@906 -- # run_test raid_superblock_test_md_separate raid_superblock_test raid1 2 00:27:34.143 18:31:17 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 4 -le 1 ']' 00:27:34.143 18:31:17 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:27:34.143 18:31:17 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:27:34.402 ************************************ 00:27:34.402 START TEST raid_superblock_test_md_separate 00:27:34.402 ************************************ 00:27:34.402 18:31:17 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@1123 -- # raid_superblock_test raid1 2 00:27:34.402 18:31:17 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@392 -- # local raid_level=raid1 00:27:34.402 18:31:17 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@393 -- # local num_base_bdevs=2 00:27:34.402 18:31:17 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@394 -- # base_bdevs_malloc=() 00:27:34.402 18:31:17 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@394 -- # local base_bdevs_malloc 00:27:34.402 18:31:17 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@395 -- # base_bdevs_pt=() 00:27:34.402 18:31:17 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@395 -- # local base_bdevs_pt 00:27:34.402 18:31:17 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@396 -- # base_bdevs_pt_uuid=() 00:27:34.402 18:31:17 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@396 -- # local base_bdevs_pt_uuid 00:27:34.402 18:31:17 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@397 -- # local raid_bdev_name=raid_bdev1 00:27:34.402 18:31:17 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@398 -- # local strip_size 00:27:34.402 18:31:17 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@399 -- # local strip_size_create_arg 00:27:34.402 18:31:17 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@400 -- # local raid_bdev_uuid 00:27:34.402 18:31:17 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@401 -- # local raid_bdev 00:27:34.402 18:31:17 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@403 -- # '[' raid1 '!=' raid1 ']' 00:27:34.402 18:31:17 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@407 -- # strip_size=0 00:27:34.402 18:31:17 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@411 -- # raid_pid=2607806 00:27:34.402 18:31:17 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@410 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -L bdev_raid 00:27:34.402 18:31:17 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@412 -- # waitforlisten 2607806 /var/tmp/spdk-raid.sock 00:27:34.402 18:31:17 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@829 -- # '[' -z 2607806 ']' 00:27:34.402 18:31:17 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:27:34.402 18:31:17 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@834 -- # local max_retries=100 00:27:34.402 18:31:17 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:27:34.402 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:27:34.402 18:31:17 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@838 -- # xtrace_disable 00:27:34.402 18:31:17 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@10 -- # set +x 00:27:34.402 [2024-07-12 18:31:17.934955] Starting SPDK v24.09-pre git sha1 182dd7de4 / DPDK 24.03.0 initialization... 00:27:34.402 [2024-07-12 18:31:17.935017] [ DPDK EAL parameters: bdev_svc --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2607806 ] 00:27:34.402 [2024-07-12 18:31:18.062017] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:27:34.660 [2024-07-12 18:31:18.167199] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:27:34.660 [2024-07-12 18:31:18.236062] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:27:34.660 [2024-07-12 18:31:18.236099] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:27:35.226 18:31:18 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:27:35.227 18:31:18 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@862 -- # return 0 00:27:35.227 18:31:18 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@415 -- # (( i = 1 )) 00:27:35.227 18:31:18 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:27:35.227 18:31:18 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc1 00:27:35.227 18:31:18 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt1 00:27:35.227 18:31:18 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000001 00:27:35.227 18:31:18 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:27:35.227 18:31:18 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:27:35.227 18:31:18 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:27:35.227 18:31:18 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -m 32 -b malloc1 00:27:35.485 malloc1 00:27:35.485 18:31:19 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:27:35.744 [2024-07-12 18:31:19.276181] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:27:35.744 [2024-07-12 18:31:19.276227] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:27:35.744 [2024-07-12 18:31:19.276247] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x238b830 00:27:35.744 [2024-07-12 18:31:19.276260] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:27:35.744 [2024-07-12 18:31:19.277759] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:27:35.744 [2024-07-12 18:31:19.277789] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:27:35.744 pt1 00:27:35.744 18:31:19 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:27:35.744 18:31:19 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:27:35.744 18:31:19 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc2 00:27:35.744 18:31:19 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt2 00:27:35.744 18:31:19 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000002 00:27:35.744 18:31:19 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:27:35.744 18:31:19 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:27:35.744 18:31:19 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:27:35.744 18:31:19 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -m 32 -b malloc2 00:27:35.744 malloc2 00:27:36.003 18:31:19 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:27:36.003 [2024-07-12 18:31:19.698790] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:27:36.003 [2024-07-12 18:31:19.698836] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:27:36.003 [2024-07-12 18:31:19.698855] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x237d250 00:27:36.003 [2024-07-12 18:31:19.698868] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:27:36.003 [2024-07-12 18:31:19.700272] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:27:36.003 [2024-07-12 18:31:19.700301] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:27:36.003 pt2 00:27:36.003 18:31:19 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:27:36.003 18:31:19 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:27:36.003 18:31:19 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@429 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'pt1 pt2' -n raid_bdev1 -s 00:27:36.262 [2024-07-12 18:31:19.931418] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:27:36.262 [2024-07-12 18:31:19.932774] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:27:36.262 [2024-07-12 18:31:19.932910] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x237dd20 00:27:36.262 [2024-07-12 18:31:19.932924] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4096 00:27:36.262 [2024-07-12 18:31:19.933006] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x2371a60 00:27:36.262 [2024-07-12 18:31:19.933118] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x237dd20 00:27:36.262 [2024-07-12 18:31:19.933128] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x237dd20 00:27:36.262 [2024-07-12 18:31:19.933195] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:27:36.262 18:31:19 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@430 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:27:36.262 18:31:19 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:27:36.262 18:31:19 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:27:36.262 18:31:19 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:27:36.262 18:31:19 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:27:36.262 18:31:19 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:27:36.262 18:31:19 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:27:36.262 18:31:19 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:27:36.262 18:31:19 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:27:36.262 18:31:19 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@124 -- # local tmp 00:27:36.262 18:31:19 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:36.263 18:31:19 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:27:36.521 18:31:20 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:27:36.521 "name": "raid_bdev1", 00:27:36.521 "uuid": "47aff921-367a-44ac-b888-7f66c8aa78d2", 00:27:36.521 "strip_size_kb": 0, 00:27:36.521 "state": "online", 00:27:36.521 "raid_level": "raid1", 00:27:36.521 "superblock": true, 00:27:36.521 "num_base_bdevs": 2, 00:27:36.521 "num_base_bdevs_discovered": 2, 00:27:36.521 "num_base_bdevs_operational": 2, 00:27:36.521 "base_bdevs_list": [ 00:27:36.521 { 00:27:36.521 "name": "pt1", 00:27:36.521 "uuid": "00000000-0000-0000-0000-000000000001", 00:27:36.521 "is_configured": true, 00:27:36.521 "data_offset": 256, 00:27:36.521 "data_size": 7936 00:27:36.521 }, 00:27:36.521 { 00:27:36.521 "name": "pt2", 00:27:36.521 "uuid": "00000000-0000-0000-0000-000000000002", 00:27:36.521 "is_configured": true, 00:27:36.521 "data_offset": 256, 00:27:36.521 "data_size": 7936 00:27:36.521 } 00:27:36.521 ] 00:27:36.521 }' 00:27:36.521 18:31:20 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:27:36.521 18:31:20 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@10 -- # set +x 00:27:37.087 18:31:20 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@431 -- # verify_raid_bdev_properties raid_bdev1 00:27:37.087 18:31:20 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:27:37.087 18:31:20 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:27:37.087 18:31:20 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:27:37.087 18:31:20 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:27:37.087 18:31:20 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@198 -- # local name 00:27:37.087 18:31:20 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:27:37.088 18:31:20 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:27:37.347 [2024-07-12 18:31:20.966391] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:27:37.347 18:31:20 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:27:37.347 "name": "raid_bdev1", 00:27:37.347 "aliases": [ 00:27:37.347 "47aff921-367a-44ac-b888-7f66c8aa78d2" 00:27:37.347 ], 00:27:37.347 "product_name": "Raid Volume", 00:27:37.347 "block_size": 4096, 00:27:37.347 "num_blocks": 7936, 00:27:37.347 "uuid": "47aff921-367a-44ac-b888-7f66c8aa78d2", 00:27:37.347 "md_size": 32, 00:27:37.347 "md_interleave": false, 00:27:37.347 "dif_type": 0, 00:27:37.347 "assigned_rate_limits": { 00:27:37.347 "rw_ios_per_sec": 0, 00:27:37.347 "rw_mbytes_per_sec": 0, 00:27:37.347 "r_mbytes_per_sec": 0, 00:27:37.347 "w_mbytes_per_sec": 0 00:27:37.347 }, 00:27:37.347 "claimed": false, 00:27:37.347 "zoned": false, 00:27:37.347 "supported_io_types": { 00:27:37.347 "read": true, 00:27:37.347 "write": true, 00:27:37.347 "unmap": false, 00:27:37.347 "flush": false, 00:27:37.347 "reset": true, 00:27:37.347 "nvme_admin": false, 00:27:37.347 "nvme_io": false, 00:27:37.347 "nvme_io_md": false, 00:27:37.347 "write_zeroes": true, 00:27:37.347 "zcopy": false, 00:27:37.347 "get_zone_info": false, 00:27:37.347 "zone_management": false, 00:27:37.347 "zone_append": false, 00:27:37.347 "compare": false, 00:27:37.347 "compare_and_write": false, 00:27:37.347 "abort": false, 00:27:37.347 "seek_hole": false, 00:27:37.347 "seek_data": false, 00:27:37.347 "copy": false, 00:27:37.347 "nvme_iov_md": false 00:27:37.347 }, 00:27:37.347 "memory_domains": [ 00:27:37.347 { 00:27:37.347 "dma_device_id": "system", 00:27:37.347 "dma_device_type": 1 00:27:37.347 }, 00:27:37.347 { 00:27:37.347 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:27:37.347 "dma_device_type": 2 00:27:37.347 }, 00:27:37.347 { 00:27:37.347 "dma_device_id": "system", 00:27:37.347 "dma_device_type": 1 00:27:37.347 }, 00:27:37.347 { 00:27:37.347 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:27:37.347 "dma_device_type": 2 00:27:37.347 } 00:27:37.347 ], 00:27:37.347 "driver_specific": { 00:27:37.347 "raid": { 00:27:37.347 "uuid": "47aff921-367a-44ac-b888-7f66c8aa78d2", 00:27:37.347 "strip_size_kb": 0, 00:27:37.347 "state": "online", 00:27:37.347 "raid_level": "raid1", 00:27:37.347 "superblock": true, 00:27:37.347 "num_base_bdevs": 2, 00:27:37.347 "num_base_bdevs_discovered": 2, 00:27:37.347 "num_base_bdevs_operational": 2, 00:27:37.347 "base_bdevs_list": [ 00:27:37.347 { 00:27:37.347 "name": "pt1", 00:27:37.347 "uuid": "00000000-0000-0000-0000-000000000001", 00:27:37.347 "is_configured": true, 00:27:37.347 "data_offset": 256, 00:27:37.347 "data_size": 7936 00:27:37.347 }, 00:27:37.347 { 00:27:37.347 "name": "pt2", 00:27:37.347 "uuid": "00000000-0000-0000-0000-000000000002", 00:27:37.347 "is_configured": true, 00:27:37.347 "data_offset": 256, 00:27:37.347 "data_size": 7936 00:27:37.347 } 00:27:37.347 ] 00:27:37.347 } 00:27:37.347 } 00:27:37.347 }' 00:27:37.347 18:31:20 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:27:37.347 18:31:21 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:27:37.347 pt2' 00:27:37.347 18:31:21 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:27:37.347 18:31:21 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:27:37.347 18:31:21 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:27:37.607 18:31:21 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:27:37.607 "name": "pt1", 00:27:37.607 "aliases": [ 00:27:37.607 "00000000-0000-0000-0000-000000000001" 00:27:37.607 ], 00:27:37.607 "product_name": "passthru", 00:27:37.607 "block_size": 4096, 00:27:37.607 "num_blocks": 8192, 00:27:37.607 "uuid": "00000000-0000-0000-0000-000000000001", 00:27:37.607 "md_size": 32, 00:27:37.607 "md_interleave": false, 00:27:37.607 "dif_type": 0, 00:27:37.607 "assigned_rate_limits": { 00:27:37.607 "rw_ios_per_sec": 0, 00:27:37.607 "rw_mbytes_per_sec": 0, 00:27:37.607 "r_mbytes_per_sec": 0, 00:27:37.607 "w_mbytes_per_sec": 0 00:27:37.607 }, 00:27:37.607 "claimed": true, 00:27:37.607 "claim_type": "exclusive_write", 00:27:37.607 "zoned": false, 00:27:37.607 "supported_io_types": { 00:27:37.607 "read": true, 00:27:37.607 "write": true, 00:27:37.607 "unmap": true, 00:27:37.607 "flush": true, 00:27:37.607 "reset": true, 00:27:37.607 "nvme_admin": false, 00:27:37.607 "nvme_io": false, 00:27:37.607 "nvme_io_md": false, 00:27:37.607 "write_zeroes": true, 00:27:37.607 "zcopy": true, 00:27:37.607 "get_zone_info": false, 00:27:37.607 "zone_management": false, 00:27:37.607 "zone_append": false, 00:27:37.607 "compare": false, 00:27:37.607 "compare_and_write": false, 00:27:37.607 "abort": true, 00:27:37.607 "seek_hole": false, 00:27:37.607 "seek_data": false, 00:27:37.607 "copy": true, 00:27:37.607 "nvme_iov_md": false 00:27:37.607 }, 00:27:37.607 "memory_domains": [ 00:27:37.607 { 00:27:37.607 "dma_device_id": "system", 00:27:37.607 "dma_device_type": 1 00:27:37.607 }, 00:27:37.607 { 00:27:37.607 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:27:37.607 "dma_device_type": 2 00:27:37.607 } 00:27:37.607 ], 00:27:37.607 "driver_specific": { 00:27:37.607 "passthru": { 00:27:37.607 "name": "pt1", 00:27:37.607 "base_bdev_name": "malloc1" 00:27:37.607 } 00:27:37.607 } 00:27:37.607 }' 00:27:37.607 18:31:21 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:27:37.865 18:31:21 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:27:37.865 18:31:21 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@205 -- # [[ 4096 == 4096 ]] 00:27:37.865 18:31:21 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:27:37.866 18:31:21 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:27:37.866 18:31:21 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@206 -- # [[ 32 == 32 ]] 00:27:37.866 18:31:21 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:27:37.866 18:31:21 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:27:37.866 18:31:21 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@207 -- # [[ false == false ]] 00:27:37.866 18:31:21 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:27:37.866 18:31:21 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:27:38.125 18:31:21 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@208 -- # [[ 0 == 0 ]] 00:27:38.125 18:31:21 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:27:38.125 18:31:21 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:27:38.125 18:31:21 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:27:38.384 18:31:21 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:27:38.384 "name": "pt2", 00:27:38.384 "aliases": [ 00:27:38.384 "00000000-0000-0000-0000-000000000002" 00:27:38.384 ], 00:27:38.384 "product_name": "passthru", 00:27:38.384 "block_size": 4096, 00:27:38.384 "num_blocks": 8192, 00:27:38.384 "uuid": "00000000-0000-0000-0000-000000000002", 00:27:38.384 "md_size": 32, 00:27:38.384 "md_interleave": false, 00:27:38.384 "dif_type": 0, 00:27:38.384 "assigned_rate_limits": { 00:27:38.384 "rw_ios_per_sec": 0, 00:27:38.384 "rw_mbytes_per_sec": 0, 00:27:38.384 "r_mbytes_per_sec": 0, 00:27:38.384 "w_mbytes_per_sec": 0 00:27:38.384 }, 00:27:38.384 "claimed": true, 00:27:38.384 "claim_type": "exclusive_write", 00:27:38.384 "zoned": false, 00:27:38.384 "supported_io_types": { 00:27:38.384 "read": true, 00:27:38.384 "write": true, 00:27:38.384 "unmap": true, 00:27:38.384 "flush": true, 00:27:38.384 "reset": true, 00:27:38.384 "nvme_admin": false, 00:27:38.384 "nvme_io": false, 00:27:38.384 "nvme_io_md": false, 00:27:38.384 "write_zeroes": true, 00:27:38.384 "zcopy": true, 00:27:38.384 "get_zone_info": false, 00:27:38.384 "zone_management": false, 00:27:38.384 "zone_append": false, 00:27:38.384 "compare": false, 00:27:38.384 "compare_and_write": false, 00:27:38.384 "abort": true, 00:27:38.384 "seek_hole": false, 00:27:38.384 "seek_data": false, 00:27:38.384 "copy": true, 00:27:38.384 "nvme_iov_md": false 00:27:38.384 }, 00:27:38.384 "memory_domains": [ 00:27:38.384 { 00:27:38.384 "dma_device_id": "system", 00:27:38.384 "dma_device_type": 1 00:27:38.384 }, 00:27:38.384 { 00:27:38.384 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:27:38.384 "dma_device_type": 2 00:27:38.384 } 00:27:38.384 ], 00:27:38.384 "driver_specific": { 00:27:38.384 "passthru": { 00:27:38.384 "name": "pt2", 00:27:38.384 "base_bdev_name": "malloc2" 00:27:38.384 } 00:27:38.384 } 00:27:38.384 }' 00:27:38.384 18:31:21 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:27:38.384 18:31:21 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:27:38.384 18:31:21 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@205 -- # [[ 4096 == 4096 ]] 00:27:38.384 18:31:21 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:27:38.384 18:31:21 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:27:38.384 18:31:22 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@206 -- # [[ 32 == 32 ]] 00:27:38.384 18:31:22 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:27:38.384 18:31:22 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:27:38.384 18:31:22 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@207 -- # [[ false == false ]] 00:27:38.384 18:31:22 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:27:38.642 18:31:22 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:27:38.643 18:31:22 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@208 -- # [[ 0 == 0 ]] 00:27:38.643 18:31:22 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@434 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:27:38.643 18:31:22 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@434 -- # jq -r '.[] | .uuid' 00:27:38.900 [2024-07-12 18:31:22.430263] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:27:38.900 18:31:22 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@434 -- # raid_bdev_uuid=47aff921-367a-44ac-b888-7f66c8aa78d2 00:27:38.900 18:31:22 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@435 -- # '[' -z 47aff921-367a-44ac-b888-7f66c8aa78d2 ']' 00:27:38.900 18:31:22 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@440 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:27:39.466 [2024-07-12 18:31:22.931351] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:27:39.466 [2024-07-12 18:31:22.931372] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:27:39.466 [2024-07-12 18:31:22.931423] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:27:39.466 [2024-07-12 18:31:22.931473] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:27:39.466 [2024-07-12 18:31:22.931485] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x237dd20 name raid_bdev1, state offline 00:27:39.466 18:31:22 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@441 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:39.466 18:31:22 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@441 -- # jq -r '.[]' 00:27:39.724 18:31:23 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@441 -- # raid_bdev= 00:27:39.724 18:31:23 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@442 -- # '[' -n '' ']' 00:27:39.724 18:31:23 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:27:39.724 18:31:23 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt1 00:27:39.724 18:31:23 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:27:39.982 18:31:23 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:27:40.241 18:31:23 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@450 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs 00:27:40.241 18:31:23 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@450 -- # jq -r '[.[] | select(.product_name == "passthru")] | any' 00:27:40.499 18:31:23 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@450 -- # '[' false == true ']' 00:27:40.499 18:31:23 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@456 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'malloc1 malloc2' -n raid_bdev1 00:27:40.499 18:31:23 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@648 -- # local es=0 00:27:40.499 18:31:23 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@650 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'malloc1 malloc2' -n raid_bdev1 00:27:40.499 18:31:23 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@636 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:27:40.499 18:31:23 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:27:40.499 18:31:23 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@640 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:27:40.499 18:31:23 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:27:40.499 18:31:23 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@642 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:27:40.499 18:31:23 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:27:40.499 18:31:23 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@642 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:27:40.499 18:31:23 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@642 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py ]] 00:27:40.499 18:31:23 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@651 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'malloc1 malloc2' -n raid_bdev1 00:27:40.499 [2024-07-12 18:31:24.202649] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc1 is claimed 00:27:40.499 [2024-07-12 18:31:24.204066] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc2 is claimed 00:27:40.499 [2024-07-12 18:31:24.204119] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc1 00:27:40.499 [2024-07-12 18:31:24.204159] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc2 00:27:40.499 [2024-07-12 18:31:24.204177] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:27:40.499 [2024-07-12 18:31:24.204186] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x21eded0 name raid_bdev1, state configuring 00:27:40.499 request: 00:27:40.499 { 00:27:40.499 "name": "raid_bdev1", 00:27:40.499 "raid_level": "raid1", 00:27:40.499 "base_bdevs": [ 00:27:40.500 "malloc1", 00:27:40.500 "malloc2" 00:27:40.500 ], 00:27:40.500 "superblock": false, 00:27:40.500 "method": "bdev_raid_create", 00:27:40.500 "req_id": 1 00:27:40.500 } 00:27:40.500 Got JSON-RPC error response 00:27:40.500 response: 00:27:40.500 { 00:27:40.500 "code": -17, 00:27:40.500 "message": "Failed to create RAID bdev raid_bdev1: File exists" 00:27:40.500 } 00:27:40.500 18:31:24 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@651 -- # es=1 00:27:40.500 18:31:24 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@659 -- # (( es > 128 )) 00:27:40.500 18:31:24 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@670 -- # [[ -n '' ]] 00:27:40.500 18:31:24 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@675 -- # (( !es == 0 )) 00:27:40.500 18:31:24 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@458 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:40.759 18:31:24 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@458 -- # jq -r '.[]' 00:27:40.759 18:31:24 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@458 -- # raid_bdev= 00:27:40.759 18:31:24 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@459 -- # '[' -n '' ']' 00:27:40.759 18:31:24 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@464 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:27:41.326 [2024-07-12 18:31:24.868347] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:27:41.326 [2024-07-12 18:31:24.868387] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:27:41.326 [2024-07-12 18:31:24.868404] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x238bee0 00:27:41.326 [2024-07-12 18:31:24.868417] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:27:41.326 [2024-07-12 18:31:24.869828] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:27:41.326 [2024-07-12 18:31:24.869856] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:27:41.326 [2024-07-12 18:31:24.869898] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt1 00:27:41.326 [2024-07-12 18:31:24.869922] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:27:41.326 pt1 00:27:41.326 18:31:24 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@467 -- # verify_raid_bdev_state raid_bdev1 configuring raid1 0 2 00:27:41.326 18:31:24 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:27:41.326 18:31:24 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:27:41.326 18:31:24 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:27:41.326 18:31:24 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:27:41.326 18:31:24 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:27:41.326 18:31:24 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:27:41.326 18:31:24 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:27:41.326 18:31:24 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:27:41.326 18:31:24 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@124 -- # local tmp 00:27:41.326 18:31:24 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:41.326 18:31:24 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:27:41.584 18:31:25 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:27:41.584 "name": "raid_bdev1", 00:27:41.584 "uuid": "47aff921-367a-44ac-b888-7f66c8aa78d2", 00:27:41.584 "strip_size_kb": 0, 00:27:41.584 "state": "configuring", 00:27:41.584 "raid_level": "raid1", 00:27:41.584 "superblock": true, 00:27:41.584 "num_base_bdevs": 2, 00:27:41.584 "num_base_bdevs_discovered": 1, 00:27:41.584 "num_base_bdevs_operational": 2, 00:27:41.584 "base_bdevs_list": [ 00:27:41.584 { 00:27:41.584 "name": "pt1", 00:27:41.584 "uuid": "00000000-0000-0000-0000-000000000001", 00:27:41.584 "is_configured": true, 00:27:41.584 "data_offset": 256, 00:27:41.584 "data_size": 7936 00:27:41.584 }, 00:27:41.584 { 00:27:41.584 "name": null, 00:27:41.584 "uuid": "00000000-0000-0000-0000-000000000002", 00:27:41.584 "is_configured": false, 00:27:41.584 "data_offset": 256, 00:27:41.584 "data_size": 7936 00:27:41.584 } 00:27:41.584 ] 00:27:41.584 }' 00:27:41.584 18:31:25 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:27:41.584 18:31:25 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@10 -- # set +x 00:27:42.150 18:31:25 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@469 -- # '[' 2 -gt 2 ']' 00:27:42.150 18:31:25 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@477 -- # (( i = 1 )) 00:27:42.150 18:31:25 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:27:42.150 18:31:25 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@478 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:27:42.408 [2024-07-12 18:31:25.983325] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:27:42.408 [2024-07-12 18:31:25.983370] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:27:42.408 [2024-07-12 18:31:25.983387] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x21ee490 00:27:42.408 [2024-07-12 18:31:25.983399] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:27:42.408 [2024-07-12 18:31:25.983576] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:27:42.408 [2024-07-12 18:31:25.983593] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:27:42.408 [2024-07-12 18:31:25.983636] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:27:42.408 [2024-07-12 18:31:25.983654] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:27:42.409 [2024-07-12 18:31:25.983741] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x23725d0 00:27:42.409 [2024-07-12 18:31:25.983752] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4096 00:27:42.409 [2024-07-12 18:31:25.983805] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x2373800 00:27:42.409 [2024-07-12 18:31:25.983903] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x23725d0 00:27:42.409 [2024-07-12 18:31:25.983919] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x23725d0 00:27:42.409 [2024-07-12 18:31:25.983999] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:27:42.409 pt2 00:27:42.409 18:31:26 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@477 -- # (( i++ )) 00:27:42.409 18:31:26 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:27:42.409 18:31:26 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@482 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:27:42.409 18:31:26 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:27:42.409 18:31:26 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:27:42.409 18:31:26 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:27:42.409 18:31:26 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:27:42.409 18:31:26 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:27:42.409 18:31:26 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:27:42.409 18:31:26 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:27:42.409 18:31:26 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:27:42.409 18:31:26 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@124 -- # local tmp 00:27:42.409 18:31:26 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:27:42.409 18:31:26 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:42.666 18:31:26 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:27:42.666 "name": "raid_bdev1", 00:27:42.666 "uuid": "47aff921-367a-44ac-b888-7f66c8aa78d2", 00:27:42.666 "strip_size_kb": 0, 00:27:42.666 "state": "online", 00:27:42.666 "raid_level": "raid1", 00:27:42.666 "superblock": true, 00:27:42.666 "num_base_bdevs": 2, 00:27:42.666 "num_base_bdevs_discovered": 2, 00:27:42.666 "num_base_bdevs_operational": 2, 00:27:42.666 "base_bdevs_list": [ 00:27:42.666 { 00:27:42.666 "name": "pt1", 00:27:42.666 "uuid": "00000000-0000-0000-0000-000000000001", 00:27:42.666 "is_configured": true, 00:27:42.666 "data_offset": 256, 00:27:42.666 "data_size": 7936 00:27:42.666 }, 00:27:42.666 { 00:27:42.666 "name": "pt2", 00:27:42.666 "uuid": "00000000-0000-0000-0000-000000000002", 00:27:42.666 "is_configured": true, 00:27:42.666 "data_offset": 256, 00:27:42.666 "data_size": 7936 00:27:42.666 } 00:27:42.666 ] 00:27:42.666 }' 00:27:42.666 18:31:26 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:27:42.666 18:31:26 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@10 -- # set +x 00:27:43.232 18:31:26 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@483 -- # verify_raid_bdev_properties raid_bdev1 00:27:43.232 18:31:26 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:27:43.232 18:31:26 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:27:43.232 18:31:26 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:27:43.232 18:31:26 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:27:43.232 18:31:26 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@198 -- # local name 00:27:43.232 18:31:26 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:27:43.232 18:31:26 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:27:43.490 [2024-07-12 18:31:26.970193] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:27:43.490 18:31:26 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:27:43.490 "name": "raid_bdev1", 00:27:43.490 "aliases": [ 00:27:43.490 "47aff921-367a-44ac-b888-7f66c8aa78d2" 00:27:43.490 ], 00:27:43.490 "product_name": "Raid Volume", 00:27:43.490 "block_size": 4096, 00:27:43.490 "num_blocks": 7936, 00:27:43.490 "uuid": "47aff921-367a-44ac-b888-7f66c8aa78d2", 00:27:43.490 "md_size": 32, 00:27:43.490 "md_interleave": false, 00:27:43.490 "dif_type": 0, 00:27:43.490 "assigned_rate_limits": { 00:27:43.490 "rw_ios_per_sec": 0, 00:27:43.490 "rw_mbytes_per_sec": 0, 00:27:43.490 "r_mbytes_per_sec": 0, 00:27:43.490 "w_mbytes_per_sec": 0 00:27:43.490 }, 00:27:43.490 "claimed": false, 00:27:43.490 "zoned": false, 00:27:43.490 "supported_io_types": { 00:27:43.490 "read": true, 00:27:43.490 "write": true, 00:27:43.490 "unmap": false, 00:27:43.490 "flush": false, 00:27:43.490 "reset": true, 00:27:43.490 "nvme_admin": false, 00:27:43.490 "nvme_io": false, 00:27:43.490 "nvme_io_md": false, 00:27:43.490 "write_zeroes": true, 00:27:43.490 "zcopy": false, 00:27:43.490 "get_zone_info": false, 00:27:43.490 "zone_management": false, 00:27:43.490 "zone_append": false, 00:27:43.490 "compare": false, 00:27:43.490 "compare_and_write": false, 00:27:43.490 "abort": false, 00:27:43.490 "seek_hole": false, 00:27:43.490 "seek_data": false, 00:27:43.490 "copy": false, 00:27:43.490 "nvme_iov_md": false 00:27:43.490 }, 00:27:43.490 "memory_domains": [ 00:27:43.490 { 00:27:43.490 "dma_device_id": "system", 00:27:43.490 "dma_device_type": 1 00:27:43.490 }, 00:27:43.490 { 00:27:43.490 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:27:43.490 "dma_device_type": 2 00:27:43.490 }, 00:27:43.490 { 00:27:43.490 "dma_device_id": "system", 00:27:43.490 "dma_device_type": 1 00:27:43.490 }, 00:27:43.490 { 00:27:43.490 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:27:43.490 "dma_device_type": 2 00:27:43.490 } 00:27:43.490 ], 00:27:43.490 "driver_specific": { 00:27:43.490 "raid": { 00:27:43.490 "uuid": "47aff921-367a-44ac-b888-7f66c8aa78d2", 00:27:43.490 "strip_size_kb": 0, 00:27:43.490 "state": "online", 00:27:43.490 "raid_level": "raid1", 00:27:43.490 "superblock": true, 00:27:43.490 "num_base_bdevs": 2, 00:27:43.490 "num_base_bdevs_discovered": 2, 00:27:43.490 "num_base_bdevs_operational": 2, 00:27:43.490 "base_bdevs_list": [ 00:27:43.490 { 00:27:43.490 "name": "pt1", 00:27:43.490 "uuid": "00000000-0000-0000-0000-000000000001", 00:27:43.490 "is_configured": true, 00:27:43.490 "data_offset": 256, 00:27:43.490 "data_size": 7936 00:27:43.490 }, 00:27:43.490 { 00:27:43.490 "name": "pt2", 00:27:43.490 "uuid": "00000000-0000-0000-0000-000000000002", 00:27:43.490 "is_configured": true, 00:27:43.490 "data_offset": 256, 00:27:43.490 "data_size": 7936 00:27:43.490 } 00:27:43.490 ] 00:27:43.490 } 00:27:43.490 } 00:27:43.490 }' 00:27:43.490 18:31:26 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:27:43.490 18:31:27 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:27:43.490 pt2' 00:27:43.490 18:31:27 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:27:43.490 18:31:27 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:27:43.490 18:31:27 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:27:43.748 18:31:27 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:27:43.748 "name": "pt1", 00:27:43.748 "aliases": [ 00:27:43.748 "00000000-0000-0000-0000-000000000001" 00:27:43.748 ], 00:27:43.748 "product_name": "passthru", 00:27:43.748 "block_size": 4096, 00:27:43.748 "num_blocks": 8192, 00:27:43.748 "uuid": "00000000-0000-0000-0000-000000000001", 00:27:43.748 "md_size": 32, 00:27:43.748 "md_interleave": false, 00:27:43.748 "dif_type": 0, 00:27:43.748 "assigned_rate_limits": { 00:27:43.748 "rw_ios_per_sec": 0, 00:27:43.748 "rw_mbytes_per_sec": 0, 00:27:43.748 "r_mbytes_per_sec": 0, 00:27:43.748 "w_mbytes_per_sec": 0 00:27:43.748 }, 00:27:43.748 "claimed": true, 00:27:43.748 "claim_type": "exclusive_write", 00:27:43.748 "zoned": false, 00:27:43.748 "supported_io_types": { 00:27:43.748 "read": true, 00:27:43.748 "write": true, 00:27:43.748 "unmap": true, 00:27:43.748 "flush": true, 00:27:43.748 "reset": true, 00:27:43.748 "nvme_admin": false, 00:27:43.748 "nvme_io": false, 00:27:43.748 "nvme_io_md": false, 00:27:43.748 "write_zeroes": true, 00:27:43.748 "zcopy": true, 00:27:43.748 "get_zone_info": false, 00:27:43.748 "zone_management": false, 00:27:43.748 "zone_append": false, 00:27:43.748 "compare": false, 00:27:43.748 "compare_and_write": false, 00:27:43.748 "abort": true, 00:27:43.748 "seek_hole": false, 00:27:43.748 "seek_data": false, 00:27:43.748 "copy": true, 00:27:43.748 "nvme_iov_md": false 00:27:43.748 }, 00:27:43.748 "memory_domains": [ 00:27:43.748 { 00:27:43.748 "dma_device_id": "system", 00:27:43.748 "dma_device_type": 1 00:27:43.748 }, 00:27:43.748 { 00:27:43.748 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:27:43.748 "dma_device_type": 2 00:27:43.748 } 00:27:43.748 ], 00:27:43.748 "driver_specific": { 00:27:43.748 "passthru": { 00:27:43.748 "name": "pt1", 00:27:43.748 "base_bdev_name": "malloc1" 00:27:43.748 } 00:27:43.748 } 00:27:43.748 }' 00:27:43.748 18:31:27 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:27:43.748 18:31:27 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:27:43.748 18:31:27 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@205 -- # [[ 4096 == 4096 ]] 00:27:43.748 18:31:27 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:27:44.005 18:31:27 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:27:44.005 18:31:27 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@206 -- # [[ 32 == 32 ]] 00:27:44.005 18:31:27 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:27:44.005 18:31:27 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:27:44.005 18:31:27 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@207 -- # [[ false == false ]] 00:27:44.005 18:31:27 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:27:44.005 18:31:27 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:27:44.263 18:31:27 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@208 -- # [[ 0 == 0 ]] 00:27:44.263 18:31:27 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:27:44.263 18:31:27 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:27:44.263 18:31:27 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:27:44.521 18:31:28 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:27:44.521 "name": "pt2", 00:27:44.521 "aliases": [ 00:27:44.521 "00000000-0000-0000-0000-000000000002" 00:27:44.521 ], 00:27:44.521 "product_name": "passthru", 00:27:44.521 "block_size": 4096, 00:27:44.521 "num_blocks": 8192, 00:27:44.521 "uuid": "00000000-0000-0000-0000-000000000002", 00:27:44.521 "md_size": 32, 00:27:44.521 "md_interleave": false, 00:27:44.521 "dif_type": 0, 00:27:44.521 "assigned_rate_limits": { 00:27:44.521 "rw_ios_per_sec": 0, 00:27:44.521 "rw_mbytes_per_sec": 0, 00:27:44.521 "r_mbytes_per_sec": 0, 00:27:44.521 "w_mbytes_per_sec": 0 00:27:44.521 }, 00:27:44.521 "claimed": true, 00:27:44.521 "claim_type": "exclusive_write", 00:27:44.521 "zoned": false, 00:27:44.521 "supported_io_types": { 00:27:44.521 "read": true, 00:27:44.521 "write": true, 00:27:44.521 "unmap": true, 00:27:44.521 "flush": true, 00:27:44.521 "reset": true, 00:27:44.521 "nvme_admin": false, 00:27:44.521 "nvme_io": false, 00:27:44.521 "nvme_io_md": false, 00:27:44.521 "write_zeroes": true, 00:27:44.521 "zcopy": true, 00:27:44.521 "get_zone_info": false, 00:27:44.521 "zone_management": false, 00:27:44.521 "zone_append": false, 00:27:44.521 "compare": false, 00:27:44.521 "compare_and_write": false, 00:27:44.521 "abort": true, 00:27:44.521 "seek_hole": false, 00:27:44.521 "seek_data": false, 00:27:44.521 "copy": true, 00:27:44.521 "nvme_iov_md": false 00:27:44.521 }, 00:27:44.521 "memory_domains": [ 00:27:44.521 { 00:27:44.521 "dma_device_id": "system", 00:27:44.521 "dma_device_type": 1 00:27:44.521 }, 00:27:44.521 { 00:27:44.521 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:27:44.521 "dma_device_type": 2 00:27:44.521 } 00:27:44.521 ], 00:27:44.521 "driver_specific": { 00:27:44.521 "passthru": { 00:27:44.521 "name": "pt2", 00:27:44.521 "base_bdev_name": "malloc2" 00:27:44.521 } 00:27:44.521 } 00:27:44.521 }' 00:27:44.521 18:31:28 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:27:44.521 18:31:28 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:27:44.521 18:31:28 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@205 -- # [[ 4096 == 4096 ]] 00:27:44.521 18:31:28 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:27:44.521 18:31:28 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:27:44.521 18:31:28 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@206 -- # [[ 32 == 32 ]] 00:27:44.521 18:31:28 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:27:44.521 18:31:28 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:27:44.780 18:31:28 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@207 -- # [[ false == false ]] 00:27:44.780 18:31:28 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:27:44.780 18:31:28 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:27:44.780 18:31:28 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@208 -- # [[ 0 == 0 ]] 00:27:44.780 18:31:28 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@486 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:27:44.780 18:31:28 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@486 -- # jq -r '.[] | .uuid' 00:27:45.038 [2024-07-12 18:31:28.594493] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:27:45.038 18:31:28 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@486 -- # '[' 47aff921-367a-44ac-b888-7f66c8aa78d2 '!=' 47aff921-367a-44ac-b888-7f66c8aa78d2 ']' 00:27:45.038 18:31:28 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@490 -- # has_redundancy raid1 00:27:45.038 18:31:28 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@213 -- # case $1 in 00:27:45.038 18:31:28 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@214 -- # return 0 00:27:45.038 18:31:28 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@492 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt1 00:27:45.319 [2024-07-12 18:31:28.830893] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: pt1 00:27:45.319 18:31:28 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@495 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:27:45.319 18:31:28 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:27:45.319 18:31:28 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:27:45.319 18:31:28 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:27:45.319 18:31:28 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:27:45.319 18:31:28 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:27:45.319 18:31:28 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:27:45.319 18:31:28 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:27:45.319 18:31:28 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:27:45.319 18:31:28 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@124 -- # local tmp 00:27:45.319 18:31:28 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:45.319 18:31:28 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:27:45.908 18:31:29 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:27:45.908 "name": "raid_bdev1", 00:27:45.908 "uuid": "47aff921-367a-44ac-b888-7f66c8aa78d2", 00:27:45.908 "strip_size_kb": 0, 00:27:45.908 "state": "online", 00:27:45.908 "raid_level": "raid1", 00:27:45.908 "superblock": true, 00:27:45.908 "num_base_bdevs": 2, 00:27:45.908 "num_base_bdevs_discovered": 1, 00:27:45.908 "num_base_bdevs_operational": 1, 00:27:45.908 "base_bdevs_list": [ 00:27:45.908 { 00:27:45.908 "name": null, 00:27:45.908 "uuid": "00000000-0000-0000-0000-000000000000", 00:27:45.908 "is_configured": false, 00:27:45.908 "data_offset": 256, 00:27:45.908 "data_size": 7936 00:27:45.908 }, 00:27:45.908 { 00:27:45.908 "name": "pt2", 00:27:45.908 "uuid": "00000000-0000-0000-0000-000000000002", 00:27:45.908 "is_configured": true, 00:27:45.908 "data_offset": 256, 00:27:45.908 "data_size": 7936 00:27:45.908 } 00:27:45.908 ] 00:27:45.908 }' 00:27:45.908 18:31:29 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:27:45.908 18:31:29 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@10 -- # set +x 00:27:46.472 18:31:29 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@498 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:27:46.472 [2024-07-12 18:31:30.194514] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:27:46.472 [2024-07-12 18:31:30.194544] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:27:46.472 [2024-07-12 18:31:30.194592] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:27:46.472 [2024-07-12 18:31:30.194633] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:27:46.472 [2024-07-12 18:31:30.194644] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x23725d0 name raid_bdev1, state offline 00:27:46.730 18:31:30 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@499 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:46.730 18:31:30 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@499 -- # jq -r '.[]' 00:27:46.988 18:31:30 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@499 -- # raid_bdev= 00:27:46.988 18:31:30 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@500 -- # '[' -n '' ']' 00:27:46.988 18:31:30 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@505 -- # (( i = 1 )) 00:27:46.988 18:31:30 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@505 -- # (( i < num_base_bdevs )) 00:27:46.988 18:31:30 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@506 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:27:46.988 18:31:30 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@505 -- # (( i++ )) 00:27:46.988 18:31:30 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@505 -- # (( i < num_base_bdevs )) 00:27:46.988 18:31:30 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@510 -- # (( i = 1 )) 00:27:46.988 18:31:30 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@510 -- # (( i < num_base_bdevs - 1 )) 00:27:46.988 18:31:30 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@518 -- # i=1 00:27:46.988 18:31:30 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@519 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:27:47.246 [2024-07-12 18:31:30.800083] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:27:47.246 [2024-07-12 18:31:30.800124] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:27:47.246 [2024-07-12 18:31:30.800142] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x2370660 00:27:47.246 [2024-07-12 18:31:30.800154] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:27:47.246 [2024-07-12 18:31:30.801585] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:27:47.246 [2024-07-12 18:31:30.801613] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:27:47.246 [2024-07-12 18:31:30.801658] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:27:47.246 [2024-07-12 18:31:30.801681] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:27:47.246 [2024-07-12 18:31:30.801756] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x2372d10 00:27:47.246 [2024-07-12 18:31:30.801766] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4096 00:27:47.246 [2024-07-12 18:31:30.801821] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x2373560 00:27:47.246 [2024-07-12 18:31:30.801914] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x2372d10 00:27:47.246 [2024-07-12 18:31:30.801924] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x2372d10 00:27:47.246 [2024-07-12 18:31:30.801999] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:27:47.246 pt2 00:27:47.246 18:31:30 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@522 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:27:47.246 18:31:30 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:27:47.246 18:31:30 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:27:47.246 18:31:30 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:27:47.246 18:31:30 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:27:47.246 18:31:30 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:27:47.246 18:31:30 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:27:47.246 18:31:30 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:27:47.246 18:31:30 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:27:47.246 18:31:30 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@124 -- # local tmp 00:27:47.246 18:31:30 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:27:47.246 18:31:30 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:47.503 18:31:31 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:27:47.503 "name": "raid_bdev1", 00:27:47.503 "uuid": "47aff921-367a-44ac-b888-7f66c8aa78d2", 00:27:47.503 "strip_size_kb": 0, 00:27:47.503 "state": "online", 00:27:47.503 "raid_level": "raid1", 00:27:47.503 "superblock": true, 00:27:47.504 "num_base_bdevs": 2, 00:27:47.504 "num_base_bdevs_discovered": 1, 00:27:47.504 "num_base_bdevs_operational": 1, 00:27:47.504 "base_bdevs_list": [ 00:27:47.504 { 00:27:47.504 "name": null, 00:27:47.504 "uuid": "00000000-0000-0000-0000-000000000000", 00:27:47.504 "is_configured": false, 00:27:47.504 "data_offset": 256, 00:27:47.504 "data_size": 7936 00:27:47.504 }, 00:27:47.504 { 00:27:47.504 "name": "pt2", 00:27:47.504 "uuid": "00000000-0000-0000-0000-000000000002", 00:27:47.504 "is_configured": true, 00:27:47.504 "data_offset": 256, 00:27:47.504 "data_size": 7936 00:27:47.504 } 00:27:47.504 ] 00:27:47.504 }' 00:27:47.504 18:31:31 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:27:47.504 18:31:31 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@10 -- # set +x 00:27:48.069 18:31:31 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@525 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:27:48.326 [2024-07-12 18:31:31.894991] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:27:48.326 [2024-07-12 18:31:31.895016] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:27:48.326 [2024-07-12 18:31:31.895064] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:27:48.326 [2024-07-12 18:31:31.895106] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:27:48.326 [2024-07-12 18:31:31.895117] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x2372d10 name raid_bdev1, state offline 00:27:48.326 18:31:31 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@526 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:48.326 18:31:31 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@526 -- # jq -r '.[]' 00:27:48.584 18:31:32 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@526 -- # raid_bdev= 00:27:48.584 18:31:32 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@527 -- # '[' -n '' ']' 00:27:48.584 18:31:32 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@531 -- # '[' 2 -gt 2 ']' 00:27:48.584 18:31:32 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@539 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:27:48.843 [2024-07-12 18:31:32.388271] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:27:48.843 [2024-07-12 18:31:32.388314] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:27:48.843 [2024-07-12 18:31:32.388331] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x2371760 00:27:48.843 [2024-07-12 18:31:32.388344] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:27:48.843 [2024-07-12 18:31:32.389743] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:27:48.843 [2024-07-12 18:31:32.389771] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:27:48.843 [2024-07-12 18:31:32.389817] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt1 00:27:48.843 [2024-07-12 18:31:32.389840] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:27:48.843 [2024-07-12 18:31:32.389936] bdev_raid.c:3547:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev pt2 (4) greater than existing raid bdev raid_bdev1 (2) 00:27:48.843 [2024-07-12 18:31:32.389955] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:27:48.843 [2024-07-12 18:31:32.389968] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x2373850 name raid_bdev1, state configuring 00:27:48.843 [2024-07-12 18:31:32.389991] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:27:48.843 [2024-07-12 18:31:32.390041] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x2372850 00:27:48.843 [2024-07-12 18:31:32.390051] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4096 00:27:48.843 [2024-07-12 18:31:32.390104] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x23733b0 00:27:48.843 [2024-07-12 18:31:32.390200] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x2372850 00:27:48.843 [2024-07-12 18:31:32.390209] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x2372850 00:27:48.843 [2024-07-12 18:31:32.390281] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:27:48.843 pt1 00:27:48.843 18:31:32 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@541 -- # '[' 2 -gt 2 ']' 00:27:48.843 18:31:32 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@553 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:27:48.843 18:31:32 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:27:48.843 18:31:32 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:27:48.843 18:31:32 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:27:48.843 18:31:32 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:27:48.843 18:31:32 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:27:48.843 18:31:32 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:27:48.843 18:31:32 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:27:48.843 18:31:32 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:27:48.843 18:31:32 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@124 -- # local tmp 00:27:48.843 18:31:32 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:48.843 18:31:32 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:27:49.101 18:31:32 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:27:49.101 "name": "raid_bdev1", 00:27:49.101 "uuid": "47aff921-367a-44ac-b888-7f66c8aa78d2", 00:27:49.101 "strip_size_kb": 0, 00:27:49.101 "state": "online", 00:27:49.101 "raid_level": "raid1", 00:27:49.101 "superblock": true, 00:27:49.101 "num_base_bdevs": 2, 00:27:49.101 "num_base_bdevs_discovered": 1, 00:27:49.101 "num_base_bdevs_operational": 1, 00:27:49.101 "base_bdevs_list": [ 00:27:49.101 { 00:27:49.101 "name": null, 00:27:49.101 "uuid": "00000000-0000-0000-0000-000000000000", 00:27:49.101 "is_configured": false, 00:27:49.101 "data_offset": 256, 00:27:49.101 "data_size": 7936 00:27:49.101 }, 00:27:49.101 { 00:27:49.101 "name": "pt2", 00:27:49.101 "uuid": "00000000-0000-0000-0000-000000000002", 00:27:49.101 "is_configured": true, 00:27:49.101 "data_offset": 256, 00:27:49.101 "data_size": 7936 00:27:49.101 } 00:27:49.101 ] 00:27:49.101 }' 00:27:49.101 18:31:32 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:27:49.101 18:31:32 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@10 -- # set +x 00:27:49.685 18:31:33 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@554 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs online 00:27:49.685 18:31:33 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@554 -- # jq -r '.[].base_bdevs_list[0].is_configured' 00:27:49.945 18:31:33 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@554 -- # [[ false == \f\a\l\s\e ]] 00:27:49.945 18:31:33 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@557 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:27:49.945 18:31:33 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@557 -- # jq -r '.[] | .uuid' 00:27:50.204 [2024-07-12 18:31:33.720048] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:27:50.204 18:31:33 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@557 -- # '[' 47aff921-367a-44ac-b888-7f66c8aa78d2 '!=' 47aff921-367a-44ac-b888-7f66c8aa78d2 ']' 00:27:50.204 18:31:33 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@562 -- # killprocess 2607806 00:27:50.204 18:31:33 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@948 -- # '[' -z 2607806 ']' 00:27:50.204 18:31:33 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@952 -- # kill -0 2607806 00:27:50.204 18:31:33 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@953 -- # uname 00:27:50.204 18:31:33 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:27:50.204 18:31:33 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2607806 00:27:50.204 18:31:33 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:27:50.204 18:31:33 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:27:50.204 18:31:33 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2607806' 00:27:50.204 killing process with pid 2607806 00:27:50.204 18:31:33 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@967 -- # kill 2607806 00:27:50.204 [2024-07-12 18:31:33.788163] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:27:50.204 [2024-07-12 18:31:33.788211] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:27:50.204 [2024-07-12 18:31:33.788252] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:27:50.204 [2024-07-12 18:31:33.788264] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x2372850 name raid_bdev1, state offline 00:27:50.204 18:31:33 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@972 -- # wait 2607806 00:27:50.204 [2024-07-12 18:31:33.814248] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:27:50.464 18:31:34 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@564 -- # return 0 00:27:50.464 00:27:50.464 real 0m16.164s 00:27:50.464 user 0m29.404s 00:27:50.464 sys 0m2.911s 00:27:50.464 18:31:34 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@1124 -- # xtrace_disable 00:27:50.464 18:31:34 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@10 -- # set +x 00:27:50.464 ************************************ 00:27:50.464 END TEST raid_superblock_test_md_separate 00:27:50.464 ************************************ 00:27:50.464 18:31:34 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:27:50.464 18:31:34 bdev_raid -- bdev/bdev_raid.sh@907 -- # '[' true = true ']' 00:27:50.464 18:31:34 bdev_raid -- bdev/bdev_raid.sh@908 -- # run_test raid_rebuild_test_sb_md_separate raid_rebuild_test raid1 2 true false true 00:27:50.464 18:31:34 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 7 -le 1 ']' 00:27:50.464 18:31:34 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:27:50.464 18:31:34 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:27:50.464 ************************************ 00:27:50.464 START TEST raid_rebuild_test_sb_md_separate 00:27:50.464 ************************************ 00:27:50.464 18:31:34 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@1123 -- # raid_rebuild_test raid1 2 true false true 00:27:50.464 18:31:34 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@568 -- # local raid_level=raid1 00:27:50.464 18:31:34 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@569 -- # local num_base_bdevs=2 00:27:50.464 18:31:34 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@570 -- # local superblock=true 00:27:50.464 18:31:34 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@571 -- # local background_io=false 00:27:50.464 18:31:34 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@572 -- # local verify=true 00:27:50.464 18:31:34 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@573 -- # (( i = 1 )) 00:27:50.464 18:31:34 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:27:50.464 18:31:34 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@575 -- # echo BaseBdev1 00:27:50.464 18:31:34 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:27:50.464 18:31:34 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:27:50.464 18:31:34 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@575 -- # echo BaseBdev2 00:27:50.464 18:31:34 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:27:50.464 18:31:34 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:27:50.464 18:31:34 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@573 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:27:50.464 18:31:34 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@573 -- # local base_bdevs 00:27:50.464 18:31:34 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@574 -- # local raid_bdev_name=raid_bdev1 00:27:50.464 18:31:34 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@575 -- # local strip_size 00:27:50.464 18:31:34 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@576 -- # local create_arg 00:27:50.464 18:31:34 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@577 -- # local raid_bdev_size 00:27:50.464 18:31:34 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@578 -- # local data_offset 00:27:50.464 18:31:34 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@580 -- # '[' raid1 '!=' raid1 ']' 00:27:50.464 18:31:34 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@588 -- # strip_size=0 00:27:50.464 18:31:34 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@591 -- # '[' true = true ']' 00:27:50.464 18:31:34 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@592 -- # create_arg+=' -s' 00:27:50.464 18:31:34 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@596 -- # raid_pid=2610209 00:27:50.464 18:31:34 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@597 -- # waitforlisten 2610209 /var/tmp/spdk-raid.sock 00:27:50.464 18:31:34 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@595 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 3M -q 2 -U -z -L bdev_raid 00:27:50.464 18:31:34 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@829 -- # '[' -z 2610209 ']' 00:27:50.464 18:31:34 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:27:50.464 18:31:34 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@834 -- # local max_retries=100 00:27:50.464 18:31:34 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:27:50.464 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:27:50.464 18:31:34 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@838 -- # xtrace_disable 00:27:50.464 18:31:34 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@10 -- # set +x 00:27:50.723 [2024-07-12 18:31:34.197509] Starting SPDK v24.09-pre git sha1 182dd7de4 / DPDK 24.03.0 initialization... 00:27:50.723 [2024-07-12 18:31:34.197592] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2610209 ] 00:27:50.723 I/O size of 3145728 is greater than zero copy threshold (65536). 00:27:50.723 Zero copy mechanism will not be used. 00:27:50.723 [2024-07-12 18:31:34.326174] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:27:50.723 [2024-07-12 18:31:34.434212] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:27:50.982 [2024-07-12 18:31:34.506300] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:27:50.982 [2024-07-12 18:31:34.506334] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:27:51.550 18:31:35 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:27:51.550 18:31:35 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@862 -- # return 0 00:27:51.550 18:31:35 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:27:51.550 18:31:35 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -m 32 -b BaseBdev1_malloc 00:27:51.808 BaseBdev1_malloc 00:27:51.808 18:31:35 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev1_malloc -p BaseBdev1 00:27:52.067 [2024-07-12 18:31:35.616349] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev1_malloc 00:27:52.067 [2024-07-12 18:31:35.616397] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:27:52.067 [2024-07-12 18:31:35.616421] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x9e56d0 00:27:52.067 [2024-07-12 18:31:35.616434] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:27:52.067 [2024-07-12 18:31:35.617960] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:27:52.067 [2024-07-12 18:31:35.617992] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:27:52.067 BaseBdev1 00:27:52.067 18:31:35 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:27:52.067 18:31:35 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -m 32 -b BaseBdev2_malloc 00:27:52.327 BaseBdev2_malloc 00:27:52.327 18:31:35 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev2_malloc -p BaseBdev2 00:27:52.586 [2024-07-12 18:31:36.054968] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev2_malloc 00:27:52.586 [2024-07-12 18:31:36.055012] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:27:52.586 [2024-07-12 18:31:36.055032] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xb3d1f0 00:27:52.586 [2024-07-12 18:31:36.055045] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:27:52.586 [2024-07-12 18:31:36.056283] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:27:52.586 [2024-07-12 18:31:36.056311] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:27:52.586 BaseBdev2 00:27:52.586 18:31:36 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@606 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -m 32 -b spare_malloc 00:27:52.844 spare_malloc 00:27:52.844 18:31:36 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@607 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_delay_create -b spare_malloc -d spare_delay -r 0 -t 0 -w 100000 -n 100000 00:27:53.103 spare_delay 00:27:53.362 18:31:36 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@608 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:27:53.362 [2024-07-12 18:31:37.074902] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:27:53.362 [2024-07-12 18:31:37.074954] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:27:53.362 [2024-07-12 18:31:37.074979] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xb397a0 00:27:53.362 [2024-07-12 18:31:37.074992] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:27:53.362 [2024-07-12 18:31:37.076426] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:27:53.362 [2024-07-12 18:31:37.076457] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:27:53.362 spare 00:27:53.621 18:31:37 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@611 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2' -n raid_bdev1 00:27:53.879 [2024-07-12 18:31:37.576234] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:27:53.879 [2024-07-12 18:31:37.577576] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:27:53.879 [2024-07-12 18:31:37.577738] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0xb3a1c0 00:27:53.879 [2024-07-12 18:31:37.577751] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4096 00:27:53.879 [2024-07-12 18:31:37.577821] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xa4b360 00:27:53.879 [2024-07-12 18:31:37.577946] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0xb3a1c0 00:27:53.879 [2024-07-12 18:31:37.577957] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0xb3a1c0 00:27:53.879 [2024-07-12 18:31:37.578029] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:27:54.138 18:31:37 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@612 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:27:54.138 18:31:37 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:27:54.138 18:31:37 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:27:54.138 18:31:37 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:27:54.138 18:31:37 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:27:54.138 18:31:37 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:27:54.138 18:31:37 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:27:54.138 18:31:37 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:27:54.138 18:31:37 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:27:54.138 18:31:37 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@124 -- # local tmp 00:27:54.138 18:31:37 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:54.138 18:31:37 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:27:54.138 18:31:37 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:27:54.138 "name": "raid_bdev1", 00:27:54.138 "uuid": "045d94af-4c44-4093-a769-a8f67e16ab80", 00:27:54.138 "strip_size_kb": 0, 00:27:54.138 "state": "online", 00:27:54.138 "raid_level": "raid1", 00:27:54.138 "superblock": true, 00:27:54.138 "num_base_bdevs": 2, 00:27:54.138 "num_base_bdevs_discovered": 2, 00:27:54.138 "num_base_bdevs_operational": 2, 00:27:54.138 "base_bdevs_list": [ 00:27:54.138 { 00:27:54.138 "name": "BaseBdev1", 00:27:54.138 "uuid": "cc9ff1cf-237f-5070-abc2-e8a80b47c54a", 00:27:54.138 "is_configured": true, 00:27:54.138 "data_offset": 256, 00:27:54.138 "data_size": 7936 00:27:54.138 }, 00:27:54.138 { 00:27:54.138 "name": "BaseBdev2", 00:27:54.138 "uuid": "ff367caa-e524-579d-a72a-736653bf96dc", 00:27:54.138 "is_configured": true, 00:27:54.138 "data_offset": 256, 00:27:54.138 "data_size": 7936 00:27:54.138 } 00:27:54.138 ] 00:27:54.138 }' 00:27:54.138 18:31:37 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:27:54.138 18:31:37 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@10 -- # set +x 00:27:55.075 18:31:38 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@615 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:27:55.075 18:31:38 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@615 -- # jq -r '.[].num_blocks' 00:27:55.075 [2024-07-12 18:31:38.675371] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:27:55.075 18:31:38 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@615 -- # raid_bdev_size=7936 00:27:55.075 18:31:38 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@618 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:55.075 18:31:38 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@618 -- # jq -r '.[].base_bdevs_list[0].data_offset' 00:27:55.336 18:31:38 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@618 -- # data_offset=256 00:27:55.336 18:31:38 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@620 -- # '[' false = true ']' 00:27:55.336 18:31:38 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@623 -- # '[' true = true ']' 00:27:55.336 18:31:38 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@624 -- # local write_unit_size 00:27:55.336 18:31:38 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@627 -- # nbd_start_disks /var/tmp/spdk-raid.sock raid_bdev1 /dev/nbd0 00:27:55.336 18:31:38 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:27:55.336 18:31:38 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@10 -- # bdev_list=('raid_bdev1') 00:27:55.336 18:31:38 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@10 -- # local bdev_list 00:27:55.336 18:31:38 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0') 00:27:55.336 18:31:38 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@11 -- # local nbd_list 00:27:55.336 18:31:38 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@12 -- # local i 00:27:55.336 18:31:38 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:27:55.336 18:31:38 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:27:55.336 18:31:38 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk raid_bdev1 /dev/nbd0 00:27:55.595 [2024-07-12 18:31:39.168477] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xa4b360 00:27:55.595 /dev/nbd0 00:27:55.595 18:31:39 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:27:55.595 18:31:39 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:27:55.595 18:31:39 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@866 -- # local nbd_name=nbd0 00:27:55.595 18:31:39 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@867 -- # local i 00:27:55.595 18:31:39 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:27:55.595 18:31:39 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:27:55.595 18:31:39 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@870 -- # grep -q -w nbd0 /proc/partitions 00:27:55.595 18:31:39 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@871 -- # break 00:27:55.595 18:31:39 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:27:55.595 18:31:39 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:27:55.595 18:31:39 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@883 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:27:55.595 1+0 records in 00:27:55.595 1+0 records out 00:27:55.595 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000282376 s, 14.5 MB/s 00:27:55.595 18:31:39 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:27:55.595 18:31:39 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@884 -- # size=4096 00:27:55.595 18:31:39 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:27:55.595 18:31:39 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:27:55.595 18:31:39 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@887 -- # return 0 00:27:55.595 18:31:39 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:27:55.595 18:31:39 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:27:55.595 18:31:39 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@628 -- # '[' raid1 = raid5f ']' 00:27:55.595 18:31:39 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@632 -- # write_unit_size=1 00:27:55.595 18:31:39 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@634 -- # dd if=/dev/urandom of=/dev/nbd0 bs=4096 count=7936 oflag=direct 00:27:56.532 7936+0 records in 00:27:56.532 7936+0 records out 00:27:56.532 32505856 bytes (33 MB, 31 MiB) copied, 0.768128 s, 42.3 MB/s 00:27:56.532 18:31:40 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@635 -- # nbd_stop_disks /var/tmp/spdk-raid.sock /dev/nbd0 00:27:56.532 18:31:40 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:27:56.532 18:31:40 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0') 00:27:56.532 18:31:40 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@50 -- # local nbd_list 00:27:56.532 18:31:40 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@51 -- # local i 00:27:56.532 18:31:40 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:27:56.532 18:31:40 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd0 00:27:56.791 [2024-07-12 18:31:40.263648] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:27:56.791 18:31:40 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:27:56.791 18:31:40 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:27:56.791 18:31:40 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:27:56.791 18:31:40 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:27:56.791 18:31:40 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:27:56.791 18:31:40 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:27:56.791 18:31:40 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@41 -- # break 00:27:56.791 18:31:40 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@45 -- # return 0 00:27:56.791 18:31:40 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@639 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev1 00:27:56.791 [2024-07-12 18:31:40.500320] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:27:57.050 18:31:40 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@642 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:27:57.050 18:31:40 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:27:57.050 18:31:40 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:27:57.050 18:31:40 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:27:57.050 18:31:40 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:27:57.050 18:31:40 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:27:57.050 18:31:40 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:27:57.050 18:31:40 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:27:57.050 18:31:40 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:27:57.050 18:31:40 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@124 -- # local tmp 00:27:57.050 18:31:40 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:27:57.050 18:31:40 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:57.050 18:31:40 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:27:57.050 "name": "raid_bdev1", 00:27:57.050 "uuid": "045d94af-4c44-4093-a769-a8f67e16ab80", 00:27:57.050 "strip_size_kb": 0, 00:27:57.050 "state": "online", 00:27:57.050 "raid_level": "raid1", 00:27:57.050 "superblock": true, 00:27:57.050 "num_base_bdevs": 2, 00:27:57.050 "num_base_bdevs_discovered": 1, 00:27:57.050 "num_base_bdevs_operational": 1, 00:27:57.050 "base_bdevs_list": [ 00:27:57.050 { 00:27:57.050 "name": null, 00:27:57.050 "uuid": "00000000-0000-0000-0000-000000000000", 00:27:57.050 "is_configured": false, 00:27:57.050 "data_offset": 256, 00:27:57.050 "data_size": 7936 00:27:57.050 }, 00:27:57.050 { 00:27:57.050 "name": "BaseBdev2", 00:27:57.050 "uuid": "ff367caa-e524-579d-a72a-736653bf96dc", 00:27:57.050 "is_configured": true, 00:27:57.050 "data_offset": 256, 00:27:57.050 "data_size": 7936 00:27:57.050 } 00:27:57.050 ] 00:27:57.050 }' 00:27:57.050 18:31:40 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:27:57.050 18:31:40 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@10 -- # set +x 00:27:57.984 18:31:41 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@645 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:27:57.984 [2024-07-12 18:31:41.603257] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:27:57.984 [2024-07-12 18:31:41.605533] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x9e43e0 00:27:57.984 [2024-07-12 18:31:41.607793] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:27:57.984 18:31:41 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@646 -- # sleep 1 00:27:58.915 18:31:42 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@649 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:27:58.915 18:31:42 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:27:58.915 18:31:42 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:27:58.915 18:31:42 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@184 -- # local target=spare 00:27:58.915 18:31:42 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:27:58.915 18:31:42 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:58.916 18:31:42 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:27:59.174 18:31:42 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:27:59.174 "name": "raid_bdev1", 00:27:59.174 "uuid": "045d94af-4c44-4093-a769-a8f67e16ab80", 00:27:59.174 "strip_size_kb": 0, 00:27:59.174 "state": "online", 00:27:59.174 "raid_level": "raid1", 00:27:59.174 "superblock": true, 00:27:59.174 "num_base_bdevs": 2, 00:27:59.174 "num_base_bdevs_discovered": 2, 00:27:59.174 "num_base_bdevs_operational": 2, 00:27:59.174 "process": { 00:27:59.174 "type": "rebuild", 00:27:59.174 "target": "spare", 00:27:59.174 "progress": { 00:27:59.174 "blocks": 3072, 00:27:59.174 "percent": 38 00:27:59.174 } 00:27:59.174 }, 00:27:59.174 "base_bdevs_list": [ 00:27:59.174 { 00:27:59.174 "name": "spare", 00:27:59.174 "uuid": "24e84382-8187-522d-a748-54c7d192b3e4", 00:27:59.174 "is_configured": true, 00:27:59.174 "data_offset": 256, 00:27:59.174 "data_size": 7936 00:27:59.174 }, 00:27:59.174 { 00:27:59.174 "name": "BaseBdev2", 00:27:59.174 "uuid": "ff367caa-e524-579d-a72a-736653bf96dc", 00:27:59.174 "is_configured": true, 00:27:59.174 "data_offset": 256, 00:27:59.174 "data_size": 7936 00:27:59.174 } 00:27:59.174 ] 00:27:59.174 }' 00:27:59.174 18:31:42 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:27:59.431 18:31:42 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:27:59.431 18:31:42 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:27:59.431 18:31:42 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:27:59.431 18:31:42 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@652 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev spare 00:27:59.687 [2024-07-12 18:31:43.192942] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:27:59.687 [2024-07-12 18:31:43.220138] bdev_raid.c:2513:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:27:59.687 [2024-07-12 18:31:43.220184] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:27:59.687 [2024-07-12 18:31:43.220205] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:27:59.687 [2024-07-12 18:31:43.220214] bdev_raid.c:2451:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:27:59.687 18:31:43 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@655 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:27:59.688 18:31:43 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:27:59.688 18:31:43 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:27:59.688 18:31:43 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:27:59.688 18:31:43 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:27:59.688 18:31:43 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:27:59.688 18:31:43 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:27:59.688 18:31:43 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:27:59.688 18:31:43 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:27:59.688 18:31:43 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@124 -- # local tmp 00:27:59.688 18:31:43 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:59.688 18:31:43 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:27:59.944 18:31:43 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:27:59.944 "name": "raid_bdev1", 00:27:59.944 "uuid": "045d94af-4c44-4093-a769-a8f67e16ab80", 00:27:59.944 "strip_size_kb": 0, 00:27:59.944 "state": "online", 00:27:59.944 "raid_level": "raid1", 00:27:59.944 "superblock": true, 00:27:59.944 "num_base_bdevs": 2, 00:27:59.944 "num_base_bdevs_discovered": 1, 00:27:59.944 "num_base_bdevs_operational": 1, 00:27:59.944 "base_bdevs_list": [ 00:27:59.944 { 00:27:59.944 "name": null, 00:27:59.944 "uuid": "00000000-0000-0000-0000-000000000000", 00:27:59.944 "is_configured": false, 00:27:59.944 "data_offset": 256, 00:27:59.944 "data_size": 7936 00:27:59.944 }, 00:27:59.944 { 00:27:59.944 "name": "BaseBdev2", 00:27:59.944 "uuid": "ff367caa-e524-579d-a72a-736653bf96dc", 00:27:59.944 "is_configured": true, 00:27:59.944 "data_offset": 256, 00:27:59.944 "data_size": 7936 00:27:59.944 } 00:27:59.944 ] 00:27:59.944 }' 00:27:59.944 18:31:43 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:27:59.945 18:31:43 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@10 -- # set +x 00:28:00.509 18:31:44 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@658 -- # verify_raid_bdev_process raid_bdev1 none none 00:28:00.509 18:31:44 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:28:00.509 18:31:44 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:28:00.509 18:31:44 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@184 -- # local target=none 00:28:00.509 18:31:44 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:28:00.509 18:31:44 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:00.509 18:31:44 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:28:00.767 18:31:44 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:28:00.767 "name": "raid_bdev1", 00:28:00.767 "uuid": "045d94af-4c44-4093-a769-a8f67e16ab80", 00:28:00.767 "strip_size_kb": 0, 00:28:00.767 "state": "online", 00:28:00.767 "raid_level": "raid1", 00:28:00.767 "superblock": true, 00:28:00.767 "num_base_bdevs": 2, 00:28:00.767 "num_base_bdevs_discovered": 1, 00:28:00.767 "num_base_bdevs_operational": 1, 00:28:00.767 "base_bdevs_list": [ 00:28:00.767 { 00:28:00.767 "name": null, 00:28:00.767 "uuid": "00000000-0000-0000-0000-000000000000", 00:28:00.767 "is_configured": false, 00:28:00.767 "data_offset": 256, 00:28:00.767 "data_size": 7936 00:28:00.767 }, 00:28:00.767 { 00:28:00.767 "name": "BaseBdev2", 00:28:00.767 "uuid": "ff367caa-e524-579d-a72a-736653bf96dc", 00:28:00.767 "is_configured": true, 00:28:00.767 "data_offset": 256, 00:28:00.767 "data_size": 7936 00:28:00.767 } 00:28:00.767 ] 00:28:00.767 }' 00:28:00.767 18:31:44 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:28:00.767 18:31:44 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:28:00.767 18:31:44 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:28:00.767 18:31:44 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:28:00.767 18:31:44 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@661 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:28:01.025 [2024-07-12 18:31:44.662999] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:28:01.025 [2024-07-12 18:31:44.665295] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xa4ca70 00:28:01.025 [2024-07-12 18:31:44.666825] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:28:01.025 18:31:44 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@662 -- # sleep 1 00:28:02.394 18:31:45 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@663 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:28:02.394 18:31:45 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:28:02.394 18:31:45 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:28:02.394 18:31:45 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@184 -- # local target=spare 00:28:02.394 18:31:45 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:28:02.394 18:31:45 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:02.394 18:31:45 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:28:02.394 18:31:45 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:28:02.394 "name": "raid_bdev1", 00:28:02.394 "uuid": "045d94af-4c44-4093-a769-a8f67e16ab80", 00:28:02.394 "strip_size_kb": 0, 00:28:02.394 "state": "online", 00:28:02.394 "raid_level": "raid1", 00:28:02.394 "superblock": true, 00:28:02.394 "num_base_bdevs": 2, 00:28:02.394 "num_base_bdevs_discovered": 2, 00:28:02.394 "num_base_bdevs_operational": 2, 00:28:02.394 "process": { 00:28:02.394 "type": "rebuild", 00:28:02.395 "target": "spare", 00:28:02.395 "progress": { 00:28:02.395 "blocks": 3072, 00:28:02.395 "percent": 38 00:28:02.395 } 00:28:02.395 }, 00:28:02.395 "base_bdevs_list": [ 00:28:02.395 { 00:28:02.395 "name": "spare", 00:28:02.395 "uuid": "24e84382-8187-522d-a748-54c7d192b3e4", 00:28:02.395 "is_configured": true, 00:28:02.395 "data_offset": 256, 00:28:02.395 "data_size": 7936 00:28:02.395 }, 00:28:02.395 { 00:28:02.395 "name": "BaseBdev2", 00:28:02.395 "uuid": "ff367caa-e524-579d-a72a-736653bf96dc", 00:28:02.395 "is_configured": true, 00:28:02.395 "data_offset": 256, 00:28:02.395 "data_size": 7936 00:28:02.395 } 00:28:02.395 ] 00:28:02.395 }' 00:28:02.395 18:31:45 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:28:02.395 18:31:45 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:28:02.395 18:31:45 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:28:02.395 18:31:46 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:28:02.395 18:31:46 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@665 -- # '[' true = true ']' 00:28:02.395 18:31:46 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@665 -- # '[' = false ']' 00:28:02.395 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev_raid.sh: line 665: [: =: unary operator expected 00:28:02.395 18:31:46 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@690 -- # local num_base_bdevs_operational=2 00:28:02.395 18:31:46 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@692 -- # '[' raid1 = raid1 ']' 00:28:02.395 18:31:46 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@692 -- # '[' 2 -gt 2 ']' 00:28:02.395 18:31:46 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@705 -- # local timeout=1079 00:28:02.395 18:31:46 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:28:02.395 18:31:46 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:28:02.395 18:31:46 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:28:02.395 18:31:46 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:28:02.395 18:31:46 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@184 -- # local target=spare 00:28:02.395 18:31:46 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:28:02.395 18:31:46 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:02.395 18:31:46 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:28:02.656 18:31:46 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:28:02.656 "name": "raid_bdev1", 00:28:02.656 "uuid": "045d94af-4c44-4093-a769-a8f67e16ab80", 00:28:02.656 "strip_size_kb": 0, 00:28:02.656 "state": "online", 00:28:02.656 "raid_level": "raid1", 00:28:02.656 "superblock": true, 00:28:02.656 "num_base_bdevs": 2, 00:28:02.656 "num_base_bdevs_discovered": 2, 00:28:02.656 "num_base_bdevs_operational": 2, 00:28:02.656 "process": { 00:28:02.656 "type": "rebuild", 00:28:02.656 "target": "spare", 00:28:02.656 "progress": { 00:28:02.656 "blocks": 3840, 00:28:02.656 "percent": 48 00:28:02.656 } 00:28:02.656 }, 00:28:02.656 "base_bdevs_list": [ 00:28:02.656 { 00:28:02.656 "name": "spare", 00:28:02.657 "uuid": "24e84382-8187-522d-a748-54c7d192b3e4", 00:28:02.657 "is_configured": true, 00:28:02.657 "data_offset": 256, 00:28:02.657 "data_size": 7936 00:28:02.657 }, 00:28:02.657 { 00:28:02.657 "name": "BaseBdev2", 00:28:02.657 "uuid": "ff367caa-e524-579d-a72a-736653bf96dc", 00:28:02.657 "is_configured": true, 00:28:02.657 "data_offset": 256, 00:28:02.657 "data_size": 7936 00:28:02.657 } 00:28:02.657 ] 00:28:02.657 }' 00:28:02.657 18:31:46 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:28:02.657 18:31:46 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:28:02.657 18:31:46 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:28:02.657 18:31:46 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:28:02.657 18:31:46 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@710 -- # sleep 1 00:28:04.062 18:31:47 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:28:04.062 18:31:47 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:28:04.062 18:31:47 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:28:04.062 18:31:47 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:28:04.062 18:31:47 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@184 -- # local target=spare 00:28:04.062 18:31:47 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:28:04.062 18:31:47 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:04.062 18:31:47 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:28:04.062 18:31:47 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:28:04.062 "name": "raid_bdev1", 00:28:04.062 "uuid": "045d94af-4c44-4093-a769-a8f67e16ab80", 00:28:04.062 "strip_size_kb": 0, 00:28:04.062 "state": "online", 00:28:04.062 "raid_level": "raid1", 00:28:04.062 "superblock": true, 00:28:04.062 "num_base_bdevs": 2, 00:28:04.062 "num_base_bdevs_discovered": 2, 00:28:04.062 "num_base_bdevs_operational": 2, 00:28:04.062 "process": { 00:28:04.062 "type": "rebuild", 00:28:04.062 "target": "spare", 00:28:04.062 "progress": { 00:28:04.062 "blocks": 7424, 00:28:04.062 "percent": 93 00:28:04.062 } 00:28:04.062 }, 00:28:04.062 "base_bdevs_list": [ 00:28:04.062 { 00:28:04.062 "name": "spare", 00:28:04.062 "uuid": "24e84382-8187-522d-a748-54c7d192b3e4", 00:28:04.062 "is_configured": true, 00:28:04.062 "data_offset": 256, 00:28:04.062 "data_size": 7936 00:28:04.062 }, 00:28:04.062 { 00:28:04.062 "name": "BaseBdev2", 00:28:04.062 "uuid": "ff367caa-e524-579d-a72a-736653bf96dc", 00:28:04.062 "is_configured": true, 00:28:04.062 "data_offset": 256, 00:28:04.062 "data_size": 7936 00:28:04.062 } 00:28:04.062 ] 00:28:04.062 }' 00:28:04.062 18:31:47 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:28:04.062 18:31:47 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:28:04.062 18:31:47 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:28:04.062 18:31:47 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:28:04.062 18:31:47 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@710 -- # sleep 1 00:28:04.320 [2024-07-12 18:31:47.791027] bdev_raid.c:2789:raid_bdev_process_thread_run: *DEBUG*: process completed on raid_bdev1 00:28:04.320 [2024-07-12 18:31:47.791083] bdev_raid.c:2504:raid_bdev_process_finish_done: *NOTICE*: Finished rebuild on raid bdev raid_bdev1 00:28:04.320 [2024-07-12 18:31:47.791163] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:28:05.254 18:31:48 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:28:05.254 18:31:48 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:28:05.254 18:31:48 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:28:05.254 18:31:48 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:28:05.254 18:31:48 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@184 -- # local target=spare 00:28:05.254 18:31:48 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:28:05.254 18:31:48 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:05.254 18:31:48 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:28:05.254 18:31:48 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:28:05.254 "name": "raid_bdev1", 00:28:05.254 "uuid": "045d94af-4c44-4093-a769-a8f67e16ab80", 00:28:05.254 "strip_size_kb": 0, 00:28:05.254 "state": "online", 00:28:05.254 "raid_level": "raid1", 00:28:05.254 "superblock": true, 00:28:05.254 "num_base_bdevs": 2, 00:28:05.254 "num_base_bdevs_discovered": 2, 00:28:05.254 "num_base_bdevs_operational": 2, 00:28:05.254 "base_bdevs_list": [ 00:28:05.254 { 00:28:05.254 "name": "spare", 00:28:05.254 "uuid": "24e84382-8187-522d-a748-54c7d192b3e4", 00:28:05.254 "is_configured": true, 00:28:05.254 "data_offset": 256, 00:28:05.254 "data_size": 7936 00:28:05.254 }, 00:28:05.254 { 00:28:05.254 "name": "BaseBdev2", 00:28:05.254 "uuid": "ff367caa-e524-579d-a72a-736653bf96dc", 00:28:05.254 "is_configured": true, 00:28:05.254 "data_offset": 256, 00:28:05.254 "data_size": 7936 00:28:05.254 } 00:28:05.254 ] 00:28:05.254 }' 00:28:05.254 18:31:48 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:28:05.513 18:31:49 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # [[ none == \r\e\b\u\i\l\d ]] 00:28:05.513 18:31:49 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:28:05.513 18:31:49 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # [[ none == \s\p\a\r\e ]] 00:28:05.513 18:31:49 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@708 -- # break 00:28:05.513 18:31:49 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@714 -- # verify_raid_bdev_process raid_bdev1 none none 00:28:05.513 18:31:49 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:28:05.513 18:31:49 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:28:05.513 18:31:49 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@184 -- # local target=none 00:28:05.513 18:31:49 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:28:05.513 18:31:49 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:05.513 18:31:49 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:28:05.772 18:31:49 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:28:05.772 "name": "raid_bdev1", 00:28:05.772 "uuid": "045d94af-4c44-4093-a769-a8f67e16ab80", 00:28:05.772 "strip_size_kb": 0, 00:28:05.772 "state": "online", 00:28:05.772 "raid_level": "raid1", 00:28:05.772 "superblock": true, 00:28:05.772 "num_base_bdevs": 2, 00:28:05.772 "num_base_bdevs_discovered": 2, 00:28:05.772 "num_base_bdevs_operational": 2, 00:28:05.772 "base_bdevs_list": [ 00:28:05.772 { 00:28:05.772 "name": "spare", 00:28:05.772 "uuid": "24e84382-8187-522d-a748-54c7d192b3e4", 00:28:05.772 "is_configured": true, 00:28:05.772 "data_offset": 256, 00:28:05.772 "data_size": 7936 00:28:05.772 }, 00:28:05.772 { 00:28:05.772 "name": "BaseBdev2", 00:28:05.772 "uuid": "ff367caa-e524-579d-a72a-736653bf96dc", 00:28:05.772 "is_configured": true, 00:28:05.772 "data_offset": 256, 00:28:05.772 "data_size": 7936 00:28:05.772 } 00:28:05.772 ] 00:28:05.772 }' 00:28:05.772 18:31:49 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:28:05.772 18:31:49 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:28:05.772 18:31:49 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:28:05.772 18:31:49 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:28:05.772 18:31:49 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@715 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:28:05.772 18:31:49 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:28:05.772 18:31:49 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:28:05.772 18:31:49 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:28:05.772 18:31:49 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:28:05.772 18:31:49 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:28:05.772 18:31:49 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:28:05.772 18:31:49 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:28:05.772 18:31:49 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:28:05.772 18:31:49 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@124 -- # local tmp 00:28:05.772 18:31:49 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:28:05.772 18:31:49 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:06.030 18:31:49 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:28:06.030 "name": "raid_bdev1", 00:28:06.030 "uuid": "045d94af-4c44-4093-a769-a8f67e16ab80", 00:28:06.030 "strip_size_kb": 0, 00:28:06.030 "state": "online", 00:28:06.030 "raid_level": "raid1", 00:28:06.030 "superblock": true, 00:28:06.030 "num_base_bdevs": 2, 00:28:06.030 "num_base_bdevs_discovered": 2, 00:28:06.030 "num_base_bdevs_operational": 2, 00:28:06.030 "base_bdevs_list": [ 00:28:06.030 { 00:28:06.030 "name": "spare", 00:28:06.030 "uuid": "24e84382-8187-522d-a748-54c7d192b3e4", 00:28:06.030 "is_configured": true, 00:28:06.030 "data_offset": 256, 00:28:06.030 "data_size": 7936 00:28:06.030 }, 00:28:06.030 { 00:28:06.030 "name": "BaseBdev2", 00:28:06.030 "uuid": "ff367caa-e524-579d-a72a-736653bf96dc", 00:28:06.030 "is_configured": true, 00:28:06.030 "data_offset": 256, 00:28:06.030 "data_size": 7936 00:28:06.030 } 00:28:06.030 ] 00:28:06.030 }' 00:28:06.030 18:31:49 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:28:06.030 18:31:49 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@10 -- # set +x 00:28:06.596 18:31:50 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@718 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:28:06.854 [2024-07-12 18:31:50.445810] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:28:06.854 [2024-07-12 18:31:50.445837] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:28:06.854 [2024-07-12 18:31:50.445895] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:28:06.854 [2024-07-12 18:31:50.445959] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:28:06.854 [2024-07-12 18:31:50.445972] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xb3a1c0 name raid_bdev1, state offline 00:28:06.854 18:31:50 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@719 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:06.854 18:31:50 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@719 -- # jq length 00:28:07.112 18:31:50 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@719 -- # [[ 0 == 0 ]] 00:28:07.112 18:31:50 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@721 -- # '[' true = true ']' 00:28:07.112 18:31:50 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@722 -- # '[' false = true ']' 00:28:07.112 18:31:50 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@736 -- # nbd_start_disks /var/tmp/spdk-raid.sock 'BaseBdev1 spare' '/dev/nbd0 /dev/nbd1' 00:28:07.112 18:31:50 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:28:07.112 18:31:50 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@10 -- # bdev_list=('BaseBdev1' 'spare') 00:28:07.112 18:31:50 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@10 -- # local bdev_list 00:28:07.112 18:31:50 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:28:07.112 18:31:50 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@11 -- # local nbd_list 00:28:07.113 18:31:50 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@12 -- # local i 00:28:07.113 18:31:50 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:28:07.113 18:31:50 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:28:07.113 18:31:50 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk BaseBdev1 /dev/nbd0 00:28:07.371 /dev/nbd0 00:28:07.371 18:31:50 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:28:07.371 18:31:50 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:28:07.371 18:31:50 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@866 -- # local nbd_name=nbd0 00:28:07.371 18:31:50 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@867 -- # local i 00:28:07.371 18:31:50 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:28:07.371 18:31:50 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:28:07.371 18:31:50 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@870 -- # grep -q -w nbd0 /proc/partitions 00:28:07.371 18:31:50 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@871 -- # break 00:28:07.371 18:31:50 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:28:07.371 18:31:50 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:28:07.371 18:31:50 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@883 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:28:07.371 1+0 records in 00:28:07.371 1+0 records out 00:28:07.371 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000263289 s, 15.6 MB/s 00:28:07.371 18:31:50 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:28:07.371 18:31:50 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@884 -- # size=4096 00:28:07.371 18:31:50 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:28:07.371 18:31:51 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:28:07.371 18:31:51 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@887 -- # return 0 00:28:07.371 18:31:51 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:28:07.371 18:31:51 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:28:07.371 18:31:51 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk spare /dev/nbd1 00:28:07.630 /dev/nbd1 00:28:07.630 18:31:51 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:28:07.630 18:31:51 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:28:07.630 18:31:51 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@866 -- # local nbd_name=nbd1 00:28:07.630 18:31:51 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@867 -- # local i 00:28:07.630 18:31:51 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:28:07.630 18:31:51 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:28:07.630 18:31:51 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@870 -- # grep -q -w nbd1 /proc/partitions 00:28:07.630 18:31:51 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@871 -- # break 00:28:07.630 18:31:51 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:28:07.630 18:31:51 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:28:07.630 18:31:51 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@883 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:28:07.630 1+0 records in 00:28:07.630 1+0 records out 00:28:07.630 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000330953 s, 12.4 MB/s 00:28:07.630 18:31:51 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:28:07.630 18:31:51 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@884 -- # size=4096 00:28:07.630 18:31:51 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:28:07.630 18:31:51 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:28:07.630 18:31:51 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@887 -- # return 0 00:28:07.630 18:31:51 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:28:07.630 18:31:51 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:28:07.630 18:31:51 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@737 -- # cmp -i 1048576 /dev/nbd0 /dev/nbd1 00:28:07.630 18:31:51 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@738 -- # nbd_stop_disks /var/tmp/spdk-raid.sock '/dev/nbd0 /dev/nbd1' 00:28:07.630 18:31:51 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:28:07.630 18:31:51 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:28:07.630 18:31:51 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@50 -- # local nbd_list 00:28:07.630 18:31:51 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@51 -- # local i 00:28:07.630 18:31:51 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:28:07.630 18:31:51 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd0 00:28:07.894 18:31:51 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:28:07.894 18:31:51 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:28:07.894 18:31:51 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:28:07.894 18:31:51 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:28:07.894 18:31:51 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:28:07.894 18:31:51 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:28:07.894 18:31:51 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@41 -- # break 00:28:07.894 18:31:51 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@45 -- # return 0 00:28:07.894 18:31:51 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:28:07.894 18:31:51 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd1 00:28:08.151 18:31:51 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:28:08.151 18:31:51 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:28:08.151 18:31:51 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:28:08.151 18:31:51 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:28:08.151 18:31:51 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:28:08.151 18:31:51 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:28:08.151 18:31:51 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@41 -- # break 00:28:08.151 18:31:51 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@45 -- # return 0 00:28:08.151 18:31:51 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@742 -- # '[' true = true ']' 00:28:08.151 18:31:51 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@744 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:28:08.409 18:31:51 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@745 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:28:08.409 [2024-07-12 18:31:52.111228] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:28:08.409 [2024-07-12 18:31:52.111275] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:28:08.409 [2024-07-12 18:31:52.111295] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xb399d0 00:28:08.409 [2024-07-12 18:31:52.111308] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:28:08.409 [2024-07-12 18:31:52.112764] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:28:08.409 [2024-07-12 18:31:52.112795] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:28:08.409 [2024-07-12 18:31:52.112855] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev spare 00:28:08.409 [2024-07-12 18:31:52.112881] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:28:08.409 [2024-07-12 18:31:52.112995] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:28:08.409 spare 00:28:08.409 18:31:52 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@747 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:28:08.409 18:31:52 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:28:08.409 18:31:52 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:28:08.409 18:31:52 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:28:08.409 18:31:52 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:28:08.409 18:31:52 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:28:08.409 18:31:52 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:28:08.409 18:31:52 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:28:08.409 18:31:52 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:28:08.409 18:31:52 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@124 -- # local tmp 00:28:08.668 18:31:52 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:28:08.668 18:31:52 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:08.668 [2024-07-12 18:31:52.213303] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0xa4b7c0 00:28:08.668 [2024-07-12 18:31:52.213321] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4096 00:28:08.668 [2024-07-12 18:31:52.213394] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xa4b620 00:28:08.668 [2024-07-12 18:31:52.213516] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0xa4b7c0 00:28:08.668 [2024-07-12 18:31:52.213526] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0xa4b7c0 00:28:08.668 [2024-07-12 18:31:52.213601] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:28:08.668 18:31:52 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:28:08.668 "name": "raid_bdev1", 00:28:08.668 "uuid": "045d94af-4c44-4093-a769-a8f67e16ab80", 00:28:08.668 "strip_size_kb": 0, 00:28:08.668 "state": "online", 00:28:08.668 "raid_level": "raid1", 00:28:08.668 "superblock": true, 00:28:08.668 "num_base_bdevs": 2, 00:28:08.668 "num_base_bdevs_discovered": 2, 00:28:08.668 "num_base_bdevs_operational": 2, 00:28:08.668 "base_bdevs_list": [ 00:28:08.668 { 00:28:08.668 "name": "spare", 00:28:08.668 "uuid": "24e84382-8187-522d-a748-54c7d192b3e4", 00:28:08.668 "is_configured": true, 00:28:08.668 "data_offset": 256, 00:28:08.668 "data_size": 7936 00:28:08.668 }, 00:28:08.668 { 00:28:08.668 "name": "BaseBdev2", 00:28:08.668 "uuid": "ff367caa-e524-579d-a72a-736653bf96dc", 00:28:08.668 "is_configured": true, 00:28:08.668 "data_offset": 256, 00:28:08.668 "data_size": 7936 00:28:08.668 } 00:28:08.668 ] 00:28:08.668 }' 00:28:08.668 18:31:52 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:28:08.668 18:31:52 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@10 -- # set +x 00:28:09.234 18:31:52 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@748 -- # verify_raid_bdev_process raid_bdev1 none none 00:28:09.234 18:31:52 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:28:09.234 18:31:52 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:28:09.234 18:31:52 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@184 -- # local target=none 00:28:09.234 18:31:52 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:28:09.234 18:31:52 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:09.234 18:31:52 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:28:09.491 18:31:53 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:28:09.491 "name": "raid_bdev1", 00:28:09.491 "uuid": "045d94af-4c44-4093-a769-a8f67e16ab80", 00:28:09.491 "strip_size_kb": 0, 00:28:09.491 "state": "online", 00:28:09.491 "raid_level": "raid1", 00:28:09.491 "superblock": true, 00:28:09.491 "num_base_bdevs": 2, 00:28:09.491 "num_base_bdevs_discovered": 2, 00:28:09.491 "num_base_bdevs_operational": 2, 00:28:09.491 "base_bdevs_list": [ 00:28:09.491 { 00:28:09.491 "name": "spare", 00:28:09.491 "uuid": "24e84382-8187-522d-a748-54c7d192b3e4", 00:28:09.491 "is_configured": true, 00:28:09.491 "data_offset": 256, 00:28:09.491 "data_size": 7936 00:28:09.491 }, 00:28:09.491 { 00:28:09.491 "name": "BaseBdev2", 00:28:09.491 "uuid": "ff367caa-e524-579d-a72a-736653bf96dc", 00:28:09.491 "is_configured": true, 00:28:09.491 "data_offset": 256, 00:28:09.491 "data_size": 7936 00:28:09.491 } 00:28:09.491 ] 00:28:09.491 }' 00:28:09.491 18:31:53 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:28:09.749 18:31:53 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:28:09.749 18:31:53 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:28:09.749 18:31:53 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:28:09.749 18:31:53 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@749 -- # jq -r '.[].base_bdevs_list[0].name' 00:28:09.749 18:31:53 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@749 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:10.007 18:31:53 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@749 -- # [[ spare == \s\p\a\r\e ]] 00:28:10.007 18:31:53 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@752 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev spare 00:28:10.265 [2024-07-12 18:31:53.759718] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:28:10.265 18:31:53 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@753 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:28:10.265 18:31:53 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:28:10.265 18:31:53 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:28:10.265 18:31:53 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:28:10.265 18:31:53 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:28:10.265 18:31:53 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:28:10.265 18:31:53 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:28:10.265 18:31:53 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:28:10.265 18:31:53 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:28:10.265 18:31:53 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@124 -- # local tmp 00:28:10.265 18:31:53 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:10.265 18:31:53 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:28:10.523 18:31:54 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:28:10.523 "name": "raid_bdev1", 00:28:10.523 "uuid": "045d94af-4c44-4093-a769-a8f67e16ab80", 00:28:10.523 "strip_size_kb": 0, 00:28:10.523 "state": "online", 00:28:10.523 "raid_level": "raid1", 00:28:10.523 "superblock": true, 00:28:10.523 "num_base_bdevs": 2, 00:28:10.523 "num_base_bdevs_discovered": 1, 00:28:10.523 "num_base_bdevs_operational": 1, 00:28:10.523 "base_bdevs_list": [ 00:28:10.523 { 00:28:10.523 "name": null, 00:28:10.524 "uuid": "00000000-0000-0000-0000-000000000000", 00:28:10.524 "is_configured": false, 00:28:10.524 "data_offset": 256, 00:28:10.524 "data_size": 7936 00:28:10.524 }, 00:28:10.524 { 00:28:10.524 "name": "BaseBdev2", 00:28:10.524 "uuid": "ff367caa-e524-579d-a72a-736653bf96dc", 00:28:10.524 "is_configured": true, 00:28:10.524 "data_offset": 256, 00:28:10.524 "data_size": 7936 00:28:10.524 } 00:28:10.524 ] 00:28:10.524 }' 00:28:10.524 18:31:54 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:28:10.524 18:31:54 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@10 -- # set +x 00:28:11.088 18:31:54 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@754 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:28:11.346 [2024-07-12 18:31:54.882724] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:28:11.346 [2024-07-12 18:31:54.882878] bdev_raid.c:3562:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev spare (4) smaller than existing raid bdev raid_bdev1 (5) 00:28:11.346 [2024-07-12 18:31:54.882895] bdev_raid.c:3620:raid_bdev_examine_sb: *NOTICE*: Re-adding bdev spare to raid bdev raid_bdev1. 00:28:11.346 [2024-07-12 18:31:54.882922] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:28:11.346 [2024-07-12 18:31:54.885107] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xa4a870 00:28:11.346 [2024-07-12 18:31:54.886436] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:28:11.346 18:31:54 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@755 -- # sleep 1 00:28:12.277 18:31:55 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@756 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:28:12.277 18:31:55 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:28:12.277 18:31:55 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:28:12.277 18:31:55 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@184 -- # local target=spare 00:28:12.277 18:31:55 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:28:12.277 18:31:55 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:12.277 18:31:55 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:28:12.534 18:31:56 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:28:12.534 "name": "raid_bdev1", 00:28:12.534 "uuid": "045d94af-4c44-4093-a769-a8f67e16ab80", 00:28:12.534 "strip_size_kb": 0, 00:28:12.534 "state": "online", 00:28:12.534 "raid_level": "raid1", 00:28:12.534 "superblock": true, 00:28:12.534 "num_base_bdevs": 2, 00:28:12.535 "num_base_bdevs_discovered": 2, 00:28:12.535 "num_base_bdevs_operational": 2, 00:28:12.535 "process": { 00:28:12.535 "type": "rebuild", 00:28:12.535 "target": "spare", 00:28:12.535 "progress": { 00:28:12.535 "blocks": 3072, 00:28:12.535 "percent": 38 00:28:12.535 } 00:28:12.535 }, 00:28:12.535 "base_bdevs_list": [ 00:28:12.535 { 00:28:12.535 "name": "spare", 00:28:12.535 "uuid": "24e84382-8187-522d-a748-54c7d192b3e4", 00:28:12.535 "is_configured": true, 00:28:12.535 "data_offset": 256, 00:28:12.535 "data_size": 7936 00:28:12.535 }, 00:28:12.535 { 00:28:12.535 "name": "BaseBdev2", 00:28:12.535 "uuid": "ff367caa-e524-579d-a72a-736653bf96dc", 00:28:12.535 "is_configured": true, 00:28:12.535 "data_offset": 256, 00:28:12.535 "data_size": 7936 00:28:12.535 } 00:28:12.535 ] 00:28:12.535 }' 00:28:12.535 18:31:56 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:28:12.535 18:31:56 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:28:12.535 18:31:56 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:28:12.535 18:31:56 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:28:12.535 18:31:56 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@759 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:28:12.792 [2024-07-12 18:31:56.475903] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:28:12.792 [2024-07-12 18:31:56.499238] bdev_raid.c:2513:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:28:12.792 [2024-07-12 18:31:56.499286] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:28:12.792 [2024-07-12 18:31:56.499302] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:28:12.792 [2024-07-12 18:31:56.499310] bdev_raid.c:2451:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:28:13.050 18:31:56 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@760 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:28:13.050 18:31:56 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:28:13.050 18:31:56 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:28:13.050 18:31:56 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:28:13.050 18:31:56 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:28:13.050 18:31:56 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:28:13.050 18:31:56 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:28:13.050 18:31:56 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:28:13.050 18:31:56 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:28:13.050 18:31:56 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@124 -- # local tmp 00:28:13.050 18:31:56 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:28:13.050 18:31:56 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:13.050 18:31:56 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:28:13.050 "name": "raid_bdev1", 00:28:13.050 "uuid": "045d94af-4c44-4093-a769-a8f67e16ab80", 00:28:13.050 "strip_size_kb": 0, 00:28:13.050 "state": "online", 00:28:13.050 "raid_level": "raid1", 00:28:13.050 "superblock": true, 00:28:13.050 "num_base_bdevs": 2, 00:28:13.050 "num_base_bdevs_discovered": 1, 00:28:13.050 "num_base_bdevs_operational": 1, 00:28:13.050 "base_bdevs_list": [ 00:28:13.050 { 00:28:13.050 "name": null, 00:28:13.050 "uuid": "00000000-0000-0000-0000-000000000000", 00:28:13.050 "is_configured": false, 00:28:13.050 "data_offset": 256, 00:28:13.050 "data_size": 7936 00:28:13.050 }, 00:28:13.050 { 00:28:13.050 "name": "BaseBdev2", 00:28:13.050 "uuid": "ff367caa-e524-579d-a72a-736653bf96dc", 00:28:13.050 "is_configured": true, 00:28:13.050 "data_offset": 256, 00:28:13.050 "data_size": 7936 00:28:13.050 } 00:28:13.050 ] 00:28:13.050 }' 00:28:13.050 18:31:56 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:28:13.050 18:31:56 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@10 -- # set +x 00:28:13.982 18:31:57 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@761 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:28:13.983 [2024-07-12 18:31:57.573235] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:28:13.983 [2024-07-12 18:31:57.573285] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:28:13.983 [2024-07-12 18:31:57.573307] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xb7e8e0 00:28:13.983 [2024-07-12 18:31:57.573320] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:28:13.983 [2024-07-12 18:31:57.573526] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:28:13.983 [2024-07-12 18:31:57.573544] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:28:13.983 [2024-07-12 18:31:57.573603] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev spare 00:28:13.983 [2024-07-12 18:31:57.573615] bdev_raid.c:3562:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev spare (4) smaller than existing raid bdev raid_bdev1 (5) 00:28:13.983 [2024-07-12 18:31:57.573626] bdev_raid.c:3620:raid_bdev_examine_sb: *NOTICE*: Re-adding bdev spare to raid bdev raid_bdev1. 00:28:13.983 [2024-07-12 18:31:57.573645] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:28:13.983 [2024-07-12 18:31:57.575850] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xa4a870 00:28:13.983 [2024-07-12 18:31:57.577190] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:28:13.983 spare 00:28:13.983 18:31:57 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@762 -- # sleep 1 00:28:14.915 18:31:58 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@763 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:28:14.916 18:31:58 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:28:14.916 18:31:58 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:28:14.916 18:31:58 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@184 -- # local target=spare 00:28:14.916 18:31:58 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:28:14.916 18:31:58 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:14.916 18:31:58 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:28:15.173 18:31:58 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:28:15.173 "name": "raid_bdev1", 00:28:15.173 "uuid": "045d94af-4c44-4093-a769-a8f67e16ab80", 00:28:15.173 "strip_size_kb": 0, 00:28:15.173 "state": "online", 00:28:15.173 "raid_level": "raid1", 00:28:15.173 "superblock": true, 00:28:15.173 "num_base_bdevs": 2, 00:28:15.173 "num_base_bdevs_discovered": 2, 00:28:15.173 "num_base_bdevs_operational": 2, 00:28:15.173 "process": { 00:28:15.173 "type": "rebuild", 00:28:15.173 "target": "spare", 00:28:15.173 "progress": { 00:28:15.173 "blocks": 3072, 00:28:15.173 "percent": 38 00:28:15.173 } 00:28:15.173 }, 00:28:15.173 "base_bdevs_list": [ 00:28:15.173 { 00:28:15.173 "name": "spare", 00:28:15.173 "uuid": "24e84382-8187-522d-a748-54c7d192b3e4", 00:28:15.173 "is_configured": true, 00:28:15.173 "data_offset": 256, 00:28:15.173 "data_size": 7936 00:28:15.173 }, 00:28:15.173 { 00:28:15.173 "name": "BaseBdev2", 00:28:15.173 "uuid": "ff367caa-e524-579d-a72a-736653bf96dc", 00:28:15.173 "is_configured": true, 00:28:15.173 "data_offset": 256, 00:28:15.173 "data_size": 7936 00:28:15.173 } 00:28:15.174 ] 00:28:15.174 }' 00:28:15.174 18:31:58 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:28:15.174 18:31:58 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:28:15.174 18:31:58 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:28:15.431 18:31:58 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:28:15.431 18:31:58 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@766 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:28:15.688 [2024-07-12 18:31:59.170318] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:28:15.688 [2024-07-12 18:31:59.189527] bdev_raid.c:2513:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:28:15.688 [2024-07-12 18:31:59.189573] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:28:15.688 [2024-07-12 18:31:59.189588] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:28:15.688 [2024-07-12 18:31:59.189596] bdev_raid.c:2451:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:28:15.688 18:31:59 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@767 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:28:15.688 18:31:59 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:28:15.688 18:31:59 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:28:15.688 18:31:59 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:28:15.688 18:31:59 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:28:15.688 18:31:59 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:28:15.688 18:31:59 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:28:15.688 18:31:59 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:28:15.688 18:31:59 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:28:15.688 18:31:59 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@124 -- # local tmp 00:28:15.688 18:31:59 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:15.688 18:31:59 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:28:15.946 18:31:59 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:28:15.946 "name": "raid_bdev1", 00:28:15.946 "uuid": "045d94af-4c44-4093-a769-a8f67e16ab80", 00:28:15.946 "strip_size_kb": 0, 00:28:15.946 "state": "online", 00:28:15.946 "raid_level": "raid1", 00:28:15.946 "superblock": true, 00:28:15.946 "num_base_bdevs": 2, 00:28:15.946 "num_base_bdevs_discovered": 1, 00:28:15.946 "num_base_bdevs_operational": 1, 00:28:15.946 "base_bdevs_list": [ 00:28:15.946 { 00:28:15.946 "name": null, 00:28:15.946 "uuid": "00000000-0000-0000-0000-000000000000", 00:28:15.946 "is_configured": false, 00:28:15.946 "data_offset": 256, 00:28:15.946 "data_size": 7936 00:28:15.946 }, 00:28:15.946 { 00:28:15.946 "name": "BaseBdev2", 00:28:15.946 "uuid": "ff367caa-e524-579d-a72a-736653bf96dc", 00:28:15.946 "is_configured": true, 00:28:15.946 "data_offset": 256, 00:28:15.946 "data_size": 7936 00:28:15.946 } 00:28:15.946 ] 00:28:15.946 }' 00:28:15.946 18:31:59 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:28:15.946 18:31:59 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@10 -- # set +x 00:28:16.512 18:31:59 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@768 -- # verify_raid_bdev_process raid_bdev1 none none 00:28:16.512 18:31:59 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:28:16.512 18:31:59 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:28:16.512 18:31:59 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@184 -- # local target=none 00:28:16.512 18:31:59 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:28:16.512 18:31:59 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:16.512 18:31:59 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:28:16.512 18:32:00 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:28:16.512 "name": "raid_bdev1", 00:28:16.512 "uuid": "045d94af-4c44-4093-a769-a8f67e16ab80", 00:28:16.512 "strip_size_kb": 0, 00:28:16.512 "state": "online", 00:28:16.512 "raid_level": "raid1", 00:28:16.512 "superblock": true, 00:28:16.512 "num_base_bdevs": 2, 00:28:16.512 "num_base_bdevs_discovered": 1, 00:28:16.512 "num_base_bdevs_operational": 1, 00:28:16.512 "base_bdevs_list": [ 00:28:16.512 { 00:28:16.512 "name": null, 00:28:16.512 "uuid": "00000000-0000-0000-0000-000000000000", 00:28:16.512 "is_configured": false, 00:28:16.512 "data_offset": 256, 00:28:16.512 "data_size": 7936 00:28:16.512 }, 00:28:16.512 { 00:28:16.512 "name": "BaseBdev2", 00:28:16.512 "uuid": "ff367caa-e524-579d-a72a-736653bf96dc", 00:28:16.512 "is_configured": true, 00:28:16.512 "data_offset": 256, 00:28:16.512 "data_size": 7936 00:28:16.512 } 00:28:16.512 ] 00:28:16.512 }' 00:28:16.512 18:32:00 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:28:16.770 18:32:00 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:28:16.770 18:32:00 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:28:16.770 18:32:00 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:28:16.770 18:32:00 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@771 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete BaseBdev1 00:28:17.027 18:32:00 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@772 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev1_malloc -p BaseBdev1 00:28:17.284 [2024-07-12 18:32:00.793086] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev1_malloc 00:28:17.284 [2024-07-12 18:32:00.793130] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:28:17.284 [2024-07-12 18:32:00.793150] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x9e5900 00:28:17.284 [2024-07-12 18:32:00.793163] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:28:17.284 [2024-07-12 18:32:00.793343] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:28:17.284 [2024-07-12 18:32:00.793361] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:28:17.284 [2024-07-12 18:32:00.793407] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev BaseBdev1 00:28:17.284 [2024-07-12 18:32:00.793418] bdev_raid.c:3562:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev BaseBdev1 (1) smaller than existing raid bdev raid_bdev1 (5) 00:28:17.284 [2024-07-12 18:32:00.793428] bdev_raid.c:3581:raid_bdev_examine_sb: *DEBUG*: raid superblock does not contain this bdev's uuid 00:28:17.284 BaseBdev1 00:28:17.284 18:32:00 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@773 -- # sleep 1 00:28:18.228 18:32:01 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@774 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:28:18.228 18:32:01 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:28:18.228 18:32:01 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:28:18.228 18:32:01 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:28:18.228 18:32:01 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:28:18.228 18:32:01 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:28:18.228 18:32:01 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:28:18.228 18:32:01 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:28:18.228 18:32:01 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:28:18.228 18:32:01 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@124 -- # local tmp 00:28:18.228 18:32:01 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:18.228 18:32:01 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:28:18.486 18:32:02 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:28:18.486 "name": "raid_bdev1", 00:28:18.486 "uuid": "045d94af-4c44-4093-a769-a8f67e16ab80", 00:28:18.486 "strip_size_kb": 0, 00:28:18.486 "state": "online", 00:28:18.486 "raid_level": "raid1", 00:28:18.486 "superblock": true, 00:28:18.486 "num_base_bdevs": 2, 00:28:18.486 "num_base_bdevs_discovered": 1, 00:28:18.486 "num_base_bdevs_operational": 1, 00:28:18.486 "base_bdevs_list": [ 00:28:18.486 { 00:28:18.486 "name": null, 00:28:18.486 "uuid": "00000000-0000-0000-0000-000000000000", 00:28:18.486 "is_configured": false, 00:28:18.486 "data_offset": 256, 00:28:18.486 "data_size": 7936 00:28:18.486 }, 00:28:18.486 { 00:28:18.486 "name": "BaseBdev2", 00:28:18.486 "uuid": "ff367caa-e524-579d-a72a-736653bf96dc", 00:28:18.486 "is_configured": true, 00:28:18.486 "data_offset": 256, 00:28:18.486 "data_size": 7936 00:28:18.486 } 00:28:18.486 ] 00:28:18.486 }' 00:28:18.486 18:32:02 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:28:18.486 18:32:02 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@10 -- # set +x 00:28:19.051 18:32:02 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@775 -- # verify_raid_bdev_process raid_bdev1 none none 00:28:19.051 18:32:02 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:28:19.051 18:32:02 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:28:19.051 18:32:02 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@184 -- # local target=none 00:28:19.051 18:32:02 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:28:19.051 18:32:02 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:19.051 18:32:02 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:28:19.308 18:32:02 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:28:19.308 "name": "raid_bdev1", 00:28:19.308 "uuid": "045d94af-4c44-4093-a769-a8f67e16ab80", 00:28:19.308 "strip_size_kb": 0, 00:28:19.308 "state": "online", 00:28:19.308 "raid_level": "raid1", 00:28:19.308 "superblock": true, 00:28:19.308 "num_base_bdevs": 2, 00:28:19.308 "num_base_bdevs_discovered": 1, 00:28:19.308 "num_base_bdevs_operational": 1, 00:28:19.308 "base_bdevs_list": [ 00:28:19.308 { 00:28:19.308 "name": null, 00:28:19.308 "uuid": "00000000-0000-0000-0000-000000000000", 00:28:19.308 "is_configured": false, 00:28:19.308 "data_offset": 256, 00:28:19.308 "data_size": 7936 00:28:19.308 }, 00:28:19.308 { 00:28:19.308 "name": "BaseBdev2", 00:28:19.308 "uuid": "ff367caa-e524-579d-a72a-736653bf96dc", 00:28:19.308 "is_configured": true, 00:28:19.308 "data_offset": 256, 00:28:19.308 "data_size": 7936 00:28:19.308 } 00:28:19.308 ] 00:28:19.308 }' 00:28:19.308 18:32:02 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:28:19.308 18:32:02 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:28:19.308 18:32:02 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:28:19.308 18:32:03 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:28:19.308 18:32:03 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@776 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:28:19.308 18:32:03 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@648 -- # local es=0 00:28:19.308 18:32:03 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@650 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:28:19.308 18:32:03 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@636 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:28:19.308 18:32:03 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:28:19.308 18:32:03 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@640 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:28:19.308 18:32:03 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:28:19.308 18:32:03 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@642 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:28:19.308 18:32:03 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:28:19.308 18:32:03 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@642 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:28:19.308 18:32:03 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@642 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py ]] 00:28:19.308 18:32:03 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@651 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:28:19.565 [2024-07-12 18:32:03.243603] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:28:19.565 [2024-07-12 18:32:03.243727] bdev_raid.c:3562:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev BaseBdev1 (1) smaller than existing raid bdev raid_bdev1 (5) 00:28:19.565 [2024-07-12 18:32:03.243743] bdev_raid.c:3581:raid_bdev_examine_sb: *DEBUG*: raid superblock does not contain this bdev's uuid 00:28:19.565 request: 00:28:19.565 { 00:28:19.565 "base_bdev": "BaseBdev1", 00:28:19.565 "raid_bdev": "raid_bdev1", 00:28:19.565 "method": "bdev_raid_add_base_bdev", 00:28:19.565 "req_id": 1 00:28:19.565 } 00:28:19.565 Got JSON-RPC error response 00:28:19.565 response: 00:28:19.565 { 00:28:19.565 "code": -22, 00:28:19.565 "message": "Failed to add base bdev to RAID bdev: Invalid argument" 00:28:19.565 } 00:28:19.565 18:32:03 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@651 -- # es=1 00:28:19.565 18:32:03 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@659 -- # (( es > 128 )) 00:28:19.565 18:32:03 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@670 -- # [[ -n '' ]] 00:28:19.565 18:32:03 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@675 -- # (( !es == 0 )) 00:28:19.565 18:32:03 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@777 -- # sleep 1 00:28:20.999 18:32:04 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@778 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:28:20.999 18:32:04 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:28:20.999 18:32:04 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:28:20.999 18:32:04 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:28:20.999 18:32:04 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:28:20.999 18:32:04 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:28:20.999 18:32:04 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:28:20.999 18:32:04 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:28:20.999 18:32:04 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:28:20.999 18:32:04 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@124 -- # local tmp 00:28:20.999 18:32:04 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:20.999 18:32:04 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:28:21.000 18:32:04 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:28:21.000 "name": "raid_bdev1", 00:28:21.000 "uuid": "045d94af-4c44-4093-a769-a8f67e16ab80", 00:28:21.000 "strip_size_kb": 0, 00:28:21.000 "state": "online", 00:28:21.000 "raid_level": "raid1", 00:28:21.000 "superblock": true, 00:28:21.000 "num_base_bdevs": 2, 00:28:21.000 "num_base_bdevs_discovered": 1, 00:28:21.000 "num_base_bdevs_operational": 1, 00:28:21.000 "base_bdevs_list": [ 00:28:21.000 { 00:28:21.000 "name": null, 00:28:21.000 "uuid": "00000000-0000-0000-0000-000000000000", 00:28:21.000 "is_configured": false, 00:28:21.000 "data_offset": 256, 00:28:21.000 "data_size": 7936 00:28:21.000 }, 00:28:21.000 { 00:28:21.000 "name": "BaseBdev2", 00:28:21.000 "uuid": "ff367caa-e524-579d-a72a-736653bf96dc", 00:28:21.000 "is_configured": true, 00:28:21.000 "data_offset": 256, 00:28:21.000 "data_size": 7936 00:28:21.000 } 00:28:21.000 ] 00:28:21.000 }' 00:28:21.000 18:32:04 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:28:21.000 18:32:04 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@10 -- # set +x 00:28:21.566 18:32:05 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@779 -- # verify_raid_bdev_process raid_bdev1 none none 00:28:21.566 18:32:05 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:28:21.566 18:32:05 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:28:21.566 18:32:05 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@184 -- # local target=none 00:28:21.566 18:32:05 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:28:21.566 18:32:05 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:28:21.566 18:32:05 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:21.824 18:32:05 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:28:21.824 "name": "raid_bdev1", 00:28:21.824 "uuid": "045d94af-4c44-4093-a769-a8f67e16ab80", 00:28:21.824 "strip_size_kb": 0, 00:28:21.824 "state": "online", 00:28:21.824 "raid_level": "raid1", 00:28:21.824 "superblock": true, 00:28:21.824 "num_base_bdevs": 2, 00:28:21.824 "num_base_bdevs_discovered": 1, 00:28:21.824 "num_base_bdevs_operational": 1, 00:28:21.824 "base_bdevs_list": [ 00:28:21.824 { 00:28:21.824 "name": null, 00:28:21.824 "uuid": "00000000-0000-0000-0000-000000000000", 00:28:21.824 "is_configured": false, 00:28:21.824 "data_offset": 256, 00:28:21.824 "data_size": 7936 00:28:21.824 }, 00:28:21.824 { 00:28:21.824 "name": "BaseBdev2", 00:28:21.824 "uuid": "ff367caa-e524-579d-a72a-736653bf96dc", 00:28:21.824 "is_configured": true, 00:28:21.824 "data_offset": 256, 00:28:21.824 "data_size": 7936 00:28:21.824 } 00:28:21.824 ] 00:28:21.824 }' 00:28:21.824 18:32:05 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:28:21.824 18:32:05 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:28:21.824 18:32:05 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:28:21.824 18:32:05 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:28:21.824 18:32:05 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@782 -- # killprocess 2610209 00:28:21.824 18:32:05 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@948 -- # '[' -z 2610209 ']' 00:28:21.824 18:32:05 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@952 -- # kill -0 2610209 00:28:21.824 18:32:05 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@953 -- # uname 00:28:21.824 18:32:05 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:28:21.824 18:32:05 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2610209 00:28:21.824 18:32:05 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:28:21.824 18:32:05 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:28:21.824 18:32:05 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2610209' 00:28:21.824 killing process with pid 2610209 00:28:21.824 18:32:05 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@967 -- # kill 2610209 00:28:21.824 Received shutdown signal, test time was about 60.000000 seconds 00:28:21.824 00:28:21.824 Latency(us) 00:28:21.824 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:28:21.824 =================================================================================================================== 00:28:21.824 Total : 0.00 0.00 0.00 0.00 0.00 18446744073709551616.00 0.00 00:28:21.824 [2024-07-12 18:32:05.479268] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:28:21.824 [2024-07-12 18:32:05.479350] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:28:21.824 [2024-07-12 18:32:05.479391] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:28:21.824 [2024-07-12 18:32:05.479403] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xa4b7c0 name raid_bdev1, state offline 00:28:21.824 18:32:05 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@972 -- # wait 2610209 00:28:21.824 [2024-07-12 18:32:05.513792] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:28:22.082 18:32:05 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@784 -- # return 0 00:28:22.082 00:28:22.082 real 0m31.589s 00:28:22.082 user 0m49.344s 00:28:22.082 sys 0m5.113s 00:28:22.082 18:32:05 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@1124 -- # xtrace_disable 00:28:22.082 18:32:05 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@10 -- # set +x 00:28:22.082 ************************************ 00:28:22.082 END TEST raid_rebuild_test_sb_md_separate 00:28:22.082 ************************************ 00:28:22.082 18:32:05 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:28:22.082 18:32:05 bdev_raid -- bdev/bdev_raid.sh@911 -- # base_malloc_params='-m 32 -i' 00:28:22.082 18:32:05 bdev_raid -- bdev/bdev_raid.sh@912 -- # run_test raid_state_function_test_sb_md_interleaved raid_state_function_test raid1 2 true 00:28:22.082 18:32:05 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:28:22.082 18:32:05 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:28:22.082 18:32:05 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:28:22.082 ************************************ 00:28:22.082 START TEST raid_state_function_test_sb_md_interleaved 00:28:22.082 ************************************ 00:28:22.082 18:32:05 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@1123 -- # raid_state_function_test raid1 2 true 00:28:22.082 18:32:05 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@220 -- # local raid_level=raid1 00:28:22.082 18:32:05 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@221 -- # local num_base_bdevs=2 00:28:22.082 18:32:05 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@222 -- # local superblock=true 00:28:22.082 18:32:05 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@223 -- # local raid_bdev 00:28:22.082 18:32:05 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@224 -- # (( i = 1 )) 00:28:22.082 18:32:05 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:28:22.082 18:32:05 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@226 -- # echo BaseBdev1 00:28:22.082 18:32:05 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:28:22.082 18:32:05 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:28:22.082 18:32:05 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@226 -- # echo BaseBdev2 00:28:22.082 18:32:05 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:28:22.082 18:32:05 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:28:22.082 18:32:05 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@224 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:28:22.082 18:32:05 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@224 -- # local base_bdevs 00:28:22.082 18:32:05 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@225 -- # local raid_bdev_name=Existed_Raid 00:28:22.082 18:32:05 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@226 -- # local strip_size 00:28:22.082 18:32:05 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@227 -- # local strip_size_create_arg 00:28:22.082 18:32:05 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@228 -- # local superblock_create_arg 00:28:22.082 18:32:05 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@230 -- # '[' raid1 '!=' raid1 ']' 00:28:22.082 18:32:05 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@234 -- # strip_size=0 00:28:22.082 18:32:05 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@237 -- # '[' true = true ']' 00:28:22.082 18:32:05 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@238 -- # superblock_create_arg=-s 00:28:22.082 18:32:05 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@243 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:28:22.082 18:32:05 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@244 -- # raid_pid=2614742 00:28:22.082 18:32:05 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@245 -- # echo 'Process raid pid: 2614742' 00:28:22.082 Process raid pid: 2614742 00:28:22.082 18:32:05 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@246 -- # waitforlisten 2614742 /var/tmp/spdk-raid.sock 00:28:22.082 18:32:05 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@829 -- # '[' -z 2614742 ']' 00:28:22.340 18:32:05 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:28:22.340 18:32:05 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@834 -- # local max_retries=100 00:28:22.340 18:32:05 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:28:22.340 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:28:22.340 18:32:05 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@838 -- # xtrace_disable 00:28:22.340 18:32:05 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:28:22.340 [2024-07-12 18:32:05.853844] Starting SPDK v24.09-pre git sha1 182dd7de4 / DPDK 24.03.0 initialization... 00:28:22.340 [2024-07-12 18:32:05.853907] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:28:22.340 [2024-07-12 18:32:05.983080] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:28:22.598 [2024-07-12 18:32:06.085587] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:28:22.598 [2024-07-12 18:32:06.143307] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:28:22.598 [2024-07-12 18:32:06.143343] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:28:23.163 18:32:06 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:28:23.163 18:32:06 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@862 -- # return 0 00:28:23.163 18:32:06 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@250 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:28:23.421 [2024-07-12 18:32:07.016095] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:28:23.421 [2024-07-12 18:32:07.016140] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:28:23.421 [2024-07-12 18:32:07.016156] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:28:23.421 [2024-07-12 18:32:07.016169] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:28:23.421 18:32:07 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@251 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 2 00:28:23.421 18:32:07 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:28:23.421 18:32:07 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:28:23.421 18:32:07 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:28:23.421 18:32:07 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:28:23.421 18:32:07 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:28:23.421 18:32:07 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:28:23.421 18:32:07 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:28:23.421 18:32:07 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:28:23.421 18:32:07 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@124 -- # local tmp 00:28:23.421 18:32:07 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:23.421 18:32:07 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:28:23.679 18:32:07 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:28:23.679 "name": "Existed_Raid", 00:28:23.679 "uuid": "21e4be6a-e4c3-4865-b6c4-d01ed3756a41", 00:28:23.679 "strip_size_kb": 0, 00:28:23.679 "state": "configuring", 00:28:23.679 "raid_level": "raid1", 00:28:23.679 "superblock": true, 00:28:23.679 "num_base_bdevs": 2, 00:28:23.679 "num_base_bdevs_discovered": 0, 00:28:23.679 "num_base_bdevs_operational": 2, 00:28:23.679 "base_bdevs_list": [ 00:28:23.679 { 00:28:23.679 "name": "BaseBdev1", 00:28:23.679 "uuid": "00000000-0000-0000-0000-000000000000", 00:28:23.679 "is_configured": false, 00:28:23.679 "data_offset": 0, 00:28:23.679 "data_size": 0 00:28:23.679 }, 00:28:23.679 { 00:28:23.679 "name": "BaseBdev2", 00:28:23.679 "uuid": "00000000-0000-0000-0000-000000000000", 00:28:23.679 "is_configured": false, 00:28:23.679 "data_offset": 0, 00:28:23.679 "data_size": 0 00:28:23.679 } 00:28:23.679 ] 00:28:23.679 }' 00:28:23.679 18:32:07 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:28:23.679 18:32:07 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:28:24.242 18:32:07 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@252 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:28:24.498 [2024-07-12 18:32:08.038684] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:28:24.498 [2024-07-12 18:32:08.038714] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x200fa80 name Existed_Raid, state configuring 00:28:24.498 18:32:08 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@256 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:28:24.498 [2024-07-12 18:32:08.215166] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:28:24.498 [2024-07-12 18:32:08.215197] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:28:24.498 [2024-07-12 18:32:08.215207] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:28:24.498 [2024-07-12 18:32:08.215219] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:28:24.755 18:32:08 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@257 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -m 32 -i -b BaseBdev1 00:28:24.755 [2024-07-12 18:32:08.469771] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:28:24.755 BaseBdev1 00:28:25.011 18:32:08 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@258 -- # waitforbdev BaseBdev1 00:28:25.011 18:32:08 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev1 00:28:25.011 18:32:08 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:28:25.011 18:32:08 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@899 -- # local i 00:28:25.011 18:32:08 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:28:25.011 18:32:08 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:28:25.011 18:32:08 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:28:25.011 18:32:08 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:28:25.268 [ 00:28:25.268 { 00:28:25.268 "name": "BaseBdev1", 00:28:25.268 "aliases": [ 00:28:25.268 "daf2e969-aef3-4f49-be77-1efcf23a5c48" 00:28:25.268 ], 00:28:25.268 "product_name": "Malloc disk", 00:28:25.268 "block_size": 4128, 00:28:25.268 "num_blocks": 8192, 00:28:25.268 "uuid": "daf2e969-aef3-4f49-be77-1efcf23a5c48", 00:28:25.268 "md_size": 32, 00:28:25.268 "md_interleave": true, 00:28:25.268 "dif_type": 0, 00:28:25.268 "assigned_rate_limits": { 00:28:25.268 "rw_ios_per_sec": 0, 00:28:25.268 "rw_mbytes_per_sec": 0, 00:28:25.268 "r_mbytes_per_sec": 0, 00:28:25.268 "w_mbytes_per_sec": 0 00:28:25.268 }, 00:28:25.268 "claimed": true, 00:28:25.268 "claim_type": "exclusive_write", 00:28:25.268 "zoned": false, 00:28:25.268 "supported_io_types": { 00:28:25.268 "read": true, 00:28:25.268 "write": true, 00:28:25.268 "unmap": true, 00:28:25.268 "flush": true, 00:28:25.268 "reset": true, 00:28:25.268 "nvme_admin": false, 00:28:25.268 "nvme_io": false, 00:28:25.268 "nvme_io_md": false, 00:28:25.268 "write_zeroes": true, 00:28:25.268 "zcopy": true, 00:28:25.268 "get_zone_info": false, 00:28:25.268 "zone_management": false, 00:28:25.268 "zone_append": false, 00:28:25.268 "compare": false, 00:28:25.268 "compare_and_write": false, 00:28:25.268 "abort": true, 00:28:25.268 "seek_hole": false, 00:28:25.268 "seek_data": false, 00:28:25.268 "copy": true, 00:28:25.268 "nvme_iov_md": false 00:28:25.268 }, 00:28:25.268 "memory_domains": [ 00:28:25.269 { 00:28:25.269 "dma_device_id": "system", 00:28:25.269 "dma_device_type": 1 00:28:25.269 }, 00:28:25.269 { 00:28:25.269 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:28:25.269 "dma_device_type": 2 00:28:25.269 } 00:28:25.269 ], 00:28:25.269 "driver_specific": {} 00:28:25.269 } 00:28:25.269 ] 00:28:25.269 18:32:08 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@905 -- # return 0 00:28:25.269 18:32:08 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@259 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 2 00:28:25.269 18:32:08 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:28:25.269 18:32:08 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:28:25.269 18:32:08 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:28:25.269 18:32:08 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:28:25.269 18:32:08 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:28:25.269 18:32:08 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:28:25.269 18:32:08 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:28:25.269 18:32:08 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:28:25.269 18:32:08 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@124 -- # local tmp 00:28:25.269 18:32:08 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:25.269 18:32:08 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:28:25.526 18:32:09 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:28:25.526 "name": "Existed_Raid", 00:28:25.526 "uuid": "f3cc814f-8487-4115-8c0d-fa1ba0529bf7", 00:28:25.526 "strip_size_kb": 0, 00:28:25.526 "state": "configuring", 00:28:25.526 "raid_level": "raid1", 00:28:25.526 "superblock": true, 00:28:25.526 "num_base_bdevs": 2, 00:28:25.526 "num_base_bdevs_discovered": 1, 00:28:25.526 "num_base_bdevs_operational": 2, 00:28:25.526 "base_bdevs_list": [ 00:28:25.526 { 00:28:25.526 "name": "BaseBdev1", 00:28:25.526 "uuid": "daf2e969-aef3-4f49-be77-1efcf23a5c48", 00:28:25.526 "is_configured": true, 00:28:25.526 "data_offset": 256, 00:28:25.526 "data_size": 7936 00:28:25.526 }, 00:28:25.526 { 00:28:25.526 "name": "BaseBdev2", 00:28:25.526 "uuid": "00000000-0000-0000-0000-000000000000", 00:28:25.526 "is_configured": false, 00:28:25.526 "data_offset": 0, 00:28:25.526 "data_size": 0 00:28:25.526 } 00:28:25.526 ] 00:28:25.526 }' 00:28:25.526 18:32:09 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:28:25.526 18:32:09 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:28:26.092 18:32:09 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@260 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:28:26.350 [2024-07-12 18:32:10.038002] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:28:26.350 [2024-07-12 18:32:10.038046] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x200f350 name Existed_Raid, state configuring 00:28:26.350 18:32:10 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@264 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:28:26.609 [2024-07-12 18:32:10.282683] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:28:26.609 [2024-07-12 18:32:10.284213] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:28:26.609 [2024-07-12 18:32:10.284248] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:28:26.609 18:32:10 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@265 -- # (( i = 1 )) 00:28:26.609 18:32:10 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:28:26.609 18:32:10 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 2 00:28:26.609 18:32:10 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:28:26.609 18:32:10 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:28:26.609 18:32:10 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:28:26.609 18:32:10 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:28:26.609 18:32:10 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:28:26.609 18:32:10 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:28:26.609 18:32:10 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:28:26.609 18:32:10 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:28:26.609 18:32:10 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@124 -- # local tmp 00:28:26.609 18:32:10 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:26.609 18:32:10 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:28:26.867 18:32:10 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:28:26.867 "name": "Existed_Raid", 00:28:26.867 "uuid": "8856e553-9fb4-47a8-bd46-747c95b604ef", 00:28:26.867 "strip_size_kb": 0, 00:28:26.867 "state": "configuring", 00:28:26.867 "raid_level": "raid1", 00:28:26.867 "superblock": true, 00:28:26.867 "num_base_bdevs": 2, 00:28:26.867 "num_base_bdevs_discovered": 1, 00:28:26.867 "num_base_bdevs_operational": 2, 00:28:26.867 "base_bdevs_list": [ 00:28:26.867 { 00:28:26.867 "name": "BaseBdev1", 00:28:26.867 "uuid": "daf2e969-aef3-4f49-be77-1efcf23a5c48", 00:28:26.867 "is_configured": true, 00:28:26.867 "data_offset": 256, 00:28:26.867 "data_size": 7936 00:28:26.867 }, 00:28:26.867 { 00:28:26.867 "name": "BaseBdev2", 00:28:26.867 "uuid": "00000000-0000-0000-0000-000000000000", 00:28:26.867 "is_configured": false, 00:28:26.867 "data_offset": 0, 00:28:26.868 "data_size": 0 00:28:26.868 } 00:28:26.868 ] 00:28:26.868 }' 00:28:26.868 18:32:10 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:28:26.868 18:32:10 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:28:27.434 18:32:11 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -m 32 -i -b BaseBdev2 00:28:27.691 [2024-07-12 18:32:11.369108] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:28:27.691 [2024-07-12 18:32:11.369239] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x2011180 00:28:27.691 [2024-07-12 18:32:11.369253] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4128 00:28:27.691 [2024-07-12 18:32:11.369312] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x2011150 00:28:27.691 [2024-07-12 18:32:11.369384] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x2011180 00:28:27.692 [2024-07-12 18:32:11.369394] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x2011180 00:28:27.692 [2024-07-12 18:32:11.369448] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:28:27.692 BaseBdev2 00:28:27.692 18:32:11 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev2 00:28:27.692 18:32:11 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev2 00:28:27.692 18:32:11 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:28:27.692 18:32:11 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@899 -- # local i 00:28:27.692 18:32:11 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:28:27.692 18:32:11 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:28:27.692 18:32:11 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:28:27.950 18:32:11 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:28:28.208 [ 00:28:28.208 { 00:28:28.208 "name": "BaseBdev2", 00:28:28.208 "aliases": [ 00:28:28.208 "b003afd1-53f9-4f6d-a66e-a64c79007219" 00:28:28.208 ], 00:28:28.208 "product_name": "Malloc disk", 00:28:28.208 "block_size": 4128, 00:28:28.208 "num_blocks": 8192, 00:28:28.208 "uuid": "b003afd1-53f9-4f6d-a66e-a64c79007219", 00:28:28.208 "md_size": 32, 00:28:28.208 "md_interleave": true, 00:28:28.208 "dif_type": 0, 00:28:28.208 "assigned_rate_limits": { 00:28:28.208 "rw_ios_per_sec": 0, 00:28:28.208 "rw_mbytes_per_sec": 0, 00:28:28.208 "r_mbytes_per_sec": 0, 00:28:28.208 "w_mbytes_per_sec": 0 00:28:28.208 }, 00:28:28.208 "claimed": true, 00:28:28.208 "claim_type": "exclusive_write", 00:28:28.208 "zoned": false, 00:28:28.208 "supported_io_types": { 00:28:28.208 "read": true, 00:28:28.208 "write": true, 00:28:28.208 "unmap": true, 00:28:28.208 "flush": true, 00:28:28.208 "reset": true, 00:28:28.208 "nvme_admin": false, 00:28:28.208 "nvme_io": false, 00:28:28.208 "nvme_io_md": false, 00:28:28.208 "write_zeroes": true, 00:28:28.208 "zcopy": true, 00:28:28.208 "get_zone_info": false, 00:28:28.208 "zone_management": false, 00:28:28.208 "zone_append": false, 00:28:28.208 "compare": false, 00:28:28.208 "compare_and_write": false, 00:28:28.208 "abort": true, 00:28:28.208 "seek_hole": false, 00:28:28.208 "seek_data": false, 00:28:28.208 "copy": true, 00:28:28.208 "nvme_iov_md": false 00:28:28.208 }, 00:28:28.208 "memory_domains": [ 00:28:28.208 { 00:28:28.208 "dma_device_id": "system", 00:28:28.208 "dma_device_type": 1 00:28:28.208 }, 00:28:28.208 { 00:28:28.208 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:28:28.208 "dma_device_type": 2 00:28:28.208 } 00:28:28.208 ], 00:28:28.208 "driver_specific": {} 00:28:28.208 } 00:28:28.208 ] 00:28:28.208 18:32:11 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@905 -- # return 0 00:28:28.208 18:32:11 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:28:28.208 18:32:11 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:28:28.208 18:32:11 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@270 -- # verify_raid_bdev_state Existed_Raid online raid1 0 2 00:28:28.208 18:32:11 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:28:28.208 18:32:11 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:28:28.208 18:32:11 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:28:28.208 18:32:11 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:28:28.208 18:32:11 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:28:28.209 18:32:11 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:28:28.209 18:32:11 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:28:28.209 18:32:11 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:28:28.209 18:32:11 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@124 -- # local tmp 00:28:28.209 18:32:11 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:28.209 18:32:11 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:28:28.467 18:32:12 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:28:28.467 "name": "Existed_Raid", 00:28:28.467 "uuid": "8856e553-9fb4-47a8-bd46-747c95b604ef", 00:28:28.467 "strip_size_kb": 0, 00:28:28.467 "state": "online", 00:28:28.467 "raid_level": "raid1", 00:28:28.467 "superblock": true, 00:28:28.467 "num_base_bdevs": 2, 00:28:28.467 "num_base_bdevs_discovered": 2, 00:28:28.467 "num_base_bdevs_operational": 2, 00:28:28.467 "base_bdevs_list": [ 00:28:28.467 { 00:28:28.467 "name": "BaseBdev1", 00:28:28.467 "uuid": "daf2e969-aef3-4f49-be77-1efcf23a5c48", 00:28:28.467 "is_configured": true, 00:28:28.467 "data_offset": 256, 00:28:28.467 "data_size": 7936 00:28:28.467 }, 00:28:28.467 { 00:28:28.467 "name": "BaseBdev2", 00:28:28.467 "uuid": "b003afd1-53f9-4f6d-a66e-a64c79007219", 00:28:28.467 "is_configured": true, 00:28:28.467 "data_offset": 256, 00:28:28.467 "data_size": 7936 00:28:28.467 } 00:28:28.467 ] 00:28:28.467 }' 00:28:28.467 18:32:12 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:28:28.467 18:32:12 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:28:29.033 18:32:12 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_properties Existed_Raid 00:28:29.033 18:32:12 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:28:29.033 18:32:12 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:28:29.033 18:32:12 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:28:29.033 18:32:12 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:28:29.033 18:32:12 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@198 -- # local name 00:28:29.033 18:32:12 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:28:29.033 18:32:12 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:28:29.291 [2024-07-12 18:32:12.881521] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:28:29.291 18:32:12 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:28:29.291 "name": "Existed_Raid", 00:28:29.291 "aliases": [ 00:28:29.291 "8856e553-9fb4-47a8-bd46-747c95b604ef" 00:28:29.291 ], 00:28:29.291 "product_name": "Raid Volume", 00:28:29.291 "block_size": 4128, 00:28:29.291 "num_blocks": 7936, 00:28:29.291 "uuid": "8856e553-9fb4-47a8-bd46-747c95b604ef", 00:28:29.291 "md_size": 32, 00:28:29.291 "md_interleave": true, 00:28:29.291 "dif_type": 0, 00:28:29.291 "assigned_rate_limits": { 00:28:29.291 "rw_ios_per_sec": 0, 00:28:29.291 "rw_mbytes_per_sec": 0, 00:28:29.291 "r_mbytes_per_sec": 0, 00:28:29.291 "w_mbytes_per_sec": 0 00:28:29.291 }, 00:28:29.291 "claimed": false, 00:28:29.291 "zoned": false, 00:28:29.291 "supported_io_types": { 00:28:29.291 "read": true, 00:28:29.291 "write": true, 00:28:29.291 "unmap": false, 00:28:29.291 "flush": false, 00:28:29.291 "reset": true, 00:28:29.291 "nvme_admin": false, 00:28:29.291 "nvme_io": false, 00:28:29.291 "nvme_io_md": false, 00:28:29.291 "write_zeroes": true, 00:28:29.291 "zcopy": false, 00:28:29.291 "get_zone_info": false, 00:28:29.291 "zone_management": false, 00:28:29.291 "zone_append": false, 00:28:29.291 "compare": false, 00:28:29.291 "compare_and_write": false, 00:28:29.291 "abort": false, 00:28:29.291 "seek_hole": false, 00:28:29.291 "seek_data": false, 00:28:29.291 "copy": false, 00:28:29.291 "nvme_iov_md": false 00:28:29.291 }, 00:28:29.291 "memory_domains": [ 00:28:29.291 { 00:28:29.291 "dma_device_id": "system", 00:28:29.291 "dma_device_type": 1 00:28:29.291 }, 00:28:29.291 { 00:28:29.291 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:28:29.291 "dma_device_type": 2 00:28:29.291 }, 00:28:29.291 { 00:28:29.291 "dma_device_id": "system", 00:28:29.291 "dma_device_type": 1 00:28:29.291 }, 00:28:29.291 { 00:28:29.291 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:28:29.291 "dma_device_type": 2 00:28:29.291 } 00:28:29.291 ], 00:28:29.291 "driver_specific": { 00:28:29.291 "raid": { 00:28:29.291 "uuid": "8856e553-9fb4-47a8-bd46-747c95b604ef", 00:28:29.291 "strip_size_kb": 0, 00:28:29.291 "state": "online", 00:28:29.291 "raid_level": "raid1", 00:28:29.291 "superblock": true, 00:28:29.291 "num_base_bdevs": 2, 00:28:29.291 "num_base_bdevs_discovered": 2, 00:28:29.291 "num_base_bdevs_operational": 2, 00:28:29.291 "base_bdevs_list": [ 00:28:29.291 { 00:28:29.291 "name": "BaseBdev1", 00:28:29.291 "uuid": "daf2e969-aef3-4f49-be77-1efcf23a5c48", 00:28:29.291 "is_configured": true, 00:28:29.291 "data_offset": 256, 00:28:29.291 "data_size": 7936 00:28:29.291 }, 00:28:29.291 { 00:28:29.291 "name": "BaseBdev2", 00:28:29.291 "uuid": "b003afd1-53f9-4f6d-a66e-a64c79007219", 00:28:29.291 "is_configured": true, 00:28:29.291 "data_offset": 256, 00:28:29.291 "data_size": 7936 00:28:29.291 } 00:28:29.291 ] 00:28:29.291 } 00:28:29.291 } 00:28:29.291 }' 00:28:29.291 18:32:12 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:28:29.291 18:32:12 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@201 -- # base_bdev_names='BaseBdev1 00:28:29.291 BaseBdev2' 00:28:29.291 18:32:12 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:28:29.291 18:32:12 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:28:29.291 18:32:12 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:28:29.547 18:32:13 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:28:29.547 "name": "BaseBdev1", 00:28:29.547 "aliases": [ 00:28:29.547 "daf2e969-aef3-4f49-be77-1efcf23a5c48" 00:28:29.547 ], 00:28:29.547 "product_name": "Malloc disk", 00:28:29.547 "block_size": 4128, 00:28:29.547 "num_blocks": 8192, 00:28:29.547 "uuid": "daf2e969-aef3-4f49-be77-1efcf23a5c48", 00:28:29.547 "md_size": 32, 00:28:29.547 "md_interleave": true, 00:28:29.547 "dif_type": 0, 00:28:29.547 "assigned_rate_limits": { 00:28:29.547 "rw_ios_per_sec": 0, 00:28:29.547 "rw_mbytes_per_sec": 0, 00:28:29.547 "r_mbytes_per_sec": 0, 00:28:29.547 "w_mbytes_per_sec": 0 00:28:29.547 }, 00:28:29.547 "claimed": true, 00:28:29.547 "claim_type": "exclusive_write", 00:28:29.547 "zoned": false, 00:28:29.547 "supported_io_types": { 00:28:29.547 "read": true, 00:28:29.547 "write": true, 00:28:29.547 "unmap": true, 00:28:29.547 "flush": true, 00:28:29.547 "reset": true, 00:28:29.547 "nvme_admin": false, 00:28:29.547 "nvme_io": false, 00:28:29.547 "nvme_io_md": false, 00:28:29.547 "write_zeroes": true, 00:28:29.547 "zcopy": true, 00:28:29.547 "get_zone_info": false, 00:28:29.547 "zone_management": false, 00:28:29.547 "zone_append": false, 00:28:29.547 "compare": false, 00:28:29.547 "compare_and_write": false, 00:28:29.547 "abort": true, 00:28:29.547 "seek_hole": false, 00:28:29.547 "seek_data": false, 00:28:29.547 "copy": true, 00:28:29.547 "nvme_iov_md": false 00:28:29.547 }, 00:28:29.547 "memory_domains": [ 00:28:29.547 { 00:28:29.547 "dma_device_id": "system", 00:28:29.547 "dma_device_type": 1 00:28:29.547 }, 00:28:29.547 { 00:28:29.547 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:28:29.547 "dma_device_type": 2 00:28:29.547 } 00:28:29.547 ], 00:28:29.547 "driver_specific": {} 00:28:29.547 }' 00:28:29.547 18:32:13 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:28:29.547 18:32:13 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:28:29.547 18:32:13 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@205 -- # [[ 4128 == 4128 ]] 00:28:29.547 18:32:13 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:28:29.804 18:32:13 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:28:29.804 18:32:13 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@206 -- # [[ 32 == 32 ]] 00:28:29.804 18:32:13 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:28:29.804 18:32:13 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:28:29.804 18:32:13 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@207 -- # [[ true == true ]] 00:28:29.804 18:32:13 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:28:29.804 18:32:13 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:28:29.804 18:32:13 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@208 -- # [[ 0 == 0 ]] 00:28:29.804 18:32:13 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:28:29.804 18:32:13 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:28:29.804 18:32:13 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:28:30.062 18:32:13 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:28:30.062 "name": "BaseBdev2", 00:28:30.062 "aliases": [ 00:28:30.062 "b003afd1-53f9-4f6d-a66e-a64c79007219" 00:28:30.062 ], 00:28:30.062 "product_name": "Malloc disk", 00:28:30.062 "block_size": 4128, 00:28:30.062 "num_blocks": 8192, 00:28:30.062 "uuid": "b003afd1-53f9-4f6d-a66e-a64c79007219", 00:28:30.062 "md_size": 32, 00:28:30.062 "md_interleave": true, 00:28:30.062 "dif_type": 0, 00:28:30.062 "assigned_rate_limits": { 00:28:30.062 "rw_ios_per_sec": 0, 00:28:30.062 "rw_mbytes_per_sec": 0, 00:28:30.062 "r_mbytes_per_sec": 0, 00:28:30.062 "w_mbytes_per_sec": 0 00:28:30.062 }, 00:28:30.062 "claimed": true, 00:28:30.062 "claim_type": "exclusive_write", 00:28:30.062 "zoned": false, 00:28:30.062 "supported_io_types": { 00:28:30.062 "read": true, 00:28:30.062 "write": true, 00:28:30.062 "unmap": true, 00:28:30.062 "flush": true, 00:28:30.062 "reset": true, 00:28:30.062 "nvme_admin": false, 00:28:30.062 "nvme_io": false, 00:28:30.062 "nvme_io_md": false, 00:28:30.062 "write_zeroes": true, 00:28:30.062 "zcopy": true, 00:28:30.062 "get_zone_info": false, 00:28:30.062 "zone_management": false, 00:28:30.062 "zone_append": false, 00:28:30.062 "compare": false, 00:28:30.062 "compare_and_write": false, 00:28:30.062 "abort": true, 00:28:30.062 "seek_hole": false, 00:28:30.062 "seek_data": false, 00:28:30.062 "copy": true, 00:28:30.062 "nvme_iov_md": false 00:28:30.062 }, 00:28:30.062 "memory_domains": [ 00:28:30.062 { 00:28:30.062 "dma_device_id": "system", 00:28:30.062 "dma_device_type": 1 00:28:30.062 }, 00:28:30.062 { 00:28:30.062 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:28:30.062 "dma_device_type": 2 00:28:30.062 } 00:28:30.062 ], 00:28:30.062 "driver_specific": {} 00:28:30.062 }' 00:28:30.062 18:32:13 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:28:30.320 18:32:13 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:28:30.320 18:32:13 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@205 -- # [[ 4128 == 4128 ]] 00:28:30.320 18:32:13 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:28:30.320 18:32:13 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:28:30.320 18:32:13 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@206 -- # [[ 32 == 32 ]] 00:28:30.320 18:32:13 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:28:30.320 18:32:13 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:28:30.320 18:32:14 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@207 -- # [[ true == true ]] 00:28:30.320 18:32:14 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:28:30.578 18:32:14 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:28:30.578 18:32:14 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@208 -- # [[ 0 == 0 ]] 00:28:30.578 18:32:14 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@274 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:28:30.578 [2024-07-12 18:32:14.281010] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:28:30.578 18:32:14 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@275 -- # local expected_state 00:28:30.578 18:32:14 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@276 -- # has_redundancy raid1 00:28:30.836 18:32:14 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@213 -- # case $1 in 00:28:30.836 18:32:14 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@214 -- # return 0 00:28:30.836 18:32:14 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@279 -- # expected_state=online 00:28:30.836 18:32:14 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@281 -- # verify_raid_bdev_state Existed_Raid online raid1 0 1 00:28:30.836 18:32:14 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:28:30.836 18:32:14 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:28:30.836 18:32:14 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:28:30.836 18:32:14 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:28:30.836 18:32:14 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:28:30.837 18:32:14 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:28:30.837 18:32:14 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:28:30.837 18:32:14 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:28:30.837 18:32:14 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@124 -- # local tmp 00:28:30.837 18:32:14 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:30.837 18:32:14 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:28:30.837 18:32:14 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:28:30.837 "name": "Existed_Raid", 00:28:30.837 "uuid": "8856e553-9fb4-47a8-bd46-747c95b604ef", 00:28:30.837 "strip_size_kb": 0, 00:28:30.837 "state": "online", 00:28:30.837 "raid_level": "raid1", 00:28:30.837 "superblock": true, 00:28:30.837 "num_base_bdevs": 2, 00:28:30.837 "num_base_bdevs_discovered": 1, 00:28:30.837 "num_base_bdevs_operational": 1, 00:28:30.837 "base_bdevs_list": [ 00:28:30.837 { 00:28:30.837 "name": null, 00:28:30.837 "uuid": "00000000-0000-0000-0000-000000000000", 00:28:30.837 "is_configured": false, 00:28:30.837 "data_offset": 256, 00:28:30.837 "data_size": 7936 00:28:30.837 }, 00:28:30.837 { 00:28:30.837 "name": "BaseBdev2", 00:28:30.837 "uuid": "b003afd1-53f9-4f6d-a66e-a64c79007219", 00:28:30.837 "is_configured": true, 00:28:30.837 "data_offset": 256, 00:28:30.837 "data_size": 7936 00:28:30.837 } 00:28:30.837 ] 00:28:30.837 }' 00:28:30.837 18:32:14 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:28:30.837 18:32:14 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:28:31.404 18:32:15 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@285 -- # (( i = 1 )) 00:28:31.404 18:32:15 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:28:31.404 18:32:15 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:31.404 18:32:15 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:28:31.970 18:32:15 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:28:31.970 18:32:15 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:28:31.970 18:32:15 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:28:32.228 [2024-07-12 18:32:15.794148] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:28:32.228 [2024-07-12 18:32:15.794227] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:28:32.228 [2024-07-12 18:32:15.805478] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:28:32.228 [2024-07-12 18:32:15.805517] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:28:32.228 [2024-07-12 18:32:15.805528] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x2011180 name Existed_Raid, state offline 00:28:32.228 18:32:15 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:28:32.228 18:32:15 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:28:32.228 18:32:15 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@293 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:32.228 18:32:15 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@293 -- # jq -r '.[0]["name"] | select(.)' 00:28:32.487 18:32:16 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@293 -- # raid_bdev= 00:28:32.487 18:32:16 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@294 -- # '[' -n '' ']' 00:28:32.487 18:32:16 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@299 -- # '[' 2 -gt 2 ']' 00:28:32.487 18:32:16 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@341 -- # killprocess 2614742 00:28:32.487 18:32:16 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@948 -- # '[' -z 2614742 ']' 00:28:32.487 18:32:16 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@952 -- # kill -0 2614742 00:28:32.487 18:32:16 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@953 -- # uname 00:28:32.487 18:32:16 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:28:32.487 18:32:16 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2614742 00:28:32.487 18:32:16 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:28:32.487 18:32:16 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:28:32.487 18:32:16 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2614742' 00:28:32.487 killing process with pid 2614742 00:28:32.487 18:32:16 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@967 -- # kill 2614742 00:28:32.487 [2024-07-12 18:32:16.111331] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:28:32.487 18:32:16 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@972 -- # wait 2614742 00:28:32.487 [2024-07-12 18:32:16.112317] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:28:32.746 18:32:16 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@343 -- # return 0 00:28:32.746 00:28:32.746 real 0m10.537s 00:28:32.746 user 0m18.705s 00:28:32.746 sys 0m1.992s 00:28:32.746 18:32:16 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@1124 -- # xtrace_disable 00:28:32.746 18:32:16 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:28:32.746 ************************************ 00:28:32.746 END TEST raid_state_function_test_sb_md_interleaved 00:28:32.746 ************************************ 00:28:32.746 18:32:16 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:28:32.746 18:32:16 bdev_raid -- bdev/bdev_raid.sh@913 -- # run_test raid_superblock_test_md_interleaved raid_superblock_test raid1 2 00:28:32.746 18:32:16 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 4 -le 1 ']' 00:28:32.746 18:32:16 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:28:32.746 18:32:16 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:28:32.746 ************************************ 00:28:32.746 START TEST raid_superblock_test_md_interleaved 00:28:32.746 ************************************ 00:28:32.746 18:32:16 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@1123 -- # raid_superblock_test raid1 2 00:28:32.746 18:32:16 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@392 -- # local raid_level=raid1 00:28:32.746 18:32:16 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@393 -- # local num_base_bdevs=2 00:28:32.746 18:32:16 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@394 -- # base_bdevs_malloc=() 00:28:32.746 18:32:16 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@394 -- # local base_bdevs_malloc 00:28:32.746 18:32:16 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@395 -- # base_bdevs_pt=() 00:28:32.746 18:32:16 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@395 -- # local base_bdevs_pt 00:28:32.746 18:32:16 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@396 -- # base_bdevs_pt_uuid=() 00:28:32.746 18:32:16 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@396 -- # local base_bdevs_pt_uuid 00:28:32.746 18:32:16 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@397 -- # local raid_bdev_name=raid_bdev1 00:28:32.746 18:32:16 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@398 -- # local strip_size 00:28:32.746 18:32:16 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@399 -- # local strip_size_create_arg 00:28:32.746 18:32:16 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@400 -- # local raid_bdev_uuid 00:28:32.746 18:32:16 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@401 -- # local raid_bdev 00:28:32.746 18:32:16 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@403 -- # '[' raid1 '!=' raid1 ']' 00:28:32.747 18:32:16 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@407 -- # strip_size=0 00:28:32.747 18:32:16 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@411 -- # raid_pid=2616250 00:28:32.747 18:32:16 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@412 -- # waitforlisten 2616250 /var/tmp/spdk-raid.sock 00:28:32.747 18:32:16 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@829 -- # '[' -z 2616250 ']' 00:28:32.747 18:32:16 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:28:32.747 18:32:16 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@834 -- # local max_retries=100 00:28:32.747 18:32:16 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:28:32.747 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:28:32.747 18:32:16 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@838 -- # xtrace_disable 00:28:32.747 18:32:16 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:28:32.747 18:32:16 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@410 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -L bdev_raid 00:28:32.747 [2024-07-12 18:32:16.463806] Starting SPDK v24.09-pre git sha1 182dd7de4 / DPDK 24.03.0 initialization... 00:28:32.747 [2024-07-12 18:32:16.463873] [ DPDK EAL parameters: bdev_svc --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2616250 ] 00:28:33.006 [2024-07-12 18:32:16.595457] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:28:33.006 [2024-07-12 18:32:16.702387] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:28:33.264 [2024-07-12 18:32:16.774521] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:28:33.264 [2024-07-12 18:32:16.774557] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:28:33.830 18:32:17 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:28:33.830 18:32:17 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@862 -- # return 0 00:28:33.830 18:32:17 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@415 -- # (( i = 1 )) 00:28:33.831 18:32:17 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:28:33.831 18:32:17 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc1 00:28:33.831 18:32:17 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt1 00:28:33.831 18:32:17 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000001 00:28:33.831 18:32:17 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:28:33.831 18:32:17 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:28:33.831 18:32:17 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:28:33.831 18:32:17 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -m 32 -i -b malloc1 00:28:34.397 malloc1 00:28:34.397 18:32:17 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:28:34.655 [2024-07-12 18:32:18.130577] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:28:34.655 [2024-07-12 18:32:18.130626] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:28:34.655 [2024-07-12 18:32:18.130647] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x145e4e0 00:28:34.655 [2024-07-12 18:32:18.130660] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:28:34.655 [2024-07-12 18:32:18.132216] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:28:34.655 [2024-07-12 18:32:18.132245] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:28:34.655 pt1 00:28:34.655 18:32:18 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:28:34.655 18:32:18 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:28:34.655 18:32:18 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc2 00:28:34.655 18:32:18 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt2 00:28:34.655 18:32:18 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000002 00:28:34.655 18:32:18 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:28:34.655 18:32:18 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:28:34.655 18:32:18 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:28:34.655 18:32:18 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -m 32 -i -b malloc2 00:28:34.655 malloc2 00:28:34.914 18:32:18 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:28:35.172 [2024-07-12 18:32:18.870827] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:28:35.172 [2024-07-12 18:32:18.870877] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:28:35.172 [2024-07-12 18:32:18.870896] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1443570 00:28:35.172 [2024-07-12 18:32:18.870909] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:28:35.172 [2024-07-12 18:32:18.872411] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:28:35.172 [2024-07-12 18:32:18.872440] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:28:35.172 pt2 00:28:35.430 18:32:18 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:28:35.430 18:32:18 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:28:35.430 18:32:18 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@429 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'pt1 pt2' -n raid_bdev1 -s 00:28:35.430 [2024-07-12 18:32:19.115490] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:28:35.430 [2024-07-12 18:32:19.116944] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:28:35.430 [2024-07-12 18:32:19.117096] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x1444f20 00:28:35.430 [2024-07-12 18:32:19.117109] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4128 00:28:35.430 [2024-07-12 18:32:19.117175] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x12c1050 00:28:35.430 [2024-07-12 18:32:19.117257] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1444f20 00:28:35.430 [2024-07-12 18:32:19.117267] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x1444f20 00:28:35.430 [2024-07-12 18:32:19.117325] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:28:35.430 18:32:19 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@430 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:28:35.430 18:32:19 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:28:35.430 18:32:19 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:28:35.430 18:32:19 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:28:35.430 18:32:19 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:28:35.430 18:32:19 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:28:35.430 18:32:19 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:28:35.431 18:32:19 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:28:35.431 18:32:19 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:28:35.431 18:32:19 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@124 -- # local tmp 00:28:35.431 18:32:19 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:35.431 18:32:19 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:28:35.689 18:32:19 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:28:35.689 "name": "raid_bdev1", 00:28:35.689 "uuid": "6131c57d-e964-4b1d-ac66-4f4e088d50d9", 00:28:35.689 "strip_size_kb": 0, 00:28:35.689 "state": "online", 00:28:35.689 "raid_level": "raid1", 00:28:35.689 "superblock": true, 00:28:35.689 "num_base_bdevs": 2, 00:28:35.689 "num_base_bdevs_discovered": 2, 00:28:35.689 "num_base_bdevs_operational": 2, 00:28:35.689 "base_bdevs_list": [ 00:28:35.689 { 00:28:35.689 "name": "pt1", 00:28:35.689 "uuid": "00000000-0000-0000-0000-000000000001", 00:28:35.689 "is_configured": true, 00:28:35.689 "data_offset": 256, 00:28:35.689 "data_size": 7936 00:28:35.689 }, 00:28:35.689 { 00:28:35.689 "name": "pt2", 00:28:35.689 "uuid": "00000000-0000-0000-0000-000000000002", 00:28:35.689 "is_configured": true, 00:28:35.689 "data_offset": 256, 00:28:35.689 "data_size": 7936 00:28:35.689 } 00:28:35.689 ] 00:28:35.689 }' 00:28:35.689 18:32:19 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:28:35.689 18:32:19 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:28:36.255 18:32:19 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@431 -- # verify_raid_bdev_properties raid_bdev1 00:28:36.255 18:32:19 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:28:36.255 18:32:19 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:28:36.255 18:32:19 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:28:36.255 18:32:19 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:28:36.255 18:32:19 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@198 -- # local name 00:28:36.255 18:32:19 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:28:36.255 18:32:19 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:28:36.513 [2024-07-12 18:32:20.202624] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:28:36.513 18:32:20 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:28:36.513 "name": "raid_bdev1", 00:28:36.513 "aliases": [ 00:28:36.513 "6131c57d-e964-4b1d-ac66-4f4e088d50d9" 00:28:36.513 ], 00:28:36.514 "product_name": "Raid Volume", 00:28:36.514 "block_size": 4128, 00:28:36.514 "num_blocks": 7936, 00:28:36.514 "uuid": "6131c57d-e964-4b1d-ac66-4f4e088d50d9", 00:28:36.514 "md_size": 32, 00:28:36.514 "md_interleave": true, 00:28:36.514 "dif_type": 0, 00:28:36.514 "assigned_rate_limits": { 00:28:36.514 "rw_ios_per_sec": 0, 00:28:36.514 "rw_mbytes_per_sec": 0, 00:28:36.514 "r_mbytes_per_sec": 0, 00:28:36.514 "w_mbytes_per_sec": 0 00:28:36.514 }, 00:28:36.514 "claimed": false, 00:28:36.514 "zoned": false, 00:28:36.514 "supported_io_types": { 00:28:36.514 "read": true, 00:28:36.514 "write": true, 00:28:36.514 "unmap": false, 00:28:36.514 "flush": false, 00:28:36.514 "reset": true, 00:28:36.514 "nvme_admin": false, 00:28:36.514 "nvme_io": false, 00:28:36.514 "nvme_io_md": false, 00:28:36.514 "write_zeroes": true, 00:28:36.514 "zcopy": false, 00:28:36.514 "get_zone_info": false, 00:28:36.514 "zone_management": false, 00:28:36.514 "zone_append": false, 00:28:36.514 "compare": false, 00:28:36.514 "compare_and_write": false, 00:28:36.514 "abort": false, 00:28:36.514 "seek_hole": false, 00:28:36.514 "seek_data": false, 00:28:36.514 "copy": false, 00:28:36.514 "nvme_iov_md": false 00:28:36.514 }, 00:28:36.514 "memory_domains": [ 00:28:36.514 { 00:28:36.514 "dma_device_id": "system", 00:28:36.514 "dma_device_type": 1 00:28:36.514 }, 00:28:36.514 { 00:28:36.514 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:28:36.514 "dma_device_type": 2 00:28:36.514 }, 00:28:36.514 { 00:28:36.514 "dma_device_id": "system", 00:28:36.514 "dma_device_type": 1 00:28:36.514 }, 00:28:36.514 { 00:28:36.514 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:28:36.514 "dma_device_type": 2 00:28:36.514 } 00:28:36.514 ], 00:28:36.514 "driver_specific": { 00:28:36.514 "raid": { 00:28:36.514 "uuid": "6131c57d-e964-4b1d-ac66-4f4e088d50d9", 00:28:36.514 "strip_size_kb": 0, 00:28:36.514 "state": "online", 00:28:36.514 "raid_level": "raid1", 00:28:36.514 "superblock": true, 00:28:36.514 "num_base_bdevs": 2, 00:28:36.514 "num_base_bdevs_discovered": 2, 00:28:36.514 "num_base_bdevs_operational": 2, 00:28:36.514 "base_bdevs_list": [ 00:28:36.514 { 00:28:36.514 "name": "pt1", 00:28:36.514 "uuid": "00000000-0000-0000-0000-000000000001", 00:28:36.514 "is_configured": true, 00:28:36.514 "data_offset": 256, 00:28:36.514 "data_size": 7936 00:28:36.514 }, 00:28:36.514 { 00:28:36.514 "name": "pt2", 00:28:36.514 "uuid": "00000000-0000-0000-0000-000000000002", 00:28:36.514 "is_configured": true, 00:28:36.514 "data_offset": 256, 00:28:36.514 "data_size": 7936 00:28:36.514 } 00:28:36.514 ] 00:28:36.514 } 00:28:36.514 } 00:28:36.514 }' 00:28:36.514 18:32:20 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:28:36.791 18:32:20 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:28:36.791 pt2' 00:28:36.791 18:32:20 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:28:36.791 18:32:20 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:28:36.791 18:32:20 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:28:37.059 18:32:20 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:28:37.059 "name": "pt1", 00:28:37.059 "aliases": [ 00:28:37.059 "00000000-0000-0000-0000-000000000001" 00:28:37.059 ], 00:28:37.059 "product_name": "passthru", 00:28:37.059 "block_size": 4128, 00:28:37.059 "num_blocks": 8192, 00:28:37.059 "uuid": "00000000-0000-0000-0000-000000000001", 00:28:37.059 "md_size": 32, 00:28:37.059 "md_interleave": true, 00:28:37.059 "dif_type": 0, 00:28:37.059 "assigned_rate_limits": { 00:28:37.059 "rw_ios_per_sec": 0, 00:28:37.059 "rw_mbytes_per_sec": 0, 00:28:37.059 "r_mbytes_per_sec": 0, 00:28:37.059 "w_mbytes_per_sec": 0 00:28:37.059 }, 00:28:37.059 "claimed": true, 00:28:37.059 "claim_type": "exclusive_write", 00:28:37.059 "zoned": false, 00:28:37.059 "supported_io_types": { 00:28:37.059 "read": true, 00:28:37.059 "write": true, 00:28:37.059 "unmap": true, 00:28:37.059 "flush": true, 00:28:37.059 "reset": true, 00:28:37.059 "nvme_admin": false, 00:28:37.059 "nvme_io": false, 00:28:37.059 "nvme_io_md": false, 00:28:37.059 "write_zeroes": true, 00:28:37.059 "zcopy": true, 00:28:37.059 "get_zone_info": false, 00:28:37.059 "zone_management": false, 00:28:37.059 "zone_append": false, 00:28:37.059 "compare": false, 00:28:37.059 "compare_and_write": false, 00:28:37.059 "abort": true, 00:28:37.059 "seek_hole": false, 00:28:37.059 "seek_data": false, 00:28:37.059 "copy": true, 00:28:37.059 "nvme_iov_md": false 00:28:37.059 }, 00:28:37.059 "memory_domains": [ 00:28:37.059 { 00:28:37.059 "dma_device_id": "system", 00:28:37.059 "dma_device_type": 1 00:28:37.059 }, 00:28:37.059 { 00:28:37.059 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:28:37.059 "dma_device_type": 2 00:28:37.059 } 00:28:37.059 ], 00:28:37.059 "driver_specific": { 00:28:37.059 "passthru": { 00:28:37.059 "name": "pt1", 00:28:37.059 "base_bdev_name": "malloc1" 00:28:37.059 } 00:28:37.059 } 00:28:37.059 }' 00:28:37.059 18:32:20 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:28:37.059 18:32:20 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:28:37.059 18:32:20 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@205 -- # [[ 4128 == 4128 ]] 00:28:37.059 18:32:20 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:28:37.060 18:32:20 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:28:37.060 18:32:20 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@206 -- # [[ 32 == 32 ]] 00:28:37.060 18:32:20 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:28:37.060 18:32:20 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:28:37.318 18:32:20 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@207 -- # [[ true == true ]] 00:28:37.318 18:32:20 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:28:37.318 18:32:20 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:28:37.318 18:32:20 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@208 -- # [[ 0 == 0 ]] 00:28:37.318 18:32:20 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:28:37.318 18:32:20 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:28:37.318 18:32:20 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:28:37.576 18:32:21 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:28:37.576 "name": "pt2", 00:28:37.576 "aliases": [ 00:28:37.576 "00000000-0000-0000-0000-000000000002" 00:28:37.576 ], 00:28:37.576 "product_name": "passthru", 00:28:37.576 "block_size": 4128, 00:28:37.576 "num_blocks": 8192, 00:28:37.576 "uuid": "00000000-0000-0000-0000-000000000002", 00:28:37.576 "md_size": 32, 00:28:37.576 "md_interleave": true, 00:28:37.576 "dif_type": 0, 00:28:37.576 "assigned_rate_limits": { 00:28:37.576 "rw_ios_per_sec": 0, 00:28:37.576 "rw_mbytes_per_sec": 0, 00:28:37.576 "r_mbytes_per_sec": 0, 00:28:37.576 "w_mbytes_per_sec": 0 00:28:37.576 }, 00:28:37.576 "claimed": true, 00:28:37.576 "claim_type": "exclusive_write", 00:28:37.576 "zoned": false, 00:28:37.576 "supported_io_types": { 00:28:37.576 "read": true, 00:28:37.576 "write": true, 00:28:37.576 "unmap": true, 00:28:37.576 "flush": true, 00:28:37.576 "reset": true, 00:28:37.576 "nvme_admin": false, 00:28:37.576 "nvme_io": false, 00:28:37.576 "nvme_io_md": false, 00:28:37.576 "write_zeroes": true, 00:28:37.576 "zcopy": true, 00:28:37.576 "get_zone_info": false, 00:28:37.576 "zone_management": false, 00:28:37.576 "zone_append": false, 00:28:37.576 "compare": false, 00:28:37.576 "compare_and_write": false, 00:28:37.576 "abort": true, 00:28:37.576 "seek_hole": false, 00:28:37.576 "seek_data": false, 00:28:37.576 "copy": true, 00:28:37.576 "nvme_iov_md": false 00:28:37.576 }, 00:28:37.576 "memory_domains": [ 00:28:37.576 { 00:28:37.576 "dma_device_id": "system", 00:28:37.576 "dma_device_type": 1 00:28:37.576 }, 00:28:37.576 { 00:28:37.576 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:28:37.576 "dma_device_type": 2 00:28:37.576 } 00:28:37.576 ], 00:28:37.576 "driver_specific": { 00:28:37.576 "passthru": { 00:28:37.576 "name": "pt2", 00:28:37.576 "base_bdev_name": "malloc2" 00:28:37.576 } 00:28:37.576 } 00:28:37.576 }' 00:28:37.576 18:32:21 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:28:37.576 18:32:21 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:28:37.576 18:32:21 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@205 -- # [[ 4128 == 4128 ]] 00:28:37.576 18:32:21 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:28:37.577 18:32:21 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:28:37.577 18:32:21 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@206 -- # [[ 32 == 32 ]] 00:28:37.577 18:32:21 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:28:37.836 18:32:21 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:28:37.836 18:32:21 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@207 -- # [[ true == true ]] 00:28:37.836 18:32:21 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:28:37.836 18:32:21 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:28:37.836 18:32:21 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@208 -- # [[ 0 == 0 ]] 00:28:37.836 18:32:21 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@434 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:28:37.836 18:32:21 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@434 -- # jq -r '.[] | .uuid' 00:28:38.094 [2024-07-12 18:32:21.694589] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:28:38.094 18:32:21 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@434 -- # raid_bdev_uuid=6131c57d-e964-4b1d-ac66-4f4e088d50d9 00:28:38.094 18:32:21 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@435 -- # '[' -z 6131c57d-e964-4b1d-ac66-4f4e088d50d9 ']' 00:28:38.094 18:32:21 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@440 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:28:38.660 [2024-07-12 18:32:22.195652] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:28:38.660 [2024-07-12 18:32:22.195676] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:28:38.660 [2024-07-12 18:32:22.195729] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:28:38.660 [2024-07-12 18:32:22.195781] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:28:38.660 [2024-07-12 18:32:22.195792] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1444f20 name raid_bdev1, state offline 00:28:38.660 18:32:22 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@441 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:38.660 18:32:22 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@441 -- # jq -r '.[]' 00:28:38.919 18:32:22 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@441 -- # raid_bdev= 00:28:38.919 18:32:22 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@442 -- # '[' -n '' ']' 00:28:38.919 18:32:22 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:28:38.919 18:32:22 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt1 00:28:39.177 18:32:22 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:28:39.177 18:32:22 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:28:39.435 18:32:22 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@450 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs 00:28:39.435 18:32:22 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@450 -- # jq -r '[.[] | select(.product_name == "passthru")] | any' 00:28:39.693 18:32:23 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@450 -- # '[' false == true ']' 00:28:39.693 18:32:23 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@456 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'malloc1 malloc2' -n raid_bdev1 00:28:39.693 18:32:23 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@648 -- # local es=0 00:28:39.693 18:32:23 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@650 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'malloc1 malloc2' -n raid_bdev1 00:28:39.693 18:32:23 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@636 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:28:39.693 18:32:23 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:28:39.693 18:32:23 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@640 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:28:39.693 18:32:23 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:28:39.693 18:32:23 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@642 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:28:39.693 18:32:23 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:28:39.693 18:32:23 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@642 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:28:39.693 18:32:23 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@642 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py ]] 00:28:39.693 18:32:23 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@651 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'malloc1 malloc2' -n raid_bdev1 00:28:39.951 [2024-07-12 18:32:23.426869] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc1 is claimed 00:28:39.951 [2024-07-12 18:32:23.428275] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc2 is claimed 00:28:39.951 [2024-07-12 18:32:23.428329] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc1 00:28:39.951 [2024-07-12 18:32:23.428369] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc2 00:28:39.951 [2024-07-12 18:32:23.428387] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:28:39.951 [2024-07-12 18:32:23.428396] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x144f260 name raid_bdev1, state configuring 00:28:39.951 request: 00:28:39.952 { 00:28:39.952 "name": "raid_bdev1", 00:28:39.952 "raid_level": "raid1", 00:28:39.952 "base_bdevs": [ 00:28:39.952 "malloc1", 00:28:39.952 "malloc2" 00:28:39.952 ], 00:28:39.952 "superblock": false, 00:28:39.952 "method": "bdev_raid_create", 00:28:39.952 "req_id": 1 00:28:39.952 } 00:28:39.952 Got JSON-RPC error response 00:28:39.952 response: 00:28:39.952 { 00:28:39.952 "code": -17, 00:28:39.952 "message": "Failed to create RAID bdev raid_bdev1: File exists" 00:28:39.952 } 00:28:39.952 18:32:23 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@651 -- # es=1 00:28:39.952 18:32:23 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@659 -- # (( es > 128 )) 00:28:39.952 18:32:23 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@670 -- # [[ -n '' ]] 00:28:39.952 18:32:23 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@675 -- # (( !es == 0 )) 00:28:39.952 18:32:23 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@458 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:39.952 18:32:23 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@458 -- # jq -r '.[]' 00:28:40.210 18:32:23 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@458 -- # raid_bdev= 00:28:40.210 18:32:23 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@459 -- # '[' -n '' ']' 00:28:40.210 18:32:23 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@464 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:28:40.210 [2024-07-12 18:32:23.916126] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:28:40.210 [2024-07-12 18:32:23.916177] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:28:40.210 [2024-07-12 18:32:23.916195] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1446000 00:28:40.210 [2024-07-12 18:32:23.916208] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:28:40.210 [2024-07-12 18:32:23.917672] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:28:40.210 [2024-07-12 18:32:23.917701] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:28:40.210 [2024-07-12 18:32:23.917750] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt1 00:28:40.210 [2024-07-12 18:32:23.917777] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:28:40.210 pt1 00:28:40.210 18:32:23 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@467 -- # verify_raid_bdev_state raid_bdev1 configuring raid1 0 2 00:28:40.210 18:32:23 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:28:40.210 18:32:23 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:28:40.210 18:32:23 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:28:40.475 18:32:23 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:28:40.475 18:32:23 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:28:40.475 18:32:23 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:28:40.475 18:32:23 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:28:40.475 18:32:23 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:28:40.475 18:32:23 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@124 -- # local tmp 00:28:40.475 18:32:23 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:40.475 18:32:23 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:28:40.475 18:32:24 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:28:40.475 "name": "raid_bdev1", 00:28:40.475 "uuid": "6131c57d-e964-4b1d-ac66-4f4e088d50d9", 00:28:40.475 "strip_size_kb": 0, 00:28:40.475 "state": "configuring", 00:28:40.475 "raid_level": "raid1", 00:28:40.475 "superblock": true, 00:28:40.475 "num_base_bdevs": 2, 00:28:40.476 "num_base_bdevs_discovered": 1, 00:28:40.476 "num_base_bdevs_operational": 2, 00:28:40.476 "base_bdevs_list": [ 00:28:40.476 { 00:28:40.476 "name": "pt1", 00:28:40.476 "uuid": "00000000-0000-0000-0000-000000000001", 00:28:40.476 "is_configured": true, 00:28:40.476 "data_offset": 256, 00:28:40.476 "data_size": 7936 00:28:40.476 }, 00:28:40.476 { 00:28:40.476 "name": null, 00:28:40.476 "uuid": "00000000-0000-0000-0000-000000000002", 00:28:40.476 "is_configured": false, 00:28:40.476 "data_offset": 256, 00:28:40.476 "data_size": 7936 00:28:40.476 } 00:28:40.476 ] 00:28:40.476 }' 00:28:40.476 18:32:24 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:28:40.476 18:32:24 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:28:41.042 18:32:24 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@469 -- # '[' 2 -gt 2 ']' 00:28:41.042 18:32:24 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@477 -- # (( i = 1 )) 00:28:41.042 18:32:24 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:28:41.042 18:32:24 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@478 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:28:41.300 [2024-07-12 18:32:24.946882] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:28:41.300 [2024-07-12 18:32:24.946943] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:28:41.300 [2024-07-12 18:32:24.946973] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1448270 00:28:41.300 [2024-07-12 18:32:24.946992] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:28:41.300 [2024-07-12 18:32:24.947180] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:28:41.300 [2024-07-12 18:32:24.947199] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:28:41.300 [2024-07-12 18:32:24.947245] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:28:41.300 [2024-07-12 18:32:24.947264] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:28:41.300 [2024-07-12 18:32:24.947343] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x12c1c10 00:28:41.300 [2024-07-12 18:32:24.947353] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4128 00:28:41.300 [2024-07-12 18:32:24.947408] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1443d40 00:28:41.300 [2024-07-12 18:32:24.947481] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x12c1c10 00:28:41.300 [2024-07-12 18:32:24.947490] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x12c1c10 00:28:41.300 [2024-07-12 18:32:24.947544] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:28:41.300 pt2 00:28:41.300 18:32:24 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@477 -- # (( i++ )) 00:28:41.300 18:32:24 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:28:41.300 18:32:24 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@482 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:28:41.300 18:32:24 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:28:41.300 18:32:24 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:28:41.300 18:32:24 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:28:41.300 18:32:24 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:28:41.300 18:32:24 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:28:41.300 18:32:24 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:28:41.300 18:32:24 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:28:41.300 18:32:24 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:28:41.300 18:32:24 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@124 -- # local tmp 00:28:41.300 18:32:24 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:41.300 18:32:24 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:28:41.558 18:32:25 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:28:41.558 "name": "raid_bdev1", 00:28:41.558 "uuid": "6131c57d-e964-4b1d-ac66-4f4e088d50d9", 00:28:41.558 "strip_size_kb": 0, 00:28:41.558 "state": "online", 00:28:41.558 "raid_level": "raid1", 00:28:41.558 "superblock": true, 00:28:41.558 "num_base_bdevs": 2, 00:28:41.558 "num_base_bdevs_discovered": 2, 00:28:41.558 "num_base_bdevs_operational": 2, 00:28:41.558 "base_bdevs_list": [ 00:28:41.558 { 00:28:41.558 "name": "pt1", 00:28:41.558 "uuid": "00000000-0000-0000-0000-000000000001", 00:28:41.558 "is_configured": true, 00:28:41.558 "data_offset": 256, 00:28:41.558 "data_size": 7936 00:28:41.558 }, 00:28:41.558 { 00:28:41.558 "name": "pt2", 00:28:41.558 "uuid": "00000000-0000-0000-0000-000000000002", 00:28:41.558 "is_configured": true, 00:28:41.558 "data_offset": 256, 00:28:41.558 "data_size": 7936 00:28:41.558 } 00:28:41.558 ] 00:28:41.558 }' 00:28:41.558 18:32:25 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:28:41.558 18:32:25 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:28:42.125 18:32:25 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@483 -- # verify_raid_bdev_properties raid_bdev1 00:28:42.125 18:32:25 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:28:42.125 18:32:25 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:28:42.125 18:32:25 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:28:42.125 18:32:25 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:28:42.125 18:32:25 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@198 -- # local name 00:28:42.125 18:32:25 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:28:42.125 18:32:25 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:28:42.384 [2024-07-12 18:32:26.009971] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:28:42.384 18:32:26 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:28:42.384 "name": "raid_bdev1", 00:28:42.384 "aliases": [ 00:28:42.384 "6131c57d-e964-4b1d-ac66-4f4e088d50d9" 00:28:42.384 ], 00:28:42.384 "product_name": "Raid Volume", 00:28:42.384 "block_size": 4128, 00:28:42.384 "num_blocks": 7936, 00:28:42.384 "uuid": "6131c57d-e964-4b1d-ac66-4f4e088d50d9", 00:28:42.384 "md_size": 32, 00:28:42.384 "md_interleave": true, 00:28:42.384 "dif_type": 0, 00:28:42.384 "assigned_rate_limits": { 00:28:42.384 "rw_ios_per_sec": 0, 00:28:42.384 "rw_mbytes_per_sec": 0, 00:28:42.384 "r_mbytes_per_sec": 0, 00:28:42.384 "w_mbytes_per_sec": 0 00:28:42.384 }, 00:28:42.384 "claimed": false, 00:28:42.384 "zoned": false, 00:28:42.384 "supported_io_types": { 00:28:42.384 "read": true, 00:28:42.384 "write": true, 00:28:42.384 "unmap": false, 00:28:42.384 "flush": false, 00:28:42.384 "reset": true, 00:28:42.384 "nvme_admin": false, 00:28:42.384 "nvme_io": false, 00:28:42.384 "nvme_io_md": false, 00:28:42.384 "write_zeroes": true, 00:28:42.384 "zcopy": false, 00:28:42.384 "get_zone_info": false, 00:28:42.384 "zone_management": false, 00:28:42.384 "zone_append": false, 00:28:42.384 "compare": false, 00:28:42.384 "compare_and_write": false, 00:28:42.384 "abort": false, 00:28:42.384 "seek_hole": false, 00:28:42.384 "seek_data": false, 00:28:42.384 "copy": false, 00:28:42.384 "nvme_iov_md": false 00:28:42.384 }, 00:28:42.384 "memory_domains": [ 00:28:42.384 { 00:28:42.384 "dma_device_id": "system", 00:28:42.384 "dma_device_type": 1 00:28:42.384 }, 00:28:42.384 { 00:28:42.384 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:28:42.384 "dma_device_type": 2 00:28:42.384 }, 00:28:42.384 { 00:28:42.384 "dma_device_id": "system", 00:28:42.384 "dma_device_type": 1 00:28:42.384 }, 00:28:42.384 { 00:28:42.384 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:28:42.384 "dma_device_type": 2 00:28:42.384 } 00:28:42.384 ], 00:28:42.384 "driver_specific": { 00:28:42.384 "raid": { 00:28:42.384 "uuid": "6131c57d-e964-4b1d-ac66-4f4e088d50d9", 00:28:42.384 "strip_size_kb": 0, 00:28:42.384 "state": "online", 00:28:42.384 "raid_level": "raid1", 00:28:42.384 "superblock": true, 00:28:42.384 "num_base_bdevs": 2, 00:28:42.384 "num_base_bdevs_discovered": 2, 00:28:42.384 "num_base_bdevs_operational": 2, 00:28:42.384 "base_bdevs_list": [ 00:28:42.384 { 00:28:42.384 "name": "pt1", 00:28:42.384 "uuid": "00000000-0000-0000-0000-000000000001", 00:28:42.384 "is_configured": true, 00:28:42.384 "data_offset": 256, 00:28:42.384 "data_size": 7936 00:28:42.384 }, 00:28:42.384 { 00:28:42.384 "name": "pt2", 00:28:42.384 "uuid": "00000000-0000-0000-0000-000000000002", 00:28:42.384 "is_configured": true, 00:28:42.384 "data_offset": 256, 00:28:42.384 "data_size": 7936 00:28:42.384 } 00:28:42.384 ] 00:28:42.384 } 00:28:42.384 } 00:28:42.384 }' 00:28:42.384 18:32:26 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:28:42.384 18:32:26 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:28:42.384 pt2' 00:28:42.384 18:32:26 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:28:42.384 18:32:26 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:28:42.384 18:32:26 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:28:42.643 18:32:26 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:28:42.643 "name": "pt1", 00:28:42.643 "aliases": [ 00:28:42.643 "00000000-0000-0000-0000-000000000001" 00:28:42.643 ], 00:28:42.643 "product_name": "passthru", 00:28:42.643 "block_size": 4128, 00:28:42.643 "num_blocks": 8192, 00:28:42.643 "uuid": "00000000-0000-0000-0000-000000000001", 00:28:42.643 "md_size": 32, 00:28:42.643 "md_interleave": true, 00:28:42.643 "dif_type": 0, 00:28:42.643 "assigned_rate_limits": { 00:28:42.643 "rw_ios_per_sec": 0, 00:28:42.643 "rw_mbytes_per_sec": 0, 00:28:42.643 "r_mbytes_per_sec": 0, 00:28:42.643 "w_mbytes_per_sec": 0 00:28:42.643 }, 00:28:42.643 "claimed": true, 00:28:42.643 "claim_type": "exclusive_write", 00:28:42.643 "zoned": false, 00:28:42.643 "supported_io_types": { 00:28:42.643 "read": true, 00:28:42.643 "write": true, 00:28:42.643 "unmap": true, 00:28:42.643 "flush": true, 00:28:42.643 "reset": true, 00:28:42.643 "nvme_admin": false, 00:28:42.643 "nvme_io": false, 00:28:42.643 "nvme_io_md": false, 00:28:42.643 "write_zeroes": true, 00:28:42.643 "zcopy": true, 00:28:42.643 "get_zone_info": false, 00:28:42.643 "zone_management": false, 00:28:42.643 "zone_append": false, 00:28:42.643 "compare": false, 00:28:42.643 "compare_and_write": false, 00:28:42.643 "abort": true, 00:28:42.643 "seek_hole": false, 00:28:42.643 "seek_data": false, 00:28:42.643 "copy": true, 00:28:42.643 "nvme_iov_md": false 00:28:42.643 }, 00:28:42.643 "memory_domains": [ 00:28:42.643 { 00:28:42.643 "dma_device_id": "system", 00:28:42.643 "dma_device_type": 1 00:28:42.643 }, 00:28:42.643 { 00:28:42.643 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:28:42.643 "dma_device_type": 2 00:28:42.643 } 00:28:42.643 ], 00:28:42.643 "driver_specific": { 00:28:42.643 "passthru": { 00:28:42.643 "name": "pt1", 00:28:42.643 "base_bdev_name": "malloc1" 00:28:42.643 } 00:28:42.643 } 00:28:42.643 }' 00:28:42.643 18:32:26 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:28:42.643 18:32:26 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:28:42.643 18:32:26 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@205 -- # [[ 4128 == 4128 ]] 00:28:42.643 18:32:26 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:28:42.901 18:32:26 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:28:42.901 18:32:26 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@206 -- # [[ 32 == 32 ]] 00:28:42.901 18:32:26 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:28:42.901 18:32:26 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:28:42.901 18:32:26 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@207 -- # [[ true == true ]] 00:28:42.901 18:32:26 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:28:42.901 18:32:26 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:28:42.901 18:32:26 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@208 -- # [[ 0 == 0 ]] 00:28:42.901 18:32:26 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:28:42.901 18:32:26 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:28:42.901 18:32:26 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:28:43.160 18:32:26 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:28:43.160 "name": "pt2", 00:28:43.160 "aliases": [ 00:28:43.160 "00000000-0000-0000-0000-000000000002" 00:28:43.160 ], 00:28:43.160 "product_name": "passthru", 00:28:43.160 "block_size": 4128, 00:28:43.160 "num_blocks": 8192, 00:28:43.160 "uuid": "00000000-0000-0000-0000-000000000002", 00:28:43.160 "md_size": 32, 00:28:43.160 "md_interleave": true, 00:28:43.160 "dif_type": 0, 00:28:43.160 "assigned_rate_limits": { 00:28:43.160 "rw_ios_per_sec": 0, 00:28:43.160 "rw_mbytes_per_sec": 0, 00:28:43.160 "r_mbytes_per_sec": 0, 00:28:43.160 "w_mbytes_per_sec": 0 00:28:43.160 }, 00:28:43.160 "claimed": true, 00:28:43.160 "claim_type": "exclusive_write", 00:28:43.160 "zoned": false, 00:28:43.160 "supported_io_types": { 00:28:43.160 "read": true, 00:28:43.160 "write": true, 00:28:43.160 "unmap": true, 00:28:43.160 "flush": true, 00:28:43.160 "reset": true, 00:28:43.160 "nvme_admin": false, 00:28:43.160 "nvme_io": false, 00:28:43.160 "nvme_io_md": false, 00:28:43.160 "write_zeroes": true, 00:28:43.160 "zcopy": true, 00:28:43.160 "get_zone_info": false, 00:28:43.160 "zone_management": false, 00:28:43.160 "zone_append": false, 00:28:43.160 "compare": false, 00:28:43.160 "compare_and_write": false, 00:28:43.160 "abort": true, 00:28:43.160 "seek_hole": false, 00:28:43.160 "seek_data": false, 00:28:43.160 "copy": true, 00:28:43.160 "nvme_iov_md": false 00:28:43.160 }, 00:28:43.160 "memory_domains": [ 00:28:43.160 { 00:28:43.160 "dma_device_id": "system", 00:28:43.160 "dma_device_type": 1 00:28:43.160 }, 00:28:43.160 { 00:28:43.160 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:28:43.160 "dma_device_type": 2 00:28:43.160 } 00:28:43.160 ], 00:28:43.160 "driver_specific": { 00:28:43.160 "passthru": { 00:28:43.160 "name": "pt2", 00:28:43.160 "base_bdev_name": "malloc2" 00:28:43.160 } 00:28:43.160 } 00:28:43.160 }' 00:28:43.160 18:32:26 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:28:43.419 18:32:26 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:28:43.419 18:32:26 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@205 -- # [[ 4128 == 4128 ]] 00:28:43.419 18:32:26 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:28:43.419 18:32:26 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:28:43.419 18:32:27 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@206 -- # [[ 32 == 32 ]] 00:28:43.419 18:32:27 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:28:43.419 18:32:27 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:28:43.419 18:32:27 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@207 -- # [[ true == true ]] 00:28:43.419 18:32:27 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:28:43.677 18:32:27 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:28:43.677 18:32:27 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@208 -- # [[ 0 == 0 ]] 00:28:43.677 18:32:27 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@486 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:28:43.677 18:32:27 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@486 -- # jq -r '.[] | .uuid' 00:28:43.935 [2024-07-12 18:32:27.441751] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:28:43.935 18:32:27 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@486 -- # '[' 6131c57d-e964-4b1d-ac66-4f4e088d50d9 '!=' 6131c57d-e964-4b1d-ac66-4f4e088d50d9 ']' 00:28:43.935 18:32:27 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@490 -- # has_redundancy raid1 00:28:43.935 18:32:27 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@213 -- # case $1 in 00:28:43.935 18:32:27 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@214 -- # return 0 00:28:43.935 18:32:27 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@492 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt1 00:28:44.194 [2024-07-12 18:32:27.690204] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: pt1 00:28:44.194 18:32:27 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@495 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:28:44.194 18:32:27 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:28:44.194 18:32:27 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:28:44.194 18:32:27 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:28:44.194 18:32:27 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:28:44.194 18:32:27 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:28:44.194 18:32:27 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:28:44.194 18:32:27 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:28:44.194 18:32:27 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:28:44.194 18:32:27 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@124 -- # local tmp 00:28:44.194 18:32:27 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:44.194 18:32:27 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:28:44.452 18:32:27 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:28:44.452 "name": "raid_bdev1", 00:28:44.452 "uuid": "6131c57d-e964-4b1d-ac66-4f4e088d50d9", 00:28:44.452 "strip_size_kb": 0, 00:28:44.452 "state": "online", 00:28:44.452 "raid_level": "raid1", 00:28:44.452 "superblock": true, 00:28:44.452 "num_base_bdevs": 2, 00:28:44.452 "num_base_bdevs_discovered": 1, 00:28:44.452 "num_base_bdevs_operational": 1, 00:28:44.452 "base_bdevs_list": [ 00:28:44.452 { 00:28:44.452 "name": null, 00:28:44.452 "uuid": "00000000-0000-0000-0000-000000000000", 00:28:44.452 "is_configured": false, 00:28:44.452 "data_offset": 256, 00:28:44.452 "data_size": 7936 00:28:44.452 }, 00:28:44.452 { 00:28:44.452 "name": "pt2", 00:28:44.453 "uuid": "00000000-0000-0000-0000-000000000002", 00:28:44.453 "is_configured": true, 00:28:44.453 "data_offset": 256, 00:28:44.453 "data_size": 7936 00:28:44.453 } 00:28:44.453 ] 00:28:44.453 }' 00:28:44.453 18:32:27 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:28:44.453 18:32:27 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:28:45.019 18:32:28 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@498 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:28:45.277 [2024-07-12 18:32:28.773063] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:28:45.277 [2024-07-12 18:32:28.773090] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:28:45.277 [2024-07-12 18:32:28.773141] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:28:45.277 [2024-07-12 18:32:28.773181] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:28:45.277 [2024-07-12 18:32:28.773193] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x12c1c10 name raid_bdev1, state offline 00:28:45.277 18:32:28 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@499 -- # jq -r '.[]' 00:28:45.277 18:32:28 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@499 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:45.536 18:32:29 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@499 -- # raid_bdev= 00:28:45.536 18:32:29 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@500 -- # '[' -n '' ']' 00:28:45.536 18:32:29 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@505 -- # (( i = 1 )) 00:28:45.536 18:32:29 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@505 -- # (( i < num_base_bdevs )) 00:28:45.536 18:32:29 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@506 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:28:45.795 18:32:29 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@505 -- # (( i++ )) 00:28:45.795 18:32:29 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@505 -- # (( i < num_base_bdevs )) 00:28:45.795 18:32:29 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@510 -- # (( i = 1 )) 00:28:45.795 18:32:29 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@510 -- # (( i < num_base_bdevs - 1 )) 00:28:45.795 18:32:29 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@518 -- # i=1 00:28:45.795 18:32:29 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@519 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:28:45.795 [2024-07-12 18:32:29.450816] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:28:45.795 [2024-07-12 18:32:29.450858] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:28:45.795 [2024-07-12 18:32:29.450882] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x14469f0 00:28:45.795 [2024-07-12 18:32:29.450895] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:28:45.795 [2024-07-12 18:32:29.452337] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:28:45.795 [2024-07-12 18:32:29.452367] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:28:45.795 [2024-07-12 18:32:29.452409] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:28:45.795 [2024-07-12 18:32:29.452434] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:28:45.795 [2024-07-12 18:32:29.452497] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x1447ea0 00:28:45.795 [2024-07-12 18:32:29.452507] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4128 00:28:45.795 [2024-07-12 18:32:29.452561] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1445bc0 00:28:45.795 [2024-07-12 18:32:29.452631] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1447ea0 00:28:45.795 [2024-07-12 18:32:29.452641] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x1447ea0 00:28:45.795 [2024-07-12 18:32:29.452693] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:28:45.795 pt2 00:28:45.795 18:32:29 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@522 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:28:45.795 18:32:29 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:28:45.795 18:32:29 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:28:45.795 18:32:29 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:28:45.795 18:32:29 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:28:45.795 18:32:29 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:28:45.795 18:32:29 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:28:45.795 18:32:29 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:28:45.795 18:32:29 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:28:45.795 18:32:29 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@124 -- # local tmp 00:28:45.795 18:32:29 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:45.795 18:32:29 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:28:46.054 18:32:29 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:28:46.054 "name": "raid_bdev1", 00:28:46.054 "uuid": "6131c57d-e964-4b1d-ac66-4f4e088d50d9", 00:28:46.054 "strip_size_kb": 0, 00:28:46.054 "state": "online", 00:28:46.054 "raid_level": "raid1", 00:28:46.054 "superblock": true, 00:28:46.054 "num_base_bdevs": 2, 00:28:46.054 "num_base_bdevs_discovered": 1, 00:28:46.054 "num_base_bdevs_operational": 1, 00:28:46.054 "base_bdevs_list": [ 00:28:46.054 { 00:28:46.054 "name": null, 00:28:46.054 "uuid": "00000000-0000-0000-0000-000000000000", 00:28:46.054 "is_configured": false, 00:28:46.054 "data_offset": 256, 00:28:46.054 "data_size": 7936 00:28:46.054 }, 00:28:46.054 { 00:28:46.054 "name": "pt2", 00:28:46.054 "uuid": "00000000-0000-0000-0000-000000000002", 00:28:46.054 "is_configured": true, 00:28:46.054 "data_offset": 256, 00:28:46.054 "data_size": 7936 00:28:46.054 } 00:28:46.054 ] 00:28:46.054 }' 00:28:46.054 18:32:29 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:28:46.054 18:32:29 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:28:46.620 18:32:30 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@525 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:28:46.879 [2024-07-12 18:32:30.481549] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:28:46.879 [2024-07-12 18:32:30.481576] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:28:46.879 [2024-07-12 18:32:30.481635] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:28:46.879 [2024-07-12 18:32:30.481676] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:28:46.879 [2024-07-12 18:32:30.481687] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1447ea0 name raid_bdev1, state offline 00:28:46.879 18:32:30 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@526 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:46.879 18:32:30 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@526 -- # jq -r '.[]' 00:28:47.137 18:32:30 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@526 -- # raid_bdev= 00:28:47.137 18:32:30 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@527 -- # '[' -n '' ']' 00:28:47.137 18:32:30 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@531 -- # '[' 2 -gt 2 ']' 00:28:47.137 18:32:30 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@539 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:28:47.396 [2024-07-12 18:32:30.962797] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:28:47.396 [2024-07-12 18:32:30.962842] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:28:47.396 [2024-07-12 18:32:30.962860] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1446620 00:28:47.396 [2024-07-12 18:32:30.962872] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:28:47.396 [2024-07-12 18:32:30.964292] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:28:47.396 [2024-07-12 18:32:30.964321] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:28:47.396 [2024-07-12 18:32:30.964364] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt1 00:28:47.396 [2024-07-12 18:32:30.964388] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:28:47.396 [2024-07-12 18:32:30.964467] bdev_raid.c:3547:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev pt2 (4) greater than existing raid bdev raid_bdev1 (2) 00:28:47.396 [2024-07-12 18:32:30.964480] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:28:47.396 [2024-07-12 18:32:30.964494] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1448640 name raid_bdev1, state configuring 00:28:47.396 [2024-07-12 18:32:30.964516] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:28:47.396 [2024-07-12 18:32:30.964566] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x1448640 00:28:47.396 [2024-07-12 18:32:30.964576] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4128 00:28:47.396 [2024-07-12 18:32:30.964630] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1447810 00:28:47.396 [2024-07-12 18:32:30.964699] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1448640 00:28:47.396 [2024-07-12 18:32:30.964708] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x1448640 00:28:47.396 [2024-07-12 18:32:30.964765] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:28:47.396 pt1 00:28:47.396 18:32:30 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@541 -- # '[' 2 -gt 2 ']' 00:28:47.396 18:32:30 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@553 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:28:47.396 18:32:30 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:28:47.396 18:32:30 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:28:47.396 18:32:30 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:28:47.396 18:32:30 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:28:47.396 18:32:30 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:28:47.396 18:32:30 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:28:47.396 18:32:30 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:28:47.396 18:32:30 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:28:47.396 18:32:30 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@124 -- # local tmp 00:28:47.396 18:32:30 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:47.396 18:32:30 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:28:47.654 18:32:31 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:28:47.654 "name": "raid_bdev1", 00:28:47.654 "uuid": "6131c57d-e964-4b1d-ac66-4f4e088d50d9", 00:28:47.654 "strip_size_kb": 0, 00:28:47.654 "state": "online", 00:28:47.654 "raid_level": "raid1", 00:28:47.654 "superblock": true, 00:28:47.654 "num_base_bdevs": 2, 00:28:47.654 "num_base_bdevs_discovered": 1, 00:28:47.654 "num_base_bdevs_operational": 1, 00:28:47.654 "base_bdevs_list": [ 00:28:47.654 { 00:28:47.654 "name": null, 00:28:47.654 "uuid": "00000000-0000-0000-0000-000000000000", 00:28:47.654 "is_configured": false, 00:28:47.654 "data_offset": 256, 00:28:47.654 "data_size": 7936 00:28:47.654 }, 00:28:47.654 { 00:28:47.654 "name": "pt2", 00:28:47.654 "uuid": "00000000-0000-0000-0000-000000000002", 00:28:47.654 "is_configured": true, 00:28:47.654 "data_offset": 256, 00:28:47.654 "data_size": 7936 00:28:47.654 } 00:28:47.654 ] 00:28:47.654 }' 00:28:47.654 18:32:31 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:28:47.654 18:32:31 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:28:48.219 18:32:31 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@554 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs online 00:28:48.220 18:32:31 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@554 -- # jq -r '.[].base_bdevs_list[0].is_configured' 00:28:48.477 18:32:32 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@554 -- # [[ false == \f\a\l\s\e ]] 00:28:48.477 18:32:32 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@557 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:28:48.477 18:32:32 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@557 -- # jq -r '.[] | .uuid' 00:28:48.736 [2024-07-12 18:32:32.286554] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:28:48.736 18:32:32 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@557 -- # '[' 6131c57d-e964-4b1d-ac66-4f4e088d50d9 '!=' 6131c57d-e964-4b1d-ac66-4f4e088d50d9 ']' 00:28:48.736 18:32:32 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@562 -- # killprocess 2616250 00:28:48.736 18:32:32 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@948 -- # '[' -z 2616250 ']' 00:28:48.736 18:32:32 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@952 -- # kill -0 2616250 00:28:48.736 18:32:32 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@953 -- # uname 00:28:48.736 18:32:32 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:28:48.736 18:32:32 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2616250 00:28:48.736 18:32:32 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:28:48.736 18:32:32 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:28:48.736 18:32:32 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2616250' 00:28:48.736 killing process with pid 2616250 00:28:48.736 18:32:32 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@967 -- # kill 2616250 00:28:48.736 [2024-07-12 18:32:32.351362] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:28:48.736 [2024-07-12 18:32:32.351411] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:28:48.736 18:32:32 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@972 -- # wait 2616250 00:28:48.736 [2024-07-12 18:32:32.351458] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:28:48.736 [2024-07-12 18:32:32.351470] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1448640 name raid_bdev1, state offline 00:28:48.736 [2024-07-12 18:32:32.371070] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:28:48.994 18:32:32 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@564 -- # return 0 00:28:48.994 00:28:48.994 real 0m16.192s 00:28:48.994 user 0m29.266s 00:28:48.994 sys 0m3.007s 00:28:48.994 18:32:32 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@1124 -- # xtrace_disable 00:28:48.994 18:32:32 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:28:48.994 ************************************ 00:28:48.994 END TEST raid_superblock_test_md_interleaved 00:28:48.994 ************************************ 00:28:48.994 18:32:32 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:28:48.994 18:32:32 bdev_raid -- bdev/bdev_raid.sh@914 -- # run_test raid_rebuild_test_sb_md_interleaved raid_rebuild_test raid1 2 true false false 00:28:48.994 18:32:32 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 7 -le 1 ']' 00:28:48.994 18:32:32 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:28:48.994 18:32:32 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:28:48.994 ************************************ 00:28:48.994 START TEST raid_rebuild_test_sb_md_interleaved 00:28:48.994 ************************************ 00:28:48.994 18:32:32 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@1123 -- # raid_rebuild_test raid1 2 true false false 00:28:48.994 18:32:32 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@568 -- # local raid_level=raid1 00:28:48.994 18:32:32 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@569 -- # local num_base_bdevs=2 00:28:48.994 18:32:32 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@570 -- # local superblock=true 00:28:48.994 18:32:32 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@571 -- # local background_io=false 00:28:48.994 18:32:32 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@572 -- # local verify=false 00:28:48.994 18:32:32 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@573 -- # (( i = 1 )) 00:28:48.994 18:32:32 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:28:48.994 18:32:32 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@575 -- # echo BaseBdev1 00:28:48.994 18:32:32 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:28:48.994 18:32:32 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:28:48.994 18:32:32 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@575 -- # echo BaseBdev2 00:28:48.994 18:32:32 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:28:48.994 18:32:32 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:28:48.994 18:32:32 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@573 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:28:48.994 18:32:32 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@573 -- # local base_bdevs 00:28:48.994 18:32:32 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@574 -- # local raid_bdev_name=raid_bdev1 00:28:48.994 18:32:32 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@575 -- # local strip_size 00:28:48.994 18:32:32 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@576 -- # local create_arg 00:28:48.994 18:32:32 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@577 -- # local raid_bdev_size 00:28:48.994 18:32:32 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@578 -- # local data_offset 00:28:48.994 18:32:32 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@580 -- # '[' raid1 '!=' raid1 ']' 00:28:48.994 18:32:32 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@588 -- # strip_size=0 00:28:48.994 18:32:32 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@591 -- # '[' true = true ']' 00:28:48.994 18:32:32 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@592 -- # create_arg+=' -s' 00:28:48.994 18:32:32 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@596 -- # raid_pid=2618623 00:28:48.994 18:32:32 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@597 -- # waitforlisten 2618623 /var/tmp/spdk-raid.sock 00:28:48.994 18:32:32 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@829 -- # '[' -z 2618623 ']' 00:28:48.994 18:32:32 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:28:48.994 18:32:32 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@834 -- # local max_retries=100 00:28:48.994 18:32:32 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:28:48.994 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:28:48.994 18:32:32 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@838 -- # xtrace_disable 00:28:48.994 18:32:32 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:28:48.994 18:32:32 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@595 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 3M -q 2 -U -z -L bdev_raid 00:28:49.269 [2024-07-12 18:32:32.735599] Starting SPDK v24.09-pre git sha1 182dd7de4 / DPDK 24.03.0 initialization... 00:28:49.269 [2024-07-12 18:32:32.735666] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2618623 ] 00:28:49.269 I/O size of 3145728 is greater than zero copy threshold (65536). 00:28:49.269 Zero copy mechanism will not be used. 00:28:49.269 [2024-07-12 18:32:32.868188] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:28:49.269 [2024-07-12 18:32:32.974961] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:28:49.526 [2024-07-12 18:32:33.042369] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:28:49.526 [2024-07-12 18:32:33.042406] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:28:50.091 18:32:33 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:28:50.091 18:32:33 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@862 -- # return 0 00:28:50.091 18:32:33 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:28:50.091 18:32:33 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -m 32 -i -b BaseBdev1_malloc 00:28:50.091 BaseBdev1_malloc 00:28:50.091 18:32:33 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev1_malloc -p BaseBdev1 00:28:50.348 [2024-07-12 18:32:33.963554] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev1_malloc 00:28:50.348 [2024-07-12 18:32:33.963602] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:28:50.348 [2024-07-12 18:32:33.963624] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x2391ce0 00:28:50.348 [2024-07-12 18:32:33.963637] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:28:50.348 [2024-07-12 18:32:33.965184] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:28:50.348 [2024-07-12 18:32:33.965213] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:28:50.348 BaseBdev1 00:28:50.348 18:32:33 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:28:50.348 18:32:33 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -m 32 -i -b BaseBdev2_malloc 00:28:50.606 BaseBdev2_malloc 00:28:50.606 18:32:34 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev2_malloc -p BaseBdev2 00:28:50.865 [2024-07-12 18:32:34.370113] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev2_malloc 00:28:50.865 [2024-07-12 18:32:34.370159] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:28:50.865 [2024-07-12 18:32:34.370181] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x23892d0 00:28:50.865 [2024-07-12 18:32:34.370194] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:28:50.865 [2024-07-12 18:32:34.371962] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:28:50.865 [2024-07-12 18:32:34.371993] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:28:50.865 BaseBdev2 00:28:50.865 18:32:34 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@606 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -m 32 -i -b spare_malloc 00:28:50.865 spare_malloc 00:28:50.865 18:32:34 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@607 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_delay_create -b spare_malloc -d spare_delay -r 0 -t 0 -w 100000 -n 100000 00:28:51.123 spare_delay 00:28:51.123 18:32:34 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@608 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:28:51.382 [2024-07-12 18:32:34.933804] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:28:51.382 [2024-07-12 18:32:34.933855] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:28:51.382 [2024-07-12 18:32:34.933876] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x238c070 00:28:51.382 [2024-07-12 18:32:34.933890] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:28:51.382 [2024-07-12 18:32:34.935304] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:28:51.382 [2024-07-12 18:32:34.935332] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:28:51.382 spare 00:28:51.382 18:32:34 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@611 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2' -n raid_bdev1 00:28:51.641 [2024-07-12 18:32:35.162442] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:28:51.641 [2024-07-12 18:32:35.163779] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:28:51.641 [2024-07-12 18:32:35.163950] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x238e370 00:28:51.641 [2024-07-12 18:32:35.163964] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4128 00:28:51.641 [2024-07-12 18:32:35.164036] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x21f49c0 00:28:51.641 [2024-07-12 18:32:35.164117] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x238e370 00:28:51.642 [2024-07-12 18:32:35.164127] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x238e370 00:28:51.642 [2024-07-12 18:32:35.164183] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:28:51.642 18:32:35 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@612 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:28:51.642 18:32:35 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:28:51.642 18:32:35 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:28:51.642 18:32:35 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:28:51.642 18:32:35 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:28:51.642 18:32:35 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:28:51.642 18:32:35 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:28:51.642 18:32:35 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:28:51.642 18:32:35 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:28:51.642 18:32:35 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@124 -- # local tmp 00:28:51.642 18:32:35 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:28:51.642 18:32:35 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:51.900 18:32:35 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:28:51.900 "name": "raid_bdev1", 00:28:51.900 "uuid": "0f5d0563-a136-4190-b1f3-16816bd77e16", 00:28:51.900 "strip_size_kb": 0, 00:28:51.900 "state": "online", 00:28:51.900 "raid_level": "raid1", 00:28:51.900 "superblock": true, 00:28:51.900 "num_base_bdevs": 2, 00:28:51.900 "num_base_bdevs_discovered": 2, 00:28:51.900 "num_base_bdevs_operational": 2, 00:28:51.900 "base_bdevs_list": [ 00:28:51.900 { 00:28:51.900 "name": "BaseBdev1", 00:28:51.900 "uuid": "7b94f8a7-9743-549b-9a3a-96e0515859c1", 00:28:51.901 "is_configured": true, 00:28:51.901 "data_offset": 256, 00:28:51.901 "data_size": 7936 00:28:51.901 }, 00:28:51.901 { 00:28:51.901 "name": "BaseBdev2", 00:28:51.901 "uuid": "87c1d39a-0239-51fa-b267-210420b7e701", 00:28:51.901 "is_configured": true, 00:28:51.901 "data_offset": 256, 00:28:51.901 "data_size": 7936 00:28:51.901 } 00:28:51.901 ] 00:28:51.901 }' 00:28:51.901 18:32:35 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:28:51.901 18:32:35 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:28:52.467 18:32:36 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@615 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:28:52.467 18:32:36 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@615 -- # jq -r '.[].num_blocks' 00:28:52.726 [2024-07-12 18:32:36.225474] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:28:52.726 18:32:36 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@615 -- # raid_bdev_size=7936 00:28:52.726 18:32:36 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@618 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:52.726 18:32:36 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@618 -- # jq -r '.[].base_bdevs_list[0].data_offset' 00:28:52.984 18:32:36 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@618 -- # data_offset=256 00:28:52.984 18:32:36 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@620 -- # '[' false = true ']' 00:28:52.984 18:32:36 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@623 -- # '[' false = true ']' 00:28:52.984 18:32:36 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@639 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev1 00:28:53.283 [2024-07-12 18:32:36.722538] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:28:53.283 18:32:36 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@642 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:28:53.284 18:32:36 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:28:53.284 18:32:36 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:28:53.284 18:32:36 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:28:53.284 18:32:36 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:28:53.284 18:32:36 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:28:53.284 18:32:36 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:28:53.284 18:32:36 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:28:53.284 18:32:36 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:28:53.284 18:32:36 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@124 -- # local tmp 00:28:53.284 18:32:36 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:53.284 18:32:36 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:28:53.563 18:32:37 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:28:53.563 "name": "raid_bdev1", 00:28:53.563 "uuid": "0f5d0563-a136-4190-b1f3-16816bd77e16", 00:28:53.563 "strip_size_kb": 0, 00:28:53.563 "state": "online", 00:28:53.563 "raid_level": "raid1", 00:28:53.563 "superblock": true, 00:28:53.563 "num_base_bdevs": 2, 00:28:53.563 "num_base_bdevs_discovered": 1, 00:28:53.563 "num_base_bdevs_operational": 1, 00:28:53.563 "base_bdevs_list": [ 00:28:53.563 { 00:28:53.563 "name": null, 00:28:53.563 "uuid": "00000000-0000-0000-0000-000000000000", 00:28:53.563 "is_configured": false, 00:28:53.563 "data_offset": 256, 00:28:53.563 "data_size": 7936 00:28:53.563 }, 00:28:53.563 { 00:28:53.563 "name": "BaseBdev2", 00:28:53.563 "uuid": "87c1d39a-0239-51fa-b267-210420b7e701", 00:28:53.563 "is_configured": true, 00:28:53.563 "data_offset": 256, 00:28:53.563 "data_size": 7936 00:28:53.563 } 00:28:53.563 ] 00:28:53.563 }' 00:28:53.563 18:32:37 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:28:53.563 18:32:37 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:28:54.498 18:32:37 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@645 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:28:54.498 [2024-07-12 18:32:38.082143] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:28:54.498 [2024-07-12 18:32:38.085727] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x238e250 00:28:54.498 [2024-07-12 18:32:38.087721] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:28:54.498 18:32:38 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@646 -- # sleep 1 00:28:55.432 18:32:39 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@649 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:28:55.432 18:32:39 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:28:55.432 18:32:39 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:28:55.432 18:32:39 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@184 -- # local target=spare 00:28:55.432 18:32:39 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:28:55.432 18:32:39 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:55.432 18:32:39 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:28:55.998 18:32:39 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:28:55.998 "name": "raid_bdev1", 00:28:55.998 "uuid": "0f5d0563-a136-4190-b1f3-16816bd77e16", 00:28:55.998 "strip_size_kb": 0, 00:28:55.998 "state": "online", 00:28:55.998 "raid_level": "raid1", 00:28:55.998 "superblock": true, 00:28:55.998 "num_base_bdevs": 2, 00:28:55.998 "num_base_bdevs_discovered": 2, 00:28:55.998 "num_base_bdevs_operational": 2, 00:28:55.998 "process": { 00:28:55.998 "type": "rebuild", 00:28:55.998 "target": "spare", 00:28:55.998 "progress": { 00:28:55.998 "blocks": 3584, 00:28:55.998 "percent": 45 00:28:55.998 } 00:28:55.998 }, 00:28:55.998 "base_bdevs_list": [ 00:28:55.998 { 00:28:55.998 "name": "spare", 00:28:55.998 "uuid": "05214cbf-c65f-5aee-951b-019e7f0fc563", 00:28:55.998 "is_configured": true, 00:28:55.998 "data_offset": 256, 00:28:55.998 "data_size": 7936 00:28:55.998 }, 00:28:55.998 { 00:28:55.998 "name": "BaseBdev2", 00:28:55.998 "uuid": "87c1d39a-0239-51fa-b267-210420b7e701", 00:28:55.998 "is_configured": true, 00:28:55.998 "data_offset": 256, 00:28:55.998 "data_size": 7936 00:28:55.998 } 00:28:55.998 ] 00:28:55.998 }' 00:28:55.998 18:32:39 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:28:55.998 18:32:39 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:28:55.998 18:32:39 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:28:55.998 18:32:39 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:28:55.998 18:32:39 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@652 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev spare 00:28:56.256 [2024-07-12 18:32:39.897465] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:28:56.256 [2024-07-12 18:32:39.901960] bdev_raid.c:2513:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:28:56.256 [2024-07-12 18:32:39.902003] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:28:56.256 [2024-07-12 18:32:39.902018] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:28:56.256 [2024-07-12 18:32:39.902027] bdev_raid.c:2451:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:28:56.256 18:32:39 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@655 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:28:56.256 18:32:39 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:28:56.256 18:32:39 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:28:56.256 18:32:39 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:28:56.256 18:32:39 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:28:56.256 18:32:39 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:28:56.256 18:32:39 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:28:56.256 18:32:39 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:28:56.256 18:32:39 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:28:56.256 18:32:39 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@124 -- # local tmp 00:28:56.256 18:32:39 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:56.256 18:32:39 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:28:56.514 18:32:40 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:28:56.514 "name": "raid_bdev1", 00:28:56.514 "uuid": "0f5d0563-a136-4190-b1f3-16816bd77e16", 00:28:56.514 "strip_size_kb": 0, 00:28:56.514 "state": "online", 00:28:56.514 "raid_level": "raid1", 00:28:56.514 "superblock": true, 00:28:56.514 "num_base_bdevs": 2, 00:28:56.514 "num_base_bdevs_discovered": 1, 00:28:56.514 "num_base_bdevs_operational": 1, 00:28:56.514 "base_bdevs_list": [ 00:28:56.514 { 00:28:56.514 "name": null, 00:28:56.514 "uuid": "00000000-0000-0000-0000-000000000000", 00:28:56.514 "is_configured": false, 00:28:56.514 "data_offset": 256, 00:28:56.514 "data_size": 7936 00:28:56.514 }, 00:28:56.514 { 00:28:56.514 "name": "BaseBdev2", 00:28:56.514 "uuid": "87c1d39a-0239-51fa-b267-210420b7e701", 00:28:56.514 "is_configured": true, 00:28:56.514 "data_offset": 256, 00:28:56.514 "data_size": 7936 00:28:56.514 } 00:28:56.514 ] 00:28:56.514 }' 00:28:56.514 18:32:40 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:28:56.514 18:32:40 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:28:57.084 18:32:40 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@658 -- # verify_raid_bdev_process raid_bdev1 none none 00:28:57.084 18:32:40 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:28:57.084 18:32:40 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:28:57.084 18:32:40 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@184 -- # local target=none 00:28:57.084 18:32:40 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:28:57.084 18:32:40 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:57.084 18:32:40 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:28:57.342 18:32:40 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:28:57.343 "name": "raid_bdev1", 00:28:57.343 "uuid": "0f5d0563-a136-4190-b1f3-16816bd77e16", 00:28:57.343 "strip_size_kb": 0, 00:28:57.343 "state": "online", 00:28:57.343 "raid_level": "raid1", 00:28:57.343 "superblock": true, 00:28:57.343 "num_base_bdevs": 2, 00:28:57.343 "num_base_bdevs_discovered": 1, 00:28:57.343 "num_base_bdevs_operational": 1, 00:28:57.343 "base_bdevs_list": [ 00:28:57.343 { 00:28:57.343 "name": null, 00:28:57.343 "uuid": "00000000-0000-0000-0000-000000000000", 00:28:57.343 "is_configured": false, 00:28:57.343 "data_offset": 256, 00:28:57.343 "data_size": 7936 00:28:57.343 }, 00:28:57.343 { 00:28:57.343 "name": "BaseBdev2", 00:28:57.343 "uuid": "87c1d39a-0239-51fa-b267-210420b7e701", 00:28:57.343 "is_configured": true, 00:28:57.343 "data_offset": 256, 00:28:57.343 "data_size": 7936 00:28:57.343 } 00:28:57.343 ] 00:28:57.343 }' 00:28:57.343 18:32:40 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:28:57.343 18:32:40 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:28:57.343 18:32:40 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:28:57.343 18:32:40 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:28:57.343 18:32:40 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@661 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:28:57.602 [2024-07-12 18:32:41.225301] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:28:57.602 [2024-07-12 18:32:41.229421] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x238a270 00:28:57.602 [2024-07-12 18:32:41.230900] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:28:57.602 18:32:41 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@662 -- # sleep 1 00:28:58.537 18:32:42 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@663 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:28:58.537 18:32:42 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:28:58.537 18:32:42 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:28:58.537 18:32:42 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@184 -- # local target=spare 00:28:58.537 18:32:42 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:28:58.537 18:32:42 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:58.537 18:32:42 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:28:58.797 18:32:42 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:28:58.797 "name": "raid_bdev1", 00:28:58.797 "uuid": "0f5d0563-a136-4190-b1f3-16816bd77e16", 00:28:58.797 "strip_size_kb": 0, 00:28:58.797 "state": "online", 00:28:58.797 "raid_level": "raid1", 00:28:58.797 "superblock": true, 00:28:58.797 "num_base_bdevs": 2, 00:28:58.797 "num_base_bdevs_discovered": 2, 00:28:58.797 "num_base_bdevs_operational": 2, 00:28:58.797 "process": { 00:28:58.797 "type": "rebuild", 00:28:58.797 "target": "spare", 00:28:58.797 "progress": { 00:28:58.797 "blocks": 3072, 00:28:58.797 "percent": 38 00:28:58.797 } 00:28:58.797 }, 00:28:58.797 "base_bdevs_list": [ 00:28:58.797 { 00:28:58.797 "name": "spare", 00:28:58.797 "uuid": "05214cbf-c65f-5aee-951b-019e7f0fc563", 00:28:58.797 "is_configured": true, 00:28:58.797 "data_offset": 256, 00:28:58.797 "data_size": 7936 00:28:58.797 }, 00:28:58.797 { 00:28:58.797 "name": "BaseBdev2", 00:28:58.797 "uuid": "87c1d39a-0239-51fa-b267-210420b7e701", 00:28:58.797 "is_configured": true, 00:28:58.797 "data_offset": 256, 00:28:58.797 "data_size": 7936 00:28:58.797 } 00:28:58.797 ] 00:28:58.797 }' 00:28:58.797 18:32:42 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:28:59.056 18:32:42 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:28:59.056 18:32:42 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:28:59.056 18:32:42 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:28:59.056 18:32:42 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@665 -- # '[' true = true ']' 00:28:59.056 18:32:42 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@665 -- # '[' = false ']' 00:28:59.056 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev_raid.sh: line 665: [: =: unary operator expected 00:28:59.056 18:32:42 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@690 -- # local num_base_bdevs_operational=2 00:28:59.056 18:32:42 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@692 -- # '[' raid1 = raid1 ']' 00:28:59.056 18:32:42 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@692 -- # '[' 2 -gt 2 ']' 00:28:59.056 18:32:42 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@705 -- # local timeout=1135 00:28:59.056 18:32:42 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:28:59.056 18:32:42 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:28:59.056 18:32:42 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:28:59.056 18:32:42 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:28:59.056 18:32:42 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@184 -- # local target=spare 00:28:59.056 18:32:42 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:28:59.056 18:32:42 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:59.056 18:32:42 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:28:59.056 18:32:42 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:28:59.056 "name": "raid_bdev1", 00:28:59.056 "uuid": "0f5d0563-a136-4190-b1f3-16816bd77e16", 00:28:59.056 "strip_size_kb": 0, 00:28:59.056 "state": "online", 00:28:59.056 "raid_level": "raid1", 00:28:59.056 "superblock": true, 00:28:59.056 "num_base_bdevs": 2, 00:28:59.056 "num_base_bdevs_discovered": 2, 00:28:59.056 "num_base_bdevs_operational": 2, 00:28:59.056 "process": { 00:28:59.056 "type": "rebuild", 00:28:59.056 "target": "spare", 00:28:59.056 "progress": { 00:28:59.056 "blocks": 3584, 00:28:59.056 "percent": 45 00:28:59.056 } 00:28:59.056 }, 00:28:59.056 "base_bdevs_list": [ 00:28:59.056 { 00:28:59.056 "name": "spare", 00:28:59.056 "uuid": "05214cbf-c65f-5aee-951b-019e7f0fc563", 00:28:59.056 "is_configured": true, 00:28:59.056 "data_offset": 256, 00:28:59.056 "data_size": 7936 00:28:59.056 }, 00:28:59.056 { 00:28:59.056 "name": "BaseBdev2", 00:28:59.056 "uuid": "87c1d39a-0239-51fa-b267-210420b7e701", 00:28:59.056 "is_configured": true, 00:28:59.056 "data_offset": 256, 00:28:59.056 "data_size": 7936 00:28:59.056 } 00:28:59.056 ] 00:28:59.056 }' 00:28:59.056 18:32:42 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:28:59.314 18:32:42 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:28:59.314 18:32:42 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:28:59.314 18:32:42 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:28:59.314 18:32:42 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@710 -- # sleep 1 00:29:00.247 18:32:43 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:29:00.247 18:32:43 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:29:00.247 18:32:43 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:29:00.247 18:32:43 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:29:00.247 18:32:43 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@184 -- # local target=spare 00:29:00.247 18:32:43 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:29:00.247 18:32:43 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:29:00.247 18:32:43 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:29:00.506 18:32:44 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:29:00.506 "name": "raid_bdev1", 00:29:00.506 "uuid": "0f5d0563-a136-4190-b1f3-16816bd77e16", 00:29:00.506 "strip_size_kb": 0, 00:29:00.506 "state": "online", 00:29:00.506 "raid_level": "raid1", 00:29:00.506 "superblock": true, 00:29:00.506 "num_base_bdevs": 2, 00:29:00.506 "num_base_bdevs_discovered": 2, 00:29:00.506 "num_base_bdevs_operational": 2, 00:29:00.506 "process": { 00:29:00.506 "type": "rebuild", 00:29:00.506 "target": "spare", 00:29:00.506 "progress": { 00:29:00.506 "blocks": 7168, 00:29:00.506 "percent": 90 00:29:00.506 } 00:29:00.506 }, 00:29:00.506 "base_bdevs_list": [ 00:29:00.506 { 00:29:00.506 "name": "spare", 00:29:00.506 "uuid": "05214cbf-c65f-5aee-951b-019e7f0fc563", 00:29:00.506 "is_configured": true, 00:29:00.506 "data_offset": 256, 00:29:00.506 "data_size": 7936 00:29:00.506 }, 00:29:00.506 { 00:29:00.506 "name": "BaseBdev2", 00:29:00.506 "uuid": "87c1d39a-0239-51fa-b267-210420b7e701", 00:29:00.506 "is_configured": true, 00:29:00.506 "data_offset": 256, 00:29:00.506 "data_size": 7936 00:29:00.506 } 00:29:00.506 ] 00:29:00.506 }' 00:29:00.506 18:32:44 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:29:00.506 18:32:44 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:29:00.506 18:32:44 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:29:00.506 18:32:44 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:29:00.506 18:32:44 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@710 -- # sleep 1 00:29:00.765 [2024-07-12 18:32:44.355132] bdev_raid.c:2789:raid_bdev_process_thread_run: *DEBUG*: process completed on raid_bdev1 00:29:00.765 [2024-07-12 18:32:44.355187] bdev_raid.c:2504:raid_bdev_process_finish_done: *NOTICE*: Finished rebuild on raid bdev raid_bdev1 00:29:00.765 [2024-07-12 18:32:44.355269] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:29:01.700 18:32:45 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:29:01.700 18:32:45 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:29:01.700 18:32:45 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:29:01.700 18:32:45 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:29:01.700 18:32:45 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@184 -- # local target=spare 00:29:01.700 18:32:45 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:29:01.700 18:32:45 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:29:01.700 18:32:45 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:29:01.700 18:32:45 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:29:01.700 "name": "raid_bdev1", 00:29:01.700 "uuid": "0f5d0563-a136-4190-b1f3-16816bd77e16", 00:29:01.700 "strip_size_kb": 0, 00:29:01.700 "state": "online", 00:29:01.700 "raid_level": "raid1", 00:29:01.700 "superblock": true, 00:29:01.700 "num_base_bdevs": 2, 00:29:01.700 "num_base_bdevs_discovered": 2, 00:29:01.700 "num_base_bdevs_operational": 2, 00:29:01.700 "base_bdevs_list": [ 00:29:01.700 { 00:29:01.700 "name": "spare", 00:29:01.700 "uuid": "05214cbf-c65f-5aee-951b-019e7f0fc563", 00:29:01.700 "is_configured": true, 00:29:01.700 "data_offset": 256, 00:29:01.700 "data_size": 7936 00:29:01.700 }, 00:29:01.700 { 00:29:01.700 "name": "BaseBdev2", 00:29:01.700 "uuid": "87c1d39a-0239-51fa-b267-210420b7e701", 00:29:01.700 "is_configured": true, 00:29:01.700 "data_offset": 256, 00:29:01.700 "data_size": 7936 00:29:01.700 } 00:29:01.700 ] 00:29:01.700 }' 00:29:01.700 18:32:45 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:29:01.959 18:32:45 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # [[ none == \r\e\b\u\i\l\d ]] 00:29:01.959 18:32:45 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:29:01.959 18:32:45 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # [[ none == \s\p\a\r\e ]] 00:29:01.959 18:32:45 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@708 -- # break 00:29:01.959 18:32:45 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@714 -- # verify_raid_bdev_process raid_bdev1 none none 00:29:01.959 18:32:45 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:29:01.959 18:32:45 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:29:01.959 18:32:45 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@184 -- # local target=none 00:29:01.959 18:32:45 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:29:01.960 18:32:45 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:29:01.960 18:32:45 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:29:02.252 18:32:45 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:29:02.252 "name": "raid_bdev1", 00:29:02.252 "uuid": "0f5d0563-a136-4190-b1f3-16816bd77e16", 00:29:02.252 "strip_size_kb": 0, 00:29:02.252 "state": "online", 00:29:02.252 "raid_level": "raid1", 00:29:02.252 "superblock": true, 00:29:02.252 "num_base_bdevs": 2, 00:29:02.252 "num_base_bdevs_discovered": 2, 00:29:02.252 "num_base_bdevs_operational": 2, 00:29:02.252 "base_bdevs_list": [ 00:29:02.252 { 00:29:02.252 "name": "spare", 00:29:02.252 "uuid": "05214cbf-c65f-5aee-951b-019e7f0fc563", 00:29:02.252 "is_configured": true, 00:29:02.252 "data_offset": 256, 00:29:02.252 "data_size": 7936 00:29:02.252 }, 00:29:02.252 { 00:29:02.252 "name": "BaseBdev2", 00:29:02.252 "uuid": "87c1d39a-0239-51fa-b267-210420b7e701", 00:29:02.253 "is_configured": true, 00:29:02.253 "data_offset": 256, 00:29:02.253 "data_size": 7936 00:29:02.253 } 00:29:02.253 ] 00:29:02.253 }' 00:29:02.253 18:32:45 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:29:02.253 18:32:45 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:29:02.253 18:32:45 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:29:02.253 18:32:45 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:29:02.253 18:32:45 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@715 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:29:02.253 18:32:45 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:29:02.253 18:32:45 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:29:02.253 18:32:45 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:29:02.253 18:32:45 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:29:02.253 18:32:45 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:29:02.253 18:32:45 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:29:02.253 18:32:45 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:29:02.253 18:32:45 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:29:02.253 18:32:45 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@124 -- # local tmp 00:29:02.253 18:32:45 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:29:02.253 18:32:45 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:29:02.820 18:32:46 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:29:02.820 "name": "raid_bdev1", 00:29:02.820 "uuid": "0f5d0563-a136-4190-b1f3-16816bd77e16", 00:29:02.820 "strip_size_kb": 0, 00:29:02.820 "state": "online", 00:29:02.820 "raid_level": "raid1", 00:29:02.820 "superblock": true, 00:29:02.820 "num_base_bdevs": 2, 00:29:02.820 "num_base_bdevs_discovered": 2, 00:29:02.820 "num_base_bdevs_operational": 2, 00:29:02.820 "base_bdevs_list": [ 00:29:02.820 { 00:29:02.820 "name": "spare", 00:29:02.820 "uuid": "05214cbf-c65f-5aee-951b-019e7f0fc563", 00:29:02.820 "is_configured": true, 00:29:02.820 "data_offset": 256, 00:29:02.820 "data_size": 7936 00:29:02.820 }, 00:29:02.820 { 00:29:02.820 "name": "BaseBdev2", 00:29:02.820 "uuid": "87c1d39a-0239-51fa-b267-210420b7e701", 00:29:02.820 "is_configured": true, 00:29:02.820 "data_offset": 256, 00:29:02.820 "data_size": 7936 00:29:02.820 } 00:29:02.820 ] 00:29:02.820 }' 00:29:02.820 18:32:46 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:29:02.820 18:32:46 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:29:03.387 18:32:46 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@718 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:29:03.645 [2024-07-12 18:32:47.138414] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:29:03.645 [2024-07-12 18:32:47.138440] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:29:03.645 [2024-07-12 18:32:47.138494] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:29:03.645 [2024-07-12 18:32:47.138546] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:29:03.645 [2024-07-12 18:32:47.138557] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x238e370 name raid_bdev1, state offline 00:29:03.645 18:32:47 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@719 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:29:03.645 18:32:47 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@719 -- # jq length 00:29:03.904 18:32:47 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@719 -- # [[ 0 == 0 ]] 00:29:03.904 18:32:47 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@721 -- # '[' false = true ']' 00:29:03.904 18:32:47 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@742 -- # '[' true = true ']' 00:29:03.904 18:32:47 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@744 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:29:04.162 18:32:47 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@745 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:29:04.420 [2024-07-12 18:32:48.129071] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:29:04.420 [2024-07-12 18:32:48.129116] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:29:04.420 [2024-07-12 18:32:48.129137] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x2390730 00:29:04.420 [2024-07-12 18:32:48.129150] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:29:04.420 [2024-07-12 18:32:48.130638] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:29:04.420 [2024-07-12 18:32:48.130666] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:29:04.420 [2024-07-12 18:32:48.130721] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev spare 00:29:04.420 [2024-07-12 18:32:48.130744] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:29:04.420 [2024-07-12 18:32:48.130830] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:29:04.420 spare 00:29:04.679 18:32:48 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@747 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:29:04.679 18:32:48 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:29:04.679 18:32:48 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:29:04.679 18:32:48 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:29:04.679 18:32:48 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:29:04.679 18:32:48 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:29:04.679 18:32:48 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:29:04.679 18:32:48 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:29:04.679 18:32:48 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:29:04.679 18:32:48 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@124 -- # local tmp 00:29:04.679 18:32:48 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:29:04.679 18:32:48 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:29:04.679 [2024-07-12 18:32:48.231136] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x238e810 00:29:04.679 [2024-07-12 18:32:48.231152] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4128 00:29:04.679 [2024-07-12 18:32:48.231225] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x2389a60 00:29:04.679 [2024-07-12 18:32:48.231317] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x238e810 00:29:04.679 [2024-07-12 18:32:48.231326] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x238e810 00:29:04.679 [2024-07-12 18:32:48.231391] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:29:04.936 18:32:48 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:29:04.936 "name": "raid_bdev1", 00:29:04.936 "uuid": "0f5d0563-a136-4190-b1f3-16816bd77e16", 00:29:04.936 "strip_size_kb": 0, 00:29:04.936 "state": "online", 00:29:04.936 "raid_level": "raid1", 00:29:04.936 "superblock": true, 00:29:04.936 "num_base_bdevs": 2, 00:29:04.937 "num_base_bdevs_discovered": 2, 00:29:04.937 "num_base_bdevs_operational": 2, 00:29:04.937 "base_bdevs_list": [ 00:29:04.937 { 00:29:04.937 "name": "spare", 00:29:04.937 "uuid": "05214cbf-c65f-5aee-951b-019e7f0fc563", 00:29:04.937 "is_configured": true, 00:29:04.937 "data_offset": 256, 00:29:04.937 "data_size": 7936 00:29:04.937 }, 00:29:04.937 { 00:29:04.937 "name": "BaseBdev2", 00:29:04.937 "uuid": "87c1d39a-0239-51fa-b267-210420b7e701", 00:29:04.937 "is_configured": true, 00:29:04.937 "data_offset": 256, 00:29:04.937 "data_size": 7936 00:29:04.937 } 00:29:04.937 ] 00:29:04.937 }' 00:29:04.937 18:32:48 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:29:04.937 18:32:48 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:29:05.505 18:32:48 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@748 -- # verify_raid_bdev_process raid_bdev1 none none 00:29:05.505 18:32:48 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:29:05.505 18:32:48 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:29:05.505 18:32:48 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@184 -- # local target=none 00:29:05.505 18:32:48 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:29:05.505 18:32:48 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:29:05.505 18:32:48 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:29:05.505 18:32:49 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:29:05.505 "name": "raid_bdev1", 00:29:05.505 "uuid": "0f5d0563-a136-4190-b1f3-16816bd77e16", 00:29:05.505 "strip_size_kb": 0, 00:29:05.505 "state": "online", 00:29:05.505 "raid_level": "raid1", 00:29:05.505 "superblock": true, 00:29:05.505 "num_base_bdevs": 2, 00:29:05.505 "num_base_bdevs_discovered": 2, 00:29:05.505 "num_base_bdevs_operational": 2, 00:29:05.505 "base_bdevs_list": [ 00:29:05.505 { 00:29:05.505 "name": "spare", 00:29:05.505 "uuid": "05214cbf-c65f-5aee-951b-019e7f0fc563", 00:29:05.505 "is_configured": true, 00:29:05.505 "data_offset": 256, 00:29:05.505 "data_size": 7936 00:29:05.505 }, 00:29:05.505 { 00:29:05.505 "name": "BaseBdev2", 00:29:05.505 "uuid": "87c1d39a-0239-51fa-b267-210420b7e701", 00:29:05.505 "is_configured": true, 00:29:05.505 "data_offset": 256, 00:29:05.505 "data_size": 7936 00:29:05.505 } 00:29:05.505 ] 00:29:05.505 }' 00:29:05.505 18:32:49 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:29:05.505 18:32:49 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:29:05.505 18:32:49 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:29:05.505 18:32:49 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:29:05.505 18:32:49 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@749 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:29:05.505 18:32:49 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@749 -- # jq -r '.[].base_bdevs_list[0].name' 00:29:05.763 18:32:49 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@749 -- # [[ spare == \s\p\a\r\e ]] 00:29:05.763 18:32:49 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@752 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev spare 00:29:06.021 [2024-07-12 18:32:49.673281] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:29:06.021 18:32:49 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@753 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:29:06.021 18:32:49 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:29:06.021 18:32:49 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:29:06.021 18:32:49 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:29:06.021 18:32:49 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:29:06.021 18:32:49 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:29:06.021 18:32:49 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:29:06.021 18:32:49 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:29:06.021 18:32:49 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:29:06.021 18:32:49 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@124 -- # local tmp 00:29:06.021 18:32:49 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:29:06.021 18:32:49 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:29:06.279 18:32:49 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:29:06.280 "name": "raid_bdev1", 00:29:06.280 "uuid": "0f5d0563-a136-4190-b1f3-16816bd77e16", 00:29:06.280 "strip_size_kb": 0, 00:29:06.280 "state": "online", 00:29:06.280 "raid_level": "raid1", 00:29:06.280 "superblock": true, 00:29:06.280 "num_base_bdevs": 2, 00:29:06.280 "num_base_bdevs_discovered": 1, 00:29:06.280 "num_base_bdevs_operational": 1, 00:29:06.280 "base_bdevs_list": [ 00:29:06.280 { 00:29:06.280 "name": null, 00:29:06.280 "uuid": "00000000-0000-0000-0000-000000000000", 00:29:06.280 "is_configured": false, 00:29:06.280 "data_offset": 256, 00:29:06.280 "data_size": 7936 00:29:06.280 }, 00:29:06.280 { 00:29:06.280 "name": "BaseBdev2", 00:29:06.280 "uuid": "87c1d39a-0239-51fa-b267-210420b7e701", 00:29:06.280 "is_configured": true, 00:29:06.280 "data_offset": 256, 00:29:06.280 "data_size": 7936 00:29:06.280 } 00:29:06.280 ] 00:29:06.280 }' 00:29:06.280 18:32:49 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:29:06.280 18:32:49 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:29:06.843 18:32:50 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@754 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:29:07.100 [2024-07-12 18:32:50.716059] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:29:07.100 [2024-07-12 18:32:50.716209] bdev_raid.c:3562:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev spare (4) smaller than existing raid bdev raid_bdev1 (5) 00:29:07.100 [2024-07-12 18:32:50.716225] bdev_raid.c:3620:raid_bdev_examine_sb: *NOTICE*: Re-adding bdev spare to raid bdev raid_bdev1. 00:29:07.100 [2024-07-12 18:32:50.716252] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:29:07.100 [2024-07-12 18:32:50.719711] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x238ed90 00:29:07.100 [2024-07-12 18:32:50.721123] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:29:07.100 18:32:50 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@755 -- # sleep 1 00:29:08.032 18:32:51 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@756 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:29:08.032 18:32:51 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:29:08.032 18:32:51 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:29:08.032 18:32:51 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@184 -- # local target=spare 00:29:08.032 18:32:51 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:29:08.032 18:32:51 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:29:08.032 18:32:51 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:29:08.289 18:32:51 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:29:08.289 "name": "raid_bdev1", 00:29:08.289 "uuid": "0f5d0563-a136-4190-b1f3-16816bd77e16", 00:29:08.289 "strip_size_kb": 0, 00:29:08.289 "state": "online", 00:29:08.289 "raid_level": "raid1", 00:29:08.289 "superblock": true, 00:29:08.289 "num_base_bdevs": 2, 00:29:08.289 "num_base_bdevs_discovered": 2, 00:29:08.289 "num_base_bdevs_operational": 2, 00:29:08.289 "process": { 00:29:08.289 "type": "rebuild", 00:29:08.289 "target": "spare", 00:29:08.289 "progress": { 00:29:08.289 "blocks": 3072, 00:29:08.289 "percent": 38 00:29:08.289 } 00:29:08.289 }, 00:29:08.289 "base_bdevs_list": [ 00:29:08.289 { 00:29:08.289 "name": "spare", 00:29:08.289 "uuid": "05214cbf-c65f-5aee-951b-019e7f0fc563", 00:29:08.289 "is_configured": true, 00:29:08.289 "data_offset": 256, 00:29:08.289 "data_size": 7936 00:29:08.289 }, 00:29:08.289 { 00:29:08.289 "name": "BaseBdev2", 00:29:08.289 "uuid": "87c1d39a-0239-51fa-b267-210420b7e701", 00:29:08.289 "is_configured": true, 00:29:08.289 "data_offset": 256, 00:29:08.289 "data_size": 7936 00:29:08.289 } 00:29:08.289 ] 00:29:08.289 }' 00:29:08.289 18:32:51 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:29:08.546 18:32:52 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:29:08.546 18:32:52 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:29:08.546 18:32:52 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:29:08.546 18:32:52 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@759 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:29:08.804 [2024-07-12 18:32:52.290439] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:29:08.804 [2024-07-12 18:32:52.333941] bdev_raid.c:2513:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:29:08.804 [2024-07-12 18:32:52.333986] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:29:08.804 [2024-07-12 18:32:52.334001] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:29:08.804 [2024-07-12 18:32:52.334009] bdev_raid.c:2451:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:29:08.804 18:32:52 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@760 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:29:08.804 18:32:52 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:29:08.804 18:32:52 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:29:08.804 18:32:52 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:29:08.804 18:32:52 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:29:08.804 18:32:52 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:29:08.804 18:32:52 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:29:08.804 18:32:52 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:29:08.804 18:32:52 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:29:08.804 18:32:52 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@124 -- # local tmp 00:29:08.804 18:32:52 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:29:08.804 18:32:52 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:29:09.061 18:32:52 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:29:09.061 "name": "raid_bdev1", 00:29:09.061 "uuid": "0f5d0563-a136-4190-b1f3-16816bd77e16", 00:29:09.061 "strip_size_kb": 0, 00:29:09.061 "state": "online", 00:29:09.061 "raid_level": "raid1", 00:29:09.061 "superblock": true, 00:29:09.061 "num_base_bdevs": 2, 00:29:09.061 "num_base_bdevs_discovered": 1, 00:29:09.061 "num_base_bdevs_operational": 1, 00:29:09.061 "base_bdevs_list": [ 00:29:09.061 { 00:29:09.061 "name": null, 00:29:09.061 "uuid": "00000000-0000-0000-0000-000000000000", 00:29:09.061 "is_configured": false, 00:29:09.061 "data_offset": 256, 00:29:09.061 "data_size": 7936 00:29:09.061 }, 00:29:09.061 { 00:29:09.061 "name": "BaseBdev2", 00:29:09.061 "uuid": "87c1d39a-0239-51fa-b267-210420b7e701", 00:29:09.061 "is_configured": true, 00:29:09.061 "data_offset": 256, 00:29:09.061 "data_size": 7936 00:29:09.061 } 00:29:09.061 ] 00:29:09.061 }' 00:29:09.061 18:32:52 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:29:09.061 18:32:52 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:29:09.991 18:32:53 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@761 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:29:09.991 [2024-07-12 18:32:53.625801] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:29:09.991 [2024-07-12 18:32:53.625847] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:29:09.991 [2024-07-12 18:32:53.625869] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x238dc80 00:29:09.991 [2024-07-12 18:32:53.625882] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:29:09.991 [2024-07-12 18:32:53.626074] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:29:09.991 [2024-07-12 18:32:53.626090] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:29:09.991 [2024-07-12 18:32:53.626143] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev spare 00:29:09.991 [2024-07-12 18:32:53.626154] bdev_raid.c:3562:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev spare (4) smaller than existing raid bdev raid_bdev1 (5) 00:29:09.991 [2024-07-12 18:32:53.626164] bdev_raid.c:3620:raid_bdev_examine_sb: *NOTICE*: Re-adding bdev spare to raid bdev raid_bdev1. 00:29:09.991 [2024-07-12 18:32:53.626181] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:29:09.991 [2024-07-12 18:32:53.629634] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x238e0a0 00:29:09.991 [2024-07-12 18:32:53.630972] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:29:09.991 spare 00:29:09.991 18:32:53 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@762 -- # sleep 1 00:29:11.394 18:32:54 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@763 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:29:11.394 18:32:54 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:29:11.394 18:32:54 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:29:11.394 18:32:54 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@184 -- # local target=spare 00:29:11.394 18:32:54 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:29:11.394 18:32:54 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:29:11.394 18:32:54 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:29:11.394 18:32:54 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:29:11.394 "name": "raid_bdev1", 00:29:11.394 "uuid": "0f5d0563-a136-4190-b1f3-16816bd77e16", 00:29:11.394 "strip_size_kb": 0, 00:29:11.394 "state": "online", 00:29:11.394 "raid_level": "raid1", 00:29:11.394 "superblock": true, 00:29:11.394 "num_base_bdevs": 2, 00:29:11.394 "num_base_bdevs_discovered": 2, 00:29:11.394 "num_base_bdevs_operational": 2, 00:29:11.394 "process": { 00:29:11.394 "type": "rebuild", 00:29:11.394 "target": "spare", 00:29:11.394 "progress": { 00:29:11.394 "blocks": 3072, 00:29:11.394 "percent": 38 00:29:11.394 } 00:29:11.394 }, 00:29:11.394 "base_bdevs_list": [ 00:29:11.394 { 00:29:11.394 "name": "spare", 00:29:11.394 "uuid": "05214cbf-c65f-5aee-951b-019e7f0fc563", 00:29:11.394 "is_configured": true, 00:29:11.394 "data_offset": 256, 00:29:11.394 "data_size": 7936 00:29:11.394 }, 00:29:11.394 { 00:29:11.394 "name": "BaseBdev2", 00:29:11.394 "uuid": "87c1d39a-0239-51fa-b267-210420b7e701", 00:29:11.394 "is_configured": true, 00:29:11.394 "data_offset": 256, 00:29:11.394 "data_size": 7936 00:29:11.394 } 00:29:11.394 ] 00:29:11.394 }' 00:29:11.394 18:32:54 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:29:11.394 18:32:54 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:29:11.394 18:32:54 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:29:11.394 18:32:54 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:29:11.394 18:32:54 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@766 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:29:11.652 [2024-07-12 18:32:55.187922] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:29:11.652 [2024-07-12 18:32:55.243415] bdev_raid.c:2513:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:29:11.652 [2024-07-12 18:32:55.243458] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:29:11.652 [2024-07-12 18:32:55.243474] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:29:11.652 [2024-07-12 18:32:55.243482] bdev_raid.c:2451:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:29:11.652 18:32:55 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@767 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:29:11.652 18:32:55 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:29:11.652 18:32:55 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:29:11.652 18:32:55 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:29:11.652 18:32:55 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:29:11.652 18:32:55 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:29:11.652 18:32:55 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:29:11.652 18:32:55 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:29:11.652 18:32:55 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:29:11.652 18:32:55 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@124 -- # local tmp 00:29:11.652 18:32:55 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:29:11.652 18:32:55 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:29:11.909 18:32:55 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:29:11.909 "name": "raid_bdev1", 00:29:11.909 "uuid": "0f5d0563-a136-4190-b1f3-16816bd77e16", 00:29:11.909 "strip_size_kb": 0, 00:29:11.909 "state": "online", 00:29:11.909 "raid_level": "raid1", 00:29:11.909 "superblock": true, 00:29:11.909 "num_base_bdevs": 2, 00:29:11.909 "num_base_bdevs_discovered": 1, 00:29:11.909 "num_base_bdevs_operational": 1, 00:29:11.909 "base_bdevs_list": [ 00:29:11.909 { 00:29:11.909 "name": null, 00:29:11.909 "uuid": "00000000-0000-0000-0000-000000000000", 00:29:11.909 "is_configured": false, 00:29:11.909 "data_offset": 256, 00:29:11.909 "data_size": 7936 00:29:11.909 }, 00:29:11.909 { 00:29:11.909 "name": "BaseBdev2", 00:29:11.909 "uuid": "87c1d39a-0239-51fa-b267-210420b7e701", 00:29:11.909 "is_configured": true, 00:29:11.909 "data_offset": 256, 00:29:11.909 "data_size": 7936 00:29:11.909 } 00:29:11.909 ] 00:29:11.909 }' 00:29:11.909 18:32:55 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:29:11.909 18:32:55 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:29:12.475 18:32:56 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@768 -- # verify_raid_bdev_process raid_bdev1 none none 00:29:12.475 18:32:56 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:29:12.475 18:32:56 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:29:12.475 18:32:56 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@184 -- # local target=none 00:29:12.475 18:32:56 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:29:12.475 18:32:56 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:29:12.475 18:32:56 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:29:12.734 18:32:56 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:29:12.734 "name": "raid_bdev1", 00:29:12.734 "uuid": "0f5d0563-a136-4190-b1f3-16816bd77e16", 00:29:12.734 "strip_size_kb": 0, 00:29:12.734 "state": "online", 00:29:12.734 "raid_level": "raid1", 00:29:12.734 "superblock": true, 00:29:12.734 "num_base_bdevs": 2, 00:29:12.734 "num_base_bdevs_discovered": 1, 00:29:12.734 "num_base_bdevs_operational": 1, 00:29:12.734 "base_bdevs_list": [ 00:29:12.734 { 00:29:12.734 "name": null, 00:29:12.734 "uuid": "00000000-0000-0000-0000-000000000000", 00:29:12.734 "is_configured": false, 00:29:12.734 "data_offset": 256, 00:29:12.734 "data_size": 7936 00:29:12.734 }, 00:29:12.734 { 00:29:12.734 "name": "BaseBdev2", 00:29:12.734 "uuid": "87c1d39a-0239-51fa-b267-210420b7e701", 00:29:12.734 "is_configured": true, 00:29:12.734 "data_offset": 256, 00:29:12.734 "data_size": 7936 00:29:12.734 } 00:29:12.734 ] 00:29:12.734 }' 00:29:12.734 18:32:56 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:29:12.734 18:32:56 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:29:12.734 18:32:56 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:29:12.734 18:32:56 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:29:12.734 18:32:56 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@771 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete BaseBdev1 00:29:12.993 18:32:56 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@772 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev1_malloc -p BaseBdev1 00:29:13.252 [2024-07-12 18:32:56.819921] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev1_malloc 00:29:13.252 [2024-07-12 18:32:56.819971] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:29:13.252 [2024-07-12 18:32:56.819991] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x238d510 00:29:13.252 [2024-07-12 18:32:56.820004] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:29:13.252 [2024-07-12 18:32:56.820172] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:29:13.252 [2024-07-12 18:32:56.820188] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:29:13.252 [2024-07-12 18:32:56.820238] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev BaseBdev1 00:29:13.252 [2024-07-12 18:32:56.820249] bdev_raid.c:3562:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev BaseBdev1 (1) smaller than existing raid bdev raid_bdev1 (5) 00:29:13.252 [2024-07-12 18:32:56.820259] bdev_raid.c:3581:raid_bdev_examine_sb: *DEBUG*: raid superblock does not contain this bdev's uuid 00:29:13.252 BaseBdev1 00:29:13.253 18:32:56 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@773 -- # sleep 1 00:29:14.189 18:32:57 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@774 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:29:14.189 18:32:57 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:29:14.189 18:32:57 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:29:14.189 18:32:57 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:29:14.189 18:32:57 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:29:14.189 18:32:57 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:29:14.189 18:32:57 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:29:14.189 18:32:57 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:29:14.189 18:32:57 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:29:14.189 18:32:57 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@124 -- # local tmp 00:29:14.189 18:32:57 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:29:14.189 18:32:57 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:29:14.448 18:32:58 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:29:14.448 "name": "raid_bdev1", 00:29:14.448 "uuid": "0f5d0563-a136-4190-b1f3-16816bd77e16", 00:29:14.448 "strip_size_kb": 0, 00:29:14.448 "state": "online", 00:29:14.448 "raid_level": "raid1", 00:29:14.448 "superblock": true, 00:29:14.448 "num_base_bdevs": 2, 00:29:14.448 "num_base_bdevs_discovered": 1, 00:29:14.448 "num_base_bdevs_operational": 1, 00:29:14.448 "base_bdevs_list": [ 00:29:14.448 { 00:29:14.448 "name": null, 00:29:14.448 "uuid": "00000000-0000-0000-0000-000000000000", 00:29:14.448 "is_configured": false, 00:29:14.448 "data_offset": 256, 00:29:14.448 "data_size": 7936 00:29:14.448 }, 00:29:14.448 { 00:29:14.448 "name": "BaseBdev2", 00:29:14.448 "uuid": "87c1d39a-0239-51fa-b267-210420b7e701", 00:29:14.448 "is_configured": true, 00:29:14.448 "data_offset": 256, 00:29:14.448 "data_size": 7936 00:29:14.448 } 00:29:14.448 ] 00:29:14.448 }' 00:29:14.448 18:32:58 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:29:14.448 18:32:58 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:29:15.015 18:32:58 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@775 -- # verify_raid_bdev_process raid_bdev1 none none 00:29:15.015 18:32:58 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:29:15.015 18:32:58 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:29:15.015 18:32:58 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@184 -- # local target=none 00:29:15.015 18:32:58 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:29:15.015 18:32:58 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:29:15.015 18:32:58 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:29:15.273 18:32:58 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:29:15.273 "name": "raid_bdev1", 00:29:15.273 "uuid": "0f5d0563-a136-4190-b1f3-16816bd77e16", 00:29:15.273 "strip_size_kb": 0, 00:29:15.273 "state": "online", 00:29:15.273 "raid_level": "raid1", 00:29:15.273 "superblock": true, 00:29:15.273 "num_base_bdevs": 2, 00:29:15.273 "num_base_bdevs_discovered": 1, 00:29:15.273 "num_base_bdevs_operational": 1, 00:29:15.273 "base_bdevs_list": [ 00:29:15.273 { 00:29:15.273 "name": null, 00:29:15.273 "uuid": "00000000-0000-0000-0000-000000000000", 00:29:15.273 "is_configured": false, 00:29:15.273 "data_offset": 256, 00:29:15.273 "data_size": 7936 00:29:15.273 }, 00:29:15.273 { 00:29:15.273 "name": "BaseBdev2", 00:29:15.273 "uuid": "87c1d39a-0239-51fa-b267-210420b7e701", 00:29:15.273 "is_configured": true, 00:29:15.273 "data_offset": 256, 00:29:15.273 "data_size": 7936 00:29:15.273 } 00:29:15.273 ] 00:29:15.273 }' 00:29:15.273 18:32:58 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:29:15.532 18:32:59 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:29:15.532 18:32:59 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:29:15.532 18:32:59 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:29:15.532 18:32:59 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@776 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:29:15.532 18:32:59 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@648 -- # local es=0 00:29:15.532 18:32:59 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@650 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:29:15.532 18:32:59 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@636 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:29:15.532 18:32:59 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:29:15.532 18:32:59 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@640 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:29:15.532 18:32:59 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:29:15.532 18:32:59 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@642 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:29:15.532 18:32:59 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:29:15.532 18:32:59 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@642 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:29:15.532 18:32:59 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@642 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py ]] 00:29:15.532 18:32:59 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@651 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:29:16.098 [2024-07-12 18:32:59.567263] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:29:16.098 [2024-07-12 18:32:59.567386] bdev_raid.c:3562:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev BaseBdev1 (1) smaller than existing raid bdev raid_bdev1 (5) 00:29:16.098 [2024-07-12 18:32:59.567403] bdev_raid.c:3581:raid_bdev_examine_sb: *DEBUG*: raid superblock does not contain this bdev's uuid 00:29:16.098 request: 00:29:16.098 { 00:29:16.098 "base_bdev": "BaseBdev1", 00:29:16.098 "raid_bdev": "raid_bdev1", 00:29:16.098 "method": "bdev_raid_add_base_bdev", 00:29:16.098 "req_id": 1 00:29:16.098 } 00:29:16.098 Got JSON-RPC error response 00:29:16.098 response: 00:29:16.098 { 00:29:16.098 "code": -22, 00:29:16.098 "message": "Failed to add base bdev to RAID bdev: Invalid argument" 00:29:16.098 } 00:29:16.098 18:32:59 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@651 -- # es=1 00:29:16.098 18:32:59 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@659 -- # (( es > 128 )) 00:29:16.098 18:32:59 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@670 -- # [[ -n '' ]] 00:29:16.098 18:32:59 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@675 -- # (( !es == 0 )) 00:29:16.098 18:32:59 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@777 -- # sleep 1 00:29:17.032 18:33:00 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@778 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:29:17.032 18:33:00 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:29:17.032 18:33:00 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:29:17.032 18:33:00 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:29:17.032 18:33:00 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:29:17.032 18:33:00 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:29:17.032 18:33:00 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:29:17.032 18:33:00 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:29:17.032 18:33:00 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:29:17.032 18:33:00 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@124 -- # local tmp 00:29:17.032 18:33:00 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:29:17.032 18:33:00 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:29:17.291 18:33:00 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:29:17.291 "name": "raid_bdev1", 00:29:17.291 "uuid": "0f5d0563-a136-4190-b1f3-16816bd77e16", 00:29:17.291 "strip_size_kb": 0, 00:29:17.291 "state": "online", 00:29:17.291 "raid_level": "raid1", 00:29:17.291 "superblock": true, 00:29:17.291 "num_base_bdevs": 2, 00:29:17.291 "num_base_bdevs_discovered": 1, 00:29:17.291 "num_base_bdevs_operational": 1, 00:29:17.291 "base_bdevs_list": [ 00:29:17.291 { 00:29:17.291 "name": null, 00:29:17.291 "uuid": "00000000-0000-0000-0000-000000000000", 00:29:17.291 "is_configured": false, 00:29:17.291 "data_offset": 256, 00:29:17.291 "data_size": 7936 00:29:17.291 }, 00:29:17.291 { 00:29:17.291 "name": "BaseBdev2", 00:29:17.291 "uuid": "87c1d39a-0239-51fa-b267-210420b7e701", 00:29:17.291 "is_configured": true, 00:29:17.291 "data_offset": 256, 00:29:17.291 "data_size": 7936 00:29:17.291 } 00:29:17.291 ] 00:29:17.291 }' 00:29:17.291 18:33:00 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:29:17.291 18:33:00 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:29:17.857 18:33:01 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@779 -- # verify_raid_bdev_process raid_bdev1 none none 00:29:17.857 18:33:01 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:29:17.857 18:33:01 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:29:17.857 18:33:01 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@184 -- # local target=none 00:29:17.857 18:33:01 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:29:17.857 18:33:01 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:29:17.857 18:33:01 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:29:18.116 18:33:01 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:29:18.116 "name": "raid_bdev1", 00:29:18.116 "uuid": "0f5d0563-a136-4190-b1f3-16816bd77e16", 00:29:18.116 "strip_size_kb": 0, 00:29:18.116 "state": "online", 00:29:18.116 "raid_level": "raid1", 00:29:18.116 "superblock": true, 00:29:18.116 "num_base_bdevs": 2, 00:29:18.116 "num_base_bdevs_discovered": 1, 00:29:18.116 "num_base_bdevs_operational": 1, 00:29:18.116 "base_bdevs_list": [ 00:29:18.116 { 00:29:18.116 "name": null, 00:29:18.116 "uuid": "00000000-0000-0000-0000-000000000000", 00:29:18.116 "is_configured": false, 00:29:18.116 "data_offset": 256, 00:29:18.116 "data_size": 7936 00:29:18.116 }, 00:29:18.116 { 00:29:18.116 "name": "BaseBdev2", 00:29:18.116 "uuid": "87c1d39a-0239-51fa-b267-210420b7e701", 00:29:18.116 "is_configured": true, 00:29:18.116 "data_offset": 256, 00:29:18.116 "data_size": 7936 00:29:18.116 } 00:29:18.116 ] 00:29:18.116 }' 00:29:18.116 18:33:01 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:29:18.116 18:33:01 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:29:18.116 18:33:01 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:29:18.116 18:33:01 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:29:18.116 18:33:01 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@782 -- # killprocess 2618623 00:29:18.116 18:33:01 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@948 -- # '[' -z 2618623 ']' 00:29:18.116 18:33:01 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@952 -- # kill -0 2618623 00:29:18.116 18:33:01 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@953 -- # uname 00:29:18.116 18:33:01 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:29:18.116 18:33:01 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2618623 00:29:18.116 18:33:01 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:29:18.116 18:33:01 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:29:18.116 18:33:01 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2618623' 00:29:18.116 killing process with pid 2618623 00:29:18.116 18:33:01 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@967 -- # kill 2618623 00:29:18.116 Received shutdown signal, test time was about 60.000000 seconds 00:29:18.116 00:29:18.116 Latency(us) 00:29:18.116 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:29:18.116 =================================================================================================================== 00:29:18.116 Total : 0.00 0.00 0.00 0.00 0.00 18446744073709551616.00 0.00 00:29:18.116 [2024-07-12 18:33:01.762868] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:29:18.116 [2024-07-12 18:33:01.762965] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:29:18.116 [2024-07-12 18:33:01.763007] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:29:18.116 [2024-07-12 18:33:01.763020] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x238e810 name raid_bdev1, state offline 00:29:18.116 18:33:01 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@972 -- # wait 2618623 00:29:18.116 [2024-07-12 18:33:01.791080] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:29:18.375 18:33:02 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@784 -- # return 0 00:29:18.375 00:29:18.375 real 0m29.338s 00:29:18.375 user 0m46.951s 00:29:18.375 sys 0m3.842s 00:29:18.375 18:33:02 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@1124 -- # xtrace_disable 00:29:18.375 18:33:02 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:29:18.375 ************************************ 00:29:18.375 END TEST raid_rebuild_test_sb_md_interleaved 00:29:18.375 ************************************ 00:29:18.375 18:33:02 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:29:18.375 18:33:02 bdev_raid -- bdev/bdev_raid.sh@916 -- # trap - EXIT 00:29:18.375 18:33:02 bdev_raid -- bdev/bdev_raid.sh@917 -- # cleanup 00:29:18.375 18:33:02 bdev_raid -- bdev/bdev_raid.sh@58 -- # '[' -n 2618623 ']' 00:29:18.375 18:33:02 bdev_raid -- bdev/bdev_raid.sh@58 -- # ps -p 2618623 00:29:18.375 18:33:02 bdev_raid -- bdev/bdev_raid.sh@62 -- # rm -rf /raidtest 00:29:18.375 00:29:18.375 real 18m44.378s 00:29:18.375 user 31m43.610s 00:29:18.375 sys 3m23.828s 00:29:18.375 18:33:02 bdev_raid -- common/autotest_common.sh@1124 -- # xtrace_disable 00:29:18.375 18:33:02 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:29:18.375 ************************************ 00:29:18.375 END TEST bdev_raid 00:29:18.375 ************************************ 00:29:18.633 18:33:02 -- common/autotest_common.sh@1142 -- # return 0 00:29:18.633 18:33:02 -- spdk/autotest.sh@191 -- # run_test bdevperf_config /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevperf/test_config.sh 00:29:18.633 18:33:02 -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:29:18.633 18:33:02 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:29:18.633 18:33:02 -- common/autotest_common.sh@10 -- # set +x 00:29:18.633 ************************************ 00:29:18.633 START TEST bdevperf_config 00:29:18.633 ************************************ 00:29:18.633 18:33:02 bdevperf_config -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevperf/test_config.sh 00:29:18.633 * Looking for test storage... 00:29:18.633 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevperf 00:29:18.633 18:33:02 bdevperf_config -- bdevperf/test_config.sh@10 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevperf/common.sh 00:29:18.633 18:33:02 bdevperf_config -- bdevperf/common.sh@5 -- # bdevperf=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf 00:29:18.633 18:33:02 bdevperf_config -- bdevperf/test_config.sh@12 -- # jsonconf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevperf/conf.json 00:29:18.633 18:33:02 bdevperf_config -- bdevperf/test_config.sh@13 -- # testconf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevperf/test.conf 00:29:18.633 18:33:02 bdevperf_config -- bdevperf/test_config.sh@15 -- # trap 'cleanup; exit 1' SIGINT SIGTERM EXIT 00:29:18.633 18:33:02 bdevperf_config -- bdevperf/test_config.sh@17 -- # create_job global read Malloc0 00:29:18.633 18:33:02 bdevperf_config -- bdevperf/common.sh@8 -- # local job_section=global 00:29:18.633 18:33:02 bdevperf_config -- bdevperf/common.sh@9 -- # local rw=read 00:29:18.633 18:33:02 bdevperf_config -- bdevperf/common.sh@10 -- # local filename=Malloc0 00:29:18.633 18:33:02 bdevperf_config -- bdevperf/common.sh@12 -- # [[ global == \g\l\o\b\a\l ]] 00:29:18.633 18:33:02 bdevperf_config -- bdevperf/common.sh@13 -- # cat 00:29:18.633 18:33:02 bdevperf_config -- bdevperf/common.sh@18 -- # job='[global]' 00:29:18.634 18:33:02 bdevperf_config -- bdevperf/common.sh@19 -- # echo 00:29:18.634 00:29:18.634 18:33:02 bdevperf_config -- bdevperf/common.sh@20 -- # cat 00:29:18.634 18:33:02 bdevperf_config -- bdevperf/test_config.sh@18 -- # create_job job0 00:29:18.634 18:33:02 bdevperf_config -- bdevperf/common.sh@8 -- # local job_section=job0 00:29:18.634 18:33:02 bdevperf_config -- bdevperf/common.sh@9 -- # local rw= 00:29:18.634 18:33:02 bdevperf_config -- bdevperf/common.sh@10 -- # local filename= 00:29:18.634 18:33:02 bdevperf_config -- bdevperf/common.sh@12 -- # [[ job0 == \g\l\o\b\a\l ]] 00:29:18.634 18:33:02 bdevperf_config -- bdevperf/common.sh@18 -- # job='[job0]' 00:29:18.634 18:33:02 bdevperf_config -- bdevperf/common.sh@19 -- # echo 00:29:18.634 00:29:18.634 18:33:02 bdevperf_config -- bdevperf/common.sh@20 -- # cat 00:29:18.634 18:33:02 bdevperf_config -- bdevperf/test_config.sh@19 -- # create_job job1 00:29:18.634 18:33:02 bdevperf_config -- bdevperf/common.sh@8 -- # local job_section=job1 00:29:18.634 18:33:02 bdevperf_config -- bdevperf/common.sh@9 -- # local rw= 00:29:18.634 18:33:02 bdevperf_config -- bdevperf/common.sh@10 -- # local filename= 00:29:18.634 18:33:02 bdevperf_config -- bdevperf/common.sh@12 -- # [[ job1 == \g\l\o\b\a\l ]] 00:29:18.634 18:33:02 bdevperf_config -- bdevperf/common.sh@18 -- # job='[job1]' 00:29:18.634 18:33:02 bdevperf_config -- bdevperf/common.sh@19 -- # echo 00:29:18.634 00:29:18.634 18:33:02 bdevperf_config -- bdevperf/common.sh@20 -- # cat 00:29:18.634 18:33:02 bdevperf_config -- bdevperf/test_config.sh@20 -- # create_job job2 00:29:18.634 18:33:02 bdevperf_config -- bdevperf/common.sh@8 -- # local job_section=job2 00:29:18.634 18:33:02 bdevperf_config -- bdevperf/common.sh@9 -- # local rw= 00:29:18.634 18:33:02 bdevperf_config -- bdevperf/common.sh@10 -- # local filename= 00:29:18.634 18:33:02 bdevperf_config -- bdevperf/common.sh@12 -- # [[ job2 == \g\l\o\b\a\l ]] 00:29:18.634 18:33:02 bdevperf_config -- bdevperf/common.sh@18 -- # job='[job2]' 00:29:18.634 18:33:02 bdevperf_config -- bdevperf/common.sh@19 -- # echo 00:29:18.634 00:29:18.634 18:33:02 bdevperf_config -- bdevperf/common.sh@20 -- # cat 00:29:18.634 18:33:02 bdevperf_config -- bdevperf/test_config.sh@21 -- # create_job job3 00:29:18.634 18:33:02 bdevperf_config -- bdevperf/common.sh@8 -- # local job_section=job3 00:29:18.634 18:33:02 bdevperf_config -- bdevperf/common.sh@9 -- # local rw= 00:29:18.634 18:33:02 bdevperf_config -- bdevperf/common.sh@10 -- # local filename= 00:29:18.634 18:33:02 bdevperf_config -- bdevperf/common.sh@12 -- # [[ job3 == \g\l\o\b\a\l ]] 00:29:18.634 18:33:02 bdevperf_config -- bdevperf/common.sh@18 -- # job='[job3]' 00:29:18.634 18:33:02 bdevperf_config -- bdevperf/common.sh@19 -- # echo 00:29:18.634 00:29:18.634 18:33:02 bdevperf_config -- bdevperf/common.sh@20 -- # cat 00:29:18.634 18:33:02 bdevperf_config -- bdevperf/test_config.sh@22 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -t 2 --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevperf/conf.json -j /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevperf/test.conf 00:29:21.917 18:33:05 bdevperf_config -- bdevperf/test_config.sh@22 -- # bdevperf_output='[2024-07-12 18:33:02.375438] Starting SPDK v24.09-pre git sha1 182dd7de4 / DPDK 24.03.0 initialization... 00:29:21.917 [2024-07-12 18:33:02.375505] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2623069 ] 00:29:21.917 Using job config with 4 jobs 00:29:21.917 [2024-07-12 18:33:02.523530] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:29:21.917 [2024-07-12 18:33:02.641560] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:29:21.917 cpumask for '\''job0'\'' is too big 00:29:21.917 cpumask for '\''job1'\'' is too big 00:29:21.917 cpumask for '\''job2'\'' is too big 00:29:21.917 cpumask for '\''job3'\'' is too big 00:29:21.917 Running I/O for 2 seconds... 00:29:21.917 00:29:21.917 Latency(us) 00:29:21.917 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:29:21.917 Job: Malloc0 (Core Mask 0xff, workload: read, depth: 256, IO size: 1024) 00:29:21.918 Malloc0 : 2.02 24497.02 23.92 0.00 0.00 10443.22 1866.35 16070.57 00:29:21.918 Job: Malloc0 (Core Mask 0xff, workload: read, depth: 256, IO size: 1024) 00:29:21.918 Malloc0 : 2.02 24474.75 23.90 0.00 0.00 10429.64 1837.86 14246.96 00:29:21.918 Job: Malloc0 (Core Mask 0xff, workload: read, depth: 256, IO size: 1024) 00:29:21.918 Malloc0 : 2.02 24452.52 23.88 0.00 0.00 10416.19 1837.86 12366.36 00:29:21.918 Job: Malloc0 (Core Mask 0xff, workload: read, depth: 256, IO size: 1024) 00:29:21.918 Malloc0 : 2.03 24524.66 23.95 0.00 0.00 10362.50 918.93 10713.71 00:29:21.918 =================================================================================================================== 00:29:21.918 Total : 97948.94 95.65 0.00 0.00 10412.82 918.93 16070.57' 00:29:21.918 18:33:05 bdevperf_config -- bdevperf/test_config.sh@23 -- # get_num_jobs '[2024-07-12 18:33:02.375438] Starting SPDK v24.09-pre git sha1 182dd7de4 / DPDK 24.03.0 initialization... 00:29:21.918 [2024-07-12 18:33:02.375505] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2623069 ] 00:29:21.918 Using job config with 4 jobs 00:29:21.918 [2024-07-12 18:33:02.523530] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:29:21.918 [2024-07-12 18:33:02.641560] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:29:21.918 cpumask for '\''job0'\'' is too big 00:29:21.918 cpumask for '\''job1'\'' is too big 00:29:21.918 cpumask for '\''job2'\'' is too big 00:29:21.918 cpumask for '\''job3'\'' is too big 00:29:21.918 Running I/O for 2 seconds... 00:29:21.918 00:29:21.918 Latency(us) 00:29:21.918 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:29:21.918 Job: Malloc0 (Core Mask 0xff, workload: read, depth: 256, IO size: 1024) 00:29:21.918 Malloc0 : 2.02 24497.02 23.92 0.00 0.00 10443.22 1866.35 16070.57 00:29:21.918 Job: Malloc0 (Core Mask 0xff, workload: read, depth: 256, IO size: 1024) 00:29:21.918 Malloc0 : 2.02 24474.75 23.90 0.00 0.00 10429.64 1837.86 14246.96 00:29:21.918 Job: Malloc0 (Core Mask 0xff, workload: read, depth: 256, IO size: 1024) 00:29:21.918 Malloc0 : 2.02 24452.52 23.88 0.00 0.00 10416.19 1837.86 12366.36 00:29:21.918 Job: Malloc0 (Core Mask 0xff, workload: read, depth: 256, IO size: 1024) 00:29:21.918 Malloc0 : 2.03 24524.66 23.95 0.00 0.00 10362.50 918.93 10713.71 00:29:21.918 =================================================================================================================== 00:29:21.918 Total : 97948.94 95.65 0.00 0.00 10412.82 918.93 16070.57' 00:29:21.918 18:33:05 bdevperf_config -- bdevperf/common.sh@32 -- # echo '[2024-07-12 18:33:02.375438] Starting SPDK v24.09-pre git sha1 182dd7de4 / DPDK 24.03.0 initialization... 00:29:21.918 [2024-07-12 18:33:02.375505] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2623069 ] 00:29:21.918 Using job config with 4 jobs 00:29:21.918 [2024-07-12 18:33:02.523530] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:29:21.918 [2024-07-12 18:33:02.641560] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:29:21.918 cpumask for '\''job0'\'' is too big 00:29:21.918 cpumask for '\''job1'\'' is too big 00:29:21.918 cpumask for '\''job2'\'' is too big 00:29:21.918 cpumask for '\''job3'\'' is too big 00:29:21.918 Running I/O for 2 seconds... 00:29:21.918 00:29:21.918 Latency(us) 00:29:21.918 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:29:21.918 Job: Malloc0 (Core Mask 0xff, workload: read, depth: 256, IO size: 1024) 00:29:21.918 Malloc0 : 2.02 24497.02 23.92 0.00 0.00 10443.22 1866.35 16070.57 00:29:21.918 Job: Malloc0 (Core Mask 0xff, workload: read, depth: 256, IO size: 1024) 00:29:21.918 Malloc0 : 2.02 24474.75 23.90 0.00 0.00 10429.64 1837.86 14246.96 00:29:21.918 Job: Malloc0 (Core Mask 0xff, workload: read, depth: 256, IO size: 1024) 00:29:21.918 Malloc0 : 2.02 24452.52 23.88 0.00 0.00 10416.19 1837.86 12366.36 00:29:21.918 Job: Malloc0 (Core Mask 0xff, workload: read, depth: 256, IO size: 1024) 00:29:21.918 Malloc0 : 2.03 24524.66 23.95 0.00 0.00 10362.50 918.93 10713.71 00:29:21.918 =================================================================================================================== 00:29:21.918 Total : 97948.94 95.65 0.00 0.00 10412.82 918.93 16070.57' 00:29:21.918 18:33:05 bdevperf_config -- bdevperf/common.sh@32 -- # grep -oE 'Using job config with [0-9]+ jobs' 00:29:21.918 18:33:05 bdevperf_config -- bdevperf/common.sh@32 -- # grep -oE '[0-9]+' 00:29:21.918 18:33:05 bdevperf_config -- bdevperf/test_config.sh@23 -- # [[ 4 == \4 ]] 00:29:21.918 18:33:05 bdevperf_config -- bdevperf/test_config.sh@25 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -C -t 2 --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevperf/conf.json -j /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevperf/test.conf 00:29:21.918 [2024-07-12 18:33:05.149799] Starting SPDK v24.09-pre git sha1 182dd7de4 / DPDK 24.03.0 initialization... 00:29:21.918 [2024-07-12 18:33:05.149872] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2623422 ] 00:29:21.918 [2024-07-12 18:33:05.295747] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:29:21.918 [2024-07-12 18:33:05.415357] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:29:21.918 cpumask for 'job0' is too big 00:29:21.918 cpumask for 'job1' is too big 00:29:21.918 cpumask for 'job2' is too big 00:29:21.918 cpumask for 'job3' is too big 00:29:24.450 18:33:07 bdevperf_config -- bdevperf/test_config.sh@25 -- # bdevperf_output='Using job config with 4 jobs 00:29:24.450 Running I/O for 2 seconds... 00:29:24.450 00:29:24.450 Latency(us) 00:29:24.450 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:29:24.450 Job: Malloc0 (Core Mask 0xff, workload: read, depth: 256, IO size: 1024) 00:29:24.450 Malloc0 : 2.02 24464.19 23.89 0.00 0.00 10448.05 1852.10 16070.57 00:29:24.450 Job: Malloc0 (Core Mask 0xff, workload: read, depth: 256, IO size: 1024) 00:29:24.450 Malloc0 : 2.02 24441.91 23.87 0.00 0.00 10433.97 1823.61 14189.97 00:29:24.450 Job: Malloc0 (Core Mask 0xff, workload: read, depth: 256, IO size: 1024) 00:29:24.450 Malloc0 : 2.02 24419.85 23.85 0.00 0.00 10420.59 1823.61 12366.36 00:29:24.450 Job: Malloc0 (Core Mask 0xff, workload: read, depth: 256, IO size: 1024) 00:29:24.450 Malloc0 : 2.03 24397.77 23.83 0.00 0.00 10407.17 1823.61 10713.71 00:29:24.450 =================================================================================================================== 00:29:24.450 Total : 97723.72 95.43 0.00 0.00 10427.44 1823.61 16070.57' 00:29:24.450 18:33:07 bdevperf_config -- bdevperf/test_config.sh@27 -- # cleanup 00:29:24.450 18:33:07 bdevperf_config -- bdevperf/common.sh@36 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevperf/test.conf 00:29:24.450 18:33:07 bdevperf_config -- bdevperf/test_config.sh@29 -- # create_job job0 write Malloc0 00:29:24.450 18:33:07 bdevperf_config -- bdevperf/common.sh@8 -- # local job_section=job0 00:29:24.450 18:33:07 bdevperf_config -- bdevperf/common.sh@9 -- # local rw=write 00:29:24.450 18:33:07 bdevperf_config -- bdevperf/common.sh@10 -- # local filename=Malloc0 00:29:24.450 18:33:07 bdevperf_config -- bdevperf/common.sh@12 -- # [[ job0 == \g\l\o\b\a\l ]] 00:29:24.450 18:33:07 bdevperf_config -- bdevperf/common.sh@18 -- # job='[job0]' 00:29:24.450 18:33:07 bdevperf_config -- bdevperf/common.sh@19 -- # echo 00:29:24.450 00:29:24.450 18:33:07 bdevperf_config -- bdevperf/common.sh@20 -- # cat 00:29:24.450 18:33:07 bdevperf_config -- bdevperf/test_config.sh@30 -- # create_job job1 write Malloc0 00:29:24.450 18:33:07 bdevperf_config -- bdevperf/common.sh@8 -- # local job_section=job1 00:29:24.450 18:33:07 bdevperf_config -- bdevperf/common.sh@9 -- # local rw=write 00:29:24.450 18:33:07 bdevperf_config -- bdevperf/common.sh@10 -- # local filename=Malloc0 00:29:24.450 18:33:07 bdevperf_config -- bdevperf/common.sh@12 -- # [[ job1 == \g\l\o\b\a\l ]] 00:29:24.450 18:33:07 bdevperf_config -- bdevperf/common.sh@18 -- # job='[job1]' 00:29:24.450 18:33:07 bdevperf_config -- bdevperf/common.sh@19 -- # echo 00:29:24.450 00:29:24.450 18:33:07 bdevperf_config -- bdevperf/common.sh@20 -- # cat 00:29:24.450 18:33:07 bdevperf_config -- bdevperf/test_config.sh@31 -- # create_job job2 write Malloc0 00:29:24.450 18:33:07 bdevperf_config -- bdevperf/common.sh@8 -- # local job_section=job2 00:29:24.450 18:33:07 bdevperf_config -- bdevperf/common.sh@9 -- # local rw=write 00:29:24.450 18:33:07 bdevperf_config -- bdevperf/common.sh@10 -- # local filename=Malloc0 00:29:24.450 18:33:07 bdevperf_config -- bdevperf/common.sh@12 -- # [[ job2 == \g\l\o\b\a\l ]] 00:29:24.450 18:33:07 bdevperf_config -- bdevperf/common.sh@18 -- # job='[job2]' 00:29:24.450 18:33:07 bdevperf_config -- bdevperf/common.sh@19 -- # echo 00:29:24.450 00:29:24.450 18:33:07 bdevperf_config -- bdevperf/common.sh@20 -- # cat 00:29:24.450 18:33:07 bdevperf_config -- bdevperf/test_config.sh@32 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -t 2 --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevperf/conf.json -j /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevperf/test.conf 00:29:26.980 18:33:10 bdevperf_config -- bdevperf/test_config.sh@32 -- # bdevperf_output='[2024-07-12 18:33:07.901424] Starting SPDK v24.09-pre git sha1 182dd7de4 / DPDK 24.03.0 initialization... 00:29:26.980 [2024-07-12 18:33:07.901489] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2624151 ] 00:29:26.980 Using job config with 3 jobs 00:29:26.980 [2024-07-12 18:33:08.044225] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:29:26.980 [2024-07-12 18:33:08.159677] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:29:26.980 cpumask for '\''job0'\'' is too big 00:29:26.980 cpumask for '\''job1'\'' is too big 00:29:26.980 cpumask for '\''job2'\'' is too big 00:29:26.980 Running I/O for 2 seconds... 00:29:26.980 00:29:26.980 Latency(us) 00:29:26.980 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:29:26.980 Job: Malloc0 (Core Mask 0xff, workload: write, depth: 256, IO size: 1024) 00:29:26.980 Malloc0 : 2.02 33018.00 32.24 0.00 0.00 7742.85 1787.99 11397.57 00:29:26.980 Job: Malloc0 (Core Mask 0xff, workload: write, depth: 256, IO size: 1024) 00:29:26.980 Malloc0 : 2.02 32987.88 32.21 0.00 0.00 7733.47 1773.75 9630.94 00:29:26.980 Job: Malloc0 (Core Mask 0xff, workload: write, depth: 256, IO size: 1024) 00:29:26.980 Malloc0 : 2.02 32957.92 32.19 0.00 0.00 7723.67 1766.62 9630.94 00:29:26.980 =================================================================================================================== 00:29:26.981 Total : 98963.80 96.64 0.00 0.00 7733.33 1766.62 11397.57' 00:29:26.981 18:33:10 bdevperf_config -- bdevperf/test_config.sh@33 -- # get_num_jobs '[2024-07-12 18:33:07.901424] Starting SPDK v24.09-pre git sha1 182dd7de4 / DPDK 24.03.0 initialization... 00:29:26.981 [2024-07-12 18:33:07.901489] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2624151 ] 00:29:26.981 Using job config with 3 jobs 00:29:26.981 [2024-07-12 18:33:08.044225] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:29:26.981 [2024-07-12 18:33:08.159677] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:29:26.981 cpumask for '\''job0'\'' is too big 00:29:26.981 cpumask for '\''job1'\'' is too big 00:29:26.981 cpumask for '\''job2'\'' is too big 00:29:26.981 Running I/O for 2 seconds... 00:29:26.981 00:29:26.981 Latency(us) 00:29:26.981 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:29:26.981 Job: Malloc0 (Core Mask 0xff, workload: write, depth: 256, IO size: 1024) 00:29:26.981 Malloc0 : 2.02 33018.00 32.24 0.00 0.00 7742.85 1787.99 11397.57 00:29:26.981 Job: Malloc0 (Core Mask 0xff, workload: write, depth: 256, IO size: 1024) 00:29:26.981 Malloc0 : 2.02 32987.88 32.21 0.00 0.00 7733.47 1773.75 9630.94 00:29:26.981 Job: Malloc0 (Core Mask 0xff, workload: write, depth: 256, IO size: 1024) 00:29:26.981 Malloc0 : 2.02 32957.92 32.19 0.00 0.00 7723.67 1766.62 9630.94 00:29:26.981 =================================================================================================================== 00:29:26.981 Total : 98963.80 96.64 0.00 0.00 7733.33 1766.62 11397.57' 00:29:26.981 18:33:10 bdevperf_config -- bdevperf/common.sh@32 -- # echo '[2024-07-12 18:33:07.901424] Starting SPDK v24.09-pre git sha1 182dd7de4 / DPDK 24.03.0 initialization... 00:29:26.981 [2024-07-12 18:33:07.901489] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2624151 ] 00:29:26.981 Using job config with 3 jobs 00:29:26.981 [2024-07-12 18:33:08.044225] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:29:26.981 [2024-07-12 18:33:08.159677] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:29:26.981 cpumask for '\''job0'\'' is too big 00:29:26.981 cpumask for '\''job1'\'' is too big 00:29:26.981 cpumask for '\''job2'\'' is too big 00:29:26.981 Running I/O for 2 seconds... 00:29:26.981 00:29:26.981 Latency(us) 00:29:26.981 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:29:26.981 Job: Malloc0 (Core Mask 0xff, workload: write, depth: 256, IO size: 1024) 00:29:26.981 Malloc0 : 2.02 33018.00 32.24 0.00 0.00 7742.85 1787.99 11397.57 00:29:26.981 Job: Malloc0 (Core Mask 0xff, workload: write, depth: 256, IO size: 1024) 00:29:26.981 Malloc0 : 2.02 32987.88 32.21 0.00 0.00 7733.47 1773.75 9630.94 00:29:26.981 Job: Malloc0 (Core Mask 0xff, workload: write, depth: 256, IO size: 1024) 00:29:26.981 Malloc0 : 2.02 32957.92 32.19 0.00 0.00 7723.67 1766.62 9630.94 00:29:26.981 =================================================================================================================== 00:29:26.981 Total : 98963.80 96.64 0.00 0.00 7733.33 1766.62 11397.57' 00:29:26.981 18:33:10 bdevperf_config -- bdevperf/common.sh@32 -- # grep -oE 'Using job config with [0-9]+ jobs' 00:29:26.981 18:33:10 bdevperf_config -- bdevperf/common.sh@32 -- # grep -oE '[0-9]+' 00:29:26.981 18:33:10 bdevperf_config -- bdevperf/test_config.sh@33 -- # [[ 3 == \3 ]] 00:29:26.981 18:33:10 bdevperf_config -- bdevperf/test_config.sh@35 -- # cleanup 00:29:26.981 18:33:10 bdevperf_config -- bdevperf/common.sh@36 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevperf/test.conf 00:29:26.981 18:33:10 bdevperf_config -- bdevperf/test_config.sh@37 -- # create_job global rw Malloc0:Malloc1 00:29:26.981 18:33:10 bdevperf_config -- bdevperf/common.sh@8 -- # local job_section=global 00:29:26.981 18:33:10 bdevperf_config -- bdevperf/common.sh@9 -- # local rw=rw 00:29:26.981 18:33:10 bdevperf_config -- bdevperf/common.sh@10 -- # local filename=Malloc0:Malloc1 00:29:26.981 18:33:10 bdevperf_config -- bdevperf/common.sh@12 -- # [[ global == \g\l\o\b\a\l ]] 00:29:26.981 18:33:10 bdevperf_config -- bdevperf/common.sh@13 -- # cat 00:29:26.981 18:33:10 bdevperf_config -- bdevperf/common.sh@18 -- # job='[global]' 00:29:26.981 18:33:10 bdevperf_config -- bdevperf/common.sh@19 -- # echo 00:29:26.981 00:29:26.981 18:33:10 bdevperf_config -- bdevperf/common.sh@20 -- # cat 00:29:26.981 18:33:10 bdevperf_config -- bdevperf/test_config.sh@38 -- # create_job job0 00:29:26.981 18:33:10 bdevperf_config -- bdevperf/common.sh@8 -- # local job_section=job0 00:29:26.981 18:33:10 bdevperf_config -- bdevperf/common.sh@9 -- # local rw= 00:29:26.981 18:33:10 bdevperf_config -- bdevperf/common.sh@10 -- # local filename= 00:29:26.981 18:33:10 bdevperf_config -- bdevperf/common.sh@12 -- # [[ job0 == \g\l\o\b\a\l ]] 00:29:26.981 18:33:10 bdevperf_config -- bdevperf/common.sh@18 -- # job='[job0]' 00:29:26.981 18:33:10 bdevperf_config -- bdevperf/common.sh@19 -- # echo 00:29:26.981 00:29:26.981 18:33:10 bdevperf_config -- bdevperf/common.sh@20 -- # cat 00:29:26.981 18:33:10 bdevperf_config -- bdevperf/test_config.sh@39 -- # create_job job1 00:29:26.981 18:33:10 bdevperf_config -- bdevperf/common.sh@8 -- # local job_section=job1 00:29:26.981 18:33:10 bdevperf_config -- bdevperf/common.sh@9 -- # local rw= 00:29:26.981 18:33:10 bdevperf_config -- bdevperf/common.sh@10 -- # local filename= 00:29:26.981 18:33:10 bdevperf_config -- bdevperf/common.sh@12 -- # [[ job1 == \g\l\o\b\a\l ]] 00:29:26.981 18:33:10 bdevperf_config -- bdevperf/common.sh@18 -- # job='[job1]' 00:29:26.981 18:33:10 bdevperf_config -- bdevperf/common.sh@19 -- # echo 00:29:26.981 00:29:26.981 18:33:10 bdevperf_config -- bdevperf/common.sh@20 -- # cat 00:29:26.981 18:33:10 bdevperf_config -- bdevperf/test_config.sh@40 -- # create_job job2 00:29:26.981 18:33:10 bdevperf_config -- bdevperf/common.sh@8 -- # local job_section=job2 00:29:26.981 18:33:10 bdevperf_config -- bdevperf/common.sh@9 -- # local rw= 00:29:26.981 18:33:10 bdevperf_config -- bdevperf/common.sh@10 -- # local filename= 00:29:26.981 18:33:10 bdevperf_config -- bdevperf/common.sh@12 -- # [[ job2 == \g\l\o\b\a\l ]] 00:29:26.981 18:33:10 bdevperf_config -- bdevperf/common.sh@18 -- # job='[job2]' 00:29:26.981 18:33:10 bdevperf_config -- bdevperf/common.sh@19 -- # echo 00:29:26.981 00:29:26.981 18:33:10 bdevperf_config -- bdevperf/common.sh@20 -- # cat 00:29:26.981 18:33:10 bdevperf_config -- bdevperf/test_config.sh@41 -- # create_job job3 00:29:26.981 18:33:10 bdevperf_config -- bdevperf/common.sh@8 -- # local job_section=job3 00:29:26.981 18:33:10 bdevperf_config -- bdevperf/common.sh@9 -- # local rw= 00:29:26.981 18:33:10 bdevperf_config -- bdevperf/common.sh@10 -- # local filename= 00:29:26.981 18:33:10 bdevperf_config -- bdevperf/common.sh@12 -- # [[ job3 == \g\l\o\b\a\l ]] 00:29:26.981 18:33:10 bdevperf_config -- bdevperf/common.sh@18 -- # job='[job3]' 00:29:26.981 18:33:10 bdevperf_config -- bdevperf/common.sh@19 -- # echo 00:29:26.981 00:29:26.981 18:33:10 bdevperf_config -- bdevperf/common.sh@20 -- # cat 00:29:26.981 18:33:10 bdevperf_config -- bdevperf/test_config.sh@42 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -t 2 --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevperf/conf.json -j /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevperf/test.conf 00:29:30.263 18:33:13 bdevperf_config -- bdevperf/test_config.sh@42 -- # bdevperf_output='[2024-07-12 18:33:10.663472] Starting SPDK v24.09-pre git sha1 182dd7de4 / DPDK 24.03.0 initialization... 00:29:30.263 [2024-07-12 18:33:10.663543] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2624511 ] 00:29:30.263 Using job config with 4 jobs 00:29:30.263 [2024-07-12 18:33:10.806251] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:29:30.263 [2024-07-12 18:33:10.923228] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:29:30.263 cpumask for '\''job0'\'' is too big 00:29:30.263 cpumask for '\''job1'\'' is too big 00:29:30.263 cpumask for '\''job2'\'' is too big 00:29:30.263 cpumask for '\''job3'\'' is too big 00:29:30.263 Running I/O for 2 seconds... 00:29:30.263 00:29:30.263 Latency(us) 00:29:30.263 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:29:30.263 Job: Malloc0 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:29:30.263 Malloc0 : 2.02 12136.47 11.85 0.00 0.00 21077.43 3789.69 32597.04 00:29:30.263 Job: Malloc1 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:29:30.263 Malloc1 : 2.03 12125.14 11.84 0.00 0.00 21074.52 4587.52 32597.04 00:29:30.263 Job: Malloc0 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:29:30.263 Malloc0 : 2.04 12145.08 11.86 0.00 0.00 20966.50 3732.70 28721.86 00:29:30.263 Job: Malloc1 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:29:30.263 Malloc1 : 2.05 12133.91 11.85 0.00 0.00 20967.07 4530.53 28721.86 00:29:30.263 Job: Malloc0 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:29:30.263 Malloc0 : 2.05 12123.00 11.84 0.00 0.00 20912.17 3732.70 24960.67 00:29:30.263 Job: Malloc1 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:29:30.263 Malloc1 : 2.05 12111.85 11.83 0.00 0.00 20913.58 4559.03 24960.67 00:29:30.263 Job: Malloc0 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:29:30.264 Malloc0 : 2.05 12100.99 11.82 0.00 0.00 20859.61 3732.70 21427.42 00:29:30.264 Job: Malloc1 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:29:30.264 Malloc1 : 2.05 12089.89 11.81 0.00 0.00 20858.95 4559.03 21427.42 00:29:30.264 =================================================================================================================== 00:29:30.264 Total : 96966.32 94.69 0.00 0.00 20953.41 3732.70 32597.04' 00:29:30.264 18:33:13 bdevperf_config -- bdevperf/test_config.sh@43 -- # get_num_jobs '[2024-07-12 18:33:10.663472] Starting SPDK v24.09-pre git sha1 182dd7de4 / DPDK 24.03.0 initialization... 00:29:30.264 [2024-07-12 18:33:10.663543] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2624511 ] 00:29:30.264 Using job config with 4 jobs 00:29:30.264 [2024-07-12 18:33:10.806251] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:29:30.264 [2024-07-12 18:33:10.923228] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:29:30.264 cpumask for '\''job0'\'' is too big 00:29:30.264 cpumask for '\''job1'\'' is too big 00:29:30.264 cpumask for '\''job2'\'' is too big 00:29:30.264 cpumask for '\''job3'\'' is too big 00:29:30.264 Running I/O for 2 seconds... 00:29:30.264 00:29:30.264 Latency(us) 00:29:30.264 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:29:30.264 Job: Malloc0 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:29:30.264 Malloc0 : 2.02 12136.47 11.85 0.00 0.00 21077.43 3789.69 32597.04 00:29:30.264 Job: Malloc1 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:29:30.264 Malloc1 : 2.03 12125.14 11.84 0.00 0.00 21074.52 4587.52 32597.04 00:29:30.264 Job: Malloc0 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:29:30.264 Malloc0 : 2.04 12145.08 11.86 0.00 0.00 20966.50 3732.70 28721.86 00:29:30.264 Job: Malloc1 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:29:30.264 Malloc1 : 2.05 12133.91 11.85 0.00 0.00 20967.07 4530.53 28721.86 00:29:30.264 Job: Malloc0 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:29:30.264 Malloc0 : 2.05 12123.00 11.84 0.00 0.00 20912.17 3732.70 24960.67 00:29:30.264 Job: Malloc1 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:29:30.264 Malloc1 : 2.05 12111.85 11.83 0.00 0.00 20913.58 4559.03 24960.67 00:29:30.264 Job: Malloc0 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:29:30.264 Malloc0 : 2.05 12100.99 11.82 0.00 0.00 20859.61 3732.70 21427.42 00:29:30.264 Job: Malloc1 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:29:30.264 Malloc1 : 2.05 12089.89 11.81 0.00 0.00 20858.95 4559.03 21427.42 00:29:30.264 =================================================================================================================== 00:29:30.264 Total : 96966.32 94.69 0.00 0.00 20953.41 3732.70 32597.04' 00:29:30.264 18:33:13 bdevperf_config -- bdevperf/common.sh@32 -- # echo '[2024-07-12 18:33:10.663472] Starting SPDK v24.09-pre git sha1 182dd7de4 / DPDK 24.03.0 initialization... 00:29:30.264 [2024-07-12 18:33:10.663543] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2624511 ] 00:29:30.264 Using job config with 4 jobs 00:29:30.264 [2024-07-12 18:33:10.806251] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:29:30.264 [2024-07-12 18:33:10.923228] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:29:30.264 cpumask for '\''job0'\'' is too big 00:29:30.264 cpumask for '\''job1'\'' is too big 00:29:30.264 cpumask for '\''job2'\'' is too big 00:29:30.264 cpumask for '\''job3'\'' is too big 00:29:30.264 Running I/O for 2 seconds... 00:29:30.264 00:29:30.264 Latency(us) 00:29:30.264 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:29:30.264 Job: Malloc0 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:29:30.264 Malloc0 : 2.02 12136.47 11.85 0.00 0.00 21077.43 3789.69 32597.04 00:29:30.264 Job: Malloc1 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:29:30.264 Malloc1 : 2.03 12125.14 11.84 0.00 0.00 21074.52 4587.52 32597.04 00:29:30.264 Job: Malloc0 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:29:30.264 Malloc0 : 2.04 12145.08 11.86 0.00 0.00 20966.50 3732.70 28721.86 00:29:30.264 Job: Malloc1 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:29:30.264 Malloc1 : 2.05 12133.91 11.85 0.00 0.00 20967.07 4530.53 28721.86 00:29:30.264 Job: Malloc0 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:29:30.264 Malloc0 : 2.05 12123.00 11.84 0.00 0.00 20912.17 3732.70 24960.67 00:29:30.264 Job: Malloc1 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:29:30.264 Malloc1 : 2.05 12111.85 11.83 0.00 0.00 20913.58 4559.03 24960.67 00:29:30.264 Job: Malloc0 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:29:30.264 Malloc0 : 2.05 12100.99 11.82 0.00 0.00 20859.61 3732.70 21427.42 00:29:30.264 Job: Malloc1 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:29:30.264 Malloc1 : 2.05 12089.89 11.81 0.00 0.00 20858.95 4559.03 21427.42 00:29:30.264 =================================================================================================================== 00:29:30.264 Total : 96966.32 94.69 0.00 0.00 20953.41 3732.70 32597.04' 00:29:30.264 18:33:13 bdevperf_config -- bdevperf/common.sh@32 -- # grep -oE 'Using job config with [0-9]+ jobs' 00:29:30.264 18:33:13 bdevperf_config -- bdevperf/common.sh@32 -- # grep -oE '[0-9]+' 00:29:30.264 18:33:13 bdevperf_config -- bdevperf/test_config.sh@43 -- # [[ 4 == \4 ]] 00:29:30.264 18:33:13 bdevperf_config -- bdevperf/test_config.sh@44 -- # cleanup 00:29:30.264 18:33:13 bdevperf_config -- bdevperf/common.sh@36 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevperf/test.conf 00:29:30.264 18:33:13 bdevperf_config -- bdevperf/test_config.sh@45 -- # trap - SIGINT SIGTERM EXIT 00:29:30.264 00:29:30.264 real 0m11.233s 00:29:30.264 user 0m9.909s 00:29:30.264 sys 0m1.181s 00:29:30.264 18:33:13 bdevperf_config -- common/autotest_common.sh@1124 -- # xtrace_disable 00:29:30.264 18:33:13 bdevperf_config -- common/autotest_common.sh@10 -- # set +x 00:29:30.264 ************************************ 00:29:30.264 END TEST bdevperf_config 00:29:30.264 ************************************ 00:29:30.264 18:33:13 -- common/autotest_common.sh@1142 -- # return 0 00:29:30.264 18:33:13 -- spdk/autotest.sh@192 -- # uname -s 00:29:30.264 18:33:13 -- spdk/autotest.sh@192 -- # [[ Linux == Linux ]] 00:29:30.264 18:33:13 -- spdk/autotest.sh@193 -- # run_test reactor_set_interrupt /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt/reactor_set_interrupt.sh 00:29:30.264 18:33:13 -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:29:30.264 18:33:13 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:29:30.264 18:33:13 -- common/autotest_common.sh@10 -- # set +x 00:29:30.264 ************************************ 00:29:30.264 START TEST reactor_set_interrupt 00:29:30.264 ************************************ 00:29:30.264 18:33:13 reactor_set_interrupt -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt/reactor_set_interrupt.sh 00:29:30.264 * Looking for test storage... 00:29:30.264 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt 00:29:30.264 18:33:13 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@9 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt/interrupt_common.sh 00:29:30.264 18:33:13 reactor_set_interrupt -- interrupt/interrupt_common.sh@5 -- # dirname /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt/reactor_set_interrupt.sh 00:29:30.264 18:33:13 reactor_set_interrupt -- interrupt/interrupt_common.sh@5 -- # readlink -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt 00:29:30.264 18:33:13 reactor_set_interrupt -- interrupt/interrupt_common.sh@5 -- # testdir=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt 00:29:30.264 18:33:13 reactor_set_interrupt -- interrupt/interrupt_common.sh@6 -- # readlink -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt/../.. 00:29:30.264 18:33:13 reactor_set_interrupt -- interrupt/interrupt_common.sh@6 -- # rootdir=/var/jenkins/workspace/crypto-phy-autotest/spdk 00:29:30.264 18:33:13 reactor_set_interrupt -- interrupt/interrupt_common.sh@7 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/common/autotest_common.sh 00:29:30.264 18:33:13 reactor_set_interrupt -- common/autotest_common.sh@7 -- # rpc_py=rpc_cmd 00:29:30.264 18:33:13 reactor_set_interrupt -- common/autotest_common.sh@34 -- # set -e 00:29:30.264 18:33:13 reactor_set_interrupt -- common/autotest_common.sh@35 -- # shopt -s nullglob 00:29:30.264 18:33:13 reactor_set_interrupt -- common/autotest_common.sh@36 -- # shopt -s extglob 00:29:30.264 18:33:13 reactor_set_interrupt -- common/autotest_common.sh@37 -- # shopt -s inherit_errexit 00:29:30.264 18:33:13 reactor_set_interrupt -- common/autotest_common.sh@39 -- # '[' -z /var/jenkins/workspace/crypto-phy-autotest/spdk/../output ']' 00:29:30.264 18:33:13 reactor_set_interrupt -- common/autotest_common.sh@44 -- # [[ -e /var/jenkins/workspace/crypto-phy-autotest/spdk/test/common/build_config.sh ]] 00:29:30.264 18:33:13 reactor_set_interrupt -- common/autotest_common.sh@45 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/common/build_config.sh 00:29:30.264 18:33:13 reactor_set_interrupt -- common/build_config.sh@1 -- # CONFIG_WPDK_DIR= 00:29:30.264 18:33:13 reactor_set_interrupt -- common/build_config.sh@2 -- # CONFIG_ASAN=n 00:29:30.264 18:33:13 reactor_set_interrupt -- common/build_config.sh@3 -- # CONFIG_VBDEV_COMPRESS=y 00:29:30.264 18:33:13 reactor_set_interrupt -- common/build_config.sh@4 -- # CONFIG_HAVE_EXECINFO_H=y 00:29:30.264 18:33:13 reactor_set_interrupt -- common/build_config.sh@5 -- # CONFIG_USDT=n 00:29:30.264 18:33:13 reactor_set_interrupt -- common/build_config.sh@6 -- # CONFIG_CUSTOMOCF=n 00:29:30.264 18:33:13 reactor_set_interrupt -- common/build_config.sh@7 -- # CONFIG_PREFIX=/usr/local 00:29:30.264 18:33:13 reactor_set_interrupt -- common/build_config.sh@8 -- # CONFIG_RBD=n 00:29:30.264 18:33:13 reactor_set_interrupt -- common/build_config.sh@9 -- # CONFIG_LIBDIR= 00:29:30.264 18:33:13 reactor_set_interrupt -- common/build_config.sh@10 -- # CONFIG_IDXD=y 00:29:30.264 18:33:13 reactor_set_interrupt -- common/build_config.sh@11 -- # CONFIG_NVME_CUSE=y 00:29:30.264 18:33:13 reactor_set_interrupt -- common/build_config.sh@12 -- # CONFIG_SMA=n 00:29:30.264 18:33:13 reactor_set_interrupt -- common/build_config.sh@13 -- # CONFIG_VTUNE=n 00:29:30.264 18:33:13 reactor_set_interrupt -- common/build_config.sh@14 -- # CONFIG_TSAN=n 00:29:30.264 18:33:13 reactor_set_interrupt -- common/build_config.sh@15 -- # CONFIG_RDMA_SEND_WITH_INVAL=y 00:29:30.265 18:33:13 reactor_set_interrupt -- common/build_config.sh@16 -- # CONFIG_VFIO_USER_DIR= 00:29:30.265 18:33:13 reactor_set_interrupt -- common/build_config.sh@17 -- # CONFIG_PGO_CAPTURE=n 00:29:30.265 18:33:13 reactor_set_interrupt -- common/build_config.sh@18 -- # CONFIG_HAVE_UUID_GENERATE_SHA1=y 00:29:30.265 18:33:13 reactor_set_interrupt -- common/build_config.sh@19 -- # CONFIG_ENV=/var/jenkins/workspace/crypto-phy-autotest/spdk/lib/env_dpdk 00:29:30.265 18:33:13 reactor_set_interrupt -- common/build_config.sh@20 -- # CONFIG_LTO=n 00:29:30.265 18:33:13 reactor_set_interrupt -- common/build_config.sh@21 -- # CONFIG_ISCSI_INITIATOR=y 00:29:30.265 18:33:13 reactor_set_interrupt -- common/build_config.sh@22 -- # CONFIG_CET=n 00:29:30.265 18:33:13 reactor_set_interrupt -- common/build_config.sh@23 -- # CONFIG_VBDEV_COMPRESS_MLX5=y 00:29:30.265 18:33:13 reactor_set_interrupt -- common/build_config.sh@24 -- # CONFIG_OCF_PATH= 00:29:30.265 18:33:13 reactor_set_interrupt -- common/build_config.sh@25 -- # CONFIG_RDMA_SET_TOS=y 00:29:30.265 18:33:13 reactor_set_interrupt -- common/build_config.sh@26 -- # CONFIG_HAVE_ARC4RANDOM=y 00:29:30.265 18:33:13 reactor_set_interrupt -- common/build_config.sh@27 -- # CONFIG_HAVE_LIBARCHIVE=n 00:29:30.265 18:33:13 reactor_set_interrupt -- common/build_config.sh@28 -- # CONFIG_UBLK=y 00:29:30.265 18:33:13 reactor_set_interrupt -- common/build_config.sh@29 -- # CONFIG_ISAL_CRYPTO=y 00:29:30.265 18:33:13 reactor_set_interrupt -- common/build_config.sh@30 -- # CONFIG_OPENSSL_PATH= 00:29:30.265 18:33:13 reactor_set_interrupt -- common/build_config.sh@31 -- # CONFIG_OCF=n 00:29:30.265 18:33:13 reactor_set_interrupt -- common/build_config.sh@32 -- # CONFIG_FUSE=n 00:29:30.265 18:33:13 reactor_set_interrupt -- common/build_config.sh@33 -- # CONFIG_VTUNE_DIR= 00:29:30.265 18:33:13 reactor_set_interrupt -- common/build_config.sh@34 -- # CONFIG_FUZZER_LIB= 00:29:30.265 18:33:13 reactor_set_interrupt -- common/build_config.sh@35 -- # CONFIG_FUZZER=n 00:29:30.265 18:33:13 reactor_set_interrupt -- common/build_config.sh@36 -- # CONFIG_DPDK_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build 00:29:30.265 18:33:13 reactor_set_interrupt -- common/build_config.sh@37 -- # CONFIG_CRYPTO=y 00:29:30.265 18:33:13 reactor_set_interrupt -- common/build_config.sh@38 -- # CONFIG_PGO_USE=n 00:29:30.265 18:33:13 reactor_set_interrupt -- common/build_config.sh@39 -- # CONFIG_VHOST=y 00:29:30.265 18:33:13 reactor_set_interrupt -- common/build_config.sh@40 -- # CONFIG_DAOS=n 00:29:30.265 18:33:13 reactor_set_interrupt -- common/build_config.sh@41 -- # CONFIG_DPDK_INC_DIR= 00:29:30.265 18:33:13 reactor_set_interrupt -- common/build_config.sh@42 -- # CONFIG_DAOS_DIR= 00:29:30.265 18:33:13 reactor_set_interrupt -- common/build_config.sh@43 -- # CONFIG_UNIT_TESTS=n 00:29:30.265 18:33:13 reactor_set_interrupt -- common/build_config.sh@44 -- # CONFIG_RDMA_SET_ACK_TIMEOUT=y 00:29:30.265 18:33:13 reactor_set_interrupt -- common/build_config.sh@45 -- # CONFIG_VIRTIO=y 00:29:30.265 18:33:13 reactor_set_interrupt -- common/build_config.sh@46 -- # CONFIG_DPDK_UADK=n 00:29:30.265 18:33:13 reactor_set_interrupt -- common/build_config.sh@47 -- # CONFIG_COVERAGE=y 00:29:30.265 18:33:13 reactor_set_interrupt -- common/build_config.sh@48 -- # CONFIG_RDMA=y 00:29:30.265 18:33:13 reactor_set_interrupt -- common/build_config.sh@49 -- # CONFIG_FIO_SOURCE_DIR=/usr/src/fio 00:29:30.265 18:33:13 reactor_set_interrupt -- common/build_config.sh@50 -- # CONFIG_URING_PATH= 00:29:30.265 18:33:13 reactor_set_interrupt -- common/build_config.sh@51 -- # CONFIG_XNVME=n 00:29:30.265 18:33:13 reactor_set_interrupt -- common/build_config.sh@52 -- # CONFIG_VFIO_USER=n 00:29:30.265 18:33:13 reactor_set_interrupt -- common/build_config.sh@53 -- # CONFIG_ARCH=native 00:29:30.265 18:33:13 reactor_set_interrupt -- common/build_config.sh@54 -- # CONFIG_HAVE_EVP_MAC=y 00:29:30.265 18:33:13 reactor_set_interrupt -- common/build_config.sh@55 -- # CONFIG_URING_ZNS=n 00:29:30.265 18:33:13 reactor_set_interrupt -- common/build_config.sh@56 -- # CONFIG_WERROR=y 00:29:30.265 18:33:13 reactor_set_interrupt -- common/build_config.sh@57 -- # CONFIG_HAVE_LIBBSD=n 00:29:30.265 18:33:13 reactor_set_interrupt -- common/build_config.sh@58 -- # CONFIG_UBSAN=y 00:29:30.265 18:33:13 reactor_set_interrupt -- common/build_config.sh@59 -- # CONFIG_IPSEC_MB_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/intel-ipsec-mb/lib 00:29:30.265 18:33:13 reactor_set_interrupt -- common/build_config.sh@60 -- # CONFIG_GOLANG=n 00:29:30.265 18:33:13 reactor_set_interrupt -- common/build_config.sh@61 -- # CONFIG_ISAL=y 00:29:30.265 18:33:13 reactor_set_interrupt -- common/build_config.sh@62 -- # CONFIG_IDXD_KERNEL=y 00:29:30.265 18:33:13 reactor_set_interrupt -- common/build_config.sh@63 -- # CONFIG_DPDK_LIB_DIR= 00:29:30.265 18:33:13 reactor_set_interrupt -- common/build_config.sh@64 -- # CONFIG_RDMA_PROV=verbs 00:29:30.265 18:33:13 reactor_set_interrupt -- common/build_config.sh@65 -- # CONFIG_APPS=y 00:29:30.265 18:33:13 reactor_set_interrupt -- common/build_config.sh@66 -- # CONFIG_SHARED=y 00:29:30.265 18:33:13 reactor_set_interrupt -- common/build_config.sh@67 -- # CONFIG_HAVE_KEYUTILS=y 00:29:30.265 18:33:13 reactor_set_interrupt -- common/build_config.sh@68 -- # CONFIG_FC_PATH= 00:29:30.265 18:33:13 reactor_set_interrupt -- common/build_config.sh@69 -- # CONFIG_DPDK_PKG_CONFIG=n 00:29:30.265 18:33:13 reactor_set_interrupt -- common/build_config.sh@70 -- # CONFIG_FC=n 00:29:30.265 18:33:13 reactor_set_interrupt -- common/build_config.sh@71 -- # CONFIG_AVAHI=n 00:29:30.265 18:33:13 reactor_set_interrupt -- common/build_config.sh@72 -- # CONFIG_FIO_PLUGIN=y 00:29:30.265 18:33:13 reactor_set_interrupt -- common/build_config.sh@73 -- # CONFIG_RAID5F=n 00:29:30.265 18:33:13 reactor_set_interrupt -- common/build_config.sh@74 -- # CONFIG_EXAMPLES=y 00:29:30.265 18:33:13 reactor_set_interrupt -- common/build_config.sh@75 -- # CONFIG_TESTS=y 00:29:30.265 18:33:13 reactor_set_interrupt -- common/build_config.sh@76 -- # CONFIG_CRYPTO_MLX5=y 00:29:30.265 18:33:13 reactor_set_interrupt -- common/build_config.sh@77 -- # CONFIG_MAX_LCORES=128 00:29:30.265 18:33:13 reactor_set_interrupt -- common/build_config.sh@78 -- # CONFIG_IPSEC_MB=y 00:29:30.265 18:33:13 reactor_set_interrupt -- common/build_config.sh@79 -- # CONFIG_PGO_DIR= 00:29:30.265 18:33:13 reactor_set_interrupt -- common/build_config.sh@80 -- # CONFIG_DEBUG=y 00:29:30.265 18:33:13 reactor_set_interrupt -- common/build_config.sh@81 -- # CONFIG_DPDK_COMPRESSDEV=y 00:29:30.265 18:33:13 reactor_set_interrupt -- common/build_config.sh@82 -- # CONFIG_CROSS_PREFIX= 00:29:30.265 18:33:13 reactor_set_interrupt -- common/build_config.sh@83 -- # CONFIG_URING=n 00:29:30.265 18:33:13 reactor_set_interrupt -- common/autotest_common.sh@54 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/common/applications.sh 00:29:30.265 18:33:13 reactor_set_interrupt -- common/applications.sh@8 -- # dirname /var/jenkins/workspace/crypto-phy-autotest/spdk/test/common/applications.sh 00:29:30.265 18:33:13 reactor_set_interrupt -- common/applications.sh@8 -- # readlink -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/common 00:29:30.265 18:33:13 reactor_set_interrupt -- common/applications.sh@8 -- # _root=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/common 00:29:30.265 18:33:13 reactor_set_interrupt -- common/applications.sh@9 -- # _root=/var/jenkins/workspace/crypto-phy-autotest/spdk 00:29:30.265 18:33:13 reactor_set_interrupt -- common/applications.sh@10 -- # _app_dir=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin 00:29:30.265 18:33:13 reactor_set_interrupt -- common/applications.sh@11 -- # _test_app_dir=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/app 00:29:30.265 18:33:13 reactor_set_interrupt -- common/applications.sh@12 -- # _examples_dir=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples 00:29:30.265 18:33:13 reactor_set_interrupt -- common/applications.sh@14 -- # VHOST_FUZZ_APP=("$_test_app_dir/fuzz/vhost_fuzz/vhost_fuzz") 00:29:30.265 18:33:13 reactor_set_interrupt -- common/applications.sh@15 -- # ISCSI_APP=("$_app_dir/iscsi_tgt") 00:29:30.265 18:33:13 reactor_set_interrupt -- common/applications.sh@16 -- # NVMF_APP=("$_app_dir/nvmf_tgt") 00:29:30.265 18:33:13 reactor_set_interrupt -- common/applications.sh@17 -- # VHOST_APP=("$_app_dir/vhost") 00:29:30.265 18:33:13 reactor_set_interrupt -- common/applications.sh@18 -- # DD_APP=("$_app_dir/spdk_dd") 00:29:30.265 18:33:13 reactor_set_interrupt -- common/applications.sh@19 -- # SPDK_APP=("$_app_dir/spdk_tgt") 00:29:30.265 18:33:13 reactor_set_interrupt -- common/applications.sh@22 -- # [[ -e /var/jenkins/workspace/crypto-phy-autotest/spdk/include/spdk/config.h ]] 00:29:30.265 18:33:13 reactor_set_interrupt -- common/applications.sh@23 -- # [[ #ifndef SPDK_CONFIG_H 00:29:30.265 #define SPDK_CONFIG_H 00:29:30.265 #define SPDK_CONFIG_APPS 1 00:29:30.265 #define SPDK_CONFIG_ARCH native 00:29:30.265 #undef SPDK_CONFIG_ASAN 00:29:30.265 #undef SPDK_CONFIG_AVAHI 00:29:30.265 #undef SPDK_CONFIG_CET 00:29:30.265 #define SPDK_CONFIG_COVERAGE 1 00:29:30.265 #define SPDK_CONFIG_CROSS_PREFIX 00:29:30.265 #define SPDK_CONFIG_CRYPTO 1 00:29:30.265 #define SPDK_CONFIG_CRYPTO_MLX5 1 00:29:30.265 #undef SPDK_CONFIG_CUSTOMOCF 00:29:30.265 #undef SPDK_CONFIG_DAOS 00:29:30.265 #define SPDK_CONFIG_DAOS_DIR 00:29:30.265 #define SPDK_CONFIG_DEBUG 1 00:29:30.265 #define SPDK_CONFIG_DPDK_COMPRESSDEV 1 00:29:30.265 #define SPDK_CONFIG_DPDK_DIR /var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build 00:29:30.265 #define SPDK_CONFIG_DPDK_INC_DIR 00:29:30.265 #define SPDK_CONFIG_DPDK_LIB_DIR 00:29:30.265 #undef SPDK_CONFIG_DPDK_PKG_CONFIG 00:29:30.265 #undef SPDK_CONFIG_DPDK_UADK 00:29:30.265 #define SPDK_CONFIG_ENV /var/jenkins/workspace/crypto-phy-autotest/spdk/lib/env_dpdk 00:29:30.265 #define SPDK_CONFIG_EXAMPLES 1 00:29:30.265 #undef SPDK_CONFIG_FC 00:29:30.265 #define SPDK_CONFIG_FC_PATH 00:29:30.265 #define SPDK_CONFIG_FIO_PLUGIN 1 00:29:30.265 #define SPDK_CONFIG_FIO_SOURCE_DIR /usr/src/fio 00:29:30.265 #undef SPDK_CONFIG_FUSE 00:29:30.265 #undef SPDK_CONFIG_FUZZER 00:29:30.265 #define SPDK_CONFIG_FUZZER_LIB 00:29:30.265 #undef SPDK_CONFIG_GOLANG 00:29:30.265 #define SPDK_CONFIG_HAVE_ARC4RANDOM 1 00:29:30.265 #define SPDK_CONFIG_HAVE_EVP_MAC 1 00:29:30.265 #define SPDK_CONFIG_HAVE_EXECINFO_H 1 00:29:30.265 #define SPDK_CONFIG_HAVE_KEYUTILS 1 00:29:30.265 #undef SPDK_CONFIG_HAVE_LIBARCHIVE 00:29:30.265 #undef SPDK_CONFIG_HAVE_LIBBSD 00:29:30.265 #define SPDK_CONFIG_HAVE_UUID_GENERATE_SHA1 1 00:29:30.265 #define SPDK_CONFIG_IDXD 1 00:29:30.265 #define SPDK_CONFIG_IDXD_KERNEL 1 00:29:30.265 #define SPDK_CONFIG_IPSEC_MB 1 00:29:30.265 #define SPDK_CONFIG_IPSEC_MB_DIR /var/jenkins/workspace/crypto-phy-autotest/spdk/intel-ipsec-mb/lib 00:29:30.265 #define SPDK_CONFIG_ISAL 1 00:29:30.265 #define SPDK_CONFIG_ISAL_CRYPTO 1 00:29:30.265 #define SPDK_CONFIG_ISCSI_INITIATOR 1 00:29:30.265 #define SPDK_CONFIG_LIBDIR 00:29:30.265 #undef SPDK_CONFIG_LTO 00:29:30.265 #define SPDK_CONFIG_MAX_LCORES 128 00:29:30.265 #define SPDK_CONFIG_NVME_CUSE 1 00:29:30.265 #undef SPDK_CONFIG_OCF 00:29:30.265 #define SPDK_CONFIG_OCF_PATH 00:29:30.265 #define SPDK_CONFIG_OPENSSL_PATH 00:29:30.265 #undef SPDK_CONFIG_PGO_CAPTURE 00:29:30.265 #define SPDK_CONFIG_PGO_DIR 00:29:30.265 #undef SPDK_CONFIG_PGO_USE 00:29:30.265 #define SPDK_CONFIG_PREFIX /usr/local 00:29:30.265 #undef SPDK_CONFIG_RAID5F 00:29:30.265 #undef SPDK_CONFIG_RBD 00:29:30.265 #define SPDK_CONFIG_RDMA 1 00:29:30.265 #define SPDK_CONFIG_RDMA_PROV verbs 00:29:30.265 #define SPDK_CONFIG_RDMA_SEND_WITH_INVAL 1 00:29:30.265 #define SPDK_CONFIG_RDMA_SET_ACK_TIMEOUT 1 00:29:30.265 #define SPDK_CONFIG_RDMA_SET_TOS 1 00:29:30.265 #define SPDK_CONFIG_SHARED 1 00:29:30.265 #undef SPDK_CONFIG_SMA 00:29:30.265 #define SPDK_CONFIG_TESTS 1 00:29:30.265 #undef SPDK_CONFIG_TSAN 00:29:30.265 #define SPDK_CONFIG_UBLK 1 00:29:30.265 #define SPDK_CONFIG_UBSAN 1 00:29:30.265 #undef SPDK_CONFIG_UNIT_TESTS 00:29:30.265 #undef SPDK_CONFIG_URING 00:29:30.265 #define SPDK_CONFIG_URING_PATH 00:29:30.265 #undef SPDK_CONFIG_URING_ZNS 00:29:30.265 #undef SPDK_CONFIG_USDT 00:29:30.265 #define SPDK_CONFIG_VBDEV_COMPRESS 1 00:29:30.265 #define SPDK_CONFIG_VBDEV_COMPRESS_MLX5 1 00:29:30.266 #undef SPDK_CONFIG_VFIO_USER 00:29:30.266 #define SPDK_CONFIG_VFIO_USER_DIR 00:29:30.266 #define SPDK_CONFIG_VHOST 1 00:29:30.266 #define SPDK_CONFIG_VIRTIO 1 00:29:30.266 #undef SPDK_CONFIG_VTUNE 00:29:30.266 #define SPDK_CONFIG_VTUNE_DIR 00:29:30.266 #define SPDK_CONFIG_WERROR 1 00:29:30.266 #define SPDK_CONFIG_WPDK_DIR 00:29:30.266 #undef SPDK_CONFIG_XNVME 00:29:30.266 #endif /* SPDK_CONFIG_H */ == *\#\d\e\f\i\n\e\ \S\P\D\K\_\C\O\N\F\I\G\_\D\E\B\U\G* ]] 00:29:30.266 18:33:13 reactor_set_interrupt -- common/applications.sh@24 -- # (( SPDK_AUTOTEST_DEBUG_APPS )) 00:29:30.266 18:33:13 reactor_set_interrupt -- common/autotest_common.sh@55 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/common.sh 00:29:30.266 18:33:13 reactor_set_interrupt -- scripts/common.sh@508 -- # [[ -e /bin/wpdk_common.sh ]] 00:29:30.266 18:33:13 reactor_set_interrupt -- scripts/common.sh@516 -- # [[ -e /etc/opt/spdk-pkgdep/paths/export.sh ]] 00:29:30.266 18:33:13 reactor_set_interrupt -- scripts/common.sh@517 -- # source /etc/opt/spdk-pkgdep/paths/export.sh 00:29:30.266 18:33:13 reactor_set_interrupt -- paths/export.sh@2 -- # PATH=/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:29:30.266 18:33:13 reactor_set_interrupt -- paths/export.sh@3 -- # PATH=/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:29:30.266 18:33:13 reactor_set_interrupt -- paths/export.sh@4 -- # PATH=/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:29:30.266 18:33:13 reactor_set_interrupt -- paths/export.sh@5 -- # export PATH 00:29:30.266 18:33:13 reactor_set_interrupt -- paths/export.sh@6 -- # echo /opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:29:30.266 18:33:13 reactor_set_interrupt -- common/autotest_common.sh@56 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm/common 00:29:30.266 18:33:13 reactor_set_interrupt -- pm/common@6 -- # dirname /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm/common 00:29:30.266 18:33:13 reactor_set_interrupt -- pm/common@6 -- # readlink -f /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm 00:29:30.266 18:33:13 reactor_set_interrupt -- pm/common@6 -- # _pmdir=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm 00:29:30.266 18:33:13 reactor_set_interrupt -- pm/common@7 -- # readlink -f /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm/../../../ 00:29:30.266 18:33:13 reactor_set_interrupt -- pm/common@7 -- # _pmrootdir=/var/jenkins/workspace/crypto-phy-autotest/spdk 00:29:30.266 18:33:13 reactor_set_interrupt -- pm/common@64 -- # TEST_TAG=N/A 00:29:30.266 18:33:13 reactor_set_interrupt -- pm/common@65 -- # TEST_TAG_FILE=/var/jenkins/workspace/crypto-phy-autotest/spdk/.run_test_name 00:29:30.266 18:33:13 reactor_set_interrupt -- pm/common@67 -- # PM_OUTPUTDIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power 00:29:30.266 18:33:13 reactor_set_interrupt -- pm/common@68 -- # uname -s 00:29:30.266 18:33:13 reactor_set_interrupt -- pm/common@68 -- # PM_OS=Linux 00:29:30.266 18:33:13 reactor_set_interrupt -- pm/common@70 -- # MONITOR_RESOURCES_SUDO=() 00:29:30.266 18:33:13 reactor_set_interrupt -- pm/common@70 -- # declare -A MONITOR_RESOURCES_SUDO 00:29:30.266 18:33:13 reactor_set_interrupt -- pm/common@71 -- # MONITOR_RESOURCES_SUDO["collect-bmc-pm"]=1 00:29:30.266 18:33:13 reactor_set_interrupt -- pm/common@72 -- # MONITOR_RESOURCES_SUDO["collect-cpu-load"]=0 00:29:30.266 18:33:13 reactor_set_interrupt -- pm/common@73 -- # MONITOR_RESOURCES_SUDO["collect-cpu-temp"]=0 00:29:30.266 18:33:13 reactor_set_interrupt -- pm/common@74 -- # MONITOR_RESOURCES_SUDO["collect-vmstat"]=0 00:29:30.266 18:33:13 reactor_set_interrupt -- pm/common@76 -- # SUDO[0]= 00:29:30.266 18:33:13 reactor_set_interrupt -- pm/common@76 -- # SUDO[1]='sudo -E' 00:29:30.266 18:33:13 reactor_set_interrupt -- pm/common@78 -- # MONITOR_RESOURCES=(collect-cpu-load collect-vmstat) 00:29:30.266 18:33:13 reactor_set_interrupt -- pm/common@79 -- # [[ Linux == FreeBSD ]] 00:29:30.266 18:33:13 reactor_set_interrupt -- pm/common@81 -- # [[ Linux == Linux ]] 00:29:30.266 18:33:13 reactor_set_interrupt -- pm/common@81 -- # [[ ............................... != QEMU ]] 00:29:30.266 18:33:13 reactor_set_interrupt -- pm/common@81 -- # [[ ! -e /.dockerenv ]] 00:29:30.266 18:33:13 reactor_set_interrupt -- pm/common@84 -- # MONITOR_RESOURCES+=(collect-cpu-temp) 00:29:30.266 18:33:13 reactor_set_interrupt -- pm/common@85 -- # MONITOR_RESOURCES+=(collect-bmc-pm) 00:29:30.266 18:33:13 reactor_set_interrupt -- pm/common@88 -- # [[ ! -d /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power ]] 00:29:30.266 18:33:13 reactor_set_interrupt -- common/autotest_common.sh@58 -- # : 0 00:29:30.266 18:33:13 reactor_set_interrupt -- common/autotest_common.sh@59 -- # export RUN_NIGHTLY 00:29:30.266 18:33:13 reactor_set_interrupt -- common/autotest_common.sh@62 -- # : 0 00:29:30.266 18:33:13 reactor_set_interrupt -- common/autotest_common.sh@63 -- # export SPDK_AUTOTEST_DEBUG_APPS 00:29:30.266 18:33:13 reactor_set_interrupt -- common/autotest_common.sh@64 -- # : 0 00:29:30.266 18:33:13 reactor_set_interrupt -- common/autotest_common.sh@65 -- # export SPDK_RUN_VALGRIND 00:29:30.266 18:33:13 reactor_set_interrupt -- common/autotest_common.sh@66 -- # : 1 00:29:30.266 18:33:13 reactor_set_interrupt -- common/autotest_common.sh@67 -- # export SPDK_RUN_FUNCTIONAL_TEST 00:29:30.266 18:33:13 reactor_set_interrupt -- common/autotest_common.sh@68 -- # : 0 00:29:30.266 18:33:13 reactor_set_interrupt -- common/autotest_common.sh@69 -- # export SPDK_TEST_UNITTEST 00:29:30.266 18:33:13 reactor_set_interrupt -- common/autotest_common.sh@70 -- # : 00:29:30.266 18:33:13 reactor_set_interrupt -- common/autotest_common.sh@71 -- # export SPDK_TEST_AUTOBUILD 00:29:30.266 18:33:13 reactor_set_interrupt -- common/autotest_common.sh@72 -- # : 0 00:29:30.266 18:33:13 reactor_set_interrupt -- common/autotest_common.sh@73 -- # export SPDK_TEST_RELEASE_BUILD 00:29:30.266 18:33:13 reactor_set_interrupt -- common/autotest_common.sh@74 -- # : 1 00:29:30.266 18:33:13 reactor_set_interrupt -- common/autotest_common.sh@75 -- # export SPDK_TEST_ISAL 00:29:30.266 18:33:13 reactor_set_interrupt -- common/autotest_common.sh@76 -- # : 0 00:29:30.266 18:33:13 reactor_set_interrupt -- common/autotest_common.sh@77 -- # export SPDK_TEST_ISCSI 00:29:30.266 18:33:13 reactor_set_interrupt -- common/autotest_common.sh@78 -- # : 0 00:29:30.266 18:33:13 reactor_set_interrupt -- common/autotest_common.sh@79 -- # export SPDK_TEST_ISCSI_INITIATOR 00:29:30.266 18:33:13 reactor_set_interrupt -- common/autotest_common.sh@80 -- # : 0 00:29:30.266 18:33:13 reactor_set_interrupt -- common/autotest_common.sh@81 -- # export SPDK_TEST_NVME 00:29:30.266 18:33:13 reactor_set_interrupt -- common/autotest_common.sh@82 -- # : 0 00:29:30.266 18:33:13 reactor_set_interrupt -- common/autotest_common.sh@83 -- # export SPDK_TEST_NVME_PMR 00:29:30.266 18:33:13 reactor_set_interrupt -- common/autotest_common.sh@84 -- # : 0 00:29:30.266 18:33:13 reactor_set_interrupt -- common/autotest_common.sh@85 -- # export SPDK_TEST_NVME_BP 00:29:30.266 18:33:13 reactor_set_interrupt -- common/autotest_common.sh@86 -- # : 0 00:29:30.266 18:33:13 reactor_set_interrupt -- common/autotest_common.sh@87 -- # export SPDK_TEST_NVME_CLI 00:29:30.266 18:33:13 reactor_set_interrupt -- common/autotest_common.sh@88 -- # : 0 00:29:30.266 18:33:13 reactor_set_interrupt -- common/autotest_common.sh@89 -- # export SPDK_TEST_NVME_CUSE 00:29:30.266 18:33:13 reactor_set_interrupt -- common/autotest_common.sh@90 -- # : 0 00:29:30.266 18:33:13 reactor_set_interrupt -- common/autotest_common.sh@91 -- # export SPDK_TEST_NVME_FDP 00:29:30.266 18:33:13 reactor_set_interrupt -- common/autotest_common.sh@92 -- # : 0 00:29:30.266 18:33:13 reactor_set_interrupt -- common/autotest_common.sh@93 -- # export SPDK_TEST_NVMF 00:29:30.266 18:33:13 reactor_set_interrupt -- common/autotest_common.sh@94 -- # : 0 00:29:30.266 18:33:13 reactor_set_interrupt -- common/autotest_common.sh@95 -- # export SPDK_TEST_VFIOUSER 00:29:30.266 18:33:13 reactor_set_interrupt -- common/autotest_common.sh@96 -- # : 0 00:29:30.266 18:33:13 reactor_set_interrupt -- common/autotest_common.sh@97 -- # export SPDK_TEST_VFIOUSER_QEMU 00:29:30.266 18:33:13 reactor_set_interrupt -- common/autotest_common.sh@98 -- # : 0 00:29:30.266 18:33:13 reactor_set_interrupt -- common/autotest_common.sh@99 -- # export SPDK_TEST_FUZZER 00:29:30.266 18:33:13 reactor_set_interrupt -- common/autotest_common.sh@100 -- # : 0 00:29:30.266 18:33:13 reactor_set_interrupt -- common/autotest_common.sh@101 -- # export SPDK_TEST_FUZZER_SHORT 00:29:30.266 18:33:13 reactor_set_interrupt -- common/autotest_common.sh@102 -- # : rdma 00:29:30.266 18:33:13 reactor_set_interrupt -- common/autotest_common.sh@103 -- # export SPDK_TEST_NVMF_TRANSPORT 00:29:30.266 18:33:13 reactor_set_interrupt -- common/autotest_common.sh@104 -- # : 0 00:29:30.266 18:33:13 reactor_set_interrupt -- common/autotest_common.sh@105 -- # export SPDK_TEST_RBD 00:29:30.266 18:33:13 reactor_set_interrupt -- common/autotest_common.sh@106 -- # : 0 00:29:30.266 18:33:13 reactor_set_interrupt -- common/autotest_common.sh@107 -- # export SPDK_TEST_VHOST 00:29:30.266 18:33:13 reactor_set_interrupt -- common/autotest_common.sh@108 -- # : 1 00:29:30.266 18:33:13 reactor_set_interrupt -- common/autotest_common.sh@109 -- # export SPDK_TEST_BLOCKDEV 00:29:30.266 18:33:13 reactor_set_interrupt -- common/autotest_common.sh@110 -- # : 0 00:29:30.266 18:33:13 reactor_set_interrupt -- common/autotest_common.sh@111 -- # export SPDK_TEST_IOAT 00:29:30.266 18:33:13 reactor_set_interrupt -- common/autotest_common.sh@112 -- # : 0 00:29:30.266 18:33:13 reactor_set_interrupt -- common/autotest_common.sh@113 -- # export SPDK_TEST_BLOBFS 00:29:30.266 18:33:13 reactor_set_interrupt -- common/autotest_common.sh@114 -- # : 0 00:29:30.266 18:33:13 reactor_set_interrupt -- common/autotest_common.sh@115 -- # export SPDK_TEST_VHOST_INIT 00:29:30.266 18:33:13 reactor_set_interrupt -- common/autotest_common.sh@116 -- # : 0 00:29:30.266 18:33:13 reactor_set_interrupt -- common/autotest_common.sh@117 -- # export SPDK_TEST_LVOL 00:29:30.266 18:33:13 reactor_set_interrupt -- common/autotest_common.sh@118 -- # : 1 00:29:30.266 18:33:13 reactor_set_interrupt -- common/autotest_common.sh@119 -- # export SPDK_TEST_VBDEV_COMPRESS 00:29:30.266 18:33:13 reactor_set_interrupt -- common/autotest_common.sh@120 -- # : 0 00:29:30.266 18:33:13 reactor_set_interrupt -- common/autotest_common.sh@121 -- # export SPDK_RUN_ASAN 00:29:30.266 18:33:13 reactor_set_interrupt -- common/autotest_common.sh@122 -- # : 1 00:29:30.266 18:33:13 reactor_set_interrupt -- common/autotest_common.sh@123 -- # export SPDK_RUN_UBSAN 00:29:30.266 18:33:13 reactor_set_interrupt -- common/autotest_common.sh@124 -- # : 00:29:30.266 18:33:13 reactor_set_interrupt -- common/autotest_common.sh@125 -- # export SPDK_RUN_EXTERNAL_DPDK 00:29:30.266 18:33:13 reactor_set_interrupt -- common/autotest_common.sh@126 -- # : 0 00:29:30.266 18:33:13 reactor_set_interrupt -- common/autotest_common.sh@127 -- # export SPDK_RUN_NON_ROOT 00:29:30.266 18:33:13 reactor_set_interrupt -- common/autotest_common.sh@128 -- # : 1 00:29:30.266 18:33:13 reactor_set_interrupt -- common/autotest_common.sh@129 -- # export SPDK_TEST_CRYPTO 00:29:30.266 18:33:13 reactor_set_interrupt -- common/autotest_common.sh@130 -- # : 0 00:29:30.266 18:33:13 reactor_set_interrupt -- common/autotest_common.sh@131 -- # export SPDK_TEST_FTL 00:29:30.266 18:33:13 reactor_set_interrupt -- common/autotest_common.sh@132 -- # : 0 00:29:30.266 18:33:13 reactor_set_interrupt -- common/autotest_common.sh@133 -- # export SPDK_TEST_OCF 00:29:30.266 18:33:13 reactor_set_interrupt -- common/autotest_common.sh@134 -- # : 0 00:29:30.267 18:33:13 reactor_set_interrupt -- common/autotest_common.sh@135 -- # export SPDK_TEST_VMD 00:29:30.267 18:33:13 reactor_set_interrupt -- common/autotest_common.sh@136 -- # : 0 00:29:30.267 18:33:13 reactor_set_interrupt -- common/autotest_common.sh@137 -- # export SPDK_TEST_OPAL 00:29:30.267 18:33:13 reactor_set_interrupt -- common/autotest_common.sh@138 -- # : 00:29:30.267 18:33:13 reactor_set_interrupt -- common/autotest_common.sh@139 -- # export SPDK_TEST_NATIVE_DPDK 00:29:30.267 18:33:13 reactor_set_interrupt -- common/autotest_common.sh@140 -- # : true 00:29:30.267 18:33:13 reactor_set_interrupt -- common/autotest_common.sh@141 -- # export SPDK_AUTOTEST_X 00:29:30.267 18:33:13 reactor_set_interrupt -- common/autotest_common.sh@142 -- # : 0 00:29:30.267 18:33:13 reactor_set_interrupt -- common/autotest_common.sh@143 -- # export SPDK_TEST_RAID5 00:29:30.267 18:33:13 reactor_set_interrupt -- common/autotest_common.sh@144 -- # : 0 00:29:30.267 18:33:13 reactor_set_interrupt -- common/autotest_common.sh@145 -- # export SPDK_TEST_URING 00:29:30.267 18:33:13 reactor_set_interrupt -- common/autotest_common.sh@146 -- # : 0 00:29:30.267 18:33:13 reactor_set_interrupt -- common/autotest_common.sh@147 -- # export SPDK_TEST_USDT 00:29:30.267 18:33:13 reactor_set_interrupt -- common/autotest_common.sh@148 -- # : 0 00:29:30.267 18:33:13 reactor_set_interrupt -- common/autotest_common.sh@149 -- # export SPDK_TEST_USE_IGB_UIO 00:29:30.267 18:33:13 reactor_set_interrupt -- common/autotest_common.sh@150 -- # : 0 00:29:30.267 18:33:13 reactor_set_interrupt -- common/autotest_common.sh@151 -- # export SPDK_TEST_SCHEDULER 00:29:30.267 18:33:13 reactor_set_interrupt -- common/autotest_common.sh@152 -- # : 0 00:29:30.267 18:33:13 reactor_set_interrupt -- common/autotest_common.sh@153 -- # export SPDK_TEST_SCANBUILD 00:29:30.267 18:33:13 reactor_set_interrupt -- common/autotest_common.sh@154 -- # : 00:29:30.267 18:33:13 reactor_set_interrupt -- common/autotest_common.sh@155 -- # export SPDK_TEST_NVMF_NICS 00:29:30.267 18:33:13 reactor_set_interrupt -- common/autotest_common.sh@156 -- # : 0 00:29:30.267 18:33:13 reactor_set_interrupt -- common/autotest_common.sh@157 -- # export SPDK_TEST_SMA 00:29:30.267 18:33:13 reactor_set_interrupt -- common/autotest_common.sh@158 -- # : 0 00:29:30.267 18:33:13 reactor_set_interrupt -- common/autotest_common.sh@159 -- # export SPDK_TEST_DAOS 00:29:30.267 18:33:13 reactor_set_interrupt -- common/autotest_common.sh@160 -- # : 0 00:29:30.267 18:33:13 reactor_set_interrupt -- common/autotest_common.sh@161 -- # export SPDK_TEST_XNVME 00:29:30.267 18:33:13 reactor_set_interrupt -- common/autotest_common.sh@162 -- # : 0 00:29:30.267 18:33:13 reactor_set_interrupt -- common/autotest_common.sh@163 -- # export SPDK_TEST_ACCEL_DSA 00:29:30.267 18:33:13 reactor_set_interrupt -- common/autotest_common.sh@164 -- # : 0 00:29:30.267 18:33:13 reactor_set_interrupt -- common/autotest_common.sh@165 -- # export SPDK_TEST_ACCEL_IAA 00:29:30.267 18:33:13 reactor_set_interrupt -- common/autotest_common.sh@167 -- # : 00:29:30.267 18:33:13 reactor_set_interrupt -- common/autotest_common.sh@168 -- # export SPDK_TEST_FUZZER_TARGET 00:29:30.267 18:33:13 reactor_set_interrupt -- common/autotest_common.sh@169 -- # : 0 00:29:30.267 18:33:13 reactor_set_interrupt -- common/autotest_common.sh@170 -- # export SPDK_TEST_NVMF_MDNS 00:29:30.267 18:33:13 reactor_set_interrupt -- common/autotest_common.sh@171 -- # : 0 00:29:30.267 18:33:13 reactor_set_interrupt -- common/autotest_common.sh@172 -- # export SPDK_JSONRPC_GO_CLIENT 00:29:30.267 18:33:13 reactor_set_interrupt -- common/autotest_common.sh@175 -- # export SPDK_LIB_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/lib 00:29:30.267 18:33:13 reactor_set_interrupt -- common/autotest_common.sh@175 -- # SPDK_LIB_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/lib 00:29:30.267 18:33:13 reactor_set_interrupt -- common/autotest_common.sh@176 -- # export DPDK_LIB_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build/lib 00:29:30.267 18:33:13 reactor_set_interrupt -- common/autotest_common.sh@176 -- # DPDK_LIB_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build/lib 00:29:30.267 18:33:13 reactor_set_interrupt -- common/autotest_common.sh@177 -- # export VFIO_LIB_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/libvfio-user/usr/local/lib 00:29:30.267 18:33:13 reactor_set_interrupt -- common/autotest_common.sh@177 -- # VFIO_LIB_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/libvfio-user/usr/local/lib 00:29:30.267 18:33:13 reactor_set_interrupt -- common/autotest_common.sh@178 -- # export LD_LIBRARY_PATH=:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/libvfio-user/usr/local/lib 00:29:30.267 18:33:13 reactor_set_interrupt -- common/autotest_common.sh@178 -- # LD_LIBRARY_PATH=:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/libvfio-user/usr/local/lib 00:29:30.267 18:33:13 reactor_set_interrupt -- common/autotest_common.sh@181 -- # export PCI_BLOCK_SYNC_ON_RESET=yes 00:29:30.267 18:33:13 reactor_set_interrupt -- common/autotest_common.sh@181 -- # PCI_BLOCK_SYNC_ON_RESET=yes 00:29:30.267 18:33:13 reactor_set_interrupt -- common/autotest_common.sh@185 -- # export PYTHONPATH=:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/python 00:29:30.267 18:33:13 reactor_set_interrupt -- common/autotest_common.sh@185 -- # PYTHONPATH=:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/python 00:29:30.267 18:33:13 reactor_set_interrupt -- common/autotest_common.sh@189 -- # export PYTHONDONTWRITEBYTECODE=1 00:29:30.267 18:33:13 reactor_set_interrupt -- common/autotest_common.sh@189 -- # PYTHONDONTWRITEBYTECODE=1 00:29:30.267 18:33:13 reactor_set_interrupt -- common/autotest_common.sh@193 -- # export ASAN_OPTIONS=new_delete_type_mismatch=0:disable_coredump=0:abort_on_error=1:use_sigaltstack=0 00:29:30.267 18:33:13 reactor_set_interrupt -- common/autotest_common.sh@193 -- # ASAN_OPTIONS=new_delete_type_mismatch=0:disable_coredump=0:abort_on_error=1:use_sigaltstack=0 00:29:30.267 18:33:13 reactor_set_interrupt -- common/autotest_common.sh@194 -- # export UBSAN_OPTIONS=halt_on_error=1:print_stacktrace=1:abort_on_error=1:disable_coredump=0:exitcode=134 00:29:30.267 18:33:13 reactor_set_interrupt -- common/autotest_common.sh@194 -- # UBSAN_OPTIONS=halt_on_error=1:print_stacktrace=1:abort_on_error=1:disable_coredump=0:exitcode=134 00:29:30.267 18:33:13 reactor_set_interrupt -- common/autotest_common.sh@198 -- # asan_suppression_file=/var/tmp/asan_suppression_file 00:29:30.267 18:33:13 reactor_set_interrupt -- common/autotest_common.sh@199 -- # rm -rf /var/tmp/asan_suppression_file 00:29:30.267 18:33:13 reactor_set_interrupt -- common/autotest_common.sh@200 -- # cat 00:29:30.267 18:33:13 reactor_set_interrupt -- common/autotest_common.sh@236 -- # echo leak:libfuse3.so 00:29:30.267 18:33:13 reactor_set_interrupt -- common/autotest_common.sh@238 -- # export LSAN_OPTIONS=suppressions=/var/tmp/asan_suppression_file 00:29:30.267 18:33:13 reactor_set_interrupt -- common/autotest_common.sh@238 -- # LSAN_OPTIONS=suppressions=/var/tmp/asan_suppression_file 00:29:30.267 18:33:13 reactor_set_interrupt -- common/autotest_common.sh@240 -- # export DEFAULT_RPC_ADDR=/var/tmp/spdk.sock 00:29:30.267 18:33:13 reactor_set_interrupt -- common/autotest_common.sh@240 -- # DEFAULT_RPC_ADDR=/var/tmp/spdk.sock 00:29:30.267 18:33:13 reactor_set_interrupt -- common/autotest_common.sh@242 -- # '[' -z /var/spdk/dependencies ']' 00:29:30.267 18:33:13 reactor_set_interrupt -- common/autotest_common.sh@245 -- # export DEPENDENCY_DIR 00:29:30.267 18:33:13 reactor_set_interrupt -- common/autotest_common.sh@249 -- # export SPDK_BIN_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin 00:29:30.267 18:33:13 reactor_set_interrupt -- common/autotest_common.sh@249 -- # SPDK_BIN_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin 00:29:30.267 18:33:13 reactor_set_interrupt -- common/autotest_common.sh@250 -- # export SPDK_EXAMPLE_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples 00:29:30.267 18:33:13 reactor_set_interrupt -- common/autotest_common.sh@250 -- # SPDK_EXAMPLE_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples 00:29:30.267 18:33:13 reactor_set_interrupt -- common/autotest_common.sh@253 -- # export QEMU_BIN=/usr/local/qemu/vanilla-latest/bin/qemu-system-x86_64 00:29:30.267 18:33:13 reactor_set_interrupt -- common/autotest_common.sh@253 -- # QEMU_BIN=/usr/local/qemu/vanilla-latest/bin/qemu-system-x86_64 00:29:30.267 18:33:13 reactor_set_interrupt -- common/autotest_common.sh@254 -- # export VFIO_QEMU_BIN=/usr/local/qemu/vfio-user-latest/bin/qemu-system-x86_64 00:29:30.267 18:33:13 reactor_set_interrupt -- common/autotest_common.sh@254 -- # VFIO_QEMU_BIN=/usr/local/qemu/vfio-user-latest/bin/qemu-system-x86_64 00:29:30.267 18:33:13 reactor_set_interrupt -- common/autotest_common.sh@256 -- # export AR_TOOL=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/ar-xnvme-fixer 00:29:30.267 18:33:13 reactor_set_interrupt -- common/autotest_common.sh@256 -- # AR_TOOL=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/ar-xnvme-fixer 00:29:30.267 18:33:13 reactor_set_interrupt -- common/autotest_common.sh@259 -- # export UNBIND_ENTIRE_IOMMU_GROUP=yes 00:29:30.267 18:33:13 reactor_set_interrupt -- common/autotest_common.sh@259 -- # UNBIND_ENTIRE_IOMMU_GROUP=yes 00:29:30.267 18:33:13 reactor_set_interrupt -- common/autotest_common.sh@262 -- # '[' 0 -eq 0 ']' 00:29:30.267 18:33:13 reactor_set_interrupt -- common/autotest_common.sh@263 -- # export valgrind= 00:29:30.267 18:33:13 reactor_set_interrupt -- common/autotest_common.sh@263 -- # valgrind= 00:29:30.267 18:33:13 reactor_set_interrupt -- common/autotest_common.sh@269 -- # uname -s 00:29:30.267 18:33:13 reactor_set_interrupt -- common/autotest_common.sh@269 -- # '[' Linux = Linux ']' 00:29:30.267 18:33:13 reactor_set_interrupt -- common/autotest_common.sh@270 -- # HUGEMEM=4096 00:29:30.267 18:33:13 reactor_set_interrupt -- common/autotest_common.sh@271 -- # export CLEAR_HUGE=yes 00:29:30.267 18:33:13 reactor_set_interrupt -- common/autotest_common.sh@271 -- # CLEAR_HUGE=yes 00:29:30.267 18:33:13 reactor_set_interrupt -- common/autotest_common.sh@272 -- # [[ 1 -eq 1 ]] 00:29:30.267 18:33:13 reactor_set_interrupt -- common/autotest_common.sh@276 -- # export HUGE_EVEN_ALLOC=yes 00:29:30.267 18:33:13 reactor_set_interrupt -- common/autotest_common.sh@276 -- # HUGE_EVEN_ALLOC=yes 00:29:30.267 18:33:13 reactor_set_interrupt -- common/autotest_common.sh@279 -- # MAKE=make 00:29:30.267 18:33:13 reactor_set_interrupt -- common/autotest_common.sh@280 -- # MAKEFLAGS=-j72 00:29:30.267 18:33:13 reactor_set_interrupt -- common/autotest_common.sh@296 -- # export HUGEMEM=4096 00:29:30.267 18:33:13 reactor_set_interrupt -- common/autotest_common.sh@296 -- # HUGEMEM=4096 00:29:30.267 18:33:13 reactor_set_interrupt -- common/autotest_common.sh@298 -- # NO_HUGE=() 00:29:30.267 18:33:13 reactor_set_interrupt -- common/autotest_common.sh@299 -- # TEST_MODE= 00:29:30.267 18:33:13 reactor_set_interrupt -- common/autotest_common.sh@318 -- # [[ -z 2624901 ]] 00:29:30.267 18:33:13 reactor_set_interrupt -- common/autotest_common.sh@318 -- # kill -0 2624901 00:29:30.267 18:33:13 reactor_set_interrupt -- common/autotest_common.sh@1680 -- # set_test_storage 2147483648 00:29:30.267 18:33:13 reactor_set_interrupt -- common/autotest_common.sh@328 -- # [[ -v testdir ]] 00:29:30.267 18:33:13 reactor_set_interrupt -- common/autotest_common.sh@330 -- # local requested_size=2147483648 00:29:30.267 18:33:13 reactor_set_interrupt -- common/autotest_common.sh@331 -- # local mount target_dir 00:29:30.267 18:33:13 reactor_set_interrupt -- common/autotest_common.sh@333 -- # local -A mounts fss sizes avails uses 00:29:30.267 18:33:13 reactor_set_interrupt -- common/autotest_common.sh@334 -- # local source fs size avail mount use 00:29:30.267 18:33:13 reactor_set_interrupt -- common/autotest_common.sh@336 -- # local storage_fallback storage_candidates 00:29:30.267 18:33:13 reactor_set_interrupt -- common/autotest_common.sh@338 -- # mktemp -udt spdk.XXXXXX 00:29:30.267 18:33:13 reactor_set_interrupt -- common/autotest_common.sh@338 -- # storage_fallback=/tmp/spdk.PWQ6Rj 00:29:30.268 18:33:13 reactor_set_interrupt -- common/autotest_common.sh@343 -- # storage_candidates=("$testdir" "$storage_fallback/tests/${testdir##*/}" "$storage_fallback") 00:29:30.268 18:33:13 reactor_set_interrupt -- common/autotest_common.sh@345 -- # [[ -n '' ]] 00:29:30.268 18:33:13 reactor_set_interrupt -- common/autotest_common.sh@350 -- # [[ -n '' ]] 00:29:30.268 18:33:13 reactor_set_interrupt -- common/autotest_common.sh@355 -- # mkdir -p /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt /tmp/spdk.PWQ6Rj/tests/interrupt /tmp/spdk.PWQ6Rj 00:29:30.268 18:33:13 reactor_set_interrupt -- common/autotest_common.sh@358 -- # requested_size=2214592512 00:29:30.268 18:33:13 reactor_set_interrupt -- common/autotest_common.sh@360 -- # read -r source fs size use avail _ mount 00:29:30.268 18:33:13 reactor_set_interrupt -- common/autotest_common.sh@327 -- # df -T 00:29:30.268 18:33:13 reactor_set_interrupt -- common/autotest_common.sh@327 -- # grep -v Filesystem 00:29:30.268 18:33:13 reactor_set_interrupt -- common/autotest_common.sh@361 -- # mounts["$mount"]=spdk_devtmpfs 00:29:30.268 18:33:13 reactor_set_interrupt -- common/autotest_common.sh@361 -- # fss["$mount"]=devtmpfs 00:29:30.268 18:33:13 reactor_set_interrupt -- common/autotest_common.sh@362 -- # avails["$mount"]=67108864 00:29:30.268 18:33:13 reactor_set_interrupt -- common/autotest_common.sh@362 -- # sizes["$mount"]=67108864 00:29:30.268 18:33:13 reactor_set_interrupt -- common/autotest_common.sh@363 -- # uses["$mount"]=0 00:29:30.268 18:33:13 reactor_set_interrupt -- common/autotest_common.sh@360 -- # read -r source fs size use avail _ mount 00:29:30.268 18:33:13 reactor_set_interrupt -- common/autotest_common.sh@361 -- # mounts["$mount"]=/dev/pmem0 00:29:30.268 18:33:13 reactor_set_interrupt -- common/autotest_common.sh@361 -- # fss["$mount"]=ext2 00:29:30.268 18:33:13 reactor_set_interrupt -- common/autotest_common.sh@362 -- # avails["$mount"]=946290688 00:29:30.268 18:33:13 reactor_set_interrupt -- common/autotest_common.sh@362 -- # sizes["$mount"]=5284429824 00:29:30.268 18:33:13 reactor_set_interrupt -- common/autotest_common.sh@363 -- # uses["$mount"]=4338139136 00:29:30.268 18:33:13 reactor_set_interrupt -- common/autotest_common.sh@360 -- # read -r source fs size use avail _ mount 00:29:30.268 18:33:13 reactor_set_interrupt -- common/autotest_common.sh@361 -- # mounts["$mount"]=spdk_root 00:29:30.268 18:33:13 reactor_set_interrupt -- common/autotest_common.sh@361 -- # fss["$mount"]=overlay 00:29:30.268 18:33:13 reactor_set_interrupt -- common/autotest_common.sh@362 -- # avails["$mount"]=88940560384 00:29:30.268 18:33:13 reactor_set_interrupt -- common/autotest_common.sh@362 -- # sizes["$mount"]=94508515328 00:29:30.268 18:33:13 reactor_set_interrupt -- common/autotest_common.sh@363 -- # uses["$mount"]=5567954944 00:29:30.268 18:33:13 reactor_set_interrupt -- common/autotest_common.sh@360 -- # read -r source fs size use avail _ mount 00:29:30.268 18:33:13 reactor_set_interrupt -- common/autotest_common.sh@361 -- # mounts["$mount"]=tmpfs 00:29:30.268 18:33:13 reactor_set_interrupt -- common/autotest_common.sh@361 -- # fss["$mount"]=tmpfs 00:29:30.268 18:33:13 reactor_set_interrupt -- common/autotest_common.sh@362 -- # avails["$mount"]=47250882560 00:29:30.268 18:33:13 reactor_set_interrupt -- common/autotest_common.sh@362 -- # sizes["$mount"]=47254257664 00:29:30.268 18:33:13 reactor_set_interrupt -- common/autotest_common.sh@363 -- # uses["$mount"]=3375104 00:29:30.268 18:33:13 reactor_set_interrupt -- common/autotest_common.sh@360 -- # read -r source fs size use avail _ mount 00:29:30.268 18:33:13 reactor_set_interrupt -- common/autotest_common.sh@361 -- # mounts["$mount"]=tmpfs 00:29:30.268 18:33:13 reactor_set_interrupt -- common/autotest_common.sh@361 -- # fss["$mount"]=tmpfs 00:29:30.268 18:33:13 reactor_set_interrupt -- common/autotest_common.sh@362 -- # avails["$mount"]=18892308480 00:29:30.268 18:33:13 reactor_set_interrupt -- common/autotest_common.sh@362 -- # sizes["$mount"]=18901704704 00:29:30.268 18:33:13 reactor_set_interrupt -- common/autotest_common.sh@363 -- # uses["$mount"]=9396224 00:29:30.268 18:33:13 reactor_set_interrupt -- common/autotest_common.sh@360 -- # read -r source fs size use avail _ mount 00:29:30.268 18:33:13 reactor_set_interrupt -- common/autotest_common.sh@361 -- # mounts["$mount"]=tmpfs 00:29:30.268 18:33:13 reactor_set_interrupt -- common/autotest_common.sh@361 -- # fss["$mount"]=tmpfs 00:29:30.268 18:33:13 reactor_set_interrupt -- common/autotest_common.sh@362 -- # avails["$mount"]=47253807104 00:29:30.268 18:33:13 reactor_set_interrupt -- common/autotest_common.sh@362 -- # sizes["$mount"]=47254257664 00:29:30.268 18:33:13 reactor_set_interrupt -- common/autotest_common.sh@363 -- # uses["$mount"]=450560 00:29:30.268 18:33:13 reactor_set_interrupt -- common/autotest_common.sh@360 -- # read -r source fs size use avail _ mount 00:29:30.268 18:33:13 reactor_set_interrupt -- common/autotest_common.sh@361 -- # mounts["$mount"]=tmpfs 00:29:30.268 18:33:13 reactor_set_interrupt -- common/autotest_common.sh@361 -- # fss["$mount"]=tmpfs 00:29:30.268 18:33:13 reactor_set_interrupt -- common/autotest_common.sh@362 -- # avails["$mount"]=9450844160 00:29:30.268 18:33:13 reactor_set_interrupt -- common/autotest_common.sh@362 -- # sizes["$mount"]=9450848256 00:29:30.268 18:33:13 reactor_set_interrupt -- common/autotest_common.sh@363 -- # uses["$mount"]=4096 00:29:30.268 18:33:13 reactor_set_interrupt -- common/autotest_common.sh@360 -- # read -r source fs size use avail _ mount 00:29:30.268 18:33:13 reactor_set_interrupt -- common/autotest_common.sh@366 -- # printf '* Looking for test storage...\n' 00:29:30.268 * Looking for test storage... 00:29:30.268 18:33:13 reactor_set_interrupt -- common/autotest_common.sh@368 -- # local target_space new_size 00:29:30.268 18:33:13 reactor_set_interrupt -- common/autotest_common.sh@369 -- # for target_dir in "${storage_candidates[@]}" 00:29:30.268 18:33:13 reactor_set_interrupt -- common/autotest_common.sh@372 -- # df /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt 00:29:30.268 18:33:13 reactor_set_interrupt -- common/autotest_common.sh@372 -- # awk '$1 !~ /Filesystem/{print $6}' 00:29:30.268 18:33:13 reactor_set_interrupt -- common/autotest_common.sh@372 -- # mount=/ 00:29:30.268 18:33:13 reactor_set_interrupt -- common/autotest_common.sh@374 -- # target_space=88940560384 00:29:30.268 18:33:13 reactor_set_interrupt -- common/autotest_common.sh@375 -- # (( target_space == 0 || target_space < requested_size )) 00:29:30.268 18:33:13 reactor_set_interrupt -- common/autotest_common.sh@378 -- # (( target_space >= requested_size )) 00:29:30.268 18:33:13 reactor_set_interrupt -- common/autotest_common.sh@380 -- # [[ overlay == tmpfs ]] 00:29:30.268 18:33:13 reactor_set_interrupt -- common/autotest_common.sh@380 -- # [[ overlay == ramfs ]] 00:29:30.268 18:33:13 reactor_set_interrupt -- common/autotest_common.sh@380 -- # [[ / == / ]] 00:29:30.268 18:33:13 reactor_set_interrupt -- common/autotest_common.sh@381 -- # new_size=7782547456 00:29:30.268 18:33:13 reactor_set_interrupt -- common/autotest_common.sh@382 -- # (( new_size * 100 / sizes[/] > 95 )) 00:29:30.268 18:33:13 reactor_set_interrupt -- common/autotest_common.sh@387 -- # export SPDK_TEST_STORAGE=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt 00:29:30.268 18:33:13 reactor_set_interrupt -- common/autotest_common.sh@387 -- # SPDK_TEST_STORAGE=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt 00:29:30.268 18:33:13 reactor_set_interrupt -- common/autotest_common.sh@388 -- # printf '* Found test storage at %s\n' /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt 00:29:30.268 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt 00:29:30.268 18:33:13 reactor_set_interrupt -- common/autotest_common.sh@389 -- # return 0 00:29:30.268 18:33:13 reactor_set_interrupt -- common/autotest_common.sh@1682 -- # set -o errtrace 00:29:30.268 18:33:13 reactor_set_interrupt -- common/autotest_common.sh@1683 -- # shopt -s extdebug 00:29:30.268 18:33:13 reactor_set_interrupt -- common/autotest_common.sh@1684 -- # trap 'trap - ERR; print_backtrace >&2' ERR 00:29:30.268 18:33:13 reactor_set_interrupt -- common/autotest_common.sh@1686 -- # PS4=' \t ${test_domain:-} -- ${BASH_SOURCE#${BASH_SOURCE%/*/*}/}@${LINENO} -- \$ ' 00:29:30.268 18:33:13 reactor_set_interrupt -- common/autotest_common.sh@1687 -- # true 00:29:30.268 18:33:13 reactor_set_interrupt -- common/autotest_common.sh@1689 -- # xtrace_fd 00:29:30.268 18:33:13 reactor_set_interrupt -- common/autotest_common.sh@25 -- # [[ -n 13 ]] 00:29:30.268 18:33:13 reactor_set_interrupt -- common/autotest_common.sh@25 -- # [[ -e /proc/self/fd/13 ]] 00:29:30.268 18:33:13 reactor_set_interrupt -- common/autotest_common.sh@27 -- # exec 00:29:30.268 18:33:13 reactor_set_interrupt -- common/autotest_common.sh@29 -- # exec 00:29:30.268 18:33:13 reactor_set_interrupt -- common/autotest_common.sh@31 -- # xtrace_restore 00:29:30.268 18:33:13 reactor_set_interrupt -- common/autotest_common.sh@16 -- # unset -v 'X_STACK[0 - 1 < 0 ? 0 : 0 - 1]' 00:29:30.268 18:33:13 reactor_set_interrupt -- common/autotest_common.sh@17 -- # (( 0 == 0 )) 00:29:30.268 18:33:13 reactor_set_interrupt -- common/autotest_common.sh@18 -- # set -x 00:29:30.268 18:33:13 reactor_set_interrupt -- interrupt/interrupt_common.sh@8 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt/common.sh 00:29:30.268 18:33:13 reactor_set_interrupt -- interrupt/interrupt_common.sh@10 -- # rpc_py=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:29:30.268 18:33:13 reactor_set_interrupt -- interrupt/interrupt_common.sh@12 -- # r0_mask=0x1 00:29:30.268 18:33:13 reactor_set_interrupt -- interrupt/interrupt_common.sh@13 -- # r1_mask=0x2 00:29:30.268 18:33:13 reactor_set_interrupt -- interrupt/interrupt_common.sh@14 -- # r2_mask=0x4 00:29:30.268 18:33:13 reactor_set_interrupt -- interrupt/interrupt_common.sh@16 -- # cpu_server_mask=0x07 00:29:30.268 18:33:13 reactor_set_interrupt -- interrupt/interrupt_common.sh@17 -- # rpc_server_addr=/var/tmp/spdk.sock 00:29:30.269 18:33:13 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@11 -- # export PYTHONPATH=:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/examples/interrupt_tgt 00:29:30.269 18:33:13 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@11 -- # PYTHONPATH=:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/examples/interrupt_tgt 00:29:30.269 18:33:13 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@86 -- # start_intr_tgt 00:29:30.269 18:33:13 reactor_set_interrupt -- interrupt/interrupt_common.sh@20 -- # local rpc_addr=/var/tmp/spdk.sock 00:29:30.269 18:33:13 reactor_set_interrupt -- interrupt/interrupt_common.sh@21 -- # local cpu_mask=0x07 00:29:30.269 18:33:13 reactor_set_interrupt -- interrupt/interrupt_common.sh@24 -- # intr_tgt_pid=2624946 00:29:30.269 18:33:13 reactor_set_interrupt -- interrupt/interrupt_common.sh@25 -- # trap 'killprocess "$intr_tgt_pid"; cleanup; exit 1' SIGINT SIGTERM EXIT 00:29:30.269 18:33:13 reactor_set_interrupt -- interrupt/interrupt_common.sh@23 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/interrupt_tgt -m 0x07 -r /var/tmp/spdk.sock -E -g 00:29:30.269 18:33:13 reactor_set_interrupt -- interrupt/interrupt_common.sh@26 -- # waitforlisten 2624946 /var/tmp/spdk.sock 00:29:30.269 18:33:13 reactor_set_interrupt -- common/autotest_common.sh@829 -- # '[' -z 2624946 ']' 00:29:30.269 18:33:13 reactor_set_interrupt -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:29:30.269 18:33:13 reactor_set_interrupt -- common/autotest_common.sh@834 -- # local max_retries=100 00:29:30.269 18:33:13 reactor_set_interrupt -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:29:30.269 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:29:30.269 18:33:13 reactor_set_interrupt -- common/autotest_common.sh@838 -- # xtrace_disable 00:29:30.269 18:33:13 reactor_set_interrupt -- common/autotest_common.sh@10 -- # set +x 00:29:30.269 [2024-07-12 18:33:13.812755] Starting SPDK v24.09-pre git sha1 182dd7de4 / DPDK 24.03.0 initialization... 00:29:30.269 [2024-07-12 18:33:13.812827] [ DPDK EAL parameters: interrupt_tgt --no-shconf -c 0x07 --single-file-segments --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2624946 ] 00:29:30.269 [2024-07-12 18:33:13.946469] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 3 00:29:30.526 [2024-07-12 18:33:14.052510] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:29:30.526 [2024-07-12 18:33:14.052534] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 2 00:29:30.526 [2024-07-12 18:33:14.052542] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:29:30.526 [2024-07-12 18:33:14.127430] thread.c:2099:spdk_thread_set_interrupt_mode: *NOTICE*: Set spdk_thread (app_thread) to intr mode from intr mode. 00:29:31.092 18:33:14 reactor_set_interrupt -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:29:31.092 18:33:14 reactor_set_interrupt -- common/autotest_common.sh@862 -- # return 0 00:29:31.092 18:33:14 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@87 -- # setup_bdev_mem 00:29:31.092 18:33:14 reactor_set_interrupt -- interrupt/common.sh@67 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:29:31.350 Malloc0 00:29:31.350 Malloc1 00:29:31.350 Malloc2 00:29:31.350 18:33:15 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@88 -- # setup_bdev_aio 00:29:31.350 18:33:15 reactor_set_interrupt -- interrupt/common.sh@75 -- # uname -s 00:29:31.350 18:33:15 reactor_set_interrupt -- interrupt/common.sh@75 -- # [[ Linux != \F\r\e\e\B\S\D ]] 00:29:31.350 18:33:15 reactor_set_interrupt -- interrupt/common.sh@76 -- # dd if=/dev/zero of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt/aiofile bs=2048 count=5000 00:29:31.609 5000+0 records in 00:29:31.609 5000+0 records out 00:29:31.609 10240000 bytes (10 MB, 9.8 MiB) copied, 0.024667 s, 415 MB/s 00:29:31.609 18:33:15 reactor_set_interrupt -- interrupt/common.sh@77 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_aio_create /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt/aiofile AIO0 2048 00:29:31.866 AIO0 00:29:31.866 18:33:15 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@90 -- # reactor_set_mode_without_threads 2624946 00:29:31.866 18:33:15 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@76 -- # reactor_set_intr_mode 2624946 without_thd 00:29:31.867 18:33:15 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@14 -- # local spdk_pid=2624946 00:29:31.867 18:33:15 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@15 -- # local without_thd=without_thd 00:29:31.867 18:33:15 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@17 -- # thd0_ids=($(reactor_get_thread_ids $r0_mask)) 00:29:31.867 18:33:15 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@17 -- # reactor_get_thread_ids 0x1 00:29:31.867 18:33:15 reactor_set_interrupt -- interrupt/common.sh@55 -- # local reactor_cpumask=0x1 00:29:31.867 18:33:15 reactor_set_interrupt -- interrupt/common.sh@56 -- # local grep_str 00:29:31.867 18:33:15 reactor_set_interrupt -- interrupt/common.sh@58 -- # reactor_cpumask=1 00:29:31.867 18:33:15 reactor_set_interrupt -- interrupt/common.sh@59 -- # jq_str='.threads|.[]|select(.cpumask == $reactor_cpumask)|.id' 00:29:31.867 18:33:15 reactor_set_interrupt -- interrupt/common.sh@62 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py thread_get_stats 00:29:31.867 18:33:15 reactor_set_interrupt -- interrupt/common.sh@62 -- # jq --arg reactor_cpumask 1 '.threads|.[]|select(.cpumask == $reactor_cpumask)|.id' 00:29:32.125 18:33:15 reactor_set_interrupt -- interrupt/common.sh@62 -- # echo 1 00:29:32.125 18:33:15 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@18 -- # thd2_ids=($(reactor_get_thread_ids $r2_mask)) 00:29:32.125 18:33:15 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@18 -- # reactor_get_thread_ids 0x4 00:29:32.125 18:33:15 reactor_set_interrupt -- interrupt/common.sh@55 -- # local reactor_cpumask=0x4 00:29:32.125 18:33:15 reactor_set_interrupt -- interrupt/common.sh@56 -- # local grep_str 00:29:32.125 18:33:15 reactor_set_interrupt -- interrupt/common.sh@58 -- # reactor_cpumask=4 00:29:32.125 18:33:15 reactor_set_interrupt -- interrupt/common.sh@59 -- # jq_str='.threads|.[]|select(.cpumask == $reactor_cpumask)|.id' 00:29:32.125 18:33:15 reactor_set_interrupt -- interrupt/common.sh@62 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py thread_get_stats 00:29:32.125 18:33:15 reactor_set_interrupt -- interrupt/common.sh@62 -- # jq --arg reactor_cpumask 4 '.threads|.[]|select(.cpumask == $reactor_cpumask)|.id' 00:29:32.383 18:33:15 reactor_set_interrupt -- interrupt/common.sh@62 -- # echo '' 00:29:32.383 18:33:15 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@21 -- # [[ 1 -eq 0 ]] 00:29:32.383 18:33:15 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@25 -- # echo 'spdk_thread ids are 1 on reactor0.' 00:29:32.383 spdk_thread ids are 1 on reactor0. 00:29:32.383 18:33:15 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@29 -- # for i in {0..2} 00:29:32.383 18:33:15 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@30 -- # reactor_is_idle 2624946 0 00:29:32.383 18:33:15 reactor_set_interrupt -- interrupt/common.sh@51 -- # reactor_is_busy_or_idle 2624946 0 idle 00:29:32.383 18:33:15 reactor_set_interrupt -- interrupt/common.sh@10 -- # local pid=2624946 00:29:32.383 18:33:15 reactor_set_interrupt -- interrupt/common.sh@11 -- # local idx=0 00:29:32.383 18:33:15 reactor_set_interrupt -- interrupt/common.sh@12 -- # local state=idle 00:29:32.383 18:33:15 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ idle != \b\u\s\y ]] 00:29:32.383 18:33:15 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ idle != \i\d\l\e ]] 00:29:32.383 18:33:15 reactor_set_interrupt -- interrupt/common.sh@18 -- # hash top 00:29:32.383 18:33:15 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j = 10 )) 00:29:32.383 18:33:15 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j != 0 )) 00:29:32.383 18:33:15 reactor_set_interrupt -- interrupt/common.sh@24 -- # top -bHn 1 -p 2624946 -w 256 00:29:32.383 18:33:15 reactor_set_interrupt -- interrupt/common.sh@24 -- # grep reactor_0 00:29:32.383 18:33:16 reactor_set_interrupt -- interrupt/common.sh@24 -- # top_reactor='2624946 root 20 0 128.2g 36864 23616 S 0.0 0.0 0:00.41 reactor_0' 00:29:32.383 18:33:16 reactor_set_interrupt -- interrupt/common.sh@25 -- # echo 2624946 root 20 0 128.2g 36864 23616 S 0.0 0.0 0:00.41 reactor_0 00:29:32.383 18:33:16 reactor_set_interrupt -- interrupt/common.sh@25 -- # sed -e 's/^\s*//g' 00:29:32.383 18:33:16 reactor_set_interrupt -- interrupt/common.sh@25 -- # awk '{print $9}' 00:29:32.383 18:33:16 reactor_set_interrupt -- interrupt/common.sh@25 -- # cpu_rate=0.0 00:29:32.383 18:33:16 reactor_set_interrupt -- interrupt/common.sh@26 -- # cpu_rate=0 00:29:32.383 18:33:16 reactor_set_interrupt -- interrupt/common.sh@28 -- # [[ idle = \b\u\s\y ]] 00:29:32.383 18:33:16 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ idle = \i\d\l\e ]] 00:29:32.383 18:33:16 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ 0 -gt 30 ]] 00:29:32.383 18:33:16 reactor_set_interrupt -- interrupt/common.sh@33 -- # return 0 00:29:32.383 18:33:16 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@29 -- # for i in {0..2} 00:29:32.383 18:33:16 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@30 -- # reactor_is_idle 2624946 1 00:29:32.383 18:33:16 reactor_set_interrupt -- interrupt/common.sh@51 -- # reactor_is_busy_or_idle 2624946 1 idle 00:29:32.383 18:33:16 reactor_set_interrupt -- interrupt/common.sh@10 -- # local pid=2624946 00:29:32.383 18:33:16 reactor_set_interrupt -- interrupt/common.sh@11 -- # local idx=1 00:29:32.383 18:33:16 reactor_set_interrupt -- interrupt/common.sh@12 -- # local state=idle 00:29:32.383 18:33:16 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ idle != \b\u\s\y ]] 00:29:32.383 18:33:16 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ idle != \i\d\l\e ]] 00:29:32.383 18:33:16 reactor_set_interrupt -- interrupt/common.sh@18 -- # hash top 00:29:32.383 18:33:16 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j = 10 )) 00:29:32.383 18:33:16 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j != 0 )) 00:29:32.383 18:33:16 reactor_set_interrupt -- interrupt/common.sh@24 -- # top -bHn 1 -p 2624946 -w 256 00:29:32.383 18:33:16 reactor_set_interrupt -- interrupt/common.sh@24 -- # grep reactor_1 00:29:32.641 18:33:16 reactor_set_interrupt -- interrupt/common.sh@24 -- # top_reactor='2624951 root 20 0 128.2g 36864 23616 S 0.0 0.0 0:00.00 reactor_1' 00:29:32.641 18:33:16 reactor_set_interrupt -- interrupt/common.sh@25 -- # echo 2624951 root 20 0 128.2g 36864 23616 S 0.0 0.0 0:00.00 reactor_1 00:29:32.641 18:33:16 reactor_set_interrupt -- interrupt/common.sh@25 -- # sed -e 's/^\s*//g' 00:29:32.641 18:33:16 reactor_set_interrupt -- interrupt/common.sh@25 -- # awk '{print $9}' 00:29:32.641 18:33:16 reactor_set_interrupt -- interrupt/common.sh@25 -- # cpu_rate=0.0 00:29:32.641 18:33:16 reactor_set_interrupt -- interrupt/common.sh@26 -- # cpu_rate=0 00:29:32.641 18:33:16 reactor_set_interrupt -- interrupt/common.sh@28 -- # [[ idle = \b\u\s\y ]] 00:29:32.641 18:33:16 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ idle = \i\d\l\e ]] 00:29:32.641 18:33:16 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ 0 -gt 30 ]] 00:29:32.641 18:33:16 reactor_set_interrupt -- interrupt/common.sh@33 -- # return 0 00:29:32.641 18:33:16 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@29 -- # for i in {0..2} 00:29:32.641 18:33:16 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@30 -- # reactor_is_idle 2624946 2 00:29:32.641 18:33:16 reactor_set_interrupt -- interrupt/common.sh@51 -- # reactor_is_busy_or_idle 2624946 2 idle 00:29:32.641 18:33:16 reactor_set_interrupt -- interrupt/common.sh@10 -- # local pid=2624946 00:29:32.641 18:33:16 reactor_set_interrupt -- interrupt/common.sh@11 -- # local idx=2 00:29:32.641 18:33:16 reactor_set_interrupt -- interrupt/common.sh@12 -- # local state=idle 00:29:32.641 18:33:16 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ idle != \b\u\s\y ]] 00:29:32.641 18:33:16 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ idle != \i\d\l\e ]] 00:29:32.641 18:33:16 reactor_set_interrupt -- interrupt/common.sh@18 -- # hash top 00:29:32.641 18:33:16 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j = 10 )) 00:29:32.641 18:33:16 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j != 0 )) 00:29:32.641 18:33:16 reactor_set_interrupt -- interrupt/common.sh@24 -- # top -bHn 1 -p 2624946 -w 256 00:29:32.641 18:33:16 reactor_set_interrupt -- interrupt/common.sh@24 -- # grep reactor_2 00:29:32.913 18:33:16 reactor_set_interrupt -- interrupt/common.sh@24 -- # top_reactor='2624952 root 20 0 128.2g 36864 23616 S 0.0 0.0 0:00.00 reactor_2' 00:29:32.913 18:33:16 reactor_set_interrupt -- interrupt/common.sh@25 -- # echo 2624952 root 20 0 128.2g 36864 23616 S 0.0 0.0 0:00.00 reactor_2 00:29:32.914 18:33:16 reactor_set_interrupt -- interrupt/common.sh@25 -- # sed -e 's/^\s*//g' 00:29:32.914 18:33:16 reactor_set_interrupt -- interrupt/common.sh@25 -- # awk '{print $9}' 00:29:32.914 18:33:16 reactor_set_interrupt -- interrupt/common.sh@25 -- # cpu_rate=0.0 00:29:32.914 18:33:16 reactor_set_interrupt -- interrupt/common.sh@26 -- # cpu_rate=0 00:29:32.914 18:33:16 reactor_set_interrupt -- interrupt/common.sh@28 -- # [[ idle = \b\u\s\y ]] 00:29:32.914 18:33:16 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ idle = \i\d\l\e ]] 00:29:32.914 18:33:16 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ 0 -gt 30 ]] 00:29:32.914 18:33:16 reactor_set_interrupt -- interrupt/common.sh@33 -- # return 0 00:29:32.914 18:33:16 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@33 -- # '[' without_thdx '!=' x ']' 00:29:32.914 18:33:16 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@35 -- # for i in "${thd0_ids[@]}" 00:29:32.914 18:33:16 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@36 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py thread_set_cpumask -i 1 -m 0x2 00:29:33.183 [2024-07-12 18:33:16.641556] thread.c:2099:spdk_thread_set_interrupt_mode: *NOTICE*: Set spdk_thread (app_thread) to intr mode from intr mode. 00:29:33.183 18:33:16 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@43 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py --plugin interrupt_plugin reactor_set_interrupt_mode 0 -d 00:29:33.183 [2024-07-12 18:33:16.885240] interrupt_tgt.c: 99:rpc_reactor_set_interrupt_mode: *NOTICE*: RPC Start to disable interrupt mode on reactor 0. 00:29:33.183 [2024-07-12 18:33:16.885559] interrupt_tgt.c: 36:rpc_reactor_set_interrupt_mode_cb: *NOTICE*: complete reactor switch 00:29:33.183 18:33:16 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@44 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py --plugin interrupt_plugin reactor_set_interrupt_mode 2 -d 00:29:33.441 [2024-07-12 18:33:17.137122] interrupt_tgt.c: 99:rpc_reactor_set_interrupt_mode: *NOTICE*: RPC Start to disable interrupt mode on reactor 2. 00:29:33.441 [2024-07-12 18:33:17.137259] interrupt_tgt.c: 36:rpc_reactor_set_interrupt_mode_cb: *NOTICE*: complete reactor switch 00:29:33.441 18:33:17 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@46 -- # for i in 0 2 00:29:33.441 18:33:17 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@47 -- # reactor_is_busy 2624946 0 00:29:33.441 18:33:17 reactor_set_interrupt -- interrupt/common.sh@47 -- # reactor_is_busy_or_idle 2624946 0 busy 00:29:33.441 18:33:17 reactor_set_interrupt -- interrupt/common.sh@10 -- # local pid=2624946 00:29:33.441 18:33:17 reactor_set_interrupt -- interrupt/common.sh@11 -- # local idx=0 00:29:33.441 18:33:17 reactor_set_interrupt -- interrupt/common.sh@12 -- # local state=busy 00:29:33.441 18:33:17 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ busy != \b\u\s\y ]] 00:29:33.441 18:33:17 reactor_set_interrupt -- interrupt/common.sh@18 -- # hash top 00:29:33.441 18:33:17 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j = 10 )) 00:29:33.441 18:33:17 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j != 0 )) 00:29:33.441 18:33:17 reactor_set_interrupt -- interrupt/common.sh@24 -- # grep reactor_0 00:29:33.441 18:33:17 reactor_set_interrupt -- interrupt/common.sh@24 -- # top -bHn 1 -p 2624946 -w 256 00:29:33.698 18:33:17 reactor_set_interrupt -- interrupt/common.sh@24 -- # top_reactor='2624946 root 20 0 128.2g 36864 23616 R 93.8 0.0 0:00.84 reactor_0' 00:29:33.698 18:33:17 reactor_set_interrupt -- interrupt/common.sh@25 -- # echo 2624946 root 20 0 128.2g 36864 23616 R 93.8 0.0 0:00.84 reactor_0 00:29:33.698 18:33:17 reactor_set_interrupt -- interrupt/common.sh@25 -- # sed -e 's/^\s*//g' 00:29:33.698 18:33:17 reactor_set_interrupt -- interrupt/common.sh@25 -- # awk '{print $9}' 00:29:33.698 18:33:17 reactor_set_interrupt -- interrupt/common.sh@25 -- # cpu_rate=93.8 00:29:33.698 18:33:17 reactor_set_interrupt -- interrupt/common.sh@26 -- # cpu_rate=93 00:29:33.698 18:33:17 reactor_set_interrupt -- interrupt/common.sh@28 -- # [[ busy = \b\u\s\y ]] 00:29:33.698 18:33:17 reactor_set_interrupt -- interrupt/common.sh@28 -- # [[ 93 -lt 70 ]] 00:29:33.698 18:33:17 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ busy = \i\d\l\e ]] 00:29:33.698 18:33:17 reactor_set_interrupt -- interrupt/common.sh@33 -- # return 0 00:29:33.698 18:33:17 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@46 -- # for i in 0 2 00:29:33.698 18:33:17 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@47 -- # reactor_is_busy 2624946 2 00:29:33.698 18:33:17 reactor_set_interrupt -- interrupt/common.sh@47 -- # reactor_is_busy_or_idle 2624946 2 busy 00:29:33.698 18:33:17 reactor_set_interrupt -- interrupt/common.sh@10 -- # local pid=2624946 00:29:33.698 18:33:17 reactor_set_interrupt -- interrupt/common.sh@11 -- # local idx=2 00:29:33.698 18:33:17 reactor_set_interrupt -- interrupt/common.sh@12 -- # local state=busy 00:29:33.698 18:33:17 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ busy != \b\u\s\y ]] 00:29:33.698 18:33:17 reactor_set_interrupt -- interrupt/common.sh@18 -- # hash top 00:29:33.698 18:33:17 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j = 10 )) 00:29:33.698 18:33:17 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j != 0 )) 00:29:33.698 18:33:17 reactor_set_interrupt -- interrupt/common.sh@24 -- # top -bHn 1 -p 2624946 -w 256 00:29:33.698 18:33:17 reactor_set_interrupt -- interrupt/common.sh@24 -- # grep reactor_2 00:29:33.954 18:33:17 reactor_set_interrupt -- interrupt/common.sh@24 -- # top_reactor='2624952 root 20 0 128.2g 36864 23616 R 99.9 0.0 0:00.36 reactor_2' 00:29:33.954 18:33:17 reactor_set_interrupt -- interrupt/common.sh@25 -- # echo 2624952 root 20 0 128.2g 36864 23616 R 99.9 0.0 0:00.36 reactor_2 00:29:33.954 18:33:17 reactor_set_interrupt -- interrupt/common.sh@25 -- # sed -e 's/^\s*//g' 00:29:33.954 18:33:17 reactor_set_interrupt -- interrupt/common.sh@25 -- # awk '{print $9}' 00:29:33.954 18:33:17 reactor_set_interrupt -- interrupt/common.sh@25 -- # cpu_rate=99.9 00:29:33.954 18:33:17 reactor_set_interrupt -- interrupt/common.sh@26 -- # cpu_rate=99 00:29:33.954 18:33:17 reactor_set_interrupt -- interrupt/common.sh@28 -- # [[ busy = \b\u\s\y ]] 00:29:33.954 18:33:17 reactor_set_interrupt -- interrupt/common.sh@28 -- # [[ 99 -lt 70 ]] 00:29:33.954 18:33:17 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ busy = \i\d\l\e ]] 00:29:33.954 18:33:17 reactor_set_interrupt -- interrupt/common.sh@33 -- # return 0 00:29:33.954 18:33:17 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@51 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py --plugin interrupt_plugin reactor_set_interrupt_mode 2 00:29:34.211 [2024-07-12 18:33:17.753121] interrupt_tgt.c: 99:rpc_reactor_set_interrupt_mode: *NOTICE*: RPC Start to enable interrupt mode on reactor 2. 00:29:34.211 [2024-07-12 18:33:17.753228] interrupt_tgt.c: 36:rpc_reactor_set_interrupt_mode_cb: *NOTICE*: complete reactor switch 00:29:34.211 18:33:17 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@52 -- # '[' without_thdx '!=' x ']' 00:29:34.211 18:33:17 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@59 -- # reactor_is_idle 2624946 2 00:29:34.211 18:33:17 reactor_set_interrupt -- interrupt/common.sh@51 -- # reactor_is_busy_or_idle 2624946 2 idle 00:29:34.211 18:33:17 reactor_set_interrupt -- interrupt/common.sh@10 -- # local pid=2624946 00:29:34.211 18:33:17 reactor_set_interrupt -- interrupt/common.sh@11 -- # local idx=2 00:29:34.211 18:33:17 reactor_set_interrupt -- interrupt/common.sh@12 -- # local state=idle 00:29:34.211 18:33:17 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ idle != \b\u\s\y ]] 00:29:34.211 18:33:17 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ idle != \i\d\l\e ]] 00:29:34.211 18:33:17 reactor_set_interrupt -- interrupt/common.sh@18 -- # hash top 00:29:34.211 18:33:17 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j = 10 )) 00:29:34.211 18:33:17 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j != 0 )) 00:29:34.211 18:33:17 reactor_set_interrupt -- interrupt/common.sh@24 -- # top -bHn 1 -p 2624946 -w 256 00:29:34.211 18:33:17 reactor_set_interrupt -- interrupt/common.sh@24 -- # grep reactor_2 00:29:34.211 18:33:17 reactor_set_interrupt -- interrupt/common.sh@24 -- # top_reactor='2624952 root 20 0 128.2g 36864 23616 S 0.0 0.0 0:00.61 reactor_2' 00:29:34.516 18:33:17 reactor_set_interrupt -- interrupt/common.sh@25 -- # echo 2624952 root 20 0 128.2g 36864 23616 S 0.0 0.0 0:00.61 reactor_2 00:29:34.516 18:33:17 reactor_set_interrupt -- interrupt/common.sh@25 -- # awk '{print $9}' 00:29:34.516 18:33:17 reactor_set_interrupt -- interrupt/common.sh@25 -- # sed -e 's/^\s*//g' 00:29:34.516 18:33:17 reactor_set_interrupt -- interrupt/common.sh@25 -- # cpu_rate=0.0 00:29:34.516 18:33:17 reactor_set_interrupt -- interrupt/common.sh@26 -- # cpu_rate=0 00:29:34.516 18:33:17 reactor_set_interrupt -- interrupt/common.sh@28 -- # [[ idle = \b\u\s\y ]] 00:29:34.516 18:33:17 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ idle = \i\d\l\e ]] 00:29:34.516 18:33:17 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ 0 -gt 30 ]] 00:29:34.516 18:33:17 reactor_set_interrupt -- interrupt/common.sh@33 -- # return 0 00:29:34.516 18:33:17 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@62 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py --plugin interrupt_plugin reactor_set_interrupt_mode 0 00:29:34.516 [2024-07-12 18:33:18.177117] interrupt_tgt.c: 99:rpc_reactor_set_interrupt_mode: *NOTICE*: RPC Start to enable interrupt mode on reactor 0. 00:29:34.516 [2024-07-12 18:33:18.177278] interrupt_tgt.c: 36:rpc_reactor_set_interrupt_mode_cb: *NOTICE*: complete reactor switch 00:29:34.516 18:33:18 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@63 -- # '[' without_thdx '!=' x ']' 00:29:34.516 18:33:18 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@65 -- # for i in "${thd0_ids[@]}" 00:29:34.516 18:33:18 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@66 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py thread_set_cpumask -i 1 -m 0x1 00:29:34.774 [2024-07-12 18:33:18.413309] thread.c:2099:spdk_thread_set_interrupt_mode: *NOTICE*: Set spdk_thread (app_thread) to intr mode from intr mode. 00:29:34.774 18:33:18 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@70 -- # reactor_is_idle 2624946 0 00:29:34.774 18:33:18 reactor_set_interrupt -- interrupt/common.sh@51 -- # reactor_is_busy_or_idle 2624946 0 idle 00:29:34.774 18:33:18 reactor_set_interrupt -- interrupt/common.sh@10 -- # local pid=2624946 00:29:34.774 18:33:18 reactor_set_interrupt -- interrupt/common.sh@11 -- # local idx=0 00:29:34.774 18:33:18 reactor_set_interrupt -- interrupt/common.sh@12 -- # local state=idle 00:29:34.774 18:33:18 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ idle != \b\u\s\y ]] 00:29:34.774 18:33:18 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ idle != \i\d\l\e ]] 00:29:34.774 18:33:18 reactor_set_interrupt -- interrupt/common.sh@18 -- # hash top 00:29:34.774 18:33:18 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j = 10 )) 00:29:34.774 18:33:18 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j != 0 )) 00:29:34.774 18:33:18 reactor_set_interrupt -- interrupt/common.sh@24 -- # top -bHn 1 -p 2624946 -w 256 00:29:34.774 18:33:18 reactor_set_interrupt -- interrupt/common.sh@24 -- # grep reactor_0 00:29:35.032 18:33:18 reactor_set_interrupt -- interrupt/common.sh@24 -- # top_reactor='2624946 root 20 0 128.2g 36864 23616 S 0.0 0.0 0:01.70 reactor_0' 00:29:35.032 18:33:18 reactor_set_interrupt -- interrupt/common.sh@25 -- # echo 2624946 root 20 0 128.2g 36864 23616 S 0.0 0.0 0:01.70 reactor_0 00:29:35.032 18:33:18 reactor_set_interrupt -- interrupt/common.sh@25 -- # sed -e 's/^\s*//g' 00:29:35.032 18:33:18 reactor_set_interrupt -- interrupt/common.sh@25 -- # awk '{print $9}' 00:29:35.032 18:33:18 reactor_set_interrupt -- interrupt/common.sh@25 -- # cpu_rate=0.0 00:29:35.032 18:33:18 reactor_set_interrupt -- interrupt/common.sh@26 -- # cpu_rate=0 00:29:35.032 18:33:18 reactor_set_interrupt -- interrupt/common.sh@28 -- # [[ idle = \b\u\s\y ]] 00:29:35.032 18:33:18 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ idle = \i\d\l\e ]] 00:29:35.032 18:33:18 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ 0 -gt 30 ]] 00:29:35.032 18:33:18 reactor_set_interrupt -- interrupt/common.sh@33 -- # return 0 00:29:35.032 18:33:18 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@72 -- # return 0 00:29:35.032 18:33:18 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@77 -- # return 0 00:29:35.032 18:33:18 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@92 -- # trap - SIGINT SIGTERM EXIT 00:29:35.032 18:33:18 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@93 -- # killprocess 2624946 00:29:35.032 18:33:18 reactor_set_interrupt -- common/autotest_common.sh@948 -- # '[' -z 2624946 ']' 00:29:35.032 18:33:18 reactor_set_interrupt -- common/autotest_common.sh@952 -- # kill -0 2624946 00:29:35.032 18:33:18 reactor_set_interrupt -- common/autotest_common.sh@953 -- # uname 00:29:35.032 18:33:18 reactor_set_interrupt -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:29:35.032 18:33:18 reactor_set_interrupt -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2624946 00:29:35.032 18:33:18 reactor_set_interrupt -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:29:35.032 18:33:18 reactor_set_interrupt -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:29:35.032 18:33:18 reactor_set_interrupt -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2624946' 00:29:35.032 killing process with pid 2624946 00:29:35.032 18:33:18 reactor_set_interrupt -- common/autotest_common.sh@967 -- # kill 2624946 00:29:35.032 18:33:18 reactor_set_interrupt -- common/autotest_common.sh@972 -- # wait 2624946 00:29:35.290 18:33:18 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@94 -- # cleanup 00:29:35.290 18:33:18 reactor_set_interrupt -- interrupt/common.sh@6 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt/aiofile 00:29:35.290 18:33:18 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@97 -- # start_intr_tgt 00:29:35.290 18:33:18 reactor_set_interrupt -- interrupt/interrupt_common.sh@20 -- # local rpc_addr=/var/tmp/spdk.sock 00:29:35.290 18:33:18 reactor_set_interrupt -- interrupt/interrupt_common.sh@21 -- # local cpu_mask=0x07 00:29:35.290 18:33:18 reactor_set_interrupt -- interrupt/interrupt_common.sh@24 -- # intr_tgt_pid=2625715 00:29:35.290 18:33:18 reactor_set_interrupt -- interrupt/interrupt_common.sh@25 -- # trap 'killprocess "$intr_tgt_pid"; cleanup; exit 1' SIGINT SIGTERM EXIT 00:29:35.290 18:33:18 reactor_set_interrupt -- interrupt/interrupt_common.sh@23 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/interrupt_tgt -m 0x07 -r /var/tmp/spdk.sock -E -g 00:29:35.290 18:33:18 reactor_set_interrupt -- interrupt/interrupt_common.sh@26 -- # waitforlisten 2625715 /var/tmp/spdk.sock 00:29:35.290 18:33:18 reactor_set_interrupt -- common/autotest_common.sh@829 -- # '[' -z 2625715 ']' 00:29:35.290 18:33:18 reactor_set_interrupt -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:29:35.290 18:33:18 reactor_set_interrupt -- common/autotest_common.sh@834 -- # local max_retries=100 00:29:35.290 18:33:18 reactor_set_interrupt -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:29:35.290 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:29:35.290 18:33:18 reactor_set_interrupt -- common/autotest_common.sh@838 -- # xtrace_disable 00:29:35.290 18:33:18 reactor_set_interrupt -- common/autotest_common.sh@10 -- # set +x 00:29:35.290 [2024-07-12 18:33:18.960097] Starting SPDK v24.09-pre git sha1 182dd7de4 / DPDK 24.03.0 initialization... 00:29:35.290 [2024-07-12 18:33:18.960169] [ DPDK EAL parameters: interrupt_tgt --no-shconf -c 0x07 --single-file-segments --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2625715 ] 00:29:35.548 [2024-07-12 18:33:19.090831] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 3 00:29:35.548 [2024-07-12 18:33:19.192946] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:29:35.548 [2024-07-12 18:33:19.193000] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 2 00:29:35.548 [2024-07-12 18:33:19.193006] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:29:35.548 [2024-07-12 18:33:19.268562] thread.c:2099:spdk_thread_set_interrupt_mode: *NOTICE*: Set spdk_thread (app_thread) to intr mode from intr mode. 00:29:36.480 18:33:19 reactor_set_interrupt -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:29:36.480 18:33:19 reactor_set_interrupt -- common/autotest_common.sh@862 -- # return 0 00:29:36.480 18:33:19 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@98 -- # setup_bdev_mem 00:29:36.480 18:33:19 reactor_set_interrupt -- interrupt/common.sh@67 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:29:36.480 Malloc0 00:29:36.481 Malloc1 00:29:36.481 Malloc2 00:29:36.481 18:33:20 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@99 -- # setup_bdev_aio 00:29:36.481 18:33:20 reactor_set_interrupt -- interrupt/common.sh@75 -- # uname -s 00:29:36.481 18:33:20 reactor_set_interrupt -- interrupt/common.sh@75 -- # [[ Linux != \F\r\e\e\B\S\D ]] 00:29:36.481 18:33:20 reactor_set_interrupt -- interrupt/common.sh@76 -- # dd if=/dev/zero of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt/aiofile bs=2048 count=5000 00:29:36.739 5000+0 records in 00:29:36.739 5000+0 records out 00:29:36.739 10240000 bytes (10 MB, 9.8 MiB) copied, 0.0257337 s, 398 MB/s 00:29:36.739 18:33:20 reactor_set_interrupt -- interrupt/common.sh@77 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_aio_create /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt/aiofile AIO0 2048 00:29:36.997 AIO0 00:29:36.997 18:33:20 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@101 -- # reactor_set_mode_with_threads 2625715 00:29:36.997 18:33:20 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@81 -- # reactor_set_intr_mode 2625715 00:29:36.997 18:33:20 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@14 -- # local spdk_pid=2625715 00:29:36.997 18:33:20 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@15 -- # local without_thd= 00:29:36.997 18:33:20 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@17 -- # thd0_ids=($(reactor_get_thread_ids $r0_mask)) 00:29:36.997 18:33:20 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@17 -- # reactor_get_thread_ids 0x1 00:29:36.997 18:33:20 reactor_set_interrupt -- interrupt/common.sh@55 -- # local reactor_cpumask=0x1 00:29:36.997 18:33:20 reactor_set_interrupt -- interrupt/common.sh@56 -- # local grep_str 00:29:36.997 18:33:20 reactor_set_interrupt -- interrupt/common.sh@58 -- # reactor_cpumask=1 00:29:36.997 18:33:20 reactor_set_interrupt -- interrupt/common.sh@59 -- # jq_str='.threads|.[]|select(.cpumask == $reactor_cpumask)|.id' 00:29:36.997 18:33:20 reactor_set_interrupt -- interrupt/common.sh@62 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py thread_get_stats 00:29:36.997 18:33:20 reactor_set_interrupt -- interrupt/common.sh@62 -- # jq --arg reactor_cpumask 1 '.threads|.[]|select(.cpumask == $reactor_cpumask)|.id' 00:29:37.255 18:33:20 reactor_set_interrupt -- interrupt/common.sh@62 -- # echo 1 00:29:37.255 18:33:20 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@18 -- # thd2_ids=($(reactor_get_thread_ids $r2_mask)) 00:29:37.255 18:33:20 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@18 -- # reactor_get_thread_ids 0x4 00:29:37.255 18:33:20 reactor_set_interrupt -- interrupt/common.sh@55 -- # local reactor_cpumask=0x4 00:29:37.255 18:33:20 reactor_set_interrupt -- interrupt/common.sh@56 -- # local grep_str 00:29:37.255 18:33:20 reactor_set_interrupt -- interrupt/common.sh@58 -- # reactor_cpumask=4 00:29:37.255 18:33:20 reactor_set_interrupt -- interrupt/common.sh@59 -- # jq_str='.threads|.[]|select(.cpumask == $reactor_cpumask)|.id' 00:29:37.255 18:33:20 reactor_set_interrupt -- interrupt/common.sh@62 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py thread_get_stats 00:29:37.255 18:33:20 reactor_set_interrupt -- interrupt/common.sh@62 -- # jq --arg reactor_cpumask 4 '.threads|.[]|select(.cpumask == $reactor_cpumask)|.id' 00:29:37.513 18:33:21 reactor_set_interrupt -- interrupt/common.sh@62 -- # echo '' 00:29:37.514 18:33:21 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@21 -- # [[ 1 -eq 0 ]] 00:29:37.514 18:33:21 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@25 -- # echo 'spdk_thread ids are 1 on reactor0.' 00:29:37.514 spdk_thread ids are 1 on reactor0. 00:29:37.514 18:33:21 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@29 -- # for i in {0..2} 00:29:37.514 18:33:21 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@30 -- # reactor_is_idle 2625715 0 00:29:37.514 18:33:21 reactor_set_interrupt -- interrupt/common.sh@51 -- # reactor_is_busy_or_idle 2625715 0 idle 00:29:37.514 18:33:21 reactor_set_interrupt -- interrupt/common.sh@10 -- # local pid=2625715 00:29:37.514 18:33:21 reactor_set_interrupt -- interrupt/common.sh@11 -- # local idx=0 00:29:37.514 18:33:21 reactor_set_interrupt -- interrupt/common.sh@12 -- # local state=idle 00:29:37.514 18:33:21 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ idle != \b\u\s\y ]] 00:29:37.514 18:33:21 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ idle != \i\d\l\e ]] 00:29:37.514 18:33:21 reactor_set_interrupt -- interrupt/common.sh@18 -- # hash top 00:29:37.514 18:33:21 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j = 10 )) 00:29:37.514 18:33:21 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j != 0 )) 00:29:37.514 18:33:21 reactor_set_interrupt -- interrupt/common.sh@24 -- # grep reactor_0 00:29:37.514 18:33:21 reactor_set_interrupt -- interrupt/common.sh@24 -- # top -bHn 1 -p 2625715 -w 256 00:29:37.514 18:33:21 reactor_set_interrupt -- interrupt/common.sh@24 -- # top_reactor='2625715 root 20 0 128.2g 36864 23616 S 0.0 0.0 0:00.39 reactor_0' 00:29:37.514 18:33:21 reactor_set_interrupt -- interrupt/common.sh@25 -- # awk '{print $9}' 00:29:37.514 18:33:21 reactor_set_interrupt -- interrupt/common.sh@25 -- # echo 2625715 root 20 0 128.2g 36864 23616 S 0.0 0.0 0:00.39 reactor_0 00:29:37.514 18:33:21 reactor_set_interrupt -- interrupt/common.sh@25 -- # sed -e 's/^\s*//g' 00:29:37.514 18:33:21 reactor_set_interrupt -- interrupt/common.sh@25 -- # cpu_rate=0.0 00:29:37.514 18:33:21 reactor_set_interrupt -- interrupt/common.sh@26 -- # cpu_rate=0 00:29:37.514 18:33:21 reactor_set_interrupt -- interrupt/common.sh@28 -- # [[ idle = \b\u\s\y ]] 00:29:37.514 18:33:21 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ idle = \i\d\l\e ]] 00:29:37.514 18:33:21 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ 0 -gt 30 ]] 00:29:37.514 18:33:21 reactor_set_interrupt -- interrupt/common.sh@33 -- # return 0 00:29:37.514 18:33:21 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@29 -- # for i in {0..2} 00:29:37.514 18:33:21 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@30 -- # reactor_is_idle 2625715 1 00:29:37.514 18:33:21 reactor_set_interrupt -- interrupt/common.sh@51 -- # reactor_is_busy_or_idle 2625715 1 idle 00:29:37.514 18:33:21 reactor_set_interrupt -- interrupt/common.sh@10 -- # local pid=2625715 00:29:37.514 18:33:21 reactor_set_interrupt -- interrupt/common.sh@11 -- # local idx=1 00:29:37.514 18:33:21 reactor_set_interrupt -- interrupt/common.sh@12 -- # local state=idle 00:29:37.514 18:33:21 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ idle != \b\u\s\y ]] 00:29:37.514 18:33:21 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ idle != \i\d\l\e ]] 00:29:37.514 18:33:21 reactor_set_interrupt -- interrupt/common.sh@18 -- # hash top 00:29:37.514 18:33:21 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j = 10 )) 00:29:37.514 18:33:21 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j != 0 )) 00:29:37.514 18:33:21 reactor_set_interrupt -- interrupt/common.sh@24 -- # top -bHn 1 -p 2625715 -w 256 00:29:37.514 18:33:21 reactor_set_interrupt -- interrupt/common.sh@24 -- # grep reactor_1 00:29:37.772 18:33:21 reactor_set_interrupt -- interrupt/common.sh@24 -- # top_reactor='2625718 root 20 0 128.2g 36864 23616 S 0.0 0.0 0:00.00 reactor_1' 00:29:37.772 18:33:21 reactor_set_interrupt -- interrupt/common.sh@25 -- # echo 2625718 root 20 0 128.2g 36864 23616 S 0.0 0.0 0:00.00 reactor_1 00:29:37.772 18:33:21 reactor_set_interrupt -- interrupt/common.sh@25 -- # awk '{print $9}' 00:29:37.772 18:33:21 reactor_set_interrupt -- interrupt/common.sh@25 -- # sed -e 's/^\s*//g' 00:29:37.772 18:33:21 reactor_set_interrupt -- interrupt/common.sh@25 -- # cpu_rate=0.0 00:29:37.772 18:33:21 reactor_set_interrupt -- interrupt/common.sh@26 -- # cpu_rate=0 00:29:37.772 18:33:21 reactor_set_interrupt -- interrupt/common.sh@28 -- # [[ idle = \b\u\s\y ]] 00:29:37.772 18:33:21 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ idle = \i\d\l\e ]] 00:29:37.772 18:33:21 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ 0 -gt 30 ]] 00:29:37.772 18:33:21 reactor_set_interrupt -- interrupt/common.sh@33 -- # return 0 00:29:37.772 18:33:21 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@29 -- # for i in {0..2} 00:29:37.772 18:33:21 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@30 -- # reactor_is_idle 2625715 2 00:29:37.772 18:33:21 reactor_set_interrupt -- interrupt/common.sh@51 -- # reactor_is_busy_or_idle 2625715 2 idle 00:29:37.772 18:33:21 reactor_set_interrupt -- interrupt/common.sh@10 -- # local pid=2625715 00:29:37.772 18:33:21 reactor_set_interrupt -- interrupt/common.sh@11 -- # local idx=2 00:29:37.772 18:33:21 reactor_set_interrupt -- interrupt/common.sh@12 -- # local state=idle 00:29:37.772 18:33:21 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ idle != \b\u\s\y ]] 00:29:37.772 18:33:21 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ idle != \i\d\l\e ]] 00:29:37.772 18:33:21 reactor_set_interrupt -- interrupt/common.sh@18 -- # hash top 00:29:37.772 18:33:21 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j = 10 )) 00:29:37.772 18:33:21 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j != 0 )) 00:29:37.772 18:33:21 reactor_set_interrupt -- interrupt/common.sh@24 -- # top -bHn 1 -p 2625715 -w 256 00:29:37.772 18:33:21 reactor_set_interrupt -- interrupt/common.sh@24 -- # grep reactor_2 00:29:38.031 18:33:21 reactor_set_interrupt -- interrupt/common.sh@24 -- # top_reactor='2625719 root 20 0 128.2g 36864 23616 S 0.0 0.0 0:00.00 reactor_2' 00:29:38.031 18:33:21 reactor_set_interrupt -- interrupt/common.sh@25 -- # echo 2625719 root 20 0 128.2g 36864 23616 S 0.0 0.0 0:00.00 reactor_2 00:29:38.031 18:33:21 reactor_set_interrupt -- interrupt/common.sh@25 -- # sed -e 's/^\s*//g' 00:29:38.031 18:33:21 reactor_set_interrupt -- interrupt/common.sh@25 -- # awk '{print $9}' 00:29:38.031 18:33:21 reactor_set_interrupt -- interrupt/common.sh@25 -- # cpu_rate=0.0 00:29:38.031 18:33:21 reactor_set_interrupt -- interrupt/common.sh@26 -- # cpu_rate=0 00:29:38.031 18:33:21 reactor_set_interrupt -- interrupt/common.sh@28 -- # [[ idle = \b\u\s\y ]] 00:29:38.031 18:33:21 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ idle = \i\d\l\e ]] 00:29:38.031 18:33:21 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ 0 -gt 30 ]] 00:29:38.031 18:33:21 reactor_set_interrupt -- interrupt/common.sh@33 -- # return 0 00:29:38.031 18:33:21 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@33 -- # '[' x '!=' x ']' 00:29:38.031 18:33:21 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@43 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py --plugin interrupt_plugin reactor_set_interrupt_mode 0 -d 00:29:38.289 [2024-07-12 18:33:21.769968] interrupt_tgt.c: 99:rpc_reactor_set_interrupt_mode: *NOTICE*: RPC Start to disable interrupt mode on reactor 0. 00:29:38.289 [2024-07-12 18:33:21.770180] thread.c:2099:spdk_thread_set_interrupt_mode: *NOTICE*: Set spdk_thread (app_thread) to poll mode from intr mode. 00:29:38.289 [2024-07-12 18:33:21.770398] interrupt_tgt.c: 36:rpc_reactor_set_interrupt_mode_cb: *NOTICE*: complete reactor switch 00:29:38.289 18:33:21 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@44 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py --plugin interrupt_plugin reactor_set_interrupt_mode 2 -d 00:29:38.548 [2024-07-12 18:33:22.018396] interrupt_tgt.c: 99:rpc_reactor_set_interrupt_mode: *NOTICE*: RPC Start to disable interrupt mode on reactor 2. 00:29:38.548 [2024-07-12 18:33:22.018493] interrupt_tgt.c: 36:rpc_reactor_set_interrupt_mode_cb: *NOTICE*: complete reactor switch 00:29:38.548 18:33:22 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@46 -- # for i in 0 2 00:29:38.548 18:33:22 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@47 -- # reactor_is_busy 2625715 0 00:29:38.548 18:33:22 reactor_set_interrupt -- interrupt/common.sh@47 -- # reactor_is_busy_or_idle 2625715 0 busy 00:29:38.548 18:33:22 reactor_set_interrupt -- interrupt/common.sh@10 -- # local pid=2625715 00:29:38.548 18:33:22 reactor_set_interrupt -- interrupt/common.sh@11 -- # local idx=0 00:29:38.548 18:33:22 reactor_set_interrupt -- interrupt/common.sh@12 -- # local state=busy 00:29:38.548 18:33:22 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ busy != \b\u\s\y ]] 00:29:38.548 18:33:22 reactor_set_interrupt -- interrupt/common.sh@18 -- # hash top 00:29:38.548 18:33:22 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j = 10 )) 00:29:38.548 18:33:22 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j != 0 )) 00:29:38.548 18:33:22 reactor_set_interrupt -- interrupt/common.sh@24 -- # top -bHn 1 -p 2625715 -w 256 00:29:38.548 18:33:22 reactor_set_interrupt -- interrupt/common.sh@24 -- # grep reactor_0 00:29:38.548 18:33:22 reactor_set_interrupt -- interrupt/common.sh@24 -- # top_reactor='2625715 root 20 0 128.2g 36864 23616 R 99.9 0.0 0:00.83 reactor_0' 00:29:38.548 18:33:22 reactor_set_interrupt -- interrupt/common.sh@25 -- # echo 2625715 root 20 0 128.2g 36864 23616 R 99.9 0.0 0:00.83 reactor_0 00:29:38.548 18:33:22 reactor_set_interrupt -- interrupt/common.sh@25 -- # awk '{print $9}' 00:29:38.548 18:33:22 reactor_set_interrupt -- interrupt/common.sh@25 -- # sed -e 's/^\s*//g' 00:29:38.548 18:33:22 reactor_set_interrupt -- interrupt/common.sh@25 -- # cpu_rate=99.9 00:29:38.548 18:33:22 reactor_set_interrupt -- interrupt/common.sh@26 -- # cpu_rate=99 00:29:38.548 18:33:22 reactor_set_interrupt -- interrupt/common.sh@28 -- # [[ busy = \b\u\s\y ]] 00:29:38.548 18:33:22 reactor_set_interrupt -- interrupt/common.sh@28 -- # [[ 99 -lt 70 ]] 00:29:38.548 18:33:22 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ busy = \i\d\l\e ]] 00:29:38.548 18:33:22 reactor_set_interrupt -- interrupt/common.sh@33 -- # return 0 00:29:38.548 18:33:22 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@46 -- # for i in 0 2 00:29:38.548 18:33:22 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@47 -- # reactor_is_busy 2625715 2 00:29:38.548 18:33:22 reactor_set_interrupt -- interrupt/common.sh@47 -- # reactor_is_busy_or_idle 2625715 2 busy 00:29:38.548 18:33:22 reactor_set_interrupt -- interrupt/common.sh@10 -- # local pid=2625715 00:29:38.548 18:33:22 reactor_set_interrupt -- interrupt/common.sh@11 -- # local idx=2 00:29:38.548 18:33:22 reactor_set_interrupt -- interrupt/common.sh@12 -- # local state=busy 00:29:38.548 18:33:22 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ busy != \b\u\s\y ]] 00:29:38.548 18:33:22 reactor_set_interrupt -- interrupt/common.sh@18 -- # hash top 00:29:38.548 18:33:22 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j = 10 )) 00:29:38.548 18:33:22 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j != 0 )) 00:29:38.548 18:33:22 reactor_set_interrupt -- interrupt/common.sh@24 -- # top -bHn 1 -p 2625715 -w 256 00:29:38.548 18:33:22 reactor_set_interrupt -- interrupt/common.sh@24 -- # grep reactor_2 00:29:38.806 18:33:22 reactor_set_interrupt -- interrupt/common.sh@24 -- # top_reactor='2625719 root 20 0 128.2g 36864 23616 R 93.3 0.0 0:00.35 reactor_2' 00:29:38.806 18:33:22 reactor_set_interrupt -- interrupt/common.sh@25 -- # echo 2625719 root 20 0 128.2g 36864 23616 R 93.3 0.0 0:00.35 reactor_2 00:29:38.806 18:33:22 reactor_set_interrupt -- interrupt/common.sh@25 -- # awk '{print $9}' 00:29:38.806 18:33:22 reactor_set_interrupt -- interrupt/common.sh@25 -- # sed -e 's/^\s*//g' 00:29:38.806 18:33:22 reactor_set_interrupt -- interrupt/common.sh@25 -- # cpu_rate=93.3 00:29:38.806 18:33:22 reactor_set_interrupt -- interrupt/common.sh@26 -- # cpu_rate=93 00:29:38.806 18:33:22 reactor_set_interrupt -- interrupt/common.sh@28 -- # [[ busy = \b\u\s\y ]] 00:29:38.806 18:33:22 reactor_set_interrupt -- interrupt/common.sh@28 -- # [[ 93 -lt 70 ]] 00:29:38.806 18:33:22 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ busy = \i\d\l\e ]] 00:29:38.806 18:33:22 reactor_set_interrupt -- interrupt/common.sh@33 -- # return 0 00:29:38.806 18:33:22 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@51 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py --plugin interrupt_plugin reactor_set_interrupt_mode 2 00:29:39.064 [2024-07-12 18:33:22.620106] interrupt_tgt.c: 99:rpc_reactor_set_interrupt_mode: *NOTICE*: RPC Start to enable interrupt mode on reactor 2. 00:29:39.064 [2024-07-12 18:33:22.620220] interrupt_tgt.c: 36:rpc_reactor_set_interrupt_mode_cb: *NOTICE*: complete reactor switch 00:29:39.064 18:33:22 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@52 -- # '[' x '!=' x ']' 00:29:39.064 18:33:22 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@59 -- # reactor_is_idle 2625715 2 00:29:39.064 18:33:22 reactor_set_interrupt -- interrupt/common.sh@51 -- # reactor_is_busy_or_idle 2625715 2 idle 00:29:39.064 18:33:22 reactor_set_interrupt -- interrupt/common.sh@10 -- # local pid=2625715 00:29:39.064 18:33:22 reactor_set_interrupt -- interrupt/common.sh@11 -- # local idx=2 00:29:39.064 18:33:22 reactor_set_interrupt -- interrupt/common.sh@12 -- # local state=idle 00:29:39.064 18:33:22 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ idle != \b\u\s\y ]] 00:29:39.064 18:33:22 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ idle != \i\d\l\e ]] 00:29:39.064 18:33:22 reactor_set_interrupt -- interrupt/common.sh@18 -- # hash top 00:29:39.064 18:33:22 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j = 10 )) 00:29:39.064 18:33:22 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j != 0 )) 00:29:39.064 18:33:22 reactor_set_interrupt -- interrupt/common.sh@24 -- # top -bHn 1 -p 2625715 -w 256 00:29:39.064 18:33:22 reactor_set_interrupt -- interrupt/common.sh@24 -- # grep reactor_2 00:29:39.321 18:33:22 reactor_set_interrupt -- interrupt/common.sh@24 -- # top_reactor='2625719 root 20 0 128.2g 36864 23616 S 0.0 0.0 0:00.59 reactor_2' 00:29:39.321 18:33:22 reactor_set_interrupt -- interrupt/common.sh@25 -- # awk '{print $9}' 00:29:39.321 18:33:22 reactor_set_interrupt -- interrupt/common.sh@25 -- # echo 2625719 root 20 0 128.2g 36864 23616 S 0.0 0.0 0:00.59 reactor_2 00:29:39.321 18:33:22 reactor_set_interrupt -- interrupt/common.sh@25 -- # sed -e 's/^\s*//g' 00:29:39.321 18:33:22 reactor_set_interrupt -- interrupt/common.sh@25 -- # cpu_rate=0.0 00:29:39.322 18:33:22 reactor_set_interrupt -- interrupt/common.sh@26 -- # cpu_rate=0 00:29:39.322 18:33:22 reactor_set_interrupt -- interrupt/common.sh@28 -- # [[ idle = \b\u\s\y ]] 00:29:39.322 18:33:22 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ idle = \i\d\l\e ]] 00:29:39.322 18:33:22 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ 0 -gt 30 ]] 00:29:39.322 18:33:22 reactor_set_interrupt -- interrupt/common.sh@33 -- # return 0 00:29:39.322 18:33:22 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@62 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py --plugin interrupt_plugin reactor_set_interrupt_mode 0 00:29:39.322 [2024-07-12 18:33:22.973012] interrupt_tgt.c: 99:rpc_reactor_set_interrupt_mode: *NOTICE*: RPC Start to enable interrupt mode on reactor 0. 00:29:39.322 [2024-07-12 18:33:22.973202] thread.c:2099:spdk_thread_set_interrupt_mode: *NOTICE*: Set spdk_thread (app_thread) to intr mode from poll mode. 00:29:39.322 [2024-07-12 18:33:22.973225] interrupt_tgt.c: 36:rpc_reactor_set_interrupt_mode_cb: *NOTICE*: complete reactor switch 00:29:39.322 18:33:22 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@63 -- # '[' x '!=' x ']' 00:29:39.322 18:33:22 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@70 -- # reactor_is_idle 2625715 0 00:29:39.322 18:33:22 reactor_set_interrupt -- interrupt/common.sh@51 -- # reactor_is_busy_or_idle 2625715 0 idle 00:29:39.322 18:33:22 reactor_set_interrupt -- interrupt/common.sh@10 -- # local pid=2625715 00:29:39.322 18:33:22 reactor_set_interrupt -- interrupt/common.sh@11 -- # local idx=0 00:29:39.322 18:33:22 reactor_set_interrupt -- interrupt/common.sh@12 -- # local state=idle 00:29:39.322 18:33:22 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ idle != \b\u\s\y ]] 00:29:39.322 18:33:22 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ idle != \i\d\l\e ]] 00:29:39.322 18:33:22 reactor_set_interrupt -- interrupt/common.sh@18 -- # hash top 00:29:39.322 18:33:22 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j = 10 )) 00:29:39.322 18:33:22 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j != 0 )) 00:29:39.322 18:33:22 reactor_set_interrupt -- interrupt/common.sh@24 -- # top -bHn 1 -p 2625715 -w 256 00:29:39.322 18:33:22 reactor_set_interrupt -- interrupt/common.sh@24 -- # grep reactor_0 00:29:39.580 18:33:23 reactor_set_interrupt -- interrupt/common.sh@24 -- # top_reactor='2625715 root 20 0 128.2g 36864 23616 S 6.7 0.0 0:01.60 reactor_0' 00:29:39.580 18:33:23 reactor_set_interrupt -- interrupt/common.sh@25 -- # echo 2625715 root 20 0 128.2g 36864 23616 S 6.7 0.0 0:01.60 reactor_0 00:29:39.580 18:33:23 reactor_set_interrupt -- interrupt/common.sh@25 -- # sed -e 's/^\s*//g' 00:29:39.580 18:33:23 reactor_set_interrupt -- interrupt/common.sh@25 -- # awk '{print $9}' 00:29:39.580 18:33:23 reactor_set_interrupt -- interrupt/common.sh@25 -- # cpu_rate=6.7 00:29:39.580 18:33:23 reactor_set_interrupt -- interrupt/common.sh@26 -- # cpu_rate=6 00:29:39.580 18:33:23 reactor_set_interrupt -- interrupt/common.sh@28 -- # [[ idle = \b\u\s\y ]] 00:29:39.580 18:33:23 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ idle = \i\d\l\e ]] 00:29:39.580 18:33:23 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ 6 -gt 30 ]] 00:29:39.580 18:33:23 reactor_set_interrupt -- interrupt/common.sh@33 -- # return 0 00:29:39.580 18:33:23 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@72 -- # return 0 00:29:39.580 18:33:23 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@82 -- # return 0 00:29:39.580 18:33:23 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@103 -- # trap - SIGINT SIGTERM EXIT 00:29:39.580 18:33:23 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@104 -- # killprocess 2625715 00:29:39.580 18:33:23 reactor_set_interrupt -- common/autotest_common.sh@948 -- # '[' -z 2625715 ']' 00:29:39.580 18:33:23 reactor_set_interrupt -- common/autotest_common.sh@952 -- # kill -0 2625715 00:29:39.580 18:33:23 reactor_set_interrupt -- common/autotest_common.sh@953 -- # uname 00:29:39.580 18:33:23 reactor_set_interrupt -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:29:39.580 18:33:23 reactor_set_interrupt -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2625715 00:29:39.580 18:33:23 reactor_set_interrupt -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:29:39.580 18:33:23 reactor_set_interrupt -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:29:39.580 18:33:23 reactor_set_interrupt -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2625715' 00:29:39.580 killing process with pid 2625715 00:29:39.580 18:33:23 reactor_set_interrupt -- common/autotest_common.sh@967 -- # kill 2625715 00:29:39.580 18:33:23 reactor_set_interrupt -- common/autotest_common.sh@972 -- # wait 2625715 00:29:39.839 18:33:23 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@105 -- # cleanup 00:29:39.839 18:33:23 reactor_set_interrupt -- interrupt/common.sh@6 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt/aiofile 00:29:39.839 00:29:39.839 real 0m9.978s 00:29:39.839 user 0m9.380s 00:29:39.839 sys 0m2.113s 00:29:39.839 18:33:23 reactor_set_interrupt -- common/autotest_common.sh@1124 -- # xtrace_disable 00:29:39.839 18:33:23 reactor_set_interrupt -- common/autotest_common.sh@10 -- # set +x 00:29:39.839 ************************************ 00:29:39.839 END TEST reactor_set_interrupt 00:29:39.839 ************************************ 00:29:39.839 18:33:23 -- common/autotest_common.sh@1142 -- # return 0 00:29:39.839 18:33:23 -- spdk/autotest.sh@194 -- # run_test reap_unregistered_poller /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt/reap_unregistered_poller.sh 00:29:39.839 18:33:23 -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:29:39.839 18:33:23 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:29:39.839 18:33:23 -- common/autotest_common.sh@10 -- # set +x 00:29:39.839 ************************************ 00:29:39.839 START TEST reap_unregistered_poller 00:29:39.839 ************************************ 00:29:39.839 18:33:23 reap_unregistered_poller -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt/reap_unregistered_poller.sh 00:29:40.099 * Looking for test storage... 00:29:40.099 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt 00:29:40.099 18:33:23 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@9 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt/interrupt_common.sh 00:29:40.099 18:33:23 reap_unregistered_poller -- interrupt/interrupt_common.sh@5 -- # dirname /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt/reap_unregistered_poller.sh 00:29:40.099 18:33:23 reap_unregistered_poller -- interrupt/interrupt_common.sh@5 -- # readlink -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt 00:29:40.099 18:33:23 reap_unregistered_poller -- interrupt/interrupt_common.sh@5 -- # testdir=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt 00:29:40.099 18:33:23 reap_unregistered_poller -- interrupt/interrupt_common.sh@6 -- # readlink -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt/../.. 00:29:40.099 18:33:23 reap_unregistered_poller -- interrupt/interrupt_common.sh@6 -- # rootdir=/var/jenkins/workspace/crypto-phy-autotest/spdk 00:29:40.099 18:33:23 reap_unregistered_poller -- interrupt/interrupt_common.sh@7 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/common/autotest_common.sh 00:29:40.099 18:33:23 reap_unregistered_poller -- common/autotest_common.sh@7 -- # rpc_py=rpc_cmd 00:29:40.099 18:33:23 reap_unregistered_poller -- common/autotest_common.sh@34 -- # set -e 00:29:40.099 18:33:23 reap_unregistered_poller -- common/autotest_common.sh@35 -- # shopt -s nullglob 00:29:40.099 18:33:23 reap_unregistered_poller -- common/autotest_common.sh@36 -- # shopt -s extglob 00:29:40.099 18:33:23 reap_unregistered_poller -- common/autotest_common.sh@37 -- # shopt -s inherit_errexit 00:29:40.099 18:33:23 reap_unregistered_poller -- common/autotest_common.sh@39 -- # '[' -z /var/jenkins/workspace/crypto-phy-autotest/spdk/../output ']' 00:29:40.100 18:33:23 reap_unregistered_poller -- common/autotest_common.sh@44 -- # [[ -e /var/jenkins/workspace/crypto-phy-autotest/spdk/test/common/build_config.sh ]] 00:29:40.100 18:33:23 reap_unregistered_poller -- common/autotest_common.sh@45 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/common/build_config.sh 00:29:40.100 18:33:23 reap_unregistered_poller -- common/build_config.sh@1 -- # CONFIG_WPDK_DIR= 00:29:40.100 18:33:23 reap_unregistered_poller -- common/build_config.sh@2 -- # CONFIG_ASAN=n 00:29:40.100 18:33:23 reap_unregistered_poller -- common/build_config.sh@3 -- # CONFIG_VBDEV_COMPRESS=y 00:29:40.100 18:33:23 reap_unregistered_poller -- common/build_config.sh@4 -- # CONFIG_HAVE_EXECINFO_H=y 00:29:40.100 18:33:23 reap_unregistered_poller -- common/build_config.sh@5 -- # CONFIG_USDT=n 00:29:40.100 18:33:23 reap_unregistered_poller -- common/build_config.sh@6 -- # CONFIG_CUSTOMOCF=n 00:29:40.100 18:33:23 reap_unregistered_poller -- common/build_config.sh@7 -- # CONFIG_PREFIX=/usr/local 00:29:40.100 18:33:23 reap_unregistered_poller -- common/build_config.sh@8 -- # CONFIG_RBD=n 00:29:40.100 18:33:23 reap_unregistered_poller -- common/build_config.sh@9 -- # CONFIG_LIBDIR= 00:29:40.100 18:33:23 reap_unregistered_poller -- common/build_config.sh@10 -- # CONFIG_IDXD=y 00:29:40.100 18:33:23 reap_unregistered_poller -- common/build_config.sh@11 -- # CONFIG_NVME_CUSE=y 00:29:40.100 18:33:23 reap_unregistered_poller -- common/build_config.sh@12 -- # CONFIG_SMA=n 00:29:40.100 18:33:23 reap_unregistered_poller -- common/build_config.sh@13 -- # CONFIG_VTUNE=n 00:29:40.100 18:33:23 reap_unregistered_poller -- common/build_config.sh@14 -- # CONFIG_TSAN=n 00:29:40.100 18:33:23 reap_unregistered_poller -- common/build_config.sh@15 -- # CONFIG_RDMA_SEND_WITH_INVAL=y 00:29:40.100 18:33:23 reap_unregistered_poller -- common/build_config.sh@16 -- # CONFIG_VFIO_USER_DIR= 00:29:40.100 18:33:23 reap_unregistered_poller -- common/build_config.sh@17 -- # CONFIG_PGO_CAPTURE=n 00:29:40.100 18:33:23 reap_unregistered_poller -- common/build_config.sh@18 -- # CONFIG_HAVE_UUID_GENERATE_SHA1=y 00:29:40.100 18:33:23 reap_unregistered_poller -- common/build_config.sh@19 -- # CONFIG_ENV=/var/jenkins/workspace/crypto-phy-autotest/spdk/lib/env_dpdk 00:29:40.100 18:33:23 reap_unregistered_poller -- common/build_config.sh@20 -- # CONFIG_LTO=n 00:29:40.100 18:33:23 reap_unregistered_poller -- common/build_config.sh@21 -- # CONFIG_ISCSI_INITIATOR=y 00:29:40.100 18:33:23 reap_unregistered_poller -- common/build_config.sh@22 -- # CONFIG_CET=n 00:29:40.100 18:33:23 reap_unregistered_poller -- common/build_config.sh@23 -- # CONFIG_VBDEV_COMPRESS_MLX5=y 00:29:40.100 18:33:23 reap_unregistered_poller -- common/build_config.sh@24 -- # CONFIG_OCF_PATH= 00:29:40.100 18:33:23 reap_unregistered_poller -- common/build_config.sh@25 -- # CONFIG_RDMA_SET_TOS=y 00:29:40.100 18:33:23 reap_unregistered_poller -- common/build_config.sh@26 -- # CONFIG_HAVE_ARC4RANDOM=y 00:29:40.100 18:33:23 reap_unregistered_poller -- common/build_config.sh@27 -- # CONFIG_HAVE_LIBARCHIVE=n 00:29:40.100 18:33:23 reap_unregistered_poller -- common/build_config.sh@28 -- # CONFIG_UBLK=y 00:29:40.100 18:33:23 reap_unregistered_poller -- common/build_config.sh@29 -- # CONFIG_ISAL_CRYPTO=y 00:29:40.100 18:33:23 reap_unregistered_poller -- common/build_config.sh@30 -- # CONFIG_OPENSSL_PATH= 00:29:40.100 18:33:23 reap_unregistered_poller -- common/build_config.sh@31 -- # CONFIG_OCF=n 00:29:40.100 18:33:23 reap_unregistered_poller -- common/build_config.sh@32 -- # CONFIG_FUSE=n 00:29:40.100 18:33:23 reap_unregistered_poller -- common/build_config.sh@33 -- # CONFIG_VTUNE_DIR= 00:29:40.100 18:33:23 reap_unregistered_poller -- common/build_config.sh@34 -- # CONFIG_FUZZER_LIB= 00:29:40.100 18:33:23 reap_unregistered_poller -- common/build_config.sh@35 -- # CONFIG_FUZZER=n 00:29:40.100 18:33:23 reap_unregistered_poller -- common/build_config.sh@36 -- # CONFIG_DPDK_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build 00:29:40.100 18:33:23 reap_unregistered_poller -- common/build_config.sh@37 -- # CONFIG_CRYPTO=y 00:29:40.100 18:33:23 reap_unregistered_poller -- common/build_config.sh@38 -- # CONFIG_PGO_USE=n 00:29:40.100 18:33:23 reap_unregistered_poller -- common/build_config.sh@39 -- # CONFIG_VHOST=y 00:29:40.100 18:33:23 reap_unregistered_poller -- common/build_config.sh@40 -- # CONFIG_DAOS=n 00:29:40.100 18:33:23 reap_unregistered_poller -- common/build_config.sh@41 -- # CONFIG_DPDK_INC_DIR= 00:29:40.100 18:33:23 reap_unregistered_poller -- common/build_config.sh@42 -- # CONFIG_DAOS_DIR= 00:29:40.100 18:33:23 reap_unregistered_poller -- common/build_config.sh@43 -- # CONFIG_UNIT_TESTS=n 00:29:40.100 18:33:23 reap_unregistered_poller -- common/build_config.sh@44 -- # CONFIG_RDMA_SET_ACK_TIMEOUT=y 00:29:40.100 18:33:23 reap_unregistered_poller -- common/build_config.sh@45 -- # CONFIG_VIRTIO=y 00:29:40.100 18:33:23 reap_unregistered_poller -- common/build_config.sh@46 -- # CONFIG_DPDK_UADK=n 00:29:40.100 18:33:23 reap_unregistered_poller -- common/build_config.sh@47 -- # CONFIG_COVERAGE=y 00:29:40.100 18:33:23 reap_unregistered_poller -- common/build_config.sh@48 -- # CONFIG_RDMA=y 00:29:40.100 18:33:23 reap_unregistered_poller -- common/build_config.sh@49 -- # CONFIG_FIO_SOURCE_DIR=/usr/src/fio 00:29:40.100 18:33:23 reap_unregistered_poller -- common/build_config.sh@50 -- # CONFIG_URING_PATH= 00:29:40.100 18:33:23 reap_unregistered_poller -- common/build_config.sh@51 -- # CONFIG_XNVME=n 00:29:40.100 18:33:23 reap_unregistered_poller -- common/build_config.sh@52 -- # CONFIG_VFIO_USER=n 00:29:40.100 18:33:23 reap_unregistered_poller -- common/build_config.sh@53 -- # CONFIG_ARCH=native 00:29:40.100 18:33:23 reap_unregistered_poller -- common/build_config.sh@54 -- # CONFIG_HAVE_EVP_MAC=y 00:29:40.100 18:33:23 reap_unregistered_poller -- common/build_config.sh@55 -- # CONFIG_URING_ZNS=n 00:29:40.100 18:33:23 reap_unregistered_poller -- common/build_config.sh@56 -- # CONFIG_WERROR=y 00:29:40.100 18:33:23 reap_unregistered_poller -- common/build_config.sh@57 -- # CONFIG_HAVE_LIBBSD=n 00:29:40.100 18:33:23 reap_unregistered_poller -- common/build_config.sh@58 -- # CONFIG_UBSAN=y 00:29:40.100 18:33:23 reap_unregistered_poller -- common/build_config.sh@59 -- # CONFIG_IPSEC_MB_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/intel-ipsec-mb/lib 00:29:40.100 18:33:23 reap_unregistered_poller -- common/build_config.sh@60 -- # CONFIG_GOLANG=n 00:29:40.100 18:33:23 reap_unregistered_poller -- common/build_config.sh@61 -- # CONFIG_ISAL=y 00:29:40.100 18:33:23 reap_unregistered_poller -- common/build_config.sh@62 -- # CONFIG_IDXD_KERNEL=y 00:29:40.100 18:33:23 reap_unregistered_poller -- common/build_config.sh@63 -- # CONFIG_DPDK_LIB_DIR= 00:29:40.100 18:33:23 reap_unregistered_poller -- common/build_config.sh@64 -- # CONFIG_RDMA_PROV=verbs 00:29:40.100 18:33:23 reap_unregistered_poller -- common/build_config.sh@65 -- # CONFIG_APPS=y 00:29:40.100 18:33:23 reap_unregistered_poller -- common/build_config.sh@66 -- # CONFIG_SHARED=y 00:29:40.100 18:33:23 reap_unregistered_poller -- common/build_config.sh@67 -- # CONFIG_HAVE_KEYUTILS=y 00:29:40.100 18:33:23 reap_unregistered_poller -- common/build_config.sh@68 -- # CONFIG_FC_PATH= 00:29:40.100 18:33:23 reap_unregistered_poller -- common/build_config.sh@69 -- # CONFIG_DPDK_PKG_CONFIG=n 00:29:40.100 18:33:23 reap_unregistered_poller -- common/build_config.sh@70 -- # CONFIG_FC=n 00:29:40.100 18:33:23 reap_unregistered_poller -- common/build_config.sh@71 -- # CONFIG_AVAHI=n 00:29:40.100 18:33:23 reap_unregistered_poller -- common/build_config.sh@72 -- # CONFIG_FIO_PLUGIN=y 00:29:40.100 18:33:23 reap_unregistered_poller -- common/build_config.sh@73 -- # CONFIG_RAID5F=n 00:29:40.100 18:33:23 reap_unregistered_poller -- common/build_config.sh@74 -- # CONFIG_EXAMPLES=y 00:29:40.100 18:33:23 reap_unregistered_poller -- common/build_config.sh@75 -- # CONFIG_TESTS=y 00:29:40.100 18:33:23 reap_unregistered_poller -- common/build_config.sh@76 -- # CONFIG_CRYPTO_MLX5=y 00:29:40.100 18:33:23 reap_unregistered_poller -- common/build_config.sh@77 -- # CONFIG_MAX_LCORES=128 00:29:40.100 18:33:23 reap_unregistered_poller -- common/build_config.sh@78 -- # CONFIG_IPSEC_MB=y 00:29:40.100 18:33:23 reap_unregistered_poller -- common/build_config.sh@79 -- # CONFIG_PGO_DIR= 00:29:40.100 18:33:23 reap_unregistered_poller -- common/build_config.sh@80 -- # CONFIG_DEBUG=y 00:29:40.100 18:33:23 reap_unregistered_poller -- common/build_config.sh@81 -- # CONFIG_DPDK_COMPRESSDEV=y 00:29:40.100 18:33:23 reap_unregistered_poller -- common/build_config.sh@82 -- # CONFIG_CROSS_PREFIX= 00:29:40.100 18:33:23 reap_unregistered_poller -- common/build_config.sh@83 -- # CONFIG_URING=n 00:29:40.100 18:33:23 reap_unregistered_poller -- common/autotest_common.sh@54 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/common/applications.sh 00:29:40.100 18:33:23 reap_unregistered_poller -- common/applications.sh@8 -- # dirname /var/jenkins/workspace/crypto-phy-autotest/spdk/test/common/applications.sh 00:29:40.100 18:33:23 reap_unregistered_poller -- common/applications.sh@8 -- # readlink -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/common 00:29:40.100 18:33:23 reap_unregistered_poller -- common/applications.sh@8 -- # _root=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/common 00:29:40.100 18:33:23 reap_unregistered_poller -- common/applications.sh@9 -- # _root=/var/jenkins/workspace/crypto-phy-autotest/spdk 00:29:40.100 18:33:23 reap_unregistered_poller -- common/applications.sh@10 -- # _app_dir=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin 00:29:40.100 18:33:23 reap_unregistered_poller -- common/applications.sh@11 -- # _test_app_dir=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/app 00:29:40.100 18:33:23 reap_unregistered_poller -- common/applications.sh@12 -- # _examples_dir=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples 00:29:40.100 18:33:23 reap_unregistered_poller -- common/applications.sh@14 -- # VHOST_FUZZ_APP=("$_test_app_dir/fuzz/vhost_fuzz/vhost_fuzz") 00:29:40.100 18:33:23 reap_unregistered_poller -- common/applications.sh@15 -- # ISCSI_APP=("$_app_dir/iscsi_tgt") 00:29:40.100 18:33:23 reap_unregistered_poller -- common/applications.sh@16 -- # NVMF_APP=("$_app_dir/nvmf_tgt") 00:29:40.100 18:33:23 reap_unregistered_poller -- common/applications.sh@17 -- # VHOST_APP=("$_app_dir/vhost") 00:29:40.100 18:33:23 reap_unregistered_poller -- common/applications.sh@18 -- # DD_APP=("$_app_dir/spdk_dd") 00:29:40.100 18:33:23 reap_unregistered_poller -- common/applications.sh@19 -- # SPDK_APP=("$_app_dir/spdk_tgt") 00:29:40.100 18:33:23 reap_unregistered_poller -- common/applications.sh@22 -- # [[ -e /var/jenkins/workspace/crypto-phy-autotest/spdk/include/spdk/config.h ]] 00:29:40.100 18:33:23 reap_unregistered_poller -- common/applications.sh@23 -- # [[ #ifndef SPDK_CONFIG_H 00:29:40.100 #define SPDK_CONFIG_H 00:29:40.100 #define SPDK_CONFIG_APPS 1 00:29:40.100 #define SPDK_CONFIG_ARCH native 00:29:40.100 #undef SPDK_CONFIG_ASAN 00:29:40.100 #undef SPDK_CONFIG_AVAHI 00:29:40.100 #undef SPDK_CONFIG_CET 00:29:40.100 #define SPDK_CONFIG_COVERAGE 1 00:29:40.100 #define SPDK_CONFIG_CROSS_PREFIX 00:29:40.100 #define SPDK_CONFIG_CRYPTO 1 00:29:40.100 #define SPDK_CONFIG_CRYPTO_MLX5 1 00:29:40.100 #undef SPDK_CONFIG_CUSTOMOCF 00:29:40.100 #undef SPDK_CONFIG_DAOS 00:29:40.100 #define SPDK_CONFIG_DAOS_DIR 00:29:40.100 #define SPDK_CONFIG_DEBUG 1 00:29:40.100 #define SPDK_CONFIG_DPDK_COMPRESSDEV 1 00:29:40.100 #define SPDK_CONFIG_DPDK_DIR /var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build 00:29:40.100 #define SPDK_CONFIG_DPDK_INC_DIR 00:29:40.100 #define SPDK_CONFIG_DPDK_LIB_DIR 00:29:40.100 #undef SPDK_CONFIG_DPDK_PKG_CONFIG 00:29:40.100 #undef SPDK_CONFIG_DPDK_UADK 00:29:40.100 #define SPDK_CONFIG_ENV /var/jenkins/workspace/crypto-phy-autotest/spdk/lib/env_dpdk 00:29:40.100 #define SPDK_CONFIG_EXAMPLES 1 00:29:40.100 #undef SPDK_CONFIG_FC 00:29:40.100 #define SPDK_CONFIG_FC_PATH 00:29:40.100 #define SPDK_CONFIG_FIO_PLUGIN 1 00:29:40.100 #define SPDK_CONFIG_FIO_SOURCE_DIR /usr/src/fio 00:29:40.100 #undef SPDK_CONFIG_FUSE 00:29:40.100 #undef SPDK_CONFIG_FUZZER 00:29:40.100 #define SPDK_CONFIG_FUZZER_LIB 00:29:40.100 #undef SPDK_CONFIG_GOLANG 00:29:40.100 #define SPDK_CONFIG_HAVE_ARC4RANDOM 1 00:29:40.100 #define SPDK_CONFIG_HAVE_EVP_MAC 1 00:29:40.100 #define SPDK_CONFIG_HAVE_EXECINFO_H 1 00:29:40.100 #define SPDK_CONFIG_HAVE_KEYUTILS 1 00:29:40.100 #undef SPDK_CONFIG_HAVE_LIBARCHIVE 00:29:40.100 #undef SPDK_CONFIG_HAVE_LIBBSD 00:29:40.100 #define SPDK_CONFIG_HAVE_UUID_GENERATE_SHA1 1 00:29:40.101 #define SPDK_CONFIG_IDXD 1 00:29:40.101 #define SPDK_CONFIG_IDXD_KERNEL 1 00:29:40.101 #define SPDK_CONFIG_IPSEC_MB 1 00:29:40.101 #define SPDK_CONFIG_IPSEC_MB_DIR /var/jenkins/workspace/crypto-phy-autotest/spdk/intel-ipsec-mb/lib 00:29:40.101 #define SPDK_CONFIG_ISAL 1 00:29:40.101 #define SPDK_CONFIG_ISAL_CRYPTO 1 00:29:40.101 #define SPDK_CONFIG_ISCSI_INITIATOR 1 00:29:40.101 #define SPDK_CONFIG_LIBDIR 00:29:40.101 #undef SPDK_CONFIG_LTO 00:29:40.101 #define SPDK_CONFIG_MAX_LCORES 128 00:29:40.101 #define SPDK_CONFIG_NVME_CUSE 1 00:29:40.101 #undef SPDK_CONFIG_OCF 00:29:40.101 #define SPDK_CONFIG_OCF_PATH 00:29:40.101 #define SPDK_CONFIG_OPENSSL_PATH 00:29:40.101 #undef SPDK_CONFIG_PGO_CAPTURE 00:29:40.101 #define SPDK_CONFIG_PGO_DIR 00:29:40.101 #undef SPDK_CONFIG_PGO_USE 00:29:40.101 #define SPDK_CONFIG_PREFIX /usr/local 00:29:40.101 #undef SPDK_CONFIG_RAID5F 00:29:40.101 #undef SPDK_CONFIG_RBD 00:29:40.101 #define SPDK_CONFIG_RDMA 1 00:29:40.101 #define SPDK_CONFIG_RDMA_PROV verbs 00:29:40.101 #define SPDK_CONFIG_RDMA_SEND_WITH_INVAL 1 00:29:40.101 #define SPDK_CONFIG_RDMA_SET_ACK_TIMEOUT 1 00:29:40.101 #define SPDK_CONFIG_RDMA_SET_TOS 1 00:29:40.101 #define SPDK_CONFIG_SHARED 1 00:29:40.101 #undef SPDK_CONFIG_SMA 00:29:40.101 #define SPDK_CONFIG_TESTS 1 00:29:40.101 #undef SPDK_CONFIG_TSAN 00:29:40.101 #define SPDK_CONFIG_UBLK 1 00:29:40.101 #define SPDK_CONFIG_UBSAN 1 00:29:40.101 #undef SPDK_CONFIG_UNIT_TESTS 00:29:40.101 #undef SPDK_CONFIG_URING 00:29:40.101 #define SPDK_CONFIG_URING_PATH 00:29:40.101 #undef SPDK_CONFIG_URING_ZNS 00:29:40.101 #undef SPDK_CONFIG_USDT 00:29:40.101 #define SPDK_CONFIG_VBDEV_COMPRESS 1 00:29:40.101 #define SPDK_CONFIG_VBDEV_COMPRESS_MLX5 1 00:29:40.101 #undef SPDK_CONFIG_VFIO_USER 00:29:40.101 #define SPDK_CONFIG_VFIO_USER_DIR 00:29:40.101 #define SPDK_CONFIG_VHOST 1 00:29:40.101 #define SPDK_CONFIG_VIRTIO 1 00:29:40.101 #undef SPDK_CONFIG_VTUNE 00:29:40.101 #define SPDK_CONFIG_VTUNE_DIR 00:29:40.101 #define SPDK_CONFIG_WERROR 1 00:29:40.101 #define SPDK_CONFIG_WPDK_DIR 00:29:40.101 #undef SPDK_CONFIG_XNVME 00:29:40.101 #endif /* SPDK_CONFIG_H */ == *\#\d\e\f\i\n\e\ \S\P\D\K\_\C\O\N\F\I\G\_\D\E\B\U\G* ]] 00:29:40.101 18:33:23 reap_unregistered_poller -- common/applications.sh@24 -- # (( SPDK_AUTOTEST_DEBUG_APPS )) 00:29:40.101 18:33:23 reap_unregistered_poller -- common/autotest_common.sh@55 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/common.sh 00:29:40.101 18:33:23 reap_unregistered_poller -- scripts/common.sh@508 -- # [[ -e /bin/wpdk_common.sh ]] 00:29:40.101 18:33:23 reap_unregistered_poller -- scripts/common.sh@516 -- # [[ -e /etc/opt/spdk-pkgdep/paths/export.sh ]] 00:29:40.101 18:33:23 reap_unregistered_poller -- scripts/common.sh@517 -- # source /etc/opt/spdk-pkgdep/paths/export.sh 00:29:40.101 18:33:23 reap_unregistered_poller -- paths/export.sh@2 -- # PATH=/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:29:40.101 18:33:23 reap_unregistered_poller -- paths/export.sh@3 -- # PATH=/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:29:40.101 18:33:23 reap_unregistered_poller -- paths/export.sh@4 -- # PATH=/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:29:40.101 18:33:23 reap_unregistered_poller -- paths/export.sh@5 -- # export PATH 00:29:40.101 18:33:23 reap_unregistered_poller -- paths/export.sh@6 -- # echo /opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:29:40.101 18:33:23 reap_unregistered_poller -- common/autotest_common.sh@56 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm/common 00:29:40.101 18:33:23 reap_unregistered_poller -- pm/common@6 -- # dirname /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm/common 00:29:40.101 18:33:23 reap_unregistered_poller -- pm/common@6 -- # readlink -f /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm 00:29:40.101 18:33:23 reap_unregistered_poller -- pm/common@6 -- # _pmdir=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm 00:29:40.101 18:33:23 reap_unregistered_poller -- pm/common@7 -- # readlink -f /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm/../../../ 00:29:40.101 18:33:23 reap_unregistered_poller -- pm/common@7 -- # _pmrootdir=/var/jenkins/workspace/crypto-phy-autotest/spdk 00:29:40.101 18:33:23 reap_unregistered_poller -- pm/common@64 -- # TEST_TAG=N/A 00:29:40.101 18:33:23 reap_unregistered_poller -- pm/common@65 -- # TEST_TAG_FILE=/var/jenkins/workspace/crypto-phy-autotest/spdk/.run_test_name 00:29:40.101 18:33:23 reap_unregistered_poller -- pm/common@67 -- # PM_OUTPUTDIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power 00:29:40.101 18:33:23 reap_unregistered_poller -- pm/common@68 -- # uname -s 00:29:40.101 18:33:23 reap_unregistered_poller -- pm/common@68 -- # PM_OS=Linux 00:29:40.101 18:33:23 reap_unregistered_poller -- pm/common@70 -- # MONITOR_RESOURCES_SUDO=() 00:29:40.101 18:33:23 reap_unregistered_poller -- pm/common@70 -- # declare -A MONITOR_RESOURCES_SUDO 00:29:40.101 18:33:23 reap_unregistered_poller -- pm/common@71 -- # MONITOR_RESOURCES_SUDO["collect-bmc-pm"]=1 00:29:40.101 18:33:23 reap_unregistered_poller -- pm/common@72 -- # MONITOR_RESOURCES_SUDO["collect-cpu-load"]=0 00:29:40.101 18:33:23 reap_unregistered_poller -- pm/common@73 -- # MONITOR_RESOURCES_SUDO["collect-cpu-temp"]=0 00:29:40.101 18:33:23 reap_unregistered_poller -- pm/common@74 -- # MONITOR_RESOURCES_SUDO["collect-vmstat"]=0 00:29:40.101 18:33:23 reap_unregistered_poller -- pm/common@76 -- # SUDO[0]= 00:29:40.101 18:33:23 reap_unregistered_poller -- pm/common@76 -- # SUDO[1]='sudo -E' 00:29:40.101 18:33:23 reap_unregistered_poller -- pm/common@78 -- # MONITOR_RESOURCES=(collect-cpu-load collect-vmstat) 00:29:40.101 18:33:23 reap_unregistered_poller -- pm/common@79 -- # [[ Linux == FreeBSD ]] 00:29:40.101 18:33:23 reap_unregistered_poller -- pm/common@81 -- # [[ Linux == Linux ]] 00:29:40.101 18:33:23 reap_unregistered_poller -- pm/common@81 -- # [[ ............................... != QEMU ]] 00:29:40.101 18:33:23 reap_unregistered_poller -- pm/common@81 -- # [[ ! -e /.dockerenv ]] 00:29:40.101 18:33:23 reap_unregistered_poller -- pm/common@84 -- # MONITOR_RESOURCES+=(collect-cpu-temp) 00:29:40.101 18:33:23 reap_unregistered_poller -- pm/common@85 -- # MONITOR_RESOURCES+=(collect-bmc-pm) 00:29:40.101 18:33:23 reap_unregistered_poller -- pm/common@88 -- # [[ ! -d /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power ]] 00:29:40.101 18:33:23 reap_unregistered_poller -- common/autotest_common.sh@58 -- # : 0 00:29:40.101 18:33:23 reap_unregistered_poller -- common/autotest_common.sh@59 -- # export RUN_NIGHTLY 00:29:40.101 18:33:23 reap_unregistered_poller -- common/autotest_common.sh@62 -- # : 0 00:29:40.101 18:33:23 reap_unregistered_poller -- common/autotest_common.sh@63 -- # export SPDK_AUTOTEST_DEBUG_APPS 00:29:40.101 18:33:23 reap_unregistered_poller -- common/autotest_common.sh@64 -- # : 0 00:29:40.101 18:33:23 reap_unregistered_poller -- common/autotest_common.sh@65 -- # export SPDK_RUN_VALGRIND 00:29:40.101 18:33:23 reap_unregistered_poller -- common/autotest_common.sh@66 -- # : 1 00:29:40.101 18:33:23 reap_unregistered_poller -- common/autotest_common.sh@67 -- # export SPDK_RUN_FUNCTIONAL_TEST 00:29:40.101 18:33:23 reap_unregistered_poller -- common/autotest_common.sh@68 -- # : 0 00:29:40.101 18:33:23 reap_unregistered_poller -- common/autotest_common.sh@69 -- # export SPDK_TEST_UNITTEST 00:29:40.101 18:33:23 reap_unregistered_poller -- common/autotest_common.sh@70 -- # : 00:29:40.101 18:33:23 reap_unregistered_poller -- common/autotest_common.sh@71 -- # export SPDK_TEST_AUTOBUILD 00:29:40.101 18:33:23 reap_unregistered_poller -- common/autotest_common.sh@72 -- # : 0 00:29:40.101 18:33:23 reap_unregistered_poller -- common/autotest_common.sh@73 -- # export SPDK_TEST_RELEASE_BUILD 00:29:40.101 18:33:23 reap_unregistered_poller -- common/autotest_common.sh@74 -- # : 1 00:29:40.101 18:33:23 reap_unregistered_poller -- common/autotest_common.sh@75 -- # export SPDK_TEST_ISAL 00:29:40.101 18:33:23 reap_unregistered_poller -- common/autotest_common.sh@76 -- # : 0 00:29:40.101 18:33:23 reap_unregistered_poller -- common/autotest_common.sh@77 -- # export SPDK_TEST_ISCSI 00:29:40.101 18:33:23 reap_unregistered_poller -- common/autotest_common.sh@78 -- # : 0 00:29:40.101 18:33:23 reap_unregistered_poller -- common/autotest_common.sh@79 -- # export SPDK_TEST_ISCSI_INITIATOR 00:29:40.101 18:33:23 reap_unregistered_poller -- common/autotest_common.sh@80 -- # : 0 00:29:40.101 18:33:23 reap_unregistered_poller -- common/autotest_common.sh@81 -- # export SPDK_TEST_NVME 00:29:40.101 18:33:23 reap_unregistered_poller -- common/autotest_common.sh@82 -- # : 0 00:29:40.101 18:33:23 reap_unregistered_poller -- common/autotest_common.sh@83 -- # export SPDK_TEST_NVME_PMR 00:29:40.101 18:33:23 reap_unregistered_poller -- common/autotest_common.sh@84 -- # : 0 00:29:40.101 18:33:23 reap_unregistered_poller -- common/autotest_common.sh@85 -- # export SPDK_TEST_NVME_BP 00:29:40.101 18:33:23 reap_unregistered_poller -- common/autotest_common.sh@86 -- # : 0 00:29:40.101 18:33:23 reap_unregistered_poller -- common/autotest_common.sh@87 -- # export SPDK_TEST_NVME_CLI 00:29:40.101 18:33:23 reap_unregistered_poller -- common/autotest_common.sh@88 -- # : 0 00:29:40.101 18:33:23 reap_unregistered_poller -- common/autotest_common.sh@89 -- # export SPDK_TEST_NVME_CUSE 00:29:40.101 18:33:23 reap_unregistered_poller -- common/autotest_common.sh@90 -- # : 0 00:29:40.101 18:33:23 reap_unregistered_poller -- common/autotest_common.sh@91 -- # export SPDK_TEST_NVME_FDP 00:29:40.101 18:33:23 reap_unregistered_poller -- common/autotest_common.sh@92 -- # : 0 00:29:40.101 18:33:23 reap_unregistered_poller -- common/autotest_common.sh@93 -- # export SPDK_TEST_NVMF 00:29:40.101 18:33:23 reap_unregistered_poller -- common/autotest_common.sh@94 -- # : 0 00:29:40.101 18:33:23 reap_unregistered_poller -- common/autotest_common.sh@95 -- # export SPDK_TEST_VFIOUSER 00:29:40.101 18:33:23 reap_unregistered_poller -- common/autotest_common.sh@96 -- # : 0 00:29:40.101 18:33:23 reap_unregistered_poller -- common/autotest_common.sh@97 -- # export SPDK_TEST_VFIOUSER_QEMU 00:29:40.101 18:33:23 reap_unregistered_poller -- common/autotest_common.sh@98 -- # : 0 00:29:40.101 18:33:23 reap_unregistered_poller -- common/autotest_common.sh@99 -- # export SPDK_TEST_FUZZER 00:29:40.101 18:33:23 reap_unregistered_poller -- common/autotest_common.sh@100 -- # : 0 00:29:40.101 18:33:23 reap_unregistered_poller -- common/autotest_common.sh@101 -- # export SPDK_TEST_FUZZER_SHORT 00:29:40.101 18:33:23 reap_unregistered_poller -- common/autotest_common.sh@102 -- # : rdma 00:29:40.101 18:33:23 reap_unregistered_poller -- common/autotest_common.sh@103 -- # export SPDK_TEST_NVMF_TRANSPORT 00:29:40.101 18:33:23 reap_unregistered_poller -- common/autotest_common.sh@104 -- # : 0 00:29:40.101 18:33:23 reap_unregistered_poller -- common/autotest_common.sh@105 -- # export SPDK_TEST_RBD 00:29:40.101 18:33:23 reap_unregistered_poller -- common/autotest_common.sh@106 -- # : 0 00:29:40.101 18:33:23 reap_unregistered_poller -- common/autotest_common.sh@107 -- # export SPDK_TEST_VHOST 00:29:40.101 18:33:23 reap_unregistered_poller -- common/autotest_common.sh@108 -- # : 1 00:29:40.102 18:33:23 reap_unregistered_poller -- common/autotest_common.sh@109 -- # export SPDK_TEST_BLOCKDEV 00:29:40.102 18:33:23 reap_unregistered_poller -- common/autotest_common.sh@110 -- # : 0 00:29:40.102 18:33:23 reap_unregistered_poller -- common/autotest_common.sh@111 -- # export SPDK_TEST_IOAT 00:29:40.102 18:33:23 reap_unregistered_poller -- common/autotest_common.sh@112 -- # : 0 00:29:40.102 18:33:23 reap_unregistered_poller -- common/autotest_common.sh@113 -- # export SPDK_TEST_BLOBFS 00:29:40.102 18:33:23 reap_unregistered_poller -- common/autotest_common.sh@114 -- # : 0 00:29:40.102 18:33:23 reap_unregistered_poller -- common/autotest_common.sh@115 -- # export SPDK_TEST_VHOST_INIT 00:29:40.102 18:33:23 reap_unregistered_poller -- common/autotest_common.sh@116 -- # : 0 00:29:40.102 18:33:23 reap_unregistered_poller -- common/autotest_common.sh@117 -- # export SPDK_TEST_LVOL 00:29:40.102 18:33:23 reap_unregistered_poller -- common/autotest_common.sh@118 -- # : 1 00:29:40.102 18:33:23 reap_unregistered_poller -- common/autotest_common.sh@119 -- # export SPDK_TEST_VBDEV_COMPRESS 00:29:40.102 18:33:23 reap_unregistered_poller -- common/autotest_common.sh@120 -- # : 0 00:29:40.102 18:33:23 reap_unregistered_poller -- common/autotest_common.sh@121 -- # export SPDK_RUN_ASAN 00:29:40.102 18:33:23 reap_unregistered_poller -- common/autotest_common.sh@122 -- # : 1 00:29:40.102 18:33:23 reap_unregistered_poller -- common/autotest_common.sh@123 -- # export SPDK_RUN_UBSAN 00:29:40.102 18:33:23 reap_unregistered_poller -- common/autotest_common.sh@124 -- # : 00:29:40.102 18:33:23 reap_unregistered_poller -- common/autotest_common.sh@125 -- # export SPDK_RUN_EXTERNAL_DPDK 00:29:40.102 18:33:23 reap_unregistered_poller -- common/autotest_common.sh@126 -- # : 0 00:29:40.102 18:33:23 reap_unregistered_poller -- common/autotest_common.sh@127 -- # export SPDK_RUN_NON_ROOT 00:29:40.102 18:33:23 reap_unregistered_poller -- common/autotest_common.sh@128 -- # : 1 00:29:40.102 18:33:23 reap_unregistered_poller -- common/autotest_common.sh@129 -- # export SPDK_TEST_CRYPTO 00:29:40.102 18:33:23 reap_unregistered_poller -- common/autotest_common.sh@130 -- # : 0 00:29:40.102 18:33:23 reap_unregistered_poller -- common/autotest_common.sh@131 -- # export SPDK_TEST_FTL 00:29:40.102 18:33:23 reap_unregistered_poller -- common/autotest_common.sh@132 -- # : 0 00:29:40.102 18:33:23 reap_unregistered_poller -- common/autotest_common.sh@133 -- # export SPDK_TEST_OCF 00:29:40.102 18:33:23 reap_unregistered_poller -- common/autotest_common.sh@134 -- # : 0 00:29:40.102 18:33:23 reap_unregistered_poller -- common/autotest_common.sh@135 -- # export SPDK_TEST_VMD 00:29:40.102 18:33:23 reap_unregistered_poller -- common/autotest_common.sh@136 -- # : 0 00:29:40.102 18:33:23 reap_unregistered_poller -- common/autotest_common.sh@137 -- # export SPDK_TEST_OPAL 00:29:40.102 18:33:23 reap_unregistered_poller -- common/autotest_common.sh@138 -- # : 00:29:40.102 18:33:23 reap_unregistered_poller -- common/autotest_common.sh@139 -- # export SPDK_TEST_NATIVE_DPDK 00:29:40.102 18:33:23 reap_unregistered_poller -- common/autotest_common.sh@140 -- # : true 00:29:40.102 18:33:23 reap_unregistered_poller -- common/autotest_common.sh@141 -- # export SPDK_AUTOTEST_X 00:29:40.102 18:33:23 reap_unregistered_poller -- common/autotest_common.sh@142 -- # : 0 00:29:40.102 18:33:23 reap_unregistered_poller -- common/autotest_common.sh@143 -- # export SPDK_TEST_RAID5 00:29:40.102 18:33:23 reap_unregistered_poller -- common/autotest_common.sh@144 -- # : 0 00:29:40.102 18:33:23 reap_unregistered_poller -- common/autotest_common.sh@145 -- # export SPDK_TEST_URING 00:29:40.102 18:33:23 reap_unregistered_poller -- common/autotest_common.sh@146 -- # : 0 00:29:40.102 18:33:23 reap_unregistered_poller -- common/autotest_common.sh@147 -- # export SPDK_TEST_USDT 00:29:40.102 18:33:23 reap_unregistered_poller -- common/autotest_common.sh@148 -- # : 0 00:29:40.102 18:33:23 reap_unregistered_poller -- common/autotest_common.sh@149 -- # export SPDK_TEST_USE_IGB_UIO 00:29:40.102 18:33:23 reap_unregistered_poller -- common/autotest_common.sh@150 -- # : 0 00:29:40.102 18:33:23 reap_unregistered_poller -- common/autotest_common.sh@151 -- # export SPDK_TEST_SCHEDULER 00:29:40.102 18:33:23 reap_unregistered_poller -- common/autotest_common.sh@152 -- # : 0 00:29:40.102 18:33:23 reap_unregistered_poller -- common/autotest_common.sh@153 -- # export SPDK_TEST_SCANBUILD 00:29:40.102 18:33:23 reap_unregistered_poller -- common/autotest_common.sh@154 -- # : 00:29:40.102 18:33:23 reap_unregistered_poller -- common/autotest_common.sh@155 -- # export SPDK_TEST_NVMF_NICS 00:29:40.102 18:33:23 reap_unregistered_poller -- common/autotest_common.sh@156 -- # : 0 00:29:40.102 18:33:23 reap_unregistered_poller -- common/autotest_common.sh@157 -- # export SPDK_TEST_SMA 00:29:40.102 18:33:23 reap_unregistered_poller -- common/autotest_common.sh@158 -- # : 0 00:29:40.102 18:33:23 reap_unregistered_poller -- common/autotest_common.sh@159 -- # export SPDK_TEST_DAOS 00:29:40.102 18:33:23 reap_unregistered_poller -- common/autotest_common.sh@160 -- # : 0 00:29:40.102 18:33:23 reap_unregistered_poller -- common/autotest_common.sh@161 -- # export SPDK_TEST_XNVME 00:29:40.102 18:33:23 reap_unregistered_poller -- common/autotest_common.sh@162 -- # : 0 00:29:40.102 18:33:23 reap_unregistered_poller -- common/autotest_common.sh@163 -- # export SPDK_TEST_ACCEL_DSA 00:29:40.102 18:33:23 reap_unregistered_poller -- common/autotest_common.sh@164 -- # : 0 00:29:40.102 18:33:23 reap_unregistered_poller -- common/autotest_common.sh@165 -- # export SPDK_TEST_ACCEL_IAA 00:29:40.102 18:33:23 reap_unregistered_poller -- common/autotest_common.sh@167 -- # : 00:29:40.102 18:33:23 reap_unregistered_poller -- common/autotest_common.sh@168 -- # export SPDK_TEST_FUZZER_TARGET 00:29:40.102 18:33:23 reap_unregistered_poller -- common/autotest_common.sh@169 -- # : 0 00:29:40.102 18:33:23 reap_unregistered_poller -- common/autotest_common.sh@170 -- # export SPDK_TEST_NVMF_MDNS 00:29:40.102 18:33:23 reap_unregistered_poller -- common/autotest_common.sh@171 -- # : 0 00:29:40.102 18:33:23 reap_unregistered_poller -- common/autotest_common.sh@172 -- # export SPDK_JSONRPC_GO_CLIENT 00:29:40.102 18:33:23 reap_unregistered_poller -- common/autotest_common.sh@175 -- # export SPDK_LIB_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/lib 00:29:40.102 18:33:23 reap_unregistered_poller -- common/autotest_common.sh@175 -- # SPDK_LIB_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/lib 00:29:40.102 18:33:23 reap_unregistered_poller -- common/autotest_common.sh@176 -- # export DPDK_LIB_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build/lib 00:29:40.102 18:33:23 reap_unregistered_poller -- common/autotest_common.sh@176 -- # DPDK_LIB_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build/lib 00:29:40.102 18:33:23 reap_unregistered_poller -- common/autotest_common.sh@177 -- # export VFIO_LIB_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/libvfio-user/usr/local/lib 00:29:40.102 18:33:23 reap_unregistered_poller -- common/autotest_common.sh@177 -- # VFIO_LIB_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/libvfio-user/usr/local/lib 00:29:40.102 18:33:23 reap_unregistered_poller -- common/autotest_common.sh@178 -- # export LD_LIBRARY_PATH=:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/libvfio-user/usr/local/lib 00:29:40.102 18:33:23 reap_unregistered_poller -- common/autotest_common.sh@178 -- # LD_LIBRARY_PATH=:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/libvfio-user/usr/local/lib 00:29:40.102 18:33:23 reap_unregistered_poller -- common/autotest_common.sh@181 -- # export PCI_BLOCK_SYNC_ON_RESET=yes 00:29:40.102 18:33:23 reap_unregistered_poller -- common/autotest_common.sh@181 -- # PCI_BLOCK_SYNC_ON_RESET=yes 00:29:40.102 18:33:23 reap_unregistered_poller -- common/autotest_common.sh@185 -- # export PYTHONPATH=:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/python 00:29:40.102 18:33:23 reap_unregistered_poller -- common/autotest_common.sh@185 -- # PYTHONPATH=:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/python 00:29:40.102 18:33:23 reap_unregistered_poller -- common/autotest_common.sh@189 -- # export PYTHONDONTWRITEBYTECODE=1 00:29:40.102 18:33:23 reap_unregistered_poller -- common/autotest_common.sh@189 -- # PYTHONDONTWRITEBYTECODE=1 00:29:40.102 18:33:23 reap_unregistered_poller -- common/autotest_common.sh@193 -- # export ASAN_OPTIONS=new_delete_type_mismatch=0:disable_coredump=0:abort_on_error=1:use_sigaltstack=0 00:29:40.102 18:33:23 reap_unregistered_poller -- common/autotest_common.sh@193 -- # ASAN_OPTIONS=new_delete_type_mismatch=0:disable_coredump=0:abort_on_error=1:use_sigaltstack=0 00:29:40.102 18:33:23 reap_unregistered_poller -- common/autotest_common.sh@194 -- # export UBSAN_OPTIONS=halt_on_error=1:print_stacktrace=1:abort_on_error=1:disable_coredump=0:exitcode=134 00:29:40.102 18:33:23 reap_unregistered_poller -- common/autotest_common.sh@194 -- # UBSAN_OPTIONS=halt_on_error=1:print_stacktrace=1:abort_on_error=1:disable_coredump=0:exitcode=134 00:29:40.102 18:33:23 reap_unregistered_poller -- common/autotest_common.sh@198 -- # asan_suppression_file=/var/tmp/asan_suppression_file 00:29:40.102 18:33:23 reap_unregistered_poller -- common/autotest_common.sh@199 -- # rm -rf /var/tmp/asan_suppression_file 00:29:40.102 18:33:23 reap_unregistered_poller -- common/autotest_common.sh@200 -- # cat 00:29:40.102 18:33:23 reap_unregistered_poller -- common/autotest_common.sh@236 -- # echo leak:libfuse3.so 00:29:40.102 18:33:23 reap_unregistered_poller -- common/autotest_common.sh@238 -- # export LSAN_OPTIONS=suppressions=/var/tmp/asan_suppression_file 00:29:40.102 18:33:23 reap_unregistered_poller -- common/autotest_common.sh@238 -- # LSAN_OPTIONS=suppressions=/var/tmp/asan_suppression_file 00:29:40.102 18:33:23 reap_unregistered_poller -- common/autotest_common.sh@240 -- # export DEFAULT_RPC_ADDR=/var/tmp/spdk.sock 00:29:40.102 18:33:23 reap_unregistered_poller -- common/autotest_common.sh@240 -- # DEFAULT_RPC_ADDR=/var/tmp/spdk.sock 00:29:40.102 18:33:23 reap_unregistered_poller -- common/autotest_common.sh@242 -- # '[' -z /var/spdk/dependencies ']' 00:29:40.102 18:33:23 reap_unregistered_poller -- common/autotest_common.sh@245 -- # export DEPENDENCY_DIR 00:29:40.102 18:33:23 reap_unregistered_poller -- common/autotest_common.sh@249 -- # export SPDK_BIN_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin 00:29:40.102 18:33:23 reap_unregistered_poller -- common/autotest_common.sh@249 -- # SPDK_BIN_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin 00:29:40.102 18:33:23 reap_unregistered_poller -- common/autotest_common.sh@250 -- # export SPDK_EXAMPLE_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples 00:29:40.102 18:33:23 reap_unregistered_poller -- common/autotest_common.sh@250 -- # SPDK_EXAMPLE_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples 00:29:40.102 18:33:23 reap_unregistered_poller -- common/autotest_common.sh@253 -- # export QEMU_BIN=/usr/local/qemu/vanilla-latest/bin/qemu-system-x86_64 00:29:40.102 18:33:23 reap_unregistered_poller -- common/autotest_common.sh@253 -- # QEMU_BIN=/usr/local/qemu/vanilla-latest/bin/qemu-system-x86_64 00:29:40.102 18:33:23 reap_unregistered_poller -- common/autotest_common.sh@254 -- # export VFIO_QEMU_BIN=/usr/local/qemu/vfio-user-latest/bin/qemu-system-x86_64 00:29:40.102 18:33:23 reap_unregistered_poller -- common/autotest_common.sh@254 -- # VFIO_QEMU_BIN=/usr/local/qemu/vfio-user-latest/bin/qemu-system-x86_64 00:29:40.102 18:33:23 reap_unregistered_poller -- common/autotest_common.sh@256 -- # export AR_TOOL=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/ar-xnvme-fixer 00:29:40.102 18:33:23 reap_unregistered_poller -- common/autotest_common.sh@256 -- # AR_TOOL=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/ar-xnvme-fixer 00:29:40.102 18:33:23 reap_unregistered_poller -- common/autotest_common.sh@259 -- # export UNBIND_ENTIRE_IOMMU_GROUP=yes 00:29:40.102 18:33:23 reap_unregistered_poller -- common/autotest_common.sh@259 -- # UNBIND_ENTIRE_IOMMU_GROUP=yes 00:29:40.102 18:33:23 reap_unregistered_poller -- common/autotest_common.sh@262 -- # '[' 0 -eq 0 ']' 00:29:40.102 18:33:23 reap_unregistered_poller -- common/autotest_common.sh@263 -- # export valgrind= 00:29:40.102 18:33:23 reap_unregistered_poller -- common/autotest_common.sh@263 -- # valgrind= 00:29:40.103 18:33:23 reap_unregistered_poller -- common/autotest_common.sh@269 -- # uname -s 00:29:40.103 18:33:23 reap_unregistered_poller -- common/autotest_common.sh@269 -- # '[' Linux = Linux ']' 00:29:40.103 18:33:23 reap_unregistered_poller -- common/autotest_common.sh@270 -- # HUGEMEM=4096 00:29:40.103 18:33:23 reap_unregistered_poller -- common/autotest_common.sh@271 -- # export CLEAR_HUGE=yes 00:29:40.103 18:33:23 reap_unregistered_poller -- common/autotest_common.sh@271 -- # CLEAR_HUGE=yes 00:29:40.103 18:33:23 reap_unregistered_poller -- common/autotest_common.sh@272 -- # [[ 1 -eq 1 ]] 00:29:40.103 18:33:23 reap_unregistered_poller -- common/autotest_common.sh@276 -- # export HUGE_EVEN_ALLOC=yes 00:29:40.103 18:33:23 reap_unregistered_poller -- common/autotest_common.sh@276 -- # HUGE_EVEN_ALLOC=yes 00:29:40.103 18:33:23 reap_unregistered_poller -- common/autotest_common.sh@279 -- # MAKE=make 00:29:40.103 18:33:23 reap_unregistered_poller -- common/autotest_common.sh@280 -- # MAKEFLAGS=-j72 00:29:40.103 18:33:23 reap_unregistered_poller -- common/autotest_common.sh@296 -- # export HUGEMEM=4096 00:29:40.103 18:33:23 reap_unregistered_poller -- common/autotest_common.sh@296 -- # HUGEMEM=4096 00:29:40.103 18:33:23 reap_unregistered_poller -- common/autotest_common.sh@298 -- # NO_HUGE=() 00:29:40.103 18:33:23 reap_unregistered_poller -- common/autotest_common.sh@299 -- # TEST_MODE= 00:29:40.103 18:33:23 reap_unregistered_poller -- common/autotest_common.sh@318 -- # [[ -z 2626377 ]] 00:29:40.103 18:33:23 reap_unregistered_poller -- common/autotest_common.sh@318 -- # kill -0 2626377 00:29:40.103 18:33:23 reap_unregistered_poller -- common/autotest_common.sh@1680 -- # set_test_storage 2147483648 00:29:40.103 18:33:23 reap_unregistered_poller -- common/autotest_common.sh@328 -- # [[ -v testdir ]] 00:29:40.103 18:33:23 reap_unregistered_poller -- common/autotest_common.sh@330 -- # local requested_size=2147483648 00:29:40.103 18:33:23 reap_unregistered_poller -- common/autotest_common.sh@331 -- # local mount target_dir 00:29:40.103 18:33:23 reap_unregistered_poller -- common/autotest_common.sh@333 -- # local -A mounts fss sizes avails uses 00:29:40.103 18:33:23 reap_unregistered_poller -- common/autotest_common.sh@334 -- # local source fs size avail mount use 00:29:40.103 18:33:23 reap_unregistered_poller -- common/autotest_common.sh@336 -- # local storage_fallback storage_candidates 00:29:40.103 18:33:23 reap_unregistered_poller -- common/autotest_common.sh@338 -- # mktemp -udt spdk.XXXXXX 00:29:40.103 18:33:23 reap_unregistered_poller -- common/autotest_common.sh@338 -- # storage_fallback=/tmp/spdk.o8vmtb 00:29:40.103 18:33:23 reap_unregistered_poller -- common/autotest_common.sh@343 -- # storage_candidates=("$testdir" "$storage_fallback/tests/${testdir##*/}" "$storage_fallback") 00:29:40.103 18:33:23 reap_unregistered_poller -- common/autotest_common.sh@345 -- # [[ -n '' ]] 00:29:40.103 18:33:23 reap_unregistered_poller -- common/autotest_common.sh@350 -- # [[ -n '' ]] 00:29:40.103 18:33:23 reap_unregistered_poller -- common/autotest_common.sh@355 -- # mkdir -p /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt /tmp/spdk.o8vmtb/tests/interrupt /tmp/spdk.o8vmtb 00:29:40.103 18:33:23 reap_unregistered_poller -- common/autotest_common.sh@358 -- # requested_size=2214592512 00:29:40.103 18:33:23 reap_unregistered_poller -- common/autotest_common.sh@360 -- # read -r source fs size use avail _ mount 00:29:40.103 18:33:23 reap_unregistered_poller -- common/autotest_common.sh@327 -- # df -T 00:29:40.103 18:33:23 reap_unregistered_poller -- common/autotest_common.sh@327 -- # grep -v Filesystem 00:29:40.103 18:33:23 reap_unregistered_poller -- common/autotest_common.sh@361 -- # mounts["$mount"]=spdk_devtmpfs 00:29:40.103 18:33:23 reap_unregistered_poller -- common/autotest_common.sh@361 -- # fss["$mount"]=devtmpfs 00:29:40.103 18:33:23 reap_unregistered_poller -- common/autotest_common.sh@362 -- # avails["$mount"]=67108864 00:29:40.103 18:33:23 reap_unregistered_poller -- common/autotest_common.sh@362 -- # sizes["$mount"]=67108864 00:29:40.103 18:33:23 reap_unregistered_poller -- common/autotest_common.sh@363 -- # uses["$mount"]=0 00:29:40.103 18:33:23 reap_unregistered_poller -- common/autotest_common.sh@360 -- # read -r source fs size use avail _ mount 00:29:40.103 18:33:23 reap_unregistered_poller -- common/autotest_common.sh@361 -- # mounts["$mount"]=/dev/pmem0 00:29:40.103 18:33:23 reap_unregistered_poller -- common/autotest_common.sh@361 -- # fss["$mount"]=ext2 00:29:40.103 18:33:23 reap_unregistered_poller -- common/autotest_common.sh@362 -- # avails["$mount"]=946290688 00:29:40.103 18:33:23 reap_unregistered_poller -- common/autotest_common.sh@362 -- # sizes["$mount"]=5284429824 00:29:40.103 18:33:23 reap_unregistered_poller -- common/autotest_common.sh@363 -- # uses["$mount"]=4338139136 00:29:40.103 18:33:23 reap_unregistered_poller -- common/autotest_common.sh@360 -- # read -r source fs size use avail _ mount 00:29:40.103 18:33:23 reap_unregistered_poller -- common/autotest_common.sh@361 -- # mounts["$mount"]=spdk_root 00:29:40.103 18:33:23 reap_unregistered_poller -- common/autotest_common.sh@361 -- # fss["$mount"]=overlay 00:29:40.103 18:33:23 reap_unregistered_poller -- common/autotest_common.sh@362 -- # avails["$mount"]=88940404736 00:29:40.103 18:33:23 reap_unregistered_poller -- common/autotest_common.sh@362 -- # sizes["$mount"]=94508515328 00:29:40.103 18:33:23 reap_unregistered_poller -- common/autotest_common.sh@363 -- # uses["$mount"]=5568110592 00:29:40.103 18:33:23 reap_unregistered_poller -- common/autotest_common.sh@360 -- # read -r source fs size use avail _ mount 00:29:40.103 18:33:23 reap_unregistered_poller -- common/autotest_common.sh@361 -- # mounts["$mount"]=tmpfs 00:29:40.103 18:33:23 reap_unregistered_poller -- common/autotest_common.sh@361 -- # fss["$mount"]=tmpfs 00:29:40.103 18:33:23 reap_unregistered_poller -- common/autotest_common.sh@362 -- # avails["$mount"]=47250882560 00:29:40.103 18:33:23 reap_unregistered_poller -- common/autotest_common.sh@362 -- # sizes["$mount"]=47254257664 00:29:40.103 18:33:23 reap_unregistered_poller -- common/autotest_common.sh@363 -- # uses["$mount"]=3375104 00:29:40.103 18:33:23 reap_unregistered_poller -- common/autotest_common.sh@360 -- # read -r source fs size use avail _ mount 00:29:40.103 18:33:23 reap_unregistered_poller -- common/autotest_common.sh@361 -- # mounts["$mount"]=tmpfs 00:29:40.103 18:33:23 reap_unregistered_poller -- common/autotest_common.sh@361 -- # fss["$mount"]=tmpfs 00:29:40.103 18:33:23 reap_unregistered_poller -- common/autotest_common.sh@362 -- # avails["$mount"]=18892308480 00:29:40.103 18:33:23 reap_unregistered_poller -- common/autotest_common.sh@362 -- # sizes["$mount"]=18901704704 00:29:40.103 18:33:23 reap_unregistered_poller -- common/autotest_common.sh@363 -- # uses["$mount"]=9396224 00:29:40.103 18:33:23 reap_unregistered_poller -- common/autotest_common.sh@360 -- # read -r source fs size use avail _ mount 00:29:40.103 18:33:23 reap_unregistered_poller -- common/autotest_common.sh@361 -- # mounts["$mount"]=tmpfs 00:29:40.103 18:33:23 reap_unregistered_poller -- common/autotest_common.sh@361 -- # fss["$mount"]=tmpfs 00:29:40.103 18:33:23 reap_unregistered_poller -- common/autotest_common.sh@362 -- # avails["$mount"]=47253807104 00:29:40.103 18:33:23 reap_unregistered_poller -- common/autotest_common.sh@362 -- # sizes["$mount"]=47254257664 00:29:40.103 18:33:23 reap_unregistered_poller -- common/autotest_common.sh@363 -- # uses["$mount"]=450560 00:29:40.103 18:33:23 reap_unregistered_poller -- common/autotest_common.sh@360 -- # read -r source fs size use avail _ mount 00:29:40.103 18:33:23 reap_unregistered_poller -- common/autotest_common.sh@361 -- # mounts["$mount"]=tmpfs 00:29:40.103 18:33:23 reap_unregistered_poller -- common/autotest_common.sh@361 -- # fss["$mount"]=tmpfs 00:29:40.103 18:33:23 reap_unregistered_poller -- common/autotest_common.sh@362 -- # avails["$mount"]=9450844160 00:29:40.103 18:33:23 reap_unregistered_poller -- common/autotest_common.sh@362 -- # sizes["$mount"]=9450848256 00:29:40.103 18:33:23 reap_unregistered_poller -- common/autotest_common.sh@363 -- # uses["$mount"]=4096 00:29:40.103 18:33:23 reap_unregistered_poller -- common/autotest_common.sh@360 -- # read -r source fs size use avail _ mount 00:29:40.362 18:33:23 reap_unregistered_poller -- common/autotest_common.sh@366 -- # printf '* Looking for test storage...\n' 00:29:40.362 * Looking for test storage... 00:29:40.362 18:33:23 reap_unregistered_poller -- common/autotest_common.sh@368 -- # local target_space new_size 00:29:40.363 18:33:23 reap_unregistered_poller -- common/autotest_common.sh@369 -- # for target_dir in "${storage_candidates[@]}" 00:29:40.363 18:33:23 reap_unregistered_poller -- common/autotest_common.sh@372 -- # df /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt 00:29:40.363 18:33:23 reap_unregistered_poller -- common/autotest_common.sh@372 -- # awk '$1 !~ /Filesystem/{print $6}' 00:29:40.363 18:33:23 reap_unregistered_poller -- common/autotest_common.sh@372 -- # mount=/ 00:29:40.363 18:33:23 reap_unregistered_poller -- common/autotest_common.sh@374 -- # target_space=88940404736 00:29:40.363 18:33:23 reap_unregistered_poller -- common/autotest_common.sh@375 -- # (( target_space == 0 || target_space < requested_size )) 00:29:40.363 18:33:23 reap_unregistered_poller -- common/autotest_common.sh@378 -- # (( target_space >= requested_size )) 00:29:40.363 18:33:23 reap_unregistered_poller -- common/autotest_common.sh@380 -- # [[ overlay == tmpfs ]] 00:29:40.363 18:33:23 reap_unregistered_poller -- common/autotest_common.sh@380 -- # [[ overlay == ramfs ]] 00:29:40.363 18:33:23 reap_unregistered_poller -- common/autotest_common.sh@380 -- # [[ / == / ]] 00:29:40.363 18:33:23 reap_unregistered_poller -- common/autotest_common.sh@381 -- # new_size=7782703104 00:29:40.363 18:33:23 reap_unregistered_poller -- common/autotest_common.sh@382 -- # (( new_size * 100 / sizes[/] > 95 )) 00:29:40.363 18:33:23 reap_unregistered_poller -- common/autotest_common.sh@387 -- # export SPDK_TEST_STORAGE=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt 00:29:40.363 18:33:23 reap_unregistered_poller -- common/autotest_common.sh@387 -- # SPDK_TEST_STORAGE=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt 00:29:40.363 18:33:23 reap_unregistered_poller -- common/autotest_common.sh@388 -- # printf '* Found test storage at %s\n' /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt 00:29:40.363 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt 00:29:40.363 18:33:23 reap_unregistered_poller -- common/autotest_common.sh@389 -- # return 0 00:29:40.363 18:33:23 reap_unregistered_poller -- common/autotest_common.sh@1682 -- # set -o errtrace 00:29:40.363 18:33:23 reap_unregistered_poller -- common/autotest_common.sh@1683 -- # shopt -s extdebug 00:29:40.363 18:33:23 reap_unregistered_poller -- common/autotest_common.sh@1684 -- # trap 'trap - ERR; print_backtrace >&2' ERR 00:29:40.363 18:33:23 reap_unregistered_poller -- common/autotest_common.sh@1686 -- # PS4=' \t ${test_domain:-} -- ${BASH_SOURCE#${BASH_SOURCE%/*/*}/}@${LINENO} -- \$ ' 00:29:40.363 18:33:23 reap_unregistered_poller -- common/autotest_common.sh@1687 -- # true 00:29:40.363 18:33:23 reap_unregistered_poller -- common/autotest_common.sh@1689 -- # xtrace_fd 00:29:40.363 18:33:23 reap_unregistered_poller -- common/autotest_common.sh@25 -- # [[ -n 13 ]] 00:29:40.363 18:33:23 reap_unregistered_poller -- common/autotest_common.sh@25 -- # [[ -e /proc/self/fd/13 ]] 00:29:40.363 18:33:23 reap_unregistered_poller -- common/autotest_common.sh@27 -- # exec 00:29:40.363 18:33:23 reap_unregistered_poller -- common/autotest_common.sh@29 -- # exec 00:29:40.363 18:33:23 reap_unregistered_poller -- common/autotest_common.sh@31 -- # xtrace_restore 00:29:40.363 18:33:23 reap_unregistered_poller -- common/autotest_common.sh@16 -- # unset -v 'X_STACK[0 - 1 < 0 ? 0 : 0 - 1]' 00:29:40.363 18:33:23 reap_unregistered_poller -- common/autotest_common.sh@17 -- # (( 0 == 0 )) 00:29:40.363 18:33:23 reap_unregistered_poller -- common/autotest_common.sh@18 -- # set -x 00:29:40.363 18:33:23 reap_unregistered_poller -- interrupt/interrupt_common.sh@8 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt/common.sh 00:29:40.363 18:33:23 reap_unregistered_poller -- interrupt/interrupt_common.sh@10 -- # rpc_py=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:29:40.363 18:33:23 reap_unregistered_poller -- interrupt/interrupt_common.sh@12 -- # r0_mask=0x1 00:29:40.363 18:33:23 reap_unregistered_poller -- interrupt/interrupt_common.sh@13 -- # r1_mask=0x2 00:29:40.363 18:33:23 reap_unregistered_poller -- interrupt/interrupt_common.sh@14 -- # r2_mask=0x4 00:29:40.363 18:33:23 reap_unregistered_poller -- interrupt/interrupt_common.sh@16 -- # cpu_server_mask=0x07 00:29:40.363 18:33:23 reap_unregistered_poller -- interrupt/interrupt_common.sh@17 -- # rpc_server_addr=/var/tmp/spdk.sock 00:29:40.363 18:33:23 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@14 -- # export PYTHONPATH=:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/examples/interrupt_tgt 00:29:40.363 18:33:23 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@14 -- # PYTHONPATH=:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/examples/interrupt_tgt 00:29:40.363 18:33:23 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@17 -- # start_intr_tgt 00:29:40.363 18:33:23 reap_unregistered_poller -- interrupt/interrupt_common.sh@20 -- # local rpc_addr=/var/tmp/spdk.sock 00:29:40.363 18:33:23 reap_unregistered_poller -- interrupt/interrupt_common.sh@21 -- # local cpu_mask=0x07 00:29:40.363 18:33:23 reap_unregistered_poller -- interrupt/interrupt_common.sh@24 -- # intr_tgt_pid=2626519 00:29:40.363 18:33:23 reap_unregistered_poller -- interrupt/interrupt_common.sh@25 -- # trap 'killprocess "$intr_tgt_pid"; cleanup; exit 1' SIGINT SIGTERM EXIT 00:29:40.363 18:33:23 reap_unregistered_poller -- interrupt/interrupt_common.sh@23 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/interrupt_tgt -m 0x07 -r /var/tmp/spdk.sock -E -g 00:29:40.363 18:33:23 reap_unregistered_poller -- interrupt/interrupt_common.sh@26 -- # waitforlisten 2626519 /var/tmp/spdk.sock 00:29:40.363 18:33:23 reap_unregistered_poller -- common/autotest_common.sh@829 -- # '[' -z 2626519 ']' 00:29:40.363 18:33:23 reap_unregistered_poller -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:29:40.363 18:33:23 reap_unregistered_poller -- common/autotest_common.sh@834 -- # local max_retries=100 00:29:40.363 18:33:23 reap_unregistered_poller -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:29:40.363 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:29:40.363 18:33:23 reap_unregistered_poller -- common/autotest_common.sh@838 -- # xtrace_disable 00:29:40.363 18:33:23 reap_unregistered_poller -- common/autotest_common.sh@10 -- # set +x 00:29:40.363 [2024-07-12 18:33:23.889293] Starting SPDK v24.09-pre git sha1 182dd7de4 / DPDK 24.03.0 initialization... 00:29:40.363 [2024-07-12 18:33:23.889365] [ DPDK EAL parameters: interrupt_tgt --no-shconf -c 0x07 --single-file-segments --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2626519 ] 00:29:40.363 [2024-07-12 18:33:24.019302] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 3 00:29:40.622 [2024-07-12 18:33:24.118527] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:29:40.622 [2024-07-12 18:33:24.118611] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 2 00:29:40.622 [2024-07-12 18:33:24.118616] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:29:40.622 [2024-07-12 18:33:24.192683] thread.c:2099:spdk_thread_set_interrupt_mode: *NOTICE*: Set spdk_thread (app_thread) to intr mode from intr mode. 00:29:41.214 18:33:24 reap_unregistered_poller -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:29:41.214 18:33:24 reap_unregistered_poller -- common/autotest_common.sh@862 -- # return 0 00:29:41.214 18:33:24 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@20 -- # rpc_cmd thread_get_pollers 00:29:41.214 18:33:24 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@20 -- # jq -r '.threads[0]' 00:29:41.214 18:33:24 reap_unregistered_poller -- common/autotest_common.sh@559 -- # xtrace_disable 00:29:41.214 18:33:24 reap_unregistered_poller -- common/autotest_common.sh@10 -- # set +x 00:29:41.214 18:33:24 reap_unregistered_poller -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:29:41.214 18:33:24 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@20 -- # app_thread='{ 00:29:41.214 "name": "app_thread", 00:29:41.214 "id": 1, 00:29:41.214 "active_pollers": [], 00:29:41.214 "timed_pollers": [ 00:29:41.214 { 00:29:41.214 "name": "rpc_subsystem_poll_servers", 00:29:41.214 "id": 1, 00:29:41.214 "state": "waiting", 00:29:41.214 "run_count": 0, 00:29:41.214 "busy_count": 0, 00:29:41.214 "period_ticks": 9200000 00:29:41.214 } 00:29:41.214 ], 00:29:41.214 "paused_pollers": [] 00:29:41.214 }' 00:29:41.214 18:33:24 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@21 -- # jq -r '.active_pollers[].name' 00:29:41.214 18:33:24 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@21 -- # native_pollers= 00:29:41.214 18:33:24 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@22 -- # native_pollers+=' ' 00:29:41.473 18:33:24 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@23 -- # jq -r '.timed_pollers[].name' 00:29:41.473 18:33:24 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@23 -- # native_pollers+=rpc_subsystem_poll_servers 00:29:41.473 18:33:24 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@28 -- # setup_bdev_aio 00:29:41.473 18:33:24 reap_unregistered_poller -- interrupt/common.sh@75 -- # uname -s 00:29:41.473 18:33:24 reap_unregistered_poller -- interrupt/common.sh@75 -- # [[ Linux != \F\r\e\e\B\S\D ]] 00:29:41.473 18:33:24 reap_unregistered_poller -- interrupt/common.sh@76 -- # dd if=/dev/zero of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt/aiofile bs=2048 count=5000 00:29:41.473 5000+0 records in 00:29:41.473 5000+0 records out 00:29:41.473 10240000 bytes (10 MB, 9.8 MiB) copied, 0.0236862 s, 432 MB/s 00:29:41.473 18:33:25 reap_unregistered_poller -- interrupt/common.sh@77 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_aio_create /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt/aiofile AIO0 2048 00:29:41.732 AIO0 00:29:41.732 18:33:25 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@33 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:29:41.990 18:33:25 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@34 -- # sleep 0.1 00:29:41.990 18:33:25 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@37 -- # rpc_cmd thread_get_pollers 00:29:41.990 18:33:25 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@37 -- # jq -r '.threads[0]' 00:29:41.990 18:33:25 reap_unregistered_poller -- common/autotest_common.sh@559 -- # xtrace_disable 00:29:41.990 18:33:25 reap_unregistered_poller -- common/autotest_common.sh@10 -- # set +x 00:29:41.990 18:33:25 reap_unregistered_poller -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:29:41.990 18:33:25 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@37 -- # app_thread='{ 00:29:41.990 "name": "app_thread", 00:29:41.990 "id": 1, 00:29:41.990 "active_pollers": [], 00:29:41.990 "timed_pollers": [ 00:29:41.990 { 00:29:41.990 "name": "rpc_subsystem_poll_servers", 00:29:41.990 "id": 1, 00:29:41.990 "state": "waiting", 00:29:41.990 "run_count": 0, 00:29:41.990 "busy_count": 0, 00:29:41.990 "period_ticks": 9200000 00:29:41.990 } 00:29:41.990 ], 00:29:41.990 "paused_pollers": [] 00:29:41.990 }' 00:29:41.990 18:33:25 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@38 -- # jq -r '.active_pollers[].name' 00:29:42.249 18:33:25 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@38 -- # remaining_pollers= 00:29:42.249 18:33:25 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@39 -- # remaining_pollers+=' ' 00:29:42.249 18:33:25 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@40 -- # jq -r '.timed_pollers[].name' 00:29:42.249 18:33:25 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@40 -- # remaining_pollers+=rpc_subsystem_poll_servers 00:29:42.249 18:33:25 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@44 -- # [[ rpc_subsystem_poll_servers == \ \r\p\c\_\s\u\b\s\y\s\t\e\m\_\p\o\l\l\_\s\e\r\v\e\r\s ]] 00:29:42.249 18:33:25 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@46 -- # trap - SIGINT SIGTERM EXIT 00:29:42.249 18:33:25 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@47 -- # killprocess 2626519 00:29:42.249 18:33:25 reap_unregistered_poller -- common/autotest_common.sh@948 -- # '[' -z 2626519 ']' 00:29:42.249 18:33:25 reap_unregistered_poller -- common/autotest_common.sh@952 -- # kill -0 2626519 00:29:42.249 18:33:25 reap_unregistered_poller -- common/autotest_common.sh@953 -- # uname 00:29:42.249 18:33:25 reap_unregistered_poller -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:29:42.249 18:33:25 reap_unregistered_poller -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2626519 00:29:42.249 18:33:25 reap_unregistered_poller -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:29:42.249 18:33:25 reap_unregistered_poller -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:29:42.249 18:33:25 reap_unregistered_poller -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2626519' 00:29:42.249 killing process with pid 2626519 00:29:42.249 18:33:25 reap_unregistered_poller -- common/autotest_common.sh@967 -- # kill 2626519 00:29:42.249 18:33:25 reap_unregistered_poller -- common/autotest_common.sh@972 -- # wait 2626519 00:29:42.508 18:33:26 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@48 -- # cleanup 00:29:42.508 18:33:26 reap_unregistered_poller -- interrupt/common.sh@6 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt/aiofile 00:29:42.508 00:29:42.508 real 0m2.512s 00:29:42.508 user 0m1.597s 00:29:42.508 sys 0m0.652s 00:29:42.508 18:33:26 reap_unregistered_poller -- common/autotest_common.sh@1124 -- # xtrace_disable 00:29:42.508 18:33:26 reap_unregistered_poller -- common/autotest_common.sh@10 -- # set +x 00:29:42.508 ************************************ 00:29:42.508 END TEST reap_unregistered_poller 00:29:42.508 ************************************ 00:29:42.508 18:33:26 -- common/autotest_common.sh@1142 -- # return 0 00:29:42.508 18:33:26 -- spdk/autotest.sh@198 -- # uname -s 00:29:42.508 18:33:26 -- spdk/autotest.sh@198 -- # [[ Linux == Linux ]] 00:29:42.508 18:33:26 -- spdk/autotest.sh@199 -- # [[ 1 -eq 1 ]] 00:29:42.508 18:33:26 -- spdk/autotest.sh@205 -- # [[ 1 -eq 0 ]] 00:29:42.508 18:33:26 -- spdk/autotest.sh@211 -- # '[' 0 -eq 1 ']' 00:29:42.508 18:33:26 -- spdk/autotest.sh@256 -- # '[' 0 -eq 1 ']' 00:29:42.508 18:33:26 -- spdk/autotest.sh@260 -- # timing_exit lib 00:29:42.508 18:33:26 -- common/autotest_common.sh@728 -- # xtrace_disable 00:29:42.508 18:33:26 -- common/autotest_common.sh@10 -- # set +x 00:29:42.508 18:33:26 -- spdk/autotest.sh@262 -- # '[' 0 -eq 1 ']' 00:29:42.508 18:33:26 -- spdk/autotest.sh@270 -- # '[' 0 -eq 1 ']' 00:29:42.508 18:33:26 -- spdk/autotest.sh@279 -- # '[' 0 -eq 1 ']' 00:29:42.508 18:33:26 -- spdk/autotest.sh@308 -- # '[' 0 -eq 1 ']' 00:29:42.508 18:33:26 -- spdk/autotest.sh@312 -- # '[' 0 -eq 1 ']' 00:29:42.508 18:33:26 -- spdk/autotest.sh@316 -- # '[' 0 -eq 1 ']' 00:29:42.508 18:33:26 -- spdk/autotest.sh@321 -- # '[' 0 -eq 1 ']' 00:29:42.508 18:33:26 -- spdk/autotest.sh@330 -- # '[' 0 -eq 1 ']' 00:29:42.508 18:33:26 -- spdk/autotest.sh@335 -- # '[' 0 -eq 1 ']' 00:29:42.508 18:33:26 -- spdk/autotest.sh@339 -- # '[' 0 -eq 1 ']' 00:29:42.508 18:33:26 -- spdk/autotest.sh@343 -- # '[' 0 -eq 1 ']' 00:29:42.508 18:33:26 -- spdk/autotest.sh@347 -- # '[' 1 -eq 1 ']' 00:29:42.508 18:33:26 -- spdk/autotest.sh@348 -- # run_test compress_compdev /var/jenkins/workspace/crypto-phy-autotest/spdk/test/compress/compress.sh compdev 00:29:42.508 18:33:26 -- common/autotest_common.sh@1099 -- # '[' 3 -le 1 ']' 00:29:42.508 18:33:26 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:29:42.508 18:33:26 -- common/autotest_common.sh@10 -- # set +x 00:29:42.508 ************************************ 00:29:42.508 START TEST compress_compdev 00:29:42.508 ************************************ 00:29:42.508 18:33:26 compress_compdev -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/compress/compress.sh compdev 00:29:42.767 * Looking for test storage... 00:29:42.767 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/compress 00:29:42.767 18:33:26 compress_compdev -- compress/compress.sh@13 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/nvmf/common.sh 00:29:42.767 18:33:26 compress_compdev -- nvmf/common.sh@7 -- # uname -s 00:29:42.767 18:33:26 compress_compdev -- nvmf/common.sh@7 -- # [[ Linux == FreeBSD ]] 00:29:42.767 18:33:26 compress_compdev -- nvmf/common.sh@9 -- # NVMF_PORT=4420 00:29:42.767 18:33:26 compress_compdev -- nvmf/common.sh@10 -- # NVMF_SECOND_PORT=4421 00:29:42.767 18:33:26 compress_compdev -- nvmf/common.sh@11 -- # NVMF_THIRD_PORT=4422 00:29:42.767 18:33:26 compress_compdev -- nvmf/common.sh@12 -- # NVMF_IP_PREFIX=192.168.100 00:29:42.767 18:33:26 compress_compdev -- nvmf/common.sh@13 -- # NVMF_IP_LEAST_ADDR=8 00:29:42.767 18:33:26 compress_compdev -- nvmf/common.sh@14 -- # NVMF_TCP_IP_ADDRESS=127.0.0.1 00:29:42.767 18:33:26 compress_compdev -- nvmf/common.sh@15 -- # NVMF_TRANSPORT_OPTS= 00:29:42.767 18:33:26 compress_compdev -- nvmf/common.sh@16 -- # NVMF_SERIAL=SPDKISFASTANDAWESOME 00:29:42.767 18:33:26 compress_compdev -- nvmf/common.sh@17 -- # nvme gen-hostnqn 00:29:42.767 18:33:26 compress_compdev -- nvmf/common.sh@17 -- # NVME_HOSTNQN=nqn.2014-08.org.nvmexpress:uuid:809e2efd-7f71-e711-906e-0017a4403562 00:29:42.767 18:33:26 compress_compdev -- nvmf/common.sh@18 -- # NVME_HOSTID=809e2efd-7f71-e711-906e-0017a4403562 00:29:42.767 18:33:26 compress_compdev -- nvmf/common.sh@19 -- # NVME_HOST=("--hostnqn=$NVME_HOSTNQN" "--hostid=$NVME_HOSTID") 00:29:42.767 18:33:26 compress_compdev -- nvmf/common.sh@20 -- # NVME_CONNECT='nvme connect' 00:29:42.767 18:33:26 compress_compdev -- nvmf/common.sh@21 -- # NET_TYPE=phy-fallback 00:29:42.767 18:33:26 compress_compdev -- nvmf/common.sh@22 -- # NVME_SUBNQN=nqn.2016-06.io.spdk:testnqn 00:29:42.767 18:33:26 compress_compdev -- nvmf/common.sh@45 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/common.sh 00:29:42.767 18:33:26 compress_compdev -- scripts/common.sh@508 -- # [[ -e /bin/wpdk_common.sh ]] 00:29:42.767 18:33:26 compress_compdev -- scripts/common.sh@516 -- # [[ -e /etc/opt/spdk-pkgdep/paths/export.sh ]] 00:29:42.767 18:33:26 compress_compdev -- scripts/common.sh@517 -- # source /etc/opt/spdk-pkgdep/paths/export.sh 00:29:42.767 18:33:26 compress_compdev -- paths/export.sh@2 -- # PATH=/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:29:42.767 18:33:26 compress_compdev -- paths/export.sh@3 -- # PATH=/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:29:42.767 18:33:26 compress_compdev -- paths/export.sh@4 -- # PATH=/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:29:42.767 18:33:26 compress_compdev -- paths/export.sh@5 -- # export PATH 00:29:42.767 18:33:26 compress_compdev -- paths/export.sh@6 -- # echo /opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:29:42.767 18:33:26 compress_compdev -- nvmf/common.sh@47 -- # : 0 00:29:42.767 18:33:26 compress_compdev -- nvmf/common.sh@48 -- # export NVMF_APP_SHM_ID 00:29:42.767 18:33:26 compress_compdev -- nvmf/common.sh@49 -- # build_nvmf_app_args 00:29:42.767 18:33:26 compress_compdev -- nvmf/common.sh@25 -- # '[' 0 -eq 1 ']' 00:29:42.767 18:33:26 compress_compdev -- nvmf/common.sh@29 -- # NVMF_APP+=(-i "$NVMF_APP_SHM_ID" -e 0xFFFF) 00:29:42.767 18:33:26 compress_compdev -- nvmf/common.sh@31 -- # NVMF_APP+=("${NO_HUGE[@]}") 00:29:42.767 18:33:26 compress_compdev -- nvmf/common.sh@33 -- # '[' -n '' ']' 00:29:42.767 18:33:26 compress_compdev -- nvmf/common.sh@35 -- # '[' 0 -eq 1 ']' 00:29:42.767 18:33:26 compress_compdev -- nvmf/common.sh@51 -- # have_pci_nics=0 00:29:42.767 18:33:26 compress_compdev -- compress/compress.sh@17 -- # rpc_py=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:29:42.767 18:33:26 compress_compdev -- compress/compress.sh@81 -- # mkdir -p /tmp/pmem 00:29:42.767 18:33:26 compress_compdev -- compress/compress.sh@82 -- # test_type=compdev 00:29:42.767 18:33:26 compress_compdev -- compress/compress.sh@86 -- # run_bdevperf 32 4096 3 00:29:42.767 18:33:26 compress_compdev -- compress/compress.sh@66 -- # [[ compdev == \c\o\m\p\d\e\v ]] 00:29:42.767 18:33:26 compress_compdev -- compress/compress.sh@71 -- # bdevperf_pid=2626854 00:29:42.767 18:33:26 compress_compdev -- compress/compress.sh@67 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -z -q 32 -o 4096 -w verify -t 3 -C -m 0x6 -c /var/jenkins/workspace/crypto-phy-autotest/spdk/test/compress/dpdk.json 00:29:42.767 18:33:26 compress_compdev -- compress/compress.sh@72 -- # trap 'killprocess $bdevperf_pid; error_cleanup; exit 1' SIGINT SIGTERM EXIT 00:29:42.767 18:33:26 compress_compdev -- compress/compress.sh@73 -- # waitforlisten 2626854 00:29:42.767 18:33:26 compress_compdev -- common/autotest_common.sh@829 -- # '[' -z 2626854 ']' 00:29:42.767 18:33:26 compress_compdev -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:29:42.767 18:33:26 compress_compdev -- common/autotest_common.sh@834 -- # local max_retries=100 00:29:42.767 18:33:26 compress_compdev -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:29:42.767 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:29:42.767 18:33:26 compress_compdev -- common/autotest_common.sh@838 -- # xtrace_disable 00:29:42.767 18:33:26 compress_compdev -- common/autotest_common.sh@10 -- # set +x 00:29:42.767 [2024-07-12 18:33:26.392474] Starting SPDK v24.09-pre git sha1 182dd7de4 / DPDK 24.03.0 initialization... 00:29:42.767 [2024-07-12 18:33:26.392545] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x6 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2626854 ] 00:29:43.025 [2024-07-12 18:33:26.511879] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 2 00:29:43.025 [2024-07-12 18:33:26.609614] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 2 00:29:43.025 [2024-07-12 18:33:26.609620] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:29:43.961 [2024-07-12 18:33:27.350299] accel_dpdk_compressdev.c: 296:accel_init_compress_drivers: *NOTICE*: initialized QAT PMD 00:29:43.961 18:33:27 compress_compdev -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:29:43.961 18:33:27 compress_compdev -- common/autotest_common.sh@862 -- # return 0 00:29:43.961 18:33:27 compress_compdev -- compress/compress.sh@74 -- # create_vols 00:29:43.961 18:33:27 compress_compdev -- compress/compress.sh@34 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/gen_nvme.sh 00:29:43.961 18:33:27 compress_compdev -- compress/compress.sh@34 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py load_subsystem_config 00:29:44.527 [2024-07-12 18:33:27.991167] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x23943c0 PMD being used: compress_qat 00:29:44.527 18:33:28 compress_compdev -- compress/compress.sh@35 -- # waitforbdev Nvme0n1 00:29:44.527 18:33:28 compress_compdev -- common/autotest_common.sh@897 -- # local bdev_name=Nvme0n1 00:29:44.527 18:33:28 compress_compdev -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:29:44.527 18:33:28 compress_compdev -- common/autotest_common.sh@899 -- # local i 00:29:44.527 18:33:28 compress_compdev -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:29:44.527 18:33:28 compress_compdev -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:29:44.527 18:33:28 compress_compdev -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:29:44.784 18:33:28 compress_compdev -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b Nvme0n1 -t 2000 00:29:44.784 [ 00:29:44.784 { 00:29:44.784 "name": "Nvme0n1", 00:29:44.784 "aliases": [ 00:29:44.784 "01000000-0000-0000-5cd2-e43197705251" 00:29:44.784 ], 00:29:44.784 "product_name": "NVMe disk", 00:29:44.784 "block_size": 512, 00:29:44.784 "num_blocks": 15002931888, 00:29:44.784 "uuid": "01000000-0000-0000-5cd2-e43197705251", 00:29:44.784 "assigned_rate_limits": { 00:29:44.784 "rw_ios_per_sec": 0, 00:29:44.784 "rw_mbytes_per_sec": 0, 00:29:44.784 "r_mbytes_per_sec": 0, 00:29:44.784 "w_mbytes_per_sec": 0 00:29:44.784 }, 00:29:44.784 "claimed": false, 00:29:44.784 "zoned": false, 00:29:44.784 "supported_io_types": { 00:29:44.784 "read": true, 00:29:44.784 "write": true, 00:29:44.784 "unmap": true, 00:29:44.784 "flush": true, 00:29:44.784 "reset": true, 00:29:44.784 "nvme_admin": true, 00:29:44.784 "nvme_io": true, 00:29:44.784 "nvme_io_md": false, 00:29:44.784 "write_zeroes": true, 00:29:44.784 "zcopy": false, 00:29:44.784 "get_zone_info": false, 00:29:44.784 "zone_management": false, 00:29:44.784 "zone_append": false, 00:29:44.784 "compare": false, 00:29:44.784 "compare_and_write": false, 00:29:44.784 "abort": true, 00:29:44.784 "seek_hole": false, 00:29:44.784 "seek_data": false, 00:29:44.784 "copy": false, 00:29:44.784 "nvme_iov_md": false 00:29:44.784 }, 00:29:44.784 "driver_specific": { 00:29:44.784 "nvme": [ 00:29:44.784 { 00:29:44.784 "pci_address": "0000:5e:00.0", 00:29:44.784 "trid": { 00:29:44.784 "trtype": "PCIe", 00:29:44.784 "traddr": "0000:5e:00.0" 00:29:44.784 }, 00:29:44.784 "ctrlr_data": { 00:29:44.784 "cntlid": 0, 00:29:44.784 "vendor_id": "0x8086", 00:29:44.784 "model_number": "INTEL SSDPF2KX076TZO", 00:29:44.784 "serial_number": "PHAC0301002G7P6CGN", 00:29:44.784 "firmware_revision": "JCV10200", 00:29:44.784 "subnqn": "nqn.2020-07.com.intel:PHAC0301002G7P6CGN ", 00:29:44.784 "oacs": { 00:29:44.784 "security": 1, 00:29:44.784 "format": 1, 00:29:44.784 "firmware": 1, 00:29:44.784 "ns_manage": 1 00:29:44.784 }, 00:29:44.784 "multi_ctrlr": false, 00:29:44.784 "ana_reporting": false 00:29:44.784 }, 00:29:44.784 "vs": { 00:29:44.784 "nvme_version": "1.3" 00:29:44.784 }, 00:29:44.784 "ns_data": { 00:29:44.784 "id": 1, 00:29:44.785 "can_share": false 00:29:44.785 }, 00:29:44.785 "security": { 00:29:44.785 "opal": true 00:29:44.785 } 00:29:44.785 } 00:29:44.785 ], 00:29:44.785 "mp_policy": "active_passive" 00:29:44.785 } 00:29:44.785 } 00:29:44.785 ] 00:29:45.042 18:33:28 compress_compdev -- common/autotest_common.sh@905 -- # return 0 00:29:45.042 18:33:28 compress_compdev -- compress/compress.sh@37 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_create_lvstore --clear-method none Nvme0n1 lvs0 00:29:45.042 [2024-07-12 18:33:28.752880] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x21f90d0 PMD being used: compress_qat 00:29:47.571 e9a2a9c5-6b6e-4dd4-9eb2-97c8bd8f6daf 00:29:47.571 18:33:30 compress_compdev -- compress/compress.sh@38 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_create -t -l lvs0 lv0 100 00:29:47.571 4a53041c-3474-440a-a4bc-9d73296d52d8 00:29:47.571 18:33:31 compress_compdev -- compress/compress.sh@39 -- # waitforbdev lvs0/lv0 00:29:47.571 18:33:31 compress_compdev -- common/autotest_common.sh@897 -- # local bdev_name=lvs0/lv0 00:29:47.571 18:33:31 compress_compdev -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:29:47.571 18:33:31 compress_compdev -- common/autotest_common.sh@899 -- # local i 00:29:47.571 18:33:31 compress_compdev -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:29:47.571 18:33:31 compress_compdev -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:29:47.571 18:33:31 compress_compdev -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:29:47.830 18:33:31 compress_compdev -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b lvs0/lv0 -t 2000 00:29:48.088 [ 00:29:48.088 { 00:29:48.088 "name": "4a53041c-3474-440a-a4bc-9d73296d52d8", 00:29:48.088 "aliases": [ 00:29:48.088 "lvs0/lv0" 00:29:48.088 ], 00:29:48.088 "product_name": "Logical Volume", 00:29:48.088 "block_size": 512, 00:29:48.088 "num_blocks": 204800, 00:29:48.088 "uuid": "4a53041c-3474-440a-a4bc-9d73296d52d8", 00:29:48.088 "assigned_rate_limits": { 00:29:48.088 "rw_ios_per_sec": 0, 00:29:48.088 "rw_mbytes_per_sec": 0, 00:29:48.088 "r_mbytes_per_sec": 0, 00:29:48.088 "w_mbytes_per_sec": 0 00:29:48.088 }, 00:29:48.088 "claimed": false, 00:29:48.088 "zoned": false, 00:29:48.088 "supported_io_types": { 00:29:48.088 "read": true, 00:29:48.088 "write": true, 00:29:48.088 "unmap": true, 00:29:48.088 "flush": false, 00:29:48.088 "reset": true, 00:29:48.088 "nvme_admin": false, 00:29:48.088 "nvme_io": false, 00:29:48.088 "nvme_io_md": false, 00:29:48.088 "write_zeroes": true, 00:29:48.088 "zcopy": false, 00:29:48.088 "get_zone_info": false, 00:29:48.088 "zone_management": false, 00:29:48.088 "zone_append": false, 00:29:48.088 "compare": false, 00:29:48.088 "compare_and_write": false, 00:29:48.088 "abort": false, 00:29:48.088 "seek_hole": true, 00:29:48.088 "seek_data": true, 00:29:48.088 "copy": false, 00:29:48.088 "nvme_iov_md": false 00:29:48.088 }, 00:29:48.088 "driver_specific": { 00:29:48.088 "lvol": { 00:29:48.088 "lvol_store_uuid": "e9a2a9c5-6b6e-4dd4-9eb2-97c8bd8f6daf", 00:29:48.088 "base_bdev": "Nvme0n1", 00:29:48.088 "thin_provision": true, 00:29:48.088 "num_allocated_clusters": 0, 00:29:48.088 "snapshot": false, 00:29:48.088 "clone": false, 00:29:48.088 "esnap_clone": false 00:29:48.088 } 00:29:48.088 } 00:29:48.088 } 00:29:48.088 ] 00:29:48.088 18:33:31 compress_compdev -- common/autotest_common.sh@905 -- # return 0 00:29:48.088 18:33:31 compress_compdev -- compress/compress.sh@41 -- # '[' -z '' ']' 00:29:48.088 18:33:31 compress_compdev -- compress/compress.sh@42 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_compress_create -b lvs0/lv0 -p /tmp/pmem 00:29:48.346 [2024-07-12 18:33:31.971378] vbdev_compress.c:1016:vbdev_compress_claim: *NOTICE*: registered io_device and virtual bdev for: COMP_lvs0/lv0 00:29:48.347 COMP_lvs0/lv0 00:29:48.347 18:33:32 compress_compdev -- compress/compress.sh@46 -- # waitforbdev COMP_lvs0/lv0 00:29:48.347 18:33:32 compress_compdev -- common/autotest_common.sh@897 -- # local bdev_name=COMP_lvs0/lv0 00:29:48.347 18:33:32 compress_compdev -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:29:48.347 18:33:32 compress_compdev -- common/autotest_common.sh@899 -- # local i 00:29:48.347 18:33:32 compress_compdev -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:29:48.347 18:33:32 compress_compdev -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:29:48.347 18:33:32 compress_compdev -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:29:48.605 18:33:32 compress_compdev -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b COMP_lvs0/lv0 -t 2000 00:29:48.863 [ 00:29:48.863 { 00:29:48.863 "name": "COMP_lvs0/lv0", 00:29:48.863 "aliases": [ 00:29:48.863 "80649899-2370-5856-aed0-bab175727c25" 00:29:48.863 ], 00:29:48.863 "product_name": "compress", 00:29:48.863 "block_size": 512, 00:29:48.863 "num_blocks": 200704, 00:29:48.863 "uuid": "80649899-2370-5856-aed0-bab175727c25", 00:29:48.863 "assigned_rate_limits": { 00:29:48.863 "rw_ios_per_sec": 0, 00:29:48.863 "rw_mbytes_per_sec": 0, 00:29:48.863 "r_mbytes_per_sec": 0, 00:29:48.863 "w_mbytes_per_sec": 0 00:29:48.863 }, 00:29:48.863 "claimed": false, 00:29:48.863 "zoned": false, 00:29:48.863 "supported_io_types": { 00:29:48.863 "read": true, 00:29:48.863 "write": true, 00:29:48.863 "unmap": false, 00:29:48.863 "flush": false, 00:29:48.863 "reset": false, 00:29:48.863 "nvme_admin": false, 00:29:48.863 "nvme_io": false, 00:29:48.863 "nvme_io_md": false, 00:29:48.863 "write_zeroes": true, 00:29:48.863 "zcopy": false, 00:29:48.863 "get_zone_info": false, 00:29:48.863 "zone_management": false, 00:29:48.863 "zone_append": false, 00:29:48.863 "compare": false, 00:29:48.863 "compare_and_write": false, 00:29:48.863 "abort": false, 00:29:48.863 "seek_hole": false, 00:29:48.863 "seek_data": false, 00:29:48.863 "copy": false, 00:29:48.863 "nvme_iov_md": false 00:29:48.863 }, 00:29:48.863 "driver_specific": { 00:29:48.863 "compress": { 00:29:48.863 "name": "COMP_lvs0/lv0", 00:29:48.863 "base_bdev_name": "4a53041c-3474-440a-a4bc-9d73296d52d8" 00:29:48.863 } 00:29:48.863 } 00:29:48.863 } 00:29:48.863 ] 00:29:48.863 18:33:32 compress_compdev -- common/autotest_common.sh@905 -- # return 0 00:29:48.863 18:33:32 compress_compdev -- compress/compress.sh@75 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py perform_tests 00:29:49.123 [2024-07-12 18:33:32.609890] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x7fddf41b15c0 PMD being used: compress_qat 00:29:49.123 [2024-07-12 18:33:32.612123] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x2391670 PMD being used: compress_qat 00:29:49.123 Running I/O for 3 seconds... 00:29:52.506 00:29:52.506 Latency(us) 00:29:52.506 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:29:52.506 Job: COMP_lvs0/lv0 (Core Mask 0x2, workload: verify, depth: 32, IO size: 4096) 00:29:52.506 Verification LBA range: start 0x0 length 0x3100 00:29:52.506 COMP_lvs0/lv0 : 3.00 5144.33 20.10 0.00 0.00 6167.91 418.50 5641.79 00:29:52.506 Job: COMP_lvs0/lv0 (Core Mask 0x4, workload: verify, depth: 32, IO size: 4096) 00:29:52.506 Verification LBA range: start 0x3100 length 0x3100 00:29:52.506 COMP_lvs0/lv0 : 3.00 5407.85 21.12 0.00 0.00 5880.61 406.04 5755.77 00:29:52.506 =================================================================================================================== 00:29:52.506 Total : 10552.18 41.22 0.00 0.00 6020.69 406.04 5755.77 00:29:52.506 0 00:29:52.506 18:33:35 compress_compdev -- compress/compress.sh@76 -- # destroy_vols 00:29:52.506 18:33:35 compress_compdev -- compress/compress.sh@29 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_compress_delete COMP_lvs0/lv0 00:29:52.506 18:33:35 compress_compdev -- compress/compress.sh@30 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_delete_lvstore -l lvs0 00:29:52.506 18:33:36 compress_compdev -- compress/compress.sh@77 -- # trap - SIGINT SIGTERM EXIT 00:29:52.506 18:33:36 compress_compdev -- compress/compress.sh@78 -- # killprocess 2626854 00:29:52.506 18:33:36 compress_compdev -- common/autotest_common.sh@948 -- # '[' -z 2626854 ']' 00:29:52.506 18:33:36 compress_compdev -- common/autotest_common.sh@952 -- # kill -0 2626854 00:29:52.506 18:33:36 compress_compdev -- common/autotest_common.sh@953 -- # uname 00:29:52.506 18:33:36 compress_compdev -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:29:52.506 18:33:36 compress_compdev -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2626854 00:29:52.506 18:33:36 compress_compdev -- common/autotest_common.sh@954 -- # process_name=reactor_1 00:29:52.506 18:33:36 compress_compdev -- common/autotest_common.sh@958 -- # '[' reactor_1 = sudo ']' 00:29:52.506 18:33:36 compress_compdev -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2626854' 00:29:52.506 killing process with pid 2626854 00:29:52.506 18:33:36 compress_compdev -- common/autotest_common.sh@967 -- # kill 2626854 00:29:52.506 Received shutdown signal, test time was about 3.000000 seconds 00:29:52.506 00:29:52.506 Latency(us) 00:29:52.506 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:29:52.506 =================================================================================================================== 00:29:52.506 Total : 0.00 0.00 0.00 0.00 0.00 0.00 0.00 00:29:52.506 18:33:36 compress_compdev -- common/autotest_common.sh@972 -- # wait 2626854 00:29:55.796 18:33:39 compress_compdev -- compress/compress.sh@87 -- # run_bdevperf 32 4096 3 512 00:29:55.796 18:33:39 compress_compdev -- compress/compress.sh@66 -- # [[ compdev == \c\o\m\p\d\e\v ]] 00:29:55.796 18:33:39 compress_compdev -- compress/compress.sh@71 -- # bdevperf_pid=2628601 00:29:55.796 18:33:39 compress_compdev -- compress/compress.sh@72 -- # trap 'killprocess $bdevperf_pid; error_cleanup; exit 1' SIGINT SIGTERM EXIT 00:29:55.796 18:33:39 compress_compdev -- compress/compress.sh@67 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -z -q 32 -o 4096 -w verify -t 3 -C -m 0x6 -c /var/jenkins/workspace/crypto-phy-autotest/spdk/test/compress/dpdk.json 00:29:55.796 18:33:39 compress_compdev -- compress/compress.sh@73 -- # waitforlisten 2628601 00:29:55.796 18:33:39 compress_compdev -- common/autotest_common.sh@829 -- # '[' -z 2628601 ']' 00:29:55.796 18:33:39 compress_compdev -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:29:55.796 18:33:39 compress_compdev -- common/autotest_common.sh@834 -- # local max_retries=100 00:29:55.796 18:33:39 compress_compdev -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:29:55.796 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:29:55.796 18:33:39 compress_compdev -- common/autotest_common.sh@838 -- # xtrace_disable 00:29:55.796 18:33:39 compress_compdev -- common/autotest_common.sh@10 -- # set +x 00:29:55.796 [2024-07-12 18:33:39.251458] Starting SPDK v24.09-pre git sha1 182dd7de4 / DPDK 24.03.0 initialization... 00:29:55.796 [2024-07-12 18:33:39.251535] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x6 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2628601 ] 00:29:55.796 [2024-07-12 18:33:39.373009] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 2 00:29:55.796 [2024-07-12 18:33:39.472393] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 2 00:29:55.796 [2024-07-12 18:33:39.472398] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:29:56.734 [2024-07-12 18:33:40.229893] accel_dpdk_compressdev.c: 296:accel_init_compress_drivers: *NOTICE*: initialized QAT PMD 00:29:56.734 18:33:40 compress_compdev -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:29:56.734 18:33:40 compress_compdev -- common/autotest_common.sh@862 -- # return 0 00:29:56.734 18:33:40 compress_compdev -- compress/compress.sh@74 -- # create_vols 512 00:29:56.734 18:33:40 compress_compdev -- compress/compress.sh@34 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/gen_nvme.sh 00:29:56.734 18:33:40 compress_compdev -- compress/compress.sh@34 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py load_subsystem_config 00:29:57.303 [2024-07-12 18:33:40.895243] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x17543c0 PMD being used: compress_qat 00:29:57.303 18:33:40 compress_compdev -- compress/compress.sh@35 -- # waitforbdev Nvme0n1 00:29:57.303 18:33:40 compress_compdev -- common/autotest_common.sh@897 -- # local bdev_name=Nvme0n1 00:29:57.303 18:33:40 compress_compdev -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:29:57.303 18:33:40 compress_compdev -- common/autotest_common.sh@899 -- # local i 00:29:57.303 18:33:40 compress_compdev -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:29:57.303 18:33:40 compress_compdev -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:29:57.303 18:33:40 compress_compdev -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:29:57.561 18:33:41 compress_compdev -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b Nvme0n1 -t 2000 00:29:57.820 [ 00:29:57.820 { 00:29:57.820 "name": "Nvme0n1", 00:29:57.820 "aliases": [ 00:29:57.820 "01000000-0000-0000-5cd2-e43197705251" 00:29:57.820 ], 00:29:57.820 "product_name": "NVMe disk", 00:29:57.820 "block_size": 512, 00:29:57.820 "num_blocks": 15002931888, 00:29:57.820 "uuid": "01000000-0000-0000-5cd2-e43197705251", 00:29:57.820 "assigned_rate_limits": { 00:29:57.820 "rw_ios_per_sec": 0, 00:29:57.820 "rw_mbytes_per_sec": 0, 00:29:57.820 "r_mbytes_per_sec": 0, 00:29:57.820 "w_mbytes_per_sec": 0 00:29:57.820 }, 00:29:57.820 "claimed": false, 00:29:57.820 "zoned": false, 00:29:57.820 "supported_io_types": { 00:29:57.820 "read": true, 00:29:57.820 "write": true, 00:29:57.820 "unmap": true, 00:29:57.820 "flush": true, 00:29:57.820 "reset": true, 00:29:57.820 "nvme_admin": true, 00:29:57.820 "nvme_io": true, 00:29:57.820 "nvme_io_md": false, 00:29:57.820 "write_zeroes": true, 00:29:57.820 "zcopy": false, 00:29:57.820 "get_zone_info": false, 00:29:57.820 "zone_management": false, 00:29:57.820 "zone_append": false, 00:29:57.820 "compare": false, 00:29:57.820 "compare_and_write": false, 00:29:57.820 "abort": true, 00:29:57.820 "seek_hole": false, 00:29:57.820 "seek_data": false, 00:29:57.820 "copy": false, 00:29:57.820 "nvme_iov_md": false 00:29:57.820 }, 00:29:57.820 "driver_specific": { 00:29:57.820 "nvme": [ 00:29:57.820 { 00:29:57.820 "pci_address": "0000:5e:00.0", 00:29:57.820 "trid": { 00:29:57.820 "trtype": "PCIe", 00:29:57.820 "traddr": "0000:5e:00.0" 00:29:57.820 }, 00:29:57.820 "ctrlr_data": { 00:29:57.820 "cntlid": 0, 00:29:57.820 "vendor_id": "0x8086", 00:29:57.820 "model_number": "INTEL SSDPF2KX076TZO", 00:29:57.820 "serial_number": "PHAC0301002G7P6CGN", 00:29:57.821 "firmware_revision": "JCV10200", 00:29:57.821 "subnqn": "nqn.2020-07.com.intel:PHAC0301002G7P6CGN ", 00:29:57.821 "oacs": { 00:29:57.821 "security": 1, 00:29:57.821 "format": 1, 00:29:57.821 "firmware": 1, 00:29:57.821 "ns_manage": 1 00:29:57.821 }, 00:29:57.821 "multi_ctrlr": false, 00:29:57.821 "ana_reporting": false 00:29:57.821 }, 00:29:57.821 "vs": { 00:29:57.821 "nvme_version": "1.3" 00:29:57.821 }, 00:29:57.821 "ns_data": { 00:29:57.821 "id": 1, 00:29:57.821 "can_share": false 00:29:57.821 }, 00:29:57.821 "security": { 00:29:57.821 "opal": true 00:29:57.821 } 00:29:57.821 } 00:29:57.821 ], 00:29:57.821 "mp_policy": "active_passive" 00:29:57.821 } 00:29:57.821 } 00:29:57.821 ] 00:29:57.821 18:33:41 compress_compdev -- common/autotest_common.sh@905 -- # return 0 00:29:57.821 18:33:41 compress_compdev -- compress/compress.sh@37 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_create_lvstore --clear-method none Nvme0n1 lvs0 00:29:58.080 [2024-07-12 18:33:41.648882] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x15b90d0 PMD being used: compress_qat 00:30:00.611 d1b35409-dbe0-45ce-afde-08a57931b021 00:30:00.611 18:33:43 compress_compdev -- compress/compress.sh@38 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_create -t -l lvs0 lv0 100 00:30:00.611 f7d35850-9d1d-435d-a20c-092c4556c1ed 00:30:00.611 18:33:44 compress_compdev -- compress/compress.sh@39 -- # waitforbdev lvs0/lv0 00:30:00.611 18:33:44 compress_compdev -- common/autotest_common.sh@897 -- # local bdev_name=lvs0/lv0 00:30:00.611 18:33:44 compress_compdev -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:30:00.611 18:33:44 compress_compdev -- common/autotest_common.sh@899 -- # local i 00:30:00.611 18:33:44 compress_compdev -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:30:00.611 18:33:44 compress_compdev -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:30:00.611 18:33:44 compress_compdev -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:30:00.870 18:33:44 compress_compdev -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b lvs0/lv0 -t 2000 00:30:00.870 [ 00:30:00.870 { 00:30:00.870 "name": "f7d35850-9d1d-435d-a20c-092c4556c1ed", 00:30:00.870 "aliases": [ 00:30:00.870 "lvs0/lv0" 00:30:00.870 ], 00:30:00.870 "product_name": "Logical Volume", 00:30:00.870 "block_size": 512, 00:30:00.870 "num_blocks": 204800, 00:30:00.870 "uuid": "f7d35850-9d1d-435d-a20c-092c4556c1ed", 00:30:00.870 "assigned_rate_limits": { 00:30:00.870 "rw_ios_per_sec": 0, 00:30:00.870 "rw_mbytes_per_sec": 0, 00:30:00.870 "r_mbytes_per_sec": 0, 00:30:00.870 "w_mbytes_per_sec": 0 00:30:00.870 }, 00:30:00.870 "claimed": false, 00:30:00.870 "zoned": false, 00:30:00.870 "supported_io_types": { 00:30:00.870 "read": true, 00:30:00.870 "write": true, 00:30:00.870 "unmap": true, 00:30:00.870 "flush": false, 00:30:00.870 "reset": true, 00:30:00.870 "nvme_admin": false, 00:30:00.870 "nvme_io": false, 00:30:00.870 "nvme_io_md": false, 00:30:00.870 "write_zeroes": true, 00:30:00.870 "zcopy": false, 00:30:00.870 "get_zone_info": false, 00:30:00.870 "zone_management": false, 00:30:00.870 "zone_append": false, 00:30:00.870 "compare": false, 00:30:00.870 "compare_and_write": false, 00:30:00.870 "abort": false, 00:30:00.870 "seek_hole": true, 00:30:00.870 "seek_data": true, 00:30:00.870 "copy": false, 00:30:00.870 "nvme_iov_md": false 00:30:00.870 }, 00:30:00.870 "driver_specific": { 00:30:00.870 "lvol": { 00:30:00.870 "lvol_store_uuid": "d1b35409-dbe0-45ce-afde-08a57931b021", 00:30:00.870 "base_bdev": "Nvme0n1", 00:30:00.870 "thin_provision": true, 00:30:00.870 "num_allocated_clusters": 0, 00:30:00.870 "snapshot": false, 00:30:00.870 "clone": false, 00:30:00.870 "esnap_clone": false 00:30:00.870 } 00:30:00.870 } 00:30:00.870 } 00:30:00.870 ] 00:30:01.129 18:33:44 compress_compdev -- common/autotest_common.sh@905 -- # return 0 00:30:01.129 18:33:44 compress_compdev -- compress/compress.sh@41 -- # '[' -z 512 ']' 00:30:01.129 18:33:44 compress_compdev -- compress/compress.sh@44 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_compress_create -b lvs0/lv0 -p /tmp/pmem -l 512 00:30:01.129 [2024-07-12 18:33:44.827316] vbdev_compress.c:1016:vbdev_compress_claim: *NOTICE*: registered io_device and virtual bdev for: COMP_lvs0/lv0 00:30:01.129 COMP_lvs0/lv0 00:30:01.129 18:33:44 compress_compdev -- compress/compress.sh@46 -- # waitforbdev COMP_lvs0/lv0 00:30:01.129 18:33:44 compress_compdev -- common/autotest_common.sh@897 -- # local bdev_name=COMP_lvs0/lv0 00:30:01.129 18:33:44 compress_compdev -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:30:01.129 18:33:44 compress_compdev -- common/autotest_common.sh@899 -- # local i 00:30:01.129 18:33:44 compress_compdev -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:30:01.129 18:33:44 compress_compdev -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:30:01.129 18:33:44 compress_compdev -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:30:01.696 18:33:45 compress_compdev -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b COMP_lvs0/lv0 -t 2000 00:30:01.696 [ 00:30:01.696 { 00:30:01.696 "name": "COMP_lvs0/lv0", 00:30:01.696 "aliases": [ 00:30:01.696 "1e56b3a4-7481-5e9d-bc9c-614371aa32e1" 00:30:01.696 ], 00:30:01.696 "product_name": "compress", 00:30:01.696 "block_size": 512, 00:30:01.696 "num_blocks": 200704, 00:30:01.696 "uuid": "1e56b3a4-7481-5e9d-bc9c-614371aa32e1", 00:30:01.696 "assigned_rate_limits": { 00:30:01.696 "rw_ios_per_sec": 0, 00:30:01.696 "rw_mbytes_per_sec": 0, 00:30:01.696 "r_mbytes_per_sec": 0, 00:30:01.696 "w_mbytes_per_sec": 0 00:30:01.696 }, 00:30:01.696 "claimed": false, 00:30:01.696 "zoned": false, 00:30:01.696 "supported_io_types": { 00:30:01.696 "read": true, 00:30:01.696 "write": true, 00:30:01.696 "unmap": false, 00:30:01.696 "flush": false, 00:30:01.696 "reset": false, 00:30:01.696 "nvme_admin": false, 00:30:01.696 "nvme_io": false, 00:30:01.696 "nvme_io_md": false, 00:30:01.696 "write_zeroes": true, 00:30:01.696 "zcopy": false, 00:30:01.696 "get_zone_info": false, 00:30:01.696 "zone_management": false, 00:30:01.696 "zone_append": false, 00:30:01.696 "compare": false, 00:30:01.696 "compare_and_write": false, 00:30:01.696 "abort": false, 00:30:01.696 "seek_hole": false, 00:30:01.696 "seek_data": false, 00:30:01.696 "copy": false, 00:30:01.696 "nvme_iov_md": false 00:30:01.696 }, 00:30:01.696 "driver_specific": { 00:30:01.696 "compress": { 00:30:01.696 "name": "COMP_lvs0/lv0", 00:30:01.696 "base_bdev_name": "f7d35850-9d1d-435d-a20c-092c4556c1ed" 00:30:01.696 } 00:30:01.696 } 00:30:01.696 } 00:30:01.696 ] 00:30:01.696 18:33:45 compress_compdev -- common/autotest_common.sh@905 -- # return 0 00:30:01.696 18:33:45 compress_compdev -- compress/compress.sh@75 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py perform_tests 00:30:01.954 [2024-07-12 18:33:45.473792] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x7f66f41b15c0 PMD being used: compress_qat 00:30:01.954 [2024-07-12 18:33:45.476026] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x1751700 PMD being used: compress_qat 00:30:01.954 Running I/O for 3 seconds... 00:30:05.240 00:30:05.240 Latency(us) 00:30:05.240 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:30:05.240 Job: COMP_lvs0/lv0 (Core Mask 0x2, workload: verify, depth: 32, IO size: 4096) 00:30:05.240 Verification LBA range: start 0x0 length 0x3100 00:30:05.240 COMP_lvs0/lv0 : 3.00 5160.14 20.16 0.00 0.00 6150.29 477.27 5727.28 00:30:05.240 Job: COMP_lvs0/lv0 (Core Mask 0x4, workload: verify, depth: 32, IO size: 4096) 00:30:05.240 Verification LBA range: start 0x3100 length 0x3100 00:30:05.240 COMP_lvs0/lv0 : 3.00 5422.43 21.18 0.00 0.00 5864.89 338.37 5470.83 00:30:05.240 =================================================================================================================== 00:30:05.240 Total : 10582.57 41.34 0.00 0.00 6004.05 338.37 5727.28 00:30:05.240 0 00:30:05.240 18:33:48 compress_compdev -- compress/compress.sh@76 -- # destroy_vols 00:30:05.240 18:33:48 compress_compdev -- compress/compress.sh@29 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_compress_delete COMP_lvs0/lv0 00:30:05.240 18:33:48 compress_compdev -- compress/compress.sh@30 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_delete_lvstore -l lvs0 00:30:05.499 18:33:49 compress_compdev -- compress/compress.sh@77 -- # trap - SIGINT SIGTERM EXIT 00:30:05.499 18:33:49 compress_compdev -- compress/compress.sh@78 -- # killprocess 2628601 00:30:05.499 18:33:49 compress_compdev -- common/autotest_common.sh@948 -- # '[' -z 2628601 ']' 00:30:05.499 18:33:49 compress_compdev -- common/autotest_common.sh@952 -- # kill -0 2628601 00:30:05.499 18:33:49 compress_compdev -- common/autotest_common.sh@953 -- # uname 00:30:05.499 18:33:49 compress_compdev -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:30:05.499 18:33:49 compress_compdev -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2628601 00:30:05.499 18:33:49 compress_compdev -- common/autotest_common.sh@954 -- # process_name=reactor_1 00:30:05.499 18:33:49 compress_compdev -- common/autotest_common.sh@958 -- # '[' reactor_1 = sudo ']' 00:30:05.499 18:33:49 compress_compdev -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2628601' 00:30:05.499 killing process with pid 2628601 00:30:05.499 18:33:49 compress_compdev -- common/autotest_common.sh@967 -- # kill 2628601 00:30:05.499 Received shutdown signal, test time was about 3.000000 seconds 00:30:05.499 00:30:05.499 Latency(us) 00:30:05.499 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:30:05.499 =================================================================================================================== 00:30:05.499 Total : 0.00 0.00 0.00 0.00 0.00 0.00 0.00 00:30:05.499 18:33:49 compress_compdev -- common/autotest_common.sh@972 -- # wait 2628601 00:30:08.788 18:33:52 compress_compdev -- compress/compress.sh@88 -- # run_bdevperf 32 4096 3 4096 00:30:08.788 18:33:52 compress_compdev -- compress/compress.sh@66 -- # [[ compdev == \c\o\m\p\d\e\v ]] 00:30:08.788 18:33:52 compress_compdev -- compress/compress.sh@71 -- # bdevperf_pid=2630207 00:30:08.788 18:33:52 compress_compdev -- compress/compress.sh@72 -- # trap 'killprocess $bdevperf_pid; error_cleanup; exit 1' SIGINT SIGTERM EXIT 00:30:08.788 18:33:52 compress_compdev -- compress/compress.sh@67 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -z -q 32 -o 4096 -w verify -t 3 -C -m 0x6 -c /var/jenkins/workspace/crypto-phy-autotest/spdk/test/compress/dpdk.json 00:30:08.788 18:33:52 compress_compdev -- compress/compress.sh@73 -- # waitforlisten 2630207 00:30:08.788 18:33:52 compress_compdev -- common/autotest_common.sh@829 -- # '[' -z 2630207 ']' 00:30:08.788 18:33:52 compress_compdev -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:30:08.788 18:33:52 compress_compdev -- common/autotest_common.sh@834 -- # local max_retries=100 00:30:08.788 18:33:52 compress_compdev -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:30:08.788 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:30:08.788 18:33:52 compress_compdev -- common/autotest_common.sh@838 -- # xtrace_disable 00:30:08.788 18:33:52 compress_compdev -- common/autotest_common.sh@10 -- # set +x 00:30:08.788 [2024-07-12 18:33:52.132953] Starting SPDK v24.09-pre git sha1 182dd7de4 / DPDK 24.03.0 initialization... 00:30:08.788 [2024-07-12 18:33:52.133034] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x6 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2630207 ] 00:30:08.788 [2024-07-12 18:33:52.253985] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 2 00:30:08.788 [2024-07-12 18:33:52.356685] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 2 00:30:08.788 [2024-07-12 18:33:52.356691] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:30:09.722 [2024-07-12 18:33:53.105253] accel_dpdk_compressdev.c: 296:accel_init_compress_drivers: *NOTICE*: initialized QAT PMD 00:30:09.722 18:33:53 compress_compdev -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:30:09.722 18:33:53 compress_compdev -- common/autotest_common.sh@862 -- # return 0 00:30:09.722 18:33:53 compress_compdev -- compress/compress.sh@74 -- # create_vols 4096 00:30:09.722 18:33:53 compress_compdev -- compress/compress.sh@34 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/gen_nvme.sh 00:30:09.722 18:33:53 compress_compdev -- compress/compress.sh@34 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py load_subsystem_config 00:30:10.288 [2024-07-12 18:33:53.751016] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x104b3c0 PMD being used: compress_qat 00:30:10.288 18:33:53 compress_compdev -- compress/compress.sh@35 -- # waitforbdev Nvme0n1 00:30:10.288 18:33:53 compress_compdev -- common/autotest_common.sh@897 -- # local bdev_name=Nvme0n1 00:30:10.288 18:33:53 compress_compdev -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:30:10.288 18:33:53 compress_compdev -- common/autotest_common.sh@899 -- # local i 00:30:10.288 18:33:53 compress_compdev -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:30:10.288 18:33:53 compress_compdev -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:30:10.288 18:33:53 compress_compdev -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:30:10.547 18:33:54 compress_compdev -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b Nvme0n1 -t 2000 00:30:10.547 [ 00:30:10.547 { 00:30:10.547 "name": "Nvme0n1", 00:30:10.547 "aliases": [ 00:30:10.547 "01000000-0000-0000-5cd2-e43197705251" 00:30:10.547 ], 00:30:10.547 "product_name": "NVMe disk", 00:30:10.547 "block_size": 512, 00:30:10.547 "num_blocks": 15002931888, 00:30:10.547 "uuid": "01000000-0000-0000-5cd2-e43197705251", 00:30:10.547 "assigned_rate_limits": { 00:30:10.547 "rw_ios_per_sec": 0, 00:30:10.547 "rw_mbytes_per_sec": 0, 00:30:10.547 "r_mbytes_per_sec": 0, 00:30:10.547 "w_mbytes_per_sec": 0 00:30:10.547 }, 00:30:10.547 "claimed": false, 00:30:10.547 "zoned": false, 00:30:10.547 "supported_io_types": { 00:30:10.547 "read": true, 00:30:10.547 "write": true, 00:30:10.547 "unmap": true, 00:30:10.547 "flush": true, 00:30:10.547 "reset": true, 00:30:10.547 "nvme_admin": true, 00:30:10.547 "nvme_io": true, 00:30:10.547 "nvme_io_md": false, 00:30:10.547 "write_zeroes": true, 00:30:10.547 "zcopy": false, 00:30:10.547 "get_zone_info": false, 00:30:10.547 "zone_management": false, 00:30:10.547 "zone_append": false, 00:30:10.547 "compare": false, 00:30:10.547 "compare_and_write": false, 00:30:10.547 "abort": true, 00:30:10.547 "seek_hole": false, 00:30:10.547 "seek_data": false, 00:30:10.547 "copy": false, 00:30:10.547 "nvme_iov_md": false 00:30:10.547 }, 00:30:10.547 "driver_specific": { 00:30:10.547 "nvme": [ 00:30:10.547 { 00:30:10.547 "pci_address": "0000:5e:00.0", 00:30:10.547 "trid": { 00:30:10.547 "trtype": "PCIe", 00:30:10.547 "traddr": "0000:5e:00.0" 00:30:10.547 }, 00:30:10.547 "ctrlr_data": { 00:30:10.547 "cntlid": 0, 00:30:10.547 "vendor_id": "0x8086", 00:30:10.547 "model_number": "INTEL SSDPF2KX076TZO", 00:30:10.547 "serial_number": "PHAC0301002G7P6CGN", 00:30:10.547 "firmware_revision": "JCV10200", 00:30:10.547 "subnqn": "nqn.2020-07.com.intel:PHAC0301002G7P6CGN ", 00:30:10.547 "oacs": { 00:30:10.547 "security": 1, 00:30:10.547 "format": 1, 00:30:10.547 "firmware": 1, 00:30:10.547 "ns_manage": 1 00:30:10.547 }, 00:30:10.547 "multi_ctrlr": false, 00:30:10.547 "ana_reporting": false 00:30:10.547 }, 00:30:10.547 "vs": { 00:30:10.547 "nvme_version": "1.3" 00:30:10.547 }, 00:30:10.547 "ns_data": { 00:30:10.547 "id": 1, 00:30:10.547 "can_share": false 00:30:10.547 }, 00:30:10.547 "security": { 00:30:10.547 "opal": true 00:30:10.547 } 00:30:10.547 } 00:30:10.547 ], 00:30:10.547 "mp_policy": "active_passive" 00:30:10.547 } 00:30:10.547 } 00:30:10.547 ] 00:30:10.547 18:33:54 compress_compdev -- common/autotest_common.sh@905 -- # return 0 00:30:10.547 18:33:54 compress_compdev -- compress/compress.sh@37 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_create_lvstore --clear-method none Nvme0n1 lvs0 00:30:10.805 [2024-07-12 18:33:54.496554] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0xeb00d0 PMD being used: compress_qat 00:30:13.337 27c87429-c106-4c12-9e71-5516e1fa6664 00:30:13.337 18:33:56 compress_compdev -- compress/compress.sh@38 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_create -t -l lvs0 lv0 100 00:30:13.337 0cf4702f-2e40-4fa3-89a3-6083203e8ea6 00:30:13.337 18:33:56 compress_compdev -- compress/compress.sh@39 -- # waitforbdev lvs0/lv0 00:30:13.337 18:33:56 compress_compdev -- common/autotest_common.sh@897 -- # local bdev_name=lvs0/lv0 00:30:13.337 18:33:56 compress_compdev -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:30:13.337 18:33:56 compress_compdev -- common/autotest_common.sh@899 -- # local i 00:30:13.337 18:33:56 compress_compdev -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:30:13.337 18:33:56 compress_compdev -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:30:13.337 18:33:56 compress_compdev -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:30:13.596 18:33:57 compress_compdev -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b lvs0/lv0 -t 2000 00:30:13.855 [ 00:30:13.855 { 00:30:13.855 "name": "0cf4702f-2e40-4fa3-89a3-6083203e8ea6", 00:30:13.855 "aliases": [ 00:30:13.855 "lvs0/lv0" 00:30:13.855 ], 00:30:13.855 "product_name": "Logical Volume", 00:30:13.855 "block_size": 512, 00:30:13.855 "num_blocks": 204800, 00:30:13.855 "uuid": "0cf4702f-2e40-4fa3-89a3-6083203e8ea6", 00:30:13.855 "assigned_rate_limits": { 00:30:13.855 "rw_ios_per_sec": 0, 00:30:13.855 "rw_mbytes_per_sec": 0, 00:30:13.855 "r_mbytes_per_sec": 0, 00:30:13.855 "w_mbytes_per_sec": 0 00:30:13.855 }, 00:30:13.855 "claimed": false, 00:30:13.855 "zoned": false, 00:30:13.855 "supported_io_types": { 00:30:13.855 "read": true, 00:30:13.855 "write": true, 00:30:13.855 "unmap": true, 00:30:13.855 "flush": false, 00:30:13.855 "reset": true, 00:30:13.855 "nvme_admin": false, 00:30:13.855 "nvme_io": false, 00:30:13.855 "nvme_io_md": false, 00:30:13.855 "write_zeroes": true, 00:30:13.855 "zcopy": false, 00:30:13.855 "get_zone_info": false, 00:30:13.855 "zone_management": false, 00:30:13.855 "zone_append": false, 00:30:13.855 "compare": false, 00:30:13.855 "compare_and_write": false, 00:30:13.855 "abort": false, 00:30:13.855 "seek_hole": true, 00:30:13.855 "seek_data": true, 00:30:13.855 "copy": false, 00:30:13.855 "nvme_iov_md": false 00:30:13.855 }, 00:30:13.855 "driver_specific": { 00:30:13.855 "lvol": { 00:30:13.855 "lvol_store_uuid": "27c87429-c106-4c12-9e71-5516e1fa6664", 00:30:13.855 "base_bdev": "Nvme0n1", 00:30:13.855 "thin_provision": true, 00:30:13.855 "num_allocated_clusters": 0, 00:30:13.855 "snapshot": false, 00:30:13.855 "clone": false, 00:30:13.855 "esnap_clone": false 00:30:13.855 } 00:30:13.855 } 00:30:13.855 } 00:30:13.855 ] 00:30:13.855 18:33:57 compress_compdev -- common/autotest_common.sh@905 -- # return 0 00:30:13.855 18:33:57 compress_compdev -- compress/compress.sh@41 -- # '[' -z 4096 ']' 00:30:13.855 18:33:57 compress_compdev -- compress/compress.sh@44 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_compress_create -b lvs0/lv0 -p /tmp/pmem -l 4096 00:30:14.114 [2024-07-12 18:33:57.703280] vbdev_compress.c:1016:vbdev_compress_claim: *NOTICE*: registered io_device and virtual bdev for: COMP_lvs0/lv0 00:30:14.114 COMP_lvs0/lv0 00:30:14.114 18:33:57 compress_compdev -- compress/compress.sh@46 -- # waitforbdev COMP_lvs0/lv0 00:30:14.114 18:33:57 compress_compdev -- common/autotest_common.sh@897 -- # local bdev_name=COMP_lvs0/lv0 00:30:14.114 18:33:57 compress_compdev -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:30:14.114 18:33:57 compress_compdev -- common/autotest_common.sh@899 -- # local i 00:30:14.114 18:33:57 compress_compdev -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:30:14.114 18:33:57 compress_compdev -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:30:14.114 18:33:57 compress_compdev -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:30:14.373 18:33:57 compress_compdev -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b COMP_lvs0/lv0 -t 2000 00:30:14.632 [ 00:30:14.632 { 00:30:14.632 "name": "COMP_lvs0/lv0", 00:30:14.632 "aliases": [ 00:30:14.632 "1af63df3-14af-5a65-bc1f-ca8518d03b9c" 00:30:14.632 ], 00:30:14.632 "product_name": "compress", 00:30:14.632 "block_size": 4096, 00:30:14.632 "num_blocks": 25088, 00:30:14.632 "uuid": "1af63df3-14af-5a65-bc1f-ca8518d03b9c", 00:30:14.632 "assigned_rate_limits": { 00:30:14.632 "rw_ios_per_sec": 0, 00:30:14.632 "rw_mbytes_per_sec": 0, 00:30:14.632 "r_mbytes_per_sec": 0, 00:30:14.632 "w_mbytes_per_sec": 0 00:30:14.632 }, 00:30:14.632 "claimed": false, 00:30:14.632 "zoned": false, 00:30:14.632 "supported_io_types": { 00:30:14.632 "read": true, 00:30:14.632 "write": true, 00:30:14.632 "unmap": false, 00:30:14.632 "flush": false, 00:30:14.632 "reset": false, 00:30:14.632 "nvme_admin": false, 00:30:14.632 "nvme_io": false, 00:30:14.632 "nvme_io_md": false, 00:30:14.632 "write_zeroes": true, 00:30:14.632 "zcopy": false, 00:30:14.632 "get_zone_info": false, 00:30:14.632 "zone_management": false, 00:30:14.632 "zone_append": false, 00:30:14.632 "compare": false, 00:30:14.632 "compare_and_write": false, 00:30:14.632 "abort": false, 00:30:14.632 "seek_hole": false, 00:30:14.632 "seek_data": false, 00:30:14.632 "copy": false, 00:30:14.632 "nvme_iov_md": false 00:30:14.632 }, 00:30:14.632 "driver_specific": { 00:30:14.632 "compress": { 00:30:14.632 "name": "COMP_lvs0/lv0", 00:30:14.632 "base_bdev_name": "0cf4702f-2e40-4fa3-89a3-6083203e8ea6" 00:30:14.632 } 00:30:14.632 } 00:30:14.632 } 00:30:14.632 ] 00:30:14.632 18:33:58 compress_compdev -- common/autotest_common.sh@905 -- # return 0 00:30:14.632 18:33:58 compress_compdev -- compress/compress.sh@75 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py perform_tests 00:30:14.632 [2024-07-12 18:33:58.357786] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x7f3efc1b15c0 PMD being used: compress_qat 00:30:14.891 [2024-07-12 18:33:58.360056] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x1048700 PMD being used: compress_qat 00:30:14.891 Running I/O for 3 seconds... 00:30:18.241 00:30:18.241 Latency(us) 00:30:18.241 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:30:18.241 Job: COMP_lvs0/lv0 (Core Mask 0x2, workload: verify, depth: 32, IO size: 4096) 00:30:18.241 Verification LBA range: start 0x0 length 0x3100 00:30:18.241 COMP_lvs0/lv0 : 3.00 5099.72 19.92 0.00 0.00 6222.94 523.58 5983.72 00:30:18.241 Job: COMP_lvs0/lv0 (Core Mask 0x4, workload: verify, depth: 32, IO size: 4096) 00:30:18.241 Verification LBA range: start 0x3100 length 0x3100 00:30:18.241 COMP_lvs0/lv0 : 3.00 5335.38 20.84 0.00 0.00 5960.45 356.17 5812.76 00:30:18.241 =================================================================================================================== 00:30:18.241 Total : 10435.10 40.76 0.00 0.00 6088.73 356.17 5983.72 00:30:18.241 0 00:30:18.241 18:34:01 compress_compdev -- compress/compress.sh@76 -- # destroy_vols 00:30:18.241 18:34:01 compress_compdev -- compress/compress.sh@29 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_compress_delete COMP_lvs0/lv0 00:30:18.241 18:34:01 compress_compdev -- compress/compress.sh@30 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_delete_lvstore -l lvs0 00:30:18.241 18:34:01 compress_compdev -- compress/compress.sh@77 -- # trap - SIGINT SIGTERM EXIT 00:30:18.241 18:34:01 compress_compdev -- compress/compress.sh@78 -- # killprocess 2630207 00:30:18.241 18:34:01 compress_compdev -- common/autotest_common.sh@948 -- # '[' -z 2630207 ']' 00:30:18.241 18:34:01 compress_compdev -- common/autotest_common.sh@952 -- # kill -0 2630207 00:30:18.241 18:34:01 compress_compdev -- common/autotest_common.sh@953 -- # uname 00:30:18.241 18:34:01 compress_compdev -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:30:18.241 18:34:01 compress_compdev -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2630207 00:30:18.241 18:34:01 compress_compdev -- common/autotest_common.sh@954 -- # process_name=reactor_1 00:30:18.241 18:34:01 compress_compdev -- common/autotest_common.sh@958 -- # '[' reactor_1 = sudo ']' 00:30:18.241 18:34:01 compress_compdev -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2630207' 00:30:18.241 killing process with pid 2630207 00:30:18.241 18:34:01 compress_compdev -- common/autotest_common.sh@967 -- # kill 2630207 00:30:18.241 Received shutdown signal, test time was about 3.000000 seconds 00:30:18.241 00:30:18.241 Latency(us) 00:30:18.241 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:30:18.241 =================================================================================================================== 00:30:18.241 Total : 0.00 0.00 0.00 0.00 0.00 0.00 0.00 00:30:18.241 18:34:01 compress_compdev -- common/autotest_common.sh@972 -- # wait 2630207 00:30:21.524 18:34:04 compress_compdev -- compress/compress.sh@89 -- # run_bdevio 00:30:21.524 18:34:04 compress_compdev -- compress/compress.sh@50 -- # [[ compdev == \c\o\m\p\d\e\v ]] 00:30:21.524 18:34:04 compress_compdev -- compress/compress.sh@55 -- # bdevio_pid=2631811 00:30:21.524 18:34:04 compress_compdev -- compress/compress.sh@56 -- # trap 'killprocess $bdevio_pid; error_cleanup; exit 1' SIGINT SIGTERM EXIT 00:30:21.524 18:34:04 compress_compdev -- compress/compress.sh@51 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevio/bdevio -c /var/jenkins/workspace/crypto-phy-autotest/spdk/test/compress/dpdk.json -w 00:30:21.524 18:34:04 compress_compdev -- compress/compress.sh@57 -- # waitforlisten 2631811 00:30:21.524 18:34:04 compress_compdev -- common/autotest_common.sh@829 -- # '[' -z 2631811 ']' 00:30:21.524 18:34:04 compress_compdev -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:30:21.524 18:34:04 compress_compdev -- common/autotest_common.sh@834 -- # local max_retries=100 00:30:21.524 18:34:04 compress_compdev -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:30:21.524 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:30:21.524 18:34:04 compress_compdev -- common/autotest_common.sh@838 -- # xtrace_disable 00:30:21.524 18:34:04 compress_compdev -- common/autotest_common.sh@10 -- # set +x 00:30:21.524 [2024-07-12 18:34:05.001695] Starting SPDK v24.09-pre git sha1 182dd7de4 / DPDK 24.03.0 initialization... 00:30:21.524 [2024-07-12 18:34:05.001771] [ DPDK EAL parameters: bdevio --no-shconf -c 0x7 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2631811 ] 00:30:21.524 [2024-07-12 18:34:05.132415] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 3 00:30:21.524 [2024-07-12 18:34:05.238274] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:30:21.524 [2024-07-12 18:34:05.238360] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 2 00:30:21.524 [2024-07-12 18:34:05.238365] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:30:22.462 [2024-07-12 18:34:05.991145] accel_dpdk_compressdev.c: 296:accel_init_compress_drivers: *NOTICE*: initialized QAT PMD 00:30:22.462 18:34:06 compress_compdev -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:30:22.462 18:34:06 compress_compdev -- common/autotest_common.sh@862 -- # return 0 00:30:22.462 18:34:06 compress_compdev -- compress/compress.sh@58 -- # create_vols 00:30:22.462 18:34:06 compress_compdev -- compress/compress.sh@34 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/gen_nvme.sh 00:30:22.462 18:34:06 compress_compdev -- compress/compress.sh@34 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py load_subsystem_config 00:30:23.030 [2024-07-12 18:34:06.631578] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0xbe5f20 PMD being used: compress_qat 00:30:23.030 18:34:06 compress_compdev -- compress/compress.sh@35 -- # waitforbdev Nvme0n1 00:30:23.030 18:34:06 compress_compdev -- common/autotest_common.sh@897 -- # local bdev_name=Nvme0n1 00:30:23.030 18:34:06 compress_compdev -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:30:23.030 18:34:06 compress_compdev -- common/autotest_common.sh@899 -- # local i 00:30:23.031 18:34:06 compress_compdev -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:30:23.031 18:34:06 compress_compdev -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:30:23.031 18:34:06 compress_compdev -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:30:23.289 18:34:06 compress_compdev -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b Nvme0n1 -t 2000 00:30:23.548 [ 00:30:23.548 { 00:30:23.548 "name": "Nvme0n1", 00:30:23.548 "aliases": [ 00:30:23.548 "01000000-0000-0000-5cd2-e43197705251" 00:30:23.548 ], 00:30:23.548 "product_name": "NVMe disk", 00:30:23.548 "block_size": 512, 00:30:23.548 "num_blocks": 15002931888, 00:30:23.548 "uuid": "01000000-0000-0000-5cd2-e43197705251", 00:30:23.548 "assigned_rate_limits": { 00:30:23.548 "rw_ios_per_sec": 0, 00:30:23.548 "rw_mbytes_per_sec": 0, 00:30:23.548 "r_mbytes_per_sec": 0, 00:30:23.548 "w_mbytes_per_sec": 0 00:30:23.548 }, 00:30:23.548 "claimed": false, 00:30:23.548 "zoned": false, 00:30:23.548 "supported_io_types": { 00:30:23.548 "read": true, 00:30:23.548 "write": true, 00:30:23.548 "unmap": true, 00:30:23.548 "flush": true, 00:30:23.548 "reset": true, 00:30:23.548 "nvme_admin": true, 00:30:23.548 "nvme_io": true, 00:30:23.548 "nvme_io_md": false, 00:30:23.548 "write_zeroes": true, 00:30:23.548 "zcopy": false, 00:30:23.548 "get_zone_info": false, 00:30:23.548 "zone_management": false, 00:30:23.548 "zone_append": false, 00:30:23.548 "compare": false, 00:30:23.548 "compare_and_write": false, 00:30:23.548 "abort": true, 00:30:23.548 "seek_hole": false, 00:30:23.548 "seek_data": false, 00:30:23.548 "copy": false, 00:30:23.548 "nvme_iov_md": false 00:30:23.548 }, 00:30:23.548 "driver_specific": { 00:30:23.548 "nvme": [ 00:30:23.548 { 00:30:23.548 "pci_address": "0000:5e:00.0", 00:30:23.548 "trid": { 00:30:23.548 "trtype": "PCIe", 00:30:23.548 "traddr": "0000:5e:00.0" 00:30:23.548 }, 00:30:23.548 "ctrlr_data": { 00:30:23.548 "cntlid": 0, 00:30:23.548 "vendor_id": "0x8086", 00:30:23.548 "model_number": "INTEL SSDPF2KX076TZO", 00:30:23.548 "serial_number": "PHAC0301002G7P6CGN", 00:30:23.548 "firmware_revision": "JCV10200", 00:30:23.548 "subnqn": "nqn.2020-07.com.intel:PHAC0301002G7P6CGN ", 00:30:23.548 "oacs": { 00:30:23.548 "security": 1, 00:30:23.548 "format": 1, 00:30:23.548 "firmware": 1, 00:30:23.548 "ns_manage": 1 00:30:23.548 }, 00:30:23.548 "multi_ctrlr": false, 00:30:23.548 "ana_reporting": false 00:30:23.548 }, 00:30:23.548 "vs": { 00:30:23.548 "nvme_version": "1.3" 00:30:23.548 }, 00:30:23.548 "ns_data": { 00:30:23.548 "id": 1, 00:30:23.548 "can_share": false 00:30:23.548 }, 00:30:23.548 "security": { 00:30:23.548 "opal": true 00:30:23.548 } 00:30:23.548 } 00:30:23.548 ], 00:30:23.548 "mp_policy": "active_passive" 00:30:23.548 } 00:30:23.548 } 00:30:23.548 ] 00:30:23.548 18:34:07 compress_compdev -- common/autotest_common.sh@905 -- # return 0 00:30:23.548 18:34:07 compress_compdev -- compress/compress.sh@37 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_create_lvstore --clear-method none Nvme0n1 lvs0 00:30:23.807 [2024-07-12 18:34:07.377105] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0xa34440 PMD being used: compress_qat 00:30:26.342 8b3f33da-0f87-4d03-ae5a-553e06e387b4 00:30:26.342 18:34:09 compress_compdev -- compress/compress.sh@38 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_create -t -l lvs0 lv0 100 00:30:26.342 7a1783cc-8dea-41c1-87ac-754624c6ffad 00:30:26.342 18:34:09 compress_compdev -- compress/compress.sh@39 -- # waitforbdev lvs0/lv0 00:30:26.342 18:34:09 compress_compdev -- common/autotest_common.sh@897 -- # local bdev_name=lvs0/lv0 00:30:26.342 18:34:09 compress_compdev -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:30:26.342 18:34:09 compress_compdev -- common/autotest_common.sh@899 -- # local i 00:30:26.342 18:34:09 compress_compdev -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:30:26.342 18:34:09 compress_compdev -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:30:26.342 18:34:09 compress_compdev -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:30:26.600 18:34:10 compress_compdev -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b lvs0/lv0 -t 2000 00:30:26.600 [ 00:30:26.600 { 00:30:26.600 "name": "7a1783cc-8dea-41c1-87ac-754624c6ffad", 00:30:26.600 "aliases": [ 00:30:26.600 "lvs0/lv0" 00:30:26.600 ], 00:30:26.600 "product_name": "Logical Volume", 00:30:26.600 "block_size": 512, 00:30:26.600 "num_blocks": 204800, 00:30:26.600 "uuid": "7a1783cc-8dea-41c1-87ac-754624c6ffad", 00:30:26.600 "assigned_rate_limits": { 00:30:26.600 "rw_ios_per_sec": 0, 00:30:26.600 "rw_mbytes_per_sec": 0, 00:30:26.600 "r_mbytes_per_sec": 0, 00:30:26.600 "w_mbytes_per_sec": 0 00:30:26.600 }, 00:30:26.600 "claimed": false, 00:30:26.600 "zoned": false, 00:30:26.600 "supported_io_types": { 00:30:26.600 "read": true, 00:30:26.600 "write": true, 00:30:26.600 "unmap": true, 00:30:26.600 "flush": false, 00:30:26.600 "reset": true, 00:30:26.600 "nvme_admin": false, 00:30:26.600 "nvme_io": false, 00:30:26.600 "nvme_io_md": false, 00:30:26.600 "write_zeroes": true, 00:30:26.600 "zcopy": false, 00:30:26.600 "get_zone_info": false, 00:30:26.600 "zone_management": false, 00:30:26.600 "zone_append": false, 00:30:26.600 "compare": false, 00:30:26.600 "compare_and_write": false, 00:30:26.600 "abort": false, 00:30:26.600 "seek_hole": true, 00:30:26.600 "seek_data": true, 00:30:26.600 "copy": false, 00:30:26.600 "nvme_iov_md": false 00:30:26.600 }, 00:30:26.600 "driver_specific": { 00:30:26.600 "lvol": { 00:30:26.600 "lvol_store_uuid": "8b3f33da-0f87-4d03-ae5a-553e06e387b4", 00:30:26.600 "base_bdev": "Nvme0n1", 00:30:26.600 "thin_provision": true, 00:30:26.600 "num_allocated_clusters": 0, 00:30:26.600 "snapshot": false, 00:30:26.600 "clone": false, 00:30:26.600 "esnap_clone": false 00:30:26.600 } 00:30:26.600 } 00:30:26.600 } 00:30:26.600 ] 00:30:26.858 18:34:10 compress_compdev -- common/autotest_common.sh@905 -- # return 0 00:30:26.858 18:34:10 compress_compdev -- compress/compress.sh@41 -- # '[' -z '' ']' 00:30:26.858 18:34:10 compress_compdev -- compress/compress.sh@42 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_compress_create -b lvs0/lv0 -p /tmp/pmem 00:30:26.858 [2024-07-12 18:34:10.561305] vbdev_compress.c:1016:vbdev_compress_claim: *NOTICE*: registered io_device and virtual bdev for: COMP_lvs0/lv0 00:30:26.858 COMP_lvs0/lv0 00:30:26.858 18:34:10 compress_compdev -- compress/compress.sh@46 -- # waitforbdev COMP_lvs0/lv0 00:30:26.858 18:34:10 compress_compdev -- common/autotest_common.sh@897 -- # local bdev_name=COMP_lvs0/lv0 00:30:26.858 18:34:10 compress_compdev -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:30:26.858 18:34:10 compress_compdev -- common/autotest_common.sh@899 -- # local i 00:30:26.858 18:34:10 compress_compdev -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:30:26.858 18:34:10 compress_compdev -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:30:26.858 18:34:10 compress_compdev -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:30:27.116 18:34:10 compress_compdev -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b COMP_lvs0/lv0 -t 2000 00:30:27.374 [ 00:30:27.374 { 00:30:27.374 "name": "COMP_lvs0/lv0", 00:30:27.374 "aliases": [ 00:30:27.374 "fa800cf9-5519-591c-ba1a-d8809847a357" 00:30:27.374 ], 00:30:27.374 "product_name": "compress", 00:30:27.374 "block_size": 512, 00:30:27.374 "num_blocks": 200704, 00:30:27.374 "uuid": "fa800cf9-5519-591c-ba1a-d8809847a357", 00:30:27.374 "assigned_rate_limits": { 00:30:27.374 "rw_ios_per_sec": 0, 00:30:27.374 "rw_mbytes_per_sec": 0, 00:30:27.374 "r_mbytes_per_sec": 0, 00:30:27.374 "w_mbytes_per_sec": 0 00:30:27.374 }, 00:30:27.374 "claimed": false, 00:30:27.374 "zoned": false, 00:30:27.374 "supported_io_types": { 00:30:27.374 "read": true, 00:30:27.374 "write": true, 00:30:27.374 "unmap": false, 00:30:27.374 "flush": false, 00:30:27.374 "reset": false, 00:30:27.374 "nvme_admin": false, 00:30:27.374 "nvme_io": false, 00:30:27.374 "nvme_io_md": false, 00:30:27.374 "write_zeroes": true, 00:30:27.374 "zcopy": false, 00:30:27.374 "get_zone_info": false, 00:30:27.374 "zone_management": false, 00:30:27.374 "zone_append": false, 00:30:27.374 "compare": false, 00:30:27.374 "compare_and_write": false, 00:30:27.374 "abort": false, 00:30:27.374 "seek_hole": false, 00:30:27.374 "seek_data": false, 00:30:27.374 "copy": false, 00:30:27.374 "nvme_iov_md": false 00:30:27.374 }, 00:30:27.374 "driver_specific": { 00:30:27.374 "compress": { 00:30:27.374 "name": "COMP_lvs0/lv0", 00:30:27.374 "base_bdev_name": "7a1783cc-8dea-41c1-87ac-754624c6ffad" 00:30:27.374 } 00:30:27.374 } 00:30:27.374 } 00:30:27.374 ] 00:30:27.374 18:34:11 compress_compdev -- common/autotest_common.sh@905 -- # return 0 00:30:27.374 18:34:11 compress_compdev -- compress/compress.sh@59 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevio/tests.py perform_tests 00:30:27.633 [2024-07-12 18:34:11.182308] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x7f895c1b1350 PMD being used: compress_qat 00:30:27.633 I/O targets: 00:30:27.633 COMP_lvs0/lv0: 200704 blocks of 512 bytes (98 MiB) 00:30:27.633 00:30:27.633 00:30:27.633 CUnit - A unit testing framework for C - Version 2.1-3 00:30:27.633 http://cunit.sourceforge.net/ 00:30:27.633 00:30:27.633 00:30:27.633 Suite: bdevio tests on: COMP_lvs0/lv0 00:30:27.633 Test: blockdev write read block ...passed 00:30:27.633 Test: blockdev write zeroes read block ...passed 00:30:27.633 Test: blockdev write zeroes read no split ...passed 00:30:27.633 Test: blockdev write zeroes read split ...passed 00:30:27.633 Test: blockdev write zeroes read split partial ...passed 00:30:27.633 Test: blockdev reset ...[2024-07-12 18:34:11.219386] vbdev_compress.c: 252:vbdev_compress_submit_request: *ERROR*: Unknown I/O type 5 00:30:27.633 passed 00:30:27.633 Test: blockdev write read 8 blocks ...passed 00:30:27.633 Test: blockdev write read size > 128k ...passed 00:30:27.633 Test: blockdev write read invalid size ...passed 00:30:27.633 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:30:27.633 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:30:27.633 Test: blockdev write read max offset ...passed 00:30:27.633 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:30:27.633 Test: blockdev writev readv 8 blocks ...passed 00:30:27.633 Test: blockdev writev readv 30 x 1block ...passed 00:30:27.633 Test: blockdev writev readv block ...passed 00:30:27.633 Test: blockdev writev readv size > 128k ...passed 00:30:27.633 Test: blockdev writev readv size > 128k in two iovs ...passed 00:30:27.633 Test: blockdev comparev and writev ...passed 00:30:27.633 Test: blockdev nvme passthru rw ...passed 00:30:27.633 Test: blockdev nvme passthru vendor specific ...passed 00:30:27.633 Test: blockdev nvme admin passthru ...passed 00:30:27.633 Test: blockdev copy ...passed 00:30:27.633 00:30:27.633 Run Summary: Type Total Ran Passed Failed Inactive 00:30:27.633 suites 1 1 n/a 0 0 00:30:27.633 tests 23 23 23 0 0 00:30:27.633 asserts 130 130 130 0 n/a 00:30:27.633 00:30:27.633 Elapsed time = 0.090 seconds 00:30:27.633 0 00:30:27.633 18:34:11 compress_compdev -- compress/compress.sh@60 -- # destroy_vols 00:30:27.633 18:34:11 compress_compdev -- compress/compress.sh@29 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_compress_delete COMP_lvs0/lv0 00:30:27.892 18:34:11 compress_compdev -- compress/compress.sh@30 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_delete_lvstore -l lvs0 00:30:28.151 18:34:11 compress_compdev -- compress/compress.sh@61 -- # trap - SIGINT SIGTERM EXIT 00:30:28.151 18:34:11 compress_compdev -- compress/compress.sh@62 -- # killprocess 2631811 00:30:28.151 18:34:11 compress_compdev -- common/autotest_common.sh@948 -- # '[' -z 2631811 ']' 00:30:28.151 18:34:11 compress_compdev -- common/autotest_common.sh@952 -- # kill -0 2631811 00:30:28.151 18:34:11 compress_compdev -- common/autotest_common.sh@953 -- # uname 00:30:28.151 18:34:11 compress_compdev -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:30:28.151 18:34:11 compress_compdev -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2631811 00:30:28.151 18:34:11 compress_compdev -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:30:28.151 18:34:11 compress_compdev -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:30:28.151 18:34:11 compress_compdev -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2631811' 00:30:28.151 killing process with pid 2631811 00:30:28.151 18:34:11 compress_compdev -- common/autotest_common.sh@967 -- # kill 2631811 00:30:28.151 18:34:11 compress_compdev -- common/autotest_common.sh@972 -- # wait 2631811 00:30:31.436 18:34:14 compress_compdev -- compress/compress.sh@91 -- # '[' 0 -eq 1 ']' 00:30:31.436 18:34:14 compress_compdev -- compress/compress.sh@120 -- # rm -rf /tmp/pmem 00:30:31.436 00:30:31.436 real 0m48.591s 00:30:31.436 user 1m52.280s 00:30:31.436 sys 0m5.837s 00:30:31.436 18:34:14 compress_compdev -- common/autotest_common.sh@1124 -- # xtrace_disable 00:30:31.436 18:34:14 compress_compdev -- common/autotest_common.sh@10 -- # set +x 00:30:31.436 ************************************ 00:30:31.436 END TEST compress_compdev 00:30:31.436 ************************************ 00:30:31.436 18:34:14 -- common/autotest_common.sh@1142 -- # return 0 00:30:31.436 18:34:14 -- spdk/autotest.sh@349 -- # run_test compress_isal /var/jenkins/workspace/crypto-phy-autotest/spdk/test/compress/compress.sh isal 00:30:31.436 18:34:14 -- common/autotest_common.sh@1099 -- # '[' 3 -le 1 ']' 00:30:31.436 18:34:14 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:30:31.436 18:34:14 -- common/autotest_common.sh@10 -- # set +x 00:30:31.436 ************************************ 00:30:31.436 START TEST compress_isal 00:30:31.436 ************************************ 00:30:31.436 18:34:14 compress_isal -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/compress/compress.sh isal 00:30:31.436 * Looking for test storage... 00:30:31.436 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/compress 00:30:31.436 18:34:14 compress_isal -- compress/compress.sh@13 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/nvmf/common.sh 00:30:31.436 18:34:14 compress_isal -- nvmf/common.sh@7 -- # uname -s 00:30:31.436 18:34:14 compress_isal -- nvmf/common.sh@7 -- # [[ Linux == FreeBSD ]] 00:30:31.436 18:34:14 compress_isal -- nvmf/common.sh@9 -- # NVMF_PORT=4420 00:30:31.436 18:34:14 compress_isal -- nvmf/common.sh@10 -- # NVMF_SECOND_PORT=4421 00:30:31.436 18:34:14 compress_isal -- nvmf/common.sh@11 -- # NVMF_THIRD_PORT=4422 00:30:31.436 18:34:14 compress_isal -- nvmf/common.sh@12 -- # NVMF_IP_PREFIX=192.168.100 00:30:31.436 18:34:14 compress_isal -- nvmf/common.sh@13 -- # NVMF_IP_LEAST_ADDR=8 00:30:31.436 18:34:14 compress_isal -- nvmf/common.sh@14 -- # NVMF_TCP_IP_ADDRESS=127.0.0.1 00:30:31.436 18:34:14 compress_isal -- nvmf/common.sh@15 -- # NVMF_TRANSPORT_OPTS= 00:30:31.436 18:34:14 compress_isal -- nvmf/common.sh@16 -- # NVMF_SERIAL=SPDKISFASTANDAWESOME 00:30:31.436 18:34:14 compress_isal -- nvmf/common.sh@17 -- # nvme gen-hostnqn 00:30:31.436 18:34:14 compress_isal -- nvmf/common.sh@17 -- # NVME_HOSTNQN=nqn.2014-08.org.nvmexpress:uuid:809e2efd-7f71-e711-906e-0017a4403562 00:30:31.436 18:34:14 compress_isal -- nvmf/common.sh@18 -- # NVME_HOSTID=809e2efd-7f71-e711-906e-0017a4403562 00:30:31.436 18:34:14 compress_isal -- nvmf/common.sh@19 -- # NVME_HOST=("--hostnqn=$NVME_HOSTNQN" "--hostid=$NVME_HOSTID") 00:30:31.436 18:34:14 compress_isal -- nvmf/common.sh@20 -- # NVME_CONNECT='nvme connect' 00:30:31.436 18:34:14 compress_isal -- nvmf/common.sh@21 -- # NET_TYPE=phy-fallback 00:30:31.436 18:34:14 compress_isal -- nvmf/common.sh@22 -- # NVME_SUBNQN=nqn.2016-06.io.spdk:testnqn 00:30:31.436 18:34:14 compress_isal -- nvmf/common.sh@45 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/common.sh 00:30:31.436 18:34:15 compress_isal -- scripts/common.sh@508 -- # [[ -e /bin/wpdk_common.sh ]] 00:30:31.436 18:34:15 compress_isal -- scripts/common.sh@516 -- # [[ -e /etc/opt/spdk-pkgdep/paths/export.sh ]] 00:30:31.436 18:34:15 compress_isal -- scripts/common.sh@517 -- # source /etc/opt/spdk-pkgdep/paths/export.sh 00:30:31.436 18:34:15 compress_isal -- paths/export.sh@2 -- # PATH=/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:30:31.436 18:34:15 compress_isal -- paths/export.sh@3 -- # PATH=/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:30:31.436 18:34:15 compress_isal -- paths/export.sh@4 -- # PATH=/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:30:31.436 18:34:15 compress_isal -- paths/export.sh@5 -- # export PATH 00:30:31.436 18:34:15 compress_isal -- paths/export.sh@6 -- # echo /opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:30:31.436 18:34:15 compress_isal -- nvmf/common.sh@47 -- # : 0 00:30:31.436 18:34:15 compress_isal -- nvmf/common.sh@48 -- # export NVMF_APP_SHM_ID 00:30:31.436 18:34:15 compress_isal -- nvmf/common.sh@49 -- # build_nvmf_app_args 00:30:31.436 18:34:15 compress_isal -- nvmf/common.sh@25 -- # '[' 0 -eq 1 ']' 00:30:31.436 18:34:15 compress_isal -- nvmf/common.sh@29 -- # NVMF_APP+=(-i "$NVMF_APP_SHM_ID" -e 0xFFFF) 00:30:31.436 18:34:15 compress_isal -- nvmf/common.sh@31 -- # NVMF_APP+=("${NO_HUGE[@]}") 00:30:31.436 18:34:15 compress_isal -- nvmf/common.sh@33 -- # '[' -n '' ']' 00:30:31.436 18:34:15 compress_isal -- nvmf/common.sh@35 -- # '[' 0 -eq 1 ']' 00:30:31.436 18:34:15 compress_isal -- nvmf/common.sh@51 -- # have_pci_nics=0 00:30:31.436 18:34:15 compress_isal -- compress/compress.sh@17 -- # rpc_py=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:30:31.436 18:34:15 compress_isal -- compress/compress.sh@81 -- # mkdir -p /tmp/pmem 00:30:31.436 18:34:15 compress_isal -- compress/compress.sh@82 -- # test_type=isal 00:30:31.436 18:34:15 compress_isal -- compress/compress.sh@86 -- # run_bdevperf 32 4096 3 00:30:31.436 18:34:15 compress_isal -- compress/compress.sh@66 -- # [[ isal == \c\o\m\p\d\e\v ]] 00:30:31.436 18:34:15 compress_isal -- compress/compress.sh@71 -- # bdevperf_pid=2633190 00:30:31.436 18:34:15 compress_isal -- compress/compress.sh@69 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -z -q 32 -o 4096 -w verify -t 3 -C -m 0x6 00:30:31.436 18:34:15 compress_isal -- compress/compress.sh@72 -- # trap 'killprocess $bdevperf_pid; error_cleanup; exit 1' SIGINT SIGTERM EXIT 00:30:31.436 18:34:15 compress_isal -- compress/compress.sh@73 -- # waitforlisten 2633190 00:30:31.436 18:34:15 compress_isal -- common/autotest_common.sh@829 -- # '[' -z 2633190 ']' 00:30:31.436 18:34:15 compress_isal -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:30:31.436 18:34:15 compress_isal -- common/autotest_common.sh@834 -- # local max_retries=100 00:30:31.436 18:34:15 compress_isal -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:30:31.436 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:30:31.436 18:34:15 compress_isal -- common/autotest_common.sh@838 -- # xtrace_disable 00:30:31.436 18:34:15 compress_isal -- common/autotest_common.sh@10 -- # set +x 00:30:31.436 [2024-07-12 18:34:15.066420] Starting SPDK v24.09-pre git sha1 182dd7de4 / DPDK 24.03.0 initialization... 00:30:31.436 [2024-07-12 18:34:15.066506] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x6 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2633190 ] 00:30:31.695 [2024-07-12 18:34:15.185052] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 2 00:30:31.695 [2024-07-12 18:34:15.289312] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 2 00:30:31.695 [2024-07-12 18:34:15.289318] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:30:31.954 18:34:15 compress_isal -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:30:31.954 18:34:15 compress_isal -- common/autotest_common.sh@862 -- # return 0 00:30:31.954 18:34:15 compress_isal -- compress/compress.sh@74 -- # create_vols 00:30:31.954 18:34:15 compress_isal -- compress/compress.sh@34 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/gen_nvme.sh 00:30:31.954 18:34:15 compress_isal -- compress/compress.sh@34 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py load_subsystem_config 00:30:32.521 18:34:16 compress_isal -- compress/compress.sh@35 -- # waitforbdev Nvme0n1 00:30:32.521 18:34:16 compress_isal -- common/autotest_common.sh@897 -- # local bdev_name=Nvme0n1 00:30:32.521 18:34:16 compress_isal -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:30:32.521 18:34:16 compress_isal -- common/autotest_common.sh@899 -- # local i 00:30:32.521 18:34:16 compress_isal -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:30:32.521 18:34:16 compress_isal -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:30:32.521 18:34:16 compress_isal -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:30:32.779 18:34:16 compress_isal -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b Nvme0n1 -t 2000 00:30:33.038 [ 00:30:33.038 { 00:30:33.038 "name": "Nvme0n1", 00:30:33.038 "aliases": [ 00:30:33.038 "01000000-0000-0000-5cd2-e43197705251" 00:30:33.038 ], 00:30:33.038 "product_name": "NVMe disk", 00:30:33.038 "block_size": 512, 00:30:33.038 "num_blocks": 15002931888, 00:30:33.038 "uuid": "01000000-0000-0000-5cd2-e43197705251", 00:30:33.038 "assigned_rate_limits": { 00:30:33.038 "rw_ios_per_sec": 0, 00:30:33.038 "rw_mbytes_per_sec": 0, 00:30:33.038 "r_mbytes_per_sec": 0, 00:30:33.038 "w_mbytes_per_sec": 0 00:30:33.038 }, 00:30:33.038 "claimed": false, 00:30:33.038 "zoned": false, 00:30:33.038 "supported_io_types": { 00:30:33.038 "read": true, 00:30:33.038 "write": true, 00:30:33.038 "unmap": true, 00:30:33.038 "flush": true, 00:30:33.038 "reset": true, 00:30:33.038 "nvme_admin": true, 00:30:33.038 "nvme_io": true, 00:30:33.038 "nvme_io_md": false, 00:30:33.038 "write_zeroes": true, 00:30:33.038 "zcopy": false, 00:30:33.038 "get_zone_info": false, 00:30:33.038 "zone_management": false, 00:30:33.038 "zone_append": false, 00:30:33.038 "compare": false, 00:30:33.038 "compare_and_write": false, 00:30:33.038 "abort": true, 00:30:33.038 "seek_hole": false, 00:30:33.038 "seek_data": false, 00:30:33.038 "copy": false, 00:30:33.038 "nvme_iov_md": false 00:30:33.038 }, 00:30:33.038 "driver_specific": { 00:30:33.038 "nvme": [ 00:30:33.038 { 00:30:33.038 "pci_address": "0000:5e:00.0", 00:30:33.038 "trid": { 00:30:33.038 "trtype": "PCIe", 00:30:33.038 "traddr": "0000:5e:00.0" 00:30:33.038 }, 00:30:33.038 "ctrlr_data": { 00:30:33.038 "cntlid": 0, 00:30:33.038 "vendor_id": "0x8086", 00:30:33.038 "model_number": "INTEL SSDPF2KX076TZO", 00:30:33.038 "serial_number": "PHAC0301002G7P6CGN", 00:30:33.038 "firmware_revision": "JCV10200", 00:30:33.038 "subnqn": "nqn.2020-07.com.intel:PHAC0301002G7P6CGN ", 00:30:33.038 "oacs": { 00:30:33.038 "security": 1, 00:30:33.038 "format": 1, 00:30:33.038 "firmware": 1, 00:30:33.038 "ns_manage": 1 00:30:33.038 }, 00:30:33.038 "multi_ctrlr": false, 00:30:33.038 "ana_reporting": false 00:30:33.038 }, 00:30:33.038 "vs": { 00:30:33.038 "nvme_version": "1.3" 00:30:33.038 }, 00:30:33.038 "ns_data": { 00:30:33.038 "id": 1, 00:30:33.038 "can_share": false 00:30:33.038 }, 00:30:33.038 "security": { 00:30:33.038 "opal": true 00:30:33.038 } 00:30:33.038 } 00:30:33.038 ], 00:30:33.038 "mp_policy": "active_passive" 00:30:33.038 } 00:30:33.038 } 00:30:33.038 ] 00:30:33.038 18:34:16 compress_isal -- common/autotest_common.sh@905 -- # return 0 00:30:33.038 18:34:16 compress_isal -- compress/compress.sh@37 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_create_lvstore --clear-method none Nvme0n1 lvs0 00:30:35.571 2f147762-ef9a-4bfa-b87b-d493431f0902 00:30:35.571 18:34:19 compress_isal -- compress/compress.sh@38 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_create -t -l lvs0 lv0 100 00:30:35.829 18dbe4b1-2737-4982-bafb-b9b0ecd6cd60 00:30:35.829 18:34:19 compress_isal -- compress/compress.sh@39 -- # waitforbdev lvs0/lv0 00:30:35.829 18:34:19 compress_isal -- common/autotest_common.sh@897 -- # local bdev_name=lvs0/lv0 00:30:35.829 18:34:19 compress_isal -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:30:35.829 18:34:19 compress_isal -- common/autotest_common.sh@899 -- # local i 00:30:35.829 18:34:19 compress_isal -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:30:35.829 18:34:19 compress_isal -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:30:35.829 18:34:19 compress_isal -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:30:36.088 18:34:19 compress_isal -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b lvs0/lv0 -t 2000 00:30:36.348 [ 00:30:36.348 { 00:30:36.348 "name": "18dbe4b1-2737-4982-bafb-b9b0ecd6cd60", 00:30:36.348 "aliases": [ 00:30:36.348 "lvs0/lv0" 00:30:36.348 ], 00:30:36.348 "product_name": "Logical Volume", 00:30:36.348 "block_size": 512, 00:30:36.348 "num_blocks": 204800, 00:30:36.348 "uuid": "18dbe4b1-2737-4982-bafb-b9b0ecd6cd60", 00:30:36.348 "assigned_rate_limits": { 00:30:36.348 "rw_ios_per_sec": 0, 00:30:36.348 "rw_mbytes_per_sec": 0, 00:30:36.348 "r_mbytes_per_sec": 0, 00:30:36.348 "w_mbytes_per_sec": 0 00:30:36.348 }, 00:30:36.348 "claimed": false, 00:30:36.348 "zoned": false, 00:30:36.348 "supported_io_types": { 00:30:36.348 "read": true, 00:30:36.348 "write": true, 00:30:36.348 "unmap": true, 00:30:36.348 "flush": false, 00:30:36.348 "reset": true, 00:30:36.348 "nvme_admin": false, 00:30:36.348 "nvme_io": false, 00:30:36.348 "nvme_io_md": false, 00:30:36.348 "write_zeroes": true, 00:30:36.348 "zcopy": false, 00:30:36.348 "get_zone_info": false, 00:30:36.348 "zone_management": false, 00:30:36.348 "zone_append": false, 00:30:36.348 "compare": false, 00:30:36.348 "compare_and_write": false, 00:30:36.348 "abort": false, 00:30:36.348 "seek_hole": true, 00:30:36.348 "seek_data": true, 00:30:36.348 "copy": false, 00:30:36.348 "nvme_iov_md": false 00:30:36.348 }, 00:30:36.348 "driver_specific": { 00:30:36.348 "lvol": { 00:30:36.348 "lvol_store_uuid": "2f147762-ef9a-4bfa-b87b-d493431f0902", 00:30:36.348 "base_bdev": "Nvme0n1", 00:30:36.348 "thin_provision": true, 00:30:36.348 "num_allocated_clusters": 0, 00:30:36.348 "snapshot": false, 00:30:36.348 "clone": false, 00:30:36.348 "esnap_clone": false 00:30:36.348 } 00:30:36.348 } 00:30:36.348 } 00:30:36.348 ] 00:30:36.348 18:34:19 compress_isal -- common/autotest_common.sh@905 -- # return 0 00:30:36.348 18:34:19 compress_isal -- compress/compress.sh@41 -- # '[' -z '' ']' 00:30:36.348 18:34:19 compress_isal -- compress/compress.sh@42 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_compress_create -b lvs0/lv0 -p /tmp/pmem 00:30:36.607 [2024-07-12 18:34:20.079399] vbdev_compress.c:1016:vbdev_compress_claim: *NOTICE*: registered io_device and virtual bdev for: COMP_lvs0/lv0 00:30:36.607 COMP_lvs0/lv0 00:30:36.607 18:34:20 compress_isal -- compress/compress.sh@46 -- # waitforbdev COMP_lvs0/lv0 00:30:36.607 18:34:20 compress_isal -- common/autotest_common.sh@897 -- # local bdev_name=COMP_lvs0/lv0 00:30:36.607 18:34:20 compress_isal -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:30:36.607 18:34:20 compress_isal -- common/autotest_common.sh@899 -- # local i 00:30:36.607 18:34:20 compress_isal -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:30:36.607 18:34:20 compress_isal -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:30:36.607 18:34:20 compress_isal -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:30:36.865 18:34:20 compress_isal -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b COMP_lvs0/lv0 -t 2000 00:30:36.865 [ 00:30:36.865 { 00:30:36.865 "name": "COMP_lvs0/lv0", 00:30:36.865 "aliases": [ 00:30:36.865 "851c50d7-b543-5ae3-abea-4f97102df6f6" 00:30:36.865 ], 00:30:36.865 "product_name": "compress", 00:30:36.865 "block_size": 512, 00:30:36.865 "num_blocks": 200704, 00:30:36.865 "uuid": "851c50d7-b543-5ae3-abea-4f97102df6f6", 00:30:36.865 "assigned_rate_limits": { 00:30:36.865 "rw_ios_per_sec": 0, 00:30:36.865 "rw_mbytes_per_sec": 0, 00:30:36.865 "r_mbytes_per_sec": 0, 00:30:36.865 "w_mbytes_per_sec": 0 00:30:36.865 }, 00:30:36.865 "claimed": false, 00:30:36.865 "zoned": false, 00:30:36.865 "supported_io_types": { 00:30:36.865 "read": true, 00:30:36.865 "write": true, 00:30:36.865 "unmap": false, 00:30:36.865 "flush": false, 00:30:36.865 "reset": false, 00:30:36.865 "nvme_admin": false, 00:30:36.865 "nvme_io": false, 00:30:36.865 "nvme_io_md": false, 00:30:36.865 "write_zeroes": true, 00:30:36.865 "zcopy": false, 00:30:36.865 "get_zone_info": false, 00:30:36.865 "zone_management": false, 00:30:36.865 "zone_append": false, 00:30:36.865 "compare": false, 00:30:36.865 "compare_and_write": false, 00:30:36.865 "abort": false, 00:30:36.865 "seek_hole": false, 00:30:36.865 "seek_data": false, 00:30:36.865 "copy": false, 00:30:36.865 "nvme_iov_md": false 00:30:36.865 }, 00:30:36.865 "driver_specific": { 00:30:36.865 "compress": { 00:30:36.865 "name": "COMP_lvs0/lv0", 00:30:36.865 "base_bdev_name": "18dbe4b1-2737-4982-bafb-b9b0ecd6cd60" 00:30:36.865 } 00:30:36.865 } 00:30:36.865 } 00:30:36.865 ] 00:30:37.124 18:34:20 compress_isal -- common/autotest_common.sh@905 -- # return 0 00:30:37.124 18:34:20 compress_isal -- compress/compress.sh@75 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py perform_tests 00:30:37.124 Running I/O for 3 seconds... 00:30:40.482 00:30:40.482 Latency(us) 00:30:40.482 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:30:40.482 Job: COMP_lvs0/lv0 (Core Mask 0x2, workload: verify, depth: 32, IO size: 4096) 00:30:40.482 Verification LBA range: start 0x0 length 0x3100 00:30:40.482 COMP_lvs0/lv0 : 3.00 3827.77 14.95 0.00 0.00 8302.73 694.54 7864.32 00:30:40.482 Job: COMP_lvs0/lv0 (Core Mask 0x4, workload: verify, depth: 32, IO size: 4096) 00:30:40.482 Verification LBA range: start 0x3100 length 0x3100 00:30:40.482 COMP_lvs0/lv0 : 3.00 3831.05 14.97 0.00 0.00 8307.86 527.14 7807.33 00:30:40.482 =================================================================================================================== 00:30:40.482 Total : 7658.82 29.92 0.00 0.00 8305.30 527.14 7864.32 00:30:40.482 0 00:30:40.482 18:34:23 compress_isal -- compress/compress.sh@76 -- # destroy_vols 00:30:40.482 18:34:23 compress_isal -- compress/compress.sh@29 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_compress_delete COMP_lvs0/lv0 00:30:40.482 18:34:24 compress_isal -- compress/compress.sh@30 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_delete_lvstore -l lvs0 00:30:40.741 18:34:24 compress_isal -- compress/compress.sh@77 -- # trap - SIGINT SIGTERM EXIT 00:30:40.741 18:34:24 compress_isal -- compress/compress.sh@78 -- # killprocess 2633190 00:30:40.741 18:34:24 compress_isal -- common/autotest_common.sh@948 -- # '[' -z 2633190 ']' 00:30:40.741 18:34:24 compress_isal -- common/autotest_common.sh@952 -- # kill -0 2633190 00:30:40.741 18:34:24 compress_isal -- common/autotest_common.sh@953 -- # uname 00:30:40.741 18:34:24 compress_isal -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:30:40.741 18:34:24 compress_isal -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2633190 00:30:40.741 18:34:24 compress_isal -- common/autotest_common.sh@954 -- # process_name=reactor_1 00:30:40.741 18:34:24 compress_isal -- common/autotest_common.sh@958 -- # '[' reactor_1 = sudo ']' 00:30:40.741 18:34:24 compress_isal -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2633190' 00:30:40.741 killing process with pid 2633190 00:30:40.741 18:34:24 compress_isal -- common/autotest_common.sh@967 -- # kill 2633190 00:30:40.741 Received shutdown signal, test time was about 3.000000 seconds 00:30:40.741 00:30:40.741 Latency(us) 00:30:40.741 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:30:40.741 =================================================================================================================== 00:30:40.741 Total : 0.00 0.00 0.00 0.00 0.00 0.00 0.00 00:30:40.741 18:34:24 compress_isal -- common/autotest_common.sh@972 -- # wait 2633190 00:30:44.027 18:34:27 compress_isal -- compress/compress.sh@87 -- # run_bdevperf 32 4096 3 512 00:30:44.027 18:34:27 compress_isal -- compress/compress.sh@66 -- # [[ isal == \c\o\m\p\d\e\v ]] 00:30:44.027 18:34:27 compress_isal -- compress/compress.sh@71 -- # bdevperf_pid=2634722 00:30:44.027 18:34:27 compress_isal -- compress/compress.sh@72 -- # trap 'killprocess $bdevperf_pid; error_cleanup; exit 1' SIGINT SIGTERM EXIT 00:30:44.027 18:34:27 compress_isal -- compress/compress.sh@73 -- # waitforlisten 2634722 00:30:44.027 18:34:27 compress_isal -- compress/compress.sh@69 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -z -q 32 -o 4096 -w verify -t 3 -C -m 0x6 00:30:44.027 18:34:27 compress_isal -- common/autotest_common.sh@829 -- # '[' -z 2634722 ']' 00:30:44.027 18:34:27 compress_isal -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:30:44.027 18:34:27 compress_isal -- common/autotest_common.sh@834 -- # local max_retries=100 00:30:44.027 18:34:27 compress_isal -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:30:44.027 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:30:44.027 18:34:27 compress_isal -- common/autotest_common.sh@838 -- # xtrace_disable 00:30:44.027 18:34:27 compress_isal -- common/autotest_common.sh@10 -- # set +x 00:30:44.027 [2024-07-12 18:34:27.355341] Starting SPDK v24.09-pre git sha1 182dd7de4 / DPDK 24.03.0 initialization... 00:30:44.027 [2024-07-12 18:34:27.355410] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x6 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2634722 ] 00:30:44.027 [2024-07-12 18:34:27.474726] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 2 00:30:44.027 [2024-07-12 18:34:27.581690] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 2 00:30:44.027 [2024-07-12 18:34:27.581704] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:30:44.596 18:34:28 compress_isal -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:30:44.596 18:34:28 compress_isal -- common/autotest_common.sh@862 -- # return 0 00:30:44.596 18:34:28 compress_isal -- compress/compress.sh@74 -- # create_vols 512 00:30:44.596 18:34:28 compress_isal -- compress/compress.sh@34 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/gen_nvme.sh 00:30:44.596 18:34:28 compress_isal -- compress/compress.sh@34 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py load_subsystem_config 00:30:45.533 18:34:28 compress_isal -- compress/compress.sh@35 -- # waitforbdev Nvme0n1 00:30:45.533 18:34:28 compress_isal -- common/autotest_common.sh@897 -- # local bdev_name=Nvme0n1 00:30:45.533 18:34:28 compress_isal -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:30:45.533 18:34:28 compress_isal -- common/autotest_common.sh@899 -- # local i 00:30:45.533 18:34:28 compress_isal -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:30:45.533 18:34:28 compress_isal -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:30:45.533 18:34:28 compress_isal -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:30:45.533 18:34:29 compress_isal -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b Nvme0n1 -t 2000 00:30:45.792 [ 00:30:45.792 { 00:30:45.792 "name": "Nvme0n1", 00:30:45.792 "aliases": [ 00:30:45.792 "01000000-0000-0000-5cd2-e43197705251" 00:30:45.792 ], 00:30:45.792 "product_name": "NVMe disk", 00:30:45.792 "block_size": 512, 00:30:45.792 "num_blocks": 15002931888, 00:30:45.792 "uuid": "01000000-0000-0000-5cd2-e43197705251", 00:30:45.792 "assigned_rate_limits": { 00:30:45.792 "rw_ios_per_sec": 0, 00:30:45.792 "rw_mbytes_per_sec": 0, 00:30:45.792 "r_mbytes_per_sec": 0, 00:30:45.792 "w_mbytes_per_sec": 0 00:30:45.792 }, 00:30:45.792 "claimed": false, 00:30:45.792 "zoned": false, 00:30:45.792 "supported_io_types": { 00:30:45.792 "read": true, 00:30:45.792 "write": true, 00:30:45.792 "unmap": true, 00:30:45.792 "flush": true, 00:30:45.792 "reset": true, 00:30:45.792 "nvme_admin": true, 00:30:45.792 "nvme_io": true, 00:30:45.792 "nvme_io_md": false, 00:30:45.792 "write_zeroes": true, 00:30:45.792 "zcopy": false, 00:30:45.792 "get_zone_info": false, 00:30:45.792 "zone_management": false, 00:30:45.792 "zone_append": false, 00:30:45.792 "compare": false, 00:30:45.792 "compare_and_write": false, 00:30:45.792 "abort": true, 00:30:45.792 "seek_hole": false, 00:30:45.793 "seek_data": false, 00:30:45.793 "copy": false, 00:30:45.793 "nvme_iov_md": false 00:30:45.793 }, 00:30:45.793 "driver_specific": { 00:30:45.793 "nvme": [ 00:30:45.793 { 00:30:45.793 "pci_address": "0000:5e:00.0", 00:30:45.793 "trid": { 00:30:45.793 "trtype": "PCIe", 00:30:45.793 "traddr": "0000:5e:00.0" 00:30:45.793 }, 00:30:45.793 "ctrlr_data": { 00:30:45.793 "cntlid": 0, 00:30:45.793 "vendor_id": "0x8086", 00:30:45.793 "model_number": "INTEL SSDPF2KX076TZO", 00:30:45.793 "serial_number": "PHAC0301002G7P6CGN", 00:30:45.793 "firmware_revision": "JCV10200", 00:30:45.793 "subnqn": "nqn.2020-07.com.intel:PHAC0301002G7P6CGN ", 00:30:45.793 "oacs": { 00:30:45.793 "security": 1, 00:30:45.793 "format": 1, 00:30:45.793 "firmware": 1, 00:30:45.793 "ns_manage": 1 00:30:45.793 }, 00:30:45.793 "multi_ctrlr": false, 00:30:45.793 "ana_reporting": false 00:30:45.793 }, 00:30:45.793 "vs": { 00:30:45.793 "nvme_version": "1.3" 00:30:45.793 }, 00:30:45.793 "ns_data": { 00:30:45.793 "id": 1, 00:30:45.793 "can_share": false 00:30:45.793 }, 00:30:45.793 "security": { 00:30:45.793 "opal": true 00:30:45.793 } 00:30:45.793 } 00:30:45.793 ], 00:30:45.793 "mp_policy": "active_passive" 00:30:45.793 } 00:30:45.793 } 00:30:45.793 ] 00:30:45.793 18:34:29 compress_isal -- common/autotest_common.sh@905 -- # return 0 00:30:45.793 18:34:29 compress_isal -- compress/compress.sh@37 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_create_lvstore --clear-method none Nvme0n1 lvs0 00:30:48.327 03f7fd93-e8b6-48ed-983c-3c56ddffbf25 00:30:48.327 18:34:31 compress_isal -- compress/compress.sh@38 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_create -t -l lvs0 lv0 100 00:30:48.586 cbc67450-8d20-458d-8787-d7cfe1f5a3b2 00:30:48.586 18:34:32 compress_isal -- compress/compress.sh@39 -- # waitforbdev lvs0/lv0 00:30:48.586 18:34:32 compress_isal -- common/autotest_common.sh@897 -- # local bdev_name=lvs0/lv0 00:30:48.586 18:34:32 compress_isal -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:30:48.586 18:34:32 compress_isal -- common/autotest_common.sh@899 -- # local i 00:30:48.586 18:34:32 compress_isal -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:30:48.586 18:34:32 compress_isal -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:30:48.586 18:34:32 compress_isal -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:30:48.586 18:34:32 compress_isal -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b lvs0/lv0 -t 2000 00:30:48.845 [ 00:30:48.845 { 00:30:48.845 "name": "cbc67450-8d20-458d-8787-d7cfe1f5a3b2", 00:30:48.845 "aliases": [ 00:30:48.845 "lvs0/lv0" 00:30:48.845 ], 00:30:48.845 "product_name": "Logical Volume", 00:30:48.845 "block_size": 512, 00:30:48.845 "num_blocks": 204800, 00:30:48.845 "uuid": "cbc67450-8d20-458d-8787-d7cfe1f5a3b2", 00:30:48.845 "assigned_rate_limits": { 00:30:48.845 "rw_ios_per_sec": 0, 00:30:48.845 "rw_mbytes_per_sec": 0, 00:30:48.845 "r_mbytes_per_sec": 0, 00:30:48.845 "w_mbytes_per_sec": 0 00:30:48.845 }, 00:30:48.845 "claimed": false, 00:30:48.845 "zoned": false, 00:30:48.845 "supported_io_types": { 00:30:48.845 "read": true, 00:30:48.845 "write": true, 00:30:48.845 "unmap": true, 00:30:48.845 "flush": false, 00:30:48.845 "reset": true, 00:30:48.845 "nvme_admin": false, 00:30:48.845 "nvme_io": false, 00:30:48.845 "nvme_io_md": false, 00:30:48.845 "write_zeroes": true, 00:30:48.845 "zcopy": false, 00:30:48.845 "get_zone_info": false, 00:30:48.845 "zone_management": false, 00:30:48.845 "zone_append": false, 00:30:48.845 "compare": false, 00:30:48.845 "compare_and_write": false, 00:30:48.845 "abort": false, 00:30:48.845 "seek_hole": true, 00:30:48.845 "seek_data": true, 00:30:48.845 "copy": false, 00:30:48.845 "nvme_iov_md": false 00:30:48.845 }, 00:30:48.845 "driver_specific": { 00:30:48.845 "lvol": { 00:30:48.845 "lvol_store_uuid": "03f7fd93-e8b6-48ed-983c-3c56ddffbf25", 00:30:48.845 "base_bdev": "Nvme0n1", 00:30:48.845 "thin_provision": true, 00:30:48.845 "num_allocated_clusters": 0, 00:30:48.845 "snapshot": false, 00:30:48.845 "clone": false, 00:30:48.845 "esnap_clone": false 00:30:48.845 } 00:30:48.845 } 00:30:48.845 } 00:30:48.845 ] 00:30:48.845 18:34:32 compress_isal -- common/autotest_common.sh@905 -- # return 0 00:30:48.845 18:34:32 compress_isal -- compress/compress.sh@41 -- # '[' -z 512 ']' 00:30:48.845 18:34:32 compress_isal -- compress/compress.sh@44 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_compress_create -b lvs0/lv0 -p /tmp/pmem -l 512 00:30:49.104 [2024-07-12 18:34:32.750645] vbdev_compress.c:1016:vbdev_compress_claim: *NOTICE*: registered io_device and virtual bdev for: COMP_lvs0/lv0 00:30:49.104 COMP_lvs0/lv0 00:30:49.104 18:34:32 compress_isal -- compress/compress.sh@46 -- # waitforbdev COMP_lvs0/lv0 00:30:49.104 18:34:32 compress_isal -- common/autotest_common.sh@897 -- # local bdev_name=COMP_lvs0/lv0 00:30:49.104 18:34:32 compress_isal -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:30:49.104 18:34:32 compress_isal -- common/autotest_common.sh@899 -- # local i 00:30:49.104 18:34:32 compress_isal -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:30:49.104 18:34:32 compress_isal -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:30:49.104 18:34:32 compress_isal -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:30:49.362 18:34:33 compress_isal -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b COMP_lvs0/lv0 -t 2000 00:30:49.620 [ 00:30:49.620 { 00:30:49.620 "name": "COMP_lvs0/lv0", 00:30:49.620 "aliases": [ 00:30:49.620 "0b7356e6-1a48-5464-888c-69473bf02042" 00:30:49.620 ], 00:30:49.620 "product_name": "compress", 00:30:49.620 "block_size": 512, 00:30:49.620 "num_blocks": 200704, 00:30:49.620 "uuid": "0b7356e6-1a48-5464-888c-69473bf02042", 00:30:49.620 "assigned_rate_limits": { 00:30:49.620 "rw_ios_per_sec": 0, 00:30:49.620 "rw_mbytes_per_sec": 0, 00:30:49.620 "r_mbytes_per_sec": 0, 00:30:49.620 "w_mbytes_per_sec": 0 00:30:49.620 }, 00:30:49.620 "claimed": false, 00:30:49.620 "zoned": false, 00:30:49.620 "supported_io_types": { 00:30:49.620 "read": true, 00:30:49.620 "write": true, 00:30:49.620 "unmap": false, 00:30:49.620 "flush": false, 00:30:49.621 "reset": false, 00:30:49.621 "nvme_admin": false, 00:30:49.621 "nvme_io": false, 00:30:49.621 "nvme_io_md": false, 00:30:49.621 "write_zeroes": true, 00:30:49.621 "zcopy": false, 00:30:49.621 "get_zone_info": false, 00:30:49.621 "zone_management": false, 00:30:49.621 "zone_append": false, 00:30:49.621 "compare": false, 00:30:49.621 "compare_and_write": false, 00:30:49.621 "abort": false, 00:30:49.621 "seek_hole": false, 00:30:49.621 "seek_data": false, 00:30:49.621 "copy": false, 00:30:49.621 "nvme_iov_md": false 00:30:49.621 }, 00:30:49.621 "driver_specific": { 00:30:49.621 "compress": { 00:30:49.621 "name": "COMP_lvs0/lv0", 00:30:49.621 "base_bdev_name": "cbc67450-8d20-458d-8787-d7cfe1f5a3b2" 00:30:49.621 } 00:30:49.621 } 00:30:49.621 } 00:30:49.621 ] 00:30:49.621 18:34:33 compress_isal -- common/autotest_common.sh@905 -- # return 0 00:30:49.621 18:34:33 compress_isal -- compress/compress.sh@75 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py perform_tests 00:30:49.879 Running I/O for 3 seconds... 00:30:53.165 00:30:53.165 Latency(us) 00:30:53.165 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:30:53.165 Job: COMP_lvs0/lv0 (Core Mask 0x2, workload: verify, depth: 32, IO size: 4096) 00:30:53.165 Verification LBA range: start 0x0 length 0x3100 00:30:53.165 COMP_lvs0/lv0 : 3.00 3944.88 15.41 0.00 0.00 8058.21 680.29 7522.39 00:30:53.165 Job: COMP_lvs0/lv0 (Core Mask 0x4, workload: verify, depth: 32, IO size: 4096) 00:30:53.165 Verification LBA range: start 0x3100 length 0x3100 00:30:53.165 COMP_lvs0/lv0 : 3.00 3948.37 15.42 0.00 0.00 8063.28 584.13 7579.38 00:30:53.165 =================================================================================================================== 00:30:53.165 Total : 7893.25 30.83 0.00 0.00 8060.75 584.13 7579.38 00:30:53.165 0 00:30:53.165 18:34:36 compress_isal -- compress/compress.sh@76 -- # destroy_vols 00:30:53.165 18:34:36 compress_isal -- compress/compress.sh@29 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_compress_delete COMP_lvs0/lv0 00:30:53.165 18:34:36 compress_isal -- compress/compress.sh@30 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_delete_lvstore -l lvs0 00:30:53.165 18:34:36 compress_isal -- compress/compress.sh@77 -- # trap - SIGINT SIGTERM EXIT 00:30:53.165 18:34:36 compress_isal -- compress/compress.sh@78 -- # killprocess 2634722 00:30:53.165 18:34:36 compress_isal -- common/autotest_common.sh@948 -- # '[' -z 2634722 ']' 00:30:53.165 18:34:36 compress_isal -- common/autotest_common.sh@952 -- # kill -0 2634722 00:30:53.165 18:34:36 compress_isal -- common/autotest_common.sh@953 -- # uname 00:30:53.165 18:34:36 compress_isal -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:30:53.165 18:34:36 compress_isal -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2634722 00:30:53.165 18:34:36 compress_isal -- common/autotest_common.sh@954 -- # process_name=reactor_1 00:30:53.165 18:34:36 compress_isal -- common/autotest_common.sh@958 -- # '[' reactor_1 = sudo ']' 00:30:53.165 18:34:36 compress_isal -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2634722' 00:30:53.165 killing process with pid 2634722 00:30:53.165 18:34:36 compress_isal -- common/autotest_common.sh@967 -- # kill 2634722 00:30:53.165 Received shutdown signal, test time was about 3.000000 seconds 00:30:53.165 00:30:53.165 Latency(us) 00:30:53.165 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:30:53.165 =================================================================================================================== 00:30:53.165 Total : 0.00 0.00 0.00 0.00 0.00 0.00 0.00 00:30:53.165 18:34:36 compress_isal -- common/autotest_common.sh@972 -- # wait 2634722 00:30:56.448 18:34:39 compress_isal -- compress/compress.sh@88 -- # run_bdevperf 32 4096 3 4096 00:30:56.448 18:34:39 compress_isal -- compress/compress.sh@66 -- # [[ isal == \c\o\m\p\d\e\v ]] 00:30:56.448 18:34:39 compress_isal -- compress/compress.sh@71 -- # bdevperf_pid=2636365 00:30:56.448 18:34:39 compress_isal -- compress/compress.sh@69 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -z -q 32 -o 4096 -w verify -t 3 -C -m 0x6 00:30:56.448 18:34:39 compress_isal -- compress/compress.sh@72 -- # trap 'killprocess $bdevperf_pid; error_cleanup; exit 1' SIGINT SIGTERM EXIT 00:30:56.448 18:34:39 compress_isal -- compress/compress.sh@73 -- # waitforlisten 2636365 00:30:56.448 18:34:39 compress_isal -- common/autotest_common.sh@829 -- # '[' -z 2636365 ']' 00:30:56.448 18:34:39 compress_isal -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:30:56.448 18:34:39 compress_isal -- common/autotest_common.sh@834 -- # local max_retries=100 00:30:56.448 18:34:39 compress_isal -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:30:56.448 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:30:56.448 18:34:39 compress_isal -- common/autotest_common.sh@838 -- # xtrace_disable 00:30:56.448 18:34:39 compress_isal -- common/autotest_common.sh@10 -- # set +x 00:30:56.448 [2024-07-12 18:34:39.914232] Starting SPDK v24.09-pre git sha1 182dd7de4 / DPDK 24.03.0 initialization... 00:30:56.448 [2024-07-12 18:34:39.914305] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x6 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2636365 ] 00:30:56.448 [2024-07-12 18:34:40.038387] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 2 00:30:56.448 [2024-07-12 18:34:40.139529] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 2 00:30:56.448 [2024-07-12 18:34:40.139534] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:30:57.379 18:34:40 compress_isal -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:30:57.379 18:34:40 compress_isal -- common/autotest_common.sh@862 -- # return 0 00:30:57.379 18:34:40 compress_isal -- compress/compress.sh@74 -- # create_vols 4096 00:30:57.379 18:34:40 compress_isal -- compress/compress.sh@34 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/gen_nvme.sh 00:30:57.379 18:34:40 compress_isal -- compress/compress.sh@34 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py load_subsystem_config 00:30:57.946 18:34:41 compress_isal -- compress/compress.sh@35 -- # waitforbdev Nvme0n1 00:30:57.946 18:34:41 compress_isal -- common/autotest_common.sh@897 -- # local bdev_name=Nvme0n1 00:30:57.946 18:34:41 compress_isal -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:30:57.946 18:34:41 compress_isal -- common/autotest_common.sh@899 -- # local i 00:30:57.946 18:34:41 compress_isal -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:30:57.946 18:34:41 compress_isal -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:30:57.946 18:34:41 compress_isal -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:30:58.205 18:34:41 compress_isal -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b Nvme0n1 -t 2000 00:30:58.476 [ 00:30:58.476 { 00:30:58.476 "name": "Nvme0n1", 00:30:58.476 "aliases": [ 00:30:58.476 "01000000-0000-0000-5cd2-e43197705251" 00:30:58.476 ], 00:30:58.476 "product_name": "NVMe disk", 00:30:58.476 "block_size": 512, 00:30:58.476 "num_blocks": 15002931888, 00:30:58.476 "uuid": "01000000-0000-0000-5cd2-e43197705251", 00:30:58.476 "assigned_rate_limits": { 00:30:58.476 "rw_ios_per_sec": 0, 00:30:58.476 "rw_mbytes_per_sec": 0, 00:30:58.476 "r_mbytes_per_sec": 0, 00:30:58.476 "w_mbytes_per_sec": 0 00:30:58.476 }, 00:30:58.476 "claimed": false, 00:30:58.476 "zoned": false, 00:30:58.476 "supported_io_types": { 00:30:58.476 "read": true, 00:30:58.476 "write": true, 00:30:58.476 "unmap": true, 00:30:58.476 "flush": true, 00:30:58.476 "reset": true, 00:30:58.476 "nvme_admin": true, 00:30:58.476 "nvme_io": true, 00:30:58.476 "nvme_io_md": false, 00:30:58.476 "write_zeroes": true, 00:30:58.476 "zcopy": false, 00:30:58.476 "get_zone_info": false, 00:30:58.476 "zone_management": false, 00:30:58.476 "zone_append": false, 00:30:58.476 "compare": false, 00:30:58.476 "compare_and_write": false, 00:30:58.476 "abort": true, 00:30:58.476 "seek_hole": false, 00:30:58.476 "seek_data": false, 00:30:58.476 "copy": false, 00:30:58.476 "nvme_iov_md": false 00:30:58.476 }, 00:30:58.476 "driver_specific": { 00:30:58.476 "nvme": [ 00:30:58.476 { 00:30:58.476 "pci_address": "0000:5e:00.0", 00:30:58.476 "trid": { 00:30:58.476 "trtype": "PCIe", 00:30:58.476 "traddr": "0000:5e:00.0" 00:30:58.476 }, 00:30:58.476 "ctrlr_data": { 00:30:58.476 "cntlid": 0, 00:30:58.476 "vendor_id": "0x8086", 00:30:58.476 "model_number": "INTEL SSDPF2KX076TZO", 00:30:58.476 "serial_number": "PHAC0301002G7P6CGN", 00:30:58.476 "firmware_revision": "JCV10200", 00:30:58.476 "subnqn": "nqn.2020-07.com.intel:PHAC0301002G7P6CGN ", 00:30:58.476 "oacs": { 00:30:58.476 "security": 1, 00:30:58.476 "format": 1, 00:30:58.476 "firmware": 1, 00:30:58.476 "ns_manage": 1 00:30:58.476 }, 00:30:58.476 "multi_ctrlr": false, 00:30:58.476 "ana_reporting": false 00:30:58.477 }, 00:30:58.477 "vs": { 00:30:58.477 "nvme_version": "1.3" 00:30:58.477 }, 00:30:58.477 "ns_data": { 00:30:58.477 "id": 1, 00:30:58.477 "can_share": false 00:30:58.477 }, 00:30:58.477 "security": { 00:30:58.477 "opal": true 00:30:58.477 } 00:30:58.477 } 00:30:58.477 ], 00:30:58.477 "mp_policy": "active_passive" 00:30:58.477 } 00:30:58.477 } 00:30:58.477 ] 00:30:58.477 18:34:41 compress_isal -- common/autotest_common.sh@905 -- # return 0 00:30:58.477 18:34:41 compress_isal -- compress/compress.sh@37 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_create_lvstore --clear-method none Nvme0n1 lvs0 00:31:01.007 cde27100-2eaf-4dca-b3b0-7319e327b9f5 00:31:01.007 18:34:44 compress_isal -- compress/compress.sh@38 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_create -t -l lvs0 lv0 100 00:31:01.007 a20cea64-9493-4e85-9c2e-569fae1b1fa4 00:31:01.007 18:34:44 compress_isal -- compress/compress.sh@39 -- # waitforbdev lvs0/lv0 00:31:01.007 18:34:44 compress_isal -- common/autotest_common.sh@897 -- # local bdev_name=lvs0/lv0 00:31:01.007 18:34:44 compress_isal -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:31:01.007 18:34:44 compress_isal -- common/autotest_common.sh@899 -- # local i 00:31:01.007 18:34:44 compress_isal -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:31:01.007 18:34:44 compress_isal -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:31:01.007 18:34:44 compress_isal -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:31:01.265 18:34:44 compress_isal -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b lvs0/lv0 -t 2000 00:31:01.524 [ 00:31:01.524 { 00:31:01.524 "name": "a20cea64-9493-4e85-9c2e-569fae1b1fa4", 00:31:01.524 "aliases": [ 00:31:01.524 "lvs0/lv0" 00:31:01.524 ], 00:31:01.524 "product_name": "Logical Volume", 00:31:01.524 "block_size": 512, 00:31:01.524 "num_blocks": 204800, 00:31:01.524 "uuid": "a20cea64-9493-4e85-9c2e-569fae1b1fa4", 00:31:01.524 "assigned_rate_limits": { 00:31:01.524 "rw_ios_per_sec": 0, 00:31:01.524 "rw_mbytes_per_sec": 0, 00:31:01.524 "r_mbytes_per_sec": 0, 00:31:01.524 "w_mbytes_per_sec": 0 00:31:01.524 }, 00:31:01.524 "claimed": false, 00:31:01.524 "zoned": false, 00:31:01.524 "supported_io_types": { 00:31:01.524 "read": true, 00:31:01.524 "write": true, 00:31:01.524 "unmap": true, 00:31:01.524 "flush": false, 00:31:01.524 "reset": true, 00:31:01.524 "nvme_admin": false, 00:31:01.524 "nvme_io": false, 00:31:01.524 "nvme_io_md": false, 00:31:01.524 "write_zeroes": true, 00:31:01.524 "zcopy": false, 00:31:01.524 "get_zone_info": false, 00:31:01.524 "zone_management": false, 00:31:01.524 "zone_append": false, 00:31:01.524 "compare": false, 00:31:01.524 "compare_and_write": false, 00:31:01.524 "abort": false, 00:31:01.524 "seek_hole": true, 00:31:01.524 "seek_data": true, 00:31:01.524 "copy": false, 00:31:01.524 "nvme_iov_md": false 00:31:01.524 }, 00:31:01.524 "driver_specific": { 00:31:01.524 "lvol": { 00:31:01.524 "lvol_store_uuid": "cde27100-2eaf-4dca-b3b0-7319e327b9f5", 00:31:01.524 "base_bdev": "Nvme0n1", 00:31:01.524 "thin_provision": true, 00:31:01.524 "num_allocated_clusters": 0, 00:31:01.524 "snapshot": false, 00:31:01.524 "clone": false, 00:31:01.524 "esnap_clone": false 00:31:01.524 } 00:31:01.524 } 00:31:01.524 } 00:31:01.524 ] 00:31:01.524 18:34:45 compress_isal -- common/autotest_common.sh@905 -- # return 0 00:31:01.524 18:34:45 compress_isal -- compress/compress.sh@41 -- # '[' -z 4096 ']' 00:31:01.524 18:34:45 compress_isal -- compress/compress.sh@44 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_compress_create -b lvs0/lv0 -p /tmp/pmem -l 4096 00:31:01.783 [2024-07-12 18:34:45.354989] vbdev_compress.c:1016:vbdev_compress_claim: *NOTICE*: registered io_device and virtual bdev for: COMP_lvs0/lv0 00:31:01.783 COMP_lvs0/lv0 00:31:01.783 18:34:45 compress_isal -- compress/compress.sh@46 -- # waitforbdev COMP_lvs0/lv0 00:31:01.783 18:34:45 compress_isal -- common/autotest_common.sh@897 -- # local bdev_name=COMP_lvs0/lv0 00:31:01.783 18:34:45 compress_isal -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:31:01.783 18:34:45 compress_isal -- common/autotest_common.sh@899 -- # local i 00:31:01.783 18:34:45 compress_isal -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:31:01.783 18:34:45 compress_isal -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:31:01.783 18:34:45 compress_isal -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:31:02.042 18:34:45 compress_isal -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b COMP_lvs0/lv0 -t 2000 00:31:02.301 [ 00:31:02.301 { 00:31:02.301 "name": "COMP_lvs0/lv0", 00:31:02.301 "aliases": [ 00:31:02.301 "d951be4c-3c21-5d8d-ba42-919e977f8cba" 00:31:02.301 ], 00:31:02.301 "product_name": "compress", 00:31:02.301 "block_size": 4096, 00:31:02.301 "num_blocks": 25088, 00:31:02.301 "uuid": "d951be4c-3c21-5d8d-ba42-919e977f8cba", 00:31:02.301 "assigned_rate_limits": { 00:31:02.301 "rw_ios_per_sec": 0, 00:31:02.301 "rw_mbytes_per_sec": 0, 00:31:02.301 "r_mbytes_per_sec": 0, 00:31:02.301 "w_mbytes_per_sec": 0 00:31:02.301 }, 00:31:02.301 "claimed": false, 00:31:02.301 "zoned": false, 00:31:02.301 "supported_io_types": { 00:31:02.301 "read": true, 00:31:02.301 "write": true, 00:31:02.301 "unmap": false, 00:31:02.301 "flush": false, 00:31:02.301 "reset": false, 00:31:02.301 "nvme_admin": false, 00:31:02.301 "nvme_io": false, 00:31:02.301 "nvme_io_md": false, 00:31:02.301 "write_zeroes": true, 00:31:02.301 "zcopy": false, 00:31:02.301 "get_zone_info": false, 00:31:02.301 "zone_management": false, 00:31:02.301 "zone_append": false, 00:31:02.301 "compare": false, 00:31:02.301 "compare_and_write": false, 00:31:02.301 "abort": false, 00:31:02.301 "seek_hole": false, 00:31:02.301 "seek_data": false, 00:31:02.301 "copy": false, 00:31:02.301 "nvme_iov_md": false 00:31:02.301 }, 00:31:02.301 "driver_specific": { 00:31:02.301 "compress": { 00:31:02.301 "name": "COMP_lvs0/lv0", 00:31:02.301 "base_bdev_name": "a20cea64-9493-4e85-9c2e-569fae1b1fa4" 00:31:02.301 } 00:31:02.301 } 00:31:02.301 } 00:31:02.301 ] 00:31:02.301 18:34:45 compress_isal -- common/autotest_common.sh@905 -- # return 0 00:31:02.301 18:34:45 compress_isal -- compress/compress.sh@75 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py perform_tests 00:31:02.301 Running I/O for 3 seconds... 00:31:05.630 00:31:05.630 Latency(us) 00:31:05.630 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:31:05.630 Job: COMP_lvs0/lv0 (Core Mask 0x2, workload: verify, depth: 32, IO size: 4096) 00:31:05.630 Verification LBA range: start 0x0 length 0x3100 00:31:05.630 COMP_lvs0/lv0 : 3.00 3946.91 15.42 0.00 0.00 8053.12 698.10 7978.30 00:31:05.630 Job: COMP_lvs0/lv0 (Core Mask 0x4, workload: verify, depth: 32, IO size: 4096) 00:31:05.631 Verification LBA range: start 0x3100 length 0x3100 00:31:05.631 COMP_lvs0/lv0 : 3.00 3952.90 15.44 0.00 0.00 8053.47 616.18 7864.32 00:31:05.631 =================================================================================================================== 00:31:05.631 Total : 7899.81 30.86 0.00 0.00 8053.29 616.18 7978.30 00:31:05.631 0 00:31:05.631 18:34:49 compress_isal -- compress/compress.sh@76 -- # destroy_vols 00:31:05.631 18:34:49 compress_isal -- compress/compress.sh@29 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_compress_delete COMP_lvs0/lv0 00:31:05.631 18:34:49 compress_isal -- compress/compress.sh@30 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_delete_lvstore -l lvs0 00:31:05.889 18:34:49 compress_isal -- compress/compress.sh@77 -- # trap - SIGINT SIGTERM EXIT 00:31:05.889 18:34:49 compress_isal -- compress/compress.sh@78 -- # killprocess 2636365 00:31:05.889 18:34:49 compress_isal -- common/autotest_common.sh@948 -- # '[' -z 2636365 ']' 00:31:05.889 18:34:49 compress_isal -- common/autotest_common.sh@952 -- # kill -0 2636365 00:31:05.889 18:34:49 compress_isal -- common/autotest_common.sh@953 -- # uname 00:31:05.889 18:34:49 compress_isal -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:31:05.889 18:34:49 compress_isal -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2636365 00:31:05.889 18:34:49 compress_isal -- common/autotest_common.sh@954 -- # process_name=reactor_1 00:31:05.889 18:34:49 compress_isal -- common/autotest_common.sh@958 -- # '[' reactor_1 = sudo ']' 00:31:05.889 18:34:49 compress_isal -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2636365' 00:31:05.889 killing process with pid 2636365 00:31:05.889 18:34:49 compress_isal -- common/autotest_common.sh@967 -- # kill 2636365 00:31:05.889 Received shutdown signal, test time was about 3.000000 seconds 00:31:05.889 00:31:05.889 Latency(us) 00:31:05.889 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:31:05.889 =================================================================================================================== 00:31:05.889 Total : 0.00 0.00 0.00 0.00 0.00 0.00 0.00 00:31:05.889 18:34:49 compress_isal -- common/autotest_common.sh@972 -- # wait 2636365 00:31:09.174 18:34:52 compress_isal -- compress/compress.sh@89 -- # run_bdevio 00:31:09.174 18:34:52 compress_isal -- compress/compress.sh@50 -- # [[ isal == \c\o\m\p\d\e\v ]] 00:31:09.174 18:34:52 compress_isal -- compress/compress.sh@55 -- # bdevio_pid=2638068 00:31:09.174 18:34:52 compress_isal -- compress/compress.sh@56 -- # trap 'killprocess $bdevio_pid; error_cleanup; exit 1' SIGINT SIGTERM EXIT 00:31:09.174 18:34:52 compress_isal -- compress/compress.sh@53 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevio/bdevio -w 00:31:09.174 18:34:52 compress_isal -- compress/compress.sh@57 -- # waitforlisten 2638068 00:31:09.174 18:34:52 compress_isal -- common/autotest_common.sh@829 -- # '[' -z 2638068 ']' 00:31:09.174 18:34:52 compress_isal -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:31:09.174 18:34:52 compress_isal -- common/autotest_common.sh@834 -- # local max_retries=100 00:31:09.174 18:34:52 compress_isal -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:31:09.174 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:31:09.174 18:34:52 compress_isal -- common/autotest_common.sh@838 -- # xtrace_disable 00:31:09.174 18:34:52 compress_isal -- common/autotest_common.sh@10 -- # set +x 00:31:09.174 [2024-07-12 18:34:52.624114] Starting SPDK v24.09-pre git sha1 182dd7de4 / DPDK 24.03.0 initialization... 00:31:09.174 [2024-07-12 18:34:52.624193] [ DPDK EAL parameters: bdevio --no-shconf -c 0x7 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2638068 ] 00:31:09.174 [2024-07-12 18:34:52.758775] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 3 00:31:09.174 [2024-07-12 18:34:52.864691] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:31:09.174 [2024-07-12 18:34:52.864718] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 2 00:31:09.174 [2024-07-12 18:34:52.864723] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:31:10.110 18:34:53 compress_isal -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:31:10.110 18:34:53 compress_isal -- common/autotest_common.sh@862 -- # return 0 00:31:10.110 18:34:53 compress_isal -- compress/compress.sh@58 -- # create_vols 00:31:10.110 18:34:53 compress_isal -- compress/compress.sh@34 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/gen_nvme.sh 00:31:10.110 18:34:53 compress_isal -- compress/compress.sh@34 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py load_subsystem_config 00:31:10.676 18:34:54 compress_isal -- compress/compress.sh@35 -- # waitforbdev Nvme0n1 00:31:10.676 18:34:54 compress_isal -- common/autotest_common.sh@897 -- # local bdev_name=Nvme0n1 00:31:10.676 18:34:54 compress_isal -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:31:10.676 18:34:54 compress_isal -- common/autotest_common.sh@899 -- # local i 00:31:10.676 18:34:54 compress_isal -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:31:10.676 18:34:54 compress_isal -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:31:10.676 18:34:54 compress_isal -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:31:10.934 18:34:54 compress_isal -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b Nvme0n1 -t 2000 00:31:10.934 [ 00:31:10.934 { 00:31:10.934 "name": "Nvme0n1", 00:31:10.934 "aliases": [ 00:31:10.934 "01000000-0000-0000-5cd2-e43197705251" 00:31:10.934 ], 00:31:10.934 "product_name": "NVMe disk", 00:31:10.934 "block_size": 512, 00:31:10.934 "num_blocks": 15002931888, 00:31:10.934 "uuid": "01000000-0000-0000-5cd2-e43197705251", 00:31:10.934 "assigned_rate_limits": { 00:31:10.934 "rw_ios_per_sec": 0, 00:31:10.934 "rw_mbytes_per_sec": 0, 00:31:10.934 "r_mbytes_per_sec": 0, 00:31:10.934 "w_mbytes_per_sec": 0 00:31:10.934 }, 00:31:10.934 "claimed": false, 00:31:10.934 "zoned": false, 00:31:10.934 "supported_io_types": { 00:31:10.934 "read": true, 00:31:10.934 "write": true, 00:31:10.934 "unmap": true, 00:31:10.934 "flush": true, 00:31:10.934 "reset": true, 00:31:10.934 "nvme_admin": true, 00:31:10.934 "nvme_io": true, 00:31:10.934 "nvme_io_md": false, 00:31:10.934 "write_zeroes": true, 00:31:10.934 "zcopy": false, 00:31:10.934 "get_zone_info": false, 00:31:10.934 "zone_management": false, 00:31:10.934 "zone_append": false, 00:31:10.934 "compare": false, 00:31:10.934 "compare_and_write": false, 00:31:10.934 "abort": true, 00:31:10.934 "seek_hole": false, 00:31:10.934 "seek_data": false, 00:31:10.934 "copy": false, 00:31:10.934 "nvme_iov_md": false 00:31:10.934 }, 00:31:10.934 "driver_specific": { 00:31:10.934 "nvme": [ 00:31:10.934 { 00:31:10.934 "pci_address": "0000:5e:00.0", 00:31:10.934 "trid": { 00:31:10.934 "trtype": "PCIe", 00:31:10.934 "traddr": "0000:5e:00.0" 00:31:10.934 }, 00:31:10.934 "ctrlr_data": { 00:31:10.934 "cntlid": 0, 00:31:10.934 "vendor_id": "0x8086", 00:31:10.934 "model_number": "INTEL SSDPF2KX076TZO", 00:31:10.934 "serial_number": "PHAC0301002G7P6CGN", 00:31:10.934 "firmware_revision": "JCV10200", 00:31:10.934 "subnqn": "nqn.2020-07.com.intel:PHAC0301002G7P6CGN ", 00:31:10.934 "oacs": { 00:31:10.934 "security": 1, 00:31:10.934 "format": 1, 00:31:10.934 "firmware": 1, 00:31:10.934 "ns_manage": 1 00:31:10.934 }, 00:31:10.934 "multi_ctrlr": false, 00:31:10.934 "ana_reporting": false 00:31:10.934 }, 00:31:10.934 "vs": { 00:31:10.934 "nvme_version": "1.3" 00:31:10.934 }, 00:31:10.934 "ns_data": { 00:31:10.934 "id": 1, 00:31:10.934 "can_share": false 00:31:10.934 }, 00:31:10.934 "security": { 00:31:10.934 "opal": false 00:31:10.934 } 00:31:10.934 } 00:31:10.934 ], 00:31:10.934 "mp_policy": "active_passive" 00:31:10.934 } 00:31:10.934 } 00:31:10.934 ] 00:31:11.193 18:34:54 compress_isal -- common/autotest_common.sh@905 -- # return 0 00:31:11.193 18:34:54 compress_isal -- compress/compress.sh@37 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_create_lvstore --clear-method none Nvme0n1 lvs0 00:31:13.731 c8bae5bd-db02-4ab8-bc47-ebb06e16212a 00:31:13.731 18:34:57 compress_isal -- compress/compress.sh@38 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_create -t -l lvs0 lv0 100 00:31:13.731 14439287-0804-4147-8088-4d0fee21b433 00:31:13.731 18:34:57 compress_isal -- compress/compress.sh@39 -- # waitforbdev lvs0/lv0 00:31:13.731 18:34:57 compress_isal -- common/autotest_common.sh@897 -- # local bdev_name=lvs0/lv0 00:31:13.731 18:34:57 compress_isal -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:31:13.731 18:34:57 compress_isal -- common/autotest_common.sh@899 -- # local i 00:31:13.731 18:34:57 compress_isal -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:31:13.731 18:34:57 compress_isal -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:31:13.731 18:34:57 compress_isal -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:31:13.989 18:34:57 compress_isal -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b lvs0/lv0 -t 2000 00:31:14.248 [ 00:31:14.248 { 00:31:14.248 "name": "14439287-0804-4147-8088-4d0fee21b433", 00:31:14.248 "aliases": [ 00:31:14.248 "lvs0/lv0" 00:31:14.248 ], 00:31:14.248 "product_name": "Logical Volume", 00:31:14.248 "block_size": 512, 00:31:14.248 "num_blocks": 204800, 00:31:14.248 "uuid": "14439287-0804-4147-8088-4d0fee21b433", 00:31:14.248 "assigned_rate_limits": { 00:31:14.248 "rw_ios_per_sec": 0, 00:31:14.248 "rw_mbytes_per_sec": 0, 00:31:14.248 "r_mbytes_per_sec": 0, 00:31:14.248 "w_mbytes_per_sec": 0 00:31:14.248 }, 00:31:14.248 "claimed": false, 00:31:14.248 "zoned": false, 00:31:14.248 "supported_io_types": { 00:31:14.248 "read": true, 00:31:14.248 "write": true, 00:31:14.248 "unmap": true, 00:31:14.248 "flush": false, 00:31:14.248 "reset": true, 00:31:14.248 "nvme_admin": false, 00:31:14.248 "nvme_io": false, 00:31:14.249 "nvme_io_md": false, 00:31:14.249 "write_zeroes": true, 00:31:14.249 "zcopy": false, 00:31:14.249 "get_zone_info": false, 00:31:14.249 "zone_management": false, 00:31:14.249 "zone_append": false, 00:31:14.249 "compare": false, 00:31:14.249 "compare_and_write": false, 00:31:14.249 "abort": false, 00:31:14.249 "seek_hole": true, 00:31:14.249 "seek_data": true, 00:31:14.249 "copy": false, 00:31:14.249 "nvme_iov_md": false 00:31:14.249 }, 00:31:14.249 "driver_specific": { 00:31:14.249 "lvol": { 00:31:14.249 "lvol_store_uuid": "c8bae5bd-db02-4ab8-bc47-ebb06e16212a", 00:31:14.249 "base_bdev": "Nvme0n1", 00:31:14.249 "thin_provision": true, 00:31:14.249 "num_allocated_clusters": 0, 00:31:14.249 "snapshot": false, 00:31:14.249 "clone": false, 00:31:14.249 "esnap_clone": false 00:31:14.249 } 00:31:14.249 } 00:31:14.249 } 00:31:14.249 ] 00:31:14.249 18:34:57 compress_isal -- common/autotest_common.sh@905 -- # return 0 00:31:14.249 18:34:57 compress_isal -- compress/compress.sh@41 -- # '[' -z '' ']' 00:31:14.249 18:34:57 compress_isal -- compress/compress.sh@42 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_compress_create -b lvs0/lv0 -p /tmp/pmem 00:31:14.816 [2024-07-12 18:34:58.353179] vbdev_compress.c:1016:vbdev_compress_claim: *NOTICE*: registered io_device and virtual bdev for: COMP_lvs0/lv0 00:31:14.816 COMP_lvs0/lv0 00:31:14.816 18:34:58 compress_isal -- compress/compress.sh@46 -- # waitforbdev COMP_lvs0/lv0 00:31:14.816 18:34:58 compress_isal -- common/autotest_common.sh@897 -- # local bdev_name=COMP_lvs0/lv0 00:31:14.816 18:34:58 compress_isal -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:31:14.816 18:34:58 compress_isal -- common/autotest_common.sh@899 -- # local i 00:31:14.816 18:34:58 compress_isal -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:31:14.816 18:34:58 compress_isal -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:31:14.816 18:34:58 compress_isal -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:31:15.076 18:34:58 compress_isal -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b COMP_lvs0/lv0 -t 2000 00:31:15.335 [ 00:31:15.335 { 00:31:15.335 "name": "COMP_lvs0/lv0", 00:31:15.335 "aliases": [ 00:31:15.335 "7bc8579d-ada1-5738-a350-dd5b78ff77a5" 00:31:15.335 ], 00:31:15.335 "product_name": "compress", 00:31:15.335 "block_size": 512, 00:31:15.335 "num_blocks": 200704, 00:31:15.335 "uuid": "7bc8579d-ada1-5738-a350-dd5b78ff77a5", 00:31:15.335 "assigned_rate_limits": { 00:31:15.335 "rw_ios_per_sec": 0, 00:31:15.335 "rw_mbytes_per_sec": 0, 00:31:15.335 "r_mbytes_per_sec": 0, 00:31:15.335 "w_mbytes_per_sec": 0 00:31:15.335 }, 00:31:15.335 "claimed": false, 00:31:15.335 "zoned": false, 00:31:15.335 "supported_io_types": { 00:31:15.335 "read": true, 00:31:15.335 "write": true, 00:31:15.335 "unmap": false, 00:31:15.335 "flush": false, 00:31:15.335 "reset": false, 00:31:15.335 "nvme_admin": false, 00:31:15.335 "nvme_io": false, 00:31:15.335 "nvme_io_md": false, 00:31:15.335 "write_zeroes": true, 00:31:15.335 "zcopy": false, 00:31:15.335 "get_zone_info": false, 00:31:15.335 "zone_management": false, 00:31:15.335 "zone_append": false, 00:31:15.335 "compare": false, 00:31:15.335 "compare_and_write": false, 00:31:15.335 "abort": false, 00:31:15.335 "seek_hole": false, 00:31:15.335 "seek_data": false, 00:31:15.335 "copy": false, 00:31:15.335 "nvme_iov_md": false 00:31:15.335 }, 00:31:15.335 "driver_specific": { 00:31:15.335 "compress": { 00:31:15.335 "name": "COMP_lvs0/lv0", 00:31:15.335 "base_bdev_name": "14439287-0804-4147-8088-4d0fee21b433" 00:31:15.335 } 00:31:15.335 } 00:31:15.335 } 00:31:15.335 ] 00:31:15.335 18:34:58 compress_isal -- common/autotest_common.sh@905 -- # return 0 00:31:15.335 18:34:58 compress_isal -- compress/compress.sh@59 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevio/tests.py perform_tests 00:31:15.335 I/O targets: 00:31:15.335 COMP_lvs0/lv0: 200704 blocks of 512 bytes (98 MiB) 00:31:15.335 00:31:15.335 00:31:15.335 CUnit - A unit testing framework for C - Version 2.1-3 00:31:15.335 http://cunit.sourceforge.net/ 00:31:15.335 00:31:15.335 00:31:15.335 Suite: bdevio tests on: COMP_lvs0/lv0 00:31:15.335 Test: blockdev write read block ...passed 00:31:15.335 Test: blockdev write zeroes read block ...passed 00:31:15.335 Test: blockdev write zeroes read no split ...passed 00:31:15.335 Test: blockdev write zeroes read split ...passed 00:31:15.335 Test: blockdev write zeroes read split partial ...passed 00:31:15.335 Test: blockdev reset ...[2024-07-12 18:34:59.024006] vbdev_compress.c: 252:vbdev_compress_submit_request: *ERROR*: Unknown I/O type 5 00:31:15.335 passed 00:31:15.335 Test: blockdev write read 8 blocks ...passed 00:31:15.335 Test: blockdev write read size > 128k ...passed 00:31:15.335 Test: blockdev write read invalid size ...passed 00:31:15.335 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:31:15.335 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:31:15.335 Test: blockdev write read max offset ...passed 00:31:15.335 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:31:15.335 Test: blockdev writev readv 8 blocks ...passed 00:31:15.335 Test: blockdev writev readv 30 x 1block ...passed 00:31:15.335 Test: blockdev writev readv block ...passed 00:31:15.335 Test: blockdev writev readv size > 128k ...passed 00:31:15.335 Test: blockdev writev readv size > 128k in two iovs ...passed 00:31:15.335 Test: blockdev comparev and writev ...passed 00:31:15.335 Test: blockdev nvme passthru rw ...passed 00:31:15.335 Test: blockdev nvme passthru vendor specific ...passed 00:31:15.335 Test: blockdev nvme admin passthru ...passed 00:31:15.335 Test: blockdev copy ...passed 00:31:15.335 00:31:15.335 Run Summary: Type Total Ran Passed Failed Inactive 00:31:15.335 suites 1 1 n/a 0 0 00:31:15.335 tests 23 23 23 0 0 00:31:15.335 asserts 130 130 130 0 n/a 00:31:15.335 00:31:15.335 Elapsed time = 0.108 seconds 00:31:15.335 0 00:31:15.335 18:34:59 compress_isal -- compress/compress.sh@60 -- # destroy_vols 00:31:15.335 18:34:59 compress_isal -- compress/compress.sh@29 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_compress_delete COMP_lvs0/lv0 00:31:15.594 18:34:59 compress_isal -- compress/compress.sh@30 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_delete_lvstore -l lvs0 00:31:15.852 18:34:59 compress_isal -- compress/compress.sh@61 -- # trap - SIGINT SIGTERM EXIT 00:31:15.852 18:34:59 compress_isal -- compress/compress.sh@62 -- # killprocess 2638068 00:31:15.852 18:34:59 compress_isal -- common/autotest_common.sh@948 -- # '[' -z 2638068 ']' 00:31:15.852 18:34:59 compress_isal -- common/autotest_common.sh@952 -- # kill -0 2638068 00:31:15.852 18:34:59 compress_isal -- common/autotest_common.sh@953 -- # uname 00:31:15.852 18:34:59 compress_isal -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:31:15.852 18:34:59 compress_isal -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2638068 00:31:16.111 18:34:59 compress_isal -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:31:16.111 18:34:59 compress_isal -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:31:16.111 18:34:59 compress_isal -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2638068' 00:31:16.111 killing process with pid 2638068 00:31:16.111 18:34:59 compress_isal -- common/autotest_common.sh@967 -- # kill 2638068 00:31:16.111 18:34:59 compress_isal -- common/autotest_common.sh@972 -- # wait 2638068 00:31:19.395 18:35:02 compress_isal -- compress/compress.sh@91 -- # '[' 0 -eq 1 ']' 00:31:19.395 18:35:02 compress_isal -- compress/compress.sh@120 -- # rm -rf /tmp/pmem 00:31:19.395 00:31:19.395 real 0m47.695s 00:31:19.395 user 1m52.405s 00:31:19.395 sys 0m4.310s 00:31:19.395 18:35:02 compress_isal -- common/autotest_common.sh@1124 -- # xtrace_disable 00:31:19.395 18:35:02 compress_isal -- common/autotest_common.sh@10 -- # set +x 00:31:19.395 ************************************ 00:31:19.395 END TEST compress_isal 00:31:19.395 ************************************ 00:31:19.395 18:35:02 -- common/autotest_common.sh@1142 -- # return 0 00:31:19.395 18:35:02 -- spdk/autotest.sh@352 -- # '[' 0 -eq 1 ']' 00:31:19.395 18:35:02 -- spdk/autotest.sh@356 -- # '[' 1 -eq 1 ']' 00:31:19.395 18:35:02 -- spdk/autotest.sh@357 -- # run_test blockdev_crypto_aesni /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/blockdev.sh crypto_aesni 00:31:19.395 18:35:02 -- common/autotest_common.sh@1099 -- # '[' 3 -le 1 ']' 00:31:19.395 18:35:02 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:31:19.395 18:35:02 -- common/autotest_common.sh@10 -- # set +x 00:31:19.395 ************************************ 00:31:19.395 START TEST blockdev_crypto_aesni 00:31:19.395 ************************************ 00:31:19.395 18:35:02 blockdev_crypto_aesni -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/blockdev.sh crypto_aesni 00:31:19.395 * Looking for test storage... 00:31:19.395 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev 00:31:19.395 18:35:02 blockdev_crypto_aesni -- bdev/blockdev.sh@10 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbd_common.sh 00:31:19.395 18:35:02 blockdev_crypto_aesni -- bdev/nbd_common.sh@6 -- # set -e 00:31:19.395 18:35:02 blockdev_crypto_aesni -- bdev/blockdev.sh@12 -- # rpc_py=rpc_cmd 00:31:19.395 18:35:02 blockdev_crypto_aesni -- bdev/blockdev.sh@13 -- # conf_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json 00:31:19.395 18:35:02 blockdev_crypto_aesni -- bdev/blockdev.sh@14 -- # nonenclosed_conf_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonenclosed.json 00:31:19.395 18:35:02 blockdev_crypto_aesni -- bdev/blockdev.sh@15 -- # nonarray_conf_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonarray.json 00:31:19.395 18:35:02 blockdev_crypto_aesni -- bdev/blockdev.sh@17 -- # export RPC_PIPE_TIMEOUT=30 00:31:19.395 18:35:02 blockdev_crypto_aesni -- bdev/blockdev.sh@17 -- # RPC_PIPE_TIMEOUT=30 00:31:19.395 18:35:02 blockdev_crypto_aesni -- bdev/blockdev.sh@20 -- # : 00:31:19.395 18:35:02 blockdev_crypto_aesni -- bdev/blockdev.sh@670 -- # QOS_DEV_1=Malloc_0 00:31:19.395 18:35:02 blockdev_crypto_aesni -- bdev/blockdev.sh@671 -- # QOS_DEV_2=Null_1 00:31:19.395 18:35:02 blockdev_crypto_aesni -- bdev/blockdev.sh@672 -- # QOS_RUN_TIME=5 00:31:19.395 18:35:02 blockdev_crypto_aesni -- bdev/blockdev.sh@674 -- # uname -s 00:31:19.395 18:35:02 blockdev_crypto_aesni -- bdev/blockdev.sh@674 -- # '[' Linux = Linux ']' 00:31:19.395 18:35:02 blockdev_crypto_aesni -- bdev/blockdev.sh@676 -- # PRE_RESERVED_MEM=0 00:31:19.395 18:35:02 blockdev_crypto_aesni -- bdev/blockdev.sh@682 -- # test_type=crypto_aesni 00:31:19.395 18:35:02 blockdev_crypto_aesni -- bdev/blockdev.sh@683 -- # crypto_device= 00:31:19.395 18:35:02 blockdev_crypto_aesni -- bdev/blockdev.sh@684 -- # dek= 00:31:19.395 18:35:02 blockdev_crypto_aesni -- bdev/blockdev.sh@685 -- # env_ctx= 00:31:19.395 18:35:02 blockdev_crypto_aesni -- bdev/blockdev.sh@686 -- # wait_for_rpc= 00:31:19.395 18:35:02 blockdev_crypto_aesni -- bdev/blockdev.sh@687 -- # '[' -n '' ']' 00:31:19.395 18:35:02 blockdev_crypto_aesni -- bdev/blockdev.sh@690 -- # [[ crypto_aesni == bdev ]] 00:31:19.395 18:35:02 blockdev_crypto_aesni -- bdev/blockdev.sh@690 -- # [[ crypto_aesni == crypto_* ]] 00:31:19.395 18:35:02 blockdev_crypto_aesni -- bdev/blockdev.sh@691 -- # wait_for_rpc=--wait-for-rpc 00:31:19.395 18:35:02 blockdev_crypto_aesni -- bdev/blockdev.sh@693 -- # start_spdk_tgt 00:31:19.395 18:35:02 blockdev_crypto_aesni -- bdev/blockdev.sh@47 -- # spdk_tgt_pid=2639372 00:31:19.396 18:35:02 blockdev_crypto_aesni -- bdev/blockdev.sh@48 -- # trap 'killprocess "$spdk_tgt_pid"; exit 1' SIGINT SIGTERM EXIT 00:31:19.396 18:35:02 blockdev_crypto_aesni -- bdev/blockdev.sh@49 -- # waitforlisten 2639372 00:31:19.396 18:35:02 blockdev_crypto_aesni -- common/autotest_common.sh@829 -- # '[' -z 2639372 ']' 00:31:19.396 18:35:02 blockdev_crypto_aesni -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:31:19.396 18:35:02 blockdev_crypto_aesni -- common/autotest_common.sh@834 -- # local max_retries=100 00:31:19.396 18:35:02 blockdev_crypto_aesni -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:31:19.396 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:31:19.396 18:35:02 blockdev_crypto_aesni -- common/autotest_common.sh@838 -- # xtrace_disable 00:31:19.396 18:35:02 blockdev_crypto_aesni -- common/autotest_common.sh@10 -- # set +x 00:31:19.396 18:35:02 blockdev_crypto_aesni -- bdev/blockdev.sh@46 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt '' --wait-for-rpc 00:31:19.396 [2024-07-12 18:35:02.859292] Starting SPDK v24.09-pre git sha1 182dd7de4 / DPDK 24.03.0 initialization... 00:31:19.396 [2024-07-12 18:35:02.859439] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2639372 ] 00:31:19.396 [2024-07-12 18:35:03.052912] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:31:19.654 [2024-07-12 18:35:03.150935] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:31:20.220 18:35:03 blockdev_crypto_aesni -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:31:20.220 18:35:03 blockdev_crypto_aesni -- common/autotest_common.sh@862 -- # return 0 00:31:20.220 18:35:03 blockdev_crypto_aesni -- bdev/blockdev.sh@694 -- # case "$test_type" in 00:31:20.220 18:35:03 blockdev_crypto_aesni -- bdev/blockdev.sh@705 -- # setup_crypto_aesni_conf 00:31:20.220 18:35:03 blockdev_crypto_aesni -- bdev/blockdev.sh@146 -- # rpc_cmd 00:31:20.220 18:35:03 blockdev_crypto_aesni -- common/autotest_common.sh@559 -- # xtrace_disable 00:31:20.220 18:35:03 blockdev_crypto_aesni -- common/autotest_common.sh@10 -- # set +x 00:31:20.220 [2024-07-12 18:35:03.728831] accel_dpdk_cryptodev.c: 223:accel_dpdk_cryptodev_set_driver: *NOTICE*: Using driver crypto_aesni_mb 00:31:20.220 [2024-07-12 18:35:03.736864] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation encrypt will be assigned to module dpdk_cryptodev 00:31:20.220 [2024-07-12 18:35:03.744882] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation decrypt will be assigned to module dpdk_cryptodev 00:31:20.220 [2024-07-12 18:35:03.810763] accel_dpdk_cryptodev.c:1178:accel_dpdk_cryptodev_init: *NOTICE*: Found crypto devices: 97 00:31:22.755 true 00:31:22.755 true 00:31:22.755 true 00:31:22.755 true 00:31:22.755 Malloc0 00:31:22.755 Malloc1 00:31:22.755 Malloc2 00:31:22.755 Malloc3 00:31:22.755 [2024-07-12 18:35:06.209287] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_1" 00:31:22.755 crypto_ram 00:31:22.755 [2024-07-12 18:35:06.217306] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_2" 00:31:22.755 crypto_ram2 00:31:22.755 [2024-07-12 18:35:06.225326] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_3" 00:31:22.755 crypto_ram3 00:31:22.755 [2024-07-12 18:35:06.233350] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_4" 00:31:22.755 crypto_ram4 00:31:22.755 18:35:06 blockdev_crypto_aesni -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:31:22.755 18:35:06 blockdev_crypto_aesni -- bdev/blockdev.sh@737 -- # rpc_cmd bdev_wait_for_examine 00:31:22.755 18:35:06 blockdev_crypto_aesni -- common/autotest_common.sh@559 -- # xtrace_disable 00:31:22.755 18:35:06 blockdev_crypto_aesni -- common/autotest_common.sh@10 -- # set +x 00:31:22.755 18:35:06 blockdev_crypto_aesni -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:31:22.755 18:35:06 blockdev_crypto_aesni -- bdev/blockdev.sh@740 -- # cat 00:31:22.755 18:35:06 blockdev_crypto_aesni -- bdev/blockdev.sh@740 -- # rpc_cmd save_subsystem_config -n accel 00:31:22.755 18:35:06 blockdev_crypto_aesni -- common/autotest_common.sh@559 -- # xtrace_disable 00:31:22.755 18:35:06 blockdev_crypto_aesni -- common/autotest_common.sh@10 -- # set +x 00:31:22.755 18:35:06 blockdev_crypto_aesni -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:31:22.755 18:35:06 blockdev_crypto_aesni -- bdev/blockdev.sh@740 -- # rpc_cmd save_subsystem_config -n bdev 00:31:22.755 18:35:06 blockdev_crypto_aesni -- common/autotest_common.sh@559 -- # xtrace_disable 00:31:22.755 18:35:06 blockdev_crypto_aesni -- common/autotest_common.sh@10 -- # set +x 00:31:22.755 18:35:06 blockdev_crypto_aesni -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:31:22.755 18:35:06 blockdev_crypto_aesni -- bdev/blockdev.sh@740 -- # rpc_cmd save_subsystem_config -n iobuf 00:31:22.755 18:35:06 blockdev_crypto_aesni -- common/autotest_common.sh@559 -- # xtrace_disable 00:31:22.755 18:35:06 blockdev_crypto_aesni -- common/autotest_common.sh@10 -- # set +x 00:31:22.755 18:35:06 blockdev_crypto_aesni -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:31:22.755 18:35:06 blockdev_crypto_aesni -- bdev/blockdev.sh@748 -- # mapfile -t bdevs 00:31:22.755 18:35:06 blockdev_crypto_aesni -- bdev/blockdev.sh@748 -- # rpc_cmd bdev_get_bdevs 00:31:22.755 18:35:06 blockdev_crypto_aesni -- common/autotest_common.sh@559 -- # xtrace_disable 00:31:22.755 18:35:06 blockdev_crypto_aesni -- bdev/blockdev.sh@748 -- # jq -r '.[] | select(.claimed == false)' 00:31:22.755 18:35:06 blockdev_crypto_aesni -- common/autotest_common.sh@10 -- # set +x 00:31:22.755 18:35:06 blockdev_crypto_aesni -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:31:22.755 18:35:06 blockdev_crypto_aesni -- bdev/blockdev.sh@749 -- # mapfile -t bdevs_name 00:31:22.755 18:35:06 blockdev_crypto_aesni -- bdev/blockdev.sh@749 -- # printf '%s\n' '{' ' "name": "crypto_ram",' ' "aliases": [' ' "1a666f97-3407-5f0c-9fd2-130d4e9b6548"' ' ],' ' "product_name": "crypto",' ' "block_size": 512,' ' "num_blocks": 65536,' ' "uuid": "1a666f97-3407-5f0c-9fd2-130d4e9b6548",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc0",' ' "name": "crypto_ram",' ' "key_name": "test_dek_aesni_cbc_1"' ' }' ' }' '}' '{' ' "name": "crypto_ram2",' ' "aliases": [' ' "13aef702-7b78-5eda-96a5-f15b57ab937b"' ' ],' ' "product_name": "crypto",' ' "block_size": 512,' ' "num_blocks": 65536,' ' "uuid": "13aef702-7b78-5eda-96a5-f15b57ab937b",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc1",' ' "name": "crypto_ram2",' ' "key_name": "test_dek_aesni_cbc_2"' ' }' ' }' '}' '{' ' "name": "crypto_ram3",' ' "aliases": [' ' "da2b7215-64f1-5366-b613-e9b5230ddbc2"' ' ],' ' "product_name": "crypto",' ' "block_size": 4096,' ' "num_blocks": 8192,' ' "uuid": "da2b7215-64f1-5366-b613-e9b5230ddbc2",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc2",' ' "name": "crypto_ram3",' ' "key_name": "test_dek_aesni_cbc_3"' ' }' ' }' '}' '{' ' "name": "crypto_ram4",' ' "aliases": [' ' "55438745-dfe1-560f-84cd-20f8b23f1ae3"' ' ],' ' "product_name": "crypto",' ' "block_size": 4096,' ' "num_blocks": 8192,' ' "uuid": "55438745-dfe1-560f-84cd-20f8b23f1ae3",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc3",' ' "name": "crypto_ram4",' ' "key_name": "test_dek_aesni_cbc_4"' ' }' ' }' '}' 00:31:22.755 18:35:06 blockdev_crypto_aesni -- bdev/blockdev.sh@749 -- # jq -r .name 00:31:22.755 18:35:06 blockdev_crypto_aesni -- bdev/blockdev.sh@750 -- # bdev_list=("${bdevs_name[@]}") 00:31:22.755 18:35:06 blockdev_crypto_aesni -- bdev/blockdev.sh@752 -- # hello_world_bdev=crypto_ram 00:31:22.755 18:35:06 blockdev_crypto_aesni -- bdev/blockdev.sh@753 -- # trap - SIGINT SIGTERM EXIT 00:31:22.755 18:35:06 blockdev_crypto_aesni -- bdev/blockdev.sh@754 -- # killprocess 2639372 00:31:22.755 18:35:06 blockdev_crypto_aesni -- common/autotest_common.sh@948 -- # '[' -z 2639372 ']' 00:31:22.755 18:35:06 blockdev_crypto_aesni -- common/autotest_common.sh@952 -- # kill -0 2639372 00:31:22.755 18:35:06 blockdev_crypto_aesni -- common/autotest_common.sh@953 -- # uname 00:31:22.755 18:35:06 blockdev_crypto_aesni -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:31:22.755 18:35:06 blockdev_crypto_aesni -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2639372 00:31:23.015 18:35:06 blockdev_crypto_aesni -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:31:23.015 18:35:06 blockdev_crypto_aesni -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:31:23.015 18:35:06 blockdev_crypto_aesni -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2639372' 00:31:23.015 killing process with pid 2639372 00:31:23.015 18:35:06 blockdev_crypto_aesni -- common/autotest_common.sh@967 -- # kill 2639372 00:31:23.015 18:35:06 blockdev_crypto_aesni -- common/autotest_common.sh@972 -- # wait 2639372 00:31:23.581 18:35:07 blockdev_crypto_aesni -- bdev/blockdev.sh@758 -- # trap cleanup SIGINT SIGTERM EXIT 00:31:23.581 18:35:07 blockdev_crypto_aesni -- bdev/blockdev.sh@760 -- # run_test bdev_hello_world /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/hello_bdev --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -b crypto_ram '' 00:31:23.581 18:35:07 blockdev_crypto_aesni -- common/autotest_common.sh@1099 -- # '[' 7 -le 1 ']' 00:31:23.581 18:35:07 blockdev_crypto_aesni -- common/autotest_common.sh@1105 -- # xtrace_disable 00:31:23.581 18:35:07 blockdev_crypto_aesni -- common/autotest_common.sh@10 -- # set +x 00:31:23.581 ************************************ 00:31:23.581 START TEST bdev_hello_world 00:31:23.581 ************************************ 00:31:23.581 18:35:07 blockdev_crypto_aesni.bdev_hello_world -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/hello_bdev --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -b crypto_ram '' 00:31:23.581 [2024-07-12 18:35:07.163350] Starting SPDK v24.09-pre git sha1 182dd7de4 / DPDK 24.03.0 initialization... 00:31:23.581 [2024-07-12 18:35:07.163417] [ DPDK EAL parameters: hello_bdev --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2639916 ] 00:31:23.581 [2024-07-12 18:35:07.295631] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:31:23.839 [2024-07-12 18:35:07.404484] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:31:23.839 [2024-07-12 18:35:07.425754] accel_dpdk_cryptodev.c: 223:accel_dpdk_cryptodev_set_driver: *NOTICE*: Using driver crypto_aesni_mb 00:31:23.839 [2024-07-12 18:35:07.433782] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation encrypt will be assigned to module dpdk_cryptodev 00:31:23.839 [2024-07-12 18:35:07.441809] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation decrypt will be assigned to module dpdk_cryptodev 00:31:23.839 [2024-07-12 18:35:07.552190] accel_dpdk_cryptodev.c:1178:accel_dpdk_cryptodev_init: *NOTICE*: Found crypto devices: 97 00:31:26.371 [2024-07-12 18:35:09.769129] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_1" 00:31:26.371 [2024-07-12 18:35:09.769199] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:31:26.371 [2024-07-12 18:35:09.769215] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:31:26.371 [2024-07-12 18:35:09.777147] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_2" 00:31:26.371 [2024-07-12 18:35:09.777166] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:31:26.371 [2024-07-12 18:35:09.777178] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:31:26.371 [2024-07-12 18:35:09.785167] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_3" 00:31:26.371 [2024-07-12 18:35:09.785185] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:31:26.371 [2024-07-12 18:35:09.785197] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:31:26.371 [2024-07-12 18:35:09.793187] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_4" 00:31:26.371 [2024-07-12 18:35:09.793203] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc3 00:31:26.371 [2024-07-12 18:35:09.793215] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:31:26.371 [2024-07-12 18:35:09.865971] hello_bdev.c: 222:hello_start: *NOTICE*: Successfully started the application 00:31:26.371 [2024-07-12 18:35:09.866014] hello_bdev.c: 231:hello_start: *NOTICE*: Opening the bdev crypto_ram 00:31:26.371 [2024-07-12 18:35:09.866034] hello_bdev.c: 244:hello_start: *NOTICE*: Opening io channel 00:31:26.371 [2024-07-12 18:35:09.867308] hello_bdev.c: 138:hello_write: *NOTICE*: Writing to the bdev 00:31:26.371 [2024-07-12 18:35:09.867381] hello_bdev.c: 117:write_complete: *NOTICE*: bdev io write completed successfully 00:31:26.371 [2024-07-12 18:35:09.867402] hello_bdev.c: 84:hello_read: *NOTICE*: Reading io 00:31:26.371 [2024-07-12 18:35:09.867447] hello_bdev.c: 65:read_complete: *NOTICE*: Read string from bdev : Hello World! 00:31:26.371 00:31:26.371 [2024-07-12 18:35:09.867465] hello_bdev.c: 74:read_complete: *NOTICE*: Stopping app 00:31:26.639 00:31:26.639 real 0m3.148s 00:31:26.639 user 0m2.734s 00:31:26.639 sys 0m0.371s 00:31:26.639 18:35:10 blockdev_crypto_aesni.bdev_hello_world -- common/autotest_common.sh@1124 -- # xtrace_disable 00:31:26.639 18:35:10 blockdev_crypto_aesni.bdev_hello_world -- common/autotest_common.sh@10 -- # set +x 00:31:26.639 ************************************ 00:31:26.639 END TEST bdev_hello_world 00:31:26.639 ************************************ 00:31:26.639 18:35:10 blockdev_crypto_aesni -- common/autotest_common.sh@1142 -- # return 0 00:31:26.639 18:35:10 blockdev_crypto_aesni -- bdev/blockdev.sh@761 -- # run_test bdev_bounds bdev_bounds '' 00:31:26.639 18:35:10 blockdev_crypto_aesni -- common/autotest_common.sh@1099 -- # '[' 3 -le 1 ']' 00:31:26.639 18:35:10 blockdev_crypto_aesni -- common/autotest_common.sh@1105 -- # xtrace_disable 00:31:26.639 18:35:10 blockdev_crypto_aesni -- common/autotest_common.sh@10 -- # set +x 00:31:26.639 ************************************ 00:31:26.639 START TEST bdev_bounds 00:31:26.639 ************************************ 00:31:26.639 18:35:10 blockdev_crypto_aesni.bdev_bounds -- common/autotest_common.sh@1123 -- # bdev_bounds '' 00:31:26.639 18:35:10 blockdev_crypto_aesni.bdev_bounds -- bdev/blockdev.sh@290 -- # bdevio_pid=2640443 00:31:26.639 18:35:10 blockdev_crypto_aesni.bdev_bounds -- bdev/blockdev.sh@291 -- # trap 'cleanup; killprocess $bdevio_pid; exit 1' SIGINT SIGTERM EXIT 00:31:26.639 18:35:10 blockdev_crypto_aesni.bdev_bounds -- bdev/blockdev.sh@289 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevio/bdevio -w -s 0 --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json '' 00:31:26.639 18:35:10 blockdev_crypto_aesni.bdev_bounds -- bdev/blockdev.sh@292 -- # echo 'Process bdevio pid: 2640443' 00:31:26.639 Process bdevio pid: 2640443 00:31:26.639 18:35:10 blockdev_crypto_aesni.bdev_bounds -- bdev/blockdev.sh@293 -- # waitforlisten 2640443 00:31:26.639 18:35:10 blockdev_crypto_aesni.bdev_bounds -- common/autotest_common.sh@829 -- # '[' -z 2640443 ']' 00:31:26.639 18:35:10 blockdev_crypto_aesni.bdev_bounds -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:31:26.639 18:35:10 blockdev_crypto_aesni.bdev_bounds -- common/autotest_common.sh@834 -- # local max_retries=100 00:31:26.639 18:35:10 blockdev_crypto_aesni.bdev_bounds -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:31:26.639 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:31:26.639 18:35:10 blockdev_crypto_aesni.bdev_bounds -- common/autotest_common.sh@838 -- # xtrace_disable 00:31:26.639 18:35:10 blockdev_crypto_aesni.bdev_bounds -- common/autotest_common.sh@10 -- # set +x 00:31:26.930 [2024-07-12 18:35:10.395076] Starting SPDK v24.09-pre git sha1 182dd7de4 / DPDK 24.03.0 initialization... 00:31:26.930 [2024-07-12 18:35:10.395147] [ DPDK EAL parameters: bdevio --no-shconf -c 0x7 -m 0 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2640443 ] 00:31:26.930 [2024-07-12 18:35:10.527172] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 3 00:31:26.930 [2024-07-12 18:35:10.630476] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:31:26.930 [2024-07-12 18:35:10.630563] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 2 00:31:26.930 [2024-07-12 18:35:10.630569] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:31:27.201 [2024-07-12 18:35:10.651942] accel_dpdk_cryptodev.c: 223:accel_dpdk_cryptodev_set_driver: *NOTICE*: Using driver crypto_aesni_mb 00:31:27.201 [2024-07-12 18:35:10.659955] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation encrypt will be assigned to module dpdk_cryptodev 00:31:27.201 [2024-07-12 18:35:10.667974] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation decrypt will be assigned to module dpdk_cryptodev 00:31:27.201 [2024-07-12 18:35:10.775873] accel_dpdk_cryptodev.c:1178:accel_dpdk_cryptodev_init: *NOTICE*: Found crypto devices: 97 00:31:29.736 [2024-07-12 18:35:12.987071] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_1" 00:31:29.736 [2024-07-12 18:35:12.987172] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:31:29.736 [2024-07-12 18:35:12.987188] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:31:29.736 [2024-07-12 18:35:12.995087] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_2" 00:31:29.736 [2024-07-12 18:35:12.995107] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:31:29.736 [2024-07-12 18:35:12.995119] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:31:29.736 [2024-07-12 18:35:13.003109] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_3" 00:31:29.736 [2024-07-12 18:35:13.003130] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:31:29.736 [2024-07-12 18:35:13.003142] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:31:29.736 [2024-07-12 18:35:13.011134] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_4" 00:31:29.736 [2024-07-12 18:35:13.011152] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc3 00:31:29.736 [2024-07-12 18:35:13.011163] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:31:29.736 18:35:13 blockdev_crypto_aesni.bdev_bounds -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:31:29.736 18:35:13 blockdev_crypto_aesni.bdev_bounds -- common/autotest_common.sh@862 -- # return 0 00:31:29.736 18:35:13 blockdev_crypto_aesni.bdev_bounds -- bdev/blockdev.sh@294 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevio/tests.py perform_tests 00:31:29.736 I/O targets: 00:31:29.736 crypto_ram: 65536 blocks of 512 bytes (32 MiB) 00:31:29.736 crypto_ram2: 65536 blocks of 512 bytes (32 MiB) 00:31:29.736 crypto_ram3: 8192 blocks of 4096 bytes (32 MiB) 00:31:29.736 crypto_ram4: 8192 blocks of 4096 bytes (32 MiB) 00:31:29.736 00:31:29.736 00:31:29.736 CUnit - A unit testing framework for C - Version 2.1-3 00:31:29.736 http://cunit.sourceforge.net/ 00:31:29.736 00:31:29.737 00:31:29.737 Suite: bdevio tests on: crypto_ram4 00:31:29.737 Test: blockdev write read block ...passed 00:31:29.737 Test: blockdev write zeroes read block ...passed 00:31:29.737 Test: blockdev write zeroes read no split ...passed 00:31:29.737 Test: blockdev write zeroes read split ...passed 00:31:29.737 Test: blockdev write zeroes read split partial ...passed 00:31:29.737 Test: blockdev reset ...passed 00:31:29.737 Test: blockdev write read 8 blocks ...passed 00:31:29.737 Test: blockdev write read size > 128k ...passed 00:31:29.737 Test: blockdev write read invalid size ...passed 00:31:29.737 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:31:29.737 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:31:29.737 Test: blockdev write read max offset ...passed 00:31:29.737 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:31:29.737 Test: blockdev writev readv 8 blocks ...passed 00:31:29.737 Test: blockdev writev readv 30 x 1block ...passed 00:31:29.737 Test: blockdev writev readv block ...passed 00:31:29.737 Test: blockdev writev readv size > 128k ...passed 00:31:29.737 Test: blockdev writev readv size > 128k in two iovs ...passed 00:31:29.737 Test: blockdev comparev and writev ...passed 00:31:29.737 Test: blockdev nvme passthru rw ...passed 00:31:29.737 Test: blockdev nvme passthru vendor specific ...passed 00:31:29.737 Test: blockdev nvme admin passthru ...passed 00:31:29.737 Test: blockdev copy ...passed 00:31:29.737 Suite: bdevio tests on: crypto_ram3 00:31:29.737 Test: blockdev write read block ...passed 00:31:29.737 Test: blockdev write zeroes read block ...passed 00:31:29.737 Test: blockdev write zeroes read no split ...passed 00:31:29.737 Test: blockdev write zeroes read split ...passed 00:31:29.737 Test: blockdev write zeroes read split partial ...passed 00:31:29.737 Test: blockdev reset ...passed 00:31:29.737 Test: blockdev write read 8 blocks ...passed 00:31:29.737 Test: blockdev write read size > 128k ...passed 00:31:29.737 Test: blockdev write read invalid size ...passed 00:31:29.737 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:31:29.737 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:31:29.737 Test: blockdev write read max offset ...passed 00:31:29.737 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:31:29.737 Test: blockdev writev readv 8 blocks ...passed 00:31:29.737 Test: blockdev writev readv 30 x 1block ...passed 00:31:29.737 Test: blockdev writev readv block ...passed 00:31:29.737 Test: blockdev writev readv size > 128k ...passed 00:31:29.737 Test: blockdev writev readv size > 128k in two iovs ...passed 00:31:29.737 Test: blockdev comparev and writev ...passed 00:31:29.737 Test: blockdev nvme passthru rw ...passed 00:31:29.737 Test: blockdev nvme passthru vendor specific ...passed 00:31:29.737 Test: blockdev nvme admin passthru ...passed 00:31:29.737 Test: blockdev copy ...passed 00:31:29.737 Suite: bdevio tests on: crypto_ram2 00:31:29.737 Test: blockdev write read block ...passed 00:31:29.737 Test: blockdev write zeroes read block ...passed 00:31:29.737 Test: blockdev write zeroes read no split ...passed 00:31:29.737 Test: blockdev write zeroes read split ...passed 00:31:29.737 Test: blockdev write zeroes read split partial ...passed 00:31:29.737 Test: blockdev reset ...passed 00:31:29.737 Test: blockdev write read 8 blocks ...passed 00:31:29.737 Test: blockdev write read size > 128k ...passed 00:31:29.737 Test: blockdev write read invalid size ...passed 00:31:29.737 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:31:29.737 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:31:29.737 Test: blockdev write read max offset ...passed 00:31:29.737 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:31:29.737 Test: blockdev writev readv 8 blocks ...passed 00:31:29.737 Test: blockdev writev readv 30 x 1block ...passed 00:31:29.737 Test: blockdev writev readv block ...passed 00:31:29.737 Test: blockdev writev readv size > 128k ...passed 00:31:29.737 Test: blockdev writev readv size > 128k in two iovs ...passed 00:31:29.737 Test: blockdev comparev and writev ...passed 00:31:29.737 Test: blockdev nvme passthru rw ...passed 00:31:29.737 Test: blockdev nvme passthru vendor specific ...passed 00:31:29.737 Test: blockdev nvme admin passthru ...passed 00:31:29.737 Test: blockdev copy ...passed 00:31:29.737 Suite: bdevio tests on: crypto_ram 00:31:29.737 Test: blockdev write read block ...passed 00:31:29.737 Test: blockdev write zeroes read block ...passed 00:31:29.737 Test: blockdev write zeroes read no split ...passed 00:31:29.737 Test: blockdev write zeroes read split ...passed 00:31:29.996 Test: blockdev write zeroes read split partial ...passed 00:31:29.996 Test: blockdev reset ...passed 00:31:29.996 Test: blockdev write read 8 blocks ...passed 00:31:29.996 Test: blockdev write read size > 128k ...passed 00:31:29.996 Test: blockdev write read invalid size ...passed 00:31:29.996 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:31:29.996 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:31:29.996 Test: blockdev write read max offset ...passed 00:31:29.996 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:31:29.996 Test: blockdev writev readv 8 blocks ...passed 00:31:29.996 Test: blockdev writev readv 30 x 1block ...passed 00:31:29.996 Test: blockdev writev readv block ...passed 00:31:29.996 Test: blockdev writev readv size > 128k ...passed 00:31:29.996 Test: blockdev writev readv size > 128k in two iovs ...passed 00:31:29.996 Test: blockdev comparev and writev ...passed 00:31:29.996 Test: blockdev nvme passthru rw ...passed 00:31:29.996 Test: blockdev nvme passthru vendor specific ...passed 00:31:29.996 Test: blockdev nvme admin passthru ...passed 00:31:29.996 Test: blockdev copy ...passed 00:31:29.996 00:31:29.996 Run Summary: Type Total Ran Passed Failed Inactive 00:31:29.996 suites 4 4 n/a 0 0 00:31:29.996 tests 92 92 92 0 0 00:31:29.996 asserts 520 520 520 0 n/a 00:31:29.996 00:31:29.996 Elapsed time = 0.533 seconds 00:31:29.996 0 00:31:29.996 18:35:13 blockdev_crypto_aesni.bdev_bounds -- bdev/blockdev.sh@295 -- # killprocess 2640443 00:31:29.996 18:35:13 blockdev_crypto_aesni.bdev_bounds -- common/autotest_common.sh@948 -- # '[' -z 2640443 ']' 00:31:29.996 18:35:13 blockdev_crypto_aesni.bdev_bounds -- common/autotest_common.sh@952 -- # kill -0 2640443 00:31:29.996 18:35:13 blockdev_crypto_aesni.bdev_bounds -- common/autotest_common.sh@953 -- # uname 00:31:29.996 18:35:13 blockdev_crypto_aesni.bdev_bounds -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:31:29.996 18:35:13 blockdev_crypto_aesni.bdev_bounds -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2640443 00:31:29.996 18:35:13 blockdev_crypto_aesni.bdev_bounds -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:31:29.996 18:35:13 blockdev_crypto_aesni.bdev_bounds -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:31:29.996 18:35:13 blockdev_crypto_aesni.bdev_bounds -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2640443' 00:31:29.996 killing process with pid 2640443 00:31:29.996 18:35:13 blockdev_crypto_aesni.bdev_bounds -- common/autotest_common.sh@967 -- # kill 2640443 00:31:29.996 18:35:13 blockdev_crypto_aesni.bdev_bounds -- common/autotest_common.sh@972 -- # wait 2640443 00:31:30.255 18:35:13 blockdev_crypto_aesni.bdev_bounds -- bdev/blockdev.sh@296 -- # trap - SIGINT SIGTERM EXIT 00:31:30.255 00:31:30.255 real 0m3.643s 00:31:30.255 user 0m10.030s 00:31:30.255 sys 0m0.570s 00:31:30.255 18:35:13 blockdev_crypto_aesni.bdev_bounds -- common/autotest_common.sh@1124 -- # xtrace_disable 00:31:30.255 18:35:13 blockdev_crypto_aesni.bdev_bounds -- common/autotest_common.sh@10 -- # set +x 00:31:30.255 ************************************ 00:31:30.255 END TEST bdev_bounds 00:31:30.255 ************************************ 00:31:30.514 18:35:14 blockdev_crypto_aesni -- common/autotest_common.sh@1142 -- # return 0 00:31:30.514 18:35:14 blockdev_crypto_aesni -- bdev/blockdev.sh@762 -- # run_test bdev_nbd nbd_function_test /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json 'crypto_ram crypto_ram2 crypto_ram3 crypto_ram4' '' 00:31:30.514 18:35:14 blockdev_crypto_aesni -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:31:30.514 18:35:14 blockdev_crypto_aesni -- common/autotest_common.sh@1105 -- # xtrace_disable 00:31:30.514 18:35:14 blockdev_crypto_aesni -- common/autotest_common.sh@10 -- # set +x 00:31:30.514 ************************************ 00:31:30.514 START TEST bdev_nbd 00:31:30.514 ************************************ 00:31:30.514 18:35:14 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@1123 -- # nbd_function_test /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json 'crypto_ram crypto_ram2 crypto_ram3 crypto_ram4' '' 00:31:30.514 18:35:14 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@300 -- # uname -s 00:31:30.515 18:35:14 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@300 -- # [[ Linux == Linux ]] 00:31:30.515 18:35:14 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@302 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:31:30.515 18:35:14 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@303 -- # local conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json 00:31:30.515 18:35:14 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@304 -- # bdev_all=('crypto_ram' 'crypto_ram2' 'crypto_ram3' 'crypto_ram4') 00:31:30.515 18:35:14 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@304 -- # local bdev_all 00:31:30.515 18:35:14 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@305 -- # local bdev_num=4 00:31:30.515 18:35:14 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@309 -- # [[ -e /sys/module/nbd ]] 00:31:30.515 18:35:14 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@311 -- # nbd_all=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11' '/dev/nbd12' '/dev/nbd13' '/dev/nbd14' '/dev/nbd15' '/dev/nbd2' '/dev/nbd3' '/dev/nbd4' '/dev/nbd5' '/dev/nbd6' '/dev/nbd7' '/dev/nbd8' '/dev/nbd9') 00:31:30.515 18:35:14 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@311 -- # local nbd_all 00:31:30.515 18:35:14 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@312 -- # bdev_num=4 00:31:30.515 18:35:14 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@314 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11') 00:31:30.515 18:35:14 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@314 -- # local nbd_list 00:31:30.515 18:35:14 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@315 -- # bdev_list=('crypto_ram' 'crypto_ram2' 'crypto_ram3' 'crypto_ram4') 00:31:30.515 18:35:14 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@315 -- # local bdev_list 00:31:30.515 18:35:14 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@318 -- # nbd_pid=2640843 00:31:30.515 18:35:14 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@319 -- # trap 'cleanup; killprocess $nbd_pid' SIGINT SIGTERM EXIT 00:31:30.515 18:35:14 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@317 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-nbd.sock -i 0 --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json '' 00:31:30.515 18:35:14 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@320 -- # waitforlisten 2640843 /var/tmp/spdk-nbd.sock 00:31:30.515 18:35:14 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@829 -- # '[' -z 2640843 ']' 00:31:30.515 18:35:14 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-nbd.sock 00:31:30.515 18:35:14 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@834 -- # local max_retries=100 00:31:30.515 18:35:14 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock...' 00:31:30.515 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock... 00:31:30.515 18:35:14 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@838 -- # xtrace_disable 00:31:30.515 18:35:14 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@10 -- # set +x 00:31:30.515 [2024-07-12 18:35:14.136368] Starting SPDK v24.09-pre git sha1 182dd7de4 / DPDK 24.03.0 initialization... 00:31:30.515 [2024-07-12 18:35:14.136441] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:31:30.774 [2024-07-12 18:35:14.267688] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:31:30.774 [2024-07-12 18:35:14.368889] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:31:30.774 [2024-07-12 18:35:14.390183] accel_dpdk_cryptodev.c: 223:accel_dpdk_cryptodev_set_driver: *NOTICE*: Using driver crypto_aesni_mb 00:31:30.774 [2024-07-12 18:35:14.398204] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation encrypt will be assigned to module dpdk_cryptodev 00:31:30.774 [2024-07-12 18:35:14.406222] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation decrypt will be assigned to module dpdk_cryptodev 00:31:31.034 [2024-07-12 18:35:14.519088] accel_dpdk_cryptodev.c:1178:accel_dpdk_cryptodev_init: *NOTICE*: Found crypto devices: 97 00:31:33.565 [2024-07-12 18:35:16.743190] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_1" 00:31:33.565 [2024-07-12 18:35:16.743257] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:31:33.565 [2024-07-12 18:35:16.743272] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:31:33.565 [2024-07-12 18:35:16.751209] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_2" 00:31:33.565 [2024-07-12 18:35:16.751227] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:31:33.565 [2024-07-12 18:35:16.751239] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:31:33.565 [2024-07-12 18:35:16.759229] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_3" 00:31:33.565 [2024-07-12 18:35:16.759247] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:31:33.565 [2024-07-12 18:35:16.759258] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:31:33.565 [2024-07-12 18:35:16.767250] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_4" 00:31:33.565 [2024-07-12 18:35:16.767267] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc3 00:31:33.565 [2024-07-12 18:35:16.767278] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:31:33.565 18:35:16 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:31:33.565 18:35:16 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@862 -- # return 0 00:31:33.565 18:35:16 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@322 -- # nbd_rpc_start_stop_verify /var/tmp/spdk-nbd.sock 'crypto_ram crypto_ram2 crypto_ram3 crypto_ram4' 00:31:33.565 18:35:16 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@113 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:31:33.565 18:35:16 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@114 -- # bdev_list=('crypto_ram' 'crypto_ram2' 'crypto_ram3' 'crypto_ram4') 00:31:33.565 18:35:16 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@114 -- # local bdev_list 00:31:33.565 18:35:16 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@116 -- # nbd_start_disks_without_nbd_idx /var/tmp/spdk-nbd.sock 'crypto_ram crypto_ram2 crypto_ram3 crypto_ram4' 00:31:33.565 18:35:16 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@22 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:31:33.566 18:35:16 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@23 -- # bdev_list=('crypto_ram' 'crypto_ram2' 'crypto_ram3' 'crypto_ram4') 00:31:33.566 18:35:16 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@23 -- # local bdev_list 00:31:33.566 18:35:16 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@24 -- # local i 00:31:33.566 18:35:16 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@25 -- # local nbd_device 00:31:33.566 18:35:16 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i = 0 )) 00:31:33.566 18:35:16 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 4 )) 00:31:33.566 18:35:16 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk crypto_ram 00:31:33.823 18:35:17 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd0 00:31:33.823 18:35:17 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd0 00:31:33.823 18:35:17 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd0 00:31:33.823 18:35:17 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd0 00:31:33.824 18:35:17 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:31:33.824 18:35:17 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:31:33.824 18:35:17 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:31:33.824 18:35:17 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd0 /proc/partitions 00:31:33.824 18:35:17 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:31:33.824 18:35:17 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:31:33.824 18:35:17 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:31:33.824 18:35:17 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:31:33.824 1+0 records in 00:31:33.824 1+0 records out 00:31:33.824 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000305517 s, 13.4 MB/s 00:31:33.824 18:35:17 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:31:33.824 18:35:17 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:31:33.824 18:35:17 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:31:33.824 18:35:17 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:31:33.824 18:35:17 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:31:33.824 18:35:17 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:31:33.824 18:35:17 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 4 )) 00:31:33.824 18:35:17 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk crypto_ram2 00:31:34.081 18:35:17 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd1 00:31:34.082 18:35:17 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd1 00:31:34.082 18:35:17 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd1 00:31:34.082 18:35:17 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd1 00:31:34.082 18:35:17 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:31:34.082 18:35:17 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:31:34.082 18:35:17 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:31:34.082 18:35:17 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd1 /proc/partitions 00:31:34.082 18:35:17 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:31:34.082 18:35:17 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:31:34.082 18:35:17 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:31:34.082 18:35:17 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:31:34.082 1+0 records in 00:31:34.082 1+0 records out 00:31:34.082 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000303712 s, 13.5 MB/s 00:31:34.082 18:35:17 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:31:34.082 18:35:17 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:31:34.082 18:35:17 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:31:34.082 18:35:17 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:31:34.082 18:35:17 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:31:34.082 18:35:17 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:31:34.082 18:35:17 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 4 )) 00:31:34.082 18:35:17 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk crypto_ram3 00:31:34.340 18:35:17 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd2 00:31:34.340 18:35:17 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd2 00:31:34.340 18:35:17 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd2 00:31:34.340 18:35:17 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd2 00:31:34.340 18:35:17 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:31:34.340 18:35:17 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:31:34.340 18:35:17 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:31:34.340 18:35:17 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd2 /proc/partitions 00:31:34.340 18:35:17 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:31:34.340 18:35:17 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:31:34.340 18:35:17 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:31:34.340 18:35:17 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd2 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:31:34.340 1+0 records in 00:31:34.340 1+0 records out 00:31:34.340 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000330102 s, 12.4 MB/s 00:31:34.340 18:35:17 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:31:34.340 18:35:17 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:31:34.340 18:35:17 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:31:34.340 18:35:17 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:31:34.340 18:35:17 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:31:34.340 18:35:17 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:31:34.340 18:35:17 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 4 )) 00:31:34.340 18:35:17 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk crypto_ram4 00:31:34.599 18:35:18 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd3 00:31:34.599 18:35:18 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd3 00:31:34.599 18:35:18 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd3 00:31:34.599 18:35:18 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd3 00:31:34.599 18:35:18 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:31:34.599 18:35:18 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:31:34.599 18:35:18 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:31:34.599 18:35:18 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd3 /proc/partitions 00:31:34.599 18:35:18 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:31:34.599 18:35:18 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:31:34.599 18:35:18 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:31:34.599 18:35:18 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd3 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:31:34.599 1+0 records in 00:31:34.599 1+0 records out 00:31:34.599 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000344628 s, 11.9 MB/s 00:31:34.599 18:35:18 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:31:34.599 18:35:18 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:31:34.599 18:35:18 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:31:34.599 18:35:18 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:31:34.599 18:35:18 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:31:34.599 18:35:18 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:31:34.599 18:35:18 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 4 )) 00:31:34.599 18:35:18 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@118 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:31:34.858 18:35:18 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@118 -- # nbd_disks_json='[ 00:31:34.858 { 00:31:34.858 "nbd_device": "/dev/nbd0", 00:31:34.858 "bdev_name": "crypto_ram" 00:31:34.858 }, 00:31:34.858 { 00:31:34.858 "nbd_device": "/dev/nbd1", 00:31:34.858 "bdev_name": "crypto_ram2" 00:31:34.858 }, 00:31:34.858 { 00:31:34.858 "nbd_device": "/dev/nbd2", 00:31:34.858 "bdev_name": "crypto_ram3" 00:31:34.858 }, 00:31:34.858 { 00:31:34.858 "nbd_device": "/dev/nbd3", 00:31:34.858 "bdev_name": "crypto_ram4" 00:31:34.858 } 00:31:34.858 ]' 00:31:34.858 18:35:18 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@119 -- # nbd_disks_name=($(echo "${nbd_disks_json}" | jq -r '.[] | .nbd_device')) 00:31:34.858 18:35:18 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@119 -- # echo '[ 00:31:34.858 { 00:31:34.858 "nbd_device": "/dev/nbd0", 00:31:34.858 "bdev_name": "crypto_ram" 00:31:34.858 }, 00:31:34.858 { 00:31:34.858 "nbd_device": "/dev/nbd1", 00:31:34.858 "bdev_name": "crypto_ram2" 00:31:34.858 }, 00:31:34.858 { 00:31:34.858 "nbd_device": "/dev/nbd2", 00:31:34.858 "bdev_name": "crypto_ram3" 00:31:34.858 }, 00:31:34.858 { 00:31:34.858 "nbd_device": "/dev/nbd3", 00:31:34.858 "bdev_name": "crypto_ram4" 00:31:34.858 } 00:31:34.858 ]' 00:31:34.858 18:35:18 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@119 -- # jq -r '.[] | .nbd_device' 00:31:34.858 18:35:18 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@120 -- # nbd_stop_disks /var/tmp/spdk-nbd.sock '/dev/nbd0 /dev/nbd1 /dev/nbd2 /dev/nbd3' 00:31:34.858 18:35:18 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:31:34.858 18:35:18 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd2' '/dev/nbd3') 00:31:34.858 18:35:18 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@50 -- # local nbd_list 00:31:34.858 18:35:18 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@51 -- # local i 00:31:34.858 18:35:18 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:31:34.858 18:35:18 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd0 00:31:35.117 18:35:18 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:31:35.117 18:35:18 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:31:35.117 18:35:18 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:31:35.117 18:35:18 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:31:35.117 18:35:18 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:31:35.117 18:35:18 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:31:35.117 18:35:18 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:31:35.117 18:35:18 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:31:35.117 18:35:18 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:31:35.117 18:35:18 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd1 00:31:35.376 18:35:19 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:31:35.376 18:35:19 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:31:35.376 18:35:19 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:31:35.376 18:35:19 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:31:35.376 18:35:19 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:31:35.376 18:35:19 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:31:35.376 18:35:19 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:31:35.376 18:35:19 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:31:35.376 18:35:19 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:31:35.376 18:35:19 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd2 00:31:35.634 18:35:19 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd2 00:31:35.634 18:35:19 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd2 00:31:35.634 18:35:19 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd2 00:31:35.634 18:35:19 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:31:35.634 18:35:19 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:31:35.634 18:35:19 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd2 /proc/partitions 00:31:35.634 18:35:19 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:31:35.634 18:35:19 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:31:35.634 18:35:19 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:31:35.634 18:35:19 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd3 00:31:36.201 18:35:19 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd3 00:31:36.201 18:35:19 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd3 00:31:36.201 18:35:19 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd3 00:31:36.201 18:35:19 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:31:36.201 18:35:19 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:31:36.201 18:35:19 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd3 /proc/partitions 00:31:36.201 18:35:19 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:31:36.201 18:35:19 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:31:36.201 18:35:19 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@122 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:31:36.201 18:35:19 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:31:36.201 18:35:19 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:31:36.201 18:35:19 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[]' 00:31:36.201 18:35:19 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@64 -- # echo '[]' 00:31:36.201 18:35:19 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:31:36.201 18:35:19 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@64 -- # nbd_disks_name= 00:31:36.201 18:35:19 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@65 -- # echo '' 00:31:36.201 18:35:19 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:31:36.460 18:35:19 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@65 -- # true 00:31:36.460 18:35:19 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@65 -- # count=0 00:31:36.460 18:35:19 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@66 -- # echo 0 00:31:36.460 18:35:19 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@122 -- # count=0 00:31:36.460 18:35:19 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@123 -- # '[' 0 -ne 0 ']' 00:31:36.460 18:35:19 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@127 -- # return 0 00:31:36.460 18:35:19 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@323 -- # nbd_rpc_data_verify /var/tmp/spdk-nbd.sock 'crypto_ram crypto_ram2 crypto_ram3 crypto_ram4' '/dev/nbd0 /dev/nbd1 /dev/nbd10 /dev/nbd11' 00:31:36.460 18:35:19 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@90 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:31:36.460 18:35:19 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@91 -- # bdev_list=('crypto_ram' 'crypto_ram2' 'crypto_ram3' 'crypto_ram4') 00:31:36.460 18:35:19 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@91 -- # local bdev_list 00:31:36.460 18:35:19 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@92 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11') 00:31:36.460 18:35:19 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@92 -- # local nbd_list 00:31:36.460 18:35:19 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@94 -- # nbd_start_disks /var/tmp/spdk-nbd.sock 'crypto_ram crypto_ram2 crypto_ram3 crypto_ram4' '/dev/nbd0 /dev/nbd1 /dev/nbd10 /dev/nbd11' 00:31:36.460 18:35:19 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:31:36.460 18:35:19 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@10 -- # bdev_list=('crypto_ram' 'crypto_ram2' 'crypto_ram3' 'crypto_ram4') 00:31:36.460 18:35:19 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@10 -- # local bdev_list 00:31:36.460 18:35:19 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11') 00:31:36.460 18:35:19 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@11 -- # local nbd_list 00:31:36.460 18:35:19 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@12 -- # local i 00:31:36.460 18:35:19 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:31:36.460 18:35:19 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 4 )) 00:31:36.460 18:35:19 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk crypto_ram /dev/nbd0 00:31:36.460 /dev/nbd0 00:31:36.719 18:35:20 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:31:36.719 18:35:20 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:31:36.719 18:35:20 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd0 00:31:36.719 18:35:20 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:31:36.719 18:35:20 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:31:36.719 18:35:20 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:31:36.719 18:35:20 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd0 /proc/partitions 00:31:36.719 18:35:20 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:31:36.719 18:35:20 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:31:36.719 18:35:20 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:31:36.719 18:35:20 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:31:36.719 1+0 records in 00:31:36.719 1+0 records out 00:31:36.719 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000261968 s, 15.6 MB/s 00:31:36.719 18:35:20 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:31:36.719 18:35:20 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:31:36.719 18:35:20 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:31:36.720 18:35:20 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:31:36.720 18:35:20 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:31:36.720 18:35:20 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:31:36.720 18:35:20 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 4 )) 00:31:36.720 18:35:20 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk crypto_ram2 /dev/nbd1 00:31:36.979 /dev/nbd1 00:31:36.979 18:35:20 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:31:36.979 18:35:20 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:31:36.979 18:35:20 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd1 00:31:36.979 18:35:20 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:31:36.979 18:35:20 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:31:36.979 18:35:20 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:31:36.979 18:35:20 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd1 /proc/partitions 00:31:36.979 18:35:20 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:31:36.979 18:35:20 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:31:36.979 18:35:20 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:31:36.979 18:35:20 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:31:36.979 1+0 records in 00:31:36.979 1+0 records out 00:31:36.979 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000341445 s, 12.0 MB/s 00:31:36.979 18:35:20 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:31:36.979 18:35:20 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:31:36.979 18:35:20 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:31:36.979 18:35:20 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:31:36.979 18:35:20 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:31:36.979 18:35:20 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:31:36.979 18:35:20 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 4 )) 00:31:36.979 18:35:20 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk crypto_ram3 /dev/nbd10 00:31:37.238 /dev/nbd10 00:31:37.238 18:35:20 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd10 00:31:37.238 18:35:20 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd10 00:31:37.238 18:35:20 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd10 00:31:37.238 18:35:20 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:31:37.238 18:35:20 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:31:37.238 18:35:20 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:31:37.238 18:35:20 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd10 /proc/partitions 00:31:37.238 18:35:20 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:31:37.238 18:35:20 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:31:37.238 18:35:20 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:31:37.238 18:35:20 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd10 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:31:37.238 1+0 records in 00:31:37.238 1+0 records out 00:31:37.238 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000238728 s, 17.2 MB/s 00:31:37.238 18:35:20 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:31:37.238 18:35:20 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:31:37.238 18:35:20 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:31:37.238 18:35:20 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:31:37.238 18:35:20 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:31:37.238 18:35:20 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:31:37.238 18:35:20 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 4 )) 00:31:37.238 18:35:20 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk crypto_ram4 /dev/nbd11 00:31:37.498 /dev/nbd11 00:31:37.498 18:35:21 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd11 00:31:37.498 18:35:21 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd11 00:31:37.498 18:35:21 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd11 00:31:37.498 18:35:21 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:31:37.498 18:35:21 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:31:37.498 18:35:21 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:31:37.498 18:35:21 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd11 /proc/partitions 00:31:37.498 18:35:21 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:31:37.498 18:35:21 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:31:37.498 18:35:21 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:31:37.498 18:35:21 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd11 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:31:37.498 1+0 records in 00:31:37.498 1+0 records out 00:31:37.498 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000281393 s, 14.6 MB/s 00:31:37.498 18:35:21 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:31:37.498 18:35:21 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:31:37.498 18:35:21 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:31:37.498 18:35:21 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:31:37.498 18:35:21 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:31:37.498 18:35:21 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:31:37.498 18:35:21 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 4 )) 00:31:37.498 18:35:21 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@95 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:31:37.498 18:35:21 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:31:37.498 18:35:21 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:31:37.757 18:35:21 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[ 00:31:37.757 { 00:31:37.757 "nbd_device": "/dev/nbd0", 00:31:37.757 "bdev_name": "crypto_ram" 00:31:37.757 }, 00:31:37.757 { 00:31:37.757 "nbd_device": "/dev/nbd1", 00:31:37.757 "bdev_name": "crypto_ram2" 00:31:37.757 }, 00:31:37.757 { 00:31:37.757 "nbd_device": "/dev/nbd10", 00:31:37.757 "bdev_name": "crypto_ram3" 00:31:37.757 }, 00:31:37.757 { 00:31:37.757 "nbd_device": "/dev/nbd11", 00:31:37.757 "bdev_name": "crypto_ram4" 00:31:37.757 } 00:31:37.757 ]' 00:31:37.757 18:35:21 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@64 -- # echo '[ 00:31:37.757 { 00:31:37.757 "nbd_device": "/dev/nbd0", 00:31:37.757 "bdev_name": "crypto_ram" 00:31:37.757 }, 00:31:37.757 { 00:31:37.757 "nbd_device": "/dev/nbd1", 00:31:37.757 "bdev_name": "crypto_ram2" 00:31:37.757 }, 00:31:37.757 { 00:31:37.757 "nbd_device": "/dev/nbd10", 00:31:37.757 "bdev_name": "crypto_ram3" 00:31:37.757 }, 00:31:37.757 { 00:31:37.757 "nbd_device": "/dev/nbd11", 00:31:37.757 "bdev_name": "crypto_ram4" 00:31:37.757 } 00:31:37.757 ]' 00:31:37.757 18:35:21 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:31:37.757 18:35:21 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@64 -- # nbd_disks_name='/dev/nbd0 00:31:37.757 /dev/nbd1 00:31:37.757 /dev/nbd10 00:31:37.757 /dev/nbd11' 00:31:37.757 18:35:21 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@65 -- # echo '/dev/nbd0 00:31:37.757 /dev/nbd1 00:31:37.757 /dev/nbd10 00:31:37.757 /dev/nbd11' 00:31:37.757 18:35:21 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:31:37.757 18:35:21 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@65 -- # count=4 00:31:37.757 18:35:21 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@66 -- # echo 4 00:31:37.757 18:35:21 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@95 -- # count=4 00:31:37.757 18:35:21 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@96 -- # '[' 4 -ne 4 ']' 00:31:37.757 18:35:21 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@100 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1 /dev/nbd10 /dev/nbd11' write 00:31:37.757 18:35:21 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11') 00:31:37.757 18:35:21 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@70 -- # local nbd_list 00:31:37.757 18:35:21 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@71 -- # local operation=write 00:31:37.757 18:35:21 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest 00:31:37.757 18:35:21 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@74 -- # '[' write = write ']' 00:31:37.757 18:35:21 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@76 -- # dd if=/dev/urandom of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest bs=4096 count=256 00:31:37.757 256+0 records in 00:31:37.757 256+0 records out 00:31:37.757 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0102851 s, 102 MB/s 00:31:37.757 18:35:21 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:31:37.757 18:35:21 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd0 bs=4096 count=256 oflag=direct 00:31:37.757 256+0 records in 00:31:37.757 256+0 records out 00:31:37.757 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0610722 s, 17.2 MB/s 00:31:37.757 18:35:21 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:31:37.757 18:35:21 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd1 bs=4096 count=256 oflag=direct 00:31:37.757 256+0 records in 00:31:37.757 256+0 records out 00:31:37.757 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0458073 s, 22.9 MB/s 00:31:37.757 18:35:21 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:31:37.757 18:35:21 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd10 bs=4096 count=256 oflag=direct 00:31:38.017 256+0 records in 00:31:38.017 256+0 records out 00:31:38.017 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0559384 s, 18.7 MB/s 00:31:38.017 18:35:21 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:31:38.017 18:35:21 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd11 bs=4096 count=256 oflag=direct 00:31:38.017 256+0 records in 00:31:38.017 256+0 records out 00:31:38.017 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.057077 s, 18.4 MB/s 00:31:38.017 18:35:21 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@101 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1 /dev/nbd10 /dev/nbd11' verify 00:31:38.017 18:35:21 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11') 00:31:38.017 18:35:21 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@70 -- # local nbd_list 00:31:38.017 18:35:21 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@71 -- # local operation=verify 00:31:38.017 18:35:21 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest 00:31:38.017 18:35:21 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@74 -- # '[' verify = write ']' 00:31:38.017 18:35:21 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@80 -- # '[' verify = verify ']' 00:31:38.017 18:35:21 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:31:38.017 18:35:21 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd0 00:31:38.017 18:35:21 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:31:38.017 18:35:21 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd1 00:31:38.017 18:35:21 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:31:38.017 18:35:21 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd10 00:31:38.017 18:35:21 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:31:38.017 18:35:21 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd11 00:31:38.017 18:35:21 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@85 -- # rm /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest 00:31:38.017 18:35:21 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@103 -- # nbd_stop_disks /var/tmp/spdk-nbd.sock '/dev/nbd0 /dev/nbd1 /dev/nbd10 /dev/nbd11' 00:31:38.017 18:35:21 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:31:38.017 18:35:21 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11') 00:31:38.017 18:35:21 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@50 -- # local nbd_list 00:31:38.017 18:35:21 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@51 -- # local i 00:31:38.017 18:35:21 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:31:38.017 18:35:21 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd0 00:31:38.276 18:35:21 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:31:38.276 18:35:21 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:31:38.276 18:35:21 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:31:38.276 18:35:21 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:31:38.276 18:35:21 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:31:38.276 18:35:21 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:31:38.276 18:35:21 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:31:38.276 18:35:21 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:31:38.276 18:35:21 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:31:38.276 18:35:21 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd1 00:31:38.535 18:35:22 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:31:38.535 18:35:22 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:31:38.535 18:35:22 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:31:38.535 18:35:22 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:31:38.535 18:35:22 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:31:38.535 18:35:22 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:31:38.535 18:35:22 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:31:38.535 18:35:22 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:31:38.535 18:35:22 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:31:38.535 18:35:22 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd10 00:31:38.794 18:35:22 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd10 00:31:38.794 18:35:22 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd10 00:31:38.794 18:35:22 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd10 00:31:38.795 18:35:22 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:31:38.795 18:35:22 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:31:38.795 18:35:22 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd10 /proc/partitions 00:31:38.795 18:35:22 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:31:38.795 18:35:22 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:31:38.795 18:35:22 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:31:38.795 18:35:22 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd11 00:31:39.054 18:35:22 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd11 00:31:39.054 18:35:22 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd11 00:31:39.054 18:35:22 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd11 00:31:39.054 18:35:22 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:31:39.054 18:35:22 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:31:39.054 18:35:22 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd11 /proc/partitions 00:31:39.054 18:35:22 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:31:39.054 18:35:22 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:31:39.054 18:35:22 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@104 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:31:39.054 18:35:22 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:31:39.054 18:35:22 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:31:39.311 18:35:22 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[]' 00:31:39.311 18:35:22 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@64 -- # echo '[]' 00:31:39.311 18:35:22 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:31:39.311 18:35:22 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@64 -- # nbd_disks_name= 00:31:39.311 18:35:22 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@65 -- # echo '' 00:31:39.311 18:35:22 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:31:39.311 18:35:22 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@65 -- # true 00:31:39.311 18:35:22 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@65 -- # count=0 00:31:39.311 18:35:22 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@66 -- # echo 0 00:31:39.311 18:35:22 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@104 -- # count=0 00:31:39.311 18:35:22 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@105 -- # '[' 0 -ne 0 ']' 00:31:39.311 18:35:22 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@109 -- # return 0 00:31:39.311 18:35:22 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@324 -- # nbd_with_lvol_verify /var/tmp/spdk-nbd.sock '/dev/nbd0 /dev/nbd1 /dev/nbd10 /dev/nbd11' 00:31:39.311 18:35:22 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@131 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:31:39.311 18:35:22 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@132 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11') 00:31:39.311 18:35:22 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@132 -- # local nbd_list 00:31:39.311 18:35:22 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@133 -- # local mkfs_ret 00:31:39.311 18:35:22 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@135 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_malloc_create -b malloc_lvol_verify 16 512 00:31:39.568 malloc_lvol_verify 00:31:39.568 18:35:23 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@136 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_lvol_create_lvstore malloc_lvol_verify lvs 00:31:39.827 93023003-5bec-43be-92a6-9a7ed95adf63 00:31:39.827 18:35:23 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@137 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_lvol_create lvol 4 -l lvs 00:31:40.086 ef6f1ece-77f4-46b8-b8cf-8f9ecb8be6f4 00:31:40.086 18:35:23 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@138 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk lvs/lvol /dev/nbd0 00:31:40.343 /dev/nbd0 00:31:40.343 18:35:23 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@140 -- # mkfs.ext4 /dev/nbd0 00:31:40.343 mke2fs 1.46.5 (30-Dec-2021) 00:31:40.343 Discarding device blocks: 0/4096 done 00:31:40.343 Creating filesystem with 4096 1k blocks and 1024 inodes 00:31:40.343 00:31:40.343 Allocating group tables: 0/1 done 00:31:40.343 Writing inode tables: 0/1 done 00:31:40.343 Creating journal (1024 blocks): done 00:31:40.343 Writing superblocks and filesystem accounting information: 0/1 done 00:31:40.343 00:31:40.343 18:35:23 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@141 -- # mkfs_ret=0 00:31:40.343 18:35:23 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@142 -- # nbd_stop_disks /var/tmp/spdk-nbd.sock /dev/nbd0 00:31:40.343 18:35:23 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:31:40.343 18:35:23 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0') 00:31:40.343 18:35:23 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@50 -- # local nbd_list 00:31:40.343 18:35:23 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@51 -- # local i 00:31:40.343 18:35:23 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:31:40.343 18:35:23 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd0 00:31:40.601 18:35:24 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:31:40.601 18:35:24 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:31:40.601 18:35:24 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:31:40.601 18:35:24 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:31:40.601 18:35:24 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:31:40.601 18:35:24 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:31:40.601 18:35:24 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:31:40.601 18:35:24 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:31:40.601 18:35:24 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@143 -- # '[' 0 -ne 0 ']' 00:31:40.601 18:35:24 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@147 -- # return 0 00:31:40.601 18:35:24 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@326 -- # killprocess 2640843 00:31:40.601 18:35:24 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@948 -- # '[' -z 2640843 ']' 00:31:40.601 18:35:24 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@952 -- # kill -0 2640843 00:31:40.601 18:35:24 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@953 -- # uname 00:31:40.601 18:35:24 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:31:40.601 18:35:24 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2640843 00:31:40.601 18:35:24 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:31:40.601 18:35:24 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:31:40.601 18:35:24 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2640843' 00:31:40.601 killing process with pid 2640843 00:31:40.601 18:35:24 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@967 -- # kill 2640843 00:31:40.601 18:35:24 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@972 -- # wait 2640843 00:31:41.169 18:35:24 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@327 -- # trap - SIGINT SIGTERM EXIT 00:31:41.169 00:31:41.169 real 0m10.664s 00:31:41.169 user 0m14.023s 00:31:41.169 sys 0m4.123s 00:31:41.169 18:35:24 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@1124 -- # xtrace_disable 00:31:41.169 18:35:24 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@10 -- # set +x 00:31:41.169 ************************************ 00:31:41.169 END TEST bdev_nbd 00:31:41.169 ************************************ 00:31:41.169 18:35:24 blockdev_crypto_aesni -- common/autotest_common.sh@1142 -- # return 0 00:31:41.169 18:35:24 blockdev_crypto_aesni -- bdev/blockdev.sh@763 -- # [[ y == y ]] 00:31:41.169 18:35:24 blockdev_crypto_aesni -- bdev/blockdev.sh@764 -- # '[' crypto_aesni = nvme ']' 00:31:41.169 18:35:24 blockdev_crypto_aesni -- bdev/blockdev.sh@764 -- # '[' crypto_aesni = gpt ']' 00:31:41.169 18:35:24 blockdev_crypto_aesni -- bdev/blockdev.sh@768 -- # run_test bdev_fio fio_test_suite '' 00:31:41.169 18:35:24 blockdev_crypto_aesni -- common/autotest_common.sh@1099 -- # '[' 3 -le 1 ']' 00:31:41.169 18:35:24 blockdev_crypto_aesni -- common/autotest_common.sh@1105 -- # xtrace_disable 00:31:41.169 18:35:24 blockdev_crypto_aesni -- common/autotest_common.sh@10 -- # set +x 00:31:41.169 ************************************ 00:31:41.169 START TEST bdev_fio 00:31:41.169 ************************************ 00:31:41.169 18:35:24 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1123 -- # fio_test_suite '' 00:31:41.169 18:35:24 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@331 -- # local env_context 00:31:41.169 18:35:24 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@335 -- # pushd /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev 00:31:41.169 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev /var/jenkins/workspace/crypto-phy-autotest/spdk 00:31:41.169 18:35:24 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@336 -- # trap 'rm -f ./*.state; popd; exit 1' SIGINT SIGTERM EXIT 00:31:41.169 18:35:24 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@339 -- # echo '' 00:31:41.169 18:35:24 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@339 -- # sed s/--env-context=// 00:31:41.169 18:35:24 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@339 -- # env_context= 00:31:41.169 18:35:24 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@340 -- # fio_config_gen /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio verify AIO '' 00:31:41.169 18:35:24 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1280 -- # local config_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:31:41.169 18:35:24 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1281 -- # local workload=verify 00:31:41.169 18:35:24 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1282 -- # local bdev_type=AIO 00:31:41.169 18:35:24 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1283 -- # local env_context= 00:31:41.169 18:35:24 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1284 -- # local fio_dir=/usr/src/fio 00:31:41.169 18:35:24 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1286 -- # '[' -e /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio ']' 00:31:41.169 18:35:24 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1291 -- # '[' -z verify ']' 00:31:41.169 18:35:24 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1295 -- # '[' -n '' ']' 00:31:41.169 18:35:24 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1299 -- # touch /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:31:41.169 18:35:24 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1301 -- # cat 00:31:41.169 18:35:24 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1313 -- # '[' verify == verify ']' 00:31:41.169 18:35:24 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1314 -- # cat 00:31:41.169 18:35:24 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1323 -- # '[' AIO == AIO ']' 00:31:41.169 18:35:24 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1324 -- # /usr/src/fio/fio --version 00:31:41.169 18:35:24 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1324 -- # [[ fio-3.35 == *\f\i\o\-\3* ]] 00:31:41.169 18:35:24 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1325 -- # echo serialize_overlap=1 00:31:41.169 18:35:24 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@341 -- # for b in "${bdevs_name[@]}" 00:31:41.169 18:35:24 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@342 -- # echo '[job_crypto_ram]' 00:31:41.169 18:35:24 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@343 -- # echo filename=crypto_ram 00:31:41.169 18:35:24 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@341 -- # for b in "${bdevs_name[@]}" 00:31:41.169 18:35:24 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@342 -- # echo '[job_crypto_ram2]' 00:31:41.169 18:35:24 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@343 -- # echo filename=crypto_ram2 00:31:41.169 18:35:24 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@341 -- # for b in "${bdevs_name[@]}" 00:31:41.169 18:35:24 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@342 -- # echo '[job_crypto_ram3]' 00:31:41.169 18:35:24 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@343 -- # echo filename=crypto_ram3 00:31:41.169 18:35:24 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@341 -- # for b in "${bdevs_name[@]}" 00:31:41.169 18:35:24 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@342 -- # echo '[job_crypto_ram4]' 00:31:41.169 18:35:24 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@343 -- # echo filename=crypto_ram4 00:31:41.169 18:35:24 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@347 -- # local 'fio_params=--ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json' 00:31:41.170 18:35:24 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@349 -- # run_test bdev_fio_rw_verify fio_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --spdk_mem=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:31:41.170 18:35:24 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1099 -- # '[' 11 -le 1 ']' 00:31:41.170 18:35:24 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1105 -- # xtrace_disable 00:31:41.170 18:35:24 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@10 -- # set +x 00:31:41.429 ************************************ 00:31:41.429 START TEST bdev_fio_rw_verify 00:31:41.429 ************************************ 00:31:41.429 18:35:24 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1123 -- # fio_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --spdk_mem=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:31:41.429 18:35:24 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1356 -- # fio_plugin /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --spdk_mem=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:31:41.429 18:35:24 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1337 -- # local fio_dir=/usr/src/fio 00:31:41.429 18:35:24 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1339 -- # sanitizers=('libasan' 'libclang_rt.asan') 00:31:41.429 18:35:24 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1339 -- # local sanitizers 00:31:41.429 18:35:24 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1340 -- # local plugin=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:31:41.429 18:35:24 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1341 -- # shift 00:31:41.429 18:35:24 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1343 -- # local asan_lib= 00:31:41.429 18:35:24 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1344 -- # for sanitizer in "${sanitizers[@]}" 00:31:41.429 18:35:24 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # ldd /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:31:41.429 18:35:24 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # grep libasan 00:31:41.429 18:35:24 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # awk '{print $3}' 00:31:41.429 18:35:24 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # asan_lib= 00:31:41.429 18:35:24 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1346 -- # [[ -n '' ]] 00:31:41.429 18:35:24 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1344 -- # for sanitizer in "${sanitizers[@]}" 00:31:41.429 18:35:24 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # ldd /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:31:41.429 18:35:24 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # grep libclang_rt.asan 00:31:41.429 18:35:24 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # awk '{print $3}' 00:31:41.429 18:35:25 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # asan_lib= 00:31:41.429 18:35:25 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1346 -- # [[ -n '' ]] 00:31:41.429 18:35:25 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1352 -- # LD_PRELOAD=' /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev' 00:31:41.429 18:35:25 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1352 -- # /usr/src/fio/fio --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --spdk_mem=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:31:41.688 job_crypto_ram: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:31:41.688 job_crypto_ram2: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:31:41.688 job_crypto_ram3: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:31:41.688 job_crypto_ram4: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:31:41.688 fio-3.35 00:31:41.688 Starting 4 threads 00:31:56.797 00:31:56.797 job_crypto_ram: (groupid=0, jobs=4): err= 0: pid=2642949: Fri Jul 12 18:35:38 2024 00:31:56.797 read: IOPS=22.9k, BW=89.6MiB/s (93.9MB/s)(896MiB/10001msec) 00:31:56.797 slat (usec): min=11, max=1174, avg=59.67, stdev=41.94 00:31:56.797 clat (usec): min=9, max=2475, avg=316.62, stdev=242.91 00:31:56.797 lat (usec): min=41, max=2739, avg=376.29, stdev=270.83 00:31:56.797 clat percentiles (usec): 00:31:56.797 | 50.000th=[ 243], 99.000th=[ 1123], 99.900th=[ 1287], 99.990th=[ 1565], 00:31:56.797 | 99.999th=[ 2278] 00:31:56.797 write: IOPS=25.1k, BW=98.2MiB/s (103MB/s)(957MiB/9744msec); 0 zone resets 00:31:56.797 slat (usec): min=14, max=492, avg=70.38, stdev=41.22 00:31:56.797 clat (usec): min=23, max=1694, avg=376.62, stdev=280.07 00:31:56.797 lat (usec): min=51, max=1839, avg=447.00, stdev=307.20 00:31:56.797 clat percentiles (usec): 00:31:56.797 | 50.000th=[ 306], 99.000th=[ 1352], 99.900th=[ 1516], 99.990th=[ 1598], 00:31:56.797 | 99.999th=[ 1647] 00:31:56.797 bw ( KiB/s): min=80616, max=124784, per=98.70%, avg=99221.05, stdev=2771.09, samples=76 00:31:56.797 iops : min=20154, max=31196, avg=24805.26, stdev=692.77, samples=76 00:31:56.797 lat (usec) : 10=0.01%, 20=0.01%, 50=1.59%, 100=8.54%, 250=34.85% 00:31:56.797 lat (usec) : 500=35.95%, 750=9.21%, 1000=6.11% 00:31:56.797 lat (msec) : 2=3.74%, 4=0.01% 00:31:56.797 cpu : usr=99.61%, sys=0.01%, ctx=66, majf=0, minf=280 00:31:56.797 IO depths : 1=10.6%, 2=25.5%, 4=50.9%, 8=13.0%, 16=0.0%, 32=0.0%, >=64=0.0% 00:31:56.797 submit : 0=0.0%, 4=100.0%, 8=0.0%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.0% 00:31:56.797 complete : 0=0.0%, 4=88.7%, 8=11.3%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.0% 00:31:56.797 issued rwts: total=229377,244878,0,0 short=0,0,0,0 dropped=0,0,0,0 00:31:56.797 latency : target=0, window=0, percentile=100.00%, depth=8 00:31:56.797 00:31:56.797 Run status group 0 (all jobs): 00:31:56.797 READ: bw=89.6MiB/s (93.9MB/s), 89.6MiB/s-89.6MiB/s (93.9MB/s-93.9MB/s), io=896MiB (940MB), run=10001-10001msec 00:31:56.797 WRITE: bw=98.2MiB/s (103MB/s), 98.2MiB/s-98.2MiB/s (103MB/s-103MB/s), io=957MiB (1003MB), run=9744-9744msec 00:31:56.797 00:31:56.797 real 0m13.521s 00:31:56.797 user 0m45.623s 00:31:56.797 sys 0m0.497s 00:31:56.797 18:35:38 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1124 -- # xtrace_disable 00:31:56.797 18:35:38 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@10 -- # set +x 00:31:56.797 ************************************ 00:31:56.797 END TEST bdev_fio_rw_verify 00:31:56.797 ************************************ 00:31:56.797 18:35:38 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1142 -- # return 0 00:31:56.797 18:35:38 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@350 -- # rm -f 00:31:56.797 18:35:38 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@351 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:31:56.797 18:35:38 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@354 -- # fio_config_gen /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio trim '' '' 00:31:56.797 18:35:38 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1280 -- # local config_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:31:56.797 18:35:38 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1281 -- # local workload=trim 00:31:56.797 18:35:38 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1282 -- # local bdev_type= 00:31:56.797 18:35:38 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1283 -- # local env_context= 00:31:56.797 18:35:38 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1284 -- # local fio_dir=/usr/src/fio 00:31:56.797 18:35:38 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1286 -- # '[' -e /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio ']' 00:31:56.797 18:35:38 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1291 -- # '[' -z trim ']' 00:31:56.797 18:35:38 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1295 -- # '[' -n '' ']' 00:31:56.797 18:35:38 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1299 -- # touch /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:31:56.797 18:35:38 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1301 -- # cat 00:31:56.797 18:35:38 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1313 -- # '[' trim == verify ']' 00:31:56.797 18:35:38 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1328 -- # '[' trim == trim ']' 00:31:56.797 18:35:38 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1329 -- # echo rw=trimwrite 00:31:56.797 18:35:38 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@355 -- # jq -r 'select(.supported_io_types.unmap == true) | .name' 00:31:56.798 18:35:38 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@355 -- # printf '%s\n' '{' ' "name": "crypto_ram",' ' "aliases": [' ' "1a666f97-3407-5f0c-9fd2-130d4e9b6548"' ' ],' ' "product_name": "crypto",' ' "block_size": 512,' ' "num_blocks": 65536,' ' "uuid": "1a666f97-3407-5f0c-9fd2-130d4e9b6548",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc0",' ' "name": "crypto_ram",' ' "key_name": "test_dek_aesni_cbc_1"' ' }' ' }' '}' '{' ' "name": "crypto_ram2",' ' "aliases": [' ' "13aef702-7b78-5eda-96a5-f15b57ab937b"' ' ],' ' "product_name": "crypto",' ' "block_size": 512,' ' "num_blocks": 65536,' ' "uuid": "13aef702-7b78-5eda-96a5-f15b57ab937b",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc1",' ' "name": "crypto_ram2",' ' "key_name": "test_dek_aesni_cbc_2"' ' }' ' }' '}' '{' ' "name": "crypto_ram3",' ' "aliases": [' ' "da2b7215-64f1-5366-b613-e9b5230ddbc2"' ' ],' ' "product_name": "crypto",' ' "block_size": 4096,' ' "num_blocks": 8192,' ' "uuid": "da2b7215-64f1-5366-b613-e9b5230ddbc2",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc2",' ' "name": "crypto_ram3",' ' "key_name": "test_dek_aesni_cbc_3"' ' }' ' }' '}' '{' ' "name": "crypto_ram4",' ' "aliases": [' ' "55438745-dfe1-560f-84cd-20f8b23f1ae3"' ' ],' ' "product_name": "crypto",' ' "block_size": 4096,' ' "num_blocks": 8192,' ' "uuid": "55438745-dfe1-560f-84cd-20f8b23f1ae3",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc3",' ' "name": "crypto_ram4",' ' "key_name": "test_dek_aesni_cbc_4"' ' }' ' }' '}' 00:31:56.798 18:35:38 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@355 -- # [[ -n crypto_ram 00:31:56.798 crypto_ram2 00:31:56.798 crypto_ram3 00:31:56.798 crypto_ram4 ]] 00:31:56.798 18:35:38 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@356 -- # jq -r 'select(.supported_io_types.unmap == true) | .name' 00:31:56.798 18:35:38 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@356 -- # printf '%s\n' '{' ' "name": "crypto_ram",' ' "aliases": [' ' "1a666f97-3407-5f0c-9fd2-130d4e9b6548"' ' ],' ' "product_name": "crypto",' ' "block_size": 512,' ' "num_blocks": 65536,' ' "uuid": "1a666f97-3407-5f0c-9fd2-130d4e9b6548",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc0",' ' "name": "crypto_ram",' ' "key_name": "test_dek_aesni_cbc_1"' ' }' ' }' '}' '{' ' "name": "crypto_ram2",' ' "aliases": [' ' "13aef702-7b78-5eda-96a5-f15b57ab937b"' ' ],' ' "product_name": "crypto",' ' "block_size": 512,' ' "num_blocks": 65536,' ' "uuid": "13aef702-7b78-5eda-96a5-f15b57ab937b",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc1",' ' "name": "crypto_ram2",' ' "key_name": "test_dek_aesni_cbc_2"' ' }' ' }' '}' '{' ' "name": "crypto_ram3",' ' "aliases": [' ' "da2b7215-64f1-5366-b613-e9b5230ddbc2"' ' ],' ' "product_name": "crypto",' ' "block_size": 4096,' ' "num_blocks": 8192,' ' "uuid": "da2b7215-64f1-5366-b613-e9b5230ddbc2",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc2",' ' "name": "crypto_ram3",' ' "key_name": "test_dek_aesni_cbc_3"' ' }' ' }' '}' '{' ' "name": "crypto_ram4",' ' "aliases": [' ' "55438745-dfe1-560f-84cd-20f8b23f1ae3"' ' ],' ' "product_name": "crypto",' ' "block_size": 4096,' ' "num_blocks": 8192,' ' "uuid": "55438745-dfe1-560f-84cd-20f8b23f1ae3",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc3",' ' "name": "crypto_ram4",' ' "key_name": "test_dek_aesni_cbc_4"' ' }' ' }' '}' 00:31:56.798 18:35:38 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@356 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:31:56.798 18:35:38 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@357 -- # echo '[job_crypto_ram]' 00:31:56.798 18:35:38 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@358 -- # echo filename=crypto_ram 00:31:56.798 18:35:38 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@356 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:31:56.798 18:35:38 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@357 -- # echo '[job_crypto_ram2]' 00:31:56.798 18:35:38 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@358 -- # echo filename=crypto_ram2 00:31:56.798 18:35:38 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@356 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:31:56.798 18:35:38 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@357 -- # echo '[job_crypto_ram3]' 00:31:56.798 18:35:38 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@358 -- # echo filename=crypto_ram3 00:31:56.798 18:35:38 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@356 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:31:56.798 18:35:38 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@357 -- # echo '[job_crypto_ram4]' 00:31:56.798 18:35:38 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@358 -- # echo filename=crypto_ram4 00:31:56.798 18:35:38 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@367 -- # run_test bdev_fio_trim fio_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --verify_state_save=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:31:56.798 18:35:38 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1099 -- # '[' 11 -le 1 ']' 00:31:56.798 18:35:38 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1105 -- # xtrace_disable 00:31:56.798 18:35:38 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@10 -- # set +x 00:31:56.798 ************************************ 00:31:56.798 START TEST bdev_fio_trim 00:31:56.798 ************************************ 00:31:56.798 18:35:38 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1123 -- # fio_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --verify_state_save=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:31:56.798 18:35:38 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1356 -- # fio_plugin /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --verify_state_save=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:31:56.798 18:35:38 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1337 -- # local fio_dir=/usr/src/fio 00:31:56.798 18:35:38 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1339 -- # sanitizers=('libasan' 'libclang_rt.asan') 00:31:56.798 18:35:38 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1339 -- # local sanitizers 00:31:56.798 18:35:38 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1340 -- # local plugin=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:31:56.798 18:35:38 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1341 -- # shift 00:31:56.798 18:35:38 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1343 -- # local asan_lib= 00:31:56.798 18:35:38 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1344 -- # for sanitizer in "${sanitizers[@]}" 00:31:56.798 18:35:38 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # ldd /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:31:56.798 18:35:38 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # awk '{print $3}' 00:31:56.798 18:35:38 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # grep libasan 00:31:56.798 18:35:38 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # asan_lib= 00:31:56.798 18:35:38 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1346 -- # [[ -n '' ]] 00:31:56.798 18:35:38 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1344 -- # for sanitizer in "${sanitizers[@]}" 00:31:56.798 18:35:38 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # awk '{print $3}' 00:31:56.798 18:35:38 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # ldd /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:31:56.798 18:35:38 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # grep libclang_rt.asan 00:31:56.798 18:35:38 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # asan_lib= 00:31:56.798 18:35:38 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1346 -- # [[ -n '' ]] 00:31:56.798 18:35:38 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1352 -- # LD_PRELOAD=' /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev' 00:31:56.798 18:35:38 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1352 -- # /usr/src/fio/fio --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --verify_state_save=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:31:56.798 job_crypto_ram: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:31:56.798 job_crypto_ram2: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:31:56.798 job_crypto_ram3: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:31:56.798 job_crypto_ram4: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:31:56.798 fio-3.35 00:31:56.798 Starting 4 threads 00:32:09.078 00:32:09.078 job_crypto_ram: (groupid=0, jobs=4): err= 0: pid=2644901: Fri Jul 12 18:35:52 2024 00:32:09.078 write: IOPS=37.1k, BW=145MiB/s (152MB/s)(1448MiB/10001msec); 0 zone resets 00:32:09.078 slat (usec): min=17, max=1401, avg=60.24, stdev=34.09 00:32:09.078 clat (usec): min=25, max=2226, avg=274.80, stdev=168.48 00:32:09.078 lat (usec): min=50, max=2306, avg=335.04, stdev=188.02 00:32:09.078 clat percentiles (usec): 00:32:09.078 | 50.000th=[ 233], 99.000th=[ 848], 99.900th=[ 1057], 99.990th=[ 1352], 00:32:09.078 | 99.999th=[ 1860] 00:32:09.078 bw ( KiB/s): min=138696, max=181904, per=100.00%, avg=148385.68, stdev=2495.22, samples=76 00:32:09.078 iops : min=34674, max=45478, avg=37096.32, stdev=623.85, samples=76 00:32:09.078 trim: IOPS=37.1k, BW=145MiB/s (152MB/s)(1448MiB/10001msec); 0 zone resets 00:32:09.078 slat (usec): min=5, max=455, avg=17.67, stdev= 7.47 00:32:09.078 clat (usec): min=50, max=1858, avg=257.99, stdev=111.45 00:32:09.078 lat (usec): min=60, max=1882, avg=275.66, stdev=113.67 00:32:09.078 clat percentiles (usec): 00:32:09.078 | 50.000th=[ 243], 99.000th=[ 586], 99.900th=[ 725], 99.990th=[ 963], 00:32:09.078 | 99.999th=[ 1663] 00:32:09.078 bw ( KiB/s): min=138696, max=181960, per=100.00%, avg=148386.53, stdev=2496.65, samples=76 00:32:09.078 iops : min=34674, max=45490, avg=37096.63, stdev=624.16, samples=76 00:32:09.078 lat (usec) : 50=0.01%, 100=5.66%, 250=48.56%, 500=38.88%, 750=5.77% 00:32:09.078 lat (usec) : 1000=1.03% 00:32:09.078 lat (msec) : 2=0.10%, 4=0.01% 00:32:09.078 cpu : usr=99.62%, sys=0.00%, ctx=58, majf=0, minf=99 00:32:09.078 IO depths : 1=7.8%, 2=26.4%, 4=52.7%, 8=13.2%, 16=0.0%, 32=0.0%, >=64=0.0% 00:32:09.078 submit : 0=0.0%, 4=100.0%, 8=0.0%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.0% 00:32:09.078 complete : 0=0.0%, 4=88.4%, 8=11.6%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.0% 00:32:09.078 issued rwts: total=0,370601,370602,0 short=0,0,0,0 dropped=0,0,0,0 00:32:09.078 latency : target=0, window=0, percentile=100.00%, depth=8 00:32:09.078 00:32:09.078 Run status group 0 (all jobs): 00:32:09.078 WRITE: bw=145MiB/s (152MB/s), 145MiB/s-145MiB/s (152MB/s-152MB/s), io=1448MiB (1518MB), run=10001-10001msec 00:32:09.078 TRIM: bw=145MiB/s (152MB/s), 145MiB/s-145MiB/s (152MB/s-152MB/s), io=1448MiB (1518MB), run=10001-10001msec 00:32:09.078 00:32:09.078 real 0m13.778s 00:32:09.078 user 0m45.733s 00:32:09.078 sys 0m0.519s 00:32:09.078 18:35:52 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1124 -- # xtrace_disable 00:32:09.078 18:35:52 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@10 -- # set +x 00:32:09.078 ************************************ 00:32:09.078 END TEST bdev_fio_trim 00:32:09.078 ************************************ 00:32:09.078 18:35:52 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1142 -- # return 0 00:32:09.078 18:35:52 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@368 -- # rm -f 00:32:09.078 18:35:52 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@369 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:32:09.078 18:35:52 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@370 -- # popd 00:32:09.078 /var/jenkins/workspace/crypto-phy-autotest/spdk 00:32:09.078 18:35:52 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@371 -- # trap - SIGINT SIGTERM EXIT 00:32:09.078 00:32:09.078 real 0m27.670s 00:32:09.078 user 1m31.543s 00:32:09.078 sys 0m1.223s 00:32:09.078 18:35:52 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1124 -- # xtrace_disable 00:32:09.078 18:35:52 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@10 -- # set +x 00:32:09.078 ************************************ 00:32:09.078 END TEST bdev_fio 00:32:09.078 ************************************ 00:32:09.078 18:35:52 blockdev_crypto_aesni -- common/autotest_common.sh@1142 -- # return 0 00:32:09.078 18:35:52 blockdev_crypto_aesni -- bdev/blockdev.sh@775 -- # trap cleanup SIGINT SIGTERM EXIT 00:32:09.078 18:35:52 blockdev_crypto_aesni -- bdev/blockdev.sh@777 -- # run_test bdev_verify /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 4096 -w verify -t 5 -C -m 0x3 '' 00:32:09.078 18:35:52 blockdev_crypto_aesni -- common/autotest_common.sh@1099 -- # '[' 16 -le 1 ']' 00:32:09.079 18:35:52 blockdev_crypto_aesni -- common/autotest_common.sh@1105 -- # xtrace_disable 00:32:09.079 18:35:52 blockdev_crypto_aesni -- common/autotest_common.sh@10 -- # set +x 00:32:09.079 ************************************ 00:32:09.079 START TEST bdev_verify 00:32:09.079 ************************************ 00:32:09.079 18:35:52 blockdev_crypto_aesni.bdev_verify -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 4096 -w verify -t 5 -C -m 0x3 '' 00:32:09.079 [2024-07-12 18:35:52.797053] Starting SPDK v24.09-pre git sha1 182dd7de4 / DPDK 24.03.0 initialization... 00:32:09.079 [2024-07-12 18:35:52.797118] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x3 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2646161 ] 00:32:09.338 [2024-07-12 18:35:52.927330] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 2 00:32:09.338 [2024-07-12 18:35:53.026620] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:32:09.338 [2024-07-12 18:35:53.026625] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:32:09.338 [2024-07-12 18:35:53.048000] accel_dpdk_cryptodev.c: 223:accel_dpdk_cryptodev_set_driver: *NOTICE*: Using driver crypto_aesni_mb 00:32:09.338 [2024-07-12 18:35:53.056031] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation encrypt will be assigned to module dpdk_cryptodev 00:32:09.338 [2024-07-12 18:35:53.064067] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation decrypt will be assigned to module dpdk_cryptodev 00:32:09.598 [2024-07-12 18:35:53.176456] accel_dpdk_cryptodev.c:1178:accel_dpdk_cryptodev_init: *NOTICE*: Found crypto devices: 97 00:32:12.126 [2024-07-12 18:35:55.393374] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_1" 00:32:12.126 [2024-07-12 18:35:55.393473] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:32:12.126 [2024-07-12 18:35:55.393488] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:32:12.126 [2024-07-12 18:35:55.401391] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_2" 00:32:12.126 [2024-07-12 18:35:55.401410] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:32:12.126 [2024-07-12 18:35:55.401422] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:32:12.126 [2024-07-12 18:35:55.409411] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_3" 00:32:12.126 [2024-07-12 18:35:55.409428] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:32:12.126 [2024-07-12 18:35:55.409440] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:32:12.126 [2024-07-12 18:35:55.417432] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_4" 00:32:12.126 [2024-07-12 18:35:55.417449] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc3 00:32:12.126 [2024-07-12 18:35:55.417460] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:32:12.126 Running I/O for 5 seconds... 00:32:17.390 00:32:17.390 Latency(us) 00:32:17.390 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:32:17.390 Job: crypto_ram (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:32:17.390 Verification LBA range: start 0x0 length 0x1000 00:32:17.390 crypto_ram : 5.06 505.48 1.97 0.00 0.00 252732.42 6040.71 174154.80 00:32:17.390 Job: crypto_ram (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:32:17.390 Verification LBA range: start 0x1000 length 0x1000 00:32:17.390 crypto_ram : 5.06 505.80 1.98 0.00 0.00 252548.10 10086.85 174154.80 00:32:17.390 Job: crypto_ram2 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:32:17.390 Verification LBA range: start 0x0 length 0x1000 00:32:17.390 crypto_ram2 : 5.07 505.37 1.97 0.00 0.00 252068.55 6696.07 158654.11 00:32:17.390 Job: crypto_ram2 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:32:17.390 Verification LBA range: start 0x1000 length 0x1000 00:32:17.390 crypto_ram2 : 5.06 505.70 1.98 0.00 0.00 251861.78 10884.67 158654.11 00:32:17.390 Job: crypto_ram3 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:32:17.390 Verification LBA range: start 0x0 length 0x1000 00:32:17.390 crypto_ram3 : 5.05 3929.21 15.35 0.00 0.00 32258.98 9175.04 26898.25 00:32:17.390 Job: crypto_ram3 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:32:17.390 Verification LBA range: start 0x1000 length 0x1000 00:32:17.390 crypto_ram3 : 5.05 3952.76 15.44 0.00 0.00 32088.97 6781.55 26898.25 00:32:17.390 Job: crypto_ram4 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:32:17.390 Verification LBA range: start 0x0 length 0x1000 00:32:17.390 crypto_ram4 : 5.06 3945.56 15.41 0.00 0.00 32059.63 2678.43 24846.69 00:32:17.390 Job: crypto_ram4 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:32:17.390 Verification LBA range: start 0x1000 length 0x1000 00:32:17.390 crypto_ram4 : 5.06 3960.66 15.47 0.00 0.00 31930.15 815.64 25416.57 00:32:17.390 =================================================================================================================== 00:32:17.390 Total : 17810.53 69.57 0.00 0.00 57130.57 815.64 174154.80 00:32:17.390 00:32:17.390 real 0m8.288s 00:32:17.390 user 0m15.689s 00:32:17.390 sys 0m0.388s 00:32:17.390 18:36:01 blockdev_crypto_aesni.bdev_verify -- common/autotest_common.sh@1124 -- # xtrace_disable 00:32:17.390 18:36:01 blockdev_crypto_aesni.bdev_verify -- common/autotest_common.sh@10 -- # set +x 00:32:17.390 ************************************ 00:32:17.390 END TEST bdev_verify 00:32:17.390 ************************************ 00:32:17.390 18:36:01 blockdev_crypto_aesni -- common/autotest_common.sh@1142 -- # return 0 00:32:17.390 18:36:01 blockdev_crypto_aesni -- bdev/blockdev.sh@778 -- # run_test bdev_verify_big_io /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 65536 -w verify -t 5 -C -m 0x3 '' 00:32:17.390 18:36:01 blockdev_crypto_aesni -- common/autotest_common.sh@1099 -- # '[' 16 -le 1 ']' 00:32:17.390 18:36:01 blockdev_crypto_aesni -- common/autotest_common.sh@1105 -- # xtrace_disable 00:32:17.390 18:36:01 blockdev_crypto_aesni -- common/autotest_common.sh@10 -- # set +x 00:32:17.390 ************************************ 00:32:17.390 START TEST bdev_verify_big_io 00:32:17.390 ************************************ 00:32:17.390 18:36:01 blockdev_crypto_aesni.bdev_verify_big_io -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 65536 -w verify -t 5 -C -m 0x3 '' 00:32:17.646 [2024-07-12 18:36:01.153379] Starting SPDK v24.09-pre git sha1 182dd7de4 / DPDK 24.03.0 initialization... 00:32:17.646 [2024-07-12 18:36:01.153442] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x3 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2647287 ] 00:32:17.646 [2024-07-12 18:36:01.282536] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 2 00:32:17.902 [2024-07-12 18:36:01.386666] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:32:17.902 [2024-07-12 18:36:01.386671] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:32:17.902 [2024-07-12 18:36:01.408031] accel_dpdk_cryptodev.c: 223:accel_dpdk_cryptodev_set_driver: *NOTICE*: Using driver crypto_aesni_mb 00:32:17.902 [2024-07-12 18:36:01.416061] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation encrypt will be assigned to module dpdk_cryptodev 00:32:17.902 [2024-07-12 18:36:01.424091] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation decrypt will be assigned to module dpdk_cryptodev 00:32:17.902 [2024-07-12 18:36:01.526624] accel_dpdk_cryptodev.c:1178:accel_dpdk_cryptodev_init: *NOTICE*: Found crypto devices: 97 00:32:20.425 [2024-07-12 18:36:03.747863] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_1" 00:32:20.425 [2024-07-12 18:36:03.747955] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:32:20.425 [2024-07-12 18:36:03.747971] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:32:20.425 [2024-07-12 18:36:03.755883] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_2" 00:32:20.425 [2024-07-12 18:36:03.755903] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:32:20.425 [2024-07-12 18:36:03.755915] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:32:20.425 [2024-07-12 18:36:03.763905] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_3" 00:32:20.425 [2024-07-12 18:36:03.763924] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:32:20.425 [2024-07-12 18:36:03.763945] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:32:20.425 [2024-07-12 18:36:03.771932] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_4" 00:32:20.425 [2024-07-12 18:36:03.771952] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc3 00:32:20.425 [2024-07-12 18:36:03.771963] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:32:20.425 Running I/O for 5 seconds... 00:32:23.220 [2024-07-12 18:36:06.847386] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:23.220 [2024-07-12 18:36:06.847503] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:23.220 [2024-07-12 18:36:06.847892] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:23.220 [2024-07-12 18:36:06.847994] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:23.220 [2024-07-12 18:36:06.861026] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:23.221 [2024-07-12 18:36:06.861086] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:23.221 [2024-07-12 18:36:06.861131] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:23.221 [2024-07-12 18:36:06.861173] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:23.479 [2024-07-12 18:36:07.023587] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:23.479 [2024-07-12 18:36:07.023688] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:23.479 [2024-07-12 18:36:07.025060] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:23.739 [2024-07-12 18:36:07.249514] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:23.739 [2024-07-12 18:36:07.249579] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:23.739 [2024-07-12 18:36:07.249634] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:23.739 [2024-07-12 18:36:07.250032] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:23.739 [2024-07-12 18:36:07.251495] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:23.739 [2024-07-12 18:36:07.251558] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:23.739 [2024-07-12 18:36:07.251606] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:23.739 [2024-07-12 18:36:07.251653] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:23.739 [2024-07-12 18:36:07.252233] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:23.739 [2024-07-12 18:36:07.252297] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:23.739 [2024-07-12 18:36:07.252344] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:23.739 [2024-07-12 18:36:07.252392] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:23.739 [2024-07-12 18:36:07.253747] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:23.739 [2024-07-12 18:36:07.253815] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:23.739 [2024-07-12 18:36:07.253864] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:23.739 [2024-07-12 18:36:07.253911] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:23.739 [2024-07-12 18:36:07.254373] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:23.739 [2024-07-12 18:36:07.254430] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:23.739 [2024-07-12 18:36:07.254477] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:23.739 [2024-07-12 18:36:07.254524] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:23.739 [2024-07-12 18:36:07.255875] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:23.739 [2024-07-12 18:36:07.255942] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:23.739 [2024-07-12 18:36:07.255989] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:23.739 [2024-07-12 18:36:07.256062] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:23.739 [2024-07-12 18:36:07.256642] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:23.739 [2024-07-12 18:36:07.256708] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:23.739 [2024-07-12 18:36:07.256770] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:23.739 [2024-07-12 18:36:07.256858] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:23.739 [2024-07-12 18:36:07.258498] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:23.739 [2024-07-12 18:36:07.258558] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:23.739 [2024-07-12 18:36:07.258605] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:23.739 [2024-07-12 18:36:07.258653] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:23.739 [2024-07-12 18:36:07.259091] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:23.740 [2024-07-12 18:36:07.259150] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:23.740 [2024-07-12 18:36:07.259200] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:23.740 [2024-07-12 18:36:07.259247] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:23.740 [2024-07-12 18:36:07.260425] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:23.740 [2024-07-12 18:36:07.260484] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:23.740 [2024-07-12 18:36:07.260538] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:23.740 [2024-07-12 18:36:07.260588] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:23.740 [2024-07-12 18:36:07.261110] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:23.740 [2024-07-12 18:36:07.261167] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:23.740 [2024-07-12 18:36:07.261215] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:23.740 [2024-07-12 18:36:07.261261] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:23.740 [2024-07-12 18:36:07.262552] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:23.740 [2024-07-12 18:36:07.262610] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:23.740 [2024-07-12 18:36:07.262662] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:23.740 [2024-07-12 18:36:07.262708] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:23.740 [2024-07-12 18:36:07.263186] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:23.740 [2024-07-12 18:36:07.263248] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:23.740 [2024-07-12 18:36:07.263297] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:23.740 [2024-07-12 18:36:07.263344] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:23.740 [2024-07-12 18:36:07.264796] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:23.740 [2024-07-12 18:36:07.264858] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:23.740 [2024-07-12 18:36:07.264903] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:23.740 [2024-07-12 18:36:07.264965] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:23.740 [2024-07-12 18:36:07.265406] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:23.740 [2024-07-12 18:36:07.265461] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:23.740 [2024-07-12 18:36:07.265508] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:23.740 [2024-07-12 18:36:07.265570] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:23.740 [2024-07-12 18:36:07.267055] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:23.740 [2024-07-12 18:36:07.267135] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:23.740 [2024-07-12 18:36:07.267196] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:23.740 [2024-07-12 18:36:07.267245] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:23.740 [2024-07-12 18:36:07.267705] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:23.740 [2024-07-12 18:36:07.267759] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:23.740 [2024-07-12 18:36:07.267807] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:23.740 [2024-07-12 18:36:07.267854] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:23.740 [2024-07-12 18:36:07.269210] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:23.740 [2024-07-12 18:36:07.269740] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:23.740 [2024-07-12 18:36:07.269793] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:23.740 [2024-07-12 18:36:07.271069] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:23.740 [2024-07-12 18:36:07.272214] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:23.740 [2024-07-12 18:36:07.272280] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:23.740 [2024-07-12 18:36:07.273430] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:23.740 [2024-07-12 18:36:07.273486] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:23.740 [2024-07-12 18:36:07.274794] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:23.740 [2024-07-12 18:36:07.276523] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:23.740 [2024-07-12 18:36:07.276586] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:23.740 [2024-07-12 18:36:07.278264] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:23.740 [2024-07-12 18:36:07.280338] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:23.740 [2024-07-12 18:36:07.280406] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:23.740 [2024-07-12 18:36:07.282205] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:23.740 [2024-07-12 18:36:07.282257] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:23.740 [2024-07-12 18:36:07.283467] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:23.740 [2024-07-12 18:36:07.284198] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:23.740 [2024-07-12 18:36:07.284253] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:23.740 [2024-07-12 18:36:07.285712] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:23.740 [2024-07-12 18:36:07.287042] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:23.740 [2024-07-12 18:36:07.287107] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:23.740 [2024-07-12 18:36:07.288508] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:23.740 [2024-07-12 18:36:07.288562] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:23.740 [2024-07-12 18:36:07.290187] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:23.740 [2024-07-12 18:36:07.290774] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:23.740 [2024-07-12 18:36:07.290827] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:23.740 [2024-07-12 18:36:07.292107] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:23.740 [2024-07-12 18:36:07.292892] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:23.740 [2024-07-12 18:36:07.292961] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:23.740 [2024-07-12 18:36:07.294510] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:23.740 [2024-07-12 18:36:07.294565] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:23.740 [2024-07-12 18:36:07.295899] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:23.740 [2024-07-12 18:36:07.297542] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:23.740 [2024-07-12 18:36:07.297602] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:23.740 [2024-07-12 18:36:07.299188] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:23.740 [2024-07-12 18:36:07.301240] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:23.740 [2024-07-12 18:36:07.301306] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:23.740 [2024-07-12 18:36:07.303142] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:23.740 [2024-07-12 18:36:07.303202] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:23.740 [2024-07-12 18:36:07.304425] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:23.740 [2024-07-12 18:36:07.305126] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:23.740 [2024-07-12 18:36:07.305180] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:23.740 [2024-07-12 18:36:07.306695] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:23.740 [2024-07-12 18:36:07.307990] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:23.740 [2024-07-12 18:36:07.308056] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:23.740 [2024-07-12 18:36:07.309510] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:23.740 [2024-07-12 18:36:07.309564] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:23.740 [2024-07-12 18:36:07.310792] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:23.740 [2024-07-12 18:36:07.311678] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:23.740 [2024-07-12 18:36:07.311733] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:23.740 [2024-07-12 18:36:07.312577] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:23.740 [2024-07-12 18:36:07.314200] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:23.741 [2024-07-12 18:36:07.314263] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:23.741 [2024-07-12 18:36:07.315274] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:23.741 [2024-07-12 18:36:07.315329] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:23.741 [2024-07-12 18:36:07.316485] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:23.741 [2024-07-12 18:36:07.318326] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:23.741 [2024-07-12 18:36:07.318391] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:23.741 [2024-07-12 18:36:07.319084] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:23.741 [2024-07-12 18:36:07.319955] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:23.741 [2024-07-12 18:36:07.320020] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:23.741 [2024-07-12 18:36:07.321444] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:23.741 [2024-07-12 18:36:07.321498] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:23.741 [2024-07-12 18:36:07.322586] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:23.741 [2024-07-12 18:36:07.323793] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:23.741 [2024-07-12 18:36:07.323845] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:23.741 [2024-07-12 18:36:07.325620] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:23.741 [2024-07-12 18:36:07.327806] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:23.741 [2024-07-12 18:36:07.327869] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:23.741 [2024-07-12 18:36:07.328514] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:23.741 [2024-07-12 18:36:07.328569] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:23.741 [2024-07-12 18:36:07.329701] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:23.741 [2024-07-12 18:36:07.331243] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:23.741 [2024-07-12 18:36:07.331300] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:23.741 [2024-07-12 18:36:07.332061] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:23.741 [2024-07-12 18:36:07.332861] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:23.741 [2024-07-12 18:36:07.332924] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:23.741 [2024-07-12 18:36:07.334167] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:23.741 [2024-07-12 18:36:07.334217] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:23.741 [2024-07-12 18:36:07.338531] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:23.741 [2024-07-12 18:36:07.339444] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:23.741 [2024-07-12 18:36:07.339502] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:23.741 [2024-07-12 18:36:07.341027] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:23.741 [2024-07-12 18:36:07.343292] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:23.741 [2024-07-12 18:36:07.343363] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:23.741 [2024-07-12 18:36:07.344109] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:23.741 [2024-07-12 18:36:07.344162] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:23.741 [2024-07-12 18:36:07.345365] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:23.741 [2024-07-12 18:36:07.346831] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:23.741 [2024-07-12 18:36:07.346886] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:23.741 [2024-07-12 18:36:07.348567] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:23.741 [2024-07-12 18:36:07.349900] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:23.741 [2024-07-12 18:36:07.349969] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:23.741 [2024-07-12 18:36:07.351625] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:23.741 [2024-07-12 18:36:07.351676] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:23.741 [2024-07-12 18:36:07.353271] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:23.741 [2024-07-12 18:36:07.354604] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:23.741 [2024-07-12 18:36:07.354658] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:23.741 [2024-07-12 18:36:07.355560] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:23.741 [2024-07-12 18:36:07.357409] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:23.741 [2024-07-12 18:36:07.357476] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:23.741 [2024-07-12 18:36:07.359159] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:23.741 [2024-07-12 18:36:07.359211] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:23.741 [2024-07-12 18:36:07.362745] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:23.741 [2024-07-12 18:36:07.363710] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:23.741 [2024-07-12 18:36:07.363765] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:23.741 [2024-07-12 18:36:07.364530] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:23.741 [2024-07-12 18:36:07.366289] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:23.741 [2024-07-12 18:36:07.366353] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:23.741 [2024-07-12 18:36:07.367785] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:23.741 [2024-07-12 18:36:07.367838] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:23.741 [2024-07-12 18:36:07.371732] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:23.741 [2024-07-12 18:36:07.372496] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:23.741 [2024-07-12 18:36:07.372552] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:23.741 [2024-07-12 18:36:07.374329] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:23.741 [2024-07-12 18:36:07.376532] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:23.741 [2024-07-12 18:36:07.376601] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:23.741 [2024-07-12 18:36:07.378435] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:23.741 [2024-07-12 18:36:07.378488] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:23.741 [2024-07-12 18:36:07.382934] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:23.741 [2024-07-12 18:36:07.382993] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:23.741 [2024-07-12 18:36:07.384658] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:23.741 [2024-07-12 18:36:07.384710] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:23.741 [2024-07-12 18:36:07.386877] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:23.741 [2024-07-12 18:36:07.386944] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:23.741 [2024-07-12 18:36:07.387783] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:23.741 [2024-07-12 18:36:07.387835] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:23.741 [2024-07-12 18:36:07.393070] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:23.741 [2024-07-12 18:36:07.393142] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:23.741 [2024-07-12 18:36:07.394451] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:23.741 [2024-07-12 18:36:07.394508] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:23.741 [2024-07-12 18:36:07.396331] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:23.741 [2024-07-12 18:36:07.396393] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:23.741 [2024-07-12 18:36:07.398069] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:23.741 [2024-07-12 18:36:07.398122] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:23.741 [2024-07-12 18:36:07.403342] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:23.741 [2024-07-12 18:36:07.403406] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:23.741 [2024-07-12 18:36:07.404829] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:23.741 [2024-07-12 18:36:07.404881] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:23.742 [2024-07-12 18:36:07.406312] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:23.742 [2024-07-12 18:36:07.406374] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:23.742 [2024-07-12 18:36:07.408046] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:23.742 [2024-07-12 18:36:07.408108] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:23.742 [2024-07-12 18:36:07.412598] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:23.742 [2024-07-12 18:36:07.412663] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:23.742 [2024-07-12 18:36:07.413279] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:23.742 [2024-07-12 18:36:07.413332] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:23.742 [2024-07-12 18:36:07.415079] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:23.742 [2024-07-12 18:36:07.415141] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:23.742 [2024-07-12 18:36:07.416545] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:23.742 [2024-07-12 18:36:07.416597] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:23.742 [2024-07-12 18:36:07.421681] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:23.742 [2024-07-12 18:36:07.421771] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:23.742 [2024-07-12 18:36:07.423144] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:23.742 [2024-07-12 18:36:07.423196] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:23.742 [2024-07-12 18:36:07.425391] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:23.742 [2024-07-12 18:36:07.425480] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:23.742 [2024-07-12 18:36:07.425881] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:23.742 [2024-07-12 18:36:07.425938] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:23.742 [2024-07-12 18:36:07.431833] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:23.742 [2024-07-12 18:36:07.431895] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:23.742 [2024-07-12 18:36:07.433333] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:23.742 [2024-07-12 18:36:07.433385] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:23.742 [2024-07-12 18:36:07.434931] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:23.742 [2024-07-12 18:36:07.434997] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:23.742 [2024-07-12 18:36:07.436497] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:23.742 [2024-07-12 18:36:07.436549] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:23.742 [2024-07-12 18:36:07.442638] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:23.742 [2024-07-12 18:36:07.442709] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:23.742 [2024-07-12 18:36:07.444319] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:23.742 [2024-07-12 18:36:07.444372] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:23.742 [2024-07-12 18:36:07.444775] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:23.742 [2024-07-12 18:36:07.446189] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:23.742 [2024-07-12 18:36:07.446243] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:23.742 [2024-07-12 18:36:07.447668] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:23.742 [2024-07-12 18:36:07.451954] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:23.742 [2024-07-12 18:36:07.452024] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:23.742 [2024-07-12 18:36:07.453702] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:23.742 [2024-07-12 18:36:07.453755] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:23.742 [2024-07-12 18:36:07.454167] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:23.742 [2024-07-12 18:36:07.455724] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:23.742 [2024-07-12 18:36:07.457157] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:23.742 [2024-07-12 18:36:07.457207] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:23.742 [2024-07-12 18:36:07.462376] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:24.309 [2024-07-12 18:36:07.767477] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:24.309 [2024-07-12 18:36:07.767626] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:24.309 [2024-07-12 18:36:07.767672] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:24.309 [2024-07-12 18:36:07.767699] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:24.877 [2024-07-12 18:36:08.314782] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:24.877 [2024-07-12 18:36:08.314867] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:24.877 [2024-07-12 18:36:08.315276] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:25.445 [2024-07-12 18:36:09.096784] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:25.445 [2024-07-12 18:36:09.096895] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:25.445 [2024-07-12 18:36:09.096949] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:25.445 [2024-07-12 18:36:09.096982] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:26.381 00:32:26.381 Latency(us) 00:32:26.381 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:32:26.381 Job: crypto_ram (Core Mask 0x1, workload: verify, depth: 128, IO size: 65536) 00:32:26.381 Verification LBA range: start 0x0 length 0x100 00:32:26.381 crypto_ram : 5.89 43.47 2.72 0.00 0.00 2858361.54 72944.42 2771887.86 00:32:26.382 Job: crypto_ram (Core Mask 0x2, workload: verify, depth: 128, IO size: 65536) 00:32:26.382 Verification LBA range: start 0x100 length 0x100 00:32:26.382 crypto_ram : 5.88 43.52 2.72 0.00 0.00 2854753.50 76591.64 2713532.33 00:32:26.382 Job: crypto_ram2 (Core Mask 0x1, workload: verify, depth: 128, IO size: 65536) 00:32:26.382 Verification LBA range: start 0x0 length 0x100 00:32:26.382 crypto_ram2 : 5.89 43.46 2.72 0.00 0.00 2755448.65 72488.51 2771887.86 00:32:26.382 Job: crypto_ram2 (Core Mask 0x2, workload: verify, depth: 128, IO size: 65536) 00:32:26.382 Verification LBA range: start 0x100 length 0x100 00:32:26.382 crypto_ram2 : 5.88 43.51 2.72 0.00 0.00 2753430.93 75679.83 2713532.33 00:32:26.382 Job: crypto_ram3 (Core Mask 0x1, workload: verify, depth: 128, IO size: 65536) 00:32:26.382 Verification LBA range: start 0x0 length 0x100 00:32:26.382 crypto_ram3 : 5.60 262.13 16.38 0.00 0.00 433995.10 14303.94 725796.95 00:32:26.382 Job: crypto_ram3 (Core Mask 0x2, workload: verify, depth: 128, IO size: 65536) 00:32:26.382 Verification LBA range: start 0x100 length 0x100 00:32:26.382 crypto_ram3 : 5.62 266.38 16.65 0.00 0.00 427993.03 69297.20 634616.43 00:32:26.382 Job: crypto_ram4 (Core Mask 0x1, workload: verify, depth: 128, IO size: 65536) 00:32:26.382 Verification LBA range: start 0x0 length 0x100 00:32:26.382 crypto_ram4 : 5.71 278.84 17.43 0.00 0.00 396320.68 2706.92 550730.35 00:32:26.382 Job: crypto_ram4 (Core Mask 0x2, workload: verify, depth: 128, IO size: 65536) 00:32:26.382 Verification LBA range: start 0x100 length 0x100 00:32:26.382 crypto_ram4 : 5.71 281.85 17.62 0.00 0.00 392889.60 11739.49 547083.13 00:32:26.382 =================================================================================================================== 00:32:26.382 Total : 1263.16 78.95 0.00 0.00 752964.63 2706.92 2771887.86 00:32:26.640 00:32:26.640 real 0m9.141s 00:32:26.640 user 0m17.381s 00:32:26.640 sys 0m0.395s 00:32:26.640 18:36:10 blockdev_crypto_aesni.bdev_verify_big_io -- common/autotest_common.sh@1124 -- # xtrace_disable 00:32:26.640 18:36:10 blockdev_crypto_aesni.bdev_verify_big_io -- common/autotest_common.sh@10 -- # set +x 00:32:26.640 ************************************ 00:32:26.640 END TEST bdev_verify_big_io 00:32:26.640 ************************************ 00:32:26.640 18:36:10 blockdev_crypto_aesni -- common/autotest_common.sh@1142 -- # return 0 00:32:26.640 18:36:10 blockdev_crypto_aesni -- bdev/blockdev.sh@779 -- # run_test bdev_write_zeroes /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:32:26.640 18:36:10 blockdev_crypto_aesni -- common/autotest_common.sh@1099 -- # '[' 13 -le 1 ']' 00:32:26.640 18:36:10 blockdev_crypto_aesni -- common/autotest_common.sh@1105 -- # xtrace_disable 00:32:26.640 18:36:10 blockdev_crypto_aesni -- common/autotest_common.sh@10 -- # set +x 00:32:26.640 ************************************ 00:32:26.640 START TEST bdev_write_zeroes 00:32:26.640 ************************************ 00:32:26.640 18:36:10 blockdev_crypto_aesni.bdev_write_zeroes -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:32:26.640 [2024-07-12 18:36:10.362806] Starting SPDK v24.09-pre git sha1 182dd7de4 / DPDK 24.03.0 initialization... 00:32:26.640 [2024-07-12 18:36:10.362865] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2648953 ] 00:32:26.898 [2024-07-12 18:36:10.490471] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:32:26.898 [2024-07-12 18:36:10.587187] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:32:26.898 [2024-07-12 18:36:10.608516] accel_dpdk_cryptodev.c: 223:accel_dpdk_cryptodev_set_driver: *NOTICE*: Using driver crypto_aesni_mb 00:32:26.898 [2024-07-12 18:36:10.616543] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation encrypt will be assigned to module dpdk_cryptodev 00:32:27.156 [2024-07-12 18:36:10.624572] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation decrypt will be assigned to module dpdk_cryptodev 00:32:29.541 [2024-07-12 18:36:11.475030] accel_dpdk_cryptodev.c:1178:accel_dpdk_cryptodev_init: *NOTICE*: Found crypto devices: 97 00:32:33.829 [2024-07-12 18:36:17.303838] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_1" 00:32:33.829 [2024-07-12 18:36:17.303905] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:32:33.829 [2024-07-12 18:36:17.303920] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:32:33.829 [2024-07-12 18:36:17.311860] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_2" 00:32:33.829 [2024-07-12 18:36:17.311881] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:32:33.829 [2024-07-12 18:36:17.311893] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:32:33.829 [2024-07-12 18:36:17.319881] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_3" 00:32:33.829 [2024-07-12 18:36:17.319900] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:32:33.829 [2024-07-12 18:36:17.319911] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:32:33.829 [2024-07-12 18:36:17.327901] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_4" 00:32:33.829 [2024-07-12 18:36:17.327919] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc3 00:32:33.829 [2024-07-12 18:36:17.327939] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:32:33.829 Running I/O for 1 seconds... 00:32:34.767 00:32:34.767 Latency(us) 00:32:34.767 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:32:34.767 Job: crypto_ram (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:32:34.767 crypto_ram : 1.03 1971.91 7.70 0.00 0.00 64410.86 5413.84 77503.44 00:32:34.767 Job: crypto_ram2 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:32:34.767 crypto_ram2 : 1.03 1977.63 7.73 0.00 0.00 63882.03 5385.35 72032.61 00:32:34.767 Job: crypto_ram3 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:32:34.767 crypto_ram3 : 1.02 15168.41 59.25 0.00 0.00 8315.83 2464.72 10770.70 00:32:34.767 Job: crypto_ram4 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:32:34.767 crypto_ram4 : 1.02 15205.48 59.40 0.00 0.00 8269.87 2450.48 8662.15 00:32:34.767 =================================================================================================================== 00:32:34.767 Total : 34323.43 134.08 0.00 0.00 14747.44 2450.48 77503.44 00:32:35.336 00:32:35.336 real 0m8.577s 00:32:35.336 user 0m3.824s 00:32:35.336 sys 0m0.363s 00:32:35.336 18:36:18 blockdev_crypto_aesni.bdev_write_zeroes -- common/autotest_common.sh@1124 -- # xtrace_disable 00:32:35.336 18:36:18 blockdev_crypto_aesni.bdev_write_zeroes -- common/autotest_common.sh@10 -- # set +x 00:32:35.336 ************************************ 00:32:35.336 END TEST bdev_write_zeroes 00:32:35.336 ************************************ 00:32:35.336 18:36:18 blockdev_crypto_aesni -- common/autotest_common.sh@1142 -- # return 0 00:32:35.336 18:36:18 blockdev_crypto_aesni -- bdev/blockdev.sh@782 -- # run_test bdev_json_nonenclosed /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonenclosed.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:32:35.336 18:36:18 blockdev_crypto_aesni -- common/autotest_common.sh@1099 -- # '[' 13 -le 1 ']' 00:32:35.336 18:36:18 blockdev_crypto_aesni -- common/autotest_common.sh@1105 -- # xtrace_disable 00:32:35.336 18:36:18 blockdev_crypto_aesni -- common/autotest_common.sh@10 -- # set +x 00:32:35.336 ************************************ 00:32:35.336 START TEST bdev_json_nonenclosed 00:32:35.336 ************************************ 00:32:35.336 18:36:18 blockdev_crypto_aesni.bdev_json_nonenclosed -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonenclosed.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:32:35.336 [2024-07-12 18:36:19.023588] Starting SPDK v24.09-pre git sha1 182dd7de4 / DPDK 24.03.0 initialization... 00:32:35.336 [2024-07-12 18:36:19.023649] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2650001 ] 00:32:35.596 [2024-07-12 18:36:19.152645] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:32:35.596 [2024-07-12 18:36:19.253164] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:32:35.596 [2024-07-12 18:36:19.253236] json_config.c: 608:json_config_prepare_ctx: *ERROR*: Invalid JSON configuration: not enclosed in {}. 00:32:35.596 [2024-07-12 18:36:19.253257] rpc.c: 190:spdk_rpc_server_finish: *ERROR*: No server listening on provided address: 00:32:35.596 [2024-07-12 18:36:19.253270] app.c:1052:spdk_app_stop: *WARNING*: spdk_app_stop'd on non-zero 00:32:35.855 00:32:35.855 real 0m0.395s 00:32:35.855 user 0m0.237s 00:32:35.855 sys 0m0.155s 00:32:35.855 18:36:19 blockdev_crypto_aesni.bdev_json_nonenclosed -- common/autotest_common.sh@1123 -- # es=234 00:32:35.855 18:36:19 blockdev_crypto_aesni.bdev_json_nonenclosed -- common/autotest_common.sh@1124 -- # xtrace_disable 00:32:35.855 18:36:19 blockdev_crypto_aesni.bdev_json_nonenclosed -- common/autotest_common.sh@10 -- # set +x 00:32:35.855 ************************************ 00:32:35.855 END TEST bdev_json_nonenclosed 00:32:35.855 ************************************ 00:32:35.855 18:36:19 blockdev_crypto_aesni -- common/autotest_common.sh@1142 -- # return 234 00:32:35.855 18:36:19 blockdev_crypto_aesni -- bdev/blockdev.sh@782 -- # true 00:32:35.855 18:36:19 blockdev_crypto_aesni -- bdev/blockdev.sh@785 -- # run_test bdev_json_nonarray /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonarray.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:32:35.855 18:36:19 blockdev_crypto_aesni -- common/autotest_common.sh@1099 -- # '[' 13 -le 1 ']' 00:32:35.855 18:36:19 blockdev_crypto_aesni -- common/autotest_common.sh@1105 -- # xtrace_disable 00:32:35.855 18:36:19 blockdev_crypto_aesni -- common/autotest_common.sh@10 -- # set +x 00:32:35.855 ************************************ 00:32:35.855 START TEST bdev_json_nonarray 00:32:35.855 ************************************ 00:32:35.855 18:36:19 blockdev_crypto_aesni.bdev_json_nonarray -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonarray.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:32:35.855 [2024-07-12 18:36:19.498372] Starting SPDK v24.09-pre git sha1 182dd7de4 / DPDK 24.03.0 initialization... 00:32:35.855 [2024-07-12 18:36:19.498433] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2650035 ] 00:32:36.115 [2024-07-12 18:36:19.626369] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:32:36.115 [2024-07-12 18:36:19.722759] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:32:36.115 [2024-07-12 18:36:19.722832] json_config.c: 614:json_config_prepare_ctx: *ERROR*: Invalid JSON configuration: 'subsystems' should be an array. 00:32:36.115 [2024-07-12 18:36:19.722853] rpc.c: 190:spdk_rpc_server_finish: *ERROR*: No server listening on provided address: 00:32:36.115 [2024-07-12 18:36:19.722866] app.c:1052:spdk_app_stop: *WARNING*: spdk_app_stop'd on non-zero 00:32:36.115 00:32:36.115 real 0m0.386s 00:32:36.115 user 0m0.226s 00:32:36.115 sys 0m0.158s 00:32:36.115 18:36:19 blockdev_crypto_aesni.bdev_json_nonarray -- common/autotest_common.sh@1123 -- # es=234 00:32:36.115 18:36:19 blockdev_crypto_aesni.bdev_json_nonarray -- common/autotest_common.sh@1124 -- # xtrace_disable 00:32:36.115 18:36:19 blockdev_crypto_aesni.bdev_json_nonarray -- common/autotest_common.sh@10 -- # set +x 00:32:36.115 ************************************ 00:32:36.115 END TEST bdev_json_nonarray 00:32:36.115 ************************************ 00:32:36.375 18:36:19 blockdev_crypto_aesni -- common/autotest_common.sh@1142 -- # return 234 00:32:36.375 18:36:19 blockdev_crypto_aesni -- bdev/blockdev.sh@785 -- # true 00:32:36.375 18:36:19 blockdev_crypto_aesni -- bdev/blockdev.sh@787 -- # [[ crypto_aesni == bdev ]] 00:32:36.375 18:36:19 blockdev_crypto_aesni -- bdev/blockdev.sh@794 -- # [[ crypto_aesni == gpt ]] 00:32:36.375 18:36:19 blockdev_crypto_aesni -- bdev/blockdev.sh@798 -- # [[ crypto_aesni == crypto_sw ]] 00:32:36.375 18:36:19 blockdev_crypto_aesni -- bdev/blockdev.sh@810 -- # trap - SIGINT SIGTERM EXIT 00:32:36.375 18:36:19 blockdev_crypto_aesni -- bdev/blockdev.sh@811 -- # cleanup 00:32:36.375 18:36:19 blockdev_crypto_aesni -- bdev/blockdev.sh@23 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/aiofile 00:32:36.375 18:36:19 blockdev_crypto_aesni -- bdev/blockdev.sh@24 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json 00:32:36.375 18:36:19 blockdev_crypto_aesni -- bdev/blockdev.sh@26 -- # [[ crypto_aesni == rbd ]] 00:32:36.375 18:36:19 blockdev_crypto_aesni -- bdev/blockdev.sh@30 -- # [[ crypto_aesni == daos ]] 00:32:36.375 18:36:19 blockdev_crypto_aesni -- bdev/blockdev.sh@34 -- # [[ crypto_aesni = \g\p\t ]] 00:32:36.375 18:36:19 blockdev_crypto_aesni -- bdev/blockdev.sh@40 -- # [[ crypto_aesni == xnvme ]] 00:32:36.375 00:32:36.375 real 1m17.234s 00:32:36.375 user 2m40.267s 00:32:36.375 sys 0m9.087s 00:32:36.375 18:36:19 blockdev_crypto_aesni -- common/autotest_common.sh@1124 -- # xtrace_disable 00:32:36.375 18:36:19 blockdev_crypto_aesni -- common/autotest_common.sh@10 -- # set +x 00:32:36.375 ************************************ 00:32:36.375 END TEST blockdev_crypto_aesni 00:32:36.375 ************************************ 00:32:36.375 18:36:19 -- common/autotest_common.sh@1142 -- # return 0 00:32:36.375 18:36:19 -- spdk/autotest.sh@358 -- # run_test blockdev_crypto_sw /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/blockdev.sh crypto_sw 00:32:36.375 18:36:19 -- common/autotest_common.sh@1099 -- # '[' 3 -le 1 ']' 00:32:36.375 18:36:19 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:32:36.375 18:36:19 -- common/autotest_common.sh@10 -- # set +x 00:32:36.375 ************************************ 00:32:36.375 START TEST blockdev_crypto_sw 00:32:36.375 ************************************ 00:32:36.375 18:36:19 blockdev_crypto_sw -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/blockdev.sh crypto_sw 00:32:36.375 * Looking for test storage... 00:32:36.375 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev 00:32:36.375 18:36:20 blockdev_crypto_sw -- bdev/blockdev.sh@10 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbd_common.sh 00:32:36.375 18:36:20 blockdev_crypto_sw -- bdev/nbd_common.sh@6 -- # set -e 00:32:36.375 18:36:20 blockdev_crypto_sw -- bdev/blockdev.sh@12 -- # rpc_py=rpc_cmd 00:32:36.375 18:36:20 blockdev_crypto_sw -- bdev/blockdev.sh@13 -- # conf_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json 00:32:36.375 18:36:20 blockdev_crypto_sw -- bdev/blockdev.sh@14 -- # nonenclosed_conf_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonenclosed.json 00:32:36.375 18:36:20 blockdev_crypto_sw -- bdev/blockdev.sh@15 -- # nonarray_conf_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonarray.json 00:32:36.375 18:36:20 blockdev_crypto_sw -- bdev/blockdev.sh@17 -- # export RPC_PIPE_TIMEOUT=30 00:32:36.375 18:36:20 blockdev_crypto_sw -- bdev/blockdev.sh@17 -- # RPC_PIPE_TIMEOUT=30 00:32:36.375 18:36:20 blockdev_crypto_sw -- bdev/blockdev.sh@20 -- # : 00:32:36.375 18:36:20 blockdev_crypto_sw -- bdev/blockdev.sh@670 -- # QOS_DEV_1=Malloc_0 00:32:36.375 18:36:20 blockdev_crypto_sw -- bdev/blockdev.sh@671 -- # QOS_DEV_2=Null_1 00:32:36.375 18:36:20 blockdev_crypto_sw -- bdev/blockdev.sh@672 -- # QOS_RUN_TIME=5 00:32:36.375 18:36:20 blockdev_crypto_sw -- bdev/blockdev.sh@674 -- # uname -s 00:32:36.375 18:36:20 blockdev_crypto_sw -- bdev/blockdev.sh@674 -- # '[' Linux = Linux ']' 00:32:36.375 18:36:20 blockdev_crypto_sw -- bdev/blockdev.sh@676 -- # PRE_RESERVED_MEM=0 00:32:36.375 18:36:20 blockdev_crypto_sw -- bdev/blockdev.sh@682 -- # test_type=crypto_sw 00:32:36.375 18:36:20 blockdev_crypto_sw -- bdev/blockdev.sh@683 -- # crypto_device= 00:32:36.375 18:36:20 blockdev_crypto_sw -- bdev/blockdev.sh@684 -- # dek= 00:32:36.375 18:36:20 blockdev_crypto_sw -- bdev/blockdev.sh@685 -- # env_ctx= 00:32:36.375 18:36:20 blockdev_crypto_sw -- bdev/blockdev.sh@686 -- # wait_for_rpc= 00:32:36.375 18:36:20 blockdev_crypto_sw -- bdev/blockdev.sh@687 -- # '[' -n '' ']' 00:32:36.375 18:36:20 blockdev_crypto_sw -- bdev/blockdev.sh@690 -- # [[ crypto_sw == bdev ]] 00:32:36.375 18:36:20 blockdev_crypto_sw -- bdev/blockdev.sh@690 -- # [[ crypto_sw == crypto_* ]] 00:32:36.375 18:36:20 blockdev_crypto_sw -- bdev/blockdev.sh@691 -- # wait_for_rpc=--wait-for-rpc 00:32:36.375 18:36:20 blockdev_crypto_sw -- bdev/blockdev.sh@693 -- # start_spdk_tgt 00:32:36.375 18:36:20 blockdev_crypto_sw -- bdev/blockdev.sh@47 -- # spdk_tgt_pid=2650170 00:32:36.375 18:36:20 blockdev_crypto_sw -- bdev/blockdev.sh@48 -- # trap 'killprocess "$spdk_tgt_pid"; exit 1' SIGINT SIGTERM EXIT 00:32:36.375 18:36:20 blockdev_crypto_sw -- bdev/blockdev.sh@49 -- # waitforlisten 2650170 00:32:36.375 18:36:20 blockdev_crypto_sw -- common/autotest_common.sh@829 -- # '[' -z 2650170 ']' 00:32:36.375 18:36:20 blockdev_crypto_sw -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:32:36.375 18:36:20 blockdev_crypto_sw -- common/autotest_common.sh@834 -- # local max_retries=100 00:32:36.375 18:36:20 blockdev_crypto_sw -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:32:36.376 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:32:36.376 18:36:20 blockdev_crypto_sw -- bdev/blockdev.sh@46 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt '' --wait-for-rpc 00:32:36.376 18:36:20 blockdev_crypto_sw -- common/autotest_common.sh@838 -- # xtrace_disable 00:32:36.376 18:36:20 blockdev_crypto_sw -- common/autotest_common.sh@10 -- # set +x 00:32:36.634 [2024-07-12 18:36:20.122878] Starting SPDK v24.09-pre git sha1 182dd7de4 / DPDK 24.03.0 initialization... 00:32:36.634 [2024-07-12 18:36:20.122960] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2650170 ] 00:32:36.634 [2024-07-12 18:36:20.241619] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:32:36.634 [2024-07-12 18:36:20.339793] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:32:38.011 18:36:21 blockdev_crypto_sw -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:32:38.011 18:36:21 blockdev_crypto_sw -- common/autotest_common.sh@862 -- # return 0 00:32:38.011 18:36:21 blockdev_crypto_sw -- bdev/blockdev.sh@694 -- # case "$test_type" in 00:32:38.011 18:36:21 blockdev_crypto_sw -- bdev/blockdev.sh@711 -- # setup_crypto_sw_conf 00:32:38.011 18:36:21 blockdev_crypto_sw -- bdev/blockdev.sh@193 -- # rpc_cmd 00:32:38.011 18:36:21 blockdev_crypto_sw -- common/autotest_common.sh@559 -- # xtrace_disable 00:32:38.011 18:36:21 blockdev_crypto_sw -- common/autotest_common.sh@10 -- # set +x 00:32:38.011 Malloc0 00:32:38.011 Malloc1 00:32:38.011 true 00:32:38.011 true 00:32:38.011 true 00:32:38.011 [2024-07-12 18:36:21.600361] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw" 00:32:38.011 crypto_ram 00:32:38.011 [2024-07-12 18:36:21.608387] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw2" 00:32:38.011 crypto_ram2 00:32:38.011 [2024-07-12 18:36:21.616414] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw3" 00:32:38.011 crypto_ram3 00:32:38.011 [ 00:32:38.011 { 00:32:38.011 "name": "Malloc1", 00:32:38.011 "aliases": [ 00:32:38.011 "df7ee121-af42-44de-9742-65405ae2b61c" 00:32:38.011 ], 00:32:38.011 "product_name": "Malloc disk", 00:32:38.011 "block_size": 4096, 00:32:38.011 "num_blocks": 4096, 00:32:38.011 "uuid": "df7ee121-af42-44de-9742-65405ae2b61c", 00:32:38.011 "assigned_rate_limits": { 00:32:38.011 "rw_ios_per_sec": 0, 00:32:38.011 "rw_mbytes_per_sec": 0, 00:32:38.011 "r_mbytes_per_sec": 0, 00:32:38.011 "w_mbytes_per_sec": 0 00:32:38.011 }, 00:32:38.011 "claimed": true, 00:32:38.011 "claim_type": "exclusive_write", 00:32:38.011 "zoned": false, 00:32:38.011 "supported_io_types": { 00:32:38.011 "read": true, 00:32:38.011 "write": true, 00:32:38.011 "unmap": true, 00:32:38.011 "flush": true, 00:32:38.011 "reset": true, 00:32:38.011 "nvme_admin": false, 00:32:38.011 "nvme_io": false, 00:32:38.011 "nvme_io_md": false, 00:32:38.011 "write_zeroes": true, 00:32:38.011 "zcopy": true, 00:32:38.011 "get_zone_info": false, 00:32:38.011 "zone_management": false, 00:32:38.011 "zone_append": false, 00:32:38.011 "compare": false, 00:32:38.011 "compare_and_write": false, 00:32:38.011 "abort": true, 00:32:38.011 "seek_hole": false, 00:32:38.011 "seek_data": false, 00:32:38.011 "copy": true, 00:32:38.011 "nvme_iov_md": false 00:32:38.011 }, 00:32:38.011 "memory_domains": [ 00:32:38.011 { 00:32:38.011 "dma_device_id": "system", 00:32:38.011 "dma_device_type": 1 00:32:38.011 }, 00:32:38.011 { 00:32:38.011 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:32:38.011 "dma_device_type": 2 00:32:38.011 } 00:32:38.011 ], 00:32:38.011 "driver_specific": {} 00:32:38.011 } 00:32:38.011 ] 00:32:38.011 18:36:21 blockdev_crypto_sw -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:32:38.011 18:36:21 blockdev_crypto_sw -- bdev/blockdev.sh@737 -- # rpc_cmd bdev_wait_for_examine 00:32:38.011 18:36:21 blockdev_crypto_sw -- common/autotest_common.sh@559 -- # xtrace_disable 00:32:38.011 18:36:21 blockdev_crypto_sw -- common/autotest_common.sh@10 -- # set +x 00:32:38.011 18:36:21 blockdev_crypto_sw -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:32:38.011 18:36:21 blockdev_crypto_sw -- bdev/blockdev.sh@740 -- # cat 00:32:38.011 18:36:21 blockdev_crypto_sw -- bdev/blockdev.sh@740 -- # rpc_cmd save_subsystem_config -n accel 00:32:38.011 18:36:21 blockdev_crypto_sw -- common/autotest_common.sh@559 -- # xtrace_disable 00:32:38.011 18:36:21 blockdev_crypto_sw -- common/autotest_common.sh@10 -- # set +x 00:32:38.011 18:36:21 blockdev_crypto_sw -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:32:38.011 18:36:21 blockdev_crypto_sw -- bdev/blockdev.sh@740 -- # rpc_cmd save_subsystem_config -n bdev 00:32:38.011 18:36:21 blockdev_crypto_sw -- common/autotest_common.sh@559 -- # xtrace_disable 00:32:38.011 18:36:21 blockdev_crypto_sw -- common/autotest_common.sh@10 -- # set +x 00:32:38.011 18:36:21 blockdev_crypto_sw -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:32:38.011 18:36:21 blockdev_crypto_sw -- bdev/blockdev.sh@740 -- # rpc_cmd save_subsystem_config -n iobuf 00:32:38.011 18:36:21 blockdev_crypto_sw -- common/autotest_common.sh@559 -- # xtrace_disable 00:32:38.011 18:36:21 blockdev_crypto_sw -- common/autotest_common.sh@10 -- # set +x 00:32:38.011 18:36:21 blockdev_crypto_sw -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:32:38.011 18:36:21 blockdev_crypto_sw -- bdev/blockdev.sh@748 -- # mapfile -t bdevs 00:32:38.011 18:36:21 blockdev_crypto_sw -- bdev/blockdev.sh@748 -- # rpc_cmd bdev_get_bdevs 00:32:38.011 18:36:21 blockdev_crypto_sw -- bdev/blockdev.sh@748 -- # jq -r '.[] | select(.claimed == false)' 00:32:38.011 18:36:21 blockdev_crypto_sw -- common/autotest_common.sh@559 -- # xtrace_disable 00:32:38.011 18:36:21 blockdev_crypto_sw -- common/autotest_common.sh@10 -- # set +x 00:32:38.270 18:36:21 blockdev_crypto_sw -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:32:38.270 18:36:21 blockdev_crypto_sw -- bdev/blockdev.sh@749 -- # mapfile -t bdevs_name 00:32:38.271 18:36:21 blockdev_crypto_sw -- bdev/blockdev.sh@749 -- # printf '%s\n' '{' ' "name": "crypto_ram",' ' "aliases": [' ' "51c992f1-be0b-50e0-84a4-ea8682f461db"' ' ],' ' "product_name": "crypto",' ' "block_size": 512,' ' "num_blocks": 32768,' ' "uuid": "51c992f1-be0b-50e0-84a4-ea8682f461db",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc0",' ' "name": "crypto_ram",' ' "key_name": "test_dek_sw"' ' }' ' }' '}' '{' ' "name": "crypto_ram3",' ' "aliases": [' ' "09fdcc7c-f343-5912-ba1b-99059684948f"' ' ],' ' "product_name": "crypto",' ' "block_size": 4096,' ' "num_blocks": 4096,' ' "uuid": "09fdcc7c-f343-5912-ba1b-99059684948f",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "crypto_ram2",' ' "name": "crypto_ram3",' ' "key_name": "test_dek_sw3"' ' }' ' }' '}' 00:32:38.271 18:36:21 blockdev_crypto_sw -- bdev/blockdev.sh@749 -- # jq -r .name 00:32:38.271 18:36:21 blockdev_crypto_sw -- bdev/blockdev.sh@750 -- # bdev_list=("${bdevs_name[@]}") 00:32:38.271 18:36:21 blockdev_crypto_sw -- bdev/blockdev.sh@752 -- # hello_world_bdev=crypto_ram 00:32:38.271 18:36:21 blockdev_crypto_sw -- bdev/blockdev.sh@753 -- # trap - SIGINT SIGTERM EXIT 00:32:38.271 18:36:21 blockdev_crypto_sw -- bdev/blockdev.sh@754 -- # killprocess 2650170 00:32:38.271 18:36:21 blockdev_crypto_sw -- common/autotest_common.sh@948 -- # '[' -z 2650170 ']' 00:32:38.271 18:36:21 blockdev_crypto_sw -- common/autotest_common.sh@952 -- # kill -0 2650170 00:32:38.271 18:36:21 blockdev_crypto_sw -- common/autotest_common.sh@953 -- # uname 00:32:38.271 18:36:21 blockdev_crypto_sw -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:32:38.271 18:36:21 blockdev_crypto_sw -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2650170 00:32:38.271 18:36:21 blockdev_crypto_sw -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:32:38.271 18:36:21 blockdev_crypto_sw -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:32:38.271 18:36:21 blockdev_crypto_sw -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2650170' 00:32:38.271 killing process with pid 2650170 00:32:38.271 18:36:21 blockdev_crypto_sw -- common/autotest_common.sh@967 -- # kill 2650170 00:32:38.271 18:36:21 blockdev_crypto_sw -- common/autotest_common.sh@972 -- # wait 2650170 00:32:38.530 18:36:22 blockdev_crypto_sw -- bdev/blockdev.sh@758 -- # trap cleanup SIGINT SIGTERM EXIT 00:32:38.530 18:36:22 blockdev_crypto_sw -- bdev/blockdev.sh@760 -- # run_test bdev_hello_world /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/hello_bdev --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -b crypto_ram '' 00:32:38.530 18:36:22 blockdev_crypto_sw -- common/autotest_common.sh@1099 -- # '[' 7 -le 1 ']' 00:32:38.530 18:36:22 blockdev_crypto_sw -- common/autotest_common.sh@1105 -- # xtrace_disable 00:32:38.530 18:36:22 blockdev_crypto_sw -- common/autotest_common.sh@10 -- # set +x 00:32:38.789 ************************************ 00:32:38.789 START TEST bdev_hello_world 00:32:38.789 ************************************ 00:32:38.789 18:36:22 blockdev_crypto_sw.bdev_hello_world -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/hello_bdev --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -b crypto_ram '' 00:32:38.789 [2024-07-12 18:36:22.338787] Starting SPDK v24.09-pre git sha1 182dd7de4 / DPDK 24.03.0 initialization... 00:32:38.789 [2024-07-12 18:36:22.338849] [ DPDK EAL parameters: hello_bdev --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2650468 ] 00:32:38.789 [2024-07-12 18:36:22.465202] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:32:39.048 [2024-07-12 18:36:22.561875] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:32:39.048 [2024-07-12 18:36:22.732775] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw" 00:32:39.048 [2024-07-12 18:36:22.732838] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:32:39.048 [2024-07-12 18:36:22.732854] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:32:39.048 [2024-07-12 18:36:22.740793] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw2" 00:32:39.048 [2024-07-12 18:36:22.740814] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:32:39.048 [2024-07-12 18:36:22.740826] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:32:39.048 [2024-07-12 18:36:22.748814] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw3" 00:32:39.048 [2024-07-12 18:36:22.748833] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: crypto_ram2 00:32:39.048 [2024-07-12 18:36:22.748845] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:32:39.306 [2024-07-12 18:36:22.789302] hello_bdev.c: 222:hello_start: *NOTICE*: Successfully started the application 00:32:39.306 [2024-07-12 18:36:22.789337] hello_bdev.c: 231:hello_start: *NOTICE*: Opening the bdev crypto_ram 00:32:39.306 [2024-07-12 18:36:22.789355] hello_bdev.c: 244:hello_start: *NOTICE*: Opening io channel 00:32:39.306 [2024-07-12 18:36:22.791375] hello_bdev.c: 138:hello_write: *NOTICE*: Writing to the bdev 00:32:39.306 [2024-07-12 18:36:22.791442] hello_bdev.c: 117:write_complete: *NOTICE*: bdev io write completed successfully 00:32:39.306 [2024-07-12 18:36:22.791458] hello_bdev.c: 84:hello_read: *NOTICE*: Reading io 00:32:39.306 [2024-07-12 18:36:22.791492] hello_bdev.c: 65:read_complete: *NOTICE*: Read string from bdev : Hello World! 00:32:39.306 00:32:39.306 [2024-07-12 18:36:22.791510] hello_bdev.c: 74:read_complete: *NOTICE*: Stopping app 00:32:39.306 00:32:39.306 real 0m0.727s 00:32:39.306 user 0m0.472s 00:32:39.306 sys 0m0.237s 00:32:39.306 18:36:23 blockdev_crypto_sw.bdev_hello_world -- common/autotest_common.sh@1124 -- # xtrace_disable 00:32:39.306 18:36:23 blockdev_crypto_sw.bdev_hello_world -- common/autotest_common.sh@10 -- # set +x 00:32:39.306 ************************************ 00:32:39.306 END TEST bdev_hello_world 00:32:39.306 ************************************ 00:32:39.563 18:36:23 blockdev_crypto_sw -- common/autotest_common.sh@1142 -- # return 0 00:32:39.563 18:36:23 blockdev_crypto_sw -- bdev/blockdev.sh@761 -- # run_test bdev_bounds bdev_bounds '' 00:32:39.563 18:36:23 blockdev_crypto_sw -- common/autotest_common.sh@1099 -- # '[' 3 -le 1 ']' 00:32:39.563 18:36:23 blockdev_crypto_sw -- common/autotest_common.sh@1105 -- # xtrace_disable 00:32:39.563 18:36:23 blockdev_crypto_sw -- common/autotest_common.sh@10 -- # set +x 00:32:39.821 ************************************ 00:32:39.821 START TEST bdev_bounds 00:32:39.821 ************************************ 00:32:39.821 18:36:23 blockdev_crypto_sw.bdev_bounds -- common/autotest_common.sh@1123 -- # bdev_bounds '' 00:32:39.821 18:36:23 blockdev_crypto_sw.bdev_bounds -- bdev/blockdev.sh@290 -- # bdevio_pid=2650659 00:32:39.821 18:36:23 blockdev_crypto_sw.bdev_bounds -- bdev/blockdev.sh@291 -- # trap 'cleanup; killprocess $bdevio_pid; exit 1' SIGINT SIGTERM EXIT 00:32:39.821 18:36:23 blockdev_crypto_sw.bdev_bounds -- bdev/blockdev.sh@292 -- # echo 'Process bdevio pid: 2650659' 00:32:39.821 Process bdevio pid: 2650659 00:32:39.821 18:36:23 blockdev_crypto_sw.bdev_bounds -- bdev/blockdev.sh@293 -- # waitforlisten 2650659 00:32:39.821 18:36:23 blockdev_crypto_sw.bdev_bounds -- common/autotest_common.sh@829 -- # '[' -z 2650659 ']' 00:32:39.821 18:36:23 blockdev_crypto_sw.bdev_bounds -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:32:39.821 18:36:23 blockdev_crypto_sw.bdev_bounds -- common/autotest_common.sh@834 -- # local max_retries=100 00:32:39.821 18:36:23 blockdev_crypto_sw.bdev_bounds -- bdev/blockdev.sh@289 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevio/bdevio -w -s 0 --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json '' 00:32:39.821 18:36:23 blockdev_crypto_sw.bdev_bounds -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:32:39.821 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:32:39.821 18:36:23 blockdev_crypto_sw.bdev_bounds -- common/autotest_common.sh@838 -- # xtrace_disable 00:32:39.821 18:36:23 blockdev_crypto_sw.bdev_bounds -- common/autotest_common.sh@10 -- # set +x 00:32:39.821 [2024-07-12 18:36:23.449176] Starting SPDK v24.09-pre git sha1 182dd7de4 / DPDK 24.03.0 initialization... 00:32:39.821 [2024-07-12 18:36:23.449240] [ DPDK EAL parameters: bdevio --no-shconf -c 0x7 -m 0 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2650659 ] 00:32:40.079 [2024-07-12 18:36:23.576041] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 3 00:32:40.079 [2024-07-12 18:36:23.675685] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:32:40.079 [2024-07-12 18:36:23.675768] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 2 00:32:40.079 [2024-07-12 18:36:23.675773] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:32:40.336 [2024-07-12 18:36:23.854199] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw" 00:32:40.336 [2024-07-12 18:36:23.854267] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:32:40.336 [2024-07-12 18:36:23.854282] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:32:40.336 [2024-07-12 18:36:23.862221] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw2" 00:32:40.336 [2024-07-12 18:36:23.862244] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:32:40.336 [2024-07-12 18:36:23.862255] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:32:40.336 [2024-07-12 18:36:23.870245] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw3" 00:32:40.336 [2024-07-12 18:36:23.870264] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: crypto_ram2 00:32:40.336 [2024-07-12 18:36:23.870276] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:32:40.904 18:36:24 blockdev_crypto_sw.bdev_bounds -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:32:40.904 18:36:24 blockdev_crypto_sw.bdev_bounds -- common/autotest_common.sh@862 -- # return 0 00:32:40.904 18:36:24 blockdev_crypto_sw.bdev_bounds -- bdev/blockdev.sh@294 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevio/tests.py perform_tests 00:32:40.904 I/O targets: 00:32:40.904 crypto_ram: 32768 blocks of 512 bytes (16 MiB) 00:32:40.904 crypto_ram3: 4096 blocks of 4096 bytes (16 MiB) 00:32:40.904 00:32:40.904 00:32:40.904 CUnit - A unit testing framework for C - Version 2.1-3 00:32:40.904 http://cunit.sourceforge.net/ 00:32:40.904 00:32:40.904 00:32:40.904 Suite: bdevio tests on: crypto_ram3 00:32:40.904 Test: blockdev write read block ...passed 00:32:40.904 Test: blockdev write zeroes read block ...passed 00:32:40.904 Test: blockdev write zeroes read no split ...passed 00:32:40.904 Test: blockdev write zeroes read split ...passed 00:32:40.904 Test: blockdev write zeroes read split partial ...passed 00:32:40.904 Test: blockdev reset ...passed 00:32:40.904 Test: blockdev write read 8 blocks ...passed 00:32:40.904 Test: blockdev write read size > 128k ...passed 00:32:40.904 Test: blockdev write read invalid size ...passed 00:32:40.904 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:32:40.904 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:32:40.904 Test: blockdev write read max offset ...passed 00:32:40.904 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:32:40.904 Test: blockdev writev readv 8 blocks ...passed 00:32:40.904 Test: blockdev writev readv 30 x 1block ...passed 00:32:40.904 Test: blockdev writev readv block ...passed 00:32:40.904 Test: blockdev writev readv size > 128k ...passed 00:32:40.904 Test: blockdev writev readv size > 128k in two iovs ...passed 00:32:40.904 Test: blockdev comparev and writev ...passed 00:32:40.904 Test: blockdev nvme passthru rw ...passed 00:32:40.904 Test: blockdev nvme passthru vendor specific ...passed 00:32:40.904 Test: blockdev nvme admin passthru ...passed 00:32:40.904 Test: blockdev copy ...passed 00:32:40.904 Suite: bdevio tests on: crypto_ram 00:32:40.904 Test: blockdev write read block ...passed 00:32:40.904 Test: blockdev write zeroes read block ...passed 00:32:40.904 Test: blockdev write zeroes read no split ...passed 00:32:40.904 Test: blockdev write zeroes read split ...passed 00:32:40.904 Test: blockdev write zeroes read split partial ...passed 00:32:40.904 Test: blockdev reset ...passed 00:32:40.904 Test: blockdev write read 8 blocks ...passed 00:32:40.904 Test: blockdev write read size > 128k ...passed 00:32:40.904 Test: blockdev write read invalid size ...passed 00:32:40.904 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:32:40.904 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:32:40.904 Test: blockdev write read max offset ...passed 00:32:40.904 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:32:40.904 Test: blockdev writev readv 8 blocks ...passed 00:32:40.904 Test: blockdev writev readv 30 x 1block ...passed 00:32:40.904 Test: blockdev writev readv block ...passed 00:32:40.904 Test: blockdev writev readv size > 128k ...passed 00:32:40.904 Test: blockdev writev readv size > 128k in two iovs ...passed 00:32:40.904 Test: blockdev comparev and writev ...passed 00:32:40.904 Test: blockdev nvme passthru rw ...passed 00:32:40.904 Test: blockdev nvme passthru vendor specific ...passed 00:32:40.904 Test: blockdev nvme admin passthru ...passed 00:32:40.904 Test: blockdev copy ...passed 00:32:40.904 00:32:40.904 Run Summary: Type Total Ran Passed Failed Inactive 00:32:40.904 suites 2 2 n/a 0 0 00:32:40.904 tests 46 46 46 0 0 00:32:40.904 asserts 260 260 260 0 n/a 00:32:40.904 00:32:40.904 Elapsed time = 0.084 seconds 00:32:40.904 0 00:32:40.904 18:36:24 blockdev_crypto_sw.bdev_bounds -- bdev/blockdev.sh@295 -- # killprocess 2650659 00:32:40.904 18:36:24 blockdev_crypto_sw.bdev_bounds -- common/autotest_common.sh@948 -- # '[' -z 2650659 ']' 00:32:40.904 18:36:24 blockdev_crypto_sw.bdev_bounds -- common/autotest_common.sh@952 -- # kill -0 2650659 00:32:40.904 18:36:24 blockdev_crypto_sw.bdev_bounds -- common/autotest_common.sh@953 -- # uname 00:32:40.904 18:36:24 blockdev_crypto_sw.bdev_bounds -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:32:40.904 18:36:24 blockdev_crypto_sw.bdev_bounds -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2650659 00:32:40.904 18:36:24 blockdev_crypto_sw.bdev_bounds -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:32:40.904 18:36:24 blockdev_crypto_sw.bdev_bounds -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:32:40.904 18:36:24 blockdev_crypto_sw.bdev_bounds -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2650659' 00:32:40.904 killing process with pid 2650659 00:32:40.904 18:36:24 blockdev_crypto_sw.bdev_bounds -- common/autotest_common.sh@967 -- # kill 2650659 00:32:40.904 18:36:24 blockdev_crypto_sw.bdev_bounds -- common/autotest_common.sh@972 -- # wait 2650659 00:32:41.162 18:36:24 blockdev_crypto_sw.bdev_bounds -- bdev/blockdev.sh@296 -- # trap - SIGINT SIGTERM EXIT 00:32:41.162 00:32:41.162 real 0m1.412s 00:32:41.162 user 0m3.646s 00:32:41.162 sys 0m0.382s 00:32:41.162 18:36:24 blockdev_crypto_sw.bdev_bounds -- common/autotest_common.sh@1124 -- # xtrace_disable 00:32:41.162 18:36:24 blockdev_crypto_sw.bdev_bounds -- common/autotest_common.sh@10 -- # set +x 00:32:41.162 ************************************ 00:32:41.162 END TEST bdev_bounds 00:32:41.162 ************************************ 00:32:41.162 18:36:24 blockdev_crypto_sw -- common/autotest_common.sh@1142 -- # return 0 00:32:41.162 18:36:24 blockdev_crypto_sw -- bdev/blockdev.sh@762 -- # run_test bdev_nbd nbd_function_test /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json 'crypto_ram crypto_ram3' '' 00:32:41.162 18:36:24 blockdev_crypto_sw -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:32:41.162 18:36:24 blockdev_crypto_sw -- common/autotest_common.sh@1105 -- # xtrace_disable 00:32:41.162 18:36:24 blockdev_crypto_sw -- common/autotest_common.sh@10 -- # set +x 00:32:41.162 ************************************ 00:32:41.162 START TEST bdev_nbd 00:32:41.162 ************************************ 00:32:41.162 18:36:24 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@1123 -- # nbd_function_test /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json 'crypto_ram crypto_ram3' '' 00:32:41.162 18:36:24 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@300 -- # uname -s 00:32:41.162 18:36:24 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@300 -- # [[ Linux == Linux ]] 00:32:41.162 18:36:24 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@302 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:32:41.162 18:36:24 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@303 -- # local conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json 00:32:41.162 18:36:24 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@304 -- # bdev_all=('crypto_ram' 'crypto_ram3') 00:32:41.162 18:36:24 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@304 -- # local bdev_all 00:32:41.162 18:36:24 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@305 -- # local bdev_num=2 00:32:41.162 18:36:24 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@309 -- # [[ -e /sys/module/nbd ]] 00:32:41.162 18:36:24 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@311 -- # nbd_all=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11' '/dev/nbd12' '/dev/nbd13' '/dev/nbd14' '/dev/nbd15' '/dev/nbd2' '/dev/nbd3' '/dev/nbd4' '/dev/nbd5' '/dev/nbd6' '/dev/nbd7' '/dev/nbd8' '/dev/nbd9') 00:32:41.162 18:36:24 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@311 -- # local nbd_all 00:32:41.162 18:36:24 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@312 -- # bdev_num=2 00:32:41.162 18:36:24 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@314 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:32:41.162 18:36:24 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@314 -- # local nbd_list 00:32:41.162 18:36:24 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@315 -- # bdev_list=('crypto_ram' 'crypto_ram3') 00:32:41.162 18:36:24 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@315 -- # local bdev_list 00:32:41.162 18:36:24 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@318 -- # nbd_pid=2650865 00:32:41.162 18:36:24 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@319 -- # trap 'cleanup; killprocess $nbd_pid' SIGINT SIGTERM EXIT 00:32:41.162 18:36:24 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@320 -- # waitforlisten 2650865 /var/tmp/spdk-nbd.sock 00:32:41.162 18:36:24 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@829 -- # '[' -z 2650865 ']' 00:32:41.162 18:36:24 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-nbd.sock 00:32:41.162 18:36:24 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@834 -- # local max_retries=100 00:32:41.162 18:36:24 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock...' 00:32:41.162 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock... 00:32:41.162 18:36:24 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@317 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-nbd.sock -i 0 --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json '' 00:32:41.162 18:36:24 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@838 -- # xtrace_disable 00:32:41.162 18:36:24 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@10 -- # set +x 00:32:41.420 [2024-07-12 18:36:24.943381] Starting SPDK v24.09-pre git sha1 182dd7de4 / DPDK 24.03.0 initialization... 00:32:41.420 [2024-07-12 18:36:24.943445] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:32:41.420 [2024-07-12 18:36:25.070040] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:32:41.678 [2024-07-12 18:36:25.174485] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:32:41.678 [2024-07-12 18:36:25.349658] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw" 00:32:41.678 [2024-07-12 18:36:25.349728] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:32:41.678 [2024-07-12 18:36:25.349742] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:32:41.678 [2024-07-12 18:36:25.357677] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw2" 00:32:41.678 [2024-07-12 18:36:25.357697] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:32:41.678 [2024-07-12 18:36:25.357709] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:32:41.678 [2024-07-12 18:36:25.365699] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw3" 00:32:41.678 [2024-07-12 18:36:25.365717] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: crypto_ram2 00:32:41.678 [2024-07-12 18:36:25.365734] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:32:42.245 18:36:25 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:32:42.245 18:36:25 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@862 -- # return 0 00:32:42.245 18:36:25 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@322 -- # nbd_rpc_start_stop_verify /var/tmp/spdk-nbd.sock 'crypto_ram crypto_ram3' 00:32:42.245 18:36:25 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@113 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:32:42.245 18:36:25 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@114 -- # bdev_list=('crypto_ram' 'crypto_ram3') 00:32:42.245 18:36:25 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@114 -- # local bdev_list 00:32:42.245 18:36:25 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@116 -- # nbd_start_disks_without_nbd_idx /var/tmp/spdk-nbd.sock 'crypto_ram crypto_ram3' 00:32:42.245 18:36:25 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@22 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:32:42.245 18:36:25 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@23 -- # bdev_list=('crypto_ram' 'crypto_ram3') 00:32:42.245 18:36:25 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@23 -- # local bdev_list 00:32:42.245 18:36:25 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@24 -- # local i 00:32:42.245 18:36:25 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@25 -- # local nbd_device 00:32:42.245 18:36:25 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i = 0 )) 00:32:42.245 18:36:25 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 2 )) 00:32:42.245 18:36:25 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk crypto_ram 00:32:42.502 18:36:26 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd0 00:32:42.502 18:36:26 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd0 00:32:42.502 18:36:26 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd0 00:32:42.502 18:36:26 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd0 00:32:42.502 18:36:26 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:32:42.502 18:36:26 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:32:42.502 18:36:26 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:32:42.502 18:36:26 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd0 /proc/partitions 00:32:42.502 18:36:26 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:32:42.502 18:36:26 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:32:42.502 18:36:26 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:32:42.502 18:36:26 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:32:42.502 1+0 records in 00:32:42.502 1+0 records out 00:32:42.502 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.00017577 s, 23.3 MB/s 00:32:42.502 18:36:26 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:32:42.502 18:36:26 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:32:42.502 18:36:26 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:32:42.502 18:36:26 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:32:42.502 18:36:26 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:32:42.502 18:36:26 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:32:42.502 18:36:26 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 2 )) 00:32:42.502 18:36:26 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk crypto_ram3 00:32:42.760 18:36:26 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd1 00:32:42.760 18:36:26 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd1 00:32:42.760 18:36:26 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd1 00:32:42.760 18:36:26 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd1 00:32:42.760 18:36:26 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:32:42.760 18:36:26 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:32:42.760 18:36:26 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:32:42.760 18:36:26 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd1 /proc/partitions 00:32:42.760 18:36:26 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:32:42.760 18:36:26 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:32:42.760 18:36:26 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:32:42.760 18:36:26 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:32:42.760 1+0 records in 00:32:42.760 1+0 records out 00:32:42.760 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.00032007 s, 12.8 MB/s 00:32:42.760 18:36:26 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:32:42.760 18:36:26 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:32:42.760 18:36:26 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:32:42.760 18:36:26 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:32:42.760 18:36:26 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:32:42.760 18:36:26 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:32:42.760 18:36:26 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 2 )) 00:32:42.760 18:36:26 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@118 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:32:43.017 18:36:26 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@118 -- # nbd_disks_json='[ 00:32:43.017 { 00:32:43.017 "nbd_device": "/dev/nbd0", 00:32:43.017 "bdev_name": "crypto_ram" 00:32:43.017 }, 00:32:43.017 { 00:32:43.017 "nbd_device": "/dev/nbd1", 00:32:43.017 "bdev_name": "crypto_ram3" 00:32:43.017 } 00:32:43.017 ]' 00:32:43.017 18:36:26 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@119 -- # nbd_disks_name=($(echo "${nbd_disks_json}" | jq -r '.[] | .nbd_device')) 00:32:43.017 18:36:26 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@119 -- # echo '[ 00:32:43.017 { 00:32:43.017 "nbd_device": "/dev/nbd0", 00:32:43.017 "bdev_name": "crypto_ram" 00:32:43.017 }, 00:32:43.017 { 00:32:43.017 "nbd_device": "/dev/nbd1", 00:32:43.017 "bdev_name": "crypto_ram3" 00:32:43.017 } 00:32:43.017 ]' 00:32:43.017 18:36:26 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@119 -- # jq -r '.[] | .nbd_device' 00:32:43.285 18:36:26 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@120 -- # nbd_stop_disks /var/tmp/spdk-nbd.sock '/dev/nbd0 /dev/nbd1' 00:32:43.286 18:36:26 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:32:43.286 18:36:26 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:32:43.286 18:36:26 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@50 -- # local nbd_list 00:32:43.286 18:36:26 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@51 -- # local i 00:32:43.286 18:36:26 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:32:43.286 18:36:26 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd0 00:32:43.543 18:36:27 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:32:43.543 18:36:27 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:32:43.543 18:36:27 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:32:43.543 18:36:27 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:32:43.543 18:36:27 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:32:43.543 18:36:27 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:32:43.543 18:36:27 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@39 -- # sleep 0.1 00:32:43.543 18:36:27 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i++ )) 00:32:43.543 18:36:27 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:32:43.543 18:36:27 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:32:43.543 18:36:27 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:32:43.544 18:36:27 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:32:43.544 18:36:27 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:32:43.544 18:36:27 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd1 00:32:44.108 18:36:27 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:32:44.108 18:36:27 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:32:44.108 18:36:27 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:32:44.108 18:36:27 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:32:44.108 18:36:27 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:32:44.108 18:36:27 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:32:44.108 18:36:27 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@39 -- # sleep 0.1 00:32:44.108 18:36:27 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i++ )) 00:32:44.108 18:36:27 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:32:44.108 18:36:27 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:32:44.108 18:36:27 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:32:44.108 18:36:27 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:32:44.108 18:36:27 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@122 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:32:44.108 18:36:27 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:32:44.108 18:36:27 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:32:44.366 18:36:27 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[]' 00:32:44.366 18:36:27 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@64 -- # echo '[]' 00:32:44.366 18:36:27 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:32:44.366 18:36:27 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@64 -- # nbd_disks_name= 00:32:44.366 18:36:27 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@65 -- # echo '' 00:32:44.366 18:36:27 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:32:44.366 18:36:27 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@65 -- # true 00:32:44.366 18:36:27 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@65 -- # count=0 00:32:44.366 18:36:27 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@66 -- # echo 0 00:32:44.366 18:36:27 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@122 -- # count=0 00:32:44.366 18:36:27 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@123 -- # '[' 0 -ne 0 ']' 00:32:44.366 18:36:27 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@127 -- # return 0 00:32:44.366 18:36:27 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@323 -- # nbd_rpc_data_verify /var/tmp/spdk-nbd.sock 'crypto_ram crypto_ram3' '/dev/nbd0 /dev/nbd1' 00:32:44.366 18:36:27 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@90 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:32:44.366 18:36:27 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@91 -- # bdev_list=('crypto_ram' 'crypto_ram3') 00:32:44.366 18:36:27 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@91 -- # local bdev_list 00:32:44.366 18:36:27 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@92 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:32:44.366 18:36:27 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@92 -- # local nbd_list 00:32:44.366 18:36:27 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@94 -- # nbd_start_disks /var/tmp/spdk-nbd.sock 'crypto_ram crypto_ram3' '/dev/nbd0 /dev/nbd1' 00:32:44.366 18:36:27 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:32:44.366 18:36:27 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@10 -- # bdev_list=('crypto_ram' 'crypto_ram3') 00:32:44.366 18:36:27 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@10 -- # local bdev_list 00:32:44.366 18:36:27 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:32:44.366 18:36:27 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@11 -- # local nbd_list 00:32:44.366 18:36:27 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@12 -- # local i 00:32:44.366 18:36:27 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:32:44.366 18:36:27 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:32:44.366 18:36:27 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk crypto_ram /dev/nbd0 00:32:44.624 /dev/nbd0 00:32:44.624 18:36:28 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:32:44.624 18:36:28 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:32:44.624 18:36:28 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd0 00:32:44.624 18:36:28 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:32:44.624 18:36:28 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:32:44.624 18:36:28 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:32:44.624 18:36:28 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd0 /proc/partitions 00:32:44.624 18:36:28 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:32:44.624 18:36:28 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:32:44.624 18:36:28 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:32:44.624 18:36:28 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:32:44.624 1+0 records in 00:32:44.624 1+0 records out 00:32:44.624 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000224777 s, 18.2 MB/s 00:32:44.624 18:36:28 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:32:44.624 18:36:28 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:32:44.624 18:36:28 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:32:44.624 18:36:28 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:32:44.624 18:36:28 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:32:44.624 18:36:28 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:32:44.624 18:36:28 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:32:44.624 18:36:28 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk crypto_ram3 /dev/nbd1 00:32:45.190 /dev/nbd1 00:32:45.190 18:36:28 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:32:45.190 18:36:28 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:32:45.190 18:36:28 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd1 00:32:45.190 18:36:28 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:32:45.190 18:36:28 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:32:45.190 18:36:28 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:32:45.190 18:36:28 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd1 /proc/partitions 00:32:45.190 18:36:28 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:32:45.190 18:36:28 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:32:45.190 18:36:28 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:32:45.190 18:36:28 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:32:45.190 1+0 records in 00:32:45.190 1+0 records out 00:32:45.190 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000314429 s, 13.0 MB/s 00:32:45.190 18:36:28 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:32:45.190 18:36:28 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:32:45.190 18:36:28 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:32:45.190 18:36:28 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:32:45.190 18:36:28 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:32:45.190 18:36:28 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:32:45.190 18:36:28 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:32:45.190 18:36:28 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@95 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:32:45.190 18:36:28 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:32:45.190 18:36:28 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:32:45.448 18:36:28 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[ 00:32:45.448 { 00:32:45.448 "nbd_device": "/dev/nbd0", 00:32:45.448 "bdev_name": "crypto_ram" 00:32:45.448 }, 00:32:45.448 { 00:32:45.448 "nbd_device": "/dev/nbd1", 00:32:45.448 "bdev_name": "crypto_ram3" 00:32:45.448 } 00:32:45.448 ]' 00:32:45.448 18:36:28 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@64 -- # echo '[ 00:32:45.448 { 00:32:45.448 "nbd_device": "/dev/nbd0", 00:32:45.448 "bdev_name": "crypto_ram" 00:32:45.448 }, 00:32:45.448 { 00:32:45.448 "nbd_device": "/dev/nbd1", 00:32:45.448 "bdev_name": "crypto_ram3" 00:32:45.448 } 00:32:45.448 ]' 00:32:45.448 18:36:28 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:32:45.448 18:36:29 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@64 -- # nbd_disks_name='/dev/nbd0 00:32:45.448 /dev/nbd1' 00:32:45.448 18:36:29 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@65 -- # echo '/dev/nbd0 00:32:45.448 /dev/nbd1' 00:32:45.448 18:36:29 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:32:45.448 18:36:29 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@65 -- # count=2 00:32:45.448 18:36:29 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@66 -- # echo 2 00:32:45.448 18:36:29 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@95 -- # count=2 00:32:45.448 18:36:29 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@96 -- # '[' 2 -ne 2 ']' 00:32:45.448 18:36:29 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@100 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1' write 00:32:45.448 18:36:29 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:32:45.448 18:36:29 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@70 -- # local nbd_list 00:32:45.448 18:36:29 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@71 -- # local operation=write 00:32:45.448 18:36:29 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest 00:32:45.448 18:36:29 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@74 -- # '[' write = write ']' 00:32:45.448 18:36:29 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@76 -- # dd if=/dev/urandom of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest bs=4096 count=256 00:32:45.448 256+0 records in 00:32:45.448 256+0 records out 00:32:45.448 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0108869 s, 96.3 MB/s 00:32:45.448 18:36:29 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:32:45.448 18:36:29 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd0 bs=4096 count=256 oflag=direct 00:32:45.448 256+0 records in 00:32:45.448 256+0 records out 00:32:45.448 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0311361 s, 33.7 MB/s 00:32:45.448 18:36:29 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:32:45.448 18:36:29 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd1 bs=4096 count=256 oflag=direct 00:32:45.448 256+0 records in 00:32:45.448 256+0 records out 00:32:45.448 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0462811 s, 22.7 MB/s 00:32:45.448 18:36:29 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@101 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1' verify 00:32:45.448 18:36:29 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:32:45.448 18:36:29 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@70 -- # local nbd_list 00:32:45.448 18:36:29 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@71 -- # local operation=verify 00:32:45.448 18:36:29 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest 00:32:45.448 18:36:29 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@74 -- # '[' verify = write ']' 00:32:45.448 18:36:29 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@80 -- # '[' verify = verify ']' 00:32:45.448 18:36:29 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:32:45.448 18:36:29 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd0 00:32:45.448 18:36:29 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:32:45.448 18:36:29 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd1 00:32:45.448 18:36:29 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@85 -- # rm /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest 00:32:45.448 18:36:29 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@103 -- # nbd_stop_disks /var/tmp/spdk-nbd.sock '/dev/nbd0 /dev/nbd1' 00:32:45.448 18:36:29 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:32:45.448 18:36:29 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:32:45.448 18:36:29 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@50 -- # local nbd_list 00:32:45.448 18:36:29 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@51 -- # local i 00:32:45.448 18:36:29 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:32:45.448 18:36:29 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd0 00:32:46.013 18:36:29 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:32:46.013 18:36:29 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:32:46.013 18:36:29 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:32:46.013 18:36:29 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:32:46.013 18:36:29 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:32:46.013 18:36:29 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:32:46.013 18:36:29 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@39 -- # sleep 0.1 00:32:46.013 18:36:29 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i++ )) 00:32:46.013 18:36:29 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:32:46.013 18:36:29 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:32:46.013 18:36:29 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:32:46.013 18:36:29 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:32:46.013 18:36:29 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:32:46.013 18:36:29 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd1 00:32:46.272 18:36:29 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:32:46.272 18:36:29 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:32:46.272 18:36:29 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:32:46.272 18:36:29 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:32:46.272 18:36:29 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:32:46.272 18:36:29 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:32:46.272 18:36:29 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@39 -- # sleep 0.1 00:32:46.272 18:36:29 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i++ )) 00:32:46.272 18:36:29 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:32:46.272 18:36:29 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:32:46.272 18:36:29 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:32:46.272 18:36:29 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:32:46.529 18:36:29 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@104 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:32:46.529 18:36:29 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:32:46.529 18:36:29 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:32:46.529 18:36:30 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[]' 00:32:46.529 18:36:30 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@64 -- # echo '[]' 00:32:46.529 18:36:30 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:32:46.529 18:36:30 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@64 -- # nbd_disks_name= 00:32:46.529 18:36:30 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@65 -- # echo '' 00:32:46.529 18:36:30 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:32:46.529 18:36:30 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@65 -- # true 00:32:46.530 18:36:30 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@65 -- # count=0 00:32:46.530 18:36:30 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@66 -- # echo 0 00:32:46.530 18:36:30 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@104 -- # count=0 00:32:46.530 18:36:30 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@105 -- # '[' 0 -ne 0 ']' 00:32:46.530 18:36:30 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@109 -- # return 0 00:32:46.530 18:36:30 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@324 -- # nbd_with_lvol_verify /var/tmp/spdk-nbd.sock '/dev/nbd0 /dev/nbd1' 00:32:46.530 18:36:30 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@131 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:32:46.530 18:36:30 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@132 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:32:46.530 18:36:30 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@132 -- # local nbd_list 00:32:46.530 18:36:30 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@133 -- # local mkfs_ret 00:32:46.530 18:36:30 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@135 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_malloc_create -b malloc_lvol_verify 16 512 00:32:46.787 malloc_lvol_verify 00:32:46.787 18:36:30 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@136 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_lvol_create_lvstore malloc_lvol_verify lvs 00:32:47.045 b572c066-c820-4eeb-b4a3-8d2d8e46f9d4 00:32:47.045 18:36:30 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@137 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_lvol_create lvol 4 -l lvs 00:32:47.303 5d2936c1-6471-45e4-8f80-70877ad4fbcb 00:32:47.303 18:36:30 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@138 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk lvs/lvol /dev/nbd0 00:32:49.386 /dev/nbd0 00:32:49.386 18:36:32 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@140 -- # mkfs.ext4 /dev/nbd0 00:32:49.386 mke2fs 1.46.5 (30-Dec-2021) 00:32:49.386 Discarding device blocks: 0/4096 done 00:32:49.386 Creating filesystem with 4096 1k blocks and 1024 inodes 00:32:49.386 00:32:49.386 Allocating group tables: 0/1 done 00:32:49.386 Writing inode tables: 0/1 done 00:32:49.386 Creating journal (1024 blocks): done 00:32:49.386 Writing superblocks and filesystem accounting information: 0/1 done 00:32:49.386 00:32:49.386 18:36:32 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@141 -- # mkfs_ret=0 00:32:49.386 18:36:32 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@142 -- # nbd_stop_disks /var/tmp/spdk-nbd.sock /dev/nbd0 00:32:49.386 18:36:32 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:32:49.386 18:36:32 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0') 00:32:49.386 18:36:32 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@50 -- # local nbd_list 00:32:49.386 18:36:32 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@51 -- # local i 00:32:49.386 18:36:32 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:32:49.386 18:36:32 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd0 00:32:49.678 18:36:33 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:32:49.678 18:36:33 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:32:49.678 18:36:33 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:32:49.678 18:36:33 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:32:49.678 18:36:33 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:32:49.678 18:36:33 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:32:49.678 18:36:33 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:32:49.678 18:36:33 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:32:49.678 18:36:33 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@143 -- # '[' 0 -ne 0 ']' 00:32:49.678 18:36:33 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@147 -- # return 0 00:32:49.678 18:36:33 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@326 -- # killprocess 2650865 00:32:49.678 18:36:33 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@948 -- # '[' -z 2650865 ']' 00:32:49.678 18:36:33 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@952 -- # kill -0 2650865 00:32:49.678 18:36:33 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@953 -- # uname 00:32:49.679 18:36:33 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:32:49.679 18:36:33 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2650865 00:32:49.679 18:36:33 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:32:49.679 18:36:33 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:32:49.679 18:36:33 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2650865' 00:32:49.679 killing process with pid 2650865 00:32:49.679 18:36:33 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@967 -- # kill 2650865 00:32:49.679 18:36:33 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@972 -- # wait 2650865 00:32:49.936 18:36:33 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@327 -- # trap - SIGINT SIGTERM EXIT 00:32:49.937 00:32:49.937 real 0m8.714s 00:32:49.937 user 0m9.451s 00:32:49.937 sys 0m2.975s 00:32:49.937 18:36:33 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@1124 -- # xtrace_disable 00:32:49.937 18:36:33 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@10 -- # set +x 00:32:49.937 ************************************ 00:32:49.937 END TEST bdev_nbd 00:32:49.937 ************************************ 00:32:49.937 18:36:33 blockdev_crypto_sw -- common/autotest_common.sh@1142 -- # return 0 00:32:49.937 18:36:33 blockdev_crypto_sw -- bdev/blockdev.sh@763 -- # [[ y == y ]] 00:32:49.937 18:36:33 blockdev_crypto_sw -- bdev/blockdev.sh@764 -- # '[' crypto_sw = nvme ']' 00:32:49.937 18:36:33 blockdev_crypto_sw -- bdev/blockdev.sh@764 -- # '[' crypto_sw = gpt ']' 00:32:49.937 18:36:33 blockdev_crypto_sw -- bdev/blockdev.sh@768 -- # run_test bdev_fio fio_test_suite '' 00:32:49.937 18:36:33 blockdev_crypto_sw -- common/autotest_common.sh@1099 -- # '[' 3 -le 1 ']' 00:32:49.937 18:36:33 blockdev_crypto_sw -- common/autotest_common.sh@1105 -- # xtrace_disable 00:32:49.937 18:36:33 blockdev_crypto_sw -- common/autotest_common.sh@10 -- # set +x 00:32:50.195 ************************************ 00:32:50.195 START TEST bdev_fio 00:32:50.195 ************************************ 00:32:50.195 18:36:33 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1123 -- # fio_test_suite '' 00:32:50.195 18:36:33 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@331 -- # local env_context 00:32:50.195 18:36:33 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@335 -- # pushd /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev 00:32:50.195 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev /var/jenkins/workspace/crypto-phy-autotest/spdk 00:32:50.195 18:36:33 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@336 -- # trap 'rm -f ./*.state; popd; exit 1' SIGINT SIGTERM EXIT 00:32:50.195 18:36:33 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@339 -- # echo '' 00:32:50.195 18:36:33 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@339 -- # sed s/--env-context=// 00:32:50.195 18:36:33 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@339 -- # env_context= 00:32:50.195 18:36:33 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@340 -- # fio_config_gen /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio verify AIO '' 00:32:50.195 18:36:33 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1280 -- # local config_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:32:50.195 18:36:33 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1281 -- # local workload=verify 00:32:50.195 18:36:33 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1282 -- # local bdev_type=AIO 00:32:50.195 18:36:33 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1283 -- # local env_context= 00:32:50.195 18:36:33 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1284 -- # local fio_dir=/usr/src/fio 00:32:50.195 18:36:33 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1286 -- # '[' -e /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio ']' 00:32:50.195 18:36:33 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1291 -- # '[' -z verify ']' 00:32:50.195 18:36:33 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1295 -- # '[' -n '' ']' 00:32:50.195 18:36:33 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1299 -- # touch /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:32:50.195 18:36:33 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1301 -- # cat 00:32:50.195 18:36:33 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1313 -- # '[' verify == verify ']' 00:32:50.195 18:36:33 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1314 -- # cat 00:32:50.195 18:36:33 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1323 -- # '[' AIO == AIO ']' 00:32:50.195 18:36:33 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1324 -- # /usr/src/fio/fio --version 00:32:50.195 18:36:33 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1324 -- # [[ fio-3.35 == *\f\i\o\-\3* ]] 00:32:50.195 18:36:33 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1325 -- # echo serialize_overlap=1 00:32:50.195 18:36:33 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@341 -- # for b in "${bdevs_name[@]}" 00:32:50.195 18:36:33 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@342 -- # echo '[job_crypto_ram]' 00:32:50.195 18:36:33 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@343 -- # echo filename=crypto_ram 00:32:50.195 18:36:33 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@341 -- # for b in "${bdevs_name[@]}" 00:32:50.195 18:36:33 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@342 -- # echo '[job_crypto_ram3]' 00:32:50.195 18:36:33 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@343 -- # echo filename=crypto_ram3 00:32:50.195 18:36:33 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@347 -- # local 'fio_params=--ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json' 00:32:50.195 18:36:33 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@349 -- # run_test bdev_fio_rw_verify fio_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --spdk_mem=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:32:50.195 18:36:33 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1099 -- # '[' 11 -le 1 ']' 00:32:50.195 18:36:33 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1105 -- # xtrace_disable 00:32:50.195 18:36:33 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@10 -- # set +x 00:32:50.195 ************************************ 00:32:50.195 START TEST bdev_fio_rw_verify 00:32:50.195 ************************************ 00:32:50.195 18:36:33 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1123 -- # fio_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --spdk_mem=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:32:50.195 18:36:33 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1356 -- # fio_plugin /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --spdk_mem=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:32:50.195 18:36:33 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1337 -- # local fio_dir=/usr/src/fio 00:32:50.195 18:36:33 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1339 -- # sanitizers=('libasan' 'libclang_rt.asan') 00:32:50.195 18:36:33 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1339 -- # local sanitizers 00:32:50.195 18:36:33 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1340 -- # local plugin=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:32:50.195 18:36:33 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1341 -- # shift 00:32:50.195 18:36:33 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1343 -- # local asan_lib= 00:32:50.195 18:36:33 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1344 -- # for sanitizer in "${sanitizers[@]}" 00:32:50.195 18:36:33 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # ldd /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:32:50.195 18:36:33 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # grep libasan 00:32:50.195 18:36:33 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # awk '{print $3}' 00:32:50.453 18:36:33 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # asan_lib= 00:32:50.453 18:36:33 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1346 -- # [[ -n '' ]] 00:32:50.453 18:36:33 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1344 -- # for sanitizer in "${sanitizers[@]}" 00:32:50.453 18:36:33 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # ldd /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:32:50.453 18:36:33 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # grep libclang_rt.asan 00:32:50.453 18:36:33 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # awk '{print $3}' 00:32:50.453 18:36:33 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # asan_lib= 00:32:50.453 18:36:33 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1346 -- # [[ -n '' ]] 00:32:50.453 18:36:33 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1352 -- # LD_PRELOAD=' /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev' 00:32:50.453 18:36:33 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1352 -- # /usr/src/fio/fio --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --spdk_mem=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:32:51.827 job_crypto_ram: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:32:51.827 job_crypto_ram3: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:32:51.827 fio-3.35 00:32:51.827 Starting 2 threads 00:33:04.024 00:33:04.024 job_crypto_ram: (groupid=0, jobs=2): err= 0: pid=2652353: Fri Jul 12 18:36:45 2024 00:33:04.024 read: IOPS=22.0k, BW=86.1MiB/s (90.3MB/s)(861MiB/10000msec) 00:33:04.024 slat (usec): min=14, max=527, avg=19.69, stdev= 3.61 00:33:04.024 clat (usec): min=7, max=2209, avg=143.59, stdev=58.19 00:33:04.024 lat (usec): min=25, max=2227, avg=163.28, stdev=59.56 00:33:04.024 clat percentiles (usec): 00:33:04.024 | 50.000th=[ 141], 99.000th=[ 277], 99.900th=[ 297], 99.990th=[ 338], 00:33:04.024 | 99.999th=[ 2180] 00:33:04.024 write: IOPS=26.5k, BW=103MiB/s (108MB/s)(980MiB/9489msec); 0 zone resets 00:33:04.024 slat (usec): min=14, max=257, avg=33.46, stdev= 4.22 00:33:04.024 clat (usec): min=24, max=2105, avg=194.30, stdev=89.08 00:33:04.024 lat (usec): min=51, max=2136, avg=227.77, stdev=90.59 00:33:04.024 clat percentiles (usec): 00:33:04.024 | 50.000th=[ 188], 99.000th=[ 383], 99.900th=[ 404], 99.990th=[ 586], 00:33:04.024 | 99.999th=[ 2040] 00:33:04.024 bw ( KiB/s): min=94024, max=107576, per=95.09%, avg=100611.37, stdev=2156.03, samples=38 00:33:04.024 iops : min=23506, max=26894, avg=25152.84, stdev=539.01, samples=38 00:33:04.024 lat (usec) : 10=0.01%, 20=0.01%, 50=4.96%, 100=14.96%, 250=63.44% 00:33:04.024 lat (usec) : 500=16.61%, 750=0.01%, 1000=0.01% 00:33:04.024 lat (msec) : 2=0.01%, 4=0.01% 00:33:04.024 cpu : usr=99.56%, sys=0.00%, ctx=32, majf=0, minf=387 00:33:04.024 IO depths : 1=12.5%, 2=25.0%, 4=50.0%, 8=12.5%, 16=0.0%, 32=0.0%, >=64=0.0% 00:33:04.024 submit : 0=0.0%, 4=100.0%, 8=0.0%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.0% 00:33:04.024 complete : 0=0.0%, 4=88.9%, 8=11.1%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.0% 00:33:04.024 issued rwts: total=220456,251000,0,0 short=0,0,0,0 dropped=0,0,0,0 00:33:04.024 latency : target=0, window=0, percentile=100.00%, depth=8 00:33:04.024 00:33:04.024 Run status group 0 (all jobs): 00:33:04.024 READ: bw=86.1MiB/s (90.3MB/s), 86.1MiB/s-86.1MiB/s (90.3MB/s-90.3MB/s), io=861MiB (903MB), run=10000-10000msec 00:33:04.024 WRITE: bw=103MiB/s (108MB/s), 103MiB/s-103MiB/s (108MB/s-108MB/s), io=980MiB (1028MB), run=9489-9489msec 00:33:04.024 00:33:04.024 real 0m12.089s 00:33:04.024 user 0m23.591s 00:33:04.024 sys 0m0.353s 00:33:04.024 18:36:45 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1124 -- # xtrace_disable 00:33:04.024 18:36:45 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@10 -- # set +x 00:33:04.024 ************************************ 00:33:04.024 END TEST bdev_fio_rw_verify 00:33:04.024 ************************************ 00:33:04.024 18:36:46 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1142 -- # return 0 00:33:04.024 18:36:46 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@350 -- # rm -f 00:33:04.024 18:36:46 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@351 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:33:04.024 18:36:46 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@354 -- # fio_config_gen /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio trim '' '' 00:33:04.024 18:36:46 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1280 -- # local config_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:33:04.024 18:36:46 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1281 -- # local workload=trim 00:33:04.024 18:36:46 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1282 -- # local bdev_type= 00:33:04.024 18:36:46 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1283 -- # local env_context= 00:33:04.024 18:36:46 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1284 -- # local fio_dir=/usr/src/fio 00:33:04.024 18:36:46 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1286 -- # '[' -e /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio ']' 00:33:04.024 18:36:46 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1291 -- # '[' -z trim ']' 00:33:04.024 18:36:46 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1295 -- # '[' -n '' ']' 00:33:04.024 18:36:46 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1299 -- # touch /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:33:04.024 18:36:46 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1301 -- # cat 00:33:04.024 18:36:46 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1313 -- # '[' trim == verify ']' 00:33:04.024 18:36:46 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1328 -- # '[' trim == trim ']' 00:33:04.024 18:36:46 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1329 -- # echo rw=trimwrite 00:33:04.024 18:36:46 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@355 -- # jq -r 'select(.supported_io_types.unmap == true) | .name' 00:33:04.025 18:36:46 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@355 -- # printf '%s\n' '{' ' "name": "crypto_ram",' ' "aliases": [' ' "51c992f1-be0b-50e0-84a4-ea8682f461db"' ' ],' ' "product_name": "crypto",' ' "block_size": 512,' ' "num_blocks": 32768,' ' "uuid": "51c992f1-be0b-50e0-84a4-ea8682f461db",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc0",' ' "name": "crypto_ram",' ' "key_name": "test_dek_sw"' ' }' ' }' '}' '{' ' "name": "crypto_ram3",' ' "aliases": [' ' "09fdcc7c-f343-5912-ba1b-99059684948f"' ' ],' ' "product_name": "crypto",' ' "block_size": 4096,' ' "num_blocks": 4096,' ' "uuid": "09fdcc7c-f343-5912-ba1b-99059684948f",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "crypto_ram2",' ' "name": "crypto_ram3",' ' "key_name": "test_dek_sw3"' ' }' ' }' '}' 00:33:04.025 18:36:46 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@355 -- # [[ -n crypto_ram 00:33:04.025 crypto_ram3 ]] 00:33:04.025 18:36:46 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@356 -- # jq -r 'select(.supported_io_types.unmap == true) | .name' 00:33:04.025 18:36:46 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@356 -- # printf '%s\n' '{' ' "name": "crypto_ram",' ' "aliases": [' ' "51c992f1-be0b-50e0-84a4-ea8682f461db"' ' ],' ' "product_name": "crypto",' ' "block_size": 512,' ' "num_blocks": 32768,' ' "uuid": "51c992f1-be0b-50e0-84a4-ea8682f461db",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc0",' ' "name": "crypto_ram",' ' "key_name": "test_dek_sw"' ' }' ' }' '}' '{' ' "name": "crypto_ram3",' ' "aliases": [' ' "09fdcc7c-f343-5912-ba1b-99059684948f"' ' ],' ' "product_name": "crypto",' ' "block_size": 4096,' ' "num_blocks": 4096,' ' "uuid": "09fdcc7c-f343-5912-ba1b-99059684948f",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "crypto_ram2",' ' "name": "crypto_ram3",' ' "key_name": "test_dek_sw3"' ' }' ' }' '}' 00:33:04.025 18:36:46 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@356 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:33:04.025 18:36:46 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@357 -- # echo '[job_crypto_ram]' 00:33:04.025 18:36:46 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@358 -- # echo filename=crypto_ram 00:33:04.025 18:36:46 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@356 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:33:04.025 18:36:46 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@357 -- # echo '[job_crypto_ram3]' 00:33:04.025 18:36:46 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@358 -- # echo filename=crypto_ram3 00:33:04.025 18:36:46 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@367 -- # run_test bdev_fio_trim fio_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --verify_state_save=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:33:04.025 18:36:46 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1099 -- # '[' 11 -le 1 ']' 00:33:04.025 18:36:46 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1105 -- # xtrace_disable 00:33:04.025 18:36:46 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@10 -- # set +x 00:33:04.025 ************************************ 00:33:04.025 START TEST bdev_fio_trim 00:33:04.025 ************************************ 00:33:04.025 18:36:46 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1123 -- # fio_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --verify_state_save=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:33:04.025 18:36:46 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1356 -- # fio_plugin /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --verify_state_save=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:33:04.025 18:36:46 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1337 -- # local fio_dir=/usr/src/fio 00:33:04.025 18:36:46 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1339 -- # sanitizers=('libasan' 'libclang_rt.asan') 00:33:04.025 18:36:46 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1339 -- # local sanitizers 00:33:04.025 18:36:46 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1340 -- # local plugin=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:33:04.025 18:36:46 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1341 -- # shift 00:33:04.025 18:36:46 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1343 -- # local asan_lib= 00:33:04.025 18:36:46 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1344 -- # for sanitizer in "${sanitizers[@]}" 00:33:04.025 18:36:46 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # ldd /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:33:04.025 18:36:46 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # grep libasan 00:33:04.025 18:36:46 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # awk '{print $3}' 00:33:04.025 18:36:46 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # asan_lib= 00:33:04.025 18:36:46 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1346 -- # [[ -n '' ]] 00:33:04.025 18:36:46 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1344 -- # for sanitizer in "${sanitizers[@]}" 00:33:04.025 18:36:46 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # ldd /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:33:04.025 18:36:46 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # grep libclang_rt.asan 00:33:04.025 18:36:46 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # awk '{print $3}' 00:33:04.025 18:36:46 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # asan_lib= 00:33:04.025 18:36:46 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1346 -- # [[ -n '' ]] 00:33:04.025 18:36:46 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1352 -- # LD_PRELOAD=' /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev' 00:33:04.025 18:36:46 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1352 -- # /usr/src/fio/fio --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --verify_state_save=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:33:04.025 job_crypto_ram: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:33:04.025 job_crypto_ram3: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:33:04.025 fio-3.35 00:33:04.025 Starting 2 threads 00:33:14.009 00:33:14.009 job_crypto_ram: (groupid=0, jobs=2): err= 0: pid=2653909: Fri Jul 12 18:36:57 2024 00:33:14.009 write: IOPS=42.3k, BW=165MiB/s (173MB/s)(1653MiB/10001msec); 0 zone resets 00:33:14.009 slat (usec): min=14, max=1937, avg=21.02, stdev= 5.09 00:33:14.009 clat (usec): min=34, max=2203, avg=152.69, stdev=86.48 00:33:14.009 lat (usec): min=51, max=2268, avg=173.71, stdev=89.68 00:33:14.009 clat percentiles (usec): 00:33:14.009 | 50.000th=[ 124], 99.000th=[ 302], 99.900th=[ 330], 99.990th=[ 482], 00:33:14.009 | 99.999th=[ 775] 00:33:14.009 bw ( KiB/s): min=164376, max=171040, per=100.00%, avg=169388.63, stdev=723.52, samples=38 00:33:14.009 iops : min=41094, max=42760, avg=42347.16, stdev=180.88, samples=38 00:33:14.009 trim: IOPS=42.3k, BW=165MiB/s (173MB/s)(1653MiB/10001msec); 0 zone resets 00:33:14.009 slat (usec): min=6, max=102, avg=10.13, stdev= 2.43 00:33:14.009 clat (usec): min=50, max=673, avg=101.03, stdev=29.64 00:33:14.009 lat (usec): min=58, max=686, avg=111.16, stdev=29.92 00:33:14.009 clat percentiles (usec): 00:33:14.009 | 50.000th=[ 96], 99.000th=[ 153], 99.900th=[ 169], 99.990th=[ 273], 00:33:14.009 | 99.999th=[ 461] 00:33:14.009 bw ( KiB/s): min=164408, max=171040, per=100.00%, avg=169389.89, stdev=721.02, samples=38 00:33:14.009 iops : min=41102, max=42760, avg=42347.47, stdev=180.26, samples=38 00:33:14.009 lat (usec) : 50=7.17%, 100=38.64%, 250=43.92%, 500=10.27%, 750=0.01% 00:33:14.009 lat (usec) : 1000=0.01% 00:33:14.009 lat (msec) : 4=0.01% 00:33:14.009 cpu : usr=99.58%, sys=0.00%, ctx=22, majf=0, minf=202 00:33:14.009 IO depths : 1=7.5%, 2=17.5%, 4=60.0%, 8=15.0%, 16=0.0%, 32=0.0%, >=64=0.0% 00:33:14.009 submit : 0=0.0%, 4=100.0%, 8=0.0%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.0% 00:33:14.009 complete : 0=0.0%, 4=87.0%, 8=13.0%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.0% 00:33:14.009 issued rwts: total=0,423293,423293,0 short=0,0,0,0 dropped=0,0,0,0 00:33:14.009 latency : target=0, window=0, percentile=100.00%, depth=8 00:33:14.009 00:33:14.009 Run status group 0 (all jobs): 00:33:14.009 WRITE: bw=165MiB/s (173MB/s), 165MiB/s-165MiB/s (173MB/s-173MB/s), io=1653MiB (1734MB), run=10001-10001msec 00:33:14.009 TRIM: bw=165MiB/s (173MB/s), 165MiB/s-165MiB/s (173MB/s-173MB/s), io=1653MiB (1734MB), run=10001-10001msec 00:33:14.009 00:33:14.009 real 0m11.195s 00:33:14.009 user 0m23.494s 00:33:14.009 sys 0m0.351s 00:33:14.009 18:36:57 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1124 -- # xtrace_disable 00:33:14.009 18:36:57 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@10 -- # set +x 00:33:14.009 ************************************ 00:33:14.009 END TEST bdev_fio_trim 00:33:14.009 ************************************ 00:33:14.009 18:36:57 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1142 -- # return 0 00:33:14.009 18:36:57 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@368 -- # rm -f 00:33:14.009 18:36:57 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@369 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:33:14.009 18:36:57 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@370 -- # popd 00:33:14.009 /var/jenkins/workspace/crypto-phy-autotest/spdk 00:33:14.009 18:36:57 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@371 -- # trap - SIGINT SIGTERM EXIT 00:33:14.009 00:33:14.009 real 0m23.633s 00:33:14.009 user 0m47.247s 00:33:14.009 sys 0m0.910s 00:33:14.009 18:36:57 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1124 -- # xtrace_disable 00:33:14.009 18:36:57 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@10 -- # set +x 00:33:14.009 ************************************ 00:33:14.009 END TEST bdev_fio 00:33:14.009 ************************************ 00:33:14.009 18:36:57 blockdev_crypto_sw -- common/autotest_common.sh@1142 -- # return 0 00:33:14.009 18:36:57 blockdev_crypto_sw -- bdev/blockdev.sh@775 -- # trap cleanup SIGINT SIGTERM EXIT 00:33:14.009 18:36:57 blockdev_crypto_sw -- bdev/blockdev.sh@777 -- # run_test bdev_verify /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 4096 -w verify -t 5 -C -m 0x3 '' 00:33:14.009 18:36:57 blockdev_crypto_sw -- common/autotest_common.sh@1099 -- # '[' 16 -le 1 ']' 00:33:14.009 18:36:57 blockdev_crypto_sw -- common/autotest_common.sh@1105 -- # xtrace_disable 00:33:14.009 18:36:57 blockdev_crypto_sw -- common/autotest_common.sh@10 -- # set +x 00:33:14.009 ************************************ 00:33:14.009 START TEST bdev_verify 00:33:14.009 ************************************ 00:33:14.009 18:36:57 blockdev_crypto_sw.bdev_verify -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 4096 -w verify -t 5 -C -m 0x3 '' 00:33:14.009 [2024-07-12 18:36:57.562792] Starting SPDK v24.09-pre git sha1 182dd7de4 / DPDK 24.03.0 initialization... 00:33:14.009 [2024-07-12 18:36:57.562856] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x3 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2655267 ] 00:33:14.009 [2024-07-12 18:36:57.692214] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 2 00:33:14.269 [2024-07-12 18:36:57.799167] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:33:14.269 [2024-07-12 18:36:57.799173] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:33:14.269 [2024-07-12 18:36:57.981837] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw" 00:33:14.269 [2024-07-12 18:36:57.981905] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:33:14.269 [2024-07-12 18:36:57.981921] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:33:14.269 [2024-07-12 18:36:57.989860] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw2" 00:33:14.269 [2024-07-12 18:36:57.989880] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:33:14.269 [2024-07-12 18:36:57.989892] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:33:14.528 [2024-07-12 18:36:57.997883] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw3" 00:33:14.528 [2024-07-12 18:36:57.997903] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: crypto_ram2 00:33:14.528 [2024-07-12 18:36:57.997915] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:33:14.528 Running I/O for 5 seconds... 00:33:19.803 00:33:19.803 Latency(us) 00:33:19.803 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:33:19.803 Job: crypto_ram (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:33:19.803 Verification LBA range: start 0x0 length 0x800 00:33:19.803 crypto_ram : 5.02 6273.82 24.51 0.00 0.00 20321.24 1453.19 24162.84 00:33:19.803 Job: crypto_ram (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:33:19.803 Verification LBA range: start 0x800 length 0x800 00:33:19.803 crypto_ram : 5.02 6300.67 24.61 0.00 0.00 20236.79 1787.99 23820.91 00:33:19.803 Job: crypto_ram3 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:33:19.803 Verification LBA range: start 0x0 length 0x800 00:33:19.803 crypto_ram3 : 5.02 3135.34 12.25 0.00 0.00 40595.07 7693.36 28379.94 00:33:19.803 Job: crypto_ram3 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:33:19.803 Verification LBA range: start 0x800 length 0x800 00:33:19.803 crypto_ram3 : 5.02 3159.01 12.34 0.00 0.00 40290.71 1738.13 28265.96 00:33:19.803 =================================================================================================================== 00:33:19.803 Total : 18868.84 73.71 0.00 0.00 27009.13 1453.19 28379.94 00:33:19.803 00:33:19.803 real 0m5.803s 00:33:19.803 user 0m10.896s 00:33:19.803 sys 0m0.246s 00:33:19.803 18:37:03 blockdev_crypto_sw.bdev_verify -- common/autotest_common.sh@1124 -- # xtrace_disable 00:33:19.803 18:37:03 blockdev_crypto_sw.bdev_verify -- common/autotest_common.sh@10 -- # set +x 00:33:19.803 ************************************ 00:33:19.803 END TEST bdev_verify 00:33:19.803 ************************************ 00:33:19.803 18:37:03 blockdev_crypto_sw -- common/autotest_common.sh@1142 -- # return 0 00:33:19.803 18:37:03 blockdev_crypto_sw -- bdev/blockdev.sh@778 -- # run_test bdev_verify_big_io /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 65536 -w verify -t 5 -C -m 0x3 '' 00:33:19.803 18:37:03 blockdev_crypto_sw -- common/autotest_common.sh@1099 -- # '[' 16 -le 1 ']' 00:33:19.803 18:37:03 blockdev_crypto_sw -- common/autotest_common.sh@1105 -- # xtrace_disable 00:33:19.803 18:37:03 blockdev_crypto_sw -- common/autotest_common.sh@10 -- # set +x 00:33:19.803 ************************************ 00:33:19.803 START TEST bdev_verify_big_io 00:33:19.803 ************************************ 00:33:19.804 18:37:03 blockdev_crypto_sw.bdev_verify_big_io -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 65536 -w verify -t 5 -C -m 0x3 '' 00:33:19.804 [2024-07-12 18:37:03.430837] Starting SPDK v24.09-pre git sha1 182dd7de4 / DPDK 24.03.0 initialization... 00:33:19.804 [2024-07-12 18:37:03.430898] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x3 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2656009 ] 00:33:20.062 [2024-07-12 18:37:03.559810] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 2 00:33:20.062 [2024-07-12 18:37:03.662122] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:33:20.062 [2024-07-12 18:37:03.662128] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:33:20.321 [2024-07-12 18:37:03.828784] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw" 00:33:20.321 [2024-07-12 18:37:03.828852] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:33:20.321 [2024-07-12 18:37:03.828866] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:33:20.321 [2024-07-12 18:37:03.836804] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw2" 00:33:20.321 [2024-07-12 18:37:03.836823] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:33:20.321 [2024-07-12 18:37:03.836834] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:33:20.321 [2024-07-12 18:37:03.844828] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw3" 00:33:20.321 [2024-07-12 18:37:03.844847] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: crypto_ram2 00:33:20.321 [2024-07-12 18:37:03.844858] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:33:20.321 Running I/O for 5 seconds... 00:33:25.596 00:33:25.596 Latency(us) 00:33:25.596 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:33:25.596 Job: crypto_ram (Core Mask 0x1, workload: verify, depth: 128, IO size: 65536) 00:33:25.596 Verification LBA range: start 0x0 length 0x80 00:33:25.596 crypto_ram : 5.02 458.82 28.68 0.00 0.00 272219.10 6012.22 381134.58 00:33:25.596 Job: crypto_ram (Core Mask 0x2, workload: verify, depth: 128, IO size: 65536) 00:33:25.596 Verification LBA range: start 0x80 length 0x80 00:33:25.596 crypto_ram : 5.25 462.93 28.93 0.00 0.00 270126.39 6097.70 379310.97 00:33:25.596 Job: crypto_ram3 (Core Mask 0x1, workload: verify, depth: 128, IO size: 65536) 00:33:25.596 Verification LBA range: start 0x0 length 0x80 00:33:25.596 crypto_ram3 : 5.23 244.71 15.29 0.00 0.00 491351.08 5812.76 390252.63 00:33:25.596 Job: crypto_ram3 (Core Mask 0x2, workload: verify, depth: 128, IO size: 65536) 00:33:25.596 Verification LBA range: start 0x80 length 0x80 00:33:25.596 crypto_ram3 : 5.27 243.03 15.19 0.00 0.00 495022.29 5670.29 386605.41 00:33:25.596 =================================================================================================================== 00:33:25.596 Total : 1409.49 88.09 0.00 0.00 349054.02 5670.29 390252.63 00:33:25.855 00:33:25.855 real 0m6.056s 00:33:25.855 user 0m11.414s 00:33:25.855 sys 0m0.233s 00:33:25.855 18:37:09 blockdev_crypto_sw.bdev_verify_big_io -- common/autotest_common.sh@1124 -- # xtrace_disable 00:33:25.855 18:37:09 blockdev_crypto_sw.bdev_verify_big_io -- common/autotest_common.sh@10 -- # set +x 00:33:25.855 ************************************ 00:33:25.855 END TEST bdev_verify_big_io 00:33:25.855 ************************************ 00:33:25.855 18:37:09 blockdev_crypto_sw -- common/autotest_common.sh@1142 -- # return 0 00:33:25.855 18:37:09 blockdev_crypto_sw -- bdev/blockdev.sh@779 -- # run_test bdev_write_zeroes /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:33:25.855 18:37:09 blockdev_crypto_sw -- common/autotest_common.sh@1099 -- # '[' 13 -le 1 ']' 00:33:25.855 18:37:09 blockdev_crypto_sw -- common/autotest_common.sh@1105 -- # xtrace_disable 00:33:25.855 18:37:09 blockdev_crypto_sw -- common/autotest_common.sh@10 -- # set +x 00:33:25.855 ************************************ 00:33:25.855 START TEST bdev_write_zeroes 00:33:25.855 ************************************ 00:33:25.855 18:37:09 blockdev_crypto_sw.bdev_write_zeroes -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:33:25.855 [2024-07-12 18:37:09.556186] Starting SPDK v24.09-pre git sha1 182dd7de4 / DPDK 24.03.0 initialization... 00:33:25.856 [2024-07-12 18:37:09.556246] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2656863 ] 00:33:26.114 [2024-07-12 18:37:09.683618] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:33:26.114 [2024-07-12 18:37:09.785387] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:33:26.373 [2024-07-12 18:37:09.957180] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw" 00:33:26.373 [2024-07-12 18:37:09.957253] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:33:26.373 [2024-07-12 18:37:09.957268] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:33:26.373 [2024-07-12 18:37:09.965199] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw2" 00:33:26.373 [2024-07-12 18:37:09.965222] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:33:26.373 [2024-07-12 18:37:09.965234] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:33:26.373 [2024-07-12 18:37:09.973221] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw3" 00:33:26.373 [2024-07-12 18:37:09.973242] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: crypto_ram2 00:33:26.373 [2024-07-12 18:37:09.973255] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:33:26.373 Running I/O for 1 seconds... 00:33:27.750 00:33:27.750 Latency(us) 00:33:27.750 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:33:27.750 Job: crypto_ram (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:33:27.750 crypto_ram : 1.01 26646.05 104.09 0.00 0.00 4791.02 1282.23 7351.43 00:33:27.750 Job: crypto_ram3 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:33:27.750 crypto_ram3 : 1.01 13296.23 51.94 0.00 0.00 9551.85 5983.72 10713.71 00:33:27.750 =================================================================================================================== 00:33:27.750 Total : 39942.28 156.02 0.00 0.00 6377.96 1282.23 10713.71 00:33:27.750 00:33:27.750 real 0m1.753s 00:33:27.750 user 0m1.492s 00:33:27.750 sys 0m0.244s 00:33:27.750 18:37:11 blockdev_crypto_sw.bdev_write_zeroes -- common/autotest_common.sh@1124 -- # xtrace_disable 00:33:27.750 18:37:11 blockdev_crypto_sw.bdev_write_zeroes -- common/autotest_common.sh@10 -- # set +x 00:33:27.750 ************************************ 00:33:27.750 END TEST bdev_write_zeroes 00:33:27.750 ************************************ 00:33:27.750 18:37:11 blockdev_crypto_sw -- common/autotest_common.sh@1142 -- # return 0 00:33:27.750 18:37:11 blockdev_crypto_sw -- bdev/blockdev.sh@782 -- # run_test bdev_json_nonenclosed /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonenclosed.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:33:27.750 18:37:11 blockdev_crypto_sw -- common/autotest_common.sh@1099 -- # '[' 13 -le 1 ']' 00:33:27.750 18:37:11 blockdev_crypto_sw -- common/autotest_common.sh@1105 -- # xtrace_disable 00:33:27.750 18:37:11 blockdev_crypto_sw -- common/autotest_common.sh@10 -- # set +x 00:33:27.750 ************************************ 00:33:27.750 START TEST bdev_json_nonenclosed 00:33:27.750 ************************************ 00:33:27.750 18:37:11 blockdev_crypto_sw.bdev_json_nonenclosed -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonenclosed.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:33:27.750 [2024-07-12 18:37:11.376580] Starting SPDK v24.09-pre git sha1 182dd7de4 / DPDK 24.03.0 initialization... 00:33:27.750 [2024-07-12 18:37:11.376645] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2657065 ] 00:33:28.009 [2024-07-12 18:37:11.504705] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:33:28.009 [2024-07-12 18:37:11.604906] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:33:28.009 [2024-07-12 18:37:11.604987] json_config.c: 608:json_config_prepare_ctx: *ERROR*: Invalid JSON configuration: not enclosed in {}. 00:33:28.009 [2024-07-12 18:37:11.605008] rpc.c: 190:spdk_rpc_server_finish: *ERROR*: No server listening on provided address: 00:33:28.009 [2024-07-12 18:37:11.605021] app.c:1052:spdk_app_stop: *WARNING*: spdk_app_stop'd on non-zero 00:33:28.009 00:33:28.009 real 0m0.387s 00:33:28.009 user 0m0.245s 00:33:28.009 sys 0m0.138s 00:33:28.009 18:37:11 blockdev_crypto_sw.bdev_json_nonenclosed -- common/autotest_common.sh@1123 -- # es=234 00:33:28.009 18:37:11 blockdev_crypto_sw.bdev_json_nonenclosed -- common/autotest_common.sh@1124 -- # xtrace_disable 00:33:28.009 18:37:11 blockdev_crypto_sw.bdev_json_nonenclosed -- common/autotest_common.sh@10 -- # set +x 00:33:28.009 ************************************ 00:33:28.009 END TEST bdev_json_nonenclosed 00:33:28.009 ************************************ 00:33:28.324 18:37:11 blockdev_crypto_sw -- common/autotest_common.sh@1142 -- # return 234 00:33:28.324 18:37:11 blockdev_crypto_sw -- bdev/blockdev.sh@782 -- # true 00:33:28.324 18:37:11 blockdev_crypto_sw -- bdev/blockdev.sh@785 -- # run_test bdev_json_nonarray /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonarray.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:33:28.324 18:37:11 blockdev_crypto_sw -- common/autotest_common.sh@1099 -- # '[' 13 -le 1 ']' 00:33:28.324 18:37:11 blockdev_crypto_sw -- common/autotest_common.sh@1105 -- # xtrace_disable 00:33:28.324 18:37:11 blockdev_crypto_sw -- common/autotest_common.sh@10 -- # set +x 00:33:28.324 ************************************ 00:33:28.324 START TEST bdev_json_nonarray 00:33:28.324 ************************************ 00:33:28.324 18:37:11 blockdev_crypto_sw.bdev_json_nonarray -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonarray.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:33:28.324 [2024-07-12 18:37:11.862624] Starting SPDK v24.09-pre git sha1 182dd7de4 / DPDK 24.03.0 initialization... 00:33:28.324 [2024-07-12 18:37:11.862691] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2657169 ] 00:33:28.324 [2024-07-12 18:37:11.991319] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:33:28.583 [2024-07-12 18:37:12.092485] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:33:28.583 [2024-07-12 18:37:12.092564] json_config.c: 614:json_config_prepare_ctx: *ERROR*: Invalid JSON configuration: 'subsystems' should be an array. 00:33:28.583 [2024-07-12 18:37:12.092585] rpc.c: 190:spdk_rpc_server_finish: *ERROR*: No server listening on provided address: 00:33:28.583 [2024-07-12 18:37:12.092599] app.c:1052:spdk_app_stop: *WARNING*: spdk_app_stop'd on non-zero 00:33:28.583 00:33:28.583 real 0m0.401s 00:33:28.583 user 0m0.237s 00:33:28.583 sys 0m0.161s 00:33:28.583 18:37:12 blockdev_crypto_sw.bdev_json_nonarray -- common/autotest_common.sh@1123 -- # es=234 00:33:28.583 18:37:12 blockdev_crypto_sw.bdev_json_nonarray -- common/autotest_common.sh@1124 -- # xtrace_disable 00:33:28.583 18:37:12 blockdev_crypto_sw.bdev_json_nonarray -- common/autotest_common.sh@10 -- # set +x 00:33:28.583 ************************************ 00:33:28.583 END TEST bdev_json_nonarray 00:33:28.583 ************************************ 00:33:28.583 18:37:12 blockdev_crypto_sw -- common/autotest_common.sh@1142 -- # return 234 00:33:28.583 18:37:12 blockdev_crypto_sw -- bdev/blockdev.sh@785 -- # true 00:33:28.583 18:37:12 blockdev_crypto_sw -- bdev/blockdev.sh@787 -- # [[ crypto_sw == bdev ]] 00:33:28.583 18:37:12 blockdev_crypto_sw -- bdev/blockdev.sh@794 -- # [[ crypto_sw == gpt ]] 00:33:28.583 18:37:12 blockdev_crypto_sw -- bdev/blockdev.sh@798 -- # [[ crypto_sw == crypto_sw ]] 00:33:28.583 18:37:12 blockdev_crypto_sw -- bdev/blockdev.sh@799 -- # run_test bdev_crypto_enomem bdev_crypto_enomem 00:33:28.583 18:37:12 blockdev_crypto_sw -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:33:28.583 18:37:12 blockdev_crypto_sw -- common/autotest_common.sh@1105 -- # xtrace_disable 00:33:28.583 18:37:12 blockdev_crypto_sw -- common/autotest_common.sh@10 -- # set +x 00:33:28.583 ************************************ 00:33:28.583 START TEST bdev_crypto_enomem 00:33:28.583 ************************************ 00:33:28.583 18:37:12 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@1123 -- # bdev_crypto_enomem 00:33:28.583 18:37:12 blockdev_crypto_sw.bdev_crypto_enomem -- bdev/blockdev.sh@635 -- # local base_dev=base0 00:33:28.583 18:37:12 blockdev_crypto_sw.bdev_crypto_enomem -- bdev/blockdev.sh@636 -- # local test_dev=crypt0 00:33:28.583 18:37:12 blockdev_crypto_sw.bdev_crypto_enomem -- bdev/blockdev.sh@637 -- # local err_dev=EE_base0 00:33:28.583 18:37:12 blockdev_crypto_sw.bdev_crypto_enomem -- bdev/blockdev.sh@638 -- # local qd=32 00:33:28.583 18:37:12 blockdev_crypto_sw.bdev_crypto_enomem -- bdev/blockdev.sh@641 -- # ERR_PID=2657273 00:33:28.583 18:37:12 blockdev_crypto_sw.bdev_crypto_enomem -- bdev/blockdev.sh@642 -- # trap 'cleanup; killprocess $ERR_PID; exit 1' SIGINT SIGTERM EXIT 00:33:28.583 18:37:12 blockdev_crypto_sw.bdev_crypto_enomem -- bdev/blockdev.sh@640 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -z -m 0x2 -q 32 -o 4096 -w randwrite -t 5 -f '' 00:33:28.583 18:37:12 blockdev_crypto_sw.bdev_crypto_enomem -- bdev/blockdev.sh@643 -- # waitforlisten 2657273 00:33:28.583 18:37:12 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@829 -- # '[' -z 2657273 ']' 00:33:28.583 18:37:12 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:33:28.583 18:37:12 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@834 -- # local max_retries=100 00:33:28.583 18:37:12 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:33:28.583 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:33:28.583 18:37:12 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@838 -- # xtrace_disable 00:33:28.583 18:37:12 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@10 -- # set +x 00:33:28.843 [2024-07-12 18:37:12.356608] Starting SPDK v24.09-pre git sha1 182dd7de4 / DPDK 24.03.0 initialization... 00:33:28.843 [2024-07-12 18:37:12.356680] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x2 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2657273 ] 00:33:28.843 [2024-07-12 18:37:12.478511] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:33:29.102 [2024-07-12 18:37:12.584924] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:33:29.669 18:37:13 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:33:29.669 18:37:13 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@862 -- # return 0 00:33:29.669 18:37:13 blockdev_crypto_sw.bdev_crypto_enomem -- bdev/blockdev.sh@645 -- # rpc_cmd 00:33:29.669 18:37:13 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@559 -- # xtrace_disable 00:33:29.669 18:37:13 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@10 -- # set +x 00:33:29.669 true 00:33:29.669 base0 00:33:29.669 true 00:33:29.669 [2024-07-12 18:37:13.258331] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw" 00:33:29.669 crypt0 00:33:29.669 18:37:13 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:33:29.669 18:37:13 blockdev_crypto_sw.bdev_crypto_enomem -- bdev/blockdev.sh@652 -- # waitforbdev crypt0 00:33:29.670 18:37:13 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@897 -- # local bdev_name=crypt0 00:33:29.670 18:37:13 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:33:29.670 18:37:13 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@899 -- # local i 00:33:29.670 18:37:13 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:33:29.670 18:37:13 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:33:29.670 18:37:13 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@902 -- # rpc_cmd bdev_wait_for_examine 00:33:29.670 18:37:13 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@559 -- # xtrace_disable 00:33:29.670 18:37:13 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@10 -- # set +x 00:33:29.670 18:37:13 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:33:29.670 18:37:13 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@904 -- # rpc_cmd bdev_get_bdevs -b crypt0 -t 2000 00:33:29.670 18:37:13 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@559 -- # xtrace_disable 00:33:29.670 18:37:13 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@10 -- # set +x 00:33:29.670 [ 00:33:29.670 { 00:33:29.670 "name": "crypt0", 00:33:29.670 "aliases": [ 00:33:29.670 "1dc41a46-c598-506b-bf03-f9e373499c70" 00:33:29.670 ], 00:33:29.670 "product_name": "crypto", 00:33:29.670 "block_size": 512, 00:33:29.670 "num_blocks": 2097152, 00:33:29.670 "uuid": "1dc41a46-c598-506b-bf03-f9e373499c70", 00:33:29.670 "assigned_rate_limits": { 00:33:29.670 "rw_ios_per_sec": 0, 00:33:29.670 "rw_mbytes_per_sec": 0, 00:33:29.670 "r_mbytes_per_sec": 0, 00:33:29.670 "w_mbytes_per_sec": 0 00:33:29.670 }, 00:33:29.670 "claimed": false, 00:33:29.670 "zoned": false, 00:33:29.670 "supported_io_types": { 00:33:29.670 "read": true, 00:33:29.670 "write": true, 00:33:29.670 "unmap": false, 00:33:29.670 "flush": false, 00:33:29.670 "reset": true, 00:33:29.670 "nvme_admin": false, 00:33:29.670 "nvme_io": false, 00:33:29.670 "nvme_io_md": false, 00:33:29.670 "write_zeroes": true, 00:33:29.670 "zcopy": false, 00:33:29.670 "get_zone_info": false, 00:33:29.670 "zone_management": false, 00:33:29.670 "zone_append": false, 00:33:29.670 "compare": false, 00:33:29.670 "compare_and_write": false, 00:33:29.670 "abort": false, 00:33:29.670 "seek_hole": false, 00:33:29.670 "seek_data": false, 00:33:29.670 "copy": false, 00:33:29.670 "nvme_iov_md": false 00:33:29.670 }, 00:33:29.670 "memory_domains": [ 00:33:29.670 { 00:33:29.670 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:33:29.670 "dma_device_type": 2 00:33:29.670 } 00:33:29.670 ], 00:33:29.670 "driver_specific": { 00:33:29.670 "crypto": { 00:33:29.670 "base_bdev_name": "EE_base0", 00:33:29.670 "name": "crypt0", 00:33:29.670 "key_name": "test_dek_sw" 00:33:29.670 } 00:33:29.670 } 00:33:29.670 } 00:33:29.670 ] 00:33:29.670 18:37:13 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:33:29.670 18:37:13 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@905 -- # return 0 00:33:29.670 18:37:13 blockdev_crypto_sw.bdev_crypto_enomem -- bdev/blockdev.sh@655 -- # rpcpid=2657418 00:33:29.670 18:37:13 blockdev_crypto_sw.bdev_crypto_enomem -- bdev/blockdev.sh@657 -- # sleep 1 00:33:29.670 18:37:13 blockdev_crypto_sw.bdev_crypto_enomem -- bdev/blockdev.sh@654 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py perform_tests 00:33:29.928 Running I/O for 5 seconds... 00:33:30.865 18:37:14 blockdev_crypto_sw.bdev_crypto_enomem -- bdev/blockdev.sh@658 -- # rpc_cmd bdev_error_inject_error EE_base0 -n 5 -q 31 write nomem 00:33:30.865 18:37:14 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@559 -- # xtrace_disable 00:33:30.865 18:37:14 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@10 -- # set +x 00:33:30.865 18:37:14 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:33:30.865 18:37:14 blockdev_crypto_sw.bdev_crypto_enomem -- bdev/blockdev.sh@660 -- # wait 2657418 00:33:35.056 00:33:35.056 Latency(us) 00:33:35.056 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:33:35.056 Job: crypt0 (Core Mask 0x2, workload: randwrite, depth: 32, IO size: 4096) 00:33:35.056 crypt0 : 5.00 36343.07 141.97 0.00 0.00 876.89 409.60 1161.13 00:33:35.056 =================================================================================================================== 00:33:35.056 Total : 36343.07 141.97 0.00 0.00 876.89 409.60 1161.13 00:33:35.056 0 00:33:35.056 18:37:18 blockdev_crypto_sw.bdev_crypto_enomem -- bdev/blockdev.sh@662 -- # rpc_cmd bdev_crypto_delete crypt0 00:33:35.056 18:37:18 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@559 -- # xtrace_disable 00:33:35.056 18:37:18 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@10 -- # set +x 00:33:35.056 18:37:18 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:33:35.056 18:37:18 blockdev_crypto_sw.bdev_crypto_enomem -- bdev/blockdev.sh@664 -- # killprocess 2657273 00:33:35.056 18:37:18 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@948 -- # '[' -z 2657273 ']' 00:33:35.056 18:37:18 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@952 -- # kill -0 2657273 00:33:35.056 18:37:18 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@953 -- # uname 00:33:35.056 18:37:18 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:33:35.056 18:37:18 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2657273 00:33:35.056 18:37:18 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@954 -- # process_name=reactor_1 00:33:35.056 18:37:18 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@958 -- # '[' reactor_1 = sudo ']' 00:33:35.056 18:37:18 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2657273' 00:33:35.056 killing process with pid 2657273 00:33:35.056 18:37:18 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@967 -- # kill 2657273 00:33:35.056 Received shutdown signal, test time was about 5.000000 seconds 00:33:35.056 00:33:35.056 Latency(us) 00:33:35.056 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:33:35.056 =================================================================================================================== 00:33:35.056 Total : 0.00 0.00 0.00 0.00 0.00 0.00 0.00 00:33:35.056 18:37:18 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@972 -- # wait 2657273 00:33:35.056 18:37:18 blockdev_crypto_sw.bdev_crypto_enomem -- bdev/blockdev.sh@665 -- # trap - SIGINT SIGTERM EXIT 00:33:35.056 00:33:35.056 real 0m6.419s 00:33:35.056 user 0m6.641s 00:33:35.056 sys 0m0.365s 00:33:35.056 18:37:18 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@1124 -- # xtrace_disable 00:33:35.056 18:37:18 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@10 -- # set +x 00:33:35.056 ************************************ 00:33:35.056 END TEST bdev_crypto_enomem 00:33:35.056 ************************************ 00:33:35.056 18:37:18 blockdev_crypto_sw -- common/autotest_common.sh@1142 -- # return 0 00:33:35.056 18:37:18 blockdev_crypto_sw -- bdev/blockdev.sh@810 -- # trap - SIGINT SIGTERM EXIT 00:33:35.056 18:37:18 blockdev_crypto_sw -- bdev/blockdev.sh@811 -- # cleanup 00:33:35.056 18:37:18 blockdev_crypto_sw -- bdev/blockdev.sh@23 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/aiofile 00:33:35.056 18:37:18 blockdev_crypto_sw -- bdev/blockdev.sh@24 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json 00:33:35.056 18:37:18 blockdev_crypto_sw -- bdev/blockdev.sh@26 -- # [[ crypto_sw == rbd ]] 00:33:35.056 18:37:18 blockdev_crypto_sw -- bdev/blockdev.sh@30 -- # [[ crypto_sw == daos ]] 00:33:35.056 18:37:18 blockdev_crypto_sw -- bdev/blockdev.sh@34 -- # [[ crypto_sw = \g\p\t ]] 00:33:35.056 18:37:18 blockdev_crypto_sw -- bdev/blockdev.sh@40 -- # [[ crypto_sw == xnvme ]] 00:33:35.056 00:33:35.056 real 0m58.817s 00:33:35.056 user 1m34.637s 00:33:35.056 sys 0m7.121s 00:33:35.056 18:37:18 blockdev_crypto_sw -- common/autotest_common.sh@1124 -- # xtrace_disable 00:33:35.056 18:37:18 blockdev_crypto_sw -- common/autotest_common.sh@10 -- # set +x 00:33:35.056 ************************************ 00:33:35.056 END TEST blockdev_crypto_sw 00:33:35.056 ************************************ 00:33:35.316 18:37:18 -- common/autotest_common.sh@1142 -- # return 0 00:33:35.316 18:37:18 -- spdk/autotest.sh@359 -- # run_test blockdev_crypto_qat /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/blockdev.sh crypto_qat 00:33:35.316 18:37:18 -- common/autotest_common.sh@1099 -- # '[' 3 -le 1 ']' 00:33:35.316 18:37:18 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:33:35.316 18:37:18 -- common/autotest_common.sh@10 -- # set +x 00:33:35.316 ************************************ 00:33:35.316 START TEST blockdev_crypto_qat 00:33:35.316 ************************************ 00:33:35.316 18:37:18 blockdev_crypto_qat -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/blockdev.sh crypto_qat 00:33:35.316 * Looking for test storage... 00:33:35.316 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev 00:33:35.316 18:37:18 blockdev_crypto_qat -- bdev/blockdev.sh@10 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbd_common.sh 00:33:35.316 18:37:18 blockdev_crypto_qat -- bdev/nbd_common.sh@6 -- # set -e 00:33:35.316 18:37:18 blockdev_crypto_qat -- bdev/blockdev.sh@12 -- # rpc_py=rpc_cmd 00:33:35.316 18:37:18 blockdev_crypto_qat -- bdev/blockdev.sh@13 -- # conf_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json 00:33:35.316 18:37:18 blockdev_crypto_qat -- bdev/blockdev.sh@14 -- # nonenclosed_conf_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonenclosed.json 00:33:35.316 18:37:18 blockdev_crypto_qat -- bdev/blockdev.sh@15 -- # nonarray_conf_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonarray.json 00:33:35.316 18:37:18 blockdev_crypto_qat -- bdev/blockdev.sh@17 -- # export RPC_PIPE_TIMEOUT=30 00:33:35.316 18:37:18 blockdev_crypto_qat -- bdev/blockdev.sh@17 -- # RPC_PIPE_TIMEOUT=30 00:33:35.316 18:37:18 blockdev_crypto_qat -- bdev/blockdev.sh@20 -- # : 00:33:35.316 18:37:18 blockdev_crypto_qat -- bdev/blockdev.sh@670 -- # QOS_DEV_1=Malloc_0 00:33:35.316 18:37:18 blockdev_crypto_qat -- bdev/blockdev.sh@671 -- # QOS_DEV_2=Null_1 00:33:35.316 18:37:18 blockdev_crypto_qat -- bdev/blockdev.sh@672 -- # QOS_RUN_TIME=5 00:33:35.316 18:37:18 blockdev_crypto_qat -- bdev/blockdev.sh@674 -- # uname -s 00:33:35.316 18:37:18 blockdev_crypto_qat -- bdev/blockdev.sh@674 -- # '[' Linux = Linux ']' 00:33:35.316 18:37:18 blockdev_crypto_qat -- bdev/blockdev.sh@676 -- # PRE_RESERVED_MEM=0 00:33:35.316 18:37:18 blockdev_crypto_qat -- bdev/blockdev.sh@682 -- # test_type=crypto_qat 00:33:35.316 18:37:18 blockdev_crypto_qat -- bdev/blockdev.sh@683 -- # crypto_device= 00:33:35.316 18:37:18 blockdev_crypto_qat -- bdev/blockdev.sh@684 -- # dek= 00:33:35.316 18:37:18 blockdev_crypto_qat -- bdev/blockdev.sh@685 -- # env_ctx= 00:33:35.316 18:37:18 blockdev_crypto_qat -- bdev/blockdev.sh@686 -- # wait_for_rpc= 00:33:35.316 18:37:18 blockdev_crypto_qat -- bdev/blockdev.sh@687 -- # '[' -n '' ']' 00:33:35.316 18:37:18 blockdev_crypto_qat -- bdev/blockdev.sh@690 -- # [[ crypto_qat == bdev ]] 00:33:35.316 18:37:18 blockdev_crypto_qat -- bdev/blockdev.sh@690 -- # [[ crypto_qat == crypto_* ]] 00:33:35.316 18:37:18 blockdev_crypto_qat -- bdev/blockdev.sh@691 -- # wait_for_rpc=--wait-for-rpc 00:33:35.316 18:37:18 blockdev_crypto_qat -- bdev/blockdev.sh@693 -- # start_spdk_tgt 00:33:35.316 18:37:18 blockdev_crypto_qat -- bdev/blockdev.sh@47 -- # spdk_tgt_pid=2658207 00:33:35.316 18:37:18 blockdev_crypto_qat -- bdev/blockdev.sh@48 -- # trap 'killprocess "$spdk_tgt_pid"; exit 1' SIGINT SIGTERM EXIT 00:33:35.316 18:37:18 blockdev_crypto_qat -- bdev/blockdev.sh@49 -- # waitforlisten 2658207 00:33:35.316 18:37:18 blockdev_crypto_qat -- bdev/blockdev.sh@46 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt '' --wait-for-rpc 00:33:35.316 18:37:18 blockdev_crypto_qat -- common/autotest_common.sh@829 -- # '[' -z 2658207 ']' 00:33:35.316 18:37:18 blockdev_crypto_qat -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:33:35.316 18:37:18 blockdev_crypto_qat -- common/autotest_common.sh@834 -- # local max_retries=100 00:33:35.316 18:37:18 blockdev_crypto_qat -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:33:35.316 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:33:35.316 18:37:18 blockdev_crypto_qat -- common/autotest_common.sh@838 -- # xtrace_disable 00:33:35.316 18:37:18 blockdev_crypto_qat -- common/autotest_common.sh@10 -- # set +x 00:33:35.316 [2024-07-12 18:37:19.041750] Starting SPDK v24.09-pre git sha1 182dd7de4 / DPDK 24.03.0 initialization... 00:33:35.316 [2024-07-12 18:37:19.041825] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2658207 ] 00:33:35.575 [2024-07-12 18:37:19.171827] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:33:35.575 [2024-07-12 18:37:19.276239] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:33:36.511 18:37:19 blockdev_crypto_qat -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:33:36.511 18:37:19 blockdev_crypto_qat -- common/autotest_common.sh@862 -- # return 0 00:33:36.511 18:37:19 blockdev_crypto_qat -- bdev/blockdev.sh@694 -- # case "$test_type" in 00:33:36.511 18:37:19 blockdev_crypto_qat -- bdev/blockdev.sh@708 -- # setup_crypto_qat_conf 00:33:36.511 18:37:19 blockdev_crypto_qat -- bdev/blockdev.sh@170 -- # rpc_cmd 00:33:36.511 18:37:19 blockdev_crypto_qat -- common/autotest_common.sh@559 -- # xtrace_disable 00:33:36.511 18:37:19 blockdev_crypto_qat -- common/autotest_common.sh@10 -- # set +x 00:33:36.511 [2024-07-12 18:37:19.982469] accel_dpdk_cryptodev.c: 223:accel_dpdk_cryptodev_set_driver: *NOTICE*: Using driver crypto_qat 00:33:36.511 [2024-07-12 18:37:19.990503] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation encrypt will be assigned to module dpdk_cryptodev 00:33:36.511 [2024-07-12 18:37:19.998519] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation decrypt will be assigned to module dpdk_cryptodev 00:33:36.511 [2024-07-12 18:37:20.067749] accel_dpdk_cryptodev.c:1178:accel_dpdk_cryptodev_init: *NOTICE*: Found crypto devices: 96 00:33:39.043 true 00:33:39.043 true 00:33:39.043 true 00:33:39.043 true 00:33:39.043 Malloc0 00:33:39.043 Malloc1 00:33:39.043 Malloc2 00:33:39.043 Malloc3 00:33:39.043 [2024-07-12 18:37:22.434086] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_cbc" 00:33:39.043 crypto_ram 00:33:39.043 [2024-07-12 18:37:22.442103] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_xts" 00:33:39.043 crypto_ram1 00:33:39.043 [2024-07-12 18:37:22.450123] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_cbc2" 00:33:39.043 crypto_ram2 00:33:39.043 [2024-07-12 18:37:22.458145] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_xts2" 00:33:39.043 crypto_ram3 00:33:39.043 [ 00:33:39.043 { 00:33:39.043 "name": "Malloc1", 00:33:39.043 "aliases": [ 00:33:39.043 "237bf543-7c15-42cb-9122-7ddc2593e47a" 00:33:39.043 ], 00:33:39.043 "product_name": "Malloc disk", 00:33:39.043 "block_size": 512, 00:33:39.043 "num_blocks": 65536, 00:33:39.043 "uuid": "237bf543-7c15-42cb-9122-7ddc2593e47a", 00:33:39.043 "assigned_rate_limits": { 00:33:39.043 "rw_ios_per_sec": 0, 00:33:39.043 "rw_mbytes_per_sec": 0, 00:33:39.043 "r_mbytes_per_sec": 0, 00:33:39.043 "w_mbytes_per_sec": 0 00:33:39.043 }, 00:33:39.043 "claimed": true, 00:33:39.043 "claim_type": "exclusive_write", 00:33:39.043 "zoned": false, 00:33:39.043 "supported_io_types": { 00:33:39.043 "read": true, 00:33:39.043 "write": true, 00:33:39.043 "unmap": true, 00:33:39.043 "flush": true, 00:33:39.043 "reset": true, 00:33:39.043 "nvme_admin": false, 00:33:39.043 "nvme_io": false, 00:33:39.043 "nvme_io_md": false, 00:33:39.043 "write_zeroes": true, 00:33:39.043 "zcopy": true, 00:33:39.043 "get_zone_info": false, 00:33:39.043 "zone_management": false, 00:33:39.043 "zone_append": false, 00:33:39.043 "compare": false, 00:33:39.043 "compare_and_write": false, 00:33:39.043 "abort": true, 00:33:39.043 "seek_hole": false, 00:33:39.043 "seek_data": false, 00:33:39.043 "copy": true, 00:33:39.043 "nvme_iov_md": false 00:33:39.043 }, 00:33:39.043 "memory_domains": [ 00:33:39.043 { 00:33:39.043 "dma_device_id": "system", 00:33:39.043 "dma_device_type": 1 00:33:39.043 }, 00:33:39.043 { 00:33:39.043 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:33:39.043 "dma_device_type": 2 00:33:39.043 } 00:33:39.043 ], 00:33:39.043 "driver_specific": {} 00:33:39.043 } 00:33:39.043 ] 00:33:39.043 18:37:22 blockdev_crypto_qat -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:33:39.043 18:37:22 blockdev_crypto_qat -- bdev/blockdev.sh@737 -- # rpc_cmd bdev_wait_for_examine 00:33:39.043 18:37:22 blockdev_crypto_qat -- common/autotest_common.sh@559 -- # xtrace_disable 00:33:39.043 18:37:22 blockdev_crypto_qat -- common/autotest_common.sh@10 -- # set +x 00:33:39.043 18:37:22 blockdev_crypto_qat -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:33:39.043 18:37:22 blockdev_crypto_qat -- bdev/blockdev.sh@740 -- # cat 00:33:39.043 18:37:22 blockdev_crypto_qat -- bdev/blockdev.sh@740 -- # rpc_cmd save_subsystem_config -n accel 00:33:39.043 18:37:22 blockdev_crypto_qat -- common/autotest_common.sh@559 -- # xtrace_disable 00:33:39.043 18:37:22 blockdev_crypto_qat -- common/autotest_common.sh@10 -- # set +x 00:33:39.043 18:37:22 blockdev_crypto_qat -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:33:39.043 18:37:22 blockdev_crypto_qat -- bdev/blockdev.sh@740 -- # rpc_cmd save_subsystem_config -n bdev 00:33:39.043 18:37:22 blockdev_crypto_qat -- common/autotest_common.sh@559 -- # xtrace_disable 00:33:39.043 18:37:22 blockdev_crypto_qat -- common/autotest_common.sh@10 -- # set +x 00:33:39.043 18:37:22 blockdev_crypto_qat -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:33:39.043 18:37:22 blockdev_crypto_qat -- bdev/blockdev.sh@740 -- # rpc_cmd save_subsystem_config -n iobuf 00:33:39.043 18:37:22 blockdev_crypto_qat -- common/autotest_common.sh@559 -- # xtrace_disable 00:33:39.043 18:37:22 blockdev_crypto_qat -- common/autotest_common.sh@10 -- # set +x 00:33:39.043 18:37:22 blockdev_crypto_qat -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:33:39.043 18:37:22 blockdev_crypto_qat -- bdev/blockdev.sh@748 -- # mapfile -t bdevs 00:33:39.043 18:37:22 blockdev_crypto_qat -- bdev/blockdev.sh@748 -- # rpc_cmd bdev_get_bdevs 00:33:39.043 18:37:22 blockdev_crypto_qat -- bdev/blockdev.sh@748 -- # jq -r '.[] | select(.claimed == false)' 00:33:39.043 18:37:22 blockdev_crypto_qat -- common/autotest_common.sh@559 -- # xtrace_disable 00:33:39.043 18:37:22 blockdev_crypto_qat -- common/autotest_common.sh@10 -- # set +x 00:33:39.043 18:37:22 blockdev_crypto_qat -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:33:39.043 18:37:22 blockdev_crypto_qat -- bdev/blockdev.sh@749 -- # mapfile -t bdevs_name 00:33:39.043 18:37:22 blockdev_crypto_qat -- bdev/blockdev.sh@749 -- # jq -r .name 00:33:39.043 18:37:22 blockdev_crypto_qat -- bdev/blockdev.sh@749 -- # printf '%s\n' '{' ' "name": "crypto_ram",' ' "aliases": [' ' "8d5c809f-46e7-51cd-8329-866d01d0c902"' ' ],' ' "product_name": "crypto",' ' "block_size": 512,' ' "num_blocks": 65536,' ' "uuid": "8d5c809f-46e7-51cd-8329-866d01d0c902",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc0",' ' "name": "crypto_ram",' ' "key_name": "test_dek_qat_cbc"' ' }' ' }' '}' '{' ' "name": "crypto_ram1",' ' "aliases": [' ' "d29b4cbb-9586-505c-8045-48e6b07fc7e6"' ' ],' ' "product_name": "crypto",' ' "block_size": 512,' ' "num_blocks": 65536,' ' "uuid": "d29b4cbb-9586-505c-8045-48e6b07fc7e6",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc1",' ' "name": "crypto_ram1",' ' "key_name": "test_dek_qat_xts"' ' }' ' }' '}' '{' ' "name": "crypto_ram2",' ' "aliases": [' ' "36b31988-15f8-5347-bba8-441219305332"' ' ],' ' "product_name": "crypto",' ' "block_size": 4096,' ' "num_blocks": 8192,' ' "uuid": "36b31988-15f8-5347-bba8-441219305332",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc2",' ' "name": "crypto_ram2",' ' "key_name": "test_dek_qat_cbc2"' ' }' ' }' '}' '{' ' "name": "crypto_ram3",' ' "aliases": [' ' "67a9dd8b-5ab4-5b8b-be1e-2d5148bb0a89"' ' ],' ' "product_name": "crypto",' ' "block_size": 4096,' ' "num_blocks": 8192,' ' "uuid": "67a9dd8b-5ab4-5b8b-be1e-2d5148bb0a89",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc3",' ' "name": "crypto_ram3",' ' "key_name": "test_dek_qat_xts2"' ' }' ' }' '}' 00:33:39.043 18:37:22 blockdev_crypto_qat -- bdev/blockdev.sh@750 -- # bdev_list=("${bdevs_name[@]}") 00:33:39.043 18:37:22 blockdev_crypto_qat -- bdev/blockdev.sh@752 -- # hello_world_bdev=crypto_ram 00:33:39.043 18:37:22 blockdev_crypto_qat -- bdev/blockdev.sh@753 -- # trap - SIGINT SIGTERM EXIT 00:33:39.043 18:37:22 blockdev_crypto_qat -- bdev/blockdev.sh@754 -- # killprocess 2658207 00:33:39.043 18:37:22 blockdev_crypto_qat -- common/autotest_common.sh@948 -- # '[' -z 2658207 ']' 00:33:39.043 18:37:22 blockdev_crypto_qat -- common/autotest_common.sh@952 -- # kill -0 2658207 00:33:39.043 18:37:22 blockdev_crypto_qat -- common/autotest_common.sh@953 -- # uname 00:33:39.043 18:37:22 blockdev_crypto_qat -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:33:39.043 18:37:22 blockdev_crypto_qat -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2658207 00:33:39.302 18:37:22 blockdev_crypto_qat -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:33:39.302 18:37:22 blockdev_crypto_qat -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:33:39.302 18:37:22 blockdev_crypto_qat -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2658207' 00:33:39.302 killing process with pid 2658207 00:33:39.302 18:37:22 blockdev_crypto_qat -- common/autotest_common.sh@967 -- # kill 2658207 00:33:39.302 18:37:22 blockdev_crypto_qat -- common/autotest_common.sh@972 -- # wait 2658207 00:33:39.869 18:37:23 blockdev_crypto_qat -- bdev/blockdev.sh@758 -- # trap cleanup SIGINT SIGTERM EXIT 00:33:39.870 18:37:23 blockdev_crypto_qat -- bdev/blockdev.sh@760 -- # run_test bdev_hello_world /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/hello_bdev --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -b crypto_ram '' 00:33:39.870 18:37:23 blockdev_crypto_qat -- common/autotest_common.sh@1099 -- # '[' 7 -le 1 ']' 00:33:39.870 18:37:23 blockdev_crypto_qat -- common/autotest_common.sh@1105 -- # xtrace_disable 00:33:39.870 18:37:23 blockdev_crypto_qat -- common/autotest_common.sh@10 -- # set +x 00:33:39.870 ************************************ 00:33:39.870 START TEST bdev_hello_world 00:33:39.870 ************************************ 00:33:39.870 18:37:23 blockdev_crypto_qat.bdev_hello_world -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/hello_bdev --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -b crypto_ram '' 00:33:39.870 [2024-07-12 18:37:23.469168] Starting SPDK v24.09-pre git sha1 182dd7de4 / DPDK 24.03.0 initialization... 00:33:39.870 [2024-07-12 18:37:23.469230] [ DPDK EAL parameters: hello_bdev --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2658752 ] 00:33:39.870 [2024-07-12 18:37:23.595296] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:33:40.129 [2024-07-12 18:37:23.695912] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:33:40.129 [2024-07-12 18:37:23.717231] accel_dpdk_cryptodev.c: 223:accel_dpdk_cryptodev_set_driver: *NOTICE*: Using driver crypto_qat 00:33:40.129 [2024-07-12 18:37:23.725259] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation encrypt will be assigned to module dpdk_cryptodev 00:33:40.129 [2024-07-12 18:37:23.733288] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation decrypt will be assigned to module dpdk_cryptodev 00:33:40.129 [2024-07-12 18:37:23.840677] accel_dpdk_cryptodev.c:1178:accel_dpdk_cryptodev_init: *NOTICE*: Found crypto devices: 96 00:33:42.665 [2024-07-12 18:37:26.058805] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_cbc" 00:33:42.665 [2024-07-12 18:37:26.058873] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:33:42.665 [2024-07-12 18:37:26.058889] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:33:42.665 [2024-07-12 18:37:26.066825] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_xts" 00:33:42.665 [2024-07-12 18:37:26.066848] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:33:42.665 [2024-07-12 18:37:26.066860] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:33:42.665 [2024-07-12 18:37:26.074846] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_cbc2" 00:33:42.665 [2024-07-12 18:37:26.074865] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:33:42.665 [2024-07-12 18:37:26.074877] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:33:42.665 [2024-07-12 18:37:26.082867] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_xts2" 00:33:42.665 [2024-07-12 18:37:26.082889] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc3 00:33:42.665 [2024-07-12 18:37:26.082901] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:33:42.665 [2024-07-12 18:37:26.159577] hello_bdev.c: 222:hello_start: *NOTICE*: Successfully started the application 00:33:42.665 [2024-07-12 18:37:26.159622] hello_bdev.c: 231:hello_start: *NOTICE*: Opening the bdev crypto_ram 00:33:42.665 [2024-07-12 18:37:26.159640] hello_bdev.c: 244:hello_start: *NOTICE*: Opening io channel 00:33:42.665 [2024-07-12 18:37:26.160910] hello_bdev.c: 138:hello_write: *NOTICE*: Writing to the bdev 00:33:42.665 [2024-07-12 18:37:26.160990] hello_bdev.c: 117:write_complete: *NOTICE*: bdev io write completed successfully 00:33:42.665 [2024-07-12 18:37:26.161009] hello_bdev.c: 84:hello_read: *NOTICE*: Reading io 00:33:42.665 [2024-07-12 18:37:26.161053] hello_bdev.c: 65:read_complete: *NOTICE*: Read string from bdev : Hello World! 00:33:42.665 00:33:42.665 [2024-07-12 18:37:26.161073] hello_bdev.c: 74:read_complete: *NOTICE*: Stopping app 00:33:42.924 00:33:42.924 real 0m3.109s 00:33:42.924 user 0m2.704s 00:33:42.924 sys 0m0.369s 00:33:42.924 18:37:26 blockdev_crypto_qat.bdev_hello_world -- common/autotest_common.sh@1124 -- # xtrace_disable 00:33:42.924 18:37:26 blockdev_crypto_qat.bdev_hello_world -- common/autotest_common.sh@10 -- # set +x 00:33:42.924 ************************************ 00:33:42.924 END TEST bdev_hello_world 00:33:42.924 ************************************ 00:33:42.924 18:37:26 blockdev_crypto_qat -- common/autotest_common.sh@1142 -- # return 0 00:33:42.924 18:37:26 blockdev_crypto_qat -- bdev/blockdev.sh@761 -- # run_test bdev_bounds bdev_bounds '' 00:33:42.924 18:37:26 blockdev_crypto_qat -- common/autotest_common.sh@1099 -- # '[' 3 -le 1 ']' 00:33:42.924 18:37:26 blockdev_crypto_qat -- common/autotest_common.sh@1105 -- # xtrace_disable 00:33:42.924 18:37:26 blockdev_crypto_qat -- common/autotest_common.sh@10 -- # set +x 00:33:42.924 ************************************ 00:33:42.924 START TEST bdev_bounds 00:33:42.924 ************************************ 00:33:42.924 18:37:26 blockdev_crypto_qat.bdev_bounds -- common/autotest_common.sh@1123 -- # bdev_bounds '' 00:33:42.924 18:37:26 blockdev_crypto_qat.bdev_bounds -- bdev/blockdev.sh@289 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevio/bdevio -w -s 0 --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json '' 00:33:42.924 18:37:26 blockdev_crypto_qat.bdev_bounds -- bdev/blockdev.sh@290 -- # bdevio_pid=2659132 00:33:42.924 18:37:26 blockdev_crypto_qat.bdev_bounds -- bdev/blockdev.sh@291 -- # trap 'cleanup; killprocess $bdevio_pid; exit 1' SIGINT SIGTERM EXIT 00:33:42.924 18:37:26 blockdev_crypto_qat.bdev_bounds -- bdev/blockdev.sh@292 -- # echo 'Process bdevio pid: 2659132' 00:33:42.924 Process bdevio pid: 2659132 00:33:42.924 18:37:26 blockdev_crypto_qat.bdev_bounds -- bdev/blockdev.sh@293 -- # waitforlisten 2659132 00:33:42.924 18:37:26 blockdev_crypto_qat.bdev_bounds -- common/autotest_common.sh@829 -- # '[' -z 2659132 ']' 00:33:42.924 18:37:26 blockdev_crypto_qat.bdev_bounds -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:33:42.924 18:37:26 blockdev_crypto_qat.bdev_bounds -- common/autotest_common.sh@834 -- # local max_retries=100 00:33:42.924 18:37:26 blockdev_crypto_qat.bdev_bounds -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:33:42.924 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:33:42.924 18:37:26 blockdev_crypto_qat.bdev_bounds -- common/autotest_common.sh@838 -- # xtrace_disable 00:33:42.924 18:37:26 blockdev_crypto_qat.bdev_bounds -- common/autotest_common.sh@10 -- # set +x 00:33:43.182 [2024-07-12 18:37:26.656113] Starting SPDK v24.09-pre git sha1 182dd7de4 / DPDK 24.03.0 initialization... 00:33:43.182 [2024-07-12 18:37:26.656175] [ DPDK EAL parameters: bdevio --no-shconf -c 0x7 -m 0 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2659132 ] 00:33:43.182 [2024-07-12 18:37:26.782704] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 3 00:33:43.182 [2024-07-12 18:37:26.889687] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:33:43.182 [2024-07-12 18:37:26.889782] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 2 00:33:43.182 [2024-07-12 18:37:26.889786] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:33:43.440 [2024-07-12 18:37:26.911109] accel_dpdk_cryptodev.c: 223:accel_dpdk_cryptodev_set_driver: *NOTICE*: Using driver crypto_qat 00:33:43.440 [2024-07-12 18:37:26.919133] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation encrypt will be assigned to module dpdk_cryptodev 00:33:43.440 [2024-07-12 18:37:26.927155] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation decrypt will be assigned to module dpdk_cryptodev 00:33:43.440 [2024-07-12 18:37:27.027504] accel_dpdk_cryptodev.c:1178:accel_dpdk_cryptodev_init: *NOTICE*: Found crypto devices: 96 00:33:45.975 [2024-07-12 18:37:29.225438] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_cbc" 00:33:45.975 [2024-07-12 18:37:29.225524] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:33:45.975 [2024-07-12 18:37:29.225539] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:33:45.975 [2024-07-12 18:37:29.233456] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_xts" 00:33:45.975 [2024-07-12 18:37:29.233476] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:33:45.975 [2024-07-12 18:37:29.233488] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:33:45.975 [2024-07-12 18:37:29.241477] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_cbc2" 00:33:45.975 [2024-07-12 18:37:29.241496] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:33:45.975 [2024-07-12 18:37:29.241508] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:33:45.975 [2024-07-12 18:37:29.249499] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_xts2" 00:33:45.975 [2024-07-12 18:37:29.249517] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc3 00:33:45.975 [2024-07-12 18:37:29.249529] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:33:45.975 18:37:29 blockdev_crypto_qat.bdev_bounds -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:33:45.975 18:37:29 blockdev_crypto_qat.bdev_bounds -- common/autotest_common.sh@862 -- # return 0 00:33:45.975 18:37:29 blockdev_crypto_qat.bdev_bounds -- bdev/blockdev.sh@294 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevio/tests.py perform_tests 00:33:45.975 I/O targets: 00:33:45.975 crypto_ram: 65536 blocks of 512 bytes (32 MiB) 00:33:45.975 crypto_ram1: 65536 blocks of 512 bytes (32 MiB) 00:33:45.975 crypto_ram2: 8192 blocks of 4096 bytes (32 MiB) 00:33:45.975 crypto_ram3: 8192 blocks of 4096 bytes (32 MiB) 00:33:45.975 00:33:45.975 00:33:45.975 CUnit - A unit testing framework for C - Version 2.1-3 00:33:45.975 http://cunit.sourceforge.net/ 00:33:45.975 00:33:45.975 00:33:45.975 Suite: bdevio tests on: crypto_ram3 00:33:45.975 Test: blockdev write read block ...passed 00:33:45.975 Test: blockdev write zeroes read block ...passed 00:33:45.975 Test: blockdev write zeroes read no split ...passed 00:33:45.975 Test: blockdev write zeroes read split ...passed 00:33:45.975 Test: blockdev write zeroes read split partial ...passed 00:33:45.975 Test: blockdev reset ...passed 00:33:45.975 Test: blockdev write read 8 blocks ...passed 00:33:45.975 Test: blockdev write read size > 128k ...passed 00:33:45.975 Test: blockdev write read invalid size ...passed 00:33:45.975 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:33:45.975 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:33:45.975 Test: blockdev write read max offset ...passed 00:33:45.975 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:33:45.975 Test: blockdev writev readv 8 blocks ...passed 00:33:45.975 Test: blockdev writev readv 30 x 1block ...passed 00:33:45.975 Test: blockdev writev readv block ...passed 00:33:45.975 Test: blockdev writev readv size > 128k ...passed 00:33:45.975 Test: blockdev writev readv size > 128k in two iovs ...passed 00:33:45.975 Test: blockdev comparev and writev ...passed 00:33:45.975 Test: blockdev nvme passthru rw ...passed 00:33:45.975 Test: blockdev nvme passthru vendor specific ...passed 00:33:45.975 Test: blockdev nvme admin passthru ...passed 00:33:45.975 Test: blockdev copy ...passed 00:33:45.975 Suite: bdevio tests on: crypto_ram2 00:33:45.975 Test: blockdev write read block ...passed 00:33:45.975 Test: blockdev write zeroes read block ...passed 00:33:45.975 Test: blockdev write zeroes read no split ...passed 00:33:45.975 Test: blockdev write zeroes read split ...passed 00:33:45.975 Test: blockdev write zeroes read split partial ...passed 00:33:45.975 Test: blockdev reset ...passed 00:33:45.975 Test: blockdev write read 8 blocks ...passed 00:33:45.975 Test: blockdev write read size > 128k ...passed 00:33:45.975 Test: blockdev write read invalid size ...passed 00:33:45.975 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:33:45.975 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:33:45.975 Test: blockdev write read max offset ...passed 00:33:45.975 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:33:45.975 Test: blockdev writev readv 8 blocks ...passed 00:33:45.975 Test: blockdev writev readv 30 x 1block ...passed 00:33:45.975 Test: blockdev writev readv block ...passed 00:33:45.975 Test: blockdev writev readv size > 128k ...passed 00:33:45.975 Test: blockdev writev readv size > 128k in two iovs ...passed 00:33:45.975 Test: blockdev comparev and writev ...passed 00:33:45.975 Test: blockdev nvme passthru rw ...passed 00:33:45.975 Test: blockdev nvme passthru vendor specific ...passed 00:33:45.975 Test: blockdev nvme admin passthru ...passed 00:33:45.975 Test: blockdev copy ...passed 00:33:45.975 Suite: bdevio tests on: crypto_ram1 00:33:45.975 Test: blockdev write read block ...passed 00:33:45.976 Test: blockdev write zeroes read block ...passed 00:33:45.976 Test: blockdev write zeroes read no split ...passed 00:33:45.976 Test: blockdev write zeroes read split ...passed 00:33:45.976 Test: blockdev write zeroes read split partial ...passed 00:33:45.976 Test: blockdev reset ...passed 00:33:45.976 Test: blockdev write read 8 blocks ...passed 00:33:45.976 Test: blockdev write read size > 128k ...passed 00:33:45.976 Test: blockdev write read invalid size ...passed 00:33:45.976 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:33:45.976 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:33:45.976 Test: blockdev write read max offset ...passed 00:33:45.976 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:33:45.976 Test: blockdev writev readv 8 blocks ...passed 00:33:45.976 Test: blockdev writev readv 30 x 1block ...passed 00:33:45.976 Test: blockdev writev readv block ...passed 00:33:45.976 Test: blockdev writev readv size > 128k ...passed 00:33:45.976 Test: blockdev writev readv size > 128k in two iovs ...passed 00:33:45.976 Test: blockdev comparev and writev ...passed 00:33:45.976 Test: blockdev nvme passthru rw ...passed 00:33:45.976 Test: blockdev nvme passthru vendor specific ...passed 00:33:45.976 Test: blockdev nvme admin passthru ...passed 00:33:45.976 Test: blockdev copy ...passed 00:33:45.976 Suite: bdevio tests on: crypto_ram 00:33:45.976 Test: blockdev write read block ...passed 00:33:45.976 Test: blockdev write zeroes read block ...passed 00:33:45.976 Test: blockdev write zeroes read no split ...passed 00:33:45.976 Test: blockdev write zeroes read split ...passed 00:33:45.976 Test: blockdev write zeroes read split partial ...passed 00:33:45.976 Test: blockdev reset ...passed 00:33:45.976 Test: blockdev write read 8 blocks ...passed 00:33:45.976 Test: blockdev write read size > 128k ...passed 00:33:45.976 Test: blockdev write read invalid size ...passed 00:33:45.976 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:33:45.976 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:33:45.976 Test: blockdev write read max offset ...passed 00:33:45.976 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:33:45.976 Test: blockdev writev readv 8 blocks ...passed 00:33:45.976 Test: blockdev writev readv 30 x 1block ...passed 00:33:45.976 Test: blockdev writev readv block ...passed 00:33:45.976 Test: blockdev writev readv size > 128k ...passed 00:33:45.976 Test: blockdev writev readv size > 128k in two iovs ...passed 00:33:45.976 Test: blockdev comparev and writev ...passed 00:33:45.976 Test: blockdev nvme passthru rw ...passed 00:33:45.976 Test: blockdev nvme passthru vendor specific ...passed 00:33:45.976 Test: blockdev nvme admin passthru ...passed 00:33:45.976 Test: blockdev copy ...passed 00:33:45.976 00:33:45.976 Run Summary: Type Total Ran Passed Failed Inactive 00:33:45.976 suites 4 4 n/a 0 0 00:33:45.976 tests 92 92 92 0 0 00:33:45.976 asserts 520 520 520 0 n/a 00:33:45.976 00:33:45.976 Elapsed time = 0.518 seconds 00:33:45.976 0 00:33:46.235 18:37:29 blockdev_crypto_qat.bdev_bounds -- bdev/blockdev.sh@295 -- # killprocess 2659132 00:33:46.235 18:37:29 blockdev_crypto_qat.bdev_bounds -- common/autotest_common.sh@948 -- # '[' -z 2659132 ']' 00:33:46.235 18:37:29 blockdev_crypto_qat.bdev_bounds -- common/autotest_common.sh@952 -- # kill -0 2659132 00:33:46.235 18:37:29 blockdev_crypto_qat.bdev_bounds -- common/autotest_common.sh@953 -- # uname 00:33:46.235 18:37:29 blockdev_crypto_qat.bdev_bounds -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:33:46.235 18:37:29 blockdev_crypto_qat.bdev_bounds -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2659132 00:33:46.235 18:37:29 blockdev_crypto_qat.bdev_bounds -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:33:46.235 18:37:29 blockdev_crypto_qat.bdev_bounds -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:33:46.235 18:37:29 blockdev_crypto_qat.bdev_bounds -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2659132' 00:33:46.235 killing process with pid 2659132 00:33:46.235 18:37:29 blockdev_crypto_qat.bdev_bounds -- common/autotest_common.sh@967 -- # kill 2659132 00:33:46.235 18:37:29 blockdev_crypto_qat.bdev_bounds -- common/autotest_common.sh@972 -- # wait 2659132 00:33:46.494 18:37:30 blockdev_crypto_qat.bdev_bounds -- bdev/blockdev.sh@296 -- # trap - SIGINT SIGTERM EXIT 00:33:46.494 00:33:46.494 real 0m3.556s 00:33:46.494 user 0m9.897s 00:33:46.494 sys 0m0.523s 00:33:46.494 18:37:30 blockdev_crypto_qat.bdev_bounds -- common/autotest_common.sh@1124 -- # xtrace_disable 00:33:46.494 18:37:30 blockdev_crypto_qat.bdev_bounds -- common/autotest_common.sh@10 -- # set +x 00:33:46.494 ************************************ 00:33:46.494 END TEST bdev_bounds 00:33:46.494 ************************************ 00:33:46.494 18:37:30 blockdev_crypto_qat -- common/autotest_common.sh@1142 -- # return 0 00:33:46.494 18:37:30 blockdev_crypto_qat -- bdev/blockdev.sh@762 -- # run_test bdev_nbd nbd_function_test /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json 'crypto_ram crypto_ram1 crypto_ram2 crypto_ram3' '' 00:33:46.494 18:37:30 blockdev_crypto_qat -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:33:46.494 18:37:30 blockdev_crypto_qat -- common/autotest_common.sh@1105 -- # xtrace_disable 00:33:46.494 18:37:30 blockdev_crypto_qat -- common/autotest_common.sh@10 -- # set +x 00:33:46.754 ************************************ 00:33:46.754 START TEST bdev_nbd 00:33:46.754 ************************************ 00:33:46.754 18:37:30 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@1123 -- # nbd_function_test /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json 'crypto_ram crypto_ram1 crypto_ram2 crypto_ram3' '' 00:33:46.754 18:37:30 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@300 -- # uname -s 00:33:46.754 18:37:30 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@300 -- # [[ Linux == Linux ]] 00:33:46.754 18:37:30 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@302 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:33:46.754 18:37:30 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@303 -- # local conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json 00:33:46.754 18:37:30 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@304 -- # bdev_all=('crypto_ram' 'crypto_ram1' 'crypto_ram2' 'crypto_ram3') 00:33:46.754 18:37:30 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@304 -- # local bdev_all 00:33:46.754 18:37:30 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@305 -- # local bdev_num=4 00:33:46.754 18:37:30 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@309 -- # [[ -e /sys/module/nbd ]] 00:33:46.754 18:37:30 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@311 -- # nbd_all=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11' '/dev/nbd12' '/dev/nbd13' '/dev/nbd14' '/dev/nbd15' '/dev/nbd2' '/dev/nbd3' '/dev/nbd4' '/dev/nbd5' '/dev/nbd6' '/dev/nbd7' '/dev/nbd8' '/dev/nbd9') 00:33:46.754 18:37:30 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@311 -- # local nbd_all 00:33:46.754 18:37:30 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@312 -- # bdev_num=4 00:33:46.754 18:37:30 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@314 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11') 00:33:46.754 18:37:30 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@314 -- # local nbd_list 00:33:46.754 18:37:30 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@315 -- # bdev_list=('crypto_ram' 'crypto_ram1' 'crypto_ram2' 'crypto_ram3') 00:33:46.754 18:37:30 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@315 -- # local bdev_list 00:33:46.754 18:37:30 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@318 -- # nbd_pid=2659673 00:33:46.754 18:37:30 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@319 -- # trap 'cleanup; killprocess $nbd_pid' SIGINT SIGTERM EXIT 00:33:46.754 18:37:30 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@317 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-nbd.sock -i 0 --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json '' 00:33:46.754 18:37:30 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@320 -- # waitforlisten 2659673 /var/tmp/spdk-nbd.sock 00:33:46.754 18:37:30 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@829 -- # '[' -z 2659673 ']' 00:33:46.754 18:37:30 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-nbd.sock 00:33:46.754 18:37:30 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@834 -- # local max_retries=100 00:33:46.754 18:37:30 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock...' 00:33:46.754 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock... 00:33:46.754 18:37:30 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@838 -- # xtrace_disable 00:33:46.754 18:37:30 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@10 -- # set +x 00:33:46.754 [2024-07-12 18:37:30.302863] Starting SPDK v24.09-pre git sha1 182dd7de4 / DPDK 24.03.0 initialization... 00:33:46.754 [2024-07-12 18:37:30.302935] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:33:46.754 [2024-07-12 18:37:30.432950] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:33:47.013 [2024-07-12 18:37:30.539566] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:33:47.013 [2024-07-12 18:37:30.560915] accel_dpdk_cryptodev.c: 223:accel_dpdk_cryptodev_set_driver: *NOTICE*: Using driver crypto_qat 00:33:47.013 [2024-07-12 18:37:30.568943] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation encrypt will be assigned to module dpdk_cryptodev 00:33:47.013 [2024-07-12 18:37:30.576960] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation decrypt will be assigned to module dpdk_cryptodev 00:33:47.013 [2024-07-12 18:37:30.683633] accel_dpdk_cryptodev.c:1178:accel_dpdk_cryptodev_init: *NOTICE*: Found crypto devices: 96 00:33:49.551 [2024-07-12 18:37:32.903387] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_cbc" 00:33:49.551 [2024-07-12 18:37:32.903452] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:33:49.551 [2024-07-12 18:37:32.903471] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:33:49.551 [2024-07-12 18:37:32.911407] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_xts" 00:33:49.551 [2024-07-12 18:37:32.911428] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:33:49.551 [2024-07-12 18:37:32.911440] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:33:49.551 [2024-07-12 18:37:32.919425] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_cbc2" 00:33:49.551 [2024-07-12 18:37:32.919445] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:33:49.551 [2024-07-12 18:37:32.919457] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:33:49.551 [2024-07-12 18:37:32.927446] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_xts2" 00:33:49.551 [2024-07-12 18:37:32.927465] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc3 00:33:49.551 [2024-07-12 18:37:32.927476] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:33:49.551 18:37:33 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:33:49.551 18:37:33 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@862 -- # return 0 00:33:49.551 18:37:33 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@322 -- # nbd_rpc_start_stop_verify /var/tmp/spdk-nbd.sock 'crypto_ram crypto_ram1 crypto_ram2 crypto_ram3' 00:33:49.551 18:37:33 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@113 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:33:49.551 18:37:33 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@114 -- # bdev_list=('crypto_ram' 'crypto_ram1' 'crypto_ram2' 'crypto_ram3') 00:33:49.551 18:37:33 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@114 -- # local bdev_list 00:33:49.551 18:37:33 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@116 -- # nbd_start_disks_without_nbd_idx /var/tmp/spdk-nbd.sock 'crypto_ram crypto_ram1 crypto_ram2 crypto_ram3' 00:33:49.551 18:37:33 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@22 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:33:49.551 18:37:33 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@23 -- # bdev_list=('crypto_ram' 'crypto_ram1' 'crypto_ram2' 'crypto_ram3') 00:33:49.551 18:37:33 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@23 -- # local bdev_list 00:33:49.551 18:37:33 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@24 -- # local i 00:33:49.551 18:37:33 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@25 -- # local nbd_device 00:33:49.551 18:37:33 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i = 0 )) 00:33:49.551 18:37:33 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 4 )) 00:33:49.551 18:37:33 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk crypto_ram 00:33:49.810 18:37:33 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd0 00:33:49.810 18:37:33 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd0 00:33:49.810 18:37:33 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd0 00:33:49.810 18:37:33 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd0 00:33:49.810 18:37:33 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:33:49.810 18:37:33 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:33:49.810 18:37:33 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:33:49.810 18:37:33 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd0 /proc/partitions 00:33:49.810 18:37:33 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:33:49.810 18:37:33 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:33:49.810 18:37:33 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:33:49.810 18:37:33 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:33:49.810 1+0 records in 00:33:49.810 1+0 records out 00:33:49.810 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000312303 s, 13.1 MB/s 00:33:49.810 18:37:33 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:33:49.810 18:37:33 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:33:49.810 18:37:33 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:33:49.810 18:37:33 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:33:49.810 18:37:33 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:33:49.810 18:37:33 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:33:49.810 18:37:33 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 4 )) 00:33:49.810 18:37:33 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk crypto_ram1 00:33:50.068 18:37:33 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd1 00:33:50.068 18:37:33 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd1 00:33:50.068 18:37:33 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd1 00:33:50.068 18:37:33 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd1 00:33:50.068 18:37:33 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:33:50.068 18:37:33 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:33:50.068 18:37:33 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:33:50.068 18:37:33 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd1 /proc/partitions 00:33:50.068 18:37:33 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:33:50.068 18:37:33 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:33:50.068 18:37:33 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:33:50.068 18:37:33 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:33:50.068 1+0 records in 00:33:50.068 1+0 records out 00:33:50.068 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000300256 s, 13.6 MB/s 00:33:50.068 18:37:33 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:33:50.068 18:37:33 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:33:50.068 18:37:33 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:33:50.068 18:37:33 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:33:50.068 18:37:33 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:33:50.068 18:37:33 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:33:50.068 18:37:33 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 4 )) 00:33:50.068 18:37:33 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk crypto_ram2 00:33:50.328 18:37:33 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd2 00:33:50.328 18:37:33 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd2 00:33:50.328 18:37:33 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd2 00:33:50.328 18:37:33 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd2 00:33:50.328 18:37:33 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:33:50.328 18:37:33 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:33:50.328 18:37:33 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:33:50.328 18:37:33 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd2 /proc/partitions 00:33:50.328 18:37:33 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:33:50.328 18:37:33 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:33:50.328 18:37:33 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:33:50.328 18:37:33 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd2 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:33:50.328 1+0 records in 00:33:50.328 1+0 records out 00:33:50.328 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000202995 s, 20.2 MB/s 00:33:50.328 18:37:33 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:33:50.328 18:37:33 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:33:50.328 18:37:33 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:33:50.328 18:37:33 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:33:50.328 18:37:33 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:33:50.328 18:37:33 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:33:50.328 18:37:33 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 4 )) 00:33:50.328 18:37:33 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk crypto_ram3 00:33:50.618 18:37:34 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd3 00:33:50.618 18:37:34 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd3 00:33:50.618 18:37:34 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd3 00:33:50.618 18:37:34 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd3 00:33:50.618 18:37:34 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:33:50.618 18:37:34 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:33:50.618 18:37:34 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:33:50.618 18:37:34 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd3 /proc/partitions 00:33:50.618 18:37:34 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:33:50.619 18:37:34 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:33:50.619 18:37:34 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:33:50.619 18:37:34 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd3 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:33:50.619 1+0 records in 00:33:50.619 1+0 records out 00:33:50.619 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000309163 s, 13.2 MB/s 00:33:50.619 18:37:34 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:33:50.619 18:37:34 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:33:50.619 18:37:34 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:33:50.619 18:37:34 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:33:50.619 18:37:34 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:33:50.619 18:37:34 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:33:50.619 18:37:34 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 4 )) 00:33:50.619 18:37:34 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@118 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:33:50.877 18:37:34 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@118 -- # nbd_disks_json='[ 00:33:50.877 { 00:33:50.877 "nbd_device": "/dev/nbd0", 00:33:50.877 "bdev_name": "crypto_ram" 00:33:50.877 }, 00:33:50.877 { 00:33:50.877 "nbd_device": "/dev/nbd1", 00:33:50.877 "bdev_name": "crypto_ram1" 00:33:50.877 }, 00:33:50.877 { 00:33:50.877 "nbd_device": "/dev/nbd2", 00:33:50.877 "bdev_name": "crypto_ram2" 00:33:50.877 }, 00:33:50.877 { 00:33:50.877 "nbd_device": "/dev/nbd3", 00:33:50.877 "bdev_name": "crypto_ram3" 00:33:50.877 } 00:33:50.877 ]' 00:33:50.877 18:37:34 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@119 -- # nbd_disks_name=($(echo "${nbd_disks_json}" | jq -r '.[] | .nbd_device')) 00:33:50.877 18:37:34 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@119 -- # echo '[ 00:33:50.877 { 00:33:50.877 "nbd_device": "/dev/nbd0", 00:33:50.877 "bdev_name": "crypto_ram" 00:33:50.877 }, 00:33:50.877 { 00:33:50.877 "nbd_device": "/dev/nbd1", 00:33:50.877 "bdev_name": "crypto_ram1" 00:33:50.877 }, 00:33:50.877 { 00:33:50.877 "nbd_device": "/dev/nbd2", 00:33:50.877 "bdev_name": "crypto_ram2" 00:33:50.877 }, 00:33:50.877 { 00:33:50.877 "nbd_device": "/dev/nbd3", 00:33:50.877 "bdev_name": "crypto_ram3" 00:33:50.877 } 00:33:50.877 ]' 00:33:50.877 18:37:34 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@119 -- # jq -r '.[] | .nbd_device' 00:33:50.877 18:37:34 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@120 -- # nbd_stop_disks /var/tmp/spdk-nbd.sock '/dev/nbd0 /dev/nbd1 /dev/nbd2 /dev/nbd3' 00:33:50.877 18:37:34 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:33:50.877 18:37:34 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd2' '/dev/nbd3') 00:33:50.877 18:37:34 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@50 -- # local nbd_list 00:33:50.877 18:37:34 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@51 -- # local i 00:33:50.877 18:37:34 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:33:50.877 18:37:34 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd0 00:33:51.136 18:37:34 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:33:51.136 18:37:34 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:33:51.136 18:37:34 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:33:51.136 18:37:34 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:33:51.136 18:37:34 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:33:51.136 18:37:34 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:33:51.136 18:37:34 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:33:51.136 18:37:34 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:33:51.136 18:37:34 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:33:51.136 18:37:34 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd1 00:33:51.395 18:37:35 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:33:51.395 18:37:35 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:33:51.395 18:37:35 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:33:51.395 18:37:35 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:33:51.395 18:37:35 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:33:51.395 18:37:35 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:33:51.395 18:37:35 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:33:51.395 18:37:35 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:33:51.395 18:37:35 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:33:51.395 18:37:35 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd2 00:33:51.654 18:37:35 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd2 00:33:51.654 18:37:35 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd2 00:33:51.654 18:37:35 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd2 00:33:51.654 18:37:35 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:33:51.654 18:37:35 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:33:51.654 18:37:35 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd2 /proc/partitions 00:33:51.654 18:37:35 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:33:51.654 18:37:35 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:33:51.654 18:37:35 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:33:51.654 18:37:35 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd3 00:33:51.913 18:37:35 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd3 00:33:51.913 18:37:35 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd3 00:33:51.913 18:37:35 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd3 00:33:51.913 18:37:35 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:33:51.913 18:37:35 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:33:51.913 18:37:35 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd3 /proc/partitions 00:33:51.913 18:37:35 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:33:51.913 18:37:35 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:33:51.913 18:37:35 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@122 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:33:51.913 18:37:35 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:33:51.914 18:37:35 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:33:52.173 18:37:35 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[]' 00:33:52.173 18:37:35 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@64 -- # echo '[]' 00:33:52.173 18:37:35 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:33:52.432 18:37:35 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@64 -- # nbd_disks_name= 00:33:52.432 18:37:35 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@65 -- # echo '' 00:33:52.432 18:37:35 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:33:52.432 18:37:35 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@65 -- # true 00:33:52.432 18:37:35 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@65 -- # count=0 00:33:52.432 18:37:35 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@66 -- # echo 0 00:33:52.432 18:37:35 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@122 -- # count=0 00:33:52.432 18:37:35 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@123 -- # '[' 0 -ne 0 ']' 00:33:52.432 18:37:35 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@127 -- # return 0 00:33:52.432 18:37:35 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@323 -- # nbd_rpc_data_verify /var/tmp/spdk-nbd.sock 'crypto_ram crypto_ram1 crypto_ram2 crypto_ram3' '/dev/nbd0 /dev/nbd1 /dev/nbd10 /dev/nbd11' 00:33:52.432 18:37:35 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@90 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:33:52.432 18:37:35 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@91 -- # bdev_list=('crypto_ram' 'crypto_ram1' 'crypto_ram2' 'crypto_ram3') 00:33:52.432 18:37:35 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@91 -- # local bdev_list 00:33:52.432 18:37:35 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@92 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11') 00:33:52.432 18:37:35 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@92 -- # local nbd_list 00:33:52.432 18:37:35 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@94 -- # nbd_start_disks /var/tmp/spdk-nbd.sock 'crypto_ram crypto_ram1 crypto_ram2 crypto_ram3' '/dev/nbd0 /dev/nbd1 /dev/nbd10 /dev/nbd11' 00:33:52.432 18:37:35 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:33:52.432 18:37:35 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@10 -- # bdev_list=('crypto_ram' 'crypto_ram1' 'crypto_ram2' 'crypto_ram3') 00:33:52.432 18:37:35 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@10 -- # local bdev_list 00:33:52.432 18:37:35 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11') 00:33:52.432 18:37:35 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@11 -- # local nbd_list 00:33:52.432 18:37:35 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@12 -- # local i 00:33:52.432 18:37:35 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:33:52.432 18:37:35 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 4 )) 00:33:52.432 18:37:35 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk crypto_ram /dev/nbd0 00:33:52.432 /dev/nbd0 00:33:52.432 18:37:36 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:33:52.691 18:37:36 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:33:52.691 18:37:36 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd0 00:33:52.691 18:37:36 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:33:52.691 18:37:36 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:33:52.691 18:37:36 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:33:52.691 18:37:36 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd0 /proc/partitions 00:33:52.691 18:37:36 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:33:52.691 18:37:36 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:33:52.691 18:37:36 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:33:52.691 18:37:36 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:33:52.691 1+0 records in 00:33:52.691 1+0 records out 00:33:52.691 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000328889 s, 12.5 MB/s 00:33:52.691 18:37:36 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:33:52.691 18:37:36 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:33:52.691 18:37:36 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:33:52.691 18:37:36 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:33:52.691 18:37:36 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:33:52.691 18:37:36 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:33:52.691 18:37:36 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 4 )) 00:33:52.691 18:37:36 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk crypto_ram1 /dev/nbd1 00:33:52.951 /dev/nbd1 00:33:52.951 18:37:36 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:33:52.951 18:37:36 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:33:52.951 18:37:36 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd1 00:33:52.951 18:37:36 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:33:52.951 18:37:36 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:33:52.951 18:37:36 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:33:52.951 18:37:36 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd1 /proc/partitions 00:33:52.951 18:37:36 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:33:52.951 18:37:36 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:33:52.951 18:37:36 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:33:52.951 18:37:36 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:33:52.951 1+0 records in 00:33:52.951 1+0 records out 00:33:52.951 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000266675 s, 15.4 MB/s 00:33:52.951 18:37:36 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:33:52.951 18:37:36 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:33:52.951 18:37:36 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:33:52.951 18:37:36 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:33:52.951 18:37:36 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:33:52.951 18:37:36 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:33:52.951 18:37:36 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 4 )) 00:33:52.951 18:37:36 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk crypto_ram2 /dev/nbd10 00:33:53.210 /dev/nbd10 00:33:53.210 18:37:36 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd10 00:33:53.210 18:37:36 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd10 00:33:53.210 18:37:36 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd10 00:33:53.210 18:37:36 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:33:53.210 18:37:36 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:33:53.210 18:37:36 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:33:53.210 18:37:36 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd10 /proc/partitions 00:33:53.210 18:37:36 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:33:53.210 18:37:36 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:33:53.210 18:37:36 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:33:53.210 18:37:36 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd10 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:33:53.210 1+0 records in 00:33:53.210 1+0 records out 00:33:53.210 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000236552 s, 17.3 MB/s 00:33:53.210 18:37:36 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:33:53.210 18:37:36 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:33:53.210 18:37:36 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:33:53.210 18:37:36 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:33:53.210 18:37:36 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:33:53.210 18:37:36 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:33:53.210 18:37:36 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 4 )) 00:33:53.210 18:37:36 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk crypto_ram3 /dev/nbd11 00:33:53.470 /dev/nbd11 00:33:53.470 18:37:37 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd11 00:33:53.470 18:37:37 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd11 00:33:53.470 18:37:37 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd11 00:33:53.470 18:37:37 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:33:53.470 18:37:37 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:33:53.470 18:37:37 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:33:53.470 18:37:37 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd11 /proc/partitions 00:33:53.470 18:37:37 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:33:53.470 18:37:37 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:33:53.470 18:37:37 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:33:53.470 18:37:37 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd11 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:33:53.470 1+0 records in 00:33:53.470 1+0 records out 00:33:53.470 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000317458 s, 12.9 MB/s 00:33:53.470 18:37:37 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:33:53.470 18:37:37 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:33:53.470 18:37:37 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:33:53.470 18:37:37 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:33:53.470 18:37:37 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:33:53.470 18:37:37 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:33:53.470 18:37:37 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 4 )) 00:33:53.470 18:37:37 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@95 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:33:53.470 18:37:37 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:33:53.470 18:37:37 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:33:53.730 18:37:37 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[ 00:33:53.730 { 00:33:53.730 "nbd_device": "/dev/nbd0", 00:33:53.730 "bdev_name": "crypto_ram" 00:33:53.730 }, 00:33:53.730 { 00:33:53.730 "nbd_device": "/dev/nbd1", 00:33:53.730 "bdev_name": "crypto_ram1" 00:33:53.730 }, 00:33:53.730 { 00:33:53.730 "nbd_device": "/dev/nbd10", 00:33:53.730 "bdev_name": "crypto_ram2" 00:33:53.730 }, 00:33:53.730 { 00:33:53.730 "nbd_device": "/dev/nbd11", 00:33:53.730 "bdev_name": "crypto_ram3" 00:33:53.730 } 00:33:53.730 ]' 00:33:53.730 18:37:37 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@64 -- # echo '[ 00:33:53.730 { 00:33:53.730 "nbd_device": "/dev/nbd0", 00:33:53.730 "bdev_name": "crypto_ram" 00:33:53.730 }, 00:33:53.730 { 00:33:53.730 "nbd_device": "/dev/nbd1", 00:33:53.730 "bdev_name": "crypto_ram1" 00:33:53.730 }, 00:33:53.730 { 00:33:53.730 "nbd_device": "/dev/nbd10", 00:33:53.730 "bdev_name": "crypto_ram2" 00:33:53.730 }, 00:33:53.730 { 00:33:53.730 "nbd_device": "/dev/nbd11", 00:33:53.730 "bdev_name": "crypto_ram3" 00:33:53.730 } 00:33:53.730 ]' 00:33:53.730 18:37:37 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:33:53.730 18:37:37 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@64 -- # nbd_disks_name='/dev/nbd0 00:33:53.730 /dev/nbd1 00:33:53.730 /dev/nbd10 00:33:53.730 /dev/nbd11' 00:33:53.730 18:37:37 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@65 -- # echo '/dev/nbd0 00:33:53.730 /dev/nbd1 00:33:53.730 /dev/nbd10 00:33:53.730 /dev/nbd11' 00:33:53.731 18:37:37 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:33:53.731 18:37:37 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@65 -- # count=4 00:33:53.731 18:37:37 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@66 -- # echo 4 00:33:53.731 18:37:37 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@95 -- # count=4 00:33:53.731 18:37:37 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@96 -- # '[' 4 -ne 4 ']' 00:33:53.731 18:37:37 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@100 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1 /dev/nbd10 /dev/nbd11' write 00:33:53.731 18:37:37 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11') 00:33:53.731 18:37:37 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@70 -- # local nbd_list 00:33:53.731 18:37:37 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@71 -- # local operation=write 00:33:53.731 18:37:37 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest 00:33:53.731 18:37:37 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@74 -- # '[' write = write ']' 00:33:53.731 18:37:37 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@76 -- # dd if=/dev/urandom of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest bs=4096 count=256 00:33:53.731 256+0 records in 00:33:53.731 256+0 records out 00:33:53.731 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0114985 s, 91.2 MB/s 00:33:53.731 18:37:37 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:33:53.731 18:37:37 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd0 bs=4096 count=256 oflag=direct 00:33:53.990 256+0 records in 00:33:53.990 256+0 records out 00:33:53.990 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0841427 s, 12.5 MB/s 00:33:53.990 18:37:37 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:33:53.990 18:37:37 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd1 bs=4096 count=256 oflag=direct 00:33:53.990 256+0 records in 00:33:53.990 256+0 records out 00:33:53.990 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0662077 s, 15.8 MB/s 00:33:53.990 18:37:37 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:33:53.990 18:37:37 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd10 bs=4096 count=256 oflag=direct 00:33:53.990 256+0 records in 00:33:53.990 256+0 records out 00:33:53.990 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0583023 s, 18.0 MB/s 00:33:53.990 18:37:37 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:33:53.990 18:37:37 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd11 bs=4096 count=256 oflag=direct 00:33:53.990 256+0 records in 00:33:53.990 256+0 records out 00:33:53.990 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0496789 s, 21.1 MB/s 00:33:53.990 18:37:37 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@101 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1 /dev/nbd10 /dev/nbd11' verify 00:33:53.990 18:37:37 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11') 00:33:53.990 18:37:37 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@70 -- # local nbd_list 00:33:53.990 18:37:37 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@71 -- # local operation=verify 00:33:53.990 18:37:37 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest 00:33:53.990 18:37:37 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@74 -- # '[' verify = write ']' 00:33:53.990 18:37:37 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@80 -- # '[' verify = verify ']' 00:33:53.990 18:37:37 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:33:53.990 18:37:37 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd0 00:33:53.990 18:37:37 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:33:53.990 18:37:37 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd1 00:33:53.990 18:37:37 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:33:53.990 18:37:37 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd10 00:33:53.990 18:37:37 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:33:53.990 18:37:37 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd11 00:33:53.990 18:37:37 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@85 -- # rm /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest 00:33:54.250 18:37:37 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@103 -- # nbd_stop_disks /var/tmp/spdk-nbd.sock '/dev/nbd0 /dev/nbd1 /dev/nbd10 /dev/nbd11' 00:33:54.250 18:37:37 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:33:54.250 18:37:37 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11') 00:33:54.250 18:37:37 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@50 -- # local nbd_list 00:33:54.250 18:37:37 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@51 -- # local i 00:33:54.250 18:37:37 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:33:54.250 18:37:37 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd0 00:33:54.250 18:37:37 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:33:54.250 18:37:37 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:33:54.250 18:37:37 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:33:54.250 18:37:37 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:33:54.250 18:37:37 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:33:54.250 18:37:37 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:33:54.250 18:37:37 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:33:54.250 18:37:37 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:33:54.250 18:37:37 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:33:54.250 18:37:37 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd1 00:33:54.510 18:37:38 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:33:54.510 18:37:38 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:33:54.510 18:37:38 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:33:54.510 18:37:38 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:33:54.510 18:37:38 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:33:54.510 18:37:38 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:33:54.510 18:37:38 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:33:54.510 18:37:38 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:33:54.510 18:37:38 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:33:54.510 18:37:38 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd10 00:33:54.770 18:37:38 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd10 00:33:54.770 18:37:38 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd10 00:33:54.770 18:37:38 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd10 00:33:54.770 18:37:38 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:33:54.770 18:37:38 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:33:54.770 18:37:38 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd10 /proc/partitions 00:33:54.770 18:37:38 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:33:54.770 18:37:38 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:33:54.770 18:37:38 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:33:54.770 18:37:38 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd11 00:33:55.029 18:37:38 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd11 00:33:55.029 18:37:38 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd11 00:33:55.029 18:37:38 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd11 00:33:55.029 18:37:38 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:33:55.029 18:37:38 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:33:55.029 18:37:38 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd11 /proc/partitions 00:33:55.029 18:37:38 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:33:55.029 18:37:38 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:33:55.029 18:37:38 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@104 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:33:55.029 18:37:38 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:33:55.029 18:37:38 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:33:55.289 18:37:38 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[]' 00:33:55.289 18:37:38 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@64 -- # echo '[]' 00:33:55.289 18:37:38 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:33:55.289 18:37:39 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@64 -- # nbd_disks_name= 00:33:55.289 18:37:39 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@65 -- # echo '' 00:33:55.289 18:37:39 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:33:55.289 18:37:39 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@65 -- # true 00:33:55.289 18:37:39 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@65 -- # count=0 00:33:55.289 18:37:39 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@66 -- # echo 0 00:33:55.289 18:37:39 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@104 -- # count=0 00:33:55.289 18:37:39 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@105 -- # '[' 0 -ne 0 ']' 00:33:55.289 18:37:39 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@109 -- # return 0 00:33:55.289 18:37:39 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@324 -- # nbd_with_lvol_verify /var/tmp/spdk-nbd.sock '/dev/nbd0 /dev/nbd1 /dev/nbd10 /dev/nbd11' 00:33:55.289 18:37:39 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@131 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:33:55.289 18:37:39 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@132 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11') 00:33:55.289 18:37:39 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@132 -- # local nbd_list 00:33:55.289 18:37:39 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@133 -- # local mkfs_ret 00:33:55.289 18:37:39 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@135 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_malloc_create -b malloc_lvol_verify 16 512 00:33:55.549 malloc_lvol_verify 00:33:55.549 18:37:39 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@136 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_lvol_create_lvstore malloc_lvol_verify lvs 00:33:55.808 51374fda-da8f-4643-a293-a4dbc0a3c856 00:33:55.808 18:37:39 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@137 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_lvol_create lvol 4 -l lvs 00:33:56.067 c89f09d3-49c8-4bac-99c3-f62aa55ed742 00:33:56.067 18:37:39 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@138 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk lvs/lvol /dev/nbd0 00:33:56.326 /dev/nbd0 00:33:56.326 18:37:39 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@140 -- # mkfs.ext4 /dev/nbd0 00:33:56.326 mke2fs 1.46.5 (30-Dec-2021) 00:33:56.326 Discarding device blocks: 0/4096 done 00:33:56.326 Creating filesystem with 4096 1k blocks and 1024 inodes 00:33:56.326 00:33:56.326 Allocating group tables: 0/1 done 00:33:56.326 Writing inode tables: 0/1 done 00:33:56.326 Creating journal (1024 blocks): done 00:33:56.326 Writing superblocks and filesystem accounting information: 0/1 done 00:33:56.326 00:33:56.326 18:37:40 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@141 -- # mkfs_ret=0 00:33:56.326 18:37:40 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@142 -- # nbd_stop_disks /var/tmp/spdk-nbd.sock /dev/nbd0 00:33:56.326 18:37:40 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:33:56.326 18:37:40 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0') 00:33:56.326 18:37:40 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@50 -- # local nbd_list 00:33:56.326 18:37:40 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@51 -- # local i 00:33:56.326 18:37:40 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:33:56.326 18:37:40 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd0 00:33:56.585 18:37:40 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:33:56.585 18:37:40 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:33:56.585 18:37:40 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:33:56.585 18:37:40 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:33:56.585 18:37:40 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:33:56.585 18:37:40 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:33:56.585 18:37:40 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:33:56.585 18:37:40 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:33:56.585 18:37:40 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@143 -- # '[' 0 -ne 0 ']' 00:33:56.585 18:37:40 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@147 -- # return 0 00:33:56.585 18:37:40 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@326 -- # killprocess 2659673 00:33:56.585 18:37:40 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@948 -- # '[' -z 2659673 ']' 00:33:56.585 18:37:40 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@952 -- # kill -0 2659673 00:33:56.585 18:37:40 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@953 -- # uname 00:33:56.843 18:37:40 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:33:56.843 18:37:40 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2659673 00:33:56.843 18:37:40 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:33:56.843 18:37:40 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:33:56.843 18:37:40 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2659673' 00:33:56.843 killing process with pid 2659673 00:33:56.843 18:37:40 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@967 -- # kill 2659673 00:33:56.843 18:37:40 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@972 -- # wait 2659673 00:33:57.410 18:37:40 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@327 -- # trap - SIGINT SIGTERM EXIT 00:33:57.410 00:33:57.410 real 0m10.701s 00:33:57.410 user 0m13.879s 00:33:57.410 sys 0m4.148s 00:33:57.410 18:37:40 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@1124 -- # xtrace_disable 00:33:57.410 18:37:40 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@10 -- # set +x 00:33:57.410 ************************************ 00:33:57.410 END TEST bdev_nbd 00:33:57.410 ************************************ 00:33:57.410 18:37:40 blockdev_crypto_qat -- common/autotest_common.sh@1142 -- # return 0 00:33:57.410 18:37:40 blockdev_crypto_qat -- bdev/blockdev.sh@763 -- # [[ y == y ]] 00:33:57.410 18:37:40 blockdev_crypto_qat -- bdev/blockdev.sh@764 -- # '[' crypto_qat = nvme ']' 00:33:57.410 18:37:40 blockdev_crypto_qat -- bdev/blockdev.sh@764 -- # '[' crypto_qat = gpt ']' 00:33:57.410 18:37:40 blockdev_crypto_qat -- bdev/blockdev.sh@768 -- # run_test bdev_fio fio_test_suite '' 00:33:57.410 18:37:40 blockdev_crypto_qat -- common/autotest_common.sh@1099 -- # '[' 3 -le 1 ']' 00:33:57.410 18:37:40 blockdev_crypto_qat -- common/autotest_common.sh@1105 -- # xtrace_disable 00:33:57.410 18:37:40 blockdev_crypto_qat -- common/autotest_common.sh@10 -- # set +x 00:33:57.410 ************************************ 00:33:57.410 START TEST bdev_fio 00:33:57.410 ************************************ 00:33:57.410 18:37:41 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1123 -- # fio_test_suite '' 00:33:57.410 18:37:41 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@331 -- # local env_context 00:33:57.410 18:37:41 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@335 -- # pushd /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev 00:33:57.410 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev /var/jenkins/workspace/crypto-phy-autotest/spdk 00:33:57.410 18:37:41 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@336 -- # trap 'rm -f ./*.state; popd; exit 1' SIGINT SIGTERM EXIT 00:33:57.410 18:37:41 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@339 -- # echo '' 00:33:57.410 18:37:41 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@339 -- # sed s/--env-context=// 00:33:57.410 18:37:41 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@339 -- # env_context= 00:33:57.410 18:37:41 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@340 -- # fio_config_gen /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio verify AIO '' 00:33:57.410 18:37:41 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1280 -- # local config_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:33:57.410 18:37:41 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1281 -- # local workload=verify 00:33:57.410 18:37:41 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1282 -- # local bdev_type=AIO 00:33:57.410 18:37:41 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1283 -- # local env_context= 00:33:57.410 18:37:41 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1284 -- # local fio_dir=/usr/src/fio 00:33:57.410 18:37:41 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1286 -- # '[' -e /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio ']' 00:33:57.410 18:37:41 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1291 -- # '[' -z verify ']' 00:33:57.410 18:37:41 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1295 -- # '[' -n '' ']' 00:33:57.410 18:37:41 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1299 -- # touch /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:33:57.410 18:37:41 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1301 -- # cat 00:33:57.410 18:37:41 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1313 -- # '[' verify == verify ']' 00:33:57.410 18:37:41 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1314 -- # cat 00:33:57.410 18:37:41 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1323 -- # '[' AIO == AIO ']' 00:33:57.410 18:37:41 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1324 -- # /usr/src/fio/fio --version 00:33:57.410 18:37:41 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1324 -- # [[ fio-3.35 == *\f\i\o\-\3* ]] 00:33:57.410 18:37:41 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1325 -- # echo serialize_overlap=1 00:33:57.410 18:37:41 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@341 -- # for b in "${bdevs_name[@]}" 00:33:57.410 18:37:41 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@342 -- # echo '[job_crypto_ram]' 00:33:57.410 18:37:41 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@343 -- # echo filename=crypto_ram 00:33:57.410 18:37:41 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@341 -- # for b in "${bdevs_name[@]}" 00:33:57.410 18:37:41 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@342 -- # echo '[job_crypto_ram1]' 00:33:57.410 18:37:41 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@343 -- # echo filename=crypto_ram1 00:33:57.410 18:37:41 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@341 -- # for b in "${bdevs_name[@]}" 00:33:57.410 18:37:41 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@342 -- # echo '[job_crypto_ram2]' 00:33:57.410 18:37:41 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@343 -- # echo filename=crypto_ram2 00:33:57.410 18:37:41 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@341 -- # for b in "${bdevs_name[@]}" 00:33:57.411 18:37:41 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@342 -- # echo '[job_crypto_ram3]' 00:33:57.411 18:37:41 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@343 -- # echo filename=crypto_ram3 00:33:57.411 18:37:41 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@347 -- # local 'fio_params=--ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json' 00:33:57.411 18:37:41 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@349 -- # run_test bdev_fio_rw_verify fio_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --spdk_mem=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:33:57.411 18:37:41 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1099 -- # '[' 11 -le 1 ']' 00:33:57.411 18:37:41 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1105 -- # xtrace_disable 00:33:57.411 18:37:41 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@10 -- # set +x 00:33:57.411 ************************************ 00:33:57.411 START TEST bdev_fio_rw_verify 00:33:57.411 ************************************ 00:33:57.411 18:37:41 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1123 -- # fio_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --spdk_mem=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:33:57.411 18:37:41 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1356 -- # fio_plugin /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --spdk_mem=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:33:57.411 18:37:41 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1337 -- # local fio_dir=/usr/src/fio 00:33:57.411 18:37:41 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1339 -- # sanitizers=('libasan' 'libclang_rt.asan') 00:33:57.411 18:37:41 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1339 -- # local sanitizers 00:33:57.411 18:37:41 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1340 -- # local plugin=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:33:57.411 18:37:41 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1341 -- # shift 00:33:57.411 18:37:41 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1343 -- # local asan_lib= 00:33:57.411 18:37:41 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1344 -- # for sanitizer in "${sanitizers[@]}" 00:33:57.411 18:37:41 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # ldd /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:33:57.411 18:37:41 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # grep libasan 00:33:57.411 18:37:41 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # awk '{print $3}' 00:33:57.673 18:37:41 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # asan_lib= 00:33:57.673 18:37:41 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1346 -- # [[ -n '' ]] 00:33:57.673 18:37:41 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1344 -- # for sanitizer in "${sanitizers[@]}" 00:33:57.673 18:37:41 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # ldd /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:33:57.673 18:37:41 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # grep libclang_rt.asan 00:33:57.673 18:37:41 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # awk '{print $3}' 00:33:57.673 18:37:41 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # asan_lib= 00:33:57.673 18:37:41 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1346 -- # [[ -n '' ]] 00:33:57.673 18:37:41 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1352 -- # LD_PRELOAD=' /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev' 00:33:57.673 18:37:41 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1352 -- # /usr/src/fio/fio --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --spdk_mem=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:33:57.930 job_crypto_ram: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:33:57.930 job_crypto_ram1: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:33:57.930 job_crypto_ram2: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:33:57.930 job_crypto_ram3: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:33:57.930 fio-3.35 00:33:57.930 Starting 4 threads 00:34:12.787 00:34:12.787 job_crypto_ram: (groupid=0, jobs=4): err= 0: pid=2661716: Fri Jul 12 18:37:54 2024 00:34:12.787 read: IOPS=21.3k, BW=83.3MiB/s (87.3MB/s)(833MiB/10001msec) 00:34:12.787 slat (usec): min=11, max=617, avg=65.94, stdev=38.00 00:34:12.787 clat (usec): min=16, max=2266, avg=351.14, stdev=231.78 00:34:12.787 lat (usec): min=57, max=2426, avg=417.08, stdev=253.16 00:34:12.787 clat percentiles (usec): 00:34:12.787 | 50.000th=[ 281], 99.000th=[ 1074], 99.900th=[ 1254], 99.990th=[ 1565], 00:34:12.787 | 99.999th=[ 1991] 00:34:12.787 write: IOPS=23.3k, BW=91.2MiB/s (95.6MB/s)(891MiB/9774msec); 0 zone resets 00:34:12.787 slat (usec): min=13, max=14514, avg=76.71, stdev=48.84 00:34:12.787 clat (usec): min=19, max=14915, avg=389.52, stdev=248.37 00:34:12.787 lat (usec): min=61, max=14956, avg=466.23, stdev=271.34 00:34:12.787 clat percentiles (usec): 00:34:12.787 | 50.000th=[ 326], 99.000th=[ 1156], 99.900th=[ 1352], 99.990th=[ 1549], 00:34:12.787 | 99.999th=[ 1598] 00:34:12.787 bw ( KiB/s): min=67664, max=116328, per=98.13%, avg=91634.53, stdev=2862.12, samples=76 00:34:12.787 iops : min=16916, max=29082, avg=22908.63, stdev=715.53, samples=76 00:34:12.787 lat (usec) : 20=0.01%, 50=0.02%, 100=5.46%, 250=33.97%, 500=36.07% 00:34:12.787 lat (usec) : 750=15.92%, 1000=6.26% 00:34:12.787 lat (msec) : 2=2.30%, 4=0.01%, 20=0.01% 00:34:12.787 cpu : usr=99.56%, sys=0.01%, ctx=64, majf=0, minf=313 00:34:12.787 IO depths : 1=5.9%, 2=26.9%, 4=53.8%, 8=13.4%, 16=0.0%, 32=0.0%, >=64=0.0% 00:34:12.787 submit : 0=0.0%, 4=100.0%, 8=0.0%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.0% 00:34:12.787 complete : 0=0.0%, 4=88.1%, 8=11.9%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.0% 00:34:12.788 issued rwts: total=213277,228173,0,0 short=0,0,0,0 dropped=0,0,0,0 00:34:12.788 latency : target=0, window=0, percentile=100.00%, depth=8 00:34:12.788 00:34:12.788 Run status group 0 (all jobs): 00:34:12.788 READ: bw=83.3MiB/s (87.3MB/s), 83.3MiB/s-83.3MiB/s (87.3MB/s-87.3MB/s), io=833MiB (874MB), run=10001-10001msec 00:34:12.788 WRITE: bw=91.2MiB/s (95.6MB/s), 91.2MiB/s-91.2MiB/s (95.6MB/s-95.6MB/s), io=891MiB (935MB), run=9774-9774msec 00:34:12.788 00:34:12.788 real 0m13.469s 00:34:12.788 user 0m45.952s 00:34:12.788 sys 0m0.461s 00:34:12.788 18:37:54 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1124 -- # xtrace_disable 00:34:12.788 18:37:54 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@10 -- # set +x 00:34:12.788 ************************************ 00:34:12.788 END TEST bdev_fio_rw_verify 00:34:12.788 ************************************ 00:34:12.788 18:37:54 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1142 -- # return 0 00:34:12.788 18:37:54 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@350 -- # rm -f 00:34:12.788 18:37:54 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@351 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:34:12.788 18:37:54 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@354 -- # fio_config_gen /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio trim '' '' 00:34:12.788 18:37:54 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1280 -- # local config_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:34:12.788 18:37:54 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1281 -- # local workload=trim 00:34:12.788 18:37:54 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1282 -- # local bdev_type= 00:34:12.788 18:37:54 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1283 -- # local env_context= 00:34:12.788 18:37:54 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1284 -- # local fio_dir=/usr/src/fio 00:34:12.788 18:37:54 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1286 -- # '[' -e /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio ']' 00:34:12.788 18:37:54 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1291 -- # '[' -z trim ']' 00:34:12.788 18:37:54 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1295 -- # '[' -n '' ']' 00:34:12.788 18:37:54 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1299 -- # touch /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:34:12.788 18:37:54 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1301 -- # cat 00:34:12.788 18:37:54 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1313 -- # '[' trim == verify ']' 00:34:12.788 18:37:54 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1328 -- # '[' trim == trim ']' 00:34:12.788 18:37:54 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1329 -- # echo rw=trimwrite 00:34:12.788 18:37:54 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@355 -- # jq -r 'select(.supported_io_types.unmap == true) | .name' 00:34:12.788 18:37:54 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@355 -- # printf '%s\n' '{' ' "name": "crypto_ram",' ' "aliases": [' ' "8d5c809f-46e7-51cd-8329-866d01d0c902"' ' ],' ' "product_name": "crypto",' ' "block_size": 512,' ' "num_blocks": 65536,' ' "uuid": "8d5c809f-46e7-51cd-8329-866d01d0c902",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc0",' ' "name": "crypto_ram",' ' "key_name": "test_dek_qat_cbc"' ' }' ' }' '}' '{' ' "name": "crypto_ram1",' ' "aliases": [' ' "d29b4cbb-9586-505c-8045-48e6b07fc7e6"' ' ],' ' "product_name": "crypto",' ' "block_size": 512,' ' "num_blocks": 65536,' ' "uuid": "d29b4cbb-9586-505c-8045-48e6b07fc7e6",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc1",' ' "name": "crypto_ram1",' ' "key_name": "test_dek_qat_xts"' ' }' ' }' '}' '{' ' "name": "crypto_ram2",' ' "aliases": [' ' "36b31988-15f8-5347-bba8-441219305332"' ' ],' ' "product_name": "crypto",' ' "block_size": 4096,' ' "num_blocks": 8192,' ' "uuid": "36b31988-15f8-5347-bba8-441219305332",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc2",' ' "name": "crypto_ram2",' ' "key_name": "test_dek_qat_cbc2"' ' }' ' }' '}' '{' ' "name": "crypto_ram3",' ' "aliases": [' ' "67a9dd8b-5ab4-5b8b-be1e-2d5148bb0a89"' ' ],' ' "product_name": "crypto",' ' "block_size": 4096,' ' "num_blocks": 8192,' ' "uuid": "67a9dd8b-5ab4-5b8b-be1e-2d5148bb0a89",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc3",' ' "name": "crypto_ram3",' ' "key_name": "test_dek_qat_xts2"' ' }' ' }' '}' 00:34:12.788 18:37:54 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@355 -- # [[ -n crypto_ram 00:34:12.788 crypto_ram1 00:34:12.788 crypto_ram2 00:34:12.788 crypto_ram3 ]] 00:34:12.788 18:37:54 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@356 -- # jq -r 'select(.supported_io_types.unmap == true) | .name' 00:34:12.789 18:37:54 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@356 -- # printf '%s\n' '{' ' "name": "crypto_ram",' ' "aliases": [' ' "8d5c809f-46e7-51cd-8329-866d01d0c902"' ' ],' ' "product_name": "crypto",' ' "block_size": 512,' ' "num_blocks": 65536,' ' "uuid": "8d5c809f-46e7-51cd-8329-866d01d0c902",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc0",' ' "name": "crypto_ram",' ' "key_name": "test_dek_qat_cbc"' ' }' ' }' '}' '{' ' "name": "crypto_ram1",' ' "aliases": [' ' "d29b4cbb-9586-505c-8045-48e6b07fc7e6"' ' ],' ' "product_name": "crypto",' ' "block_size": 512,' ' "num_blocks": 65536,' ' "uuid": "d29b4cbb-9586-505c-8045-48e6b07fc7e6",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc1",' ' "name": "crypto_ram1",' ' "key_name": "test_dek_qat_xts"' ' }' ' }' '}' '{' ' "name": "crypto_ram2",' ' "aliases": [' ' "36b31988-15f8-5347-bba8-441219305332"' ' ],' ' "product_name": "crypto",' ' "block_size": 4096,' ' "num_blocks": 8192,' ' "uuid": "36b31988-15f8-5347-bba8-441219305332",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc2",' ' "name": "crypto_ram2",' ' "key_name": "test_dek_qat_cbc2"' ' }' ' }' '}' '{' ' "name": "crypto_ram3",' ' "aliases": [' ' "67a9dd8b-5ab4-5b8b-be1e-2d5148bb0a89"' ' ],' ' "product_name": "crypto",' ' "block_size": 4096,' ' "num_blocks": 8192,' ' "uuid": "67a9dd8b-5ab4-5b8b-be1e-2d5148bb0a89",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc3",' ' "name": "crypto_ram3",' ' "key_name": "test_dek_qat_xts2"' ' }' ' }' '}' 00:34:12.789 18:37:54 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@356 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:34:12.789 18:37:54 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@357 -- # echo '[job_crypto_ram]' 00:34:12.789 18:37:54 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@358 -- # echo filename=crypto_ram 00:34:12.789 18:37:54 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@356 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:34:12.789 18:37:54 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@357 -- # echo '[job_crypto_ram1]' 00:34:12.789 18:37:54 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@358 -- # echo filename=crypto_ram1 00:34:12.789 18:37:54 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@356 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:34:12.789 18:37:54 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@357 -- # echo '[job_crypto_ram2]' 00:34:12.789 18:37:54 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@358 -- # echo filename=crypto_ram2 00:34:12.789 18:37:54 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@356 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:34:12.789 18:37:54 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@357 -- # echo '[job_crypto_ram3]' 00:34:12.789 18:37:54 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@358 -- # echo filename=crypto_ram3 00:34:12.789 18:37:54 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@367 -- # run_test bdev_fio_trim fio_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --verify_state_save=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:34:12.789 18:37:54 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1099 -- # '[' 11 -le 1 ']' 00:34:12.789 18:37:54 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1105 -- # xtrace_disable 00:34:12.789 18:37:54 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@10 -- # set +x 00:34:12.789 ************************************ 00:34:12.789 START TEST bdev_fio_trim 00:34:12.789 ************************************ 00:34:12.789 18:37:54 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1123 -- # fio_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --verify_state_save=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:34:12.789 18:37:54 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1356 -- # fio_plugin /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --verify_state_save=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:34:12.789 18:37:54 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1337 -- # local fio_dir=/usr/src/fio 00:34:12.789 18:37:54 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1339 -- # sanitizers=('libasan' 'libclang_rt.asan') 00:34:12.789 18:37:54 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1339 -- # local sanitizers 00:34:12.789 18:37:54 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1340 -- # local plugin=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:34:12.789 18:37:54 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1341 -- # shift 00:34:12.789 18:37:54 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1343 -- # local asan_lib= 00:34:12.789 18:37:54 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1344 -- # for sanitizer in "${sanitizers[@]}" 00:34:12.789 18:37:54 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # ldd /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:34:12.789 18:37:54 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # grep libasan 00:34:12.789 18:37:54 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # awk '{print $3}' 00:34:12.789 18:37:54 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # asan_lib= 00:34:12.789 18:37:54 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1346 -- # [[ -n '' ]] 00:34:12.789 18:37:54 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1344 -- # for sanitizer in "${sanitizers[@]}" 00:34:12.789 18:37:54 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # ldd /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:34:12.789 18:37:54 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # grep libclang_rt.asan 00:34:12.789 18:37:54 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # awk '{print $3}' 00:34:12.789 18:37:54 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # asan_lib= 00:34:12.789 18:37:54 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1346 -- # [[ -n '' ]] 00:34:12.789 18:37:54 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1352 -- # LD_PRELOAD=' /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev' 00:34:12.789 18:37:54 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1352 -- # /usr/src/fio/fio --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --verify_state_save=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:34:12.789 job_crypto_ram: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:34:12.789 job_crypto_ram1: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:34:12.789 job_crypto_ram2: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:34:12.789 job_crypto_ram3: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:34:12.789 fio-3.35 00:34:12.789 Starting 4 threads 00:34:24.977 00:34:24.977 job_crypto_ram: (groupid=0, jobs=4): err= 0: pid=2663563: Fri Jul 12 18:38:07 2024 00:34:24.977 write: IOPS=33.2k, BW=130MiB/s (136MB/s)(1296MiB/10001msec); 0 zone resets 00:34:24.977 slat (usec): min=18, max=1503, avg=71.07, stdev=36.35 00:34:24.977 clat (usec): min=44, max=1088, avg=252.00, stdev=134.83 00:34:24.977 lat (usec): min=73, max=1953, avg=323.08, stdev=150.98 00:34:24.977 clat percentiles (usec): 00:34:24.977 | 50.000th=[ 229], 99.000th=[ 635], 99.900th=[ 758], 99.990th=[ 848], 00:34:24.977 | 99.999th=[ 1074] 00:34:24.977 bw ( KiB/s): min=114688, max=158512, per=99.95%, avg=132628.55, stdev=4034.34, samples=77 00:34:24.977 iops : min=28672, max=39628, avg=33157.01, stdev=1008.58, samples=77 00:34:24.977 trim: IOPS=33.2k, BW=130MiB/s (136MB/s)(1296MiB/10001msec); 0 zone resets 00:34:24.977 slat (usec): min=6, max=429, avg=19.46, stdev= 7.17 00:34:24.977 clat (usec): min=73, max=1953, avg=323.29, stdev=151.00 00:34:24.977 lat (usec): min=80, max=1966, avg=342.74, stdev=152.35 00:34:24.977 clat percentiles (usec): 00:34:24.977 | 50.000th=[ 297], 99.000th=[ 750], 99.900th=[ 881], 99.990th=[ 1004], 00:34:24.977 | 99.999th=[ 1270] 00:34:24.977 bw ( KiB/s): min=114688, max=158512, per=99.94%, avg=132628.55, stdev=4034.34, samples=77 00:34:24.977 iops : min=28672, max=39628, avg=33157.01, stdev=1008.58, samples=77 00:34:24.977 lat (usec) : 50=0.06%, 100=5.90%, 250=40.50%, 500=43.83%, 750=9.17% 00:34:24.977 lat (usec) : 1000=0.54% 00:34:24.977 lat (msec) : 2=0.01% 00:34:24.977 cpu : usr=99.57%, sys=0.00%, ctx=111, majf=0, minf=114 00:34:24.977 IO depths : 1=12.5%, 2=25.0%, 4=50.0%, 8=12.5%, 16=0.0%, 32=0.0%, >=64=0.0% 00:34:24.977 submit : 0=0.0%, 4=100.0%, 8=0.0%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.0% 00:34:24.977 complete : 0=0.0%, 4=88.9%, 8=11.1%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.0% 00:34:24.977 issued rwts: total=0,331784,331786,0 short=0,0,0,0 dropped=0,0,0,0 00:34:24.977 latency : target=0, window=0, percentile=100.00%, depth=8 00:34:24.977 00:34:24.977 Run status group 0 (all jobs): 00:34:24.977 WRITE: bw=130MiB/s (136MB/s), 130MiB/s-130MiB/s (136MB/s-136MB/s), io=1296MiB (1359MB), run=10001-10001msec 00:34:24.977 TRIM: bw=130MiB/s (136MB/s), 130MiB/s-130MiB/s (136MB/s-136MB/s), io=1296MiB (1359MB), run=10001-10001msec 00:34:24.977 00:34:24.977 real 0m13.499s 00:34:24.977 user 0m45.740s 00:34:24.977 sys 0m0.511s 00:34:24.977 18:38:08 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1124 -- # xtrace_disable 00:34:24.977 18:38:08 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@10 -- # set +x 00:34:24.977 ************************************ 00:34:24.977 END TEST bdev_fio_trim 00:34:24.977 ************************************ 00:34:24.977 18:38:08 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1142 -- # return 0 00:34:24.977 18:38:08 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@368 -- # rm -f 00:34:24.977 18:38:08 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@369 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:34:24.977 18:38:08 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@370 -- # popd 00:34:24.977 /var/jenkins/workspace/crypto-phy-autotest/spdk 00:34:24.977 18:38:08 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@371 -- # trap - SIGINT SIGTERM EXIT 00:34:24.977 00:34:24.977 real 0m27.314s 00:34:24.977 user 1m31.880s 00:34:24.977 sys 0m1.151s 00:34:24.977 18:38:08 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1124 -- # xtrace_disable 00:34:24.977 18:38:08 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@10 -- # set +x 00:34:24.977 ************************************ 00:34:24.977 END TEST bdev_fio 00:34:24.977 ************************************ 00:34:24.977 18:38:08 blockdev_crypto_qat -- common/autotest_common.sh@1142 -- # return 0 00:34:24.977 18:38:08 blockdev_crypto_qat -- bdev/blockdev.sh@775 -- # trap cleanup SIGINT SIGTERM EXIT 00:34:24.977 18:38:08 blockdev_crypto_qat -- bdev/blockdev.sh@777 -- # run_test bdev_verify /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 4096 -w verify -t 5 -C -m 0x3 '' 00:34:24.977 18:38:08 blockdev_crypto_qat -- common/autotest_common.sh@1099 -- # '[' 16 -le 1 ']' 00:34:24.977 18:38:08 blockdev_crypto_qat -- common/autotest_common.sh@1105 -- # xtrace_disable 00:34:24.977 18:38:08 blockdev_crypto_qat -- common/autotest_common.sh@10 -- # set +x 00:34:24.977 ************************************ 00:34:24.977 START TEST bdev_verify 00:34:24.977 ************************************ 00:34:24.977 18:38:08 blockdev_crypto_qat.bdev_verify -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 4096 -w verify -t 5 -C -m 0x3 '' 00:34:24.977 [2024-07-12 18:38:08.441588] Starting SPDK v24.09-pre git sha1 182dd7de4 / DPDK 24.03.0 initialization... 00:34:24.977 [2024-07-12 18:38:08.441647] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x3 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2664984 ] 00:34:24.977 [2024-07-12 18:38:08.559496] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 2 00:34:24.977 [2024-07-12 18:38:08.661742] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:34:24.977 [2024-07-12 18:38:08.661757] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:34:24.977 [2024-07-12 18:38:08.683704] accel_dpdk_cryptodev.c: 223:accel_dpdk_cryptodev_set_driver: *NOTICE*: Using driver crypto_qat 00:34:24.977 [2024-07-12 18:38:08.691752] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation encrypt will be assigned to module dpdk_cryptodev 00:34:24.977 [2024-07-12 18:38:08.699764] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation decrypt will be assigned to module dpdk_cryptodev 00:34:25.233 [2024-07-12 18:38:08.806071] accel_dpdk_cryptodev.c:1178:accel_dpdk_cryptodev_init: *NOTICE*: Found crypto devices: 96 00:34:27.783 [2024-07-12 18:38:11.019301] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_cbc" 00:34:27.783 [2024-07-12 18:38:11.019393] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:34:27.783 [2024-07-12 18:38:11.019409] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:34:27.783 [2024-07-12 18:38:11.027319] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_xts" 00:34:27.783 [2024-07-12 18:38:11.027340] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:34:27.783 [2024-07-12 18:38:11.027353] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:34:27.783 [2024-07-12 18:38:11.035343] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_cbc2" 00:34:27.783 [2024-07-12 18:38:11.035362] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:34:27.783 [2024-07-12 18:38:11.035373] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:34:27.783 [2024-07-12 18:38:11.043365] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_xts2" 00:34:27.783 [2024-07-12 18:38:11.043383] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc3 00:34:27.783 [2024-07-12 18:38:11.043395] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:34:27.783 Running I/O for 5 seconds... 00:34:33.081 00:34:33.081 Latency(us) 00:34:33.081 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:34:33.081 Job: crypto_ram (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:34:33.081 Verification LBA range: start 0x0 length 0x1000 00:34:33.081 crypto_ram : 5.08 486.30 1.90 0.00 0.00 261316.63 3191.32 196038.12 00:34:33.081 Job: crypto_ram (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:34:33.081 Verification LBA range: start 0x1000 length 0x1000 00:34:33.081 crypto_ram : 5.08 492.60 1.92 0.00 0.00 258214.86 4217.10 195126.32 00:34:33.081 Job: crypto_ram1 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:34:33.081 Verification LBA range: start 0x0 length 0x1000 00:34:33.081 crypto_ram1 : 5.09 490.45 1.92 0.00 0.00 258441.64 6496.61 175066.60 00:34:33.081 Job: crypto_ram1 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:34:33.081 Verification LBA range: start 0x1000 length 0x1000 00:34:33.081 crypto_ram1 : 5.09 496.74 1.94 0.00 0.00 255374.78 7522.39 174154.80 00:34:33.081 Job: crypto_ram2 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:34:33.081 Verification LBA range: start 0x0 length 0x1000 00:34:33.081 crypto_ram2 : 5.07 3827.01 14.95 0.00 0.00 33033.54 3219.81 29633.67 00:34:33.081 Job: crypto_ram2 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:34:33.081 Verification LBA range: start 0x1000 length 0x1000 00:34:33.081 crypto_ram2 : 5.06 3848.25 15.03 0.00 0.00 32863.47 7693.36 29633.67 00:34:33.081 Job: crypto_ram3 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:34:33.081 Verification LBA range: start 0x0 length 0x1000 00:34:33.081 crypto_ram3 : 5.07 3835.89 14.98 0.00 0.00 32880.32 1852.10 31001.38 00:34:33.081 Job: crypto_ram3 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:34:33.081 Verification LBA range: start 0x1000 length 0x1000 00:34:33.081 crypto_ram3 : 5.07 3861.57 15.08 0.00 0.00 32656.51 3846.68 30545.47 00:34:33.081 =================================================================================================================== 00:34:33.081 Total : 17338.81 67.73 0.00 0.00 58511.35 1852.10 196038.12 00:34:33.081 00:34:33.081 real 0m8.255s 00:34:33.081 user 0m15.639s 00:34:33.081 sys 0m0.395s 00:34:33.081 18:38:16 blockdev_crypto_qat.bdev_verify -- common/autotest_common.sh@1124 -- # xtrace_disable 00:34:33.081 18:38:16 blockdev_crypto_qat.bdev_verify -- common/autotest_common.sh@10 -- # set +x 00:34:33.081 ************************************ 00:34:33.081 END TEST bdev_verify 00:34:33.081 ************************************ 00:34:33.081 18:38:16 blockdev_crypto_qat -- common/autotest_common.sh@1142 -- # return 0 00:34:33.081 18:38:16 blockdev_crypto_qat -- bdev/blockdev.sh@778 -- # run_test bdev_verify_big_io /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 65536 -w verify -t 5 -C -m 0x3 '' 00:34:33.081 18:38:16 blockdev_crypto_qat -- common/autotest_common.sh@1099 -- # '[' 16 -le 1 ']' 00:34:33.081 18:38:16 blockdev_crypto_qat -- common/autotest_common.sh@1105 -- # xtrace_disable 00:34:33.081 18:38:16 blockdev_crypto_qat -- common/autotest_common.sh@10 -- # set +x 00:34:33.081 ************************************ 00:34:33.081 START TEST bdev_verify_big_io 00:34:33.081 ************************************ 00:34:33.081 18:38:16 blockdev_crypto_qat.bdev_verify_big_io -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 65536 -w verify -t 5 -C -m 0x3 '' 00:34:33.081 [2024-07-12 18:38:16.761827] Starting SPDK v24.09-pre git sha1 182dd7de4 / DPDK 24.03.0 initialization... 00:34:33.082 [2024-07-12 18:38:16.761885] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x3 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2666050 ] 00:34:33.339 [2024-07-12 18:38:16.892218] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 2 00:34:33.339 [2024-07-12 18:38:16.993886] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:34:33.339 [2024-07-12 18:38:16.993891] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:34:33.339 [2024-07-12 18:38:17.015285] accel_dpdk_cryptodev.c: 223:accel_dpdk_cryptodev_set_driver: *NOTICE*: Using driver crypto_qat 00:34:33.339 [2024-07-12 18:38:17.023316] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation encrypt will be assigned to module dpdk_cryptodev 00:34:33.339 [2024-07-12 18:38:17.031344] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation decrypt will be assigned to module dpdk_cryptodev 00:34:33.596 [2024-07-12 18:38:17.144145] accel_dpdk_cryptodev.c:1178:accel_dpdk_cryptodev_init: *NOTICE*: Found crypto devices: 96 00:34:36.122 [2024-07-12 18:38:19.360970] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_cbc" 00:34:36.122 [2024-07-12 18:38:19.361057] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:34:36.122 [2024-07-12 18:38:19.361072] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:34:36.122 [2024-07-12 18:38:19.368987] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_xts" 00:34:36.122 [2024-07-12 18:38:19.369007] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:34:36.122 [2024-07-12 18:38:19.369019] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:34:36.122 [2024-07-12 18:38:19.377006] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_cbc2" 00:34:36.122 [2024-07-12 18:38:19.377024] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:34:36.122 [2024-07-12 18:38:19.377035] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:34:36.122 [2024-07-12 18:38:19.385029] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_xts2" 00:34:36.122 [2024-07-12 18:38:19.385047] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc3 00:34:36.122 [2024-07-12 18:38:19.385059] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:34:36.122 Running I/O for 5 seconds... 00:34:36.690 [2024-07-12 18:38:20.339058] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:36.690 [2024-07-12 18:38:20.339518] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:36.690 [2024-07-12 18:38:20.339885] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:36.690 [2024-07-12 18:38:20.340257] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:36.690 [2024-07-12 18:38:20.340320] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:36.690 [2024-07-12 18:38:20.340363] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:36.690 [2024-07-12 18:38:20.340403] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:36.690 [2024-07-12 18:38:20.340443] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:36.690 [2024-07-12 18:38:20.340873] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:36.690 [2024-07-12 18:38:20.340891] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:36.690 [2024-07-12 18:38:20.340906] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:36.690 [2024-07-12 18:38:20.340921] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:36.690 [2024-07-12 18:38:20.344410] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:36.690 [2024-07-12 18:38:20.344458] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:36.690 [2024-07-12 18:38:20.344502] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:36.690 [2024-07-12 18:38:20.344543] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:36.690 [2024-07-12 18:38:20.345032] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:36.690 [2024-07-12 18:38:20.345076] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:36.690 [2024-07-12 18:38:20.345124] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:36.690 [2024-07-12 18:38:20.345165] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:36.690 [2024-07-12 18:38:20.345590] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:36.690 [2024-07-12 18:38:20.345606] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:36.690 [2024-07-12 18:38:20.345622] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:36.690 [2024-07-12 18:38:20.345637] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:36.690 [2024-07-12 18:38:20.348825] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:36.690 [2024-07-12 18:38:20.348871] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:36.690 [2024-07-12 18:38:20.348911] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:36.690 [2024-07-12 18:38:20.348958] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:36.690 [2024-07-12 18:38:20.349422] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:36.690 [2024-07-12 18:38:20.349467] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:36.690 [2024-07-12 18:38:20.349508] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:36.690 [2024-07-12 18:38:20.349548] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:36.690 [2024-07-12 18:38:20.349991] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:36.690 [2024-07-12 18:38:20.350008] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:36.690 [2024-07-12 18:38:20.350024] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:36.690 [2024-07-12 18:38:20.350039] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:36.690 [2024-07-12 18:38:20.353354] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:36.690 [2024-07-12 18:38:20.353401] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:36.690 [2024-07-12 18:38:20.353442] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:36.690 [2024-07-12 18:38:20.353486] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:36.690 [2024-07-12 18:38:20.353952] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:36.690 [2024-07-12 18:38:20.353997] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:36.690 [2024-07-12 18:38:20.354041] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:36.690 [2024-07-12 18:38:20.354083] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:36.690 [2024-07-12 18:38:20.354414] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:36.690 [2024-07-12 18:38:20.354431] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:36.690 [2024-07-12 18:38:20.354446] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:36.690 [2024-07-12 18:38:20.354461] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:36.690 [2024-07-12 18:38:20.357683] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:36.690 [2024-07-12 18:38:20.357733] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:36.690 [2024-07-12 18:38:20.357774] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:36.690 [2024-07-12 18:38:20.357816] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:36.690 [2024-07-12 18:38:20.358284] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:36.690 [2024-07-12 18:38:20.358329] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:36.690 [2024-07-12 18:38:20.358386] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:36.690 [2024-07-12 18:38:20.358438] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:36.690 [2024-07-12 18:38:20.358823] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:36.690 [2024-07-12 18:38:20.358839] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:36.690 [2024-07-12 18:38:20.358853] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:36.690 [2024-07-12 18:38:20.358868] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:36.690 [2024-07-12 18:38:20.362304] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:36.690 [2024-07-12 18:38:20.362350] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:36.690 [2024-07-12 18:38:20.362395] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:36.690 [2024-07-12 18:38:20.362436] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:36.690 [2024-07-12 18:38:20.362876] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:36.690 [2024-07-12 18:38:20.362919] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:36.690 [2024-07-12 18:38:20.362967] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:36.690 [2024-07-12 18:38:20.363008] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:36.690 [2024-07-12 18:38:20.363364] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:36.690 [2024-07-12 18:38:20.363380] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:36.690 [2024-07-12 18:38:20.363394] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:36.690 [2024-07-12 18:38:20.363409] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:36.690 [2024-07-12 18:38:20.366674] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:36.690 [2024-07-12 18:38:20.366720] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:36.690 [2024-07-12 18:38:20.366766] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:36.690 [2024-07-12 18:38:20.366819] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:36.690 [2024-07-12 18:38:20.367238] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:36.690 [2024-07-12 18:38:20.367294] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:36.690 [2024-07-12 18:38:20.367338] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:36.690 [2024-07-12 18:38:20.367408] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:36.691 [2024-07-12 18:38:20.367853] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:36.691 [2024-07-12 18:38:20.367872] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:36.691 [2024-07-12 18:38:20.367886] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:36.691 [2024-07-12 18:38:20.367900] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:36.691 [2024-07-12 18:38:20.371252] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:36.691 [2024-07-12 18:38:20.371308] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:36.691 [2024-07-12 18:38:20.371349] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:36.691 [2024-07-12 18:38:20.371391] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:36.691 [2024-07-12 18:38:20.371782] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:36.691 [2024-07-12 18:38:20.371826] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:36.691 [2024-07-12 18:38:20.371867] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:36.691 [2024-07-12 18:38:20.371908] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:36.691 [2024-07-12 18:38:20.372315] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:36.691 [2024-07-12 18:38:20.372332] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:36.691 [2024-07-12 18:38:20.372347] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:36.691 [2024-07-12 18:38:20.372361] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:36.691 [2024-07-12 18:38:20.375619] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:36.691 [2024-07-12 18:38:20.375666] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:36.691 [2024-07-12 18:38:20.375707] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:36.691 [2024-07-12 18:38:20.375758] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:36.691 [2024-07-12 18:38:20.376161] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:36.691 [2024-07-12 18:38:20.376208] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:36.691 [2024-07-12 18:38:20.376249] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:36.691 [2024-07-12 18:38:20.376290] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:36.691 [2024-07-12 18:38:20.376737] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:36.691 [2024-07-12 18:38:20.376754] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:36.691 [2024-07-12 18:38:20.376770] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:36.691 [2024-07-12 18:38:20.376786] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:36.691 [2024-07-12 18:38:20.379854] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:36.691 [2024-07-12 18:38:20.379906] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:36.691 [2024-07-12 18:38:20.379953] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:36.691 [2024-07-12 18:38:20.379994] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:36.691 [2024-07-12 18:38:20.380428] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:36.691 [2024-07-12 18:38:20.380470] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:36.691 [2024-07-12 18:38:20.380513] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:36.691 [2024-07-12 18:38:20.380554] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:36.691 [2024-07-12 18:38:20.380978] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:36.691 [2024-07-12 18:38:20.380995] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:36.691 [2024-07-12 18:38:20.381011] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:36.691 [2024-07-12 18:38:20.381026] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:36.691 [2024-07-12 18:38:20.383994] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:36.691 [2024-07-12 18:38:20.384039] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:36.691 [2024-07-12 18:38:20.384093] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:36.691 [2024-07-12 18:38:20.384135] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:36.691 [2024-07-12 18:38:20.384651] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:36.691 [2024-07-12 18:38:20.384695] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:36.691 [2024-07-12 18:38:20.384736] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:36.691 [2024-07-12 18:38:20.384777] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:36.691 [2024-07-12 18:38:20.385215] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:36.691 [2024-07-12 18:38:20.385233] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:36.691 [2024-07-12 18:38:20.385252] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:36.691 [2024-07-12 18:38:20.385267] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:36.691 [2024-07-12 18:38:20.388320] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:36.691 [2024-07-12 18:38:20.388366] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:36.691 [2024-07-12 18:38:20.388406] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:36.691 [2024-07-12 18:38:20.388446] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:36.691 [2024-07-12 18:38:20.388903] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:36.691 [2024-07-12 18:38:20.388954] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:36.691 [2024-07-12 18:38:20.388998] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:36.691 [2024-07-12 18:38:20.389039] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:36.691 [2024-07-12 18:38:20.389457] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:36.691 [2024-07-12 18:38:20.389474] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:36.691 [2024-07-12 18:38:20.389488] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:36.691 [2024-07-12 18:38:20.389505] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:36.691 [2024-07-12 18:38:20.392461] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:36.691 [2024-07-12 18:38:20.392506] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:36.691 [2024-07-12 18:38:20.392551] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:36.691 [2024-07-12 18:38:20.392595] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:36.691 [2024-07-12 18:38:20.393057] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:36.691 [2024-07-12 18:38:20.393101] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:36.691 [2024-07-12 18:38:20.393143] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:36.691 [2024-07-12 18:38:20.393183] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:36.691 [2024-07-12 18:38:20.393612] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:36.691 [2024-07-12 18:38:20.393629] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:36.691 [2024-07-12 18:38:20.393644] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:36.691 [2024-07-12 18:38:20.393658] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:36.691 [2024-07-12 18:38:20.396740] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:36.691 [2024-07-12 18:38:20.396786] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:36.691 [2024-07-12 18:38:20.396827] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:36.691 [2024-07-12 18:38:20.396868] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:36.691 [2024-07-12 18:38:20.397302] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:36.691 [2024-07-12 18:38:20.397347] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:36.691 [2024-07-12 18:38:20.397388] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:36.691 [2024-07-12 18:38:20.397428] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:36.691 [2024-07-12 18:38:20.397882] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:36.691 [2024-07-12 18:38:20.397899] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:36.691 [2024-07-12 18:38:20.397914] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:36.691 [2024-07-12 18:38:20.397934] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:36.691 [2024-07-12 18:38:20.400814] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:36.691 [2024-07-12 18:38:20.400859] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:36.691 [2024-07-12 18:38:20.400905] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:36.691 [2024-07-12 18:38:20.400953] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:36.691 [2024-07-12 18:38:20.401413] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:36.691 [2024-07-12 18:38:20.401456] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:36.691 [2024-07-12 18:38:20.401497] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:36.691 [2024-07-12 18:38:20.401539] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:36.691 [2024-07-12 18:38:20.401977] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:36.691 [2024-07-12 18:38:20.401994] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:36.692 [2024-07-12 18:38:20.402010] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:36.692 [2024-07-12 18:38:20.402025] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:36.692 [2024-07-12 18:38:20.405257] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:36.692 [2024-07-12 18:38:20.405304] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:36.692 [2024-07-12 18:38:20.405345] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:36.692 [2024-07-12 18:38:20.405385] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:36.692 [2024-07-12 18:38:20.405848] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:36.692 [2024-07-12 18:38:20.405892] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:36.692 [2024-07-12 18:38:20.405939] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:36.692 [2024-07-12 18:38:20.405983] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:36.692 [2024-07-12 18:38:20.406403] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:36.692 [2024-07-12 18:38:20.406420] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:36.692 [2024-07-12 18:38:20.406435] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:36.692 [2024-07-12 18:38:20.406450] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:36.692 [2024-07-12 18:38:20.409478] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:36.692 [2024-07-12 18:38:20.409523] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:36.692 [2024-07-12 18:38:20.409565] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:36.692 [2024-07-12 18:38:20.409606] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:36.692 [2024-07-12 18:38:20.410073] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:36.692 [2024-07-12 18:38:20.410117] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:36.692 [2024-07-12 18:38:20.410159] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:36.692 [2024-07-12 18:38:20.410200] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:36.692 [2024-07-12 18:38:20.410629] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:36.692 [2024-07-12 18:38:20.410650] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:36.692 [2024-07-12 18:38:20.410664] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:36.692 [2024-07-12 18:38:20.410678] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:36.692 [2024-07-12 18:38:20.413622] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:36.692 [2024-07-12 18:38:20.413667] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:36.692 [2024-07-12 18:38:20.413709] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:36.692 [2024-07-12 18:38:20.413752] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:36.692 [2024-07-12 18:38:20.414228] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:36.692 [2024-07-12 18:38:20.414273] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:36.692 [2024-07-12 18:38:20.414314] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:36.692 [2024-07-12 18:38:20.414355] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:36.952 [2024-07-12 18:38:20.414713] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:36.952 [2024-07-12 18:38:20.414730] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:36.952 [2024-07-12 18:38:20.414745] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:36.952 [2024-07-12 18:38:20.414760] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:36.952 [2024-07-12 18:38:20.417672] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:36.952 [2024-07-12 18:38:20.417718] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:36.952 [2024-07-12 18:38:20.417759] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:36.952 [2024-07-12 18:38:20.417800] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:36.952 [2024-07-12 18:38:20.418260] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:36.952 [2024-07-12 18:38:20.418318] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:36.952 [2024-07-12 18:38:20.418359] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:36.952 [2024-07-12 18:38:20.418423] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:36.952 [2024-07-12 18:38:20.418795] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:36.952 [2024-07-12 18:38:20.418814] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:36.952 [2024-07-12 18:38:20.418828] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:36.953 [2024-07-12 18:38:20.418843] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:36.953 [2024-07-12 18:38:20.421965] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:36.953 [2024-07-12 18:38:20.422012] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:36.953 [2024-07-12 18:38:20.422054] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:36.953 [2024-07-12 18:38:20.422101] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:36.953 [2024-07-12 18:38:20.422579] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:36.953 [2024-07-12 18:38:20.422633] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:36.953 [2024-07-12 18:38:20.422685] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:36.953 [2024-07-12 18:38:20.422727] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:36.953 [2024-07-12 18:38:20.423104] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:36.953 [2024-07-12 18:38:20.423121] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:36.953 [2024-07-12 18:38:20.423136] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:36.953 [2024-07-12 18:38:20.423151] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:36.953 [2024-07-12 18:38:20.426186] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:36.953 [2024-07-12 18:38:20.426231] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:36.953 [2024-07-12 18:38:20.426272] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:36.953 [2024-07-12 18:38:20.426322] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:36.953 [2024-07-12 18:38:20.426764] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:36.953 [2024-07-12 18:38:20.426826] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:36.953 [2024-07-12 18:38:20.426877] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:36.953 [2024-07-12 18:38:20.426940] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:36.953 [2024-07-12 18:38:20.427311] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:36.953 [2024-07-12 18:38:20.427328] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:36.953 [2024-07-12 18:38:20.427343] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:36.953 [2024-07-12 18:38:20.427358] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:36.953 [2024-07-12 18:38:20.430552] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:36.953 [2024-07-12 18:38:20.430614] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:36.953 [2024-07-12 18:38:20.430666] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:36.953 [2024-07-12 18:38:20.430706] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:36.953 [2024-07-12 18:38:20.431149] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:36.953 [2024-07-12 18:38:20.431194] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:36.953 [2024-07-12 18:38:20.431236] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:36.953 [2024-07-12 18:38:20.431278] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:36.953 [2024-07-12 18:38:20.431668] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:36.953 [2024-07-12 18:38:20.431685] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:36.953 [2024-07-12 18:38:20.431703] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:36.953 [2024-07-12 18:38:20.431717] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:36.953 [2024-07-12 18:38:20.434549] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:36.953 [2024-07-12 18:38:20.434594] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:36.953 [2024-07-12 18:38:20.434647] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:36.953 [2024-07-12 18:38:20.434689] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:36.953 [2024-07-12 18:38:20.435255] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:36.953 [2024-07-12 18:38:20.435312] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:36.953 [2024-07-12 18:38:20.435353] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:36.953 [2024-07-12 18:38:20.435393] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:36.953 [2024-07-12 18:38:20.435838] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:36.953 [2024-07-12 18:38:20.435855] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:36.953 [2024-07-12 18:38:20.435870] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:36.953 [2024-07-12 18:38:20.435886] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:36.953 [2024-07-12 18:38:20.438723] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:36.953 [2024-07-12 18:38:20.438773] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:36.953 [2024-07-12 18:38:20.438814] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:36.953 [2024-07-12 18:38:20.438854] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:36.953 [2024-07-12 18:38:20.439310] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:36.953 [2024-07-12 18:38:20.439354] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:36.953 [2024-07-12 18:38:20.439396] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:36.953 [2024-07-12 18:38:20.439437] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:36.953 [2024-07-12 18:38:20.439857] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:36.953 [2024-07-12 18:38:20.439875] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:36.953 [2024-07-12 18:38:20.439890] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:36.953 [2024-07-12 18:38:20.439906] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:36.953 [2024-07-12 18:38:20.442580] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:36.953 [2024-07-12 18:38:20.442637] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:36.953 [2024-07-12 18:38:20.442680] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:36.953 [2024-07-12 18:38:20.442720] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:36.953 [2024-07-12 18:38:20.443230] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:36.953 [2024-07-12 18:38:20.443279] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:36.953 [2024-07-12 18:38:20.443321] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:36.953 [2024-07-12 18:38:20.443362] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:36.953 [2024-07-12 18:38:20.443777] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:36.953 [2024-07-12 18:38:20.443795] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:36.953 [2024-07-12 18:38:20.443811] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:36.953 [2024-07-12 18:38:20.443826] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:36.953 [2024-07-12 18:38:20.446774] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:36.953 [2024-07-12 18:38:20.446820] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:36.953 [2024-07-12 18:38:20.446862] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:36.953 [2024-07-12 18:38:20.446903] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:36.953 [2024-07-12 18:38:20.447380] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:36.953 [2024-07-12 18:38:20.447426] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:36.953 [2024-07-12 18:38:20.447468] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:36.953 [2024-07-12 18:38:20.447509] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:36.953 [2024-07-12 18:38:20.447960] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:36.953 [2024-07-12 18:38:20.447977] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:36.953 [2024-07-12 18:38:20.447992] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:36.953 [2024-07-12 18:38:20.448007] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:36.953 [2024-07-12 18:38:20.450609] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:36.953 [2024-07-12 18:38:20.450654] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:36.953 [2024-07-12 18:38:20.450695] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:36.953 [2024-07-12 18:38:20.450736] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:36.953 [2024-07-12 18:38:20.451194] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:36.953 [2024-07-12 18:38:20.451243] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:36.953 [2024-07-12 18:38:20.451296] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:36.953 [2024-07-12 18:38:20.451340] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:36.953 [2024-07-12 18:38:20.451805] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:36.953 [2024-07-12 18:38:20.451822] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:36.953 [2024-07-12 18:38:20.451837] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:36.953 [2024-07-12 18:38:20.451857] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:36.953 [2024-07-12 18:38:20.454754] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:36.953 [2024-07-12 18:38:20.454800] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:36.954 [2024-07-12 18:38:20.454841] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:36.954 [2024-07-12 18:38:20.454883] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:36.954 [2024-07-12 18:38:20.455351] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:36.954 [2024-07-12 18:38:20.455394] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:36.954 [2024-07-12 18:38:20.455436] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:36.954 [2024-07-12 18:38:20.455477] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:36.954 [2024-07-12 18:38:20.455911] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:36.954 [2024-07-12 18:38:20.455932] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:36.954 [2024-07-12 18:38:20.455949] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:36.954 [2024-07-12 18:38:20.455968] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:36.954 [2024-07-12 18:38:20.458772] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:36.954 [2024-07-12 18:38:20.458818] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:36.954 [2024-07-12 18:38:20.458859] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:36.954 [2024-07-12 18:38:20.458899] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:36.954 [2024-07-12 18:38:20.459372] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:36.954 [2024-07-12 18:38:20.459417] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:36.954 [2024-07-12 18:38:20.459457] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:36.954 [2024-07-12 18:38:20.459499] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:36.954 [2024-07-12 18:38:20.459917] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:36.954 [2024-07-12 18:38:20.459938] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:36.954 [2024-07-12 18:38:20.459953] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:36.954 [2024-07-12 18:38:20.459967] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:36.954 [2024-07-12 18:38:20.462763] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:36.954 [2024-07-12 18:38:20.462809] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:36.954 [2024-07-12 18:38:20.462850] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:36.954 [2024-07-12 18:38:20.462892] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:36.954 [2024-07-12 18:38:20.463360] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:36.954 [2024-07-12 18:38:20.463405] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:36.954 [2024-07-12 18:38:20.463457] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:36.954 [2024-07-12 18:38:20.463505] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:36.954 [2024-07-12 18:38:20.463772] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:36.954 [2024-07-12 18:38:20.463788] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:36.954 [2024-07-12 18:38:20.463802] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:36.954 [2024-07-12 18:38:20.463816] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:36.954 [2024-07-12 18:38:20.465822] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:36.954 [2024-07-12 18:38:20.465874] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:36.954 [2024-07-12 18:38:20.465914] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:36.954 [2024-07-12 18:38:20.465960] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:36.954 [2024-07-12 18:38:20.466270] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:36.954 [2024-07-12 18:38:20.466313] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:36.954 [2024-07-12 18:38:20.466354] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:36.954 [2024-07-12 18:38:20.466394] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:36.954 [2024-07-12 18:38:20.466656] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:36.954 [2024-07-12 18:38:20.466673] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:36.954 [2024-07-12 18:38:20.466687] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:36.954 [2024-07-12 18:38:20.466701] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:36.954 [2024-07-12 18:38:20.469364] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:36.954 [2024-07-12 18:38:20.469410] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:36.954 [2024-07-12 18:38:20.469451] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:36.954 [2024-07-12 18:38:20.469488] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:36.954 [2024-07-12 18:38:20.469945] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:36.954 [2024-07-12 18:38:20.469992] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:36.954 [2024-07-12 18:38:20.470032] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:36.954 [2024-07-12 18:38:20.470072] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:36.954 [2024-07-12 18:38:20.470359] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:36.954 [2024-07-12 18:38:20.470375] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:36.954 [2024-07-12 18:38:20.470390] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:36.954 [2024-07-12 18:38:20.470405] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:36.954 [2024-07-12 18:38:20.473731] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:36.954 [2024-07-12 18:38:20.475459] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:36.954 [2024-07-12 18:38:20.477301] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:36.954 [2024-07-12 18:38:20.477697] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:36.954 [2024-07-12 18:38:20.478521] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:36.954 [2024-07-12 18:38:20.478911] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:36.954 [2024-07-12 18:38:20.479305] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:36.954 [2024-07-12 18:38:20.481046] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:36.954 [2024-07-12 18:38:20.481319] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:36.954 [2024-07-12 18:38:20.481336] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:36.954 [2024-07-12 18:38:20.481351] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:36.954 [2024-07-12 18:38:20.481367] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:36.954 [2024-07-12 18:38:20.484844] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:36.954 [2024-07-12 18:38:20.486519] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:36.954 [2024-07-12 18:38:20.487333] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:36.954 [2024-07-12 18:38:20.487722] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:36.954 [2024-07-12 18:38:20.488538] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:36.954 [2024-07-12 18:38:20.488936] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:36.954 [2024-07-12 18:38:20.490170] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:36.954 [2024-07-12 18:38:20.491562] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:36.954 [2024-07-12 18:38:20.491834] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:36.954 [2024-07-12 18:38:20.491851] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:36.954 [2024-07-12 18:38:20.491865] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:36.954 [2024-07-12 18:38:20.491880] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:36.954 [2024-07-12 18:38:20.495344] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:36.954 [2024-07-12 18:38:20.496834] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:36.954 [2024-07-12 18:38:20.497230] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:36.954 [2024-07-12 18:38:20.497623] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:36.954 [2024-07-12 18:38:20.498412] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:36.954 [2024-07-12 18:38:20.499023] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:36.954 [2024-07-12 18:38:20.500411] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:36.954 [2024-07-12 18:38:20.502087] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:36.954 [2024-07-12 18:38:20.502361] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:36.954 [2024-07-12 18:38:20.502377] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:36.954 [2024-07-12 18:38:20.502392] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:36.954 [2024-07-12 18:38:20.502406] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:36.954 [2024-07-12 18:38:20.505886] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:36.954 [2024-07-12 18:38:20.506338] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:36.954 [2024-07-12 18:38:20.506731] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:36.954 [2024-07-12 18:38:20.507126] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:36.954 [2024-07-12 18:38:20.507966] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:36.954 [2024-07-12 18:38:20.509666] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:36.955 [2024-07-12 18:38:20.511204] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:36.955 [2024-07-12 18:38:20.512858] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:36.955 [2024-07-12 18:38:20.513138] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:36.955 [2024-07-12 18:38:20.513154] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:36.955 [2024-07-12 18:38:20.513169] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:36.955 [2024-07-12 18:38:20.513184] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:36.955 [2024-07-12 18:38:20.516226] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:36.955 [2024-07-12 18:38:20.516620] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:36.955 [2024-07-12 18:38:20.517016] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:36.955 [2024-07-12 18:38:20.517406] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:36.955 [2024-07-12 18:38:20.518846] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:36.955 [2024-07-12 18:38:20.520248] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:36.955 [2024-07-12 18:38:20.521896] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:36.955 [2024-07-12 18:38:20.523570] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:36.955 [2024-07-12 18:38:20.523924] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:36.955 [2024-07-12 18:38:20.523947] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:36.955 [2024-07-12 18:38:20.523963] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:36.955 [2024-07-12 18:38:20.523977] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:36.955 [2024-07-12 18:38:20.526031] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:36.955 [2024-07-12 18:38:20.526428] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:36.955 [2024-07-12 18:38:20.526817] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:36.955 [2024-07-12 18:38:20.527212] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:36.955 [2024-07-12 18:38:20.529028] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:36.955 [2024-07-12 18:38:20.530645] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:36.955 [2024-07-12 18:38:20.532296] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:36.955 [2024-07-12 18:38:20.533665] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:36.955 [2024-07-12 18:38:20.534001] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:36.955 [2024-07-12 18:38:20.534017] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:36.955 [2024-07-12 18:38:20.534032] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:36.955 [2024-07-12 18:38:20.534046] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:36.955 [2024-07-12 18:38:20.536268] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:36.955 [2024-07-12 18:38:20.536661] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:36.955 [2024-07-12 18:38:20.537056] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:36.955 [2024-07-12 18:38:20.537468] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:36.955 [2024-07-12 18:38:20.539147] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:36.955 [2024-07-12 18:38:20.540795] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:36.955 [2024-07-12 18:38:20.542469] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:36.955 [2024-07-12 18:38:20.543304] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:36.955 [2024-07-12 18:38:20.543619] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:36.955 [2024-07-12 18:38:20.543635] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:36.955 [2024-07-12 18:38:20.543649] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:36.955 [2024-07-12 18:38:20.543664] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:36.955 [2024-07-12 18:38:20.545960] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:36.955 [2024-07-12 18:38:20.546353] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:36.955 [2024-07-12 18:38:20.546746] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:36.955 [2024-07-12 18:38:20.548308] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:36.955 [2024-07-12 18:38:20.550290] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:36.955 [2024-07-12 18:38:20.551951] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:36.955 [2024-07-12 18:38:20.552821] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:36.955 [2024-07-12 18:38:20.554493] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:36.955 [2024-07-12 18:38:20.554804] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:36.955 [2024-07-12 18:38:20.554820] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:36.955 [2024-07-12 18:38:20.554835] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:36.955 [2024-07-12 18:38:20.554849] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:36.955 [2024-07-12 18:38:20.557358] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:36.955 [2024-07-12 18:38:20.557753] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:36.955 [2024-07-12 18:38:20.558795] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:36.955 [2024-07-12 18:38:20.560179] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:36.955 [2024-07-12 18:38:20.562147] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:36.955 [2024-07-12 18:38:20.563555] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:36.955 [2024-07-12 18:38:20.565002] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:36.955 [2024-07-12 18:38:20.566383] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:36.955 [2024-07-12 18:38:20.566654] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:36.955 [2024-07-12 18:38:20.566671] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:36.955 [2024-07-12 18:38:20.566685] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:36.955 [2024-07-12 18:38:20.566700] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:36.955 [2024-07-12 18:38:20.569390] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:36.955 [2024-07-12 18:38:20.569788] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:36.955 [2024-07-12 18:38:20.571345] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:36.955 [2024-07-12 18:38:20.573046] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:36.955 [2024-07-12 18:38:20.574977] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:36.955 [2024-07-12 18:38:20.575718] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:36.955 [2024-07-12 18:38:20.577121] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:36.955 [2024-07-12 18:38:20.578750] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:36.955 [2024-07-12 18:38:20.579032] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:36.955 [2024-07-12 18:38:20.579049] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:36.955 [2024-07-12 18:38:20.579063] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:36.955 [2024-07-12 18:38:20.579078] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:36.955 [2024-07-12 18:38:20.581847] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:36.955 [2024-07-12 18:38:20.583320] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:36.955 [2024-07-12 18:38:20.584710] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:36.955 [2024-07-12 18:38:20.586380] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:36.955 [2024-07-12 18:38:20.587608] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:36.955 [2024-07-12 18:38:20.589328] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:36.955 [2024-07-12 18:38:20.591007] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:36.955 [2024-07-12 18:38:20.592782] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:36.955 [2024-07-12 18:38:20.593067] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:36.955 [2024-07-12 18:38:20.593084] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:36.955 [2024-07-12 18:38:20.593098] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:36.955 [2024-07-12 18:38:20.593113] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:36.955 [2024-07-12 18:38:20.596433] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:36.955 [2024-07-12 18:38:20.597830] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:36.955 [2024-07-12 18:38:20.599515] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:36.955 [2024-07-12 18:38:20.601177] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:36.955 [2024-07-12 18:38:20.602818] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:36.955 [2024-07-12 18:38:20.604211] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:36.955 [2024-07-12 18:38:20.605870] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:36.956 [2024-07-12 18:38:20.607535] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:36.956 [2024-07-12 18:38:20.607901] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:36.956 [2024-07-12 18:38:20.607919] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:36.956 [2024-07-12 18:38:20.607956] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:36.956 [2024-07-12 18:38:20.607970] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:36.956 [2024-07-12 18:38:20.612405] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:36.956 [2024-07-12 18:38:20.614258] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:36.956 [2024-07-12 18:38:20.615967] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:36.956 [2024-07-12 18:38:20.617556] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:36.956 [2024-07-12 18:38:20.619274] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:36.956 [2024-07-12 18:38:20.620945] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:36.956 [2024-07-12 18:38:20.622591] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:36.956 [2024-07-12 18:38:20.623812] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:36.956 [2024-07-12 18:38:20.624222] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:36.956 [2024-07-12 18:38:20.624244] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:36.956 [2024-07-12 18:38:20.624259] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:36.956 [2024-07-12 18:38:20.624274] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:36.956 [2024-07-12 18:38:20.627986] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:36.956 [2024-07-12 18:38:20.629630] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:36.956 [2024-07-12 18:38:20.631279] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:36.956 [2024-07-12 18:38:20.632027] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:36.956 [2024-07-12 18:38:20.633806] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:36.956 [2024-07-12 18:38:20.635457] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:36.956 [2024-07-12 18:38:20.637182] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:36.956 [2024-07-12 18:38:20.637579] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:36.956 [2024-07-12 18:38:20.638027] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:36.956 [2024-07-12 18:38:20.638045] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:36.956 [2024-07-12 18:38:20.638060] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:36.956 [2024-07-12 18:38:20.638077] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:36.956 [2024-07-12 18:38:20.642013] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:36.956 [2024-07-12 18:38:20.643680] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:36.956 [2024-07-12 18:38:20.645042] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:36.956 [2024-07-12 18:38:20.646532] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:36.956 [2024-07-12 18:38:20.648517] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:36.956 [2024-07-12 18:38:20.650195] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:36.956 [2024-07-12 18:38:20.651096] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:36.956 [2024-07-12 18:38:20.651490] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:36.956 [2024-07-12 18:38:20.651916] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:36.956 [2024-07-12 18:38:20.651942] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:36.956 [2024-07-12 18:38:20.651958] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:36.956 [2024-07-12 18:38:20.651974] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:36.956 [2024-07-12 18:38:20.655849] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:36.956 [2024-07-12 18:38:20.657580] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:36.956 [2024-07-12 18:38:20.658521] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:36.956 [2024-07-12 18:38:20.659908] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:36.956 [2024-07-12 18:38:20.661861] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:36.956 [2024-07-12 18:38:20.663385] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:36.956 [2024-07-12 18:38:20.663773] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:36.956 [2024-07-12 18:38:20.664177] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:36.956 [2024-07-12 18:38:20.664558] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:36.956 [2024-07-12 18:38:20.664575] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:36.956 [2024-07-12 18:38:20.664590] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:36.956 [2024-07-12 18:38:20.664605] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:36.956 [2024-07-12 18:38:20.668263] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:36.956 [2024-07-12 18:38:20.669171] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:36.956 [2024-07-12 18:38:20.670866] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:36.956 [2024-07-12 18:38:20.672531] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:36.956 [2024-07-12 18:38:20.674310] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:36.956 [2024-07-12 18:38:20.675127] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:36.956 [2024-07-12 18:38:20.675517] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:36.956 [2024-07-12 18:38:20.675903] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.219 [2024-07-12 18:38:20.676375] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.219 [2024-07-12 18:38:20.676393] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.219 [2024-07-12 18:38:20.676409] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.219 [2024-07-12 18:38:20.676424] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.219 [2024-07-12 18:38:20.679938] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.219 [2024-07-12 18:38:20.680681] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.219 [2024-07-12 18:38:20.682146] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.219 [2024-07-12 18:38:20.683769] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.219 [2024-07-12 18:38:20.685518] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.219 [2024-07-12 18:38:20.686077] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.219 [2024-07-12 18:38:20.686468] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.219 [2024-07-12 18:38:20.686851] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.219 [2024-07-12 18:38:20.687318] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.219 [2024-07-12 18:38:20.687336] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.219 [2024-07-12 18:38:20.687356] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.219 [2024-07-12 18:38:20.687372] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.219 [2024-07-12 18:38:20.690766] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.219 [2024-07-12 18:38:20.692013] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.219 [2024-07-12 18:38:20.693405] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.219 [2024-07-12 18:38:20.695058] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.219 [2024-07-12 18:38:20.696548] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.219 [2024-07-12 18:38:20.696951] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.219 [2024-07-12 18:38:20.697339] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.219 [2024-07-12 18:38:20.697742] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.219 [2024-07-12 18:38:20.698184] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.219 [2024-07-12 18:38:20.698203] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.219 [2024-07-12 18:38:20.698218] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.219 [2024-07-12 18:38:20.698233] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.219 [2024-07-12 18:38:20.700550] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.219 [2024-07-12 18:38:20.701952] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.219 [2024-07-12 18:38:20.703612] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.219 [2024-07-12 18:38:20.705273] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.219 [2024-07-12 18:38:20.705948] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.219 [2024-07-12 18:38:20.706341] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.219 [2024-07-12 18:38:20.706728] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.219 [2024-07-12 18:38:20.707129] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.219 [2024-07-12 18:38:20.707465] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.219 [2024-07-12 18:38:20.707481] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.219 [2024-07-12 18:38:20.707496] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.219 [2024-07-12 18:38:20.707510] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.219 [2024-07-12 18:38:20.710835] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.219 [2024-07-12 18:38:20.712379] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.219 [2024-07-12 18:38:20.714043] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.219 [2024-07-12 18:38:20.715781] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.219 [2024-07-12 18:38:20.716661] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.219 [2024-07-12 18:38:20.717063] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.219 [2024-07-12 18:38:20.717451] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.219 [2024-07-12 18:38:20.718236] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.219 [2024-07-12 18:38:20.718530] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.219 [2024-07-12 18:38:20.718546] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.220 [2024-07-12 18:38:20.718562] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.220 [2024-07-12 18:38:20.718576] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.220 [2024-07-12 18:38:20.721609] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.220 [2024-07-12 18:38:20.723253] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.220 [2024-07-12 18:38:20.724902] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.220 [2024-07-12 18:38:20.725730] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.220 [2024-07-12 18:38:20.726583] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.220 [2024-07-12 18:38:20.726981] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.220 [2024-07-12 18:38:20.727366] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.220 [2024-07-12 18:38:20.729205] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.220 [2024-07-12 18:38:20.729482] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.220 [2024-07-12 18:38:20.729500] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.220 [2024-07-12 18:38:20.729514] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.220 [2024-07-12 18:38:20.729528] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.220 [2024-07-12 18:38:20.732978] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.220 [2024-07-12 18:38:20.734638] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.220 [2024-07-12 18:38:20.736106] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.220 [2024-07-12 18:38:20.736498] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.220 [2024-07-12 18:38:20.737333] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.220 [2024-07-12 18:38:20.737724] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.220 [2024-07-12 18:38:20.738922] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.220 [2024-07-12 18:38:20.740320] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.220 [2024-07-12 18:38:20.740597] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.220 [2024-07-12 18:38:20.740613] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.220 [2024-07-12 18:38:20.740628] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.220 [2024-07-12 18:38:20.740647] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.220 [2024-07-12 18:38:20.743951] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.220 [2024-07-12 18:38:20.745609] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.220 [2024-07-12 18:38:20.746018] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.220 [2024-07-12 18:38:20.746408] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.220 [2024-07-12 18:38:20.747203] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.220 [2024-07-12 18:38:20.747844] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.220 [2024-07-12 18:38:20.749231] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.220 [2024-07-12 18:38:20.750887] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.220 [2024-07-12 18:38:20.751172] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.220 [2024-07-12 18:38:20.751190] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.220 [2024-07-12 18:38:20.751204] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.220 [2024-07-12 18:38:20.751219] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.220 [2024-07-12 18:38:20.754568] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.220 [2024-07-12 18:38:20.755555] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.220 [2024-07-12 18:38:20.755967] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.220 [2024-07-12 18:38:20.756356] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.220 [2024-07-12 18:38:20.757219] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.220 [2024-07-12 18:38:20.759064] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.220 [2024-07-12 18:38:20.760756] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.220 [2024-07-12 18:38:20.762541] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.220 [2024-07-12 18:38:20.762813] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.220 [2024-07-12 18:38:20.762829] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.220 [2024-07-12 18:38:20.762844] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.220 [2024-07-12 18:38:20.762859] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.220 [2024-07-12 18:38:20.766213] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.220 [2024-07-12 18:38:20.766607] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.220 [2024-07-12 18:38:20.766998] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.220 [2024-07-12 18:38:20.767386] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.220 [2024-07-12 18:38:20.768865] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.220 [2024-07-12 18:38:20.770261] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.220 [2024-07-12 18:38:20.771915] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.220 [2024-07-12 18:38:20.773554] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.220 [2024-07-12 18:38:20.773876] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.220 [2024-07-12 18:38:20.773893] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.220 [2024-07-12 18:38:20.773907] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.220 [2024-07-12 18:38:20.773923] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.220 [2024-07-12 18:38:20.776169] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.220 [2024-07-12 18:38:20.776564] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.220 [2024-07-12 18:38:20.776961] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.220 [2024-07-12 18:38:20.777353] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.220 [2024-07-12 18:38:20.779207] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.220 [2024-07-12 18:38:20.780860] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.220 [2024-07-12 18:38:20.782527] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.220 [2024-07-12 18:38:20.783911] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.220 [2024-07-12 18:38:20.784227] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.220 [2024-07-12 18:38:20.784245] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.220 [2024-07-12 18:38:20.784259] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.220 [2024-07-12 18:38:20.784273] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.220 [2024-07-12 18:38:20.786296] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.220 [2024-07-12 18:38:20.786692] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.220 [2024-07-12 18:38:20.787104] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.220 [2024-07-12 18:38:20.787498] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.220 [2024-07-12 18:38:20.789162] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.221 [2024-07-12 18:38:20.790823] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.221 [2024-07-12 18:38:20.792484] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.221 [2024-07-12 18:38:20.793231] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.221 [2024-07-12 18:38:20.793519] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.221 [2024-07-12 18:38:20.793537] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.221 [2024-07-12 18:38:20.793552] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.221 [2024-07-12 18:38:20.793567] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.221 [2024-07-12 18:38:20.795631] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.221 [2024-07-12 18:38:20.796036] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.221 [2024-07-12 18:38:20.796426] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.221 [2024-07-12 18:38:20.797913] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.221 [2024-07-12 18:38:20.799797] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.221 [2024-07-12 18:38:20.800935] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.221 [2024-07-12 18:38:20.802305] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.221 [2024-07-12 18:38:20.803702] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.221 [2024-07-12 18:38:20.803981] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.221 [2024-07-12 18:38:20.803998] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.221 [2024-07-12 18:38:20.804012] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.221 [2024-07-12 18:38:20.804027] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.221 [2024-07-12 18:38:20.806599] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.221 [2024-07-12 18:38:20.807004] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.221 [2024-07-12 18:38:20.807409] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.221 [2024-07-12 18:38:20.807801] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.221 [2024-07-12 18:38:20.808635] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.221 [2024-07-12 18:38:20.809035] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.221 [2024-07-12 18:38:20.809438] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.221 [2024-07-12 18:38:20.809841] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.221 [2024-07-12 18:38:20.810305] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.221 [2024-07-12 18:38:20.810323] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.221 [2024-07-12 18:38:20.810338] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.221 [2024-07-12 18:38:20.810353] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.221 [2024-07-12 18:38:20.812984] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.221 [2024-07-12 18:38:20.813383] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.221 [2024-07-12 18:38:20.813772] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.221 [2024-07-12 18:38:20.813810] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.221 [2024-07-12 18:38:20.814657] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.221 [2024-07-12 18:38:20.815061] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.221 [2024-07-12 18:38:20.815454] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.221 [2024-07-12 18:38:20.815846] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.221 [2024-07-12 18:38:20.816274] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.221 [2024-07-12 18:38:20.816292] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.221 [2024-07-12 18:38:20.816307] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.221 [2024-07-12 18:38:20.816322] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.221 [2024-07-12 18:38:20.818827] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.221 [2024-07-12 18:38:20.819226] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.221 [2024-07-12 18:38:20.819616] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.221 [2024-07-12 18:38:20.820026] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.221 [2024-07-12 18:38:20.820082] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.221 [2024-07-12 18:38:20.820444] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.221 [2024-07-12 18:38:20.820851] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.221 [2024-07-12 18:38:20.821246] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.221 [2024-07-12 18:38:20.821635] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.221 [2024-07-12 18:38:20.822046] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.221 [2024-07-12 18:38:20.822427] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.221 [2024-07-12 18:38:20.822444] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.221 [2024-07-12 18:38:20.822459] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.221 [2024-07-12 18:38:20.822473] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.221 [2024-07-12 18:38:20.824997] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.221 [2024-07-12 18:38:20.825047] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.221 [2024-07-12 18:38:20.825089] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.221 [2024-07-12 18:38:20.825131] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.221 [2024-07-12 18:38:20.825498] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.221 [2024-07-12 18:38:20.825562] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.221 [2024-07-12 18:38:20.825605] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.221 [2024-07-12 18:38:20.825645] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.221 [2024-07-12 18:38:20.825688] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.221 [2024-07-12 18:38:20.826056] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.221 [2024-07-12 18:38:20.826074] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.221 [2024-07-12 18:38:20.826089] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.221 [2024-07-12 18:38:20.826108] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.221 [2024-07-12 18:38:20.828522] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.221 [2024-07-12 18:38:20.828569] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.221 [2024-07-12 18:38:20.828621] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.221 [2024-07-12 18:38:20.828665] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.221 [2024-07-12 18:38:20.829129] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.221 [2024-07-12 18:38:20.829194] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.221 [2024-07-12 18:38:20.829236] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.221 [2024-07-12 18:38:20.829291] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.221 [2024-07-12 18:38:20.829351] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.222 [2024-07-12 18:38:20.829716] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.222 [2024-07-12 18:38:20.829733] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.222 [2024-07-12 18:38:20.829748] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.222 [2024-07-12 18:38:20.829762] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.222 [2024-07-12 18:38:20.832158] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.222 [2024-07-12 18:38:20.832217] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.222 [2024-07-12 18:38:20.832261] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.222 [2024-07-12 18:38:20.832302] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.222 [2024-07-12 18:38:20.832644] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.222 [2024-07-12 18:38:20.832707] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.222 [2024-07-12 18:38:20.832748] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.222 [2024-07-12 18:38:20.832790] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.222 [2024-07-12 18:38:20.832831] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.222 [2024-07-12 18:38:20.833287] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.222 [2024-07-12 18:38:20.833306] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.222 [2024-07-12 18:38:20.833321] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.222 [2024-07-12 18:38:20.833336] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.222 [2024-07-12 18:38:20.835579] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.222 [2024-07-12 18:38:20.835624] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.222 [2024-07-12 18:38:20.835674] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.222 [2024-07-12 18:38:20.835724] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.222 [2024-07-12 18:38:20.836207] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.222 [2024-07-12 18:38:20.836283] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.222 [2024-07-12 18:38:20.836330] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.222 [2024-07-12 18:38:20.836371] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.222 [2024-07-12 18:38:20.836411] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.222 [2024-07-12 18:38:20.836865] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.222 [2024-07-12 18:38:20.836885] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.222 [2024-07-12 18:38:20.836901] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.222 [2024-07-12 18:38:20.836915] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.222 [2024-07-12 18:38:20.839148] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.222 [2024-07-12 18:38:20.839207] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.222 [2024-07-12 18:38:20.839248] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.222 [2024-07-12 18:38:20.839289] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.222 [2024-07-12 18:38:20.839713] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.222 [2024-07-12 18:38:20.839769] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.222 [2024-07-12 18:38:20.839811] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.222 [2024-07-12 18:38:20.839852] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.222 [2024-07-12 18:38:20.839897] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.222 [2024-07-12 18:38:20.840330] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.222 [2024-07-12 18:38:20.840348] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.222 [2024-07-12 18:38:20.840363] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.222 [2024-07-12 18:38:20.840378] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.222 [2024-07-12 18:38:20.842728] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.222 [2024-07-12 18:38:20.842787] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.222 [2024-07-12 18:38:20.842828] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.222 [2024-07-12 18:38:20.842870] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.222 [2024-07-12 18:38:20.843315] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.222 [2024-07-12 18:38:20.843369] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.222 [2024-07-12 18:38:20.843411] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.222 [2024-07-12 18:38:20.843452] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.222 [2024-07-12 18:38:20.843499] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.222 [2024-07-12 18:38:20.843859] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.222 [2024-07-12 18:38:20.843876] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.222 [2024-07-12 18:38:20.843890] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.222 [2024-07-12 18:38:20.843905] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.222 [2024-07-12 18:38:20.846142] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.222 [2024-07-12 18:38:20.846188] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.222 [2024-07-12 18:38:20.846229] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.222 [2024-07-12 18:38:20.846270] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.222 [2024-07-12 18:38:20.846694] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.222 [2024-07-12 18:38:20.846747] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.222 [2024-07-12 18:38:20.846789] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.222 [2024-07-12 18:38:20.846831] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.222 [2024-07-12 18:38:20.846872] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.222 [2024-07-12 18:38:20.847317] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.222 [2024-07-12 18:38:20.847336] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.222 [2024-07-12 18:38:20.847351] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.222 [2024-07-12 18:38:20.847366] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.222 [2024-07-12 18:38:20.849619] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.222 [2024-07-12 18:38:20.849665] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.222 [2024-07-12 18:38:20.849711] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.222 [2024-07-12 18:38:20.849752] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.222 [2024-07-12 18:38:20.850154] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.222 [2024-07-12 18:38:20.850207] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.222 [2024-07-12 18:38:20.850248] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.222 [2024-07-12 18:38:20.850289] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.222 [2024-07-12 18:38:20.850329] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.223 [2024-07-12 18:38:20.850739] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.223 [2024-07-12 18:38:20.850756] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.223 [2024-07-12 18:38:20.850772] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.223 [2024-07-12 18:38:20.850792] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.223 [2024-07-12 18:38:20.853100] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.223 [2024-07-12 18:38:20.853147] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.223 [2024-07-12 18:38:20.853188] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.223 [2024-07-12 18:38:20.853230] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.223 [2024-07-12 18:38:20.853638] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.223 [2024-07-12 18:38:20.853693] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.223 [2024-07-12 18:38:20.853735] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.223 [2024-07-12 18:38:20.853776] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.223 [2024-07-12 18:38:20.853817] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.223 [2024-07-12 18:38:20.854259] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.223 [2024-07-12 18:38:20.854277] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.223 [2024-07-12 18:38:20.854292] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.223 [2024-07-12 18:38:20.854308] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.223 [2024-07-12 18:38:20.856614] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.223 [2024-07-12 18:38:20.856675] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.223 [2024-07-12 18:38:20.856717] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.223 [2024-07-12 18:38:20.856757] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.223 [2024-07-12 18:38:20.857220] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.223 [2024-07-12 18:38:20.857274] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.223 [2024-07-12 18:38:20.857316] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.223 [2024-07-12 18:38:20.857358] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.223 [2024-07-12 18:38:20.857399] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.223 [2024-07-12 18:38:20.857792] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.223 [2024-07-12 18:38:20.857808] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.223 [2024-07-12 18:38:20.857823] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.223 [2024-07-12 18:38:20.857838] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.223 [2024-07-12 18:38:20.860072] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.223 [2024-07-12 18:38:20.860119] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.223 [2024-07-12 18:38:20.860160] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.223 [2024-07-12 18:38:20.860202] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.223 [2024-07-12 18:38:20.860643] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.223 [2024-07-12 18:38:20.860695] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.223 [2024-07-12 18:38:20.860737] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.223 [2024-07-12 18:38:20.860779] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.223 [2024-07-12 18:38:20.860831] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.223 [2024-07-12 18:38:20.861234] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.223 [2024-07-12 18:38:20.861251] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.223 [2024-07-12 18:38:20.861266] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.223 [2024-07-12 18:38:20.861280] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.223 [2024-07-12 18:38:20.863599] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.223 [2024-07-12 18:38:20.863645] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.223 [2024-07-12 18:38:20.863691] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.223 [2024-07-12 18:38:20.863735] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.223 [2024-07-12 18:38:20.864135] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.223 [2024-07-12 18:38:20.864205] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.223 [2024-07-12 18:38:20.864259] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.223 [2024-07-12 18:38:20.864313] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.223 [2024-07-12 18:38:20.864355] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.223 [2024-07-12 18:38:20.864721] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.223 [2024-07-12 18:38:20.864738] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.223 [2024-07-12 18:38:20.864752] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.223 [2024-07-12 18:38:20.864767] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.223 [2024-07-12 18:38:20.867116] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.223 [2024-07-12 18:38:20.867163] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.223 [2024-07-12 18:38:20.867204] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.223 [2024-07-12 18:38:20.867245] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.223 [2024-07-12 18:38:20.867628] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.223 [2024-07-12 18:38:20.867695] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.223 [2024-07-12 18:38:20.867737] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.223 [2024-07-12 18:38:20.867791] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.223 [2024-07-12 18:38:20.867832] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.223 [2024-07-12 18:38:20.868320] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.223 [2024-07-12 18:38:20.868337] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.223 [2024-07-12 18:38:20.868352] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.223 [2024-07-12 18:38:20.868367] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.223 [2024-07-12 18:38:20.870860] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.223 [2024-07-12 18:38:20.870920] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.223 [2024-07-12 18:38:20.871007] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.223 [2024-07-12 18:38:20.871057] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.223 [2024-07-12 18:38:20.871499] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.223 [2024-07-12 18:38:20.871568] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.223 [2024-07-12 18:38:20.871620] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.224 [2024-07-12 18:38:20.871661] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.224 [2024-07-12 18:38:20.871702] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.224 [2024-07-12 18:38:20.872134] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.224 [2024-07-12 18:38:20.872151] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.224 [2024-07-12 18:38:20.872166] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.224 [2024-07-12 18:38:20.872181] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.224 [2024-07-12 18:38:20.874513] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.224 [2024-07-12 18:38:20.874560] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.224 [2024-07-12 18:38:20.874601] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.224 [2024-07-12 18:38:20.874650] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.224 [2024-07-12 18:38:20.875040] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.224 [2024-07-12 18:38:20.875104] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.224 [2024-07-12 18:38:20.875164] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.224 [2024-07-12 18:38:20.875205] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.224 [2024-07-12 18:38:20.875246] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.224 [2024-07-12 18:38:20.875694] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.224 [2024-07-12 18:38:20.875712] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.224 [2024-07-12 18:38:20.875727] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.224 [2024-07-12 18:38:20.875742] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.224 [2024-07-12 18:38:20.878002] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.224 [2024-07-12 18:38:20.878074] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.224 [2024-07-12 18:38:20.878131] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.224 [2024-07-12 18:38:20.878184] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.224 [2024-07-12 18:38:20.878573] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.224 [2024-07-12 18:38:20.878626] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.224 [2024-07-12 18:38:20.878668] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.224 [2024-07-12 18:38:20.878708] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.224 [2024-07-12 18:38:20.878748] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.224 [2024-07-12 18:38:20.879187] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.224 [2024-07-12 18:38:20.879207] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.224 [2024-07-12 18:38:20.879222] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.224 [2024-07-12 18:38:20.879237] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.224 [2024-07-12 18:38:20.881462] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.224 [2024-07-12 18:38:20.881508] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.224 [2024-07-12 18:38:20.881550] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.224 [2024-07-12 18:38:20.881598] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.224 [2024-07-12 18:38:20.882064] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.224 [2024-07-12 18:38:20.882121] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.224 [2024-07-12 18:38:20.882165] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.224 [2024-07-12 18:38:20.882205] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.224 [2024-07-12 18:38:20.882246] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.224 [2024-07-12 18:38:20.882669] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.224 [2024-07-12 18:38:20.882685] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.224 [2024-07-12 18:38:20.882701] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.224 [2024-07-12 18:38:20.882716] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.224 [2024-07-12 18:38:20.884941] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.224 [2024-07-12 18:38:20.884987] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.224 [2024-07-12 18:38:20.885028] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.224 [2024-07-12 18:38:20.885068] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.224 [2024-07-12 18:38:20.885487] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.224 [2024-07-12 18:38:20.885544] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.224 [2024-07-12 18:38:20.885585] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.224 [2024-07-12 18:38:20.885627] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.224 [2024-07-12 18:38:20.885667] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.225 [2024-07-12 18:38:20.886076] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.225 [2024-07-12 18:38:20.886093] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.225 [2024-07-12 18:38:20.886108] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.225 [2024-07-12 18:38:20.886123] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.225 [2024-07-12 18:38:20.888254] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.225 [2024-07-12 18:38:20.888301] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.225 [2024-07-12 18:38:20.888342] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.225 [2024-07-12 18:38:20.888384] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.225 [2024-07-12 18:38:20.888820] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.225 [2024-07-12 18:38:20.888872] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.225 [2024-07-12 18:38:20.888914] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.225 [2024-07-12 18:38:20.888970] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.225 [2024-07-12 18:38:20.889011] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.225 [2024-07-12 18:38:20.889454] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.225 [2024-07-12 18:38:20.889470] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.225 [2024-07-12 18:38:20.889487] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.225 [2024-07-12 18:38:20.889503] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.225 [2024-07-12 18:38:20.891743] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.225 [2024-07-12 18:38:20.891789] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.225 [2024-07-12 18:38:20.891834] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.225 [2024-07-12 18:38:20.891876] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.225 [2024-07-12 18:38:20.892246] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.225 [2024-07-12 18:38:20.892299] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.225 [2024-07-12 18:38:20.892340] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.225 [2024-07-12 18:38:20.892380] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.225 [2024-07-12 18:38:20.892420] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.225 [2024-07-12 18:38:20.892866] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.225 [2024-07-12 18:38:20.892884] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.225 [2024-07-12 18:38:20.892899] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.225 [2024-07-12 18:38:20.892915] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.225 [2024-07-12 18:38:20.895128] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.225 [2024-07-12 18:38:20.895175] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.225 [2024-07-12 18:38:20.895217] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.225 [2024-07-12 18:38:20.895258] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.225 [2024-07-12 18:38:20.895688] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.225 [2024-07-12 18:38:20.895744] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.225 [2024-07-12 18:38:20.895786] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.225 [2024-07-12 18:38:20.895830] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.225 [2024-07-12 18:38:20.895871] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.225 [2024-07-12 18:38:20.896312] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.225 [2024-07-12 18:38:20.896329] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.225 [2024-07-12 18:38:20.896344] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.225 [2024-07-12 18:38:20.896359] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.225 [2024-07-12 18:38:20.898816] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.225 [2024-07-12 18:38:20.898861] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.225 [2024-07-12 18:38:20.898902] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.225 [2024-07-12 18:38:20.898949] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.225 [2024-07-12 18:38:20.899375] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.225 [2024-07-12 18:38:20.899427] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.225 [2024-07-12 18:38:20.899469] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.225 [2024-07-12 18:38:20.899510] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.225 [2024-07-12 18:38:20.899551] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.225 [2024-07-12 18:38:20.899898] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.225 [2024-07-12 18:38:20.899915] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.225 [2024-07-12 18:38:20.899935] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.225 [2024-07-12 18:38:20.899949] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.225 [2024-07-12 18:38:20.902219] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.225 [2024-07-12 18:38:20.902273] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.225 [2024-07-12 18:38:20.902314] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.225 [2024-07-12 18:38:20.902355] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.225 [2024-07-12 18:38:20.902790] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.225 [2024-07-12 18:38:20.902842] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.225 [2024-07-12 18:38:20.902885] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.225 [2024-07-12 18:38:20.902944] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.225 [2024-07-12 18:38:20.902988] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.225 [2024-07-12 18:38:20.903261] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.225 [2024-07-12 18:38:20.903277] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.225 [2024-07-12 18:38:20.903291] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.225 [2024-07-12 18:38:20.903305] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.225 [2024-07-12 18:38:20.905533] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.225 [2024-07-12 18:38:20.905579] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.225 [2024-07-12 18:38:20.905623] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.225 [2024-07-12 18:38:20.905665] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.225 [2024-07-12 18:38:20.906071] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.226 [2024-07-12 18:38:20.906124] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.226 [2024-07-12 18:38:20.906165] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.226 [2024-07-12 18:38:20.906208] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.226 [2024-07-12 18:38:20.906249] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.226 [2024-07-12 18:38:20.906683] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.226 [2024-07-12 18:38:20.906700] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.226 [2024-07-12 18:38:20.906715] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.226 [2024-07-12 18:38:20.906730] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.226 [2024-07-12 18:38:20.908255] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.226 [2024-07-12 18:38:20.908300] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.226 [2024-07-12 18:38:20.908341] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.226 [2024-07-12 18:38:20.908383] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.226 [2024-07-12 18:38:20.908782] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.226 [2024-07-12 18:38:20.908849] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.226 [2024-07-12 18:38:20.908889] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.226 [2024-07-12 18:38:20.908935] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.226 [2024-07-12 18:38:20.908975] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.226 [2024-07-12 18:38:20.909290] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.226 [2024-07-12 18:38:20.909307] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.226 [2024-07-12 18:38:20.909321] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.226 [2024-07-12 18:38:20.909335] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.226 [2024-07-12 18:38:20.911102] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.226 [2024-07-12 18:38:20.911148] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.226 [2024-07-12 18:38:20.911189] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.226 [2024-07-12 18:38:20.911231] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.226 [2024-07-12 18:38:20.911662] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.226 [2024-07-12 18:38:20.911719] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.226 [2024-07-12 18:38:20.911763] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.226 [2024-07-12 18:38:20.911803] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.226 [2024-07-12 18:38:20.911844] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.226 [2024-07-12 18:38:20.912301] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.226 [2024-07-12 18:38:20.912319] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.226 [2024-07-12 18:38:20.912334] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.226 [2024-07-12 18:38:20.912350] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.226 [2024-07-12 18:38:20.914012] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.226 [2024-07-12 18:38:20.914059] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.226 [2024-07-12 18:38:20.914099] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.226 [2024-07-12 18:38:20.914139] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.226 [2024-07-12 18:38:20.914435] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.226 [2024-07-12 18:38:20.914494] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.226 [2024-07-12 18:38:20.914535] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.226 [2024-07-12 18:38:20.914577] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.226 [2024-07-12 18:38:20.914635] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.226 [2024-07-12 18:38:20.914903] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.226 [2024-07-12 18:38:20.914924] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.226 [2024-07-12 18:38:20.914943] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.226 [2024-07-12 18:38:20.914958] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.226 [2024-07-12 18:38:20.916564] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.226 [2024-07-12 18:38:20.916608] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.226 [2024-07-12 18:38:20.916652] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.226 [2024-07-12 18:38:20.916692] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.226 [2024-07-12 18:38:20.917130] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.226 [2024-07-12 18:38:20.917182] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.226 [2024-07-12 18:38:20.917224] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.226 [2024-07-12 18:38:20.917266] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.226 [2024-07-12 18:38:20.917307] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.226 [2024-07-12 18:38:20.917711] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.226 [2024-07-12 18:38:20.917728] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.226 [2024-07-12 18:38:20.917742] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.226 [2024-07-12 18:38:20.917757] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.226 [2024-07-12 18:38:20.919661] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.226 [2024-07-12 18:38:20.919706] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.226 [2024-07-12 18:38:20.919746] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.226 [2024-07-12 18:38:20.919786] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.226 [2024-07-12 18:38:20.920054] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.226 [2024-07-12 18:38:20.920115] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.226 [2024-07-12 18:38:20.920156] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.226 [2024-07-12 18:38:20.920196] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.226 [2024-07-12 18:38:20.920236] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.226 [2024-07-12 18:38:20.920706] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.226 [2024-07-12 18:38:20.920723] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.226 [2024-07-12 18:38:20.920737] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.226 [2024-07-12 18:38:20.920751] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.226 [2024-07-12 18:38:20.922265] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.226 [2024-07-12 18:38:20.922311] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.226 [2024-07-12 18:38:20.922356] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.227 [2024-07-12 18:38:20.922398] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.227 [2024-07-12 18:38:20.922807] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.227 [2024-07-12 18:38:20.922857] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.227 [2024-07-12 18:38:20.922900] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.227 [2024-07-12 18:38:20.922946] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.227 [2024-07-12 18:38:20.922987] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.227 [2024-07-12 18:38:20.923423] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.227 [2024-07-12 18:38:20.923442] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.227 [2024-07-12 18:38:20.923457] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.227 [2024-07-12 18:38:20.923472] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.227 [2024-07-12 18:38:20.925667] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.227 [2024-07-12 18:38:20.925711] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.227 [2024-07-12 18:38:20.927465] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.227 [2024-07-12 18:38:20.927526] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.227 [2024-07-12 18:38:20.927791] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.227 [2024-07-12 18:38:20.927841] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.227 [2024-07-12 18:38:20.927890] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.227 [2024-07-12 18:38:20.927936] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.227 [2024-07-12 18:38:20.927978] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.227 [2024-07-12 18:38:20.928298] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.227 [2024-07-12 18:38:20.928314] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.227 [2024-07-12 18:38:20.928329] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.227 [2024-07-12 18:38:20.928343] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.227 [2024-07-12 18:38:20.929953] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.227 [2024-07-12 18:38:20.929998] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.227 [2024-07-12 18:38:20.930053] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.227 [2024-07-12 18:38:20.930441] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.227 [2024-07-12 18:38:20.930871] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.227 [2024-07-12 18:38:20.930930] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.227 [2024-07-12 18:38:20.930978] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.227 [2024-07-12 18:38:20.931019] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.227 [2024-07-12 18:38:20.931059] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.227 [2024-07-12 18:38:20.931515] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.227 [2024-07-12 18:38:20.931533] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.227 [2024-07-12 18:38:20.931548] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.227 [2024-07-12 18:38:20.931562] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.227 [2024-07-12 18:38:20.934944] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.227 [2024-07-12 18:38:20.935960] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.227 [2024-07-12 18:38:20.937141] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.227 [2024-07-12 18:38:20.938694] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.227 [2024-07-12 18:38:20.938969] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.227 [2024-07-12 18:38:20.940748] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.488 [2024-07-12 18:38:20.941147] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.488 [2024-07-12 18:38:20.941537] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.488 [2024-07-12 18:38:20.941923] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.488 [2024-07-12 18:38:20.942338] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.488 [2024-07-12 18:38:20.942356] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.488 [2024-07-12 18:38:20.942373] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.488 [2024-07-12 18:38:20.942388] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.488 [2024-07-12 18:38:20.945630] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.488 [2024-07-12 18:38:20.946792] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.488 [2024-07-12 18:38:20.948181] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.488 [2024-07-12 18:38:20.949837] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.488 [2024-07-12 18:38:20.950111] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.488 [2024-07-12 18:38:20.951393] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.488 [2024-07-12 18:38:20.951782] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.488 [2024-07-12 18:38:20.952174] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.488 [2024-07-12 18:38:20.952574] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.488 [2024-07-12 18:38:20.953017] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.488 [2024-07-12 18:38:20.953036] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.488 [2024-07-12 18:38:20.953056] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.488 [2024-07-12 18:38:20.953071] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.488 [2024-07-12 18:38:20.955353] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.488 [2024-07-12 18:38:20.956990] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.488 [2024-07-12 18:38:20.958738] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.488 [2024-07-12 18:38:20.960589] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.488 [2024-07-12 18:38:20.960872] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.488 [2024-07-12 18:38:20.961326] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.488 [2024-07-12 18:38:20.961712] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.488 [2024-07-12 18:38:20.962104] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.488 [2024-07-12 18:38:20.962492] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.488 [2024-07-12 18:38:20.962834] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.488 [2024-07-12 18:38:20.962850] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.488 [2024-07-12 18:38:20.962864] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.488 [2024-07-12 18:38:20.962879] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.488 [2024-07-12 18:38:20.965585] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.488 [2024-07-12 18:38:20.966970] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.488 [2024-07-12 18:38:20.968614] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.488 [2024-07-12 18:38:20.970278] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.488 [2024-07-12 18:38:20.970674] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.488 [2024-07-12 18:38:20.971086] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.488 [2024-07-12 18:38:20.971475] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.488 [2024-07-12 18:38:20.971870] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.488 [2024-07-12 18:38:20.972387] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.488 [2024-07-12 18:38:20.972657] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.488 [2024-07-12 18:38:20.972673] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.488 [2024-07-12 18:38:20.972688] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.488 [2024-07-12 18:38:20.972703] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.488 [2024-07-12 18:38:20.975782] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.488 [2024-07-12 18:38:20.977446] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.488 [2024-07-12 18:38:20.979107] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.488 [2024-07-12 18:38:20.980513] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.488 [2024-07-12 18:38:20.980960] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.488 [2024-07-12 18:38:20.981359] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.488 [2024-07-12 18:38:20.981746] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.488 [2024-07-12 18:38:20.982138] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.488 [2024-07-12 18:38:20.983671] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.488 [2024-07-12 18:38:20.983986] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.488 [2024-07-12 18:38:20.984002] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.488 [2024-07-12 18:38:20.984017] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.488 [2024-07-12 18:38:20.984031] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.488 [2024-07-12 18:38:20.987000] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.488 [2024-07-12 18:38:20.988657] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.489 [2024-07-12 18:38:20.990318] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.489 [2024-07-12 18:38:20.990715] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.489 [2024-07-12 18:38:20.991174] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.489 [2024-07-12 18:38:20.991578] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.489 [2024-07-12 18:38:20.991971] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.489 [2024-07-12 18:38:20.992808] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.489 [2024-07-12 18:38:20.994204] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.489 [2024-07-12 18:38:20.994473] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.489 [2024-07-12 18:38:20.994490] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.489 [2024-07-12 18:38:20.994504] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.489 [2024-07-12 18:38:20.994518] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.489 [2024-07-12 18:38:20.997747] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.489 [2024-07-12 18:38:20.999414] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.489 [2024-07-12 18:38:21.000478] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.489 [2024-07-12 18:38:21.000869] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.489 [2024-07-12 18:38:21.001300] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.489 [2024-07-12 18:38:21.001700] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.489 [2024-07-12 18:38:21.002090] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.489 [2024-07-12 18:38:21.003894] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.489 [2024-07-12 18:38:21.005612] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.489 [2024-07-12 18:38:21.005881] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.489 [2024-07-12 18:38:21.005898] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.489 [2024-07-12 18:38:21.005912] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.489 [2024-07-12 18:38:21.005930] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.489 [2024-07-12 18:38:21.009297] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.489 [2024-07-12 18:38:21.011073] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.489 [2024-07-12 18:38:21.011462] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.489 [2024-07-12 18:38:21.011850] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.489 [2024-07-12 18:38:21.012246] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.489 [2024-07-12 18:38:21.012646] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.489 [2024-07-12 18:38:21.013825] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.489 [2024-07-12 18:38:21.015212] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.489 [2024-07-12 18:38:21.016874] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.489 [2024-07-12 18:38:21.017146] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.489 [2024-07-12 18:38:21.017163] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.489 [2024-07-12 18:38:21.017177] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.489 [2024-07-12 18:38:21.017192] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.489 [2024-07-12 18:38:21.020448] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.489 [2024-07-12 18:38:21.021171] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.489 [2024-07-12 18:38:21.021563] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.489 [2024-07-12 18:38:21.021954] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.489 [2024-07-12 18:38:21.022435] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.489 [2024-07-12 18:38:21.022978] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.489 [2024-07-12 18:38:21.024387] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.489 [2024-07-12 18:38:21.026032] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.489 [2024-07-12 18:38:21.027686] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.489 [2024-07-12 18:38:21.027959] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.489 [2024-07-12 18:38:21.027976] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.489 [2024-07-12 18:38:21.027995] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.489 [2024-07-12 18:38:21.028010] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.489 [2024-07-12 18:38:21.030983] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.489 [2024-07-12 18:38:21.031376] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.489 [2024-07-12 18:38:21.031764] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.489 [2024-07-12 18:38:21.032157] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.489 [2024-07-12 18:38:21.032592] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.489 [2024-07-12 18:38:21.034157] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.489 [2024-07-12 18:38:21.035548] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.489 [2024-07-12 18:38:21.037200] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.489 [2024-07-12 18:38:21.038855] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.489 [2024-07-12 18:38:21.039242] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.489 [2024-07-12 18:38:21.039259] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.489 [2024-07-12 18:38:21.039273] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.489 [2024-07-12 18:38:21.039287] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.489 [2024-07-12 18:38:21.041175] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.489 [2024-07-12 18:38:21.041567] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.489 [2024-07-12 18:38:21.041960] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.489 [2024-07-12 18:38:21.042350] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.489 [2024-07-12 18:38:21.042672] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.489 [2024-07-12 18:38:21.044064] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.489 [2024-07-12 18:38:21.045701] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.489 [2024-07-12 18:38:21.047350] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.489 [2024-07-12 18:38:21.048393] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.489 [2024-07-12 18:38:21.048665] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.489 [2024-07-12 18:38:21.048681] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.489 [2024-07-12 18:38:21.048695] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.489 [2024-07-12 18:38:21.048710] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.489 [2024-07-12 18:38:21.050691] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.489 [2024-07-12 18:38:21.051090] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.490 [2024-07-12 18:38:21.051479] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.490 [2024-07-12 18:38:21.052205] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.490 [2024-07-12 18:38:21.052476] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.490 [2024-07-12 18:38:21.054235] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.490 [2024-07-12 18:38:21.056055] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.490 [2024-07-12 18:38:21.057738] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.490 [2024-07-12 18:38:21.058935] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.490 [2024-07-12 18:38:21.059266] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.490 [2024-07-12 18:38:21.059282] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.490 [2024-07-12 18:38:21.059296] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.490 [2024-07-12 18:38:21.059311] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.490 [2024-07-12 18:38:21.061416] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.490 [2024-07-12 18:38:21.061808] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.490 [2024-07-12 18:38:21.062201] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.490 [2024-07-12 18:38:21.064037] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.490 [2024-07-12 18:38:21.064324] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.490 [2024-07-12 18:38:21.066011] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.490 [2024-07-12 18:38:21.067665] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.490 [2024-07-12 18:38:21.068426] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.490 [2024-07-12 18:38:21.069814] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.490 [2024-07-12 18:38:21.070087] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.490 [2024-07-12 18:38:21.070104] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.490 [2024-07-12 18:38:21.070118] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.490 [2024-07-12 18:38:21.070133] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.490 [2024-07-12 18:38:21.072418] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.490 [2024-07-12 18:38:21.072808] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.490 [2024-07-12 18:38:21.074108] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.490 [2024-07-12 18:38:21.075501] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.490 [2024-07-12 18:38:21.075767] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.490 [2024-07-12 18:38:21.077442] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.490 [2024-07-12 18:38:21.078569] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.490 [2024-07-12 18:38:21.080308] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.490 [2024-07-12 18:38:21.081874] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.490 [2024-07-12 18:38:21.082148] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.490 [2024-07-12 18:38:21.082164] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.490 [2024-07-12 18:38:21.082179] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.490 [2024-07-12 18:38:21.082193] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.490 [2024-07-12 18:38:21.084562] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.490 [2024-07-12 18:38:21.085302] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.490 [2024-07-12 18:38:21.086688] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.490 [2024-07-12 18:38:21.088327] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.490 [2024-07-12 18:38:21.088601] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.490 [2024-07-12 18:38:21.090307] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.490 [2024-07-12 18:38:21.091502] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.490 [2024-07-12 18:38:21.092881] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.490 [2024-07-12 18:38:21.094518] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.490 [2024-07-12 18:38:21.094791] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.490 [2024-07-12 18:38:21.094807] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.490 [2024-07-12 18:38:21.094821] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.490 [2024-07-12 18:38:21.094835] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.490 [2024-07-12 18:38:21.097503] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.490 [2024-07-12 18:38:21.099182] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.490 [2024-07-12 18:38:21.100999] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.490 [2024-07-12 18:38:21.102703] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.490 [2024-07-12 18:38:21.102978] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.490 [2024-07-12 18:38:21.103750] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.490 [2024-07-12 18:38:21.105144] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.490 [2024-07-12 18:38:21.106795] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.490 [2024-07-12 18:38:21.108459] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.490 [2024-07-12 18:38:21.108733] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.490 [2024-07-12 18:38:21.108749] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.490 [2024-07-12 18:38:21.108764] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.490 [2024-07-12 18:38:21.108784] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.490 [2024-07-12 18:38:21.112749] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.490 [2024-07-12 18:38:21.114155] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.490 [2024-07-12 18:38:21.115802] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.490 [2024-07-12 18:38:21.117452] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.490 [2024-07-12 18:38:21.117836] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.490 [2024-07-12 18:38:21.119491] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.490 [2024-07-12 18:38:21.121280] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.490 [2024-07-12 18:38:21.122968] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.490 [2024-07-12 18:38:21.124548] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.490 [2024-07-12 18:38:21.124951] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.490 [2024-07-12 18:38:21.124968] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.490 [2024-07-12 18:38:21.124983] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.491 [2024-07-12 18:38:21.124998] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.491 [2024-07-12 18:38:21.128640] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.491 [2024-07-12 18:38:21.130290] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.491 [2024-07-12 18:38:21.131945] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.491 [2024-07-12 18:38:21.132760] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.491 [2024-07-12 18:38:21.133034] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.491 [2024-07-12 18:38:21.134423] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.491 [2024-07-12 18:38:21.136083] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.491 [2024-07-12 18:38:21.137735] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.491 [2024-07-12 18:38:21.138134] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.491 [2024-07-12 18:38:21.138583] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.491 [2024-07-12 18:38:21.138601] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.491 [2024-07-12 18:38:21.138616] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.491 [2024-07-12 18:38:21.138634] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.491 [2024-07-12 18:38:21.142339] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.491 [2024-07-12 18:38:21.143986] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.491 [2024-07-12 18:38:21.145345] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.491 [2024-07-12 18:38:21.146856] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.491 [2024-07-12 18:38:21.147181] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.491 [2024-07-12 18:38:21.148848] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.491 [2024-07-12 18:38:21.150489] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.491 [2024-07-12 18:38:21.151359] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.491 [2024-07-12 18:38:21.151751] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.491 [2024-07-12 18:38:21.152200] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.491 [2024-07-12 18:38:21.152218] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.491 [2024-07-12 18:38:21.152233] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.491 [2024-07-12 18:38:21.152247] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.491 [2024-07-12 18:38:21.155917] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.491 [2024-07-12 18:38:21.157752] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.491 [2024-07-12 18:38:21.158813] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.491 [2024-07-12 18:38:21.160206] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.491 [2024-07-12 18:38:21.160479] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.491 [2024-07-12 18:38:21.162169] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.491 [2024-07-12 18:38:21.163474] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.491 [2024-07-12 18:38:21.163864] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.491 [2024-07-12 18:38:21.164263] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.491 [2024-07-12 18:38:21.164739] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.491 [2024-07-12 18:38:21.164757] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.491 [2024-07-12 18:38:21.164773] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.491 [2024-07-12 18:38:21.164788] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.491 [2024-07-12 18:38:21.168108] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.491 [2024-07-12 18:38:21.168877] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.491 [2024-07-12 18:38:21.170276] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.491 [2024-07-12 18:38:21.171939] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.491 [2024-07-12 18:38:21.172210] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.491 [2024-07-12 18:38:21.174035] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.491 [2024-07-12 18:38:21.174423] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.491 [2024-07-12 18:38:21.174811] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.491 [2024-07-12 18:38:21.175211] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.491 [2024-07-12 18:38:21.175667] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.491 [2024-07-12 18:38:21.175684] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.491 [2024-07-12 18:38:21.175700] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.491 [2024-07-12 18:38:21.175718] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.491 [2024-07-12 18:38:21.178379] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.491 [2024-07-12 18:38:21.180132] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.491 [2024-07-12 18:38:21.181719] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.491 [2024-07-12 18:38:21.183424] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.491 [2024-07-12 18:38:21.183697] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.491 [2024-07-12 18:38:21.184351] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.491 [2024-07-12 18:38:21.184742] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.491 [2024-07-12 18:38:21.185135] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.491 [2024-07-12 18:38:21.185525] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.491 [2024-07-12 18:38:21.185889] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.491 [2024-07-12 18:38:21.185906] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.491 [2024-07-12 18:38:21.185921] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.491 [2024-07-12 18:38:21.185942] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.491 [2024-07-12 18:38:21.188580] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.491 [2024-07-12 18:38:21.189972] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.491 [2024-07-12 18:38:21.191612] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.491 [2024-07-12 18:38:21.193255] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.491 [2024-07-12 18:38:21.193608] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.491 [2024-07-12 18:38:21.194024] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.491 [2024-07-12 18:38:21.194412] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.492 [2024-07-12 18:38:21.194801] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.492 [2024-07-12 18:38:21.195346] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.492 [2024-07-12 18:38:21.195615] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.492 [2024-07-12 18:38:21.195632] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.492 [2024-07-12 18:38:21.195647] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.492 [2024-07-12 18:38:21.195662] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.492 [2024-07-12 18:38:21.198647] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.492 [2024-07-12 18:38:21.200316] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.492 [2024-07-12 18:38:21.201980] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.492 [2024-07-12 18:38:21.203244] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.492 [2024-07-12 18:38:21.203684] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.492 [2024-07-12 18:38:21.204087] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.492 [2024-07-12 18:38:21.204480] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.492 [2024-07-12 18:38:21.204867] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.492 [2024-07-12 18:38:21.206645] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.492 [2024-07-12 18:38:21.206947] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.492 [2024-07-12 18:38:21.206964] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.492 [2024-07-12 18:38:21.206979] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.492 [2024-07-12 18:38:21.206993] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.492 [2024-07-12 18:38:21.209858] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.492 [2024-07-12 18:38:21.211441] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.755 [2024-07-12 18:38:21.212908] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.755 [2024-07-12 18:38:21.213559] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.755 [2024-07-12 18:38:21.214078] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.755 [2024-07-12 18:38:21.214480] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.755 [2024-07-12 18:38:21.214869] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.755 [2024-07-12 18:38:21.215444] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.755 [2024-07-12 18:38:21.216842] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.755 [2024-07-12 18:38:21.217125] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.755 [2024-07-12 18:38:21.217142] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.755 [2024-07-12 18:38:21.217157] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.755 [2024-07-12 18:38:21.217172] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.755 [2024-07-12 18:38:21.220241] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.755 [2024-07-12 18:38:21.221842] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.755 [2024-07-12 18:38:21.223254] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.755 [2024-07-12 18:38:21.223643] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.755 [2024-07-12 18:38:21.224083] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.755 [2024-07-12 18:38:21.224482] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.755 [2024-07-12 18:38:21.224870] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.755 [2024-07-12 18:38:21.226053] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.755 [2024-07-12 18:38:21.227734] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.755 [2024-07-12 18:38:21.228014] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.755 [2024-07-12 18:38:21.228031] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.755 [2024-07-12 18:38:21.228045] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.755 [2024-07-12 18:38:21.228062] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.755 [2024-07-12 18:38:21.230778] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.755 [2024-07-12 18:38:21.231180] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.755 [2024-07-12 18:38:21.231569] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.755 [2024-07-12 18:38:21.231962] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.755 [2024-07-12 18:38:21.232414] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.755 [2024-07-12 18:38:21.232815] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.755 [2024-07-12 18:38:21.233216] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.755 [2024-07-12 18:38:21.233606] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.755 [2024-07-12 18:38:21.233999] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.755 [2024-07-12 18:38:21.234429] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.755 [2024-07-12 18:38:21.234447] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.755 [2024-07-12 18:38:21.234462] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.755 [2024-07-12 18:38:21.234480] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.755 [2024-07-12 18:38:21.237130] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.755 [2024-07-12 18:38:21.237527] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.755 [2024-07-12 18:38:21.237919] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.755 [2024-07-12 18:38:21.238318] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.755 [2024-07-12 18:38:21.238723] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.755 [2024-07-12 18:38:21.239131] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.755 [2024-07-12 18:38:21.239520] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.755 [2024-07-12 18:38:21.239907] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.755 [2024-07-12 18:38:21.240318] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.755 [2024-07-12 18:38:21.240748] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.755 [2024-07-12 18:38:21.240766] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.755 [2024-07-12 18:38:21.240781] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.755 [2024-07-12 18:38:21.240795] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.755 [2024-07-12 18:38:21.243850] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.755 [2024-07-12 18:38:21.244258] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.755 [2024-07-12 18:38:21.244650] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.755 [2024-07-12 18:38:21.245043] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.755 [2024-07-12 18:38:21.245502] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.756 [2024-07-12 18:38:21.245906] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.756 [2024-07-12 18:38:21.246310] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.756 [2024-07-12 18:38:21.246704] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.756 [2024-07-12 18:38:21.247098] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.756 [2024-07-12 18:38:21.247481] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.756 [2024-07-12 18:38:21.247498] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.756 [2024-07-12 18:38:21.247512] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.756 [2024-07-12 18:38:21.247527] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.756 [2024-07-12 18:38:21.250163] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.756 [2024-07-12 18:38:21.250552] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.756 [2024-07-12 18:38:21.250940] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.756 [2024-07-12 18:38:21.251343] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.756 [2024-07-12 18:38:21.251735] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.756 [2024-07-12 18:38:21.252144] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.756 [2024-07-12 18:38:21.252530] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.756 [2024-07-12 18:38:21.252943] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.756 [2024-07-12 18:38:21.253332] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.756 [2024-07-12 18:38:21.253704] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.756 [2024-07-12 18:38:21.253721] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.756 [2024-07-12 18:38:21.253736] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.756 [2024-07-12 18:38:21.253750] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.756 [2024-07-12 18:38:21.256496] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.756 [2024-07-12 18:38:21.256908] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.756 [2024-07-12 18:38:21.256968] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.756 [2024-07-12 18:38:21.257355] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.756 [2024-07-12 18:38:21.257807] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.756 [2024-07-12 18:38:21.258211] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.756 [2024-07-12 18:38:21.258599] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.756 [2024-07-12 18:38:21.258992] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.756 [2024-07-12 18:38:21.259385] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.756 [2024-07-12 18:38:21.259798] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.756 [2024-07-12 18:38:21.259815] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.756 [2024-07-12 18:38:21.259830] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.756 [2024-07-12 18:38:21.259845] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.756 [2024-07-12 18:38:21.262514] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.756 [2024-07-12 18:38:21.262908] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.756 [2024-07-12 18:38:21.263302] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.756 [2024-07-12 18:38:21.263349] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.756 [2024-07-12 18:38:21.263845] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.756 [2024-07-12 18:38:21.264252] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.756 [2024-07-12 18:38:21.264648] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.756 [2024-07-12 18:38:21.265045] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.756 [2024-07-12 18:38:21.265437] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.756 [2024-07-12 18:38:21.265838] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.756 [2024-07-12 18:38:21.265855] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.756 [2024-07-12 18:38:21.265869] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.756 [2024-07-12 18:38:21.265884] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.756 [2024-07-12 18:38:21.268211] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.756 [2024-07-12 18:38:21.268256] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.756 [2024-07-12 18:38:21.268298] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.756 [2024-07-12 18:38:21.268339] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.756 [2024-07-12 18:38:21.268759] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.756 [2024-07-12 18:38:21.268817] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.756 [2024-07-12 18:38:21.268858] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.756 [2024-07-12 18:38:21.268900] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.756 [2024-07-12 18:38:21.268950] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.756 [2024-07-12 18:38:21.269443] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.756 [2024-07-12 18:38:21.269459] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.756 [2024-07-12 18:38:21.269474] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.756 [2024-07-12 18:38:21.269492] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.756 [2024-07-12 18:38:21.271809] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.756 [2024-07-12 18:38:21.271853] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.756 [2024-07-12 18:38:21.271897] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.756 [2024-07-12 18:38:21.271942] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.756 [2024-07-12 18:38:21.272345] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.756 [2024-07-12 18:38:21.272396] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.756 [2024-07-12 18:38:21.272437] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.756 [2024-07-12 18:38:21.272477] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.756 [2024-07-12 18:38:21.272518] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.756 [2024-07-12 18:38:21.272946] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.756 [2024-07-12 18:38:21.272963] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.756 [2024-07-12 18:38:21.272978] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.756 [2024-07-12 18:38:21.272993] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.756 [2024-07-12 18:38:21.275261] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.756 [2024-07-12 18:38:21.275307] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.756 [2024-07-12 18:38:21.275362] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.756 [2024-07-12 18:38:21.275403] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.756 [2024-07-12 18:38:21.275860] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.756 [2024-07-12 18:38:21.275913] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.756 [2024-07-12 18:38:21.275960] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.756 [2024-07-12 18:38:21.276001] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.756 [2024-07-12 18:38:21.276043] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.756 [2024-07-12 18:38:21.276471] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.756 [2024-07-12 18:38:21.276491] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.756 [2024-07-12 18:38:21.276505] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.756 [2024-07-12 18:38:21.276520] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.756 [2024-07-12 18:38:21.278900] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.756 [2024-07-12 18:38:21.278952] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.756 [2024-07-12 18:38:21.278995] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.756 [2024-07-12 18:38:21.279038] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.756 [2024-07-12 18:38:21.279474] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.756 [2024-07-12 18:38:21.279526] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.756 [2024-07-12 18:38:21.279568] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.756 [2024-07-12 18:38:21.279621] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.756 [2024-07-12 18:38:21.279662] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.756 [2024-07-12 18:38:21.280156] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.756 [2024-07-12 18:38:21.280173] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.756 [2024-07-12 18:38:21.280188] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.756 [2024-07-12 18:38:21.280203] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.757 [2024-07-12 18:38:21.282553] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.757 [2024-07-12 18:38:21.282598] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.757 [2024-07-12 18:38:21.282639] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.757 [2024-07-12 18:38:21.282680] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.757 [2024-07-12 18:38:21.283047] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.757 [2024-07-12 18:38:21.283113] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.757 [2024-07-12 18:38:21.283155] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.757 [2024-07-12 18:38:21.283196] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.757 [2024-07-12 18:38:21.283237] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.757 [2024-07-12 18:38:21.283587] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.757 [2024-07-12 18:38:21.283604] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.757 [2024-07-12 18:38:21.283618] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.757 [2024-07-12 18:38:21.283633] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.757 [2024-07-12 18:38:21.286246] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.757 [2024-07-12 18:38:21.286307] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.757 [2024-07-12 18:38:21.286362] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.757 [2024-07-12 18:38:21.286421] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.757 [2024-07-12 18:38:21.286841] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.757 [2024-07-12 18:38:21.286908] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.757 [2024-07-12 18:38:21.286969] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.757 [2024-07-12 18:38:21.287021] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.757 [2024-07-12 18:38:21.287062] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.757 [2024-07-12 18:38:21.287451] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.757 [2024-07-12 18:38:21.287467] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.757 [2024-07-12 18:38:21.287482] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.757 [2024-07-12 18:38:21.287497] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.757 [2024-07-12 18:38:21.289793] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.757 [2024-07-12 18:38:21.289838] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.757 [2024-07-12 18:38:21.289878] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.757 [2024-07-12 18:38:21.289953] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.757 [2024-07-12 18:38:21.290376] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.757 [2024-07-12 18:38:21.290438] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.757 [2024-07-12 18:38:21.290493] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.757 [2024-07-12 18:38:21.290534] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.757 [2024-07-12 18:38:21.290575] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.757 [2024-07-12 18:38:21.291049] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.757 [2024-07-12 18:38:21.291068] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.757 [2024-07-12 18:38:21.291084] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.757 [2024-07-12 18:38:21.291100] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.757 [2024-07-12 18:38:21.293450] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.757 [2024-07-12 18:38:21.293507] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.757 [2024-07-12 18:38:21.293552] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.757 [2024-07-12 18:38:21.293594] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.757 [2024-07-12 18:38:21.294033] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.757 [2024-07-12 18:38:21.294087] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.757 [2024-07-12 18:38:21.294132] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.757 [2024-07-12 18:38:21.294174] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.757 [2024-07-12 18:38:21.294214] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.757 [2024-07-12 18:38:21.294649] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.757 [2024-07-12 18:38:21.294666] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.757 [2024-07-12 18:38:21.294681] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.757 [2024-07-12 18:38:21.294697] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.757 [2024-07-12 18:38:21.297277] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.757 [2024-07-12 18:38:21.297322] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.757 [2024-07-12 18:38:21.297362] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.757 [2024-07-12 18:38:21.297402] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.757 [2024-07-12 18:38:21.297848] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.757 [2024-07-12 18:38:21.297906] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.757 [2024-07-12 18:38:21.297953] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.757 [2024-07-12 18:38:21.297994] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.757 [2024-07-12 18:38:21.298035] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.757 [2024-07-12 18:38:21.298449] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.757 [2024-07-12 18:38:21.298466] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.757 [2024-07-12 18:38:21.298481] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.757 [2024-07-12 18:38:21.298496] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.757 [2024-07-12 18:38:21.300742] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.757 [2024-07-12 18:38:21.300786] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.757 [2024-07-12 18:38:21.300828] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.757 [2024-07-12 18:38:21.300871] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.757 [2024-07-12 18:38:21.301314] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.757 [2024-07-12 18:38:21.301380] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.757 [2024-07-12 18:38:21.301422] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.757 [2024-07-12 18:38:21.301463] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.757 [2024-07-12 18:38:21.301503] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.757 [2024-07-12 18:38:21.301949] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.757 [2024-07-12 18:38:21.301973] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.757 [2024-07-12 18:38:21.301991] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.757 [2024-07-12 18:38:21.302006] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.757 [2024-07-12 18:38:21.304262] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.757 [2024-07-12 18:38:21.304307] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.757 [2024-07-12 18:38:21.304349] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.757 [2024-07-12 18:38:21.304389] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.757 [2024-07-12 18:38:21.304818] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.757 [2024-07-12 18:38:21.304869] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.757 [2024-07-12 18:38:21.304911] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.757 [2024-07-12 18:38:21.304957] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.757 [2024-07-12 18:38:21.304998] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.757 [2024-07-12 18:38:21.305419] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.757 [2024-07-12 18:38:21.305436] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.757 [2024-07-12 18:38:21.305451] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.757 [2024-07-12 18:38:21.305466] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.757 [2024-07-12 18:38:21.307730] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.758 [2024-07-12 18:38:21.307775] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.758 [2024-07-12 18:38:21.307819] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.758 [2024-07-12 18:38:21.307859] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.758 [2024-07-12 18:38:21.308284] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.758 [2024-07-12 18:38:21.308337] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.758 [2024-07-12 18:38:21.308378] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.758 [2024-07-12 18:38:21.308420] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.758 [2024-07-12 18:38:21.308460] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.758 [2024-07-12 18:38:21.308813] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.758 [2024-07-12 18:38:21.308831] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.758 [2024-07-12 18:38:21.308846] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.758 [2024-07-12 18:38:21.308860] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.758 [2024-07-12 18:38:21.311153] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.758 [2024-07-12 18:38:21.311198] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.758 [2024-07-12 18:38:21.311245] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.758 [2024-07-12 18:38:21.311286] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.758 [2024-07-12 18:38:21.311693] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.758 [2024-07-12 18:38:21.311760] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.758 [2024-07-12 18:38:21.311816] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.758 [2024-07-12 18:38:21.311884] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.758 [2024-07-12 18:38:21.311932] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.758 [2024-07-12 18:38:21.312338] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.758 [2024-07-12 18:38:21.312355] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.758 [2024-07-12 18:38:21.312370] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.758 [2024-07-12 18:38:21.312385] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.758 [2024-07-12 18:38:21.314777] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.758 [2024-07-12 18:38:21.314824] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.758 [2024-07-12 18:38:21.314866] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.758 [2024-07-12 18:38:21.314907] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.758 [2024-07-12 18:38:21.315299] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.758 [2024-07-12 18:38:21.315362] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.758 [2024-07-12 18:38:21.315408] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.758 [2024-07-12 18:38:21.315461] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.758 [2024-07-12 18:38:21.315505] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.758 [2024-07-12 18:38:21.315907] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.758 [2024-07-12 18:38:21.315923] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.758 [2024-07-12 18:38:21.315943] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.758 [2024-07-12 18:38:21.315958] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.758 [2024-07-12 18:38:21.318429] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.758 [2024-07-12 18:38:21.318488] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.758 [2024-07-12 18:38:21.318529] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.758 [2024-07-12 18:38:21.318571] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.758 [2024-07-12 18:38:21.318951] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.758 [2024-07-12 18:38:21.319015] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.758 [2024-07-12 18:38:21.319062] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.758 [2024-07-12 18:38:21.319104] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.758 [2024-07-12 18:38:21.319144] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.758 [2024-07-12 18:38:21.319620] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.758 [2024-07-12 18:38:21.319638] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.758 [2024-07-12 18:38:21.319653] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.758 [2024-07-12 18:38:21.319668] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.758 [2024-07-12 18:38:21.321893] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.758 [2024-07-12 18:38:21.321941] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.758 [2024-07-12 18:38:21.321985] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.758 [2024-07-12 18:38:21.322033] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.758 [2024-07-12 18:38:21.322451] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.758 [2024-07-12 18:38:21.322521] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.758 [2024-07-12 18:38:21.322574] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.758 [2024-07-12 18:38:21.322614] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.758 [2024-07-12 18:38:21.322655] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.758 [2024-07-12 18:38:21.322922] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.758 [2024-07-12 18:38:21.322944] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.758 [2024-07-12 18:38:21.322958] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.758 [2024-07-12 18:38:21.322973] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.758 [2024-07-12 18:38:21.325456] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.758 [2024-07-12 18:38:21.325500] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.758 [2024-07-12 18:38:21.325558] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.758 [2024-07-12 18:38:21.325612] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.758 [2024-07-12 18:38:21.326057] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.758 [2024-07-12 18:38:21.326126] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.758 [2024-07-12 18:38:21.326168] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.758 [2024-07-12 18:38:21.326228] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.758 [2024-07-12 18:38:21.326279] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.758 [2024-07-12 18:38:21.326636] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.758 [2024-07-12 18:38:21.326653] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.758 [2024-07-12 18:38:21.326671] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.758 [2024-07-12 18:38:21.326685] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.758 [2024-07-12 18:38:21.329110] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.758 [2024-07-12 18:38:21.329154] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.758 [2024-07-12 18:38:21.329194] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.758 [2024-07-12 18:38:21.329234] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.758 [2024-07-12 18:38:21.329542] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.758 [2024-07-12 18:38:21.329600] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.758 [2024-07-12 18:38:21.329641] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.758 [2024-07-12 18:38:21.329681] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.758 [2024-07-12 18:38:21.329721] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.758 [2024-07-12 18:38:21.329990] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.758 [2024-07-12 18:38:21.330007] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.758 [2024-07-12 18:38:21.330021] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.758 [2024-07-12 18:38:21.330035] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.758 [2024-07-12 18:38:21.331664] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.758 [2024-07-12 18:38:21.331708] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.758 [2024-07-12 18:38:21.331760] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.758 [2024-07-12 18:38:21.331800] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.758 [2024-07-12 18:38:21.332067] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.758 [2024-07-12 18:38:21.332144] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.758 [2024-07-12 18:38:21.332189] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.758 [2024-07-12 18:38:21.332229] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.758 [2024-07-12 18:38:21.332269] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.759 [2024-07-12 18:38:21.332552] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.759 [2024-07-12 18:38:21.332569] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.759 [2024-07-12 18:38:21.332584] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.759 [2024-07-12 18:38:21.332599] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.759 [2024-07-12 18:38:21.335177] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.759 [2024-07-12 18:38:21.335225] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.759 [2024-07-12 18:38:21.335270] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.759 [2024-07-12 18:38:21.335314] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.759 [2024-07-12 18:38:21.335585] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.759 [2024-07-12 18:38:21.335643] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.759 [2024-07-12 18:38:21.335687] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.759 [2024-07-12 18:38:21.335727] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.759 [2024-07-12 18:38:21.335768] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.759 [2024-07-12 18:38:21.336035] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.759 [2024-07-12 18:38:21.336051] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.759 [2024-07-12 18:38:21.336066] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.759 [2024-07-12 18:38:21.336080] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.759 [2024-07-12 18:38:21.337682] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.759 [2024-07-12 18:38:21.337726] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.759 [2024-07-12 18:38:21.337766] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.759 [2024-07-12 18:38:21.337805] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.759 [2024-07-12 18:38:21.338073] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.759 [2024-07-12 18:38:21.338135] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.759 [2024-07-12 18:38:21.338176] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.759 [2024-07-12 18:38:21.338216] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.759 [2024-07-12 18:38:21.338255] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.759 [2024-07-12 18:38:21.338522] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.759 [2024-07-12 18:38:21.338538] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.759 [2024-07-12 18:38:21.338552] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.759 [2024-07-12 18:38:21.338567] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.759 [2024-07-12 18:38:21.340894] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.759 [2024-07-12 18:38:21.340944] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.759 [2024-07-12 18:38:21.340986] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.759 [2024-07-12 18:38:21.341027] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.759 [2024-07-12 18:38:21.341342] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.759 [2024-07-12 18:38:21.341395] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.759 [2024-07-12 18:38:21.341435] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.759 [2024-07-12 18:38:21.341480] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.759 [2024-07-12 18:38:21.341521] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.759 [2024-07-12 18:38:21.341831] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.759 [2024-07-12 18:38:21.341848] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.759 [2024-07-12 18:38:21.341862] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.759 [2024-07-12 18:38:21.341876] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.759 [2024-07-12 18:38:21.343480] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.759 [2024-07-12 18:38:21.343531] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.759 [2024-07-12 18:38:21.343582] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.759 [2024-07-12 18:38:21.343625] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.759 [2024-07-12 18:38:21.343890] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.759 [2024-07-12 18:38:21.343954] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.759 [2024-07-12 18:38:21.343998] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.759 [2024-07-12 18:38:21.344038] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.759 [2024-07-12 18:38:21.344078] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.759 [2024-07-12 18:38:21.344344] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.759 [2024-07-12 18:38:21.344361] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.759 [2024-07-12 18:38:21.344375] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.759 [2024-07-12 18:38:21.344389] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.759 [2024-07-12 18:38:21.346620] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.759 [2024-07-12 18:38:21.346665] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.759 [2024-07-12 18:38:21.346710] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.759 [2024-07-12 18:38:21.346750] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.759 [2024-07-12 18:38:21.347179] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.759 [2024-07-12 18:38:21.347236] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.759 [2024-07-12 18:38:21.347277] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.759 [2024-07-12 18:38:21.347316] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.759 [2024-07-12 18:38:21.347356] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.759 [2024-07-12 18:38:21.347673] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.759 [2024-07-12 18:38:21.347689] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.759 [2024-07-12 18:38:21.347703] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.759 [2024-07-12 18:38:21.347722] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.759 [2024-07-12 18:38:21.349322] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.760 [2024-07-12 18:38:21.349366] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.760 [2024-07-12 18:38:21.349406] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.760 [2024-07-12 18:38:21.349446] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.760 [2024-07-12 18:38:21.349787] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.760 [2024-07-12 18:38:21.349849] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.760 [2024-07-12 18:38:21.349889] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.760 [2024-07-12 18:38:21.349934] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.760 [2024-07-12 18:38:21.349974] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.760 [2024-07-12 18:38:21.350243] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.760 [2024-07-12 18:38:21.350259] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.760 [2024-07-12 18:38:21.350273] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.760 [2024-07-12 18:38:21.350288] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.760 [2024-07-12 18:38:21.352429] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.760 [2024-07-12 18:38:21.352472] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.760 [2024-07-12 18:38:21.352512] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.760 [2024-07-12 18:38:21.352552] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.760 [2024-07-12 18:38:21.352994] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.760 [2024-07-12 18:38:21.353046] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.760 [2024-07-12 18:38:21.353087] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.760 [2024-07-12 18:38:21.353129] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.760 [2024-07-12 18:38:21.353170] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.760 [2024-07-12 18:38:21.353444] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.760 [2024-07-12 18:38:21.353460] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.760 [2024-07-12 18:38:21.353474] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.760 [2024-07-12 18:38:21.353489] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.760 [2024-07-12 18:38:21.355120] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.760 [2024-07-12 18:38:21.355180] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.760 [2024-07-12 18:38:21.355223] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.760 [2024-07-12 18:38:21.355266] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.760 [2024-07-12 18:38:21.355533] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.760 [2024-07-12 18:38:21.355590] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.760 [2024-07-12 18:38:21.355637] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.760 [2024-07-12 18:38:21.355678] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.760 [2024-07-12 18:38:21.355718] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.760 [2024-07-12 18:38:21.355985] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.760 [2024-07-12 18:38:21.356001] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.760 [2024-07-12 18:38:21.356016] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.760 [2024-07-12 18:38:21.356030] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.760 [2024-07-12 18:38:21.357899] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.760 [2024-07-12 18:38:21.357950] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.760 [2024-07-12 18:38:21.357995] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.760 [2024-07-12 18:38:21.358037] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.760 [2024-07-12 18:38:21.358471] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.760 [2024-07-12 18:38:21.358523] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.760 [2024-07-12 18:38:21.358564] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.760 [2024-07-12 18:38:21.358605] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.760 [2024-07-12 18:38:21.358647] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.760 [2024-07-12 18:38:21.359076] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.760 [2024-07-12 18:38:21.359093] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.760 [2024-07-12 18:38:21.359107] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.760 [2024-07-12 18:38:21.359121] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.760 [2024-07-12 18:38:21.360630] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.760 [2024-07-12 18:38:21.360681] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.760 [2024-07-12 18:38:21.360722] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.760 [2024-07-12 18:38:21.360764] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.760 [2024-07-12 18:38:21.361106] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.760 [2024-07-12 18:38:21.361165] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.760 [2024-07-12 18:38:21.361206] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.760 [2024-07-12 18:38:21.361246] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.760 [2024-07-12 18:38:21.361290] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.760 [2024-07-12 18:38:21.361618] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.760 [2024-07-12 18:38:21.361634] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.760 [2024-07-12 18:38:21.361649] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.760 [2024-07-12 18:38:21.361664] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.760 [2024-07-12 18:38:21.363503] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.760 [2024-07-12 18:38:21.363549] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.760 [2024-07-12 18:38:21.363590] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.760 [2024-07-12 18:38:21.363631] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.760 [2024-07-12 18:38:21.364018] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.760 [2024-07-12 18:38:21.364079] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.760 [2024-07-12 18:38:21.364120] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.760 [2024-07-12 18:38:21.364160] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.760 [2024-07-12 18:38:21.364201] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.760 [2024-07-12 18:38:21.364645] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.760 [2024-07-12 18:38:21.364662] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.760 [2024-07-12 18:38:21.364678] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.760 [2024-07-12 18:38:21.364693] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.760 [2024-07-12 18:38:21.366304] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.760 [2024-07-12 18:38:21.366349] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.760 [2024-07-12 18:38:21.367498] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.760 [2024-07-12 18:38:21.367544] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.760 [2024-07-12 18:38:21.367812] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.760 [2024-07-12 18:38:21.367868] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.760 [2024-07-12 18:38:21.367909] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.760 [2024-07-12 18:38:21.367962] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.760 [2024-07-12 18:38:21.368005] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.760 [2024-07-12 18:38:21.368275] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.760 [2024-07-12 18:38:21.368291] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.760 [2024-07-12 18:38:21.368306] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.760 [2024-07-12 18:38:21.368324] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.760 [2024-07-12 18:38:21.370300] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.760 [2024-07-12 18:38:21.370347] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.760 [2024-07-12 18:38:21.370389] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.760 [2024-07-12 18:38:21.370777] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.760 [2024-07-12 18:38:21.371217] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.760 [2024-07-12 18:38:21.371270] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.760 [2024-07-12 18:38:21.371312] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.760 [2024-07-12 18:38:21.371352] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.760 [2024-07-12 18:38:21.371394] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.760 [2024-07-12 18:38:21.371664] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.761 [2024-07-12 18:38:21.371681] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.761 [2024-07-12 18:38:21.371695] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.761 [2024-07-12 18:38:21.371709] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.761 [2024-07-12 18:38:21.374879] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.761 [2024-07-12 18:38:21.376550] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.761 [2024-07-12 18:38:21.378193] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.761 [2024-07-12 18:38:21.379655] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.761 [2024-07-12 18:38:21.380032] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.761 [2024-07-12 18:38:21.380436] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.761 [2024-07-12 18:38:21.380827] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.761 [2024-07-12 18:38:21.381224] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.761 [2024-07-12 18:38:21.382517] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.761 [2024-07-12 18:38:21.382826] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.761 [2024-07-12 18:38:21.382843] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.761 [2024-07-12 18:38:21.382857] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.761 [2024-07-12 18:38:21.382872] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.761 [2024-07-12 18:38:21.385823] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.761 [2024-07-12 18:38:21.387488] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.761 [2024-07-12 18:38:21.389145] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.761 [2024-07-12 18:38:21.389717] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.761 [2024-07-12 18:38:21.390200] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.761 [2024-07-12 18:38:21.390599] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.761 [2024-07-12 18:38:21.390991] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.761 [2024-07-12 18:38:21.391702] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.761 [2024-07-12 18:38:21.393088] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.761 [2024-07-12 18:38:21.393360] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.761 [2024-07-12 18:38:21.393376] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.761 [2024-07-12 18:38:21.393391] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.761 [2024-07-12 18:38:21.393405] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.761 [2024-07-12 18:38:21.396655] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.761 [2024-07-12 18:38:21.398322] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.761 [2024-07-12 18:38:21.399304] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.761 [2024-07-12 18:38:21.399708] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.761 [2024-07-12 18:38:21.400134] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.761 [2024-07-12 18:38:21.400539] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.761 [2024-07-12 18:38:21.400933] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.761 [2024-07-12 18:38:21.402671] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.761 [2024-07-12 18:38:21.404252] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.761 [2024-07-12 18:38:21.404523] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.761 [2024-07-12 18:38:21.404540] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.761 [2024-07-12 18:38:21.404554] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.761 [2024-07-12 18:38:21.404569] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.761 [2024-07-12 18:38:21.407946] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.761 [2024-07-12 18:38:21.409715] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.761 [2024-07-12 18:38:21.410109] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.761 [2024-07-12 18:38:21.410497] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.761 [2024-07-12 18:38:21.410910] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.761 [2024-07-12 18:38:21.411311] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.761 [2024-07-12 18:38:21.412502] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.761 [2024-07-12 18:38:21.413904] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.761 [2024-07-12 18:38:21.415541] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.761 [2024-07-12 18:38:21.415816] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.761 [2024-07-12 18:38:21.415832] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.761 [2024-07-12 18:38:21.415847] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.761 [2024-07-12 18:38:21.415861] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.761 [2024-07-12 18:38:21.419160] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.761 [2024-07-12 18:38:21.419626] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.761 [2024-07-12 18:38:21.420019] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.761 [2024-07-12 18:38:21.420410] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.761 [2024-07-12 18:38:21.420873] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.761 [2024-07-12 18:38:21.421479] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.761 [2024-07-12 18:38:21.422868] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.761 [2024-07-12 18:38:21.424520] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.761 [2024-07-12 18:38:21.426172] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.761 [2024-07-12 18:38:21.426445] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.761 [2024-07-12 18:38:21.426461] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.761 [2024-07-12 18:38:21.426477] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.761 [2024-07-12 18:38:21.426492] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.761 [2024-07-12 18:38:21.429374] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.761 [2024-07-12 18:38:21.429767] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.761 [2024-07-12 18:38:21.430161] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.761 [2024-07-12 18:38:21.430564] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.761 [2024-07-12 18:38:21.431026] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.761 [2024-07-12 18:38:21.432731] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.761 [2024-07-12 18:38:21.434273] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.761 [2024-07-12 18:38:21.435922] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.761 [2024-07-12 18:38:21.437652] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.761 [2024-07-12 18:38:21.438133] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.761 [2024-07-12 18:38:21.438150] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.761 [2024-07-12 18:38:21.438164] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.761 [2024-07-12 18:38:21.438178] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.761 [2024-07-12 18:38:21.440189] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.761 [2024-07-12 18:38:21.440587] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.761 [2024-07-12 18:38:21.440981] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.761 [2024-07-12 18:38:21.441371] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.761 [2024-07-12 18:38:21.441647] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.761 [2024-07-12 18:38:21.443032] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.761 [2024-07-12 18:38:21.444700] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.761 [2024-07-12 18:38:21.446356] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.761 [2024-07-12 18:38:21.447101] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.761 [2024-07-12 18:38:21.447374] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.761 [2024-07-12 18:38:21.447390] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.762 [2024-07-12 18:38:21.447405] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.762 [2024-07-12 18:38:21.447419] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.762 [2024-07-12 18:38:21.449408] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.762 [2024-07-12 18:38:21.449804] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.762 [2024-07-12 18:38:21.450199] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.762 [2024-07-12 18:38:21.451349] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.762 [2024-07-12 18:38:21.451689] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.762 [2024-07-12 18:38:21.453350] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.762 [2024-07-12 18:38:21.454998] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.762 [2024-07-12 18:38:21.456272] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.762 [2024-07-12 18:38:21.457843] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.762 [2024-07-12 18:38:21.458152] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.762 [2024-07-12 18:38:21.458169] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.762 [2024-07-12 18:38:21.458183] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.762 [2024-07-12 18:38:21.458198] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.762 [2024-07-12 18:38:21.460473] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.762 [2024-07-12 18:38:21.460877] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.762 [2024-07-12 18:38:21.461514] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.762 [2024-07-12 18:38:21.462900] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.762 [2024-07-12 18:38:21.463179] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.762 [2024-07-12 18:38:21.465019] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.762 [2024-07-12 18:38:21.466749] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.762 [2024-07-12 18:38:21.467872] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.762 [2024-07-12 18:38:21.469256] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.762 [2024-07-12 18:38:21.469531] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.762 [2024-07-12 18:38:21.469548] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.762 [2024-07-12 18:38:21.469562] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.762 [2024-07-12 18:38:21.469577] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.762 [2024-07-12 18:38:21.472006] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.762 [2024-07-12 18:38:21.472397] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.762 [2024-07-12 18:38:21.474120] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.762 [2024-07-12 18:38:21.475535] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:37.762 [2024-07-12 18:38:21.475808] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.024 [2024-07-12 18:38:21.477373] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.024 [2024-07-12 18:38:21.478277] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.024 [2024-07-12 18:38:21.479984] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.024 [2024-07-12 18:38:21.481534] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.024 [2024-07-12 18:38:21.481807] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.024 [2024-07-12 18:38:21.481824] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.024 [2024-07-12 18:38:21.481838] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.024 [2024-07-12 18:38:21.481853] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.024 [2024-07-12 18:38:21.484294] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.024 [2024-07-12 18:38:21.485082] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.024 [2024-07-12 18:38:21.486467] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.024 [2024-07-12 18:38:21.488311] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.024 [2024-07-12 18:38:21.488590] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.024 [2024-07-12 18:38:21.489406] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.024 [2024-07-12 18:38:21.491122] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.024 [2024-07-12 18:38:21.492747] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.024 [2024-07-12 18:38:21.494486] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.024 [2024-07-12 18:38:21.494762] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.024 [2024-07-12 18:38:21.494778] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.024 [2024-07-12 18:38:21.494793] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.024 [2024-07-12 18:38:21.494807] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.024 [2024-07-12 18:38:21.498109] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.024 [2024-07-12 18:38:21.499512] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.024 [2024-07-12 18:38:21.501163] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.024 [2024-07-12 18:38:21.502808] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.024 [2024-07-12 18:38:21.503150] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.024 [2024-07-12 18:38:21.504618] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.024 [2024-07-12 18:38:21.506009] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.024 [2024-07-12 18:38:21.507645] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.024 [2024-07-12 18:38:21.509320] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.024 [2024-07-12 18:38:21.509771] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.024 [2024-07-12 18:38:21.509787] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.024 [2024-07-12 18:38:21.509801] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.024 [2024-07-12 18:38:21.509815] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.024 [2024-07-12 18:38:21.513639] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.024 [2024-07-12 18:38:21.515307] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.024 [2024-07-12 18:38:21.516965] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.024 [2024-07-12 18:38:21.518312] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.024 [2024-07-12 18:38:21.518625] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.024 [2024-07-12 18:38:21.520028] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.024 [2024-07-12 18:38:21.521679] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.024 [2024-07-12 18:38:21.523340] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.024 [2024-07-12 18:38:21.524238] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.024 [2024-07-12 18:38:21.524692] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.024 [2024-07-12 18:38:21.524709] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.024 [2024-07-12 18:38:21.524724] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.024 [2024-07-12 18:38:21.524740] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.025 [2024-07-12 18:38:21.528374] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.025 [2024-07-12 18:38:21.530091] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.025 [2024-07-12 18:38:21.531899] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.025 [2024-07-12 18:38:21.532913] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.025 [2024-07-12 18:38:21.533230] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.025 [2024-07-12 18:38:21.534914] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.025 [2024-07-12 18:38:21.536577] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.025 [2024-07-12 18:38:21.537918] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.025 [2024-07-12 18:38:21.538313] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.025 [2024-07-12 18:38:21.538752] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.025 [2024-07-12 18:38:21.538770] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.025 [2024-07-12 18:38:21.538785] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.025 [2024-07-12 18:38:21.538801] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.025 [2024-07-12 18:38:21.542379] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.025 [2024-07-12 18:38:21.544045] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.025 [2024-07-12 18:38:21.544791] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.025 [2024-07-12 18:38:21.546200] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.025 [2024-07-12 18:38:21.546473] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.025 [2024-07-12 18:38:21.548181] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.025 [2024-07-12 18:38:21.549996] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.025 [2024-07-12 18:38:21.550392] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.025 [2024-07-12 18:38:21.550782] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.025 [2024-07-12 18:38:21.551206] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.025 [2024-07-12 18:38:21.551223] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.025 [2024-07-12 18:38:21.551238] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.025 [2024-07-12 18:38:21.551253] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.025 [2024-07-12 18:38:21.554763] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.025 [2024-07-12 18:38:21.556012] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.025 [2024-07-12 18:38:21.557653] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.025 [2024-07-12 18:38:21.559131] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.025 [2024-07-12 18:38:21.559404] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.025 [2024-07-12 18:38:21.561078] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.025 [2024-07-12 18:38:21.561845] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.025 [2024-07-12 18:38:21.562243] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.025 [2024-07-12 18:38:21.562639] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.025 [2024-07-12 18:38:21.563156] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.025 [2024-07-12 18:38:21.563174] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.025 [2024-07-12 18:38:21.563189] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.025 [2024-07-12 18:38:21.563207] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.025 [2024-07-12 18:38:21.566655] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.025 [2024-07-12 18:38:21.567660] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.025 [2024-07-12 18:38:21.569052] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.025 [2024-07-12 18:38:21.570709] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.025 [2024-07-12 18:38:21.570984] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.025 [2024-07-12 18:38:21.572398] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.025 [2024-07-12 18:38:21.572786] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.025 [2024-07-12 18:38:21.573179] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.025 [2024-07-12 18:38:21.573568] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.025 [2024-07-12 18:38:21.574018] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.025 [2024-07-12 18:38:21.574037] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.025 [2024-07-12 18:38:21.574052] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.025 [2024-07-12 18:38:21.574067] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.025 [2024-07-12 18:38:21.576413] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.025 [2024-07-12 18:38:21.578041] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.025 [2024-07-12 18:38:21.579840] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.025 [2024-07-12 18:38:21.581535] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.025 [2024-07-12 18:38:21.581807] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.025 [2024-07-12 18:38:21.582216] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.025 [2024-07-12 18:38:21.582607] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.025 [2024-07-12 18:38:21.583004] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.025 [2024-07-12 18:38:21.583392] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.025 [2024-07-12 18:38:21.583766] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.025 [2024-07-12 18:38:21.583786] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.025 [2024-07-12 18:38:21.583800] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.025 [2024-07-12 18:38:21.583814] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.025 [2024-07-12 18:38:21.586775] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.025 [2024-07-12 18:38:21.588167] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.025 [2024-07-12 18:38:21.589829] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.025 [2024-07-12 18:38:21.591489] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.025 [2024-07-12 18:38:21.591965] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.025 [2024-07-12 18:38:21.592380] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.025 [2024-07-12 18:38:21.592767] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.025 [2024-07-12 18:38:21.593163] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.025 [2024-07-12 18:38:21.593975] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.025 [2024-07-12 18:38:21.594271] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.025 [2024-07-12 18:38:21.594288] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.025 [2024-07-12 18:38:21.594302] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.025 [2024-07-12 18:38:21.594318] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.025 [2024-07-12 18:38:21.597291] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.025 [2024-07-12 18:38:21.598948] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.025 [2024-07-12 18:38:21.600600] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.025 [2024-07-12 18:38:21.601658] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.025 [2024-07-12 18:38:21.602056] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.025 [2024-07-12 18:38:21.602460] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.025 [2024-07-12 18:38:21.602850] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.025 [2024-07-12 18:38:21.603244] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.025 [2024-07-12 18:38:21.604930] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.025 [2024-07-12 18:38:21.605259] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.025 [2024-07-12 18:38:21.605276] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.025 [2024-07-12 18:38:21.605290] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.025 [2024-07-12 18:38:21.605304] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.025 [2024-07-12 18:38:21.608717] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.025 [2024-07-12 18:38:21.610488] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.025 [2024-07-12 18:38:21.612129] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.025 [2024-07-12 18:38:21.612518] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.025 [2024-07-12 18:38:21.612967] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.025 [2024-07-12 18:38:21.613366] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.025 [2024-07-12 18:38:21.613753] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.025 [2024-07-12 18:38:21.615087] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.025 [2024-07-12 18:38:21.616509] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.026 [2024-07-12 18:38:21.616782] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.026 [2024-07-12 18:38:21.616798] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.026 [2024-07-12 18:38:21.616812] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.026 [2024-07-12 18:38:21.616827] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.026 [2024-07-12 18:38:21.620105] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.026 [2024-07-12 18:38:21.621766] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.026 [2024-07-12 18:38:21.622191] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.026 [2024-07-12 18:38:21.622581] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.026 [2024-07-12 18:38:21.622995] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.026 [2024-07-12 18:38:21.623396] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.026 [2024-07-12 18:38:21.624003] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.026 [2024-07-12 18:38:21.625393] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.026 [2024-07-12 18:38:21.626945] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.026 [2024-07-12 18:38:21.627218] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.026 [2024-07-12 18:38:21.627234] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.026 [2024-07-12 18:38:21.627248] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.026 [2024-07-12 18:38:21.627263] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.026 [2024-07-12 18:38:21.629335] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.026 [2024-07-12 18:38:21.629730] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.026 [2024-07-12 18:38:21.630124] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.026 [2024-07-12 18:38:21.630512] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.026 [2024-07-12 18:38:21.630783] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.026 [2024-07-12 18:38:21.632176] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.026 [2024-07-12 18:38:21.633867] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.026 [2024-07-12 18:38:21.635532] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.026 [2024-07-12 18:38:21.635932] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.026 [2024-07-12 18:38:21.636208] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.026 [2024-07-12 18:38:21.636224] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.026 [2024-07-12 18:38:21.636238] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.026 [2024-07-12 18:38:21.636253] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.026 [2024-07-12 18:38:21.638245] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.026 [2024-07-12 18:38:21.638645] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.026 [2024-07-12 18:38:21.639049] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.026 [2024-07-12 18:38:21.639442] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.026 [2024-07-12 18:38:21.639844] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.026 [2024-07-12 18:38:21.640265] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.026 [2024-07-12 18:38:21.640656] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.026 [2024-07-12 18:38:21.641048] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.026 [2024-07-12 18:38:21.641438] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.026 [2024-07-12 18:38:21.641820] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.026 [2024-07-12 18:38:21.641837] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.026 [2024-07-12 18:38:21.641851] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.026 [2024-07-12 18:38:21.641866] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.026 [2024-07-12 18:38:21.644536] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.026 [2024-07-12 18:38:21.644938] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.026 [2024-07-12 18:38:21.645333] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.026 [2024-07-12 18:38:21.645721] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.026 [2024-07-12 18:38:21.646181] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.026 [2024-07-12 18:38:21.646580] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.026 [2024-07-12 18:38:21.646971] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.026 [2024-07-12 18:38:21.647360] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.026 [2024-07-12 18:38:21.647754] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.026 [2024-07-12 18:38:21.648211] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.026 [2024-07-12 18:38:21.648228] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.026 [2024-07-12 18:38:21.648248] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.026 [2024-07-12 18:38:21.648263] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.026 [2024-07-12 18:38:21.650909] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.026 [2024-07-12 18:38:21.651311] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.026 [2024-07-12 18:38:21.651699] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.026 [2024-07-12 18:38:21.652090] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.026 [2024-07-12 18:38:21.652550] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.026 [2024-07-12 18:38:21.652961] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.026 [2024-07-12 18:38:21.653354] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.026 [2024-07-12 18:38:21.653744] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.026 [2024-07-12 18:38:21.654140] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.026 [2024-07-12 18:38:21.654612] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.026 [2024-07-12 18:38:21.654630] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.026 [2024-07-12 18:38:21.654645] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.026 [2024-07-12 18:38:21.654661] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.026 [2024-07-12 18:38:21.657417] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.026 [2024-07-12 18:38:21.657812] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.026 [2024-07-12 18:38:21.658210] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.026 [2024-07-12 18:38:21.658606] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.026 [2024-07-12 18:38:21.658973] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.026 [2024-07-12 18:38:21.659378] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.026 [2024-07-12 18:38:21.659768] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.026 [2024-07-12 18:38:21.660164] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.026 [2024-07-12 18:38:21.660555] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.026 [2024-07-12 18:38:21.660945] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.026 [2024-07-12 18:38:21.660962] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.026 [2024-07-12 18:38:21.660976] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.026 [2024-07-12 18:38:21.660992] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.026 [2024-07-12 18:38:21.663672] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.026 [2024-07-12 18:38:21.664079] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.026 [2024-07-12 18:38:21.664479] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.026 [2024-07-12 18:38:21.664867] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.026 [2024-07-12 18:38:21.665277] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.026 [2024-07-12 18:38:21.665680] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.026 [2024-07-12 18:38:21.666076] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.026 [2024-07-12 18:38:21.666469] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.026 [2024-07-12 18:38:21.666865] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.026 [2024-07-12 18:38:21.667319] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.026 [2024-07-12 18:38:21.667337] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.026 [2024-07-12 18:38:21.667355] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.026 [2024-07-12 18:38:21.667370] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.026 [2024-07-12 18:38:21.670191] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.026 [2024-07-12 18:38:21.670587] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.026 [2024-07-12 18:38:21.670985] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.026 [2024-07-12 18:38:21.671373] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.026 [2024-07-12 18:38:21.671766] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.027 [2024-07-12 18:38:21.672176] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.027 [2024-07-12 18:38:21.672577] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.027 [2024-07-12 18:38:21.672967] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.027 [2024-07-12 18:38:21.673355] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.027 [2024-07-12 18:38:21.673784] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.027 [2024-07-12 18:38:21.673800] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.027 [2024-07-12 18:38:21.673815] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.027 [2024-07-12 18:38:21.673830] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.027 [2024-07-12 18:38:21.676496] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.027 [2024-07-12 18:38:21.676893] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.027 [2024-07-12 18:38:21.677292] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.027 [2024-07-12 18:38:21.677689] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.027 [2024-07-12 18:38:21.678181] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.027 [2024-07-12 18:38:21.678582] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.027 [2024-07-12 18:38:21.678976] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.027 [2024-07-12 18:38:21.679371] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.027 [2024-07-12 18:38:21.679763] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.027 [2024-07-12 18:38:21.680283] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.027 [2024-07-12 18:38:21.680301] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.027 [2024-07-12 18:38:21.680316] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.027 [2024-07-12 18:38:21.680331] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.027 [2024-07-12 18:38:21.683151] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.027 [2024-07-12 18:38:21.683553] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.027 [2024-07-12 18:38:21.683599] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.027 [2024-07-12 18:38:21.683990] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.027 [2024-07-12 18:38:21.684393] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.027 [2024-07-12 18:38:21.684790] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.027 [2024-07-12 18:38:21.685186] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.027 [2024-07-12 18:38:21.685578] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.027 [2024-07-12 18:38:21.685980] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.027 [2024-07-12 18:38:21.686410] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.027 [2024-07-12 18:38:21.686427] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.027 [2024-07-12 18:38:21.686442] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.027 [2024-07-12 18:38:21.686458] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.027 [2024-07-12 18:38:21.689048] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.027 [2024-07-12 18:38:21.689451] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.027 [2024-07-12 18:38:21.689839] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.027 [2024-07-12 18:38:21.689885] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.027 [2024-07-12 18:38:21.690350] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.027 [2024-07-12 18:38:21.690752] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.027 [2024-07-12 18:38:21.691151] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.027 [2024-07-12 18:38:21.691556] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.027 [2024-07-12 18:38:21.691955] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.027 [2024-07-12 18:38:21.692429] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.027 [2024-07-12 18:38:21.692446] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.027 [2024-07-12 18:38:21.692467] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.027 [2024-07-12 18:38:21.692482] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.027 [2024-07-12 18:38:21.694800] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.027 [2024-07-12 18:38:21.694848] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.027 [2024-07-12 18:38:21.694890] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.027 [2024-07-12 18:38:21.694937] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.027 [2024-07-12 18:38:21.695343] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.027 [2024-07-12 18:38:21.695395] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.027 [2024-07-12 18:38:21.695436] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.027 [2024-07-12 18:38:21.695476] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.027 [2024-07-12 18:38:21.695518] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.027 [2024-07-12 18:38:21.695964] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.027 [2024-07-12 18:38:21.695981] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.027 [2024-07-12 18:38:21.695997] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.027 [2024-07-12 18:38:21.696012] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.027 [2024-07-12 18:38:21.698245] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.027 [2024-07-12 18:38:21.698290] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.027 [2024-07-12 18:38:21.698332] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.027 [2024-07-12 18:38:21.698378] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.027 [2024-07-12 18:38:21.698851] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.027 [2024-07-12 18:38:21.698911] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.027 [2024-07-12 18:38:21.698958] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.027 [2024-07-12 18:38:21.699000] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.027 [2024-07-12 18:38:21.699044] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.027 [2024-07-12 18:38:21.699452] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.027 [2024-07-12 18:38:21.699469] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.027 [2024-07-12 18:38:21.699484] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.027 [2024-07-12 18:38:21.699498] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.027 [2024-07-12 18:38:21.701867] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.027 [2024-07-12 18:38:21.701912] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.027 [2024-07-12 18:38:21.701957] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.027 [2024-07-12 18:38:21.702004] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.027 [2024-07-12 18:38:21.702448] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.027 [2024-07-12 18:38:21.702499] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.027 [2024-07-12 18:38:21.702544] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.027 [2024-07-12 18:38:21.702585] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.027 [2024-07-12 18:38:21.702627] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.027 [2024-07-12 18:38:21.703027] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.027 [2024-07-12 18:38:21.703044] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.027 [2024-07-12 18:38:21.703059] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.027 [2024-07-12 18:38:21.703074] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.027 [2024-07-12 18:38:21.705389] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.027 [2024-07-12 18:38:21.705435] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.027 [2024-07-12 18:38:21.705476] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.027 [2024-07-12 18:38:21.705518] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.027 [2024-07-12 18:38:21.705957] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.027 [2024-07-12 18:38:21.706020] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.027 [2024-07-12 18:38:21.706062] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.027 [2024-07-12 18:38:21.706116] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.027 [2024-07-12 18:38:21.706175] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.027 [2024-07-12 18:38:21.706578] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.027 [2024-07-12 18:38:21.706595] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.027 [2024-07-12 18:38:21.706609] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.028 [2024-07-12 18:38:21.706624] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.028 [2024-07-12 18:38:21.709020] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.028 [2024-07-12 18:38:21.709065] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.028 [2024-07-12 18:38:21.709106] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.028 [2024-07-12 18:38:21.709147] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.028 [2024-07-12 18:38:21.709528] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.028 [2024-07-12 18:38:21.709593] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.028 [2024-07-12 18:38:21.709635] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.028 [2024-07-12 18:38:21.709691] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.028 [2024-07-12 18:38:21.709745] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.028 [2024-07-12 18:38:21.710114] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.028 [2024-07-12 18:38:21.710131] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.028 [2024-07-12 18:38:21.710146] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.028 [2024-07-12 18:38:21.710160] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.028 [2024-07-12 18:38:21.712827] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.028 [2024-07-12 18:38:21.712884] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.028 [2024-07-12 18:38:21.712944] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.028 [2024-07-12 18:38:21.713001] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.028 [2024-07-12 18:38:21.713482] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.028 [2024-07-12 18:38:21.713544] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.028 [2024-07-12 18:38:21.713599] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.028 [2024-07-12 18:38:21.713652] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.028 [2024-07-12 18:38:21.713694] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.028 [2024-07-12 18:38:21.714083] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.028 [2024-07-12 18:38:21.714100] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.028 [2024-07-12 18:38:21.714115] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.028 [2024-07-12 18:38:21.714129] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.028 [2024-07-12 18:38:21.716496] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.028 [2024-07-12 18:38:21.716542] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.028 [2024-07-12 18:38:21.716582] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.028 [2024-07-12 18:38:21.716636] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.028 [2024-07-12 18:38:21.717043] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.028 [2024-07-12 18:38:21.717106] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.028 [2024-07-12 18:38:21.717162] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.028 [2024-07-12 18:38:21.717215] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.028 [2024-07-12 18:38:21.717257] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.028 [2024-07-12 18:38:21.717719] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.028 [2024-07-12 18:38:21.717737] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.028 [2024-07-12 18:38:21.717751] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.028 [2024-07-12 18:38:21.717771] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.028 [2024-07-12 18:38:21.720062] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.028 [2024-07-12 18:38:21.720132] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.028 [2024-07-12 18:38:21.720192] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.028 [2024-07-12 18:38:21.720244] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.028 [2024-07-12 18:38:21.720696] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.028 [2024-07-12 18:38:21.720745] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.028 [2024-07-12 18:38:21.720787] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.028 [2024-07-12 18:38:21.720828] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.028 [2024-07-12 18:38:21.720868] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.028 [2024-07-12 18:38:21.721302] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.028 [2024-07-12 18:38:21.721320] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.028 [2024-07-12 18:38:21.721335] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.028 [2024-07-12 18:38:21.721351] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.028 [2024-07-12 18:38:21.723607] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.028 [2024-07-12 18:38:21.723652] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.028 [2024-07-12 18:38:21.723705] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.028 [2024-07-12 18:38:21.723756] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.028 [2024-07-12 18:38:21.724212] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.028 [2024-07-12 18:38:21.724269] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.028 [2024-07-12 18:38:21.724311] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.028 [2024-07-12 18:38:21.724353] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.028 [2024-07-12 18:38:21.724394] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.028 [2024-07-12 18:38:21.724782] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.028 [2024-07-12 18:38:21.724799] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.028 [2024-07-12 18:38:21.724813] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.028 [2024-07-12 18:38:21.724827] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.028 [2024-07-12 18:38:21.726920] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.028 [2024-07-12 18:38:21.726970] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.028 [2024-07-12 18:38:21.727011] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.028 [2024-07-12 18:38:21.727052] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.028 [2024-07-12 18:38:21.727328] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.028 [2024-07-12 18:38:21.727389] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.028 [2024-07-12 18:38:21.727430] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.028 [2024-07-12 18:38:21.727471] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.028 [2024-07-12 18:38:21.727512] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.028 [2024-07-12 18:38:21.727934] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.028 [2024-07-12 18:38:21.727951] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.028 [2024-07-12 18:38:21.727965] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.028 [2024-07-12 18:38:21.727981] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.028 [2024-07-12 18:38:21.730314] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.029 [2024-07-12 18:38:21.730372] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.029 [2024-07-12 18:38:21.730413] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.029 [2024-07-12 18:38:21.730469] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.029 [2024-07-12 18:38:21.730850] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.029 [2024-07-12 18:38:21.730912] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.029 [2024-07-12 18:38:21.730959] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.029 [2024-07-12 18:38:21.730999] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.029 [2024-07-12 18:38:21.731040] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.029 [2024-07-12 18:38:21.731487] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.029 [2024-07-12 18:38:21.731504] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.029 [2024-07-12 18:38:21.731520] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.029 [2024-07-12 18:38:21.731536] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.029 [2024-07-12 18:38:21.733572] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.029 [2024-07-12 18:38:21.733617] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.029 [2024-07-12 18:38:21.733660] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.029 [2024-07-12 18:38:21.733701] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.029 [2024-07-12 18:38:21.733974] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.029 [2024-07-12 18:38:21.734032] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.029 [2024-07-12 18:38:21.734072] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.029 [2024-07-12 18:38:21.734112] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.029 [2024-07-12 18:38:21.734156] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.029 [2024-07-12 18:38:21.734423] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.029 [2024-07-12 18:38:21.734439] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.029 [2024-07-12 18:38:21.734453] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.029 [2024-07-12 18:38:21.734467] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.029 [2024-07-12 18:38:21.736203] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.029 [2024-07-12 18:38:21.736257] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.029 [2024-07-12 18:38:21.736297] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.029 [2024-07-12 18:38:21.736337] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.029 [2024-07-12 18:38:21.736604] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.029 [2024-07-12 18:38:21.736665] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.029 [2024-07-12 18:38:21.736707] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.029 [2024-07-12 18:38:21.736749] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.029 [2024-07-12 18:38:21.736789] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.029 [2024-07-12 18:38:21.737236] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.029 [2024-07-12 18:38:21.737253] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.029 [2024-07-12 18:38:21.737268] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.029 [2024-07-12 18:38:21.737283] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.029 [2024-07-12 18:38:21.739514] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.029 [2024-07-12 18:38:21.739558] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.029 [2024-07-12 18:38:21.739604] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.029 [2024-07-12 18:38:21.739645] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.029 [2024-07-12 18:38:21.739916] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.029 [2024-07-12 18:38:21.739974] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.029 [2024-07-12 18:38:21.740016] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.029 [2024-07-12 18:38:21.740071] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.029 [2024-07-12 18:38:21.740112] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.029 [2024-07-12 18:38:21.740378] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.029 [2024-07-12 18:38:21.740395] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.029 [2024-07-12 18:38:21.740410] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.029 [2024-07-12 18:38:21.740424] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.029 [2024-07-12 18:38:21.742147] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.029 [2024-07-12 18:38:21.742192] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.029 [2024-07-12 18:38:21.742233] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.029 [2024-07-12 18:38:21.742274] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.029 [2024-07-12 18:38:21.742537] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.029 [2024-07-12 18:38:21.742592] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.029 [2024-07-12 18:38:21.742632] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.029 [2024-07-12 18:38:21.742673] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.029 [2024-07-12 18:38:21.742712] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.029 [2024-07-12 18:38:21.743102] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.029 [2024-07-12 18:38:21.743120] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.029 [2024-07-12 18:38:21.743135] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.029 [2024-07-12 18:38:21.743150] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.029 [2024-07-12 18:38:21.745751] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.029 [2024-07-12 18:38:21.745799] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.029 [2024-07-12 18:38:21.745843] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.029 [2024-07-12 18:38:21.745883] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.029 [2024-07-12 18:38:21.746189] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.029 [2024-07-12 18:38:21.746256] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.029 [2024-07-12 18:38:21.746300] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.029 [2024-07-12 18:38:21.746343] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.029 [2024-07-12 18:38:21.746383] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.029 [2024-07-12 18:38:21.746656] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.029 [2024-07-12 18:38:21.746674] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.029 [2024-07-12 18:38:21.746689] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.029 [2024-07-12 18:38:21.746704] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.291 [2024-07-12 18:38:21.748343] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.291 [2024-07-12 18:38:21.748388] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.291 [2024-07-12 18:38:21.748429] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.291 [2024-07-12 18:38:21.748468] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.291 [2024-07-12 18:38:21.748746] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.291 [2024-07-12 18:38:21.748808] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.291 [2024-07-12 18:38:21.748848] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.291 [2024-07-12 18:38:21.748891] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.291 [2024-07-12 18:38:21.748946] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.291 [2024-07-12 18:38:21.749216] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.291 [2024-07-12 18:38:21.749233] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.291 [2024-07-12 18:38:21.749249] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.291 [2024-07-12 18:38:21.749263] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.291 [2024-07-12 18:38:21.751683] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.291 [2024-07-12 18:38:21.751731] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.291 [2024-07-12 18:38:21.751772] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.291 [2024-07-12 18:38:21.751814] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.291 [2024-07-12 18:38:21.752111] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.291 [2024-07-12 18:38:21.752163] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.291 [2024-07-12 18:38:21.752204] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.291 [2024-07-12 18:38:21.752245] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.291 [2024-07-12 18:38:21.752286] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.291 [2024-07-12 18:38:21.752590] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.291 [2024-07-12 18:38:21.752607] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.291 [2024-07-12 18:38:21.752621] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.291 [2024-07-12 18:38:21.752636] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.291 [2024-07-12 18:38:21.754395] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.291 [2024-07-12 18:38:21.754445] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.291 [2024-07-12 18:38:21.754488] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.291 [2024-07-12 18:38:21.754528] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.291 [2024-07-12 18:38:21.754799] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.291 [2024-07-12 18:38:21.754858] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.291 [2024-07-12 18:38:21.754900] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.291 [2024-07-12 18:38:21.754947] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.291 [2024-07-12 18:38:21.754987] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.291 [2024-07-12 18:38:21.755263] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.291 [2024-07-12 18:38:21.755280] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.291 [2024-07-12 18:38:21.755296] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.291 [2024-07-12 18:38:21.755311] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.291 [2024-07-12 18:38:21.757517] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.291 [2024-07-12 18:38:21.757565] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.291 [2024-07-12 18:38:21.757610] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.291 [2024-07-12 18:38:21.757651] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.291 [2024-07-12 18:38:21.758092] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.291 [2024-07-12 18:38:21.758151] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.291 [2024-07-12 18:38:21.758193] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.291 [2024-07-12 18:38:21.758234] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.291 [2024-07-12 18:38:21.758274] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.291 [2024-07-12 18:38:21.758573] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.291 [2024-07-12 18:38:21.758591] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.291 [2024-07-12 18:38:21.758606] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.291 [2024-07-12 18:38:21.758620] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.291 [2024-07-12 18:38:21.760295] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.291 [2024-07-12 18:38:21.760340] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.291 [2024-07-12 18:38:21.760380] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.291 [2024-07-12 18:38:21.760426] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.291 [2024-07-12 18:38:21.760718] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.291 [2024-07-12 18:38:21.760780] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.291 [2024-07-12 18:38:21.760821] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.291 [2024-07-12 18:38:21.760861] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.291 [2024-07-12 18:38:21.760906] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.291 [2024-07-12 18:38:21.761185] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.291 [2024-07-12 18:38:21.761202] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.291 [2024-07-12 18:38:21.761217] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.291 [2024-07-12 18:38:21.761231] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.291 [2024-07-12 18:38:21.763324] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.291 [2024-07-12 18:38:21.763369] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.291 [2024-07-12 18:38:21.763410] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.291 [2024-07-12 18:38:21.763451] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.291 [2024-07-12 18:38:21.763877] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.291 [2024-07-12 18:38:21.763934] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.291 [2024-07-12 18:38:21.763977] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.291 [2024-07-12 18:38:21.764018] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.291 [2024-07-12 18:38:21.764060] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.291 [2024-07-12 18:38:21.764329] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.291 [2024-07-12 18:38:21.764345] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.291 [2024-07-12 18:38:21.764360] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.291 [2024-07-12 18:38:21.764375] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.291 [2024-07-12 18:38:21.766056] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.291 [2024-07-12 18:38:21.766101] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.291 [2024-07-12 18:38:21.766141] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.291 [2024-07-12 18:38:21.766181] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.291 [2024-07-12 18:38:21.766488] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.291 [2024-07-12 18:38:21.766544] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.291 [2024-07-12 18:38:21.766585] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.291 [2024-07-12 18:38:21.766625] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.291 [2024-07-12 18:38:21.766666] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.291 [2024-07-12 18:38:21.766937] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.291 [2024-07-12 18:38:21.766954] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.291 [2024-07-12 18:38:21.766968] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.291 [2024-07-12 18:38:21.766982] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.291 [2024-07-12 18:38:21.768901] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.291 [2024-07-12 18:38:21.768956] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.291 [2024-07-12 18:38:21.769005] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.291 [2024-07-12 18:38:21.769059] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.291 [2024-07-12 18:38:21.769529] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.291 [2024-07-12 18:38:21.769585] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.291 [2024-07-12 18:38:21.769628] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.292 [2024-07-12 18:38:21.769668] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.292 [2024-07-12 18:38:21.769711] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.292 [2024-07-12 18:38:21.770114] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.292 [2024-07-12 18:38:21.770130] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.292 [2024-07-12 18:38:21.770144] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.292 [2024-07-12 18:38:21.770159] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.292 [2024-07-12 18:38:21.771727] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.292 [2024-07-12 18:38:21.771772] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.292 [2024-07-12 18:38:21.771814] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.292 [2024-07-12 18:38:21.771854] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.292 [2024-07-12 18:38:21.772126] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.292 [2024-07-12 18:38:21.772183] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.292 [2024-07-12 18:38:21.772223] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.292 [2024-07-12 18:38:21.772270] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.292 [2024-07-12 18:38:21.772311] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.292 [2024-07-12 18:38:21.772582] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.292 [2024-07-12 18:38:21.772598] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.292 [2024-07-12 18:38:21.772613] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.292 [2024-07-12 18:38:21.772627] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.292 [2024-07-12 18:38:21.774525] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.292 [2024-07-12 18:38:21.774574] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.292 [2024-07-12 18:38:21.774616] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.292 [2024-07-12 18:38:21.774657] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.292 [2024-07-12 18:38:21.775018] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.292 [2024-07-12 18:38:21.775079] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.292 [2024-07-12 18:38:21.775120] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.292 [2024-07-12 18:38:21.775161] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.292 [2024-07-12 18:38:21.775200] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.292 [2024-07-12 18:38:21.775620] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.292 [2024-07-12 18:38:21.775641] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.292 [2024-07-12 18:38:21.775656] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.292 [2024-07-12 18:38:21.775671] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.292 [2024-07-12 18:38:21.777263] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.292 [2024-07-12 18:38:21.777307] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.292 [2024-07-12 18:38:21.777348] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.292 [2024-07-12 18:38:21.777388] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.292 [2024-07-12 18:38:21.777763] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.292 [2024-07-12 18:38:21.777821] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.292 [2024-07-12 18:38:21.777862] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.292 [2024-07-12 18:38:21.777902] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.292 [2024-07-12 18:38:21.777951] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.292 [2024-07-12 18:38:21.778257] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.292 [2024-07-12 18:38:21.778273] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.292 [2024-07-12 18:38:21.778287] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.292 [2024-07-12 18:38:21.778302] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.292 [2024-07-12 18:38:21.780020] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.292 [2024-07-12 18:38:21.780065] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.292 [2024-07-12 18:38:21.780110] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.292 [2024-07-12 18:38:21.780151] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.292 [2024-07-12 18:38:21.780591] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.292 [2024-07-12 18:38:21.780641] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.292 [2024-07-12 18:38:21.780684] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.292 [2024-07-12 18:38:21.780740] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.292 [2024-07-12 18:38:21.780792] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.292 [2024-07-12 18:38:21.781303] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.292 [2024-07-12 18:38:21.781324] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.292 [2024-07-12 18:38:21.781340] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.292 [2024-07-12 18:38:21.781355] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.292 [2024-07-12 18:38:21.783094] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.292 [2024-07-12 18:38:21.783152] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.292 [2024-07-12 18:38:21.783192] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.292 [2024-07-12 18:38:21.783232] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.292 [2024-07-12 18:38:21.783500] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.292 [2024-07-12 18:38:21.783561] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.292 [2024-07-12 18:38:21.783602] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.292 [2024-07-12 18:38:21.783644] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.292 [2024-07-12 18:38:21.783684] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.292 [2024-07-12 18:38:21.783959] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.292 [2024-07-12 18:38:21.783976] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.292 [2024-07-12 18:38:21.783990] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.292 [2024-07-12 18:38:21.784004] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.292 [2024-07-12 18:38:21.785627] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.292 [2024-07-12 18:38:21.785684] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.292 [2024-07-12 18:38:21.785726] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.292 [2024-07-12 18:38:21.785766] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.292 [2024-07-12 18:38:21.786235] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.292 [2024-07-12 18:38:21.786286] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.292 [2024-07-12 18:38:21.786328] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.292 [2024-07-12 18:38:21.786370] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.292 [2024-07-12 18:38:21.786411] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.292 [2024-07-12 18:38:21.786779] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.292 [2024-07-12 18:38:21.786795] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.292 [2024-07-12 18:38:21.786810] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.292 [2024-07-12 18:38:21.786825] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.292 [2024-07-12 18:38:21.788789] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.292 [2024-07-12 18:38:21.788834] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.292 [2024-07-12 18:38:21.790491] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.292 [2024-07-12 18:38:21.790536] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.292 [2024-07-12 18:38:21.790914] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.292 [2024-07-12 18:38:21.790989] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.292 [2024-07-12 18:38:21.791036] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.292 [2024-07-12 18:38:21.791076] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.292 [2024-07-12 18:38:21.791117] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.292 [2024-07-12 18:38:21.791445] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.292 [2024-07-12 18:38:21.791463] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.292 [2024-07-12 18:38:21.791477] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.292 [2024-07-12 18:38:21.791491] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.292 [2024-07-12 18:38:21.793304] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.293 [2024-07-12 18:38:21.793361] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.293 [2024-07-12 18:38:21.793403] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.293 [2024-07-12 18:38:21.793794] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.293 [2024-07-12 18:38:21.794280] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.293 [2024-07-12 18:38:21.794344] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.293 [2024-07-12 18:38:21.794386] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.293 [2024-07-12 18:38:21.794428] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.293 [2024-07-12 18:38:21.794470] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.293 [2024-07-12 18:38:21.794901] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.293 [2024-07-12 18:38:21.794916] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.293 [2024-07-12 18:38:21.794936] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.293 [2024-07-12 18:38:21.794951] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.293 [2024-07-12 18:38:21.797402] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.293 [2024-07-12 18:38:21.798791] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.293 [2024-07-12 18:38:21.800453] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.293 [2024-07-12 18:38:21.802117] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.293 [2024-07-12 18:38:21.802467] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.293 [2024-07-12 18:38:21.802872] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.293 [2024-07-12 18:38:21.803268] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.293 [2024-07-12 18:38:21.803660] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.293 [2024-07-12 18:38:21.804055] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.293 [2024-07-12 18:38:21.804326] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.293 [2024-07-12 18:38:21.804348] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.293 [2024-07-12 18:38:21.804362] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.293 [2024-07-12 18:38:21.804377] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.293 [2024-07-12 18:38:21.807662] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.293 [2024-07-12 18:38:21.809455] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.293 [2024-07-12 18:38:21.811151] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.293 [2024-07-12 18:38:21.812752] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.293 [2024-07-12 18:38:21.813123] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.293 [2024-07-12 18:38:21.813527] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.293 [2024-07-12 18:38:21.813918] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.293 [2024-07-12 18:38:21.814310] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.293 [2024-07-12 18:38:21.815392] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.293 [2024-07-12 18:38:21.815693] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.293 [2024-07-12 18:38:21.815709] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.293 [2024-07-12 18:38:21.815724] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.293 [2024-07-12 18:38:21.815738] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.293 [2024-07-12 18:38:21.818764] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.293 [2024-07-12 18:38:21.820409] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.293 [2024-07-12 18:38:21.822073] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.293 [2024-07-12 18:38:21.822649] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.293 [2024-07-12 18:38:21.823123] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.293 [2024-07-12 18:38:21.823523] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.293 [2024-07-12 18:38:21.823919] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.293 [2024-07-12 18:38:21.824362] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.293 [2024-07-12 18:38:21.825878] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.293 [2024-07-12 18:38:21.826153] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.293 [2024-07-12 18:38:21.826169] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.293 [2024-07-12 18:38:21.826184] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.293 [2024-07-12 18:38:21.826198] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.293 [2024-07-12 18:38:21.829511] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.293 [2024-07-12 18:38:21.831165] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.293 [2024-07-12 18:38:21.832404] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.293 [2024-07-12 18:38:21.832795] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.293 [2024-07-12 18:38:21.833250] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.293 [2024-07-12 18:38:21.833650] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.293 [2024-07-12 18:38:21.834045] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.293 [2024-07-12 18:38:21.835506] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.293 [2024-07-12 18:38:21.836892] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.293 [2024-07-12 18:38:21.837166] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.293 [2024-07-12 18:38:21.837183] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.293 [2024-07-12 18:38:21.837197] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.293 [2024-07-12 18:38:21.837212] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.293 [2024-07-12 18:38:21.840532] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.293 [2024-07-12 18:38:21.842292] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.293 [2024-07-12 18:38:21.842691] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.293 [2024-07-12 18:38:21.843088] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.293 [2024-07-12 18:38:21.843463] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.293 [2024-07-12 18:38:21.843866] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.293 [2024-07-12 18:38:21.844649] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.293 [2024-07-12 18:38:21.846033] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.293 [2024-07-12 18:38:21.847673] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.293 [2024-07-12 18:38:21.847949] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.293 [2024-07-12 18:38:21.847966] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.293 [2024-07-12 18:38:21.847980] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.293 [2024-07-12 18:38:21.847995] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.293 [2024-07-12 18:38:21.851336] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.293 [2024-07-12 18:38:21.852237] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.293 [2024-07-12 18:38:21.852635] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.293 [2024-07-12 18:38:21.853031] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.293 [2024-07-12 18:38:21.853471] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.293 [2024-07-12 18:38:21.853873] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.293 [2024-07-12 18:38:21.855674] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.293 [2024-07-12 18:38:21.857400] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.293 [2024-07-12 18:38:21.859234] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.293 [2024-07-12 18:38:21.859503] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.293 [2024-07-12 18:38:21.859519] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.293 [2024-07-12 18:38:21.859533] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.293 [2024-07-12 18:38:21.859548] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.293 [2024-07-12 18:38:21.862692] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.293 [2024-07-12 18:38:21.863095] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.293 [2024-07-12 18:38:21.863488] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.293 [2024-07-12 18:38:21.863875] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.293 [2024-07-12 18:38:21.864329] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.293 [2024-07-12 18:38:21.865625] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.293 [2024-07-12 18:38:21.867016] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.293 [2024-07-12 18:38:21.868667] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.293 [2024-07-12 18:38:21.870324] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.293 [2024-07-12 18:38:21.870670] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.294 [2024-07-12 18:38:21.870687] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.294 [2024-07-12 18:38:21.870701] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.294 [2024-07-12 18:38:21.870717] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.294 [2024-07-12 18:38:21.872696] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.294 [2024-07-12 18:38:21.873101] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.294 [2024-07-12 18:38:21.873492] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.294 [2024-07-12 18:38:21.873881] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.294 [2024-07-12 18:38:21.874256] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.294 [2024-07-12 18:38:21.875642] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.294 [2024-07-12 18:38:21.877307] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.294 [2024-07-12 18:38:21.878963] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.294 [2024-07-12 18:38:21.880247] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.294 [2024-07-12 18:38:21.880532] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.294 [2024-07-12 18:38:21.880548] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.294 [2024-07-12 18:38:21.880567] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.294 [2024-07-12 18:38:21.880582] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.294 [2024-07-12 18:38:21.882678] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.294 [2024-07-12 18:38:21.883080] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.294 [2024-07-12 18:38:21.883477] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.294 [2024-07-12 18:38:21.884077] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.294 [2024-07-12 18:38:21.884349] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.294 [2024-07-12 18:38:21.885996] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.294 [2024-07-12 18:38:21.887745] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.294 [2024-07-12 18:38:21.889589] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.294 [2024-07-12 18:38:21.890660] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.294 [2024-07-12 18:38:21.890972] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.294 [2024-07-12 18:38:21.890989] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.294 [2024-07-12 18:38:21.891004] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.294 [2024-07-12 18:38:21.891018] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.294 [2024-07-12 18:38:21.893180] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.294 [2024-07-12 18:38:21.893577] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.294 [2024-07-12 18:38:21.893978] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.294 [2024-07-12 18:38:21.895705] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.294 [2024-07-12 18:38:21.896015] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.294 [2024-07-12 18:38:21.897684] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.294 [2024-07-12 18:38:21.899336] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.294 [2024-07-12 18:38:21.900114] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.294 [2024-07-12 18:38:21.901575] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.294 [2024-07-12 18:38:21.901842] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.294 [2024-07-12 18:38:21.901859] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.294 [2024-07-12 18:38:21.901873] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.294 [2024-07-12 18:38:21.901887] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.294 [2024-07-12 18:38:21.904251] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.294 [2024-07-12 18:38:21.904649] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.294 [2024-07-12 18:38:21.905816] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.294 [2024-07-12 18:38:21.907211] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.294 [2024-07-12 18:38:21.907482] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.294 [2024-07-12 18:38:21.909168] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.294 [2024-07-12 18:38:21.910388] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.294 [2024-07-12 18:38:21.912061] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.294 [2024-07-12 18:38:21.913583] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.294 [2024-07-12 18:38:21.913853] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.294 [2024-07-12 18:38:21.913870] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.294 [2024-07-12 18:38:21.913885] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.294 [2024-07-12 18:38:21.913899] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.294 [2024-07-12 18:38:21.916385] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.294 [2024-07-12 18:38:21.917103] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.294 [2024-07-12 18:38:21.918500] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.294 [2024-07-12 18:38:21.920159] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.294 [2024-07-12 18:38:21.920430] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.294 [2024-07-12 18:38:21.922119] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.294 [2024-07-12 18:38:21.923337] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.294 [2024-07-12 18:38:21.924731] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.294 [2024-07-12 18:38:21.926397] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.294 [2024-07-12 18:38:21.926668] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.294 [2024-07-12 18:38:21.926684] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.294 [2024-07-12 18:38:21.926698] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.294 [2024-07-12 18:38:21.926712] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.294 [2024-07-12 18:38:21.929348] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.294 [2024-07-12 18:38:21.930998] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.294 [2024-07-12 18:38:21.932763] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.294 [2024-07-12 18:38:21.934360] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.294 [2024-07-12 18:38:21.934629] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.294 [2024-07-12 18:38:21.935456] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.294 [2024-07-12 18:38:21.936846] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.294 [2024-07-12 18:38:21.938495] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.294 [2024-07-12 18:38:21.940141] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.294 [2024-07-12 18:38:21.940450] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.294 [2024-07-12 18:38:21.940467] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.294 [2024-07-12 18:38:21.940482] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.294 [2024-07-12 18:38:21.940497] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.294 [2024-07-12 18:38:21.944650] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.294 [2024-07-12 18:38:21.946191] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.294 [2024-07-12 18:38:21.947833] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.294 [2024-07-12 18:38:21.949583] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.294 [2024-07-12 18:38:21.950026] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.294 [2024-07-12 18:38:21.951477] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.294 [2024-07-12 18:38:21.953147] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.294 [2024-07-12 18:38:21.954808] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.294 [2024-07-12 18:38:21.956184] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.294 [2024-07-12 18:38:21.956569] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.294 [2024-07-12 18:38:21.956586] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.294 [2024-07-12 18:38:21.956601] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.294 [2024-07-12 18:38:21.956616] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.294 [2024-07-12 18:38:21.960234] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.294 [2024-07-12 18:38:21.961895] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.294 [2024-07-12 18:38:21.963551] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.294 [2024-07-12 18:38:21.964285] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.294 [2024-07-12 18:38:21.964555] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.294 [2024-07-12 18:38:21.966037] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.295 [2024-07-12 18:38:21.967688] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.295 [2024-07-12 18:38:21.969381] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.295 [2024-07-12 18:38:21.969779] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.295 [2024-07-12 18:38:21.970250] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.295 [2024-07-12 18:38:21.970268] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.295 [2024-07-12 18:38:21.970284] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.295 [2024-07-12 18:38:21.970304] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.295 [2024-07-12 18:38:21.973998] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.295 [2024-07-12 18:38:21.975667] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.295 [2024-07-12 18:38:21.976804] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.295 [2024-07-12 18:38:21.978544] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.295 [2024-07-12 18:38:21.978850] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.295 [2024-07-12 18:38:21.980539] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.295 [2024-07-12 18:38:21.982201] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.295 [2024-07-12 18:38:21.982822] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.295 [2024-07-12 18:38:21.983224] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.295 [2024-07-12 18:38:21.983651] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.295 [2024-07-12 18:38:21.983667] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.295 [2024-07-12 18:38:21.983682] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.295 [2024-07-12 18:38:21.983697] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.295 [2024-07-12 18:38:21.987524] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.295 [2024-07-12 18:38:21.989199] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.295 [2024-07-12 18:38:21.990401] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.295 [2024-07-12 18:38:21.991795] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.295 [2024-07-12 18:38:21.992076] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.295 [2024-07-12 18:38:21.993752] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.295 [2024-07-12 18:38:21.994952] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.295 [2024-07-12 18:38:21.995343] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.295 [2024-07-12 18:38:21.995734] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.295 [2024-07-12 18:38:21.996158] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.295 [2024-07-12 18:38:21.996176] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.295 [2024-07-12 18:38:21.996190] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.295 [2024-07-12 18:38:21.996206] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.295 [2024-07-12 18:38:21.999580] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.295 [2024-07-12 18:38:22.000377] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.295 [2024-07-12 18:38:22.001774] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.295 [2024-07-12 18:38:22.003438] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.295 [2024-07-12 18:38:22.003709] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.295 [2024-07-12 18:38:22.005495] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.295 [2024-07-12 18:38:22.005890] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.295 [2024-07-12 18:38:22.006289] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.295 [2024-07-12 18:38:22.006680] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.295 [2024-07-12 18:38:22.007128] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.295 [2024-07-12 18:38:22.007146] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.295 [2024-07-12 18:38:22.007161] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.295 [2024-07-12 18:38:22.007176] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.295 [2024-07-12 18:38:22.009937] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.295 [2024-07-12 18:38:22.011453] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.295 [2024-07-12 18:38:22.012689] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.557 [2024-07-12 18:38:22.014227] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.557 [2024-07-12 18:38:22.014499] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.557 [2024-07-12 18:38:22.015654] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.557 [2024-07-12 18:38:22.016054] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.557 [2024-07-12 18:38:22.016447] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.557 [2024-07-12 18:38:22.016837] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.557 [2024-07-12 18:38:22.017278] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.557 [2024-07-12 18:38:22.017296] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.557 [2024-07-12 18:38:22.017312] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.557 [2024-07-12 18:38:22.017327] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.557 [2024-07-12 18:38:22.020534] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.557 [2024-07-12 18:38:22.022262] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.557 [2024-07-12 18:38:22.023924] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.557 [2024-07-12 18:38:22.024325] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.557 [2024-07-12 18:38:22.024781] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.557 [2024-07-12 18:38:22.025192] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.557 [2024-07-12 18:38:22.025585] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.557 [2024-07-12 18:38:22.026267] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.557 [2024-07-12 18:38:22.027662] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.557 [2024-07-12 18:38:22.027940] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.557 [2024-07-12 18:38:22.027958] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.557 [2024-07-12 18:38:22.027972] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.557 [2024-07-12 18:38:22.027987] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.557 [2024-07-12 18:38:22.031361] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.557 [2024-07-12 18:38:22.033032] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.557 [2024-07-12 18:38:22.034274] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.557 [2024-07-12 18:38:22.034669] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.557 [2024-07-12 18:38:22.035103] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.557 [2024-07-12 18:38:22.035506] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.557 [2024-07-12 18:38:22.035897] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.557 [2024-07-12 18:38:22.036294] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.557 [2024-07-12 18:38:22.036688] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.557 [2024-07-12 18:38:22.037108] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.557 [2024-07-12 18:38:22.037127] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.557 [2024-07-12 18:38:22.037142] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.557 [2024-07-12 18:38:22.037156] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.557 [2024-07-12 18:38:22.039843] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.557 [2024-07-12 18:38:22.040250] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.557 [2024-07-12 18:38:22.040643] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.557 [2024-07-12 18:38:22.041053] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.557 [2024-07-12 18:38:22.041495] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.557 [2024-07-12 18:38:22.041897] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.557 [2024-07-12 18:38:22.042298] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.557 [2024-07-12 18:38:22.042706] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.558 [2024-07-12 18:38:22.043106] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.558 [2024-07-12 18:38:22.043564] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.558 [2024-07-12 18:38:22.043581] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.558 [2024-07-12 18:38:22.043597] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.558 [2024-07-12 18:38:22.043616] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.558 [2024-07-12 18:38:22.046269] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.558 [2024-07-12 18:38:22.046670] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.558 [2024-07-12 18:38:22.047069] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.558 [2024-07-12 18:38:22.047469] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.558 [2024-07-12 18:38:22.047878] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.558 [2024-07-12 18:38:22.048289] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.558 [2024-07-12 18:38:22.048681] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.558 [2024-07-12 18:38:22.049084] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.558 [2024-07-12 18:38:22.049475] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.558 [2024-07-12 18:38:22.049980] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.558 [2024-07-12 18:38:22.049997] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.558 [2024-07-12 18:38:22.050012] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.558 [2024-07-12 18:38:22.050027] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.558 [2024-07-12 18:38:22.052824] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.558 [2024-07-12 18:38:22.053239] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.558 [2024-07-12 18:38:22.053635] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.558 [2024-07-12 18:38:22.054034] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.558 [2024-07-12 18:38:22.054410] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.558 [2024-07-12 18:38:22.054812] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.558 [2024-07-12 18:38:22.055212] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.558 [2024-07-12 18:38:22.055605] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.558 [2024-07-12 18:38:22.056013] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.558 [2024-07-12 18:38:22.056449] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.558 [2024-07-12 18:38:22.056470] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.558 [2024-07-12 18:38:22.056486] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.558 [2024-07-12 18:38:22.056501] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.558 [2024-07-12 18:38:22.059168] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.558 [2024-07-12 18:38:22.059566] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.558 [2024-07-12 18:38:22.059966] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.558 [2024-07-12 18:38:22.060358] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.558 [2024-07-12 18:38:22.060730] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.558 [2024-07-12 18:38:22.061143] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.558 [2024-07-12 18:38:22.061539] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.558 [2024-07-12 18:38:22.061934] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.558 [2024-07-12 18:38:22.062340] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.558 [2024-07-12 18:38:22.062772] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.558 [2024-07-12 18:38:22.062790] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.558 [2024-07-12 18:38:22.062807] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.558 [2024-07-12 18:38:22.062822] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.558 [2024-07-12 18:38:22.065652] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.558 [2024-07-12 18:38:22.066059] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.558 [2024-07-12 18:38:22.066457] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.558 [2024-07-12 18:38:22.066850] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.558 [2024-07-12 18:38:22.067377] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.558 [2024-07-12 18:38:22.067776] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.558 [2024-07-12 18:38:22.068190] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.558 [2024-07-12 18:38:22.068582] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.558 [2024-07-12 18:38:22.068984] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.558 [2024-07-12 18:38:22.069410] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.558 [2024-07-12 18:38:22.069428] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.558 [2024-07-12 18:38:22.069442] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.558 [2024-07-12 18:38:22.069457] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.558 [2024-07-12 18:38:22.072306] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.558 [2024-07-12 18:38:22.072710] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.558 [2024-07-12 18:38:22.073106] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.558 [2024-07-12 18:38:22.073499] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.558 [2024-07-12 18:38:22.073973] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.558 [2024-07-12 18:38:22.074375] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.558 [2024-07-12 18:38:22.074779] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.558 [2024-07-12 18:38:22.075183] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.558 [2024-07-12 18:38:22.075572] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.558 [2024-07-12 18:38:22.075982] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.558 [2024-07-12 18:38:22.075999] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.558 [2024-07-12 18:38:22.076015] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.558 [2024-07-12 18:38:22.076029] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.558 [2024-07-12 18:38:22.079762] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.558 [2024-07-12 18:38:22.080174] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.558 [2024-07-12 18:38:22.081028] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.558 [2024-07-12 18:38:22.082134] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.558 [2024-07-12 18:38:22.082595] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.558 [2024-07-12 18:38:22.083004] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.558 [2024-07-12 18:38:22.083407] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.558 [2024-07-12 18:38:22.084272] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.558 [2024-07-12 18:38:22.085354] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.558 [2024-07-12 18:38:22.085803] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.558 [2024-07-12 18:38:22.085824] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.558 [2024-07-12 18:38:22.085839] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.558 [2024-07-12 18:38:22.085854] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.558 [2024-07-12 18:38:22.088232] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.558 [2024-07-12 18:38:22.089319] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.558 [2024-07-12 18:38:22.090196] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.558 [2024-07-12 18:38:22.090587] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.558 [2024-07-12 18:38:22.090898] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.558 [2024-07-12 18:38:22.092014] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.558 [2024-07-12 18:38:22.092408] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.558 [2024-07-12 18:38:22.092801] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.558 [2024-07-12 18:38:22.093203] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.558 [2024-07-12 18:38:22.093536] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.558 [2024-07-12 18:38:22.093554] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.558 [2024-07-12 18:38:22.093569] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.558 [2024-07-12 18:38:22.093584] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.558 [2024-07-12 18:38:22.096093] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.558 [2024-07-12 18:38:22.096492] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.558 [2024-07-12 18:38:22.096887] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.558 [2024-07-12 18:38:22.098458] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.559 [2024-07-12 18:38:22.098847] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.559 [2024-07-12 18:38:22.099259] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.559 [2024-07-12 18:38:22.100651] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.559 [2024-07-12 18:38:22.101223] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.559 [2024-07-12 18:38:22.101615] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.559 [2024-07-12 18:38:22.102006] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.559 [2024-07-12 18:38:22.102024] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.559 [2024-07-12 18:38:22.102039] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.559 [2024-07-12 18:38:22.102054] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.559 [2024-07-12 18:38:22.105740] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.559 [2024-07-12 18:38:22.106152] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.559 [2024-07-12 18:38:22.106202] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.559 [2024-07-12 18:38:22.106592] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.559 [2024-07-12 18:38:22.106956] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.559 [2024-07-12 18:38:22.107371] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.559 [2024-07-12 18:38:22.109167] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.559 [2024-07-12 18:38:22.109563] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.559 [2024-07-12 18:38:22.110024] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.559 [2024-07-12 18:38:22.110293] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.559 [2024-07-12 18:38:22.110309] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.559 [2024-07-12 18:38:22.110323] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.559 [2024-07-12 18:38:22.110338] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.559 [2024-07-12 18:38:22.114285] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.559 [2024-07-12 18:38:22.114685] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.559 [2024-07-12 18:38:22.115414] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.559 [2024-07-12 18:38:22.115463] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.559 [2024-07-12 18:38:22.115738] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.559 [2024-07-12 18:38:22.116153] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.559 [2024-07-12 18:38:22.116546] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.559 [2024-07-12 18:38:22.116951] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.559 [2024-07-12 18:38:22.117949] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.559 [2024-07-12 18:38:22.118228] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.559 [2024-07-12 18:38:22.118245] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.559 [2024-07-12 18:38:22.118260] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.559 [2024-07-12 18:38:22.118275] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.559 [2024-07-12 18:38:22.120457] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.559 [2024-07-12 18:38:22.120504] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.559 [2024-07-12 18:38:22.120546] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.559 [2024-07-12 18:38:22.120587] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.559 [2024-07-12 18:38:22.120957] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.559 [2024-07-12 18:38:22.121020] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.559 [2024-07-12 18:38:22.121062] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.559 [2024-07-12 18:38:22.121115] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.559 [2024-07-12 18:38:22.121156] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.559 [2024-07-12 18:38:22.121552] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.559 [2024-07-12 18:38:22.121568] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.559 [2024-07-12 18:38:22.121583] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.559 [2024-07-12 18:38:22.121597] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.559 [2024-07-12 18:38:22.123612] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.559 [2024-07-12 18:38:22.123659] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.559 [2024-07-12 18:38:22.123699] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.559 [2024-07-12 18:38:22.123740] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.559 [2024-07-12 18:38:22.124187] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.559 [2024-07-12 18:38:22.124238] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.559 [2024-07-12 18:38:22.124282] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.559 [2024-07-12 18:38:22.124324] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.559 [2024-07-12 18:38:22.124365] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.559 [2024-07-12 18:38:22.124704] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.559 [2024-07-12 18:38:22.124721] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.559 [2024-07-12 18:38:22.124736] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.559 [2024-07-12 18:38:22.124751] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.559 [2024-07-12 18:38:22.126949] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.559 [2024-07-12 18:38:22.126997] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.559 [2024-07-12 18:38:22.127054] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.559 [2024-07-12 18:38:22.127099] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.559 [2024-07-12 18:38:22.127367] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.559 [2024-07-12 18:38:22.127416] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.559 [2024-07-12 18:38:22.127470] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.559 [2024-07-12 18:38:22.127522] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.559 [2024-07-12 18:38:22.127563] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.559 [2024-07-12 18:38:22.128026] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.559 [2024-07-12 18:38:22.128043] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.559 [2024-07-12 18:38:22.128058] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.559 [2024-07-12 18:38:22.128073] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.559 [2024-07-12 18:38:22.130136] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.559 [2024-07-12 18:38:22.130182] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.559 [2024-07-12 18:38:22.130222] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.559 [2024-07-12 18:38:22.130262] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.559 [2024-07-12 18:38:22.130699] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.559 [2024-07-12 18:38:22.130749] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.559 [2024-07-12 18:38:22.130791] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.559 [2024-07-12 18:38:22.130836] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.559 [2024-07-12 18:38:22.130877] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.559 [2024-07-12 18:38:22.131153] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.559 [2024-07-12 18:38:22.131170] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.559 [2024-07-12 18:38:22.131184] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.559 [2024-07-12 18:38:22.131198] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.559 [2024-07-12 18:38:22.133555] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.559 [2024-07-12 18:38:22.133627] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.559 [2024-07-12 18:38:22.133670] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.559 [2024-07-12 18:38:22.133711] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.559 [2024-07-12 18:38:22.133984] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.559 [2024-07-12 18:38:22.134063] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.559 [2024-07-12 18:38:22.134111] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.559 [2024-07-12 18:38:22.134152] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.559 [2024-07-12 18:38:22.134193] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.559 [2024-07-12 18:38:22.134651] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.559 [2024-07-12 18:38:22.134669] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.559 [2024-07-12 18:38:22.134684] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.560 [2024-07-12 18:38:22.134700] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.560 [2024-07-12 18:38:22.136805] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.560 [2024-07-12 18:38:22.136854] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.560 [2024-07-12 18:38:22.136896] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.560 [2024-07-12 18:38:22.136946] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.560 [2024-07-12 18:38:22.137334] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.560 [2024-07-12 18:38:22.137394] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.560 [2024-07-12 18:38:22.137437] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.560 [2024-07-12 18:38:22.137477] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.560 [2024-07-12 18:38:22.137530] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.560 [2024-07-12 18:38:22.137797] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.560 [2024-07-12 18:38:22.137814] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.560 [2024-07-12 18:38:22.137828] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.560 [2024-07-12 18:38:22.137842] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.560 [2024-07-12 18:38:22.140267] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.560 [2024-07-12 18:38:22.140314] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.560 [2024-07-12 18:38:22.140355] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.560 [2024-07-12 18:38:22.140396] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.560 [2024-07-12 18:38:22.140802] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.560 [2024-07-12 18:38:22.140886] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.560 [2024-07-12 18:38:22.140950] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.560 [2024-07-12 18:38:22.140994] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.560 [2024-07-12 18:38:22.141034] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.560 [2024-07-12 18:38:22.141302] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.560 [2024-07-12 18:38:22.141318] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.560 [2024-07-12 18:38:22.141333] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.560 [2024-07-12 18:38:22.141348] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.560 [2024-07-12 18:38:22.143180] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.560 [2024-07-12 18:38:22.143227] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.560 [2024-07-12 18:38:22.143271] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.560 [2024-07-12 18:38:22.143315] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.560 [2024-07-12 18:38:22.143757] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.560 [2024-07-12 18:38:22.143806] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.560 [2024-07-12 18:38:22.143863] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.560 [2024-07-12 18:38:22.143907] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.560 [2024-07-12 18:38:22.143956] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.560 [2024-07-12 18:38:22.144225] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.560 [2024-07-12 18:38:22.144242] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.560 [2024-07-12 18:38:22.144257] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.560 [2024-07-12 18:38:22.144271] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.560 [2024-07-12 18:38:22.146712] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.560 [2024-07-12 18:38:22.146758] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.560 [2024-07-12 18:38:22.146798] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.560 [2024-07-12 18:38:22.146838] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.560 [2024-07-12 18:38:22.147274] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.560 [2024-07-12 18:38:22.147331] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.560 [2024-07-12 18:38:22.147373] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.560 [2024-07-12 18:38:22.147414] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.560 [2024-07-12 18:38:22.147455] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.560 [2024-07-12 18:38:22.147860] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.560 [2024-07-12 18:38:22.147880] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.560 [2024-07-12 18:38:22.147895] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.560 [2024-07-12 18:38:22.147910] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.560 [2024-07-12 18:38:22.150248] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.560 [2024-07-12 18:38:22.150295] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.560 [2024-07-12 18:38:22.150336] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.560 [2024-07-12 18:38:22.150377] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.560 [2024-07-12 18:38:22.150767] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.560 [2024-07-12 18:38:22.150829] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.560 [2024-07-12 18:38:22.150870] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.560 [2024-07-12 18:38:22.150910] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.560 [2024-07-12 18:38:22.150958] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.560 [2024-07-12 18:38:22.151380] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.560 [2024-07-12 18:38:22.151397] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.560 [2024-07-12 18:38:22.151412] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.560 [2024-07-12 18:38:22.151428] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.560 [2024-07-12 18:38:22.153073] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.560 [2024-07-12 18:38:22.153118] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.560 [2024-07-12 18:38:22.153159] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.560 [2024-07-12 18:38:22.153198] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.560 [2024-07-12 18:38:22.153593] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.560 [2024-07-12 18:38:22.153657] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.560 [2024-07-12 18:38:22.153698] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.560 [2024-07-12 18:38:22.153739] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.560 [2024-07-12 18:38:22.153778] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.560 [2024-07-12 18:38:22.154106] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.560 [2024-07-12 18:38:22.154123] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.560 [2024-07-12 18:38:22.154138] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.560 [2024-07-12 18:38:22.154153] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.560 [2024-07-12 18:38:22.155952] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.560 [2024-07-12 18:38:22.156005] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.560 [2024-07-12 18:38:22.156050] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.560 [2024-07-12 18:38:22.156091] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.560 [2024-07-12 18:38:22.156521] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.560 [2024-07-12 18:38:22.156572] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.560 [2024-07-12 18:38:22.156615] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.560 [2024-07-12 18:38:22.156667] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.560 [2024-07-12 18:38:22.156718] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.560 [2024-07-12 18:38:22.157201] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.560 [2024-07-12 18:38:22.157220] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.560 [2024-07-12 18:38:22.157236] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.560 [2024-07-12 18:38:22.157252] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.560 [2024-07-12 18:38:22.158992] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.560 [2024-07-12 18:38:22.159040] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.560 [2024-07-12 18:38:22.159080] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.560 [2024-07-12 18:38:22.159120] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.560 [2024-07-12 18:38:22.159386] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.560 [2024-07-12 18:38:22.159447] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.560 [2024-07-12 18:38:22.159490] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.561 [2024-07-12 18:38:22.159531] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.561 [2024-07-12 18:38:22.159572] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.561 [2024-07-12 18:38:22.159842] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.561 [2024-07-12 18:38:22.159858] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.561 [2024-07-12 18:38:22.159872] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.561 [2024-07-12 18:38:22.159887] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.561 [2024-07-12 18:38:22.161568] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.561 [2024-07-12 18:38:22.161614] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.561 [2024-07-12 18:38:22.161655] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.561 [2024-07-12 18:38:22.161695] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.561 [2024-07-12 18:38:22.162118] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.561 [2024-07-12 18:38:22.162169] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.561 [2024-07-12 18:38:22.162215] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.561 [2024-07-12 18:38:22.162261] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.561 [2024-07-12 18:38:22.162304] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.561 [2024-07-12 18:38:22.162666] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.561 [2024-07-12 18:38:22.162682] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.561 [2024-07-12 18:38:22.162697] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.561 [2024-07-12 18:38:22.162712] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.561 [2024-07-12 18:38:22.164647] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.561 [2024-07-12 18:38:22.164692] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.561 [2024-07-12 18:38:22.164732] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.561 [2024-07-12 18:38:22.164772] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.561 [2024-07-12 18:38:22.165043] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.561 [2024-07-12 18:38:22.165100] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.561 [2024-07-12 18:38:22.165141] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.561 [2024-07-12 18:38:22.165190] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.561 [2024-07-12 18:38:22.165231] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.561 [2024-07-12 18:38:22.165655] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.561 [2024-07-12 18:38:22.165671] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.561 [2024-07-12 18:38:22.165686] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.561 [2024-07-12 18:38:22.165700] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.561 [2024-07-12 18:38:22.167356] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.561 [2024-07-12 18:38:22.167402] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.561 [2024-07-12 18:38:22.167464] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.561 [2024-07-12 18:38:22.167517] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.561 [2024-07-12 18:38:22.168002] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.561 [2024-07-12 18:38:22.168055] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.561 [2024-07-12 18:38:22.168097] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.561 [2024-07-12 18:38:22.168138] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.561 [2024-07-12 18:38:22.168178] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.561 [2024-07-12 18:38:22.168575] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.561 [2024-07-12 18:38:22.168592] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.561 [2024-07-12 18:38:22.168610] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.561 [2024-07-12 18:38:22.168625] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.561 [2024-07-12 18:38:22.172416] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.561 [2024-07-12 18:38:22.172468] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.561 [2024-07-12 18:38:22.172509] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.561 [2024-07-12 18:38:22.172557] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.561 [2024-07-12 18:38:22.172827] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.561 [2024-07-12 18:38:22.172879] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.561 [2024-07-12 18:38:22.172935] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.561 [2024-07-12 18:38:22.172977] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.561 [2024-07-12 18:38:22.173023] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.561 [2024-07-12 18:38:22.173290] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.561 [2024-07-12 18:38:22.173307] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.561 [2024-07-12 18:38:22.173321] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.561 [2024-07-12 18:38:22.173335] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.561 [2024-07-12 18:38:22.177645] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.561 [2024-07-12 18:38:22.177697] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.561 [2024-07-12 18:38:22.177737] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.561 [2024-07-12 18:38:22.177779] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.561 [2024-07-12 18:38:22.178053] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.561 [2024-07-12 18:38:22.178107] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.561 [2024-07-12 18:38:22.178147] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.561 [2024-07-12 18:38:22.178194] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.561 [2024-07-12 18:38:22.178234] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.561 [2024-07-12 18:38:22.178505] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.561 [2024-07-12 18:38:22.178521] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.561 [2024-07-12 18:38:22.178535] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.561 [2024-07-12 18:38:22.178549] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.561 [2024-07-12 18:38:22.183383] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.561 [2024-07-12 18:38:22.183434] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.561 [2024-07-12 18:38:22.183478] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.561 [2024-07-12 18:38:22.183519] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.561 [2024-07-12 18:38:22.183878] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.561 [2024-07-12 18:38:22.183944] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.561 [2024-07-12 18:38:22.184003] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.561 [2024-07-12 18:38:22.184047] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.561 [2024-07-12 18:38:22.184088] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.562 [2024-07-12 18:38:22.184551] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.562 [2024-07-12 18:38:22.184568] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.562 [2024-07-12 18:38:22.184584] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.562 [2024-07-12 18:38:22.184600] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.562 [2024-07-12 18:38:22.188299] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.562 [2024-07-12 18:38:22.188350] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.562 [2024-07-12 18:38:22.188394] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.562 [2024-07-12 18:38:22.188434] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.562 [2024-07-12 18:38:22.188746] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.562 [2024-07-12 18:38:22.188805] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.562 [2024-07-12 18:38:22.188846] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.562 [2024-07-12 18:38:22.188905] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.562 [2024-07-12 18:38:22.188953] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.562 [2024-07-12 18:38:22.189223] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.562 [2024-07-12 18:38:22.189239] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.562 [2024-07-12 18:38:22.189253] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.562 [2024-07-12 18:38:22.189269] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.562 [2024-07-12 18:38:22.193419] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.562 [2024-07-12 18:38:22.193470] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.562 [2024-07-12 18:38:22.193511] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.562 [2024-07-12 18:38:22.193552] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.562 [2024-07-12 18:38:22.193995] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.562 [2024-07-12 18:38:22.194051] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.562 [2024-07-12 18:38:22.194095] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.562 [2024-07-12 18:38:22.194145] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.562 [2024-07-12 18:38:22.194187] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.562 [2024-07-12 18:38:22.194455] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.562 [2024-07-12 18:38:22.194472] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.562 [2024-07-12 18:38:22.194486] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.562 [2024-07-12 18:38:22.194501] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.562 [2024-07-12 18:38:22.198410] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.562 [2024-07-12 18:38:22.198460] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.562 [2024-07-12 18:38:22.198501] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.562 [2024-07-12 18:38:22.198540] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.562 [2024-07-12 18:38:22.198807] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.562 [2024-07-12 18:38:22.198864] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.562 [2024-07-12 18:38:22.198904] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.562 [2024-07-12 18:38:22.198953] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.562 [2024-07-12 18:38:22.198993] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.562 [2024-07-12 18:38:22.199337] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.562 [2024-07-12 18:38:22.199354] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.562 [2024-07-12 18:38:22.199369] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.562 [2024-07-12 18:38:22.199383] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.562 [2024-07-12 18:38:22.202265] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.562 [2024-07-12 18:38:22.202323] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.562 [2024-07-12 18:38:22.202363] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.562 [2024-07-12 18:38:22.202403] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.562 [2024-07-12 18:38:22.202670] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.562 [2024-07-12 18:38:22.202727] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.562 [2024-07-12 18:38:22.202768] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.562 [2024-07-12 18:38:22.202809] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.562 [2024-07-12 18:38:22.202849] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.562 [2024-07-12 18:38:22.203170] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.562 [2024-07-12 18:38:22.203188] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.562 [2024-07-12 18:38:22.203212] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.562 [2024-07-12 18:38:22.203226] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.562 [2024-07-12 18:38:22.208738] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.562 [2024-07-12 18:38:22.208800] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.562 [2024-07-12 18:38:22.208846] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.562 [2024-07-12 18:38:22.208888] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.562 [2024-07-12 18:38:22.209272] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.562 [2024-07-12 18:38:22.209332] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.562 [2024-07-12 18:38:22.209373] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.562 [2024-07-12 18:38:22.209414] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.562 [2024-07-12 18:38:22.209456] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.562 [2024-07-12 18:38:22.209874] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.562 [2024-07-12 18:38:22.209891] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.562 [2024-07-12 18:38:22.209909] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.562 [2024-07-12 18:38:22.209933] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.562 [2024-07-12 18:38:22.215068] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.562 [2024-07-12 18:38:22.215118] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.562 [2024-07-12 18:38:22.215159] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.562 [2024-07-12 18:38:22.215199] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.562 [2024-07-12 18:38:22.215462] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.562 [2024-07-12 18:38:22.215522] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.562 [2024-07-12 18:38:22.215563] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.562 [2024-07-12 18:38:22.215603] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.562 [2024-07-12 18:38:22.215643] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.562 [2024-07-12 18:38:22.215905] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.562 [2024-07-12 18:38:22.215921] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.562 [2024-07-12 18:38:22.215944] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.562 [2024-07-12 18:38:22.215959] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.562 [2024-07-12 18:38:22.219421] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.562 [2024-07-12 18:38:22.219478] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.562 [2024-07-12 18:38:22.219518] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.562 [2024-07-12 18:38:22.219562] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.562 [2024-07-12 18:38:22.219828] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.562 [2024-07-12 18:38:22.219885] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.562 [2024-07-12 18:38:22.219935] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.562 [2024-07-12 18:38:22.219975] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.562 [2024-07-12 18:38:22.220015] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.562 [2024-07-12 18:38:22.220279] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.562 [2024-07-12 18:38:22.220296] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.562 [2024-07-12 18:38:22.220310] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.562 [2024-07-12 18:38:22.220325] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.562 [2024-07-12 18:38:22.224748] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.562 [2024-07-12 18:38:22.224798] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.562 [2024-07-12 18:38:22.224839] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.562 [2024-07-12 18:38:22.224879] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.562 [2024-07-12 18:38:22.225306] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.563 [2024-07-12 18:38:22.225357] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.563 [2024-07-12 18:38:22.225398] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.563 [2024-07-12 18:38:22.225439] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.563 [2024-07-12 18:38:22.225480] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.563 [2024-07-12 18:38:22.225871] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.563 [2024-07-12 18:38:22.225888] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.563 [2024-07-12 18:38:22.225903] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.563 [2024-07-12 18:38:22.225917] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.563 [2024-07-12 18:38:22.229681] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.563 [2024-07-12 18:38:22.229732] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.563 [2024-07-12 18:38:22.230164] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.563 [2024-07-12 18:38:22.230206] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.563 [2024-07-12 18:38:22.230246] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.563 [2024-07-12 18:38:22.230512] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.563 [2024-07-12 18:38:22.230529] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.563 [2024-07-12 18:38:22.230549] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.563 [2024-07-12 18:38:22.247529] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.563 [2024-07-12 18:38:22.248094] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.563 [2024-07-12 18:38:22.253084] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.563 [2024-07-12 18:38:22.253150] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.563 [2024-07-12 18:38:22.254672] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.563 [2024-07-12 18:38:22.254730] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.563 [2024-07-12 18:38:22.256313] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.563 [2024-07-12 18:38:22.256583] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.563 [2024-07-12 18:38:22.256651] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.563 [2024-07-12 18:38:22.258449] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.563 [2024-07-12 18:38:22.258507] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.563 [2024-07-12 18:38:22.258870] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.563 [2024-07-12 18:38:22.258924] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.563 [2024-07-12 18:38:22.259295] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.563 [2024-07-12 18:38:22.259346] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.563 [2024-07-12 18:38:22.259708] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.563 [2024-07-12 18:38:22.260143] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.563 [2024-07-12 18:38:22.260163] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.563 [2024-07-12 18:38:22.260180] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.563 [2024-07-12 18:38:22.260195] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.563 [2024-07-12 18:38:22.262539] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.563 [2024-07-12 18:38:22.264074] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.563 [2024-07-12 18:38:22.265740] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.563 [2024-07-12 18:38:22.267401] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.563 [2024-07-12 18:38:22.267671] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.563 [2024-07-12 18:38:22.268083] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.563 [2024-07-12 18:38:22.268472] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.563 [2024-07-12 18:38:22.268859] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.563 [2024-07-12 18:38:22.269253] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.563 [2024-07-12 18:38:22.269556] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.563 [2024-07-12 18:38:22.269577] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.563 [2024-07-12 18:38:22.269593] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.563 [2024-07-12 18:38:22.269608] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.563 [2024-07-12 18:38:22.273112] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.563 [2024-07-12 18:38:22.274836] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.563 [2024-07-12 18:38:22.276443] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.563 [2024-07-12 18:38:22.278068] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.563 [2024-07-12 18:38:22.278453] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.563 [2024-07-12 18:38:22.278857] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.563 [2024-07-12 18:38:22.279249] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.824 [2024-07-12 18:38:22.279634] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.824 [2024-07-12 18:38:22.280357] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.824 [2024-07-12 18:38:22.280628] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.824 [2024-07-12 18:38:22.280644] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.824 [2024-07-12 18:38:22.280660] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.824 [2024-07-12 18:38:22.280674] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.824 [2024-07-12 18:38:22.283797] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.824 [2024-07-12 18:38:22.285225] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.824 [2024-07-12 18:38:22.286880] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.824 [2024-07-12 18:38:22.288540] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.824 [2024-07-12 18:38:22.288932] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.824 [2024-07-12 18:38:22.289336] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.824 [2024-07-12 18:38:22.289720] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.824 [2024-07-12 18:38:22.290112] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.824 [2024-07-12 18:38:22.290917] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.824 [2024-07-12 18:38:22.291223] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.824 [2024-07-12 18:38:22.291239] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.824 [2024-07-12 18:38:22.291254] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.824 [2024-07-12 18:38:22.291268] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.824 [2024-07-12 18:38:22.294305] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.824 [2024-07-12 18:38:22.295971] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.824 [2024-07-12 18:38:22.297626] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.824 [2024-07-12 18:38:22.298468] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.824 [2024-07-12 18:38:22.298953] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.824 [2024-07-12 18:38:22.299352] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.824 [2024-07-12 18:38:22.299760] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.824 [2024-07-12 18:38:22.300228] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.824 [2024-07-12 18:38:22.301717] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.824 [2024-07-12 18:38:22.301990] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.824 [2024-07-12 18:38:22.302007] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.824 [2024-07-12 18:38:22.302022] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.824 [2024-07-12 18:38:22.302036] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.824 [2024-07-12 18:38:22.305370] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.824 [2024-07-12 18:38:22.307038] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.824 [2024-07-12 18:38:22.308151] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.824 [2024-07-12 18:38:22.308541] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.824 [2024-07-12 18:38:22.308998] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.824 [2024-07-12 18:38:22.309396] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.824 [2024-07-12 18:38:22.309784] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.824 [2024-07-12 18:38:22.311576] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.824 [2024-07-12 18:38:22.313299] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.824 [2024-07-12 18:38:22.313578] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.824 [2024-07-12 18:38:22.313594] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.824 [2024-07-12 18:38:22.313609] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.824 [2024-07-12 18:38:22.313623] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.824 [2024-07-12 18:38:22.316999] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.824 [2024-07-12 18:38:22.318354] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.824 [2024-07-12 18:38:22.318743] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.824 [2024-07-12 18:38:22.319136] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.824 [2024-07-12 18:38:22.319595] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.824 [2024-07-12 18:38:22.320000] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.824 [2024-07-12 18:38:22.321671] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.824 [2024-07-12 18:38:22.323169] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.824 [2024-07-12 18:38:22.324814] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.824 [2024-07-12 18:38:22.325086] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.824 [2024-07-12 18:38:22.325105] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.824 [2024-07-12 18:38:22.325119] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.824 [2024-07-12 18:38:22.325133] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.824 [2024-07-12 18:38:22.328455] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.824 [2024-07-12 18:38:22.328854] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.824 [2024-07-12 18:38:22.329250] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.824 [2024-07-12 18:38:22.329639] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.824 [2024-07-12 18:38:22.330075] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.824 [2024-07-12 18:38:22.331462] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.824 [2024-07-12 18:38:22.332857] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.824 [2024-07-12 18:38:22.334513] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.824 [2024-07-12 18:38:22.336191] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.824 [2024-07-12 18:38:22.336671] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.824 [2024-07-12 18:38:22.336687] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.825 [2024-07-12 18:38:22.336701] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.825 [2024-07-12 18:38:22.336716] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.825 [2024-07-12 18:38:22.338648] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.825 [2024-07-12 18:38:22.339047] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.825 [2024-07-12 18:38:22.339435] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.825 [2024-07-12 18:38:22.339823] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.825 [2024-07-12 18:38:22.340117] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.825 [2024-07-12 18:38:22.341508] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.825 [2024-07-12 18:38:22.343169] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.825 [2024-07-12 18:38:22.344827] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.825 [2024-07-12 18:38:22.345612] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.825 [2024-07-12 18:38:22.345883] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.825 [2024-07-12 18:38:22.345899] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.825 [2024-07-12 18:38:22.345919] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.825 [2024-07-12 18:38:22.345939] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.825 [2024-07-12 18:38:22.348043] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.825 [2024-07-12 18:38:22.348435] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.825 [2024-07-12 18:38:22.348824] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.825 [2024-07-12 18:38:22.350170] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.825 [2024-07-12 18:38:22.350479] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.825 [2024-07-12 18:38:22.352092] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.825 [2024-07-12 18:38:22.353723] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.825 [2024-07-12 18:38:22.354891] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.825 [2024-07-12 18:38:22.356600] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.825 [2024-07-12 18:38:22.356944] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.825 [2024-07-12 18:38:22.356961] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.825 [2024-07-12 18:38:22.356975] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.825 [2024-07-12 18:38:22.356989] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.825 [2024-07-12 18:38:22.359350] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.825 [2024-07-12 18:38:22.359740] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.825 [2024-07-12 18:38:22.360701] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.825 [2024-07-12 18:38:22.362102] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.825 [2024-07-12 18:38:22.362371] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.825 [2024-07-12 18:38:22.364050] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.825 [2024-07-12 18:38:22.365508] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.825 [2024-07-12 18:38:22.366910] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.825 [2024-07-12 18:38:22.368855] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.825 [2024-07-12 18:38:22.369137] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.825 [2024-07-12 18:38:22.369154] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.825 [2024-07-12 18:38:22.369169] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.825 [2024-07-12 18:38:22.369183] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.825 [2024-07-12 18:38:22.371877] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.825 [2024-07-12 18:38:22.373448] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.825 [2024-07-12 18:38:22.375178] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.825 [2024-07-12 18:38:22.376832] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.825 [2024-07-12 18:38:22.377104] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.825 [2024-07-12 18:38:22.377964] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.825 [2024-07-12 18:38:22.379350] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.825 [2024-07-12 18:38:22.381005] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.825 [2024-07-12 18:38:22.382666] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.825 [2024-07-12 18:38:22.382971] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.825 [2024-07-12 18:38:22.382988] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.825 [2024-07-12 18:38:22.383003] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.825 [2024-07-12 18:38:22.383018] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.825 [2024-07-12 18:38:22.387323] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.825 [2024-07-12 18:38:22.389155] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.825 [2024-07-12 18:38:22.390868] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.825 [2024-07-12 18:38:22.392484] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.825 [2024-07-12 18:38:22.392863] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.825 [2024-07-12 18:38:22.394263] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.825 [2024-07-12 18:38:22.395919] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.825 [2024-07-12 18:38:22.397575] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.825 [2024-07-12 18:38:22.398747] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.825 [2024-07-12 18:38:22.399171] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.825 [2024-07-12 18:38:22.399188] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.825 [2024-07-12 18:38:22.399203] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.825 [2024-07-12 18:38:22.399218] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.825 [2024-07-12 18:38:22.403011] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.825 [2024-07-12 18:38:22.404753] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.825 [2024-07-12 18:38:22.406598] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.825 [2024-07-12 18:38:22.407660] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.825 [2024-07-12 18:38:22.407972] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.825 [2024-07-12 18:38:22.409642] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.825 [2024-07-12 18:38:22.411286] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.825 [2024-07-12 18:38:22.412615] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.825 [2024-07-12 18:38:22.413009] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.825 [2024-07-12 18:38:22.413441] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.825 [2024-07-12 18:38:22.413459] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.825 [2024-07-12 18:38:22.413474] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.825 [2024-07-12 18:38:22.413489] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.825 [2024-07-12 18:38:22.417362] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.825 [2024-07-12 18:38:22.419096] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.825 [2024-07-12 18:38:22.420047] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.825 [2024-07-12 18:38:22.421448] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.825 [2024-07-12 18:38:22.421716] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.825 [2024-07-12 18:38:22.423391] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.825 [2024-07-12 18:38:22.424838] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.825 [2024-07-12 18:38:22.425232] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.825 [2024-07-12 18:38:22.425620] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.825 [2024-07-12 18:38:22.426065] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.825 [2024-07-12 18:38:22.426083] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.825 [2024-07-12 18:38:22.426097] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.825 [2024-07-12 18:38:22.426113] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.825 [2024-07-12 18:38:22.428784] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.825 [2024-07-12 18:38:22.430482] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.825 [2024-07-12 18:38:22.432321] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.825 [2024-07-12 18:38:22.434142] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.825 [2024-07-12 18:38:22.434539] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.825 [2024-07-12 18:38:22.436135] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.826 [2024-07-12 18:38:22.436524] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.826 [2024-07-12 18:38:22.437198] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.826 [2024-07-12 18:38:22.438465] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.826 [2024-07-12 18:38:22.438901] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.826 [2024-07-12 18:38:22.438920] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.826 [2024-07-12 18:38:22.438945] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.826 [2024-07-12 18:38:22.438961] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.826 [2024-07-12 18:38:22.442108] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.826 [2024-07-12 18:38:22.443442] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.826 [2024-07-12 18:38:22.444832] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.826 [2024-07-12 18:38:22.446491] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.826 [2024-07-12 18:38:22.446760] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.826 [2024-07-12 18:38:22.447878] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.826 [2024-07-12 18:38:22.448315] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.826 [2024-07-12 18:38:22.448703] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.826 [2024-07-12 18:38:22.449093] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.826 [2024-07-12 18:38:22.449530] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.826 [2024-07-12 18:38:22.449547] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.826 [2024-07-12 18:38:22.449562] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.826 [2024-07-12 18:38:22.449576] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.826 [2024-07-12 18:38:22.451498] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.826 [2024-07-12 18:38:22.452994] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.826 [2024-07-12 18:38:22.454681] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.826 [2024-07-12 18:38:22.456351] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.826 [2024-07-12 18:38:22.456621] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.826 [2024-07-12 18:38:22.457042] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.826 [2024-07-12 18:38:22.457430] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.826 [2024-07-12 18:38:22.457818] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.826 [2024-07-12 18:38:22.458210] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.826 [2024-07-12 18:38:22.458582] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.826 [2024-07-12 18:38:22.458599] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.826 [2024-07-12 18:38:22.458613] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.826 [2024-07-12 18:38:22.458629] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.826 [2024-07-12 18:38:22.461286] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.826 [2024-07-12 18:38:22.461683] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.826 [2024-07-12 18:38:22.462082] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.826 [2024-07-12 18:38:22.462476] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.826 [2024-07-12 18:38:22.462911] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.826 [2024-07-12 18:38:22.463315] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.826 [2024-07-12 18:38:22.463706] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.826 [2024-07-12 18:38:22.464106] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.826 [2024-07-12 18:38:22.464511] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.826 [2024-07-12 18:38:22.464963] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.826 [2024-07-12 18:38:22.464981] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.826 [2024-07-12 18:38:22.464997] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.826 [2024-07-12 18:38:22.465012] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.826 [2024-07-12 18:38:22.467691] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.826 [2024-07-12 18:38:22.468092] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.826 [2024-07-12 18:38:22.468481] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.826 [2024-07-12 18:38:22.468869] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.826 [2024-07-12 18:38:22.469251] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.826 [2024-07-12 18:38:22.469655] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.826 [2024-07-12 18:38:22.470050] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.826 [2024-07-12 18:38:22.470438] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.826 [2024-07-12 18:38:22.470829] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.826 [2024-07-12 18:38:22.471255] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.826 [2024-07-12 18:38:22.471273] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.826 [2024-07-12 18:38:22.471288] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.826 [2024-07-12 18:38:22.471303] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.826 [2024-07-12 18:38:22.473981] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.826 [2024-07-12 18:38:22.474374] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.826 [2024-07-12 18:38:22.474769] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.826 [2024-07-12 18:38:22.475168] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.826 [2024-07-12 18:38:22.475643] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.826 [2024-07-12 18:38:22.476048] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.826 [2024-07-12 18:38:22.476438] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.826 [2024-07-12 18:38:22.476827] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.826 [2024-07-12 18:38:22.477234] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.826 [2024-07-12 18:38:22.477619] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.826 [2024-07-12 18:38:22.477636] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.826 [2024-07-12 18:38:22.477651] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.826 [2024-07-12 18:38:22.477666] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.826 [2024-07-12 18:38:22.480473] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.826 [2024-07-12 18:38:22.480870] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.826 [2024-07-12 18:38:22.481265] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.826 [2024-07-12 18:38:22.481652] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.826 [2024-07-12 18:38:22.482093] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.826 [2024-07-12 18:38:22.482495] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.826 [2024-07-12 18:38:22.482886] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.826 [2024-07-12 18:38:22.483281] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.826 [2024-07-12 18:38:22.483670] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.826 [2024-07-12 18:38:22.484117] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.826 [2024-07-12 18:38:22.484135] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.826 [2024-07-12 18:38:22.484149] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.826 [2024-07-12 18:38:22.484164] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.826 [2024-07-12 18:38:22.486793] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.826 [2024-07-12 18:38:22.487204] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.826 [2024-07-12 18:38:22.487596] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.826 [2024-07-12 18:38:22.487993] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.826 [2024-07-12 18:38:22.488390] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.826 [2024-07-12 18:38:22.488808] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.826 [2024-07-12 18:38:22.489200] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.826 [2024-07-12 18:38:22.489590] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.826 [2024-07-12 18:38:22.489984] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.826 [2024-07-12 18:38:22.490366] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.826 [2024-07-12 18:38:22.490383] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.826 [2024-07-12 18:38:22.490397] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.826 [2024-07-12 18:38:22.490416] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.826 [2024-07-12 18:38:22.493278] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.826 [2024-07-12 18:38:22.493680] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.827 [2024-07-12 18:38:22.494076] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.827 [2024-07-12 18:38:22.494465] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.827 [2024-07-12 18:38:22.494884] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.827 [2024-07-12 18:38:22.495284] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.827 [2024-07-12 18:38:22.495673] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.827 [2024-07-12 18:38:22.496070] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.827 [2024-07-12 18:38:22.496466] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.827 [2024-07-12 18:38:22.496897] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.827 [2024-07-12 18:38:22.496915] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.827 [2024-07-12 18:38:22.496936] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.827 [2024-07-12 18:38:22.496951] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.827 [2024-07-12 18:38:22.499660] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.827 [2024-07-12 18:38:22.500056] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.827 [2024-07-12 18:38:22.500447] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.827 [2024-07-12 18:38:22.500836] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.827 [2024-07-12 18:38:22.501288] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.827 [2024-07-12 18:38:22.501693] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.827 [2024-07-12 18:38:22.502087] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.827 [2024-07-12 18:38:22.502474] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.827 [2024-07-12 18:38:22.502864] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.827 [2024-07-12 18:38:22.503281] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.827 [2024-07-12 18:38:22.503299] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.827 [2024-07-12 18:38:22.503314] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.827 [2024-07-12 18:38:22.503328] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.827 [2024-07-12 18:38:22.506030] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.827 [2024-07-12 18:38:22.506426] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.827 [2024-07-12 18:38:22.506818] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.827 [2024-07-12 18:38:22.507211] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.827 [2024-07-12 18:38:22.507651] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.827 [2024-07-12 18:38:22.508053] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.827 [2024-07-12 18:38:22.508442] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.827 [2024-07-12 18:38:22.508829] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.827 [2024-07-12 18:38:22.509226] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.827 [2024-07-12 18:38:22.509651] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.827 [2024-07-12 18:38:22.509668] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.827 [2024-07-12 18:38:22.509683] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.827 [2024-07-12 18:38:22.509698] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.827 [2024-07-12 18:38:22.512389] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.827 [2024-07-12 18:38:22.512801] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.827 [2024-07-12 18:38:22.513193] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.827 [2024-07-12 18:38:22.513580] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.827 [2024-07-12 18:38:22.514002] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.827 [2024-07-12 18:38:22.514404] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.827 [2024-07-12 18:38:22.514798] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.827 [2024-07-12 18:38:22.515196] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.827 [2024-07-12 18:38:22.515585] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.827 [2024-07-12 18:38:22.516058] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.827 [2024-07-12 18:38:22.516076] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.827 [2024-07-12 18:38:22.516091] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.827 [2024-07-12 18:38:22.516106] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.827 [2024-07-12 18:38:22.518826] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.827 [2024-07-12 18:38:22.519230] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.827 [2024-07-12 18:38:22.519621] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.827 [2024-07-12 18:38:22.520015] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.827 [2024-07-12 18:38:22.520460] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.827 [2024-07-12 18:38:22.520859] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.827 [2024-07-12 18:38:22.521251] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.827 [2024-07-12 18:38:22.521639] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.827 [2024-07-12 18:38:22.522042] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.827 [2024-07-12 18:38:22.522407] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.827 [2024-07-12 18:38:22.522424] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.827 [2024-07-12 18:38:22.522438] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.827 [2024-07-12 18:38:22.522453] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.827 [2024-07-12 18:38:22.525469] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.827 [2024-07-12 18:38:22.525867] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.827 [2024-07-12 18:38:22.526266] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.827 [2024-07-12 18:38:22.526654] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.827 [2024-07-12 18:38:22.527119] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.827 [2024-07-12 18:38:22.527520] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.827 [2024-07-12 18:38:22.527938] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.827 [2024-07-12 18:38:22.528338] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.827 [2024-07-12 18:38:22.528727] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.827 [2024-07-12 18:38:22.529122] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.827 [2024-07-12 18:38:22.529139] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.827 [2024-07-12 18:38:22.529154] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.827 [2024-07-12 18:38:22.529170] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.827 [2024-07-12 18:38:22.531744] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.827 [2024-07-12 18:38:22.531793] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.827 [2024-07-12 18:38:22.532184] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.827 [2024-07-12 18:38:22.532573] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.827 [2024-07-12 18:38:22.532963] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.827 [2024-07-12 18:38:22.533369] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.827 [2024-07-12 18:38:22.533760] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.827 [2024-07-12 18:38:22.534152] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.827 [2024-07-12 18:38:22.534538] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.827 [2024-07-12 18:38:22.534988] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.827 [2024-07-12 18:38:22.535006] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.827 [2024-07-12 18:38:22.535022] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.827 [2024-07-12 18:38:22.535037] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.827 [2024-07-12 18:38:22.539567] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.827 [2024-07-12 18:38:22.539971] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.827 [2024-07-12 18:38:22.540381] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.827 [2024-07-12 18:38:22.540435] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.827 [2024-07-12 18:38:22.540888] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.827 [2024-07-12 18:38:22.541291] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.827 [2024-07-12 18:38:22.541679] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.827 [2024-07-12 18:38:22.542070] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.827 [2024-07-12 18:38:22.542461] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.827 [2024-07-12 18:38:22.542835] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.827 [2024-07-12 18:38:22.542852] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.828 [2024-07-12 18:38:22.542867] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.828 [2024-07-12 18:38:22.542882] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.828 [2024-07-12 18:38:22.546294] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.828 [2024-07-12 18:38:22.546346] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.828 [2024-07-12 18:38:22.547598] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:38.828 [2024-07-12 18:38:22.547643] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.089 [2024-07-12 18:38:22.547914] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.089 [2024-07-12 18:38:22.549423] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.089 [2024-07-12 18:38:22.549469] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.089 [2024-07-12 18:38:22.550422] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.089 [2024-07-12 18:38:22.550470] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.089 [2024-07-12 18:38:22.550741] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.089 [2024-07-12 18:38:22.550758] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.089 [2024-07-12 18:38:22.550772] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.089 [2024-07-12 18:38:22.550786] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.089 [2024-07-12 18:38:22.552778] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.089 [2024-07-12 18:38:22.552828] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.089 [2024-07-12 18:38:22.553221] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.089 [2024-07-12 18:38:22.553264] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.089 [2024-07-12 18:38:22.553700] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.089 [2024-07-12 18:38:22.554713] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.089 [2024-07-12 18:38:22.554759] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.089 [2024-07-12 18:38:22.555974] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.089 [2024-07-12 18:38:22.556019] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.090 [2024-07-12 18:38:22.556290] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.090 [2024-07-12 18:38:22.556306] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.090 [2024-07-12 18:38:22.556321] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.090 [2024-07-12 18:38:22.556335] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.090 [2024-07-12 18:38:22.557947] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.090 [2024-07-12 18:38:22.557991] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.090 [2024-07-12 18:38:22.558031] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.090 [2024-07-12 18:38:22.558071] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.090 [2024-07-12 18:38:22.558342] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.090 [2024-07-12 18:38:22.558399] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.090 [2024-07-12 18:38:22.558440] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.090 [2024-07-12 18:38:22.558503] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.090 [2024-07-12 18:38:22.558543] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.090 [2024-07-12 18:38:22.558811] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.090 [2024-07-12 18:38:22.558828] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.090 [2024-07-12 18:38:22.558842] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.090 [2024-07-12 18:38:22.558858] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.090 [2024-07-12 18:38:22.561197] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.090 [2024-07-12 18:38:22.561244] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.090 [2024-07-12 18:38:22.561286] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.090 [2024-07-12 18:38:22.561335] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.090 [2024-07-12 18:38:22.561605] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.090 [2024-07-12 18:38:22.561656] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.090 [2024-07-12 18:38:22.561704] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.090 [2024-07-12 18:38:22.561745] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.090 [2024-07-12 18:38:22.561790] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.090 [2024-07-12 18:38:22.562067] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.090 [2024-07-12 18:38:22.562083] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.090 [2024-07-12 18:38:22.562098] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.090 [2024-07-12 18:38:22.562112] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.090 [2024-07-12 18:38:22.563767] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.090 [2024-07-12 18:38:22.563810] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.090 [2024-07-12 18:38:22.563851] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.090 [2024-07-12 18:38:22.563891] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.090 [2024-07-12 18:38:22.564160] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.090 [2024-07-12 18:38:22.564218] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.090 [2024-07-12 18:38:22.564258] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.090 [2024-07-12 18:38:22.564299] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.090 [2024-07-12 18:38:22.564338] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.090 [2024-07-12 18:38:22.564604] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.090 [2024-07-12 18:38:22.564621] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.090 [2024-07-12 18:38:22.564635] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.090 [2024-07-12 18:38:22.564649] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.090 [2024-07-12 18:38:22.567061] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.090 [2024-07-12 18:38:22.567107] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.090 [2024-07-12 18:38:22.567148] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.090 [2024-07-12 18:38:22.567190] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.090 [2024-07-12 18:38:22.567506] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.090 [2024-07-12 18:38:22.567560] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.090 [2024-07-12 18:38:22.567601] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.090 [2024-07-12 18:38:22.567641] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.090 [2024-07-12 18:38:22.567680] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.090 [2024-07-12 18:38:22.567981] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.090 [2024-07-12 18:38:22.567998] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.090 [2024-07-12 18:38:22.568012] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.090 [2024-07-12 18:38:22.568026] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.090 [2024-07-12 18:38:22.569636] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.090 [2024-07-12 18:38:22.569686] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.090 [2024-07-12 18:38:22.569732] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.090 [2024-07-12 18:38:22.569772] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.090 [2024-07-12 18:38:22.570044] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.090 [2024-07-12 18:38:22.570110] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.090 [2024-07-12 18:38:22.570154] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.090 [2024-07-12 18:38:22.570194] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.090 [2024-07-12 18:38:22.570234] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.090 [2024-07-12 18:38:22.570503] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.090 [2024-07-12 18:38:22.570519] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.090 [2024-07-12 18:38:22.570533] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.090 [2024-07-12 18:38:22.570548] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.090 [2024-07-12 18:38:22.572799] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.090 [2024-07-12 18:38:22.572844] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.090 [2024-07-12 18:38:22.572885] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.090 [2024-07-12 18:38:22.572931] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.090 [2024-07-12 18:38:22.573356] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.090 [2024-07-12 18:38:22.573411] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.090 [2024-07-12 18:38:22.573451] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.090 [2024-07-12 18:38:22.573491] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.090 [2024-07-12 18:38:22.573532] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.090 [2024-07-12 18:38:22.573828] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.090 [2024-07-12 18:38:22.573845] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.090 [2024-07-12 18:38:22.573859] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.090 [2024-07-12 18:38:22.573873] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.090 [2024-07-12 18:38:22.575494] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.090 [2024-07-12 18:38:22.575538] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.090 [2024-07-12 18:38:22.575582] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.090 [2024-07-12 18:38:22.575626] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.090 [2024-07-12 18:38:22.575913] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.090 [2024-07-12 18:38:22.575978] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.090 [2024-07-12 18:38:22.576019] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.090 [2024-07-12 18:38:22.576059] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.090 [2024-07-12 18:38:22.576103] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.090 [2024-07-12 18:38:22.576368] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.090 [2024-07-12 18:38:22.576384] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.090 [2024-07-12 18:38:22.576399] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.090 [2024-07-12 18:38:22.576413] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.090 [2024-07-12 18:38:22.578559] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.090 [2024-07-12 18:38:22.578603] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.090 [2024-07-12 18:38:22.578643] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.090 [2024-07-12 18:38:22.578687] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.091 [2024-07-12 18:38:22.579131] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.091 [2024-07-12 18:38:22.579186] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.091 [2024-07-12 18:38:22.579229] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.091 [2024-07-12 18:38:22.579271] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.091 [2024-07-12 18:38:22.579314] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.091 [2024-07-12 18:38:22.579581] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.091 [2024-07-12 18:38:22.579597] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.091 [2024-07-12 18:38:22.579611] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.091 [2024-07-12 18:38:22.579626] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.091 [2024-07-12 18:38:22.581249] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.091 [2024-07-12 18:38:22.581293] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.091 [2024-07-12 18:38:22.581333] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.091 [2024-07-12 18:38:22.581372] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.091 [2024-07-12 18:38:22.581720] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.091 [2024-07-12 18:38:22.581779] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.091 [2024-07-12 18:38:22.581820] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.091 [2024-07-12 18:38:22.581860] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.091 [2024-07-12 18:38:22.581900] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.091 [2024-07-12 18:38:22.582167] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.091 [2024-07-12 18:38:22.582188] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.091 [2024-07-12 18:38:22.582202] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.091 [2024-07-12 18:38:22.582217] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.091 [2024-07-12 18:38:22.584472] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.091 [2024-07-12 18:38:22.584518] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.091 [2024-07-12 18:38:22.584558] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.091 [2024-07-12 18:38:22.584599] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.091 [2024-07-12 18:38:22.585063] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.091 [2024-07-12 18:38:22.585117] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.091 [2024-07-12 18:38:22.585158] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.091 [2024-07-12 18:38:22.585200] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.091 [2024-07-12 18:38:22.585240] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.091 [2024-07-12 18:38:22.585555] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.091 [2024-07-12 18:38:22.585571] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.091 [2024-07-12 18:38:22.585586] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.091 [2024-07-12 18:38:22.585600] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.091 [2024-07-12 18:38:22.587212] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.091 [2024-07-12 18:38:22.587257] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.091 [2024-07-12 18:38:22.587308] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.091 [2024-07-12 18:38:22.587349] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.091 [2024-07-12 18:38:22.587618] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.091 [2024-07-12 18:38:22.587676] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.091 [2024-07-12 18:38:22.587718] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.091 [2024-07-12 18:38:22.587763] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.091 [2024-07-12 18:38:22.587803] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.091 [2024-07-12 18:38:22.588071] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.091 [2024-07-12 18:38:22.588088] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.091 [2024-07-12 18:38:22.588102] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.091 [2024-07-12 18:38:22.588117] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.091 [2024-07-12 18:38:22.590099] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.091 [2024-07-12 18:38:22.590149] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.091 [2024-07-12 18:38:22.590189] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.091 [2024-07-12 18:38:22.590231] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.091 [2024-07-12 18:38:22.590685] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.091 [2024-07-12 18:38:22.590743] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.091 [2024-07-12 18:38:22.590786] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.091 [2024-07-12 18:38:22.590827] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.091 [2024-07-12 18:38:22.590868] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.091 [2024-07-12 18:38:22.591293] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.091 [2024-07-12 18:38:22.591309] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.091 [2024-07-12 18:38:22.591323] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.091 [2024-07-12 18:38:22.591338] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.091 [2024-07-12 18:38:22.592872] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.091 [2024-07-12 18:38:22.592924] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.091 [2024-07-12 18:38:22.592970] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.091 [2024-07-12 18:38:22.593012] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.091 [2024-07-12 18:38:22.593329] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.091 [2024-07-12 18:38:22.593383] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.091 [2024-07-12 18:38:22.593424] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.091 [2024-07-12 18:38:22.593464] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.091 [2024-07-12 18:38:22.593505] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.091 [2024-07-12 18:38:22.593811] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.091 [2024-07-12 18:38:22.593827] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.091 [2024-07-12 18:38:22.593841] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.091 [2024-07-12 18:38:22.593856] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.091 [2024-07-12 18:38:22.595715] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.091 [2024-07-12 18:38:22.595760] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.091 [2024-07-12 18:38:22.595803] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.091 [2024-07-12 18:38:22.595844] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.091 [2024-07-12 18:38:22.596208] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.091 [2024-07-12 18:38:22.596261] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.091 [2024-07-12 18:38:22.596306] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.091 [2024-07-12 18:38:22.596347] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.091 [2024-07-12 18:38:22.596387] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.091 [2024-07-12 18:38:22.596827] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.091 [2024-07-12 18:38:22.596844] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.091 [2024-07-12 18:38:22.596860] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.091 [2024-07-12 18:38:22.596877] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.091 [2024-07-12 18:38:22.598440] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.091 [2024-07-12 18:38:22.598484] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.091 [2024-07-12 18:38:22.598528] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.091 [2024-07-12 18:38:22.598568] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.091 [2024-07-12 18:38:22.598999] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.091 [2024-07-12 18:38:22.599060] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.091 [2024-07-12 18:38:22.599101] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.091 [2024-07-12 18:38:22.599141] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.091 [2024-07-12 18:38:22.599181] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.091 [2024-07-12 18:38:22.599517] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.091 [2024-07-12 18:38:22.599533] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.091 [2024-07-12 18:38:22.599548] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.091 [2024-07-12 18:38:22.599562] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.092 [2024-07-12 18:38:22.601284] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.092 [2024-07-12 18:38:22.601328] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.092 [2024-07-12 18:38:22.601371] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.092 [2024-07-12 18:38:22.601411] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.092 [2024-07-12 18:38:22.601836] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.092 [2024-07-12 18:38:22.601891] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.092 [2024-07-12 18:38:22.601937] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.092 [2024-07-12 18:38:22.601993] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.092 [2024-07-12 18:38:22.602036] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.092 [2024-07-12 18:38:22.602498] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.092 [2024-07-12 18:38:22.602520] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.092 [2024-07-12 18:38:22.602535] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.092 [2024-07-12 18:38:22.602551] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.092 [2024-07-12 18:38:22.604249] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.092 [2024-07-12 18:38:22.604298] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.092 [2024-07-12 18:38:22.604338] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.092 [2024-07-12 18:38:22.604378] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.092 [2024-07-12 18:38:22.604641] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.092 [2024-07-12 18:38:22.604698] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.092 [2024-07-12 18:38:22.604739] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.092 [2024-07-12 18:38:22.604781] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.092 [2024-07-12 18:38:22.604821] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.092 [2024-07-12 18:38:22.605090] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.092 [2024-07-12 18:38:22.605106] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.092 [2024-07-12 18:38:22.605121] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.092 [2024-07-12 18:38:22.605135] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.092 [2024-07-12 18:38:22.606763] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.092 [2024-07-12 18:38:22.606808] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.092 [2024-07-12 18:38:22.606848] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.092 [2024-07-12 18:38:22.606888] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.092 [2024-07-12 18:38:22.607349] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.092 [2024-07-12 18:38:22.607402] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.092 [2024-07-12 18:38:22.607443] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.092 [2024-07-12 18:38:22.607487] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.092 [2024-07-12 18:38:22.607528] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.092 [2024-07-12 18:38:22.607893] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.092 [2024-07-12 18:38:22.607910] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.092 [2024-07-12 18:38:22.607924] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.092 [2024-07-12 18:38:22.607943] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.092 [2024-07-12 18:38:22.609831] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.092 [2024-07-12 18:38:22.609875] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.092 [2024-07-12 18:38:22.609923] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.092 [2024-07-12 18:38:22.609967] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.092 [2024-07-12 18:38:22.610234] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.092 [2024-07-12 18:38:22.610291] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.092 [2024-07-12 18:38:22.610332] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.092 [2024-07-12 18:38:22.610372] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.092 [2024-07-12 18:38:22.610418] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.092 [2024-07-12 18:38:22.610781] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.092 [2024-07-12 18:38:22.610797] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.092 [2024-07-12 18:38:22.610812] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.092 [2024-07-12 18:38:22.610826] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.092 [2024-07-12 18:38:22.612354] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.092 [2024-07-12 18:38:22.612399] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.092 [2024-07-12 18:38:22.612440] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.092 [2024-07-12 18:38:22.612481] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.092 [2024-07-12 18:38:22.612971] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.092 [2024-07-12 18:38:22.613030] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.092 [2024-07-12 18:38:22.613072] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.092 [2024-07-12 18:38:22.613113] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.092 [2024-07-12 18:38:22.613153] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.092 [2024-07-12 18:38:22.613596] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.092 [2024-07-12 18:38:22.613612] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.092 [2024-07-12 18:38:22.613626] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.092 [2024-07-12 18:38:22.613642] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.092 [2024-07-12 18:38:22.615711] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.092 [2024-07-12 18:38:22.615758] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.092 [2024-07-12 18:38:22.615799] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.092 [2024-07-12 18:38:22.615838] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.092 [2024-07-12 18:38:22.616109] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.092 [2024-07-12 18:38:22.616167] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.092 [2024-07-12 18:38:22.616211] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.092 [2024-07-12 18:38:22.616251] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.092 [2024-07-12 18:38:22.616291] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.092 [2024-07-12 18:38:22.616608] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.092 [2024-07-12 18:38:22.616624] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.092 [2024-07-12 18:38:22.616639] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.092 [2024-07-12 18:38:22.616655] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.092 [2024-07-12 18:38:22.618185] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.092 [2024-07-12 18:38:22.618229] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.092 [2024-07-12 18:38:22.618269] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.092 [2024-07-12 18:38:22.618317] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.092 [2024-07-12 18:38:22.618717] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.092 [2024-07-12 18:38:22.618769] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.092 [2024-07-12 18:38:22.618810] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.092 [2024-07-12 18:38:22.618849] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.092 [2024-07-12 18:38:22.618890] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.092 [2024-07-12 18:38:22.619335] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.092 [2024-07-12 18:38:22.619353] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.092 [2024-07-12 18:38:22.619368] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.092 [2024-07-12 18:38:22.619383] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.092 [2024-07-12 18:38:22.621378] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.092 [2024-07-12 18:38:22.621423] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.092 [2024-07-12 18:38:22.621466] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.092 [2024-07-12 18:38:22.621506] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.092 [2024-07-12 18:38:22.621773] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.092 [2024-07-12 18:38:22.621829] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.092 [2024-07-12 18:38:22.621870] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.092 [2024-07-12 18:38:22.621910] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.092 [2024-07-12 18:38:22.621971] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.093 [2024-07-12 18:38:22.622235] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.093 [2024-07-12 18:38:22.622251] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.093 [2024-07-12 18:38:22.622270] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.093 [2024-07-12 18:38:22.622285] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.093 [2024-07-12 18:38:22.623929] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.093 [2024-07-12 18:38:22.623973] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.093 [2024-07-12 18:38:22.624013] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.093 [2024-07-12 18:38:22.624062] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.093 [2024-07-12 18:38:22.624409] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.093 [2024-07-12 18:38:22.624471] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.093 [2024-07-12 18:38:22.624518] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.093 [2024-07-12 18:38:22.624561] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.093 [2024-07-12 18:38:22.624603] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.093 [2024-07-12 18:38:22.625075] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.093 [2024-07-12 18:38:22.625093] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.093 [2024-07-12 18:38:22.625108] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.093 [2024-07-12 18:38:22.625124] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.093 [2024-07-12 18:38:22.627250] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.093 [2024-07-12 18:38:22.627293] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.093 [2024-07-12 18:38:22.627333] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.093 [2024-07-12 18:38:22.627373] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.093 [2024-07-12 18:38:22.627634] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.093 [2024-07-12 18:38:22.627689] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.093 [2024-07-12 18:38:22.627730] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.093 [2024-07-12 18:38:22.627770] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.093 [2024-07-12 18:38:22.627811] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.093 [2024-07-12 18:38:22.628079] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.093 [2024-07-12 18:38:22.628095] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.093 [2024-07-12 18:38:22.628109] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.093 [2024-07-12 18:38:22.628124] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.093 [2024-07-12 18:38:22.629750] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.093 [2024-07-12 18:38:22.629793] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.093 [2024-07-12 18:38:22.629840] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.093 [2024-07-12 18:38:22.629890] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.093 [2024-07-12 18:38:22.630161] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.093 [2024-07-12 18:38:22.630221] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.093 [2024-07-12 18:38:22.630268] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.093 [2024-07-12 18:38:22.630309] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.093 [2024-07-12 18:38:22.630349] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.093 [2024-07-12 18:38:22.630774] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.093 [2024-07-12 18:38:22.630791] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.093 [2024-07-12 18:38:22.630805] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.093 [2024-07-12 18:38:22.630820] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.093 [2024-07-12 18:38:22.633081] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.093 [2024-07-12 18:38:22.633125] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.093 [2024-07-12 18:38:22.633501] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.093 [2024-07-12 18:38:22.633543] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.093 [2024-07-12 18:38:22.633583] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.093 [2024-07-12 18:38:22.633851] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.093 [2024-07-12 18:38:22.633867] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.093 [2024-07-12 18:38:22.633881] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.093 [2024-07-12 18:38:22.664740] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.093 [2024-07-12 18:38:22.664809] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.093 [2024-07-12 18:38:22.665199] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.093 [2024-07-12 18:38:22.665254] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.093 [2024-07-12 18:38:22.666662] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.093 [2024-07-12 18:38:22.666939] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.093 [2024-07-12 18:38:22.674746] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.093 [2024-07-12 18:38:22.674808] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.093 [2024-07-12 18:38:22.675174] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.093 [2024-07-12 18:38:22.675226] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.093 [2024-07-12 18:38:22.676600] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.093 [2024-07-12 18:38:22.676655] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.093 [2024-07-12 18:38:22.678017] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.093 [2024-07-12 18:38:22.678288] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.093 [2024-07-12 18:38:22.678304] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.093 [2024-07-12 18:38:22.678319] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.093 [2024-07-12 18:38:22.678333] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.093 [2024-07-12 18:38:22.681603] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.093 [2024-07-12 18:38:22.683256] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.093 [2024-07-12 18:38:22.683651] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.093 [2024-07-12 18:38:22.684043] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.093 [2024-07-12 18:38:22.684797] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.093 [2024-07-12 18:38:22.685604] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.093 [2024-07-12 18:38:22.686999] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.093 [2024-07-12 18:38:22.688657] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.093 [2024-07-12 18:38:22.688932] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.093 [2024-07-12 18:38:22.688949] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.093 [2024-07-12 18:38:22.688963] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.093 [2024-07-12 18:38:22.688977] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.093 [2024-07-12 18:38:22.692254] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.093 [2024-07-12 18:38:22.693454] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.093 [2024-07-12 18:38:22.693843] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.093 [2024-07-12 18:38:22.694234] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.093 [2024-07-12 18:38:22.695071] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.093 [2024-07-12 18:38:22.696790] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.093 [2024-07-12 18:38:22.698350] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.093 [2024-07-12 18:38:22.700000] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.093 [2024-07-12 18:38:22.700272] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.093 [2024-07-12 18:38:22.700289] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.093 [2024-07-12 18:38:22.700303] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.094 [2024-07-12 18:38:22.700318] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.094 [2024-07-12 18:38:22.703775] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.094 [2024-07-12 18:38:22.704183] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.094 [2024-07-12 18:38:22.704576] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.094 [2024-07-12 18:38:22.704971] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.094 [2024-07-12 18:38:22.706496] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.094 [2024-07-12 18:38:22.707881] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.094 [2024-07-12 18:38:22.709540] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.094 [2024-07-12 18:38:22.711201] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.094 [2024-07-12 18:38:22.711524] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.094 [2024-07-12 18:38:22.711540] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.094 [2024-07-12 18:38:22.711556] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.094 [2024-07-12 18:38:22.711571] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.094 [2024-07-12 18:38:22.713896] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.094 [2024-07-12 18:38:22.714289] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.094 [2024-07-12 18:38:22.714675] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.094 [2024-07-12 18:38:22.715067] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.094 [2024-07-12 18:38:22.717065] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.094 [2024-07-12 18:38:22.718855] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.094 [2024-07-12 18:38:22.720565] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.094 [2024-07-12 18:38:22.722187] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.094 [2024-07-12 18:38:22.722556] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.094 [2024-07-12 18:38:22.722572] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.094 [2024-07-12 18:38:22.722586] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.094 [2024-07-12 18:38:22.722601] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.094 [2024-07-12 18:38:22.724512] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.094 [2024-07-12 18:38:22.724908] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.094 [2024-07-12 18:38:22.725301] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.094 [2024-07-12 18:38:22.725689] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.094 [2024-07-12 18:38:22.727345] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.094 [2024-07-12 18:38:22.728996] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.094 [2024-07-12 18:38:22.730658] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.094 [2024-07-12 18:38:22.731389] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.094 [2024-07-12 18:38:22.731670] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.094 [2024-07-12 18:38:22.731686] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.094 [2024-07-12 18:38:22.731703] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.094 [2024-07-12 18:38:22.731718] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.094 [2024-07-12 18:38:22.733720] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.094 [2024-07-12 18:38:22.734116] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.094 [2024-07-12 18:38:22.734505] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.094 [2024-07-12 18:38:22.735689] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.094 [2024-07-12 18:38:22.737696] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.094 [2024-07-12 18:38:22.739355] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.094 [2024-07-12 18:38:22.740598] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.094 [2024-07-12 18:38:22.742187] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.094 [2024-07-12 18:38:22.742504] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.094 [2024-07-12 18:38:22.742520] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.094 [2024-07-12 18:38:22.742534] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.094 [2024-07-12 18:38:22.742549] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.094 [2024-07-12 18:38:22.744731] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.094 [2024-07-12 18:38:22.745125] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.094 [2024-07-12 18:38:22.745721] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.094 [2024-07-12 18:38:22.747114] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.094 [2024-07-12 18:38:22.749078] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.094 [2024-07-12 18:38:22.750892] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.094 [2024-07-12 18:38:22.751906] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.094 [2024-07-12 18:38:22.753299] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.094 [2024-07-12 18:38:22.753571] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.094 [2024-07-12 18:38:22.753588] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.094 [2024-07-12 18:38:22.753602] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.094 [2024-07-12 18:38:22.753616] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.094 [2024-07-12 18:38:22.755880] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.094 [2024-07-12 18:38:22.756277] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.094 [2024-07-12 18:38:22.758012] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.094 [2024-07-12 18:38:22.759608] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.094 [2024-07-12 18:38:22.761536] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.094 [2024-07-12 18:38:22.762293] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.094 [2024-07-12 18:38:22.763835] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.094 [2024-07-12 18:38:22.765540] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.094 [2024-07-12 18:38:22.765815] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.094 [2024-07-12 18:38:22.765832] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.094 [2024-07-12 18:38:22.765846] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.094 [2024-07-12 18:38:22.765860] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.094 [2024-07-12 18:38:22.768360] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.094 [2024-07-12 18:38:22.769553] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.094 [2024-07-12 18:38:22.770950] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.094 [2024-07-12 18:38:22.772595] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.094 [2024-07-12 18:38:22.774156] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.094 [2024-07-12 18:38:22.775735] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.094 [2024-07-12 18:38:22.777148] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.094 [2024-07-12 18:38:22.778809] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.094 [2024-07-12 18:38:22.779091] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.094 [2024-07-12 18:38:22.779110] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.094 [2024-07-12 18:38:22.779126] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.094 [2024-07-12 18:38:22.779142] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.094 [2024-07-12 18:38:22.781807] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.094 [2024-07-12 18:38:22.783579] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.094 [2024-07-12 18:38:22.785247] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.094 [2024-07-12 18:38:22.786816] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.095 [2024-07-12 18:38:22.788842] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.095 [2024-07-12 18:38:22.790496] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.095 [2024-07-12 18:38:22.791861] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.095 [2024-07-12 18:38:22.792254] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.095 [2024-07-12 18:38:22.792704] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.095 [2024-07-12 18:38:22.792727] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.095 [2024-07-12 18:38:22.792742] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.095 [2024-07-12 18:38:22.792757] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.095 [2024-07-12 18:38:22.796396] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.095 [2024-07-12 18:38:22.798057] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.095 [2024-07-12 18:38:22.798813] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.095 [2024-07-12 18:38:22.800303] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.095 [2024-07-12 18:38:22.802231] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.095 [2024-07-12 18:38:22.803892] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.095 [2024-07-12 18:38:22.804294] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.095 [2024-07-12 18:38:22.804680] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.095 [2024-07-12 18:38:22.805058] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.095 [2024-07-12 18:38:22.805075] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.095 [2024-07-12 18:38:22.805090] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.095 [2024-07-12 18:38:22.805105] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.095 [2024-07-12 18:38:22.808409] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.095 [2024-07-12 18:38:22.809945] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.095 [2024-07-12 18:38:22.810668] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.095 [2024-07-12 18:38:22.811918] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.355 [2024-07-12 18:38:22.813698] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.355 [2024-07-12 18:38:22.815304] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.355 [2024-07-12 18:38:22.815701] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.355 [2024-07-12 18:38:22.816096] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.355 [2024-07-12 18:38:22.816543] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.355 [2024-07-12 18:38:22.816560] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.355 [2024-07-12 18:38:22.816574] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.355 [2024-07-12 18:38:22.816589] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.355 [2024-07-12 18:38:22.819353] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.355 [2024-07-12 18:38:22.819749] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.355 [2024-07-12 18:38:22.820142] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.355 [2024-07-12 18:38:22.820533] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.355 [2024-07-12 18:38:22.821312] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.355 [2024-07-12 18:38:22.821701] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.355 [2024-07-12 18:38:22.822092] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.355 [2024-07-12 18:38:22.822484] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.355 [2024-07-12 18:38:22.822809] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.355 [2024-07-12 18:38:22.822826] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.355 [2024-07-12 18:38:22.822841] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.355 [2024-07-12 18:38:22.822856] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.355 [2024-07-12 18:38:22.825429] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.355 [2024-07-12 18:38:22.825826] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.355 [2024-07-12 18:38:22.826224] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.355 [2024-07-12 18:38:22.826618] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.355 [2024-07-12 18:38:22.827448] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.355 [2024-07-12 18:38:22.827837] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.355 [2024-07-12 18:38:22.828246] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.355 [2024-07-12 18:38:22.828642] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.355 [2024-07-12 18:38:22.829123] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.355 [2024-07-12 18:38:22.829141] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.355 [2024-07-12 18:38:22.829159] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.355 [2024-07-12 18:38:22.829174] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.355 [2024-07-12 18:38:22.831808] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.355 [2024-07-12 18:38:22.832239] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.355 [2024-07-12 18:38:22.832640] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.355 [2024-07-12 18:38:22.833033] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.355 [2024-07-12 18:38:22.833867] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.355 [2024-07-12 18:38:22.834270] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.355 [2024-07-12 18:38:22.834662] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.355 [2024-07-12 18:38:22.835054] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.355 [2024-07-12 18:38:22.835474] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.355 [2024-07-12 18:38:22.835492] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.355 [2024-07-12 18:38:22.835507] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.355 [2024-07-12 18:38:22.835527] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.355 [2024-07-12 18:38:22.838058] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.355 [2024-07-12 18:38:22.838451] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.355 [2024-07-12 18:38:22.838841] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.355 [2024-07-12 18:38:22.839239] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.355 [2024-07-12 18:38:22.840026] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.355 [2024-07-12 18:38:22.840416] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.355 [2024-07-12 18:38:22.840803] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.355 [2024-07-12 18:38:22.841197] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.355 [2024-07-12 18:38:22.841559] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.355 [2024-07-12 18:38:22.841576] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.355 [2024-07-12 18:38:22.841591] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.355 [2024-07-12 18:38:22.841606] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.355 [2024-07-12 18:38:22.844254] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.355 [2024-07-12 18:38:22.844654] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.355 [2024-07-12 18:38:22.845063] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.355 [2024-07-12 18:38:22.845448] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.355 [2024-07-12 18:38:22.846316] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.355 [2024-07-12 18:38:22.846718] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.355 [2024-07-12 18:38:22.847116] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.355 [2024-07-12 18:38:22.847510] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.355 [2024-07-12 18:38:22.847940] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.355 [2024-07-12 18:38:22.847959] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.355 [2024-07-12 18:38:22.847974] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.355 [2024-07-12 18:38:22.847989] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.355 [2024-07-12 18:38:22.850683] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.355 [2024-07-12 18:38:22.851080] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.355 [2024-07-12 18:38:22.851469] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.355 [2024-07-12 18:38:22.851857] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.355 [2024-07-12 18:38:22.852673] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.355 [2024-07-12 18:38:22.853074] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.355 [2024-07-12 18:38:22.853462] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.355 [2024-07-12 18:38:22.853847] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.355 [2024-07-12 18:38:22.854277] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.355 [2024-07-12 18:38:22.854294] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.355 [2024-07-12 18:38:22.854310] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.355 [2024-07-12 18:38:22.854325] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.355 [2024-07-12 18:38:22.856975] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.355 [2024-07-12 18:38:22.857369] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.355 [2024-07-12 18:38:22.857763] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.355 [2024-07-12 18:38:22.858156] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.355 [2024-07-12 18:38:22.858982] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.355 [2024-07-12 18:38:22.859372] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.355 [2024-07-12 18:38:22.859761] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.355 [2024-07-12 18:38:22.860155] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.355 [2024-07-12 18:38:22.860633] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.355 [2024-07-12 18:38:22.860650] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.355 [2024-07-12 18:38:22.860669] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.355 [2024-07-12 18:38:22.860685] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.355 [2024-07-12 18:38:22.863318] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.355 [2024-07-12 18:38:22.863713] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.355 [2024-07-12 18:38:22.864106] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.355 [2024-07-12 18:38:22.864494] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.355 [2024-07-12 18:38:22.865347] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.355 [2024-07-12 18:38:22.865739] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.355 [2024-07-12 18:38:22.866133] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.355 [2024-07-12 18:38:22.866521] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.355 [2024-07-12 18:38:22.866990] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.355 [2024-07-12 18:38:22.867008] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.355 [2024-07-12 18:38:22.867026] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.355 [2024-07-12 18:38:22.867042] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.355 [2024-07-12 18:38:22.869669] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.355 [2024-07-12 18:38:22.870071] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.355 [2024-07-12 18:38:22.870463] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.355 [2024-07-12 18:38:22.870857] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.355 [2024-07-12 18:38:22.871621] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.355 [2024-07-12 18:38:22.872016] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.355 [2024-07-12 18:38:22.872404] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.355 [2024-07-12 18:38:22.872796] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.355 [2024-07-12 18:38:22.873164] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.355 [2024-07-12 18:38:22.873181] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.355 [2024-07-12 18:38:22.873196] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.355 [2024-07-12 18:38:22.873211] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.355 [2024-07-12 18:38:22.875847] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.355 [2024-07-12 18:38:22.876248] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.355 [2024-07-12 18:38:22.876640] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.355 [2024-07-12 18:38:22.877032] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.355 [2024-07-12 18:38:22.877852] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.355 [2024-07-12 18:38:22.878247] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.355 [2024-07-12 18:38:22.878638] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.355 [2024-07-12 18:38:22.879033] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.355 [2024-07-12 18:38:22.879482] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.355 [2024-07-12 18:38:22.879500] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.355 [2024-07-12 18:38:22.879515] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.355 [2024-07-12 18:38:22.879530] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.355 [2024-07-12 18:38:22.882182] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.355 [2024-07-12 18:38:22.882573] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.355 [2024-07-12 18:38:22.882979] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.355 [2024-07-12 18:38:22.883369] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.355 [2024-07-12 18:38:22.884147] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.356 [2024-07-12 18:38:22.884553] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.356 [2024-07-12 18:38:22.884946] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.356 [2024-07-12 18:38:22.885333] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.356 [2024-07-12 18:38:22.885761] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.356 [2024-07-12 18:38:22.885777] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.356 [2024-07-12 18:38:22.885792] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.356 [2024-07-12 18:38:22.885807] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.356 [2024-07-12 18:38:22.888463] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.356 [2024-07-12 18:38:22.888859] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.356 [2024-07-12 18:38:22.889265] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.356 [2024-07-12 18:38:22.889662] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.356 [2024-07-12 18:38:22.890484] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.356 [2024-07-12 18:38:22.890875] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.356 [2024-07-12 18:38:22.891273] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.356 [2024-07-12 18:38:22.892853] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.356 [2024-07-12 18:38:22.893252] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.356 [2024-07-12 18:38:22.893269] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.356 [2024-07-12 18:38:22.893283] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.356 [2024-07-12 18:38:22.893298] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.356 [2024-07-12 18:38:22.895852] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.356 [2024-07-12 18:38:22.896258] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.356 [2024-07-12 18:38:22.896655] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.356 [2024-07-12 18:38:22.896703] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.356 [2024-07-12 18:38:22.897542] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.356 [2024-07-12 18:38:22.897935] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.356 [2024-07-12 18:38:22.898324] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.356 [2024-07-12 18:38:22.898723] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.356 [2024-07-12 18:38:22.899080] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.356 [2024-07-12 18:38:22.899098] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.356 [2024-07-12 18:38:22.899112] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.356 [2024-07-12 18:38:22.899127] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.356 [2024-07-12 18:38:22.902106] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.356 [2024-07-12 18:38:22.902621] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.356 [2024-07-12 18:38:22.904080] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.356 [2024-07-12 18:38:22.904468] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.356 [2024-07-12 18:38:22.906507] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.356 [2024-07-12 18:38:22.906897] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.356 [2024-07-12 18:38:22.906947] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.356 [2024-07-12 18:38:22.907908] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.356 [2024-07-12 18:38:22.908228] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.356 [2024-07-12 18:38:22.908245] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.356 [2024-07-12 18:38:22.908259] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.356 [2024-07-12 18:38:22.908274] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.356 [2024-07-12 18:38:22.909876] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.356 [2024-07-12 18:38:22.911263] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.356 [2024-07-12 18:38:22.912931] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.356 [2024-07-12 18:38:22.912976] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.356 [2024-07-12 18:38:22.914324] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.356 [2024-07-12 18:38:22.914382] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.356 [2024-07-12 18:38:22.914769] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.356 [2024-07-12 18:38:22.915160] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.356 [2024-07-12 18:38:22.915638] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.356 [2024-07-12 18:38:22.915655] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.356 [2024-07-12 18:38:22.915671] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.356 [2024-07-12 18:38:22.915686] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.356 [2024-07-12 18:38:22.919017] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.356 [2024-07-12 18:38:22.919938] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.356 [2024-07-12 18:38:22.919987] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.356 [2024-07-12 18:38:22.921384] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.356 [2024-07-12 18:38:22.921700] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.356 [2024-07-12 18:38:22.923365] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.356 [2024-07-12 18:38:22.924506] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.356 [2024-07-12 18:38:22.924556] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.356 [2024-07-12 18:38:22.924831] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.356 [2024-07-12 18:38:22.924847] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.356 [2024-07-12 18:38:22.924861] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.356 [2024-07-12 18:38:22.924875] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.356 [2024-07-12 18:38:22.927429] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.356 [2024-07-12 18:38:22.927492] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.356 [2024-07-12 18:38:22.929137] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.356 [2024-07-12 18:38:22.930957] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.356 [2024-07-12 18:38:22.932870] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.356 [2024-07-12 18:38:22.932917] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.356 [2024-07-12 18:38:22.933776] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.356 [2024-07-12 18:38:22.933821] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.356 [2024-07-12 18:38:22.934134] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.356 [2024-07-12 18:38:22.934150] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.356 [2024-07-12 18:38:22.934165] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.356 [2024-07-12 18:38:22.934179] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.356 [2024-07-12 18:38:22.935899] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.356 [2024-07-12 18:38:22.936296] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.356 [2024-07-12 18:38:22.936339] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.356 [2024-07-12 18:38:22.936726] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.356 [2024-07-12 18:38:22.938629] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.356 [2024-07-12 18:38:22.938676] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.356 [2024-07-12 18:38:22.940153] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.356 [2024-07-12 18:38:22.941818] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.356 [2024-07-12 18:38:22.942096] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.356 [2024-07-12 18:38:22.942112] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.356 [2024-07-12 18:38:22.942127] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.356 [2024-07-12 18:38:22.942141] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.356 [2024-07-12 18:38:22.943838] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.356 [2024-07-12 18:38:22.945559] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.356 [2024-07-12 18:38:22.945612] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.356 [2024-07-12 18:38:22.945649] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.356 [2024-07-12 18:38:22.946009] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.356 [2024-07-12 18:38:22.946052] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.356 [2024-07-12 18:38:22.946815] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.356 [2024-07-12 18:38:22.946861] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.356 [2024-07-12 18:38:22.947305] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.356 [2024-07-12 18:38:22.947323] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.356 [2024-07-12 18:38:22.947340] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.356 [2024-07-12 18:38:22.947355] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.356 [2024-07-12 18:38:22.949284] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.356 [2024-07-12 18:38:22.949327] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.356 [2024-07-12 18:38:22.949370] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.356 [2024-07-12 18:38:22.949416] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.356 [2024-07-12 18:38:22.951349] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.356 [2024-07-12 18:38:22.951396] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.356 [2024-07-12 18:38:22.953042] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.356 [2024-07-12 18:38:22.953095] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.356 [2024-07-12 18:38:22.953452] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.356 [2024-07-12 18:38:22.953468] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.356 [2024-07-12 18:38:22.953482] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.356 [2024-07-12 18:38:22.953497] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.356 [2024-07-12 18:38:22.955013] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.356 [2024-07-12 18:38:22.955058] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.356 [2024-07-12 18:38:22.955098] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.356 [2024-07-12 18:38:22.955139] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.356 [2024-07-12 18:38:22.955615] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.356 [2024-07-12 18:38:22.955659] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.356 [2024-07-12 18:38:22.955700] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.356 [2024-07-12 18:38:22.955743] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.356 [2024-07-12 18:38:22.956185] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.356 [2024-07-12 18:38:22.956209] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.356 [2024-07-12 18:38:22.956224] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.356 [2024-07-12 18:38:22.956238] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.356 [2024-07-12 18:38:22.958320] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.356 [2024-07-12 18:38:22.958372] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.356 [2024-07-12 18:38:22.958415] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.356 [2024-07-12 18:38:22.958455] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.356 [2024-07-12 18:38:22.958776] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.356 [2024-07-12 18:38:22.958819] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.356 [2024-07-12 18:38:22.958859] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.357 [2024-07-12 18:38:22.958899] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.357 [2024-07-12 18:38:22.959204] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.357 [2024-07-12 18:38:22.959221] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.357 [2024-07-12 18:38:22.959236] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.357 [2024-07-12 18:38:22.959252] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.357 [2024-07-12 18:38:22.960772] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.357 [2024-07-12 18:38:22.960817] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.357 [2024-07-12 18:38:22.960857] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.357 [2024-07-12 18:38:22.960898] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.357 [2024-07-12 18:38:22.961341] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.357 [2024-07-12 18:38:22.961384] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.357 [2024-07-12 18:38:22.961424] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.357 [2024-07-12 18:38:22.961464] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.357 [2024-07-12 18:38:22.961738] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.357 [2024-07-12 18:38:22.961755] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.357 [2024-07-12 18:38:22.961769] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.357 [2024-07-12 18:38:22.961783] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.357 [2024-07-12 18:38:22.963966] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.357 [2024-07-12 18:38:22.964011] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.357 [2024-07-12 18:38:22.964053] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.357 [2024-07-12 18:38:22.964101] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.357 [2024-07-12 18:38:22.964448] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.357 [2024-07-12 18:38:22.964492] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.357 [2024-07-12 18:38:22.964532] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.357 [2024-07-12 18:38:22.964572] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.357 [2024-07-12 18:38:22.964871] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.357 [2024-07-12 18:38:22.964887] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.357 [2024-07-12 18:38:22.964901] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.357 [2024-07-12 18:38:22.964915] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.357 [2024-07-12 18:38:22.966530] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.357 [2024-07-12 18:38:22.966581] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.357 [2024-07-12 18:38:22.966623] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.357 [2024-07-12 18:38:22.966673] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.357 [2024-07-12 18:38:22.966979] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.357 [2024-07-12 18:38:22.967032] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.357 [2024-07-12 18:38:22.967072] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.357 [2024-07-12 18:38:22.967112] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.357 [2024-07-12 18:38:22.967379] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.357 [2024-07-12 18:38:22.967396] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.357 [2024-07-12 18:38:22.967410] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.357 [2024-07-12 18:38:22.967424] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.357 [2024-07-12 18:38:22.969594] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.357 [2024-07-12 18:38:22.969640] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.357 [2024-07-12 18:38:22.969682] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.357 [2024-07-12 18:38:22.969723] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.357 [2024-07-12 18:38:22.970186] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.357 [2024-07-12 18:38:22.970230] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.357 [2024-07-12 18:38:22.970275] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.357 [2024-07-12 18:38:22.970316] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.357 [2024-07-12 18:38:22.970584] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.357 [2024-07-12 18:38:22.970601] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.357 [2024-07-12 18:38:22.970620] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.357 [2024-07-12 18:38:22.970635] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.357 [2024-07-12 18:38:22.972263] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.357 [2024-07-12 18:38:22.972306] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.357 [2024-07-12 18:38:22.972346] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.357 [2024-07-12 18:38:22.972386] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.357 [2024-07-12 18:38:22.972736] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.357 [2024-07-12 18:38:22.972779] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.357 [2024-07-12 18:38:22.972819] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.357 [2024-07-12 18:38:22.972859] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.357 [2024-07-12 18:38:22.973127] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.357 [2024-07-12 18:38:22.973144] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.357 [2024-07-12 18:38:22.973158] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.357 [2024-07-12 18:38:22.973173] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.357 [2024-07-12 18:38:22.974930] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.357 [2024-07-12 18:38:22.974978] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.357 [2024-07-12 18:38:22.975019] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.357 [2024-07-12 18:38:22.975060] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.357 [2024-07-12 18:38:22.975528] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.357 [2024-07-12 18:38:22.975573] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.357 [2024-07-12 18:38:22.975614] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.357 [2024-07-12 18:38:22.975654] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.357 [2024-07-12 18:38:22.975933] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.357 [2024-07-12 18:38:22.975950] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.357 [2024-07-12 18:38:22.975965] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.357 [2024-07-12 18:38:22.975980] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.357 [2024-07-12 18:38:22.977876] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.357 [2024-07-12 18:38:22.977920] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.357 [2024-07-12 18:38:22.977965] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.357 [2024-07-12 18:38:22.978008] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.357 [2024-07-12 18:38:22.978321] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.357 [2024-07-12 18:38:22.978363] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.357 [2024-07-12 18:38:22.978403] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.357 [2024-07-12 18:38:22.978442] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.357 [2024-07-12 18:38:22.978802] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.357 [2024-07-12 18:38:22.978819] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.357 [2024-07-12 18:38:22.978833] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.357 [2024-07-12 18:38:22.978848] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.357 [2024-07-12 18:38:22.980363] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.357 [2024-07-12 18:38:22.980407] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.357 [2024-07-12 18:38:22.980453] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.357 [2024-07-12 18:38:22.980497] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.357 [2024-07-12 18:38:22.981002] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.357 [2024-07-12 18:38:22.981045] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.357 [2024-07-12 18:38:22.981087] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.357 [2024-07-12 18:38:22.981129] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.357 [2024-07-12 18:38:22.981563] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.357 [2024-07-12 18:38:22.981580] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.357 [2024-07-12 18:38:22.981595] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.357 [2024-07-12 18:38:22.981611] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.357 [2024-07-12 18:38:22.983544] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.357 [2024-07-12 18:38:22.983588] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.357 [2024-07-12 18:38:22.983629] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.357 [2024-07-12 18:38:22.983668] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.357 [2024-07-12 18:38:22.983997] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.357 [2024-07-12 18:38:22.984040] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.357 [2024-07-12 18:38:22.984081] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.357 [2024-07-12 18:38:22.984138] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.357 [2024-07-12 18:38:22.984403] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.357 [2024-07-12 18:38:22.984419] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.357 [2024-07-12 18:38:22.984434] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.357 [2024-07-12 18:38:22.984456] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.357 [2024-07-12 18:38:22.986107] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.357 [2024-07-12 18:38:22.986150] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.357 [2024-07-12 18:38:22.986189] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.357 [2024-07-12 18:38:22.986230] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.357 [2024-07-12 18:38:22.986599] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.357 [2024-07-12 18:38:22.986643] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.357 [2024-07-12 18:38:22.986684] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.357 [2024-07-12 18:38:22.986735] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.357 [2024-07-12 18:38:22.987008] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.357 [2024-07-12 18:38:22.987025] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.357 [2024-07-12 18:38:22.987039] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.357 [2024-07-12 18:38:22.987053] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.357 [2024-07-12 18:38:22.989434] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.357 [2024-07-12 18:38:22.989479] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.357 [2024-07-12 18:38:22.989521] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.357 [2024-07-12 18:38:22.989565] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.358 [2024-07-12 18:38:22.989989] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.358 [2024-07-12 18:38:22.990032] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.358 [2024-07-12 18:38:22.990073] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.358 [2024-07-12 18:38:22.990113] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.358 [2024-07-12 18:38:22.990419] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.358 [2024-07-12 18:38:22.990436] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.358 [2024-07-12 18:38:22.990450] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.358 [2024-07-12 18:38:22.990465] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.358 [2024-07-12 18:38:22.992058] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.358 [2024-07-12 18:38:22.992102] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.358 [2024-07-12 18:38:22.992142] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.358 [2024-07-12 18:38:22.992182] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.358 [2024-07-12 18:38:22.992517] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.358 [2024-07-12 18:38:22.992564] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.358 [2024-07-12 18:38:22.992605] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.358 [2024-07-12 18:38:22.992645] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.358 [2024-07-12 18:38:22.992913] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.358 [2024-07-12 18:38:22.992933] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.358 [2024-07-12 18:38:22.992948] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.358 [2024-07-12 18:38:22.992962] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.358 [2024-07-12 18:38:22.995030] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.358 [2024-07-12 18:38:22.995074] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.358 [2024-07-12 18:38:22.995114] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.358 [2024-07-12 18:38:22.995155] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.358 [2024-07-12 18:38:22.995613] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.358 [2024-07-12 18:38:22.995657] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.358 [2024-07-12 18:38:22.995698] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.358 [2024-07-12 18:38:22.995739] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.358 [2024-07-12 18:38:22.996020] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.358 [2024-07-12 18:38:22.996037] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.358 [2024-07-12 18:38:22.996051] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.358 [2024-07-12 18:38:22.996066] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.358 [2024-07-12 18:38:22.997671] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.358 [2024-07-12 18:38:22.997726] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.358 [2024-07-12 18:38:22.997769] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.358 [2024-07-12 18:38:22.997809] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.358 [2024-07-12 18:38:22.998125] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.358 [2024-07-12 18:38:22.998178] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.358 [2024-07-12 18:38:22.998221] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.358 [2024-07-12 18:38:22.998260] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.358 [2024-07-12 18:38:22.998531] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.358 [2024-07-12 18:38:22.998547] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.358 [2024-07-12 18:38:22.998561] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.358 [2024-07-12 18:38:22.998575] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.358 [2024-07-12 18:38:23.000212] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.358 [2024-07-12 18:38:23.000257] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.358 [2024-07-12 18:38:23.000299] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.358 [2024-07-12 18:38:23.000344] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.358 [2024-07-12 18:38:23.000812] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.358 [2024-07-12 18:38:23.000855] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.358 [2024-07-12 18:38:23.000897] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.358 [2024-07-12 18:38:23.000954] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.358 [2024-07-12 18:38:23.001223] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.358 [2024-07-12 18:38:23.001239] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.358 [2024-07-12 18:38:23.001253] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.358 [2024-07-12 18:38:23.001267] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.358 [2024-07-12 18:38:23.003253] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.358 [2024-07-12 18:38:23.003297] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.358 [2024-07-12 18:38:23.003348] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.358 [2024-07-12 18:38:23.003388] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.358 [2024-07-12 18:38:23.003695] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.358 [2024-07-12 18:38:23.003742] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.358 [2024-07-12 18:38:23.003782] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.358 [2024-07-12 18:38:23.003822] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.358 [2024-07-12 18:38:23.004092] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.358 [2024-07-12 18:38:23.004110] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.358 [2024-07-12 18:38:23.004124] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.358 [2024-07-12 18:38:23.004139] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.358 [2024-07-12 18:38:23.005655] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.358 [2024-07-12 18:38:23.005699] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.358 [2024-07-12 18:38:23.005739] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.358 [2024-07-12 18:38:23.005778] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.358 [2024-07-12 18:38:23.006298] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.358 [2024-07-12 18:38:23.006343] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.358 [2024-07-12 18:38:23.006383] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.358 [2024-07-12 18:38:23.006428] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.358 [2024-07-12 18:38:23.006887] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.358 [2024-07-12 18:38:23.006906] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.358 [2024-07-12 18:38:23.006921] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.358 [2024-07-12 18:38:23.006940] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.358 [2024-07-12 18:38:23.009025] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.358 [2024-07-12 18:38:23.009069] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.358 [2024-07-12 18:38:23.009109] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.358 [2024-07-12 18:38:23.009149] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.358 [2024-07-12 18:38:23.009453] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.358 [2024-07-12 18:38:23.009495] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.358 [2024-07-12 18:38:23.009536] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.358 [2024-07-12 18:38:23.009577] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.358 [2024-07-12 18:38:23.009842] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.358 [2024-07-12 18:38:23.009858] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.358 [2024-07-12 18:38:23.009872] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.358 [2024-07-12 18:38:23.009886] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.358 [2024-07-12 18:38:23.011500] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.358 [2024-07-12 18:38:23.011544] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.358 [2024-07-12 18:38:23.011590] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.358 [2024-07-12 18:38:23.011638] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.358 [2024-07-12 18:38:23.011943] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.358 [2024-07-12 18:38:23.011995] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.358 [2024-07-12 18:38:23.012036] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.358 [2024-07-12 18:38:23.012077] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.358 [2024-07-12 18:38:23.012402] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.358 [2024-07-12 18:38:23.012418] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.358 [2024-07-12 18:38:23.012433] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.358 [2024-07-12 18:38:23.012447] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.358 [2024-07-12 18:38:23.014487] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.358 [2024-07-12 18:38:23.014535] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.358 [2024-07-12 18:38:23.014577] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.358 [2024-07-12 18:38:23.014621] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.358 [2024-07-12 18:38:23.015097] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.358 [2024-07-12 18:38:23.015142] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.358 [2024-07-12 18:38:23.015187] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.359 [2024-07-12 18:38:23.015228] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.359 [2024-07-12 18:38:23.015491] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.359 [2024-07-12 18:38:23.015507] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.359 [2024-07-12 18:38:23.015522] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.359 [2024-07-12 18:38:23.015537] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.359 [2024-07-12 18:38:23.017145] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.359 [2024-07-12 18:38:23.017189] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.359 [2024-07-12 18:38:23.017229] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.359 [2024-07-12 18:38:23.017268] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.359 [2024-07-12 18:38:23.017658] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.359 [2024-07-12 18:38:23.017701] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.359 [2024-07-12 18:38:23.017740] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.359 [2024-07-12 18:38:23.017781] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.359 [2024-07-12 18:38:23.018051] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.359 [2024-07-12 18:38:23.018067] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.359 [2024-07-12 18:38:23.018082] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.359 [2024-07-12 18:38:23.018096] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.359 [2024-07-12 18:38:23.020155] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.359 [2024-07-12 18:38:23.020204] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.359 [2024-07-12 18:38:23.020248] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.359 [2024-07-12 18:38:23.020288] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.359 [2024-07-12 18:38:23.020794] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.359 [2024-07-12 18:38:23.020837] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.359 [2024-07-12 18:38:23.020879] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.359 [2024-07-12 18:38:23.020919] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.359 [2024-07-12 18:38:23.021298] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.359 [2024-07-12 18:38:23.021314] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.359 [2024-07-12 18:38:23.021329] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.359 [2024-07-12 18:38:23.021343] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.359 [2024-07-12 18:38:23.022859] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.359 [2024-07-12 18:38:23.022904] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.359 [2024-07-12 18:38:23.024355] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.359 [2024-07-12 18:38:23.024399] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.359 [2024-07-12 18:38:23.024750] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.359 [2024-07-12 18:38:23.024793] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.359 [2024-07-12 18:38:23.024833] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.359 [2024-07-12 18:38:23.024873] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.359 [2024-07-12 18:38:23.025142] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.359 [2024-07-12 18:38:23.025159] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.359 [2024-07-12 18:38:23.025173] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.359 [2024-07-12 18:38:23.025187] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.359 [2024-07-12 18:38:23.027031] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.359 [2024-07-12 18:38:23.027079] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.359 [2024-07-12 18:38:23.027121] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.359 [2024-07-12 18:38:23.028037] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.359 [2024-07-12 18:38:23.028363] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.359 [2024-07-12 18:38:23.028407] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.359 [2024-07-12 18:38:23.028796] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.359 [2024-07-12 18:38:23.028839] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.359 [2024-07-12 18:38:23.029158] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.359 [2024-07-12 18:38:23.029174] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.359 [2024-07-12 18:38:23.029188] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.359 [2024-07-12 18:38:23.029203] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.359 [2024-07-12 18:38:23.030781] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.359 [2024-07-12 18:38:23.030826] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.359 [2024-07-12 18:38:23.032618] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.359 [2024-07-12 18:38:23.032669] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.359 [2024-07-12 18:38:23.032984] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.359 [2024-07-12 18:38:23.034647] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.359 [2024-07-12 18:38:23.034692] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.359 [2024-07-12 18:38:23.034746] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.359 [2024-07-12 18:38:23.035022] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.359 [2024-07-12 18:38:23.035039] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.359 [2024-07-12 18:38:23.035053] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.359 [2024-07-12 18:38:23.035068] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.359 [2024-07-12 18:38:23.037494] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.359 [2024-07-12 18:38:23.038654] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.359 [2024-07-12 18:38:23.038700] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.359 [2024-07-12 18:38:23.038741] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.359 [2024-07-12 18:38:23.040722] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.359 [2024-07-12 18:38:23.040778] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.359 [2024-07-12 18:38:23.040819] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.359 [2024-07-12 18:38:23.042471] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.359 [2024-07-12 18:38:23.042848] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.359 [2024-07-12 18:38:23.042865] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.359 [2024-07-12 18:38:23.042880] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.359 [2024-07-12 18:38:23.042894] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.359 [2024-07-12 18:38:23.045047] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.359 [2024-07-12 18:38:23.045096] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.359 [2024-07-12 18:38:23.045137] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.359 [2024-07-12 18:38:23.046210] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.359 [2024-07-12 18:38:23.046669] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.359 [2024-07-12 18:38:23.047550] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.359 [2024-07-12 18:38:23.047596] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.359 [2024-07-12 18:38:23.048501] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.359 [2024-07-12 18:38:23.048932] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.359 [2024-07-12 18:38:23.048955] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.359 [2024-07-12 18:38:23.048970] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.359 [2024-07-12 18:38:23.048986] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.359 [2024-07-12 18:38:23.053641] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.359 [2024-07-12 18:38:23.055133] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.359 [2024-07-12 18:38:23.055178] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.359 [2024-07-12 18:38:23.055220] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.359 [2024-07-12 18:38:23.055534] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.359 [2024-07-12 18:38:23.057220] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.359 [2024-07-12 18:38:23.057272] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.359 [2024-07-12 18:38:23.057314] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.359 [2024-07-12 18:38:23.057726] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.359 [2024-07-12 18:38:23.057743] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.359 [2024-07-12 18:38:23.057757] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.359 [2024-07-12 18:38:23.057772] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.359 [2024-07-12 18:38:23.061454] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.359 [2024-07-12 18:38:23.061504] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.359 [2024-07-12 18:38:23.063150] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.359 [2024-07-12 18:38:23.063194] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.359 [2024-07-12 18:38:23.064832] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.359 [2024-07-12 18:38:23.065233] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.359 [2024-07-12 18:38:23.066930] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.359 [2024-07-12 18:38:23.068468] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.359 [2024-07-12 18:38:23.068513] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.359 [2024-07-12 18:38:23.070316] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.359 [2024-07-12 18:38:23.070588] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.359 [2024-07-12 18:38:23.070605] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.359 [2024-07-12 18:38:23.070619] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.359 [2024-07-12 18:38:23.070633] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.359 [2024-07-12 18:38:23.074965] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.359 [2024-07-12 18:38:23.076315] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.359 [2024-07-12 18:38:23.077813] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.620 [2024-07-12 18:38:23.079336] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.620 [2024-07-12 18:38:23.079607] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.620 [2024-07-12 18:38:23.079661] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.620 [2024-07-12 18:38:23.080535] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.620 [2024-07-12 18:38:23.080580] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.620 [2024-07-12 18:38:23.081960] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.620 [2024-07-12 18:38:23.082231] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.620 [2024-07-12 18:38:23.082248] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.620 [2024-07-12 18:38:23.082262] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.620 [2024-07-12 18:38:23.082276] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.620 [2024-07-12 18:38:23.084553] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.620 [2024-07-12 18:38:23.084951] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.620 [2024-07-12 18:38:23.086647] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.620 [2024-07-12 18:38:23.088038] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.620 [2024-07-12 18:38:23.088308] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.620 [2024-07-12 18:38:23.089976] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.620 [2024-07-12 18:38:23.090904] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.620 [2024-07-12 18:38:23.092640] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.620 [2024-07-12 18:38:23.094396] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.620 [2024-07-12 18:38:23.094668] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.620 [2024-07-12 18:38:23.094684] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.620 [2024-07-12 18:38:23.094698] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.620 [2024-07-12 18:38:23.094713] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.620 [2024-07-12 18:38:23.099020] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.620 [2024-07-12 18:38:23.099873] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.620 [2024-07-12 18:38:23.101260] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.620 [2024-07-12 18:38:23.102916] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.620 [2024-07-12 18:38:23.103189] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.620 [2024-07-12 18:38:23.104784] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.620 [2024-07-12 18:38:23.106098] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.620 [2024-07-12 18:38:23.107491] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.620 [2024-07-12 18:38:23.109154] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.620 [2024-07-12 18:38:23.109421] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.620 [2024-07-12 18:38:23.109437] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.620 [2024-07-12 18:38:23.109451] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.621 [2024-07-12 18:38:23.109466] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.621 [2024-07-12 18:38:23.112084] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.621 [2024-07-12 18:38:23.113774] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.621 [2024-07-12 18:38:23.115570] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.621 [2024-07-12 18:38:23.117361] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.621 [2024-07-12 18:38:23.117630] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.621 [2024-07-12 18:38:23.118407] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.621 [2024-07-12 18:38:23.119795] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.621 [2024-07-12 18:38:23.121434] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.621 [2024-07-12 18:38:23.123083] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.621 [2024-07-12 18:38:23.123352] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.621 [2024-07-12 18:38:23.123369] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.621 [2024-07-12 18:38:23.123384] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.621 [2024-07-12 18:38:23.123398] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.621 [2024-07-12 18:38:23.129462] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.621 [2024-07-12 18:38:23.130845] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.621 [2024-07-12 18:38:23.132492] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.621 [2024-07-12 18:38:23.134145] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.621 [2024-07-12 18:38:23.134526] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.621 [2024-07-12 18:38:23.136236] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.621 [2024-07-12 18:38:23.137772] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.621 [2024-07-12 18:38:23.139428] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.621 [2024-07-12 18:38:23.141145] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.621 [2024-07-12 18:38:23.141591] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.621 [2024-07-12 18:38:23.141609] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.621 [2024-07-12 18:38:23.141627] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.621 [2024-07-12 18:38:23.141642] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.621 [2024-07-12 18:38:23.145266] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.621 [2024-07-12 18:38:23.146939] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.621 [2024-07-12 18:38:23.148593] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.621 [2024-07-12 18:38:23.149803] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.621 [2024-07-12 18:38:23.150081] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.621 [2024-07-12 18:38:23.151475] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.621 [2024-07-12 18:38:23.153107] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.621 [2024-07-12 18:38:23.154771] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.621 [2024-07-12 18:38:23.155545] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.621 [2024-07-12 18:38:23.155817] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.621 [2024-07-12 18:38:23.155833] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.621 [2024-07-12 18:38:23.155848] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.621 [2024-07-12 18:38:23.155862] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.621 [2024-07-12 18:38:23.160007] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.621 [2024-07-12 18:38:23.161676] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.621 [2024-07-12 18:38:23.163445] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.621 [2024-07-12 18:38:23.164430] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.621 [2024-07-12 18:38:23.164735] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.621 [2024-07-12 18:38:23.166419] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.621 [2024-07-12 18:38:23.168074] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.621 [2024-07-12 18:38:23.169493] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.621 [2024-07-12 18:38:23.169879] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.621 [2024-07-12 18:38:23.170322] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.621 [2024-07-12 18:38:23.170339] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.621 [2024-07-12 18:38:23.170354] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.621 [2024-07-12 18:38:23.170370] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.621 [2024-07-12 18:38:23.173880] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.621 [2024-07-12 18:38:23.175527] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.621 [2024-07-12 18:38:23.176408] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.621 [2024-07-12 18:38:23.178098] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.621 [2024-07-12 18:38:23.178367] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.621 [2024-07-12 18:38:23.180016] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.621 [2024-07-12 18:38:23.181658] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.621 [2024-07-12 18:38:23.182320] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.621 [2024-07-12 18:38:23.183741] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.621 [2024-07-12 18:38:23.184208] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.621 [2024-07-12 18:38:23.184228] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.621 [2024-07-12 18:38:23.184243] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.621 [2024-07-12 18:38:23.184258] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.621 [2024-07-12 18:38:23.189434] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.621 [2024-07-12 18:38:23.190554] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.621 [2024-07-12 18:38:23.191945] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.621 [2024-07-12 18:38:23.193488] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.621 [2024-07-12 18:38:23.193816] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.621 [2024-07-12 18:38:23.194226] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.621 [2024-07-12 18:38:23.194614] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.621 [2024-07-12 18:38:23.195004] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.621 [2024-07-12 18:38:23.195391] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.621 [2024-07-12 18:38:23.195662] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.621 [2024-07-12 18:38:23.195678] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.621 [2024-07-12 18:38:23.195693] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.621 [2024-07-12 18:38:23.195707] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.621 [2024-07-12 18:38:23.199010] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.621 [2024-07-12 18:38:23.200798] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.621 [2024-07-12 18:38:23.202464] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.621 [2024-07-12 18:38:23.204015] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.621 [2024-07-12 18:38:23.204342] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.621 [2024-07-12 18:38:23.205607] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.621 [2024-07-12 18:38:23.206002] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.621 [2024-07-12 18:38:23.207093] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.621 [2024-07-12 18:38:23.207968] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.621 [2024-07-12 18:38:23.208412] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.621 [2024-07-12 18:38:23.208430] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.621 [2024-07-12 18:38:23.208445] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.621 [2024-07-12 18:38:23.208461] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.621 [2024-07-12 18:38:23.214526] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.621 [2024-07-12 18:38:23.216195] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.621 [2024-07-12 18:38:23.217847] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.621 [2024-07-12 18:38:23.218290] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.621 [2024-07-12 18:38:23.218765] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.621 [2024-07-12 18:38:23.219170] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.621 [2024-07-12 18:38:23.219559] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.621 [2024-07-12 18:38:23.220334] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.622 [2024-07-12 18:38:23.221713] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.622 [2024-07-12 18:38:23.221984] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.622 [2024-07-12 18:38:23.222002] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.622 [2024-07-12 18:38:23.222016] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.622 [2024-07-12 18:38:23.222031] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.622 [2024-07-12 18:38:23.225441] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.622 [2024-07-12 18:38:23.227108] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.622 [2024-07-12 18:38:23.228589] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.622 [2024-07-12 18:38:23.229436] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.622 [2024-07-12 18:38:23.229707] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.622 [2024-07-12 18:38:23.230117] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.622 [2024-07-12 18:38:23.230809] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.622 [2024-07-12 18:38:23.232089] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.622 [2024-07-12 18:38:23.232477] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.622 [2024-07-12 18:38:23.232845] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.622 [2024-07-12 18:38:23.232862] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.622 [2024-07-12 18:38:23.232882] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.622 [2024-07-12 18:38:23.232896] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.622 [2024-07-12 18:38:23.236358] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.622 [2024-07-12 18:38:23.238070] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.622 [2024-07-12 18:38:23.238470] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.622 [2024-07-12 18:38:23.238858] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.622 [2024-07-12 18:38:23.239132] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.622 [2024-07-12 18:38:23.239551] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.622 [2024-07-12 18:38:23.239948] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.622 [2024-07-12 18:38:23.240340] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.622 [2024-07-12 18:38:23.240737] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.622 [2024-07-12 18:38:23.241205] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.622 [2024-07-12 18:38:23.241223] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.622 [2024-07-12 18:38:23.241238] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.622 [2024-07-12 18:38:23.241254] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.622 [2024-07-12 18:38:23.243903] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.622 [2024-07-12 18:38:23.245411] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.622 [2024-07-12 18:38:23.245800] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.622 [2024-07-12 18:38:23.246642] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.622 [2024-07-12 18:38:23.246913] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.622 [2024-07-12 18:38:23.247322] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.622 [2024-07-12 18:38:23.247716] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.622 [2024-07-12 18:38:23.248117] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.622 [2024-07-12 18:38:23.248520] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.622 [2024-07-12 18:38:23.248943] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.622 [2024-07-12 18:38:23.248962] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.622 [2024-07-12 18:38:23.248978] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.622 [2024-07-12 18:38:23.248993] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.622 [2024-07-12 18:38:23.252126] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.622 [2024-07-12 18:38:23.253361] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.622 [2024-07-12 18:38:23.254096] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.622 [2024-07-12 18:38:23.254490] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.622 [2024-07-12 18:38:23.254859] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.622 [2024-07-12 18:38:23.255268] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.622 [2024-07-12 18:38:23.255660] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.622 [2024-07-12 18:38:23.256051] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.622 [2024-07-12 18:38:23.256442] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.622 [2024-07-12 18:38:23.256868] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.622 [2024-07-12 18:38:23.256888] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.622 [2024-07-12 18:38:23.256904] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.622 [2024-07-12 18:38:23.256919] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.622 [2024-07-12 18:38:23.259250] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.622 [2024-07-12 18:38:23.260931] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.622 [2024-07-12 18:38:23.261328] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.622 [2024-07-12 18:38:23.261720] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.622 [2024-07-12 18:38:23.262085] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.622 [2024-07-12 18:38:23.262491] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.622 [2024-07-12 18:38:23.262883] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.622 [2024-07-12 18:38:23.263271] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.622 [2024-07-12 18:38:23.263658] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.622 [2024-07-12 18:38:23.264094] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.622 [2024-07-12 18:38:23.264112] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.622 [2024-07-12 18:38:23.264127] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.622 [2024-07-12 18:38:23.264142] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.622 [2024-07-12 18:38:23.267100] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.622 [2024-07-12 18:38:23.267499] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.622 [2024-07-12 18:38:23.267910] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.622 [2024-07-12 18:38:23.268311] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.622 [2024-07-12 18:38:23.268751] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.622 [2024-07-12 18:38:23.269158] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.622 [2024-07-12 18:38:23.269547] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.622 [2024-07-12 18:38:23.269953] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.622 [2024-07-12 18:38:23.270346] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.622 [2024-07-12 18:38:23.270686] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.622 [2024-07-12 18:38:23.270703] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.622 [2024-07-12 18:38:23.270717] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.622 [2024-07-12 18:38:23.270731] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.622 [2024-07-12 18:38:23.273066] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.622 [2024-07-12 18:38:23.273464] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.622 [2024-07-12 18:38:23.273861] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.622 [2024-07-12 18:38:23.274261] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.622 [2024-07-12 18:38:23.274712] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.622 [2024-07-12 18:38:23.275117] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.622 [2024-07-12 18:38:23.275507] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.622 [2024-07-12 18:38:23.275897] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.622 [2024-07-12 18:38:23.276299] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.622 [2024-07-12 18:38:23.276574] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.622 [2024-07-12 18:38:23.276591] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.622 [2024-07-12 18:38:23.276606] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.622 [2024-07-12 18:38:23.276620] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.622 [2024-07-12 18:38:23.279988] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.622 [2024-07-12 18:38:23.280403] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.622 [2024-07-12 18:38:23.280790] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.622 [2024-07-12 18:38:23.281184] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.623 [2024-07-12 18:38:23.281618] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.623 [2024-07-12 18:38:23.282033] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.623 [2024-07-12 18:38:23.282429] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.623 [2024-07-12 18:38:23.284091] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.623 [2024-07-12 18:38:23.284476] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.623 [2024-07-12 18:38:23.284887] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.623 [2024-07-12 18:38:23.284903] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.623 [2024-07-12 18:38:23.284918] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.623 [2024-07-12 18:38:23.284944] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.623 [2024-07-12 18:38:23.287355] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.623 [2024-07-12 18:38:23.287752] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.623 [2024-07-12 18:38:23.288148] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.623 [2024-07-12 18:38:23.288537] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.623 [2024-07-12 18:38:23.288982] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.623 [2024-07-12 18:38:23.289390] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.623 [2024-07-12 18:38:23.290146] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.623 [2024-07-12 18:38:23.291365] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.623 [2024-07-12 18:38:23.291750] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.623 [2024-07-12 18:38:23.292115] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.623 [2024-07-12 18:38:23.292133] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.623 [2024-07-12 18:38:23.292147] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.623 [2024-07-12 18:38:23.292161] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.623 [2024-07-12 18:38:23.296185] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.623 [2024-07-12 18:38:23.296583] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.623 [2024-07-12 18:38:23.296981] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.623 [2024-07-12 18:38:23.297374] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.623 [2024-07-12 18:38:23.297654] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.623 [2024-07-12 18:38:23.298612] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.623 [2024-07-12 18:38:23.299010] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.623 [2024-07-12 18:38:23.300400] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.623 [2024-07-12 18:38:23.300972] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.623 [2024-07-12 18:38:23.301403] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.623 [2024-07-12 18:38:23.301421] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.623 [2024-07-12 18:38:23.301437] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.623 [2024-07-12 18:38:23.301452] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.623 [2024-07-12 18:38:23.304082] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.623 [2024-07-12 18:38:23.304478] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.623 [2024-07-12 18:38:23.304877] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.623 [2024-07-12 18:38:23.305283] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.623 [2024-07-12 18:38:23.305553] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.623 [2024-07-12 18:38:23.305980] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.623 [2024-07-12 18:38:23.306371] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.623 [2024-07-12 18:38:23.308083] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.623 [2024-07-12 18:38:23.308478] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.623 [2024-07-12 18:38:23.308902] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.623 [2024-07-12 18:38:23.308920] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.623 [2024-07-12 18:38:23.308942] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.623 [2024-07-12 18:38:23.308956] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.623 [2024-07-12 18:38:23.312502] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.623 [2024-07-12 18:38:23.313245] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.623 [2024-07-12 18:38:23.314465] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.623 [2024-07-12 18:38:23.314856] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.623 [2024-07-12 18:38:23.315216] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.623 [2024-07-12 18:38:23.316571] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.623 [2024-07-12 18:38:23.316965] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.623 [2024-07-12 18:38:23.317354] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.623 [2024-07-12 18:38:23.317745] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.623 [2024-07-12 18:38:23.318157] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.623 [2024-07-12 18:38:23.318174] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.623 [2024-07-12 18:38:23.318190] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.623 [2024-07-12 18:38:23.318206] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.623 [2024-07-12 18:38:23.320842] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.623 [2024-07-12 18:38:23.322114] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.623 [2024-07-12 18:38:23.322815] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.623 [2024-07-12 18:38:23.323214] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.623 [2024-07-12 18:38:23.323491] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.623 [2024-07-12 18:38:23.324326] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.623 [2024-07-12 18:38:23.324717] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.623 [2024-07-12 18:38:23.325116] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.623 [2024-07-12 18:38:23.325515] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.623 [2024-07-12 18:38:23.325980] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.623 [2024-07-12 18:38:23.325998] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.623 [2024-07-12 18:38:23.326015] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.623 [2024-07-12 18:38:23.326031] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.623 [2024-07-12 18:38:23.329425] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.623 [2024-07-12 18:38:23.329823] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.623 [2024-07-12 18:38:23.331464] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.623 [2024-07-12 18:38:23.331854] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.623 [2024-07-12 18:38:23.332280] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.623 [2024-07-12 18:38:23.333967] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.623 [2024-07-12 18:38:23.334368] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.623 [2024-07-12 18:38:23.336001] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.623 [2024-07-12 18:38:23.336398] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.623 [2024-07-12 18:38:23.336826] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.623 [2024-07-12 18:38:23.336844] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.623 [2024-07-12 18:38:23.336863] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.623 [2024-07-12 18:38:23.336878] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.623 [2024-07-12 18:38:23.339736] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.623 [2024-07-12 18:38:23.339788] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.623 [2024-07-12 18:38:23.340890] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.623 [2024-07-12 18:38:23.341293] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.623 [2024-07-12 18:38:23.341640] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.623 [2024-07-12 18:38:23.343008] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.884 [2024-07-12 18:38:23.343396] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.884 [2024-07-12 18:38:23.343788] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.884 [2024-07-12 18:38:23.344197] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.884 [2024-07-12 18:38:23.344552] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.884 [2024-07-12 18:38:23.344569] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.884 [2024-07-12 18:38:23.344584] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.885 [2024-07-12 18:38:23.344598] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.885 [2024-07-12 18:38:23.349589] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.885 [2024-07-12 18:38:23.350619] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.885 [2024-07-12 18:38:23.350669] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.885 [2024-07-12 18:38:23.351431] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.885 [2024-07-12 18:38:23.351854] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.885 [2024-07-12 18:38:23.352729] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.885 [2024-07-12 18:38:23.353820] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.885 [2024-07-12 18:38:23.353867] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.885 [2024-07-12 18:38:23.354264] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.885 [2024-07-12 18:38:23.354556] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.885 [2024-07-12 18:38:23.354573] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.885 [2024-07-12 18:38:23.354589] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.885 [2024-07-12 18:38:23.354603] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.885 [2024-07-12 18:38:23.357759] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.885 [2024-07-12 18:38:23.357814] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.885 [2024-07-12 18:38:23.359461] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.885 [2024-07-12 18:38:23.361252] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.885 [2024-07-12 18:38:23.361525] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.885 [2024-07-12 18:38:23.362179] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.885 [2024-07-12 18:38:23.362228] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.885 [2024-07-12 18:38:23.363510] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.885 [2024-07-12 18:38:23.363898] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.885 [2024-07-12 18:38:23.364269] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.885 [2024-07-12 18:38:23.364285] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.885 [2024-07-12 18:38:23.364300] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.885 [2024-07-12 18:38:23.364314] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.885 [2024-07-12 18:38:23.368659] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.885 [2024-07-12 18:38:23.369852] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.885 [2024-07-12 18:38:23.371581] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.885 [2024-07-12 18:38:23.371635] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.885 [2024-07-12 18:38:23.371909] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.885 [2024-07-12 18:38:23.371974] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.885 [2024-07-12 18:38:23.373606] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.885 [2024-07-12 18:38:23.375243] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.885 [2024-07-12 18:38:23.375291] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.885 [2024-07-12 18:38:23.375701] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.885 [2024-07-12 18:38:23.375718] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.885 [2024-07-12 18:38:23.375732] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.885 [2024-07-12 18:38:23.375746] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.885 [2024-07-12 18:38:23.378056] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.885 [2024-07-12 18:38:23.378881] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.885 [2024-07-12 18:38:23.378939] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.885 [2024-07-12 18:38:23.380319] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.885 [2024-07-12 18:38:23.380587] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.885 [2024-07-12 18:38:23.382273] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.885 [2024-07-12 18:38:23.382320] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.885 [2024-07-12 18:38:23.383511] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.885 [2024-07-12 18:38:23.383556] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.885 [2024-07-12 18:38:23.383824] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.885 [2024-07-12 18:38:23.383840] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.885 [2024-07-12 18:38:23.383855] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.885 [2024-07-12 18:38:23.383869] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.885 [2024-07-12 18:38:23.387732] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.885 [2024-07-12 18:38:23.388305] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.885 [2024-07-12 18:38:23.389687] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.885 [2024-07-12 18:38:23.389734] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.885 [2024-07-12 18:38:23.390201] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.885 [2024-07-12 18:38:23.390999] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.885 [2024-07-12 18:38:23.391046] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.885 [2024-07-12 18:38:23.392429] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.885 [2024-07-12 18:38:23.394092] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.885 [2024-07-12 18:38:23.394368] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.885 [2024-07-12 18:38:23.394384] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.885 [2024-07-12 18:38:23.394399] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.885 [2024-07-12 18:38:23.394413] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.885 [2024-07-12 18:38:23.397724] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.885 [2024-07-12 18:38:23.397776] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.885 [2024-07-12 18:38:23.398451] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.885 [2024-07-12 18:38:23.398500] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.885 [2024-07-12 18:38:23.398767] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.885 [2024-07-12 18:38:23.398826] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.885 [2024-07-12 18:38:23.398868] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.885 [2024-07-12 18:38:23.399266] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.885 [2024-07-12 18:38:23.399311] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.885 [2024-07-12 18:38:23.399682] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.885 [2024-07-12 18:38:23.399698] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.885 [2024-07-12 18:38:23.399712] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.885 [2024-07-12 18:38:23.399727] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.885 [2024-07-12 18:38:23.404080] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.885 [2024-07-12 18:38:23.404130] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.885 [2024-07-12 18:38:23.404171] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.885 [2024-07-12 18:38:23.404212] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.885 [2024-07-12 18:38:23.404553] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.885 [2024-07-12 18:38:23.406136] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.885 [2024-07-12 18:38:23.406183] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.885 [2024-07-12 18:38:23.407774] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.885 [2024-07-12 18:38:23.407823] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.885 [2024-07-12 18:38:23.408099] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.885 [2024-07-12 18:38:23.408115] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.885 [2024-07-12 18:38:23.408129] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.885 [2024-07-12 18:38:23.408144] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.885 [2024-07-12 18:38:23.410049] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.885 [2024-07-12 18:38:23.410097] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.885 [2024-07-12 18:38:23.410139] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.885 [2024-07-12 18:38:23.410182] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.885 [2024-07-12 18:38:23.410493] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.885 [2024-07-12 18:38:23.410547] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.885 [2024-07-12 18:38:23.410588] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.885 [2024-07-12 18:38:23.410628] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.886 [2024-07-12 18:38:23.410668] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.886 [2024-07-12 18:38:23.411063] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.886 [2024-07-12 18:38:23.411080] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.886 [2024-07-12 18:38:23.411095] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.886 [2024-07-12 18:38:23.411110] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.886 [2024-07-12 18:38:23.414820] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.886 [2024-07-12 18:38:23.414871] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.886 [2024-07-12 18:38:23.414911] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.886 [2024-07-12 18:38:23.414959] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.886 [2024-07-12 18:38:23.415296] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.886 [2024-07-12 18:38:23.415355] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.886 [2024-07-12 18:38:23.415396] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.886 [2024-07-12 18:38:23.415436] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.886 [2024-07-12 18:38:23.415476] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.886 [2024-07-12 18:38:23.415741] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.886 [2024-07-12 18:38:23.415757] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.886 [2024-07-12 18:38:23.415772] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.886 [2024-07-12 18:38:23.415786] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.886 [2024-07-12 18:38:23.417508] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.886 [2024-07-12 18:38:23.417557] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.886 [2024-07-12 18:38:23.417599] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.886 [2024-07-12 18:38:23.417640] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.886 [2024-07-12 18:38:23.418090] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.886 [2024-07-12 18:38:23.418148] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.886 [2024-07-12 18:38:23.418194] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.886 [2024-07-12 18:38:23.418235] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.886 [2024-07-12 18:38:23.418275] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.886 [2024-07-12 18:38:23.418544] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.886 [2024-07-12 18:38:23.418563] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.886 [2024-07-12 18:38:23.418578] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.886 [2024-07-12 18:38:23.418592] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.886 [2024-07-12 18:38:23.422298] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.886 [2024-07-12 18:38:23.422351] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.886 [2024-07-12 18:38:23.422393] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.886 [2024-07-12 18:38:23.422434] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.886 [2024-07-12 18:38:23.422700] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.886 [2024-07-12 18:38:23.422754] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.886 [2024-07-12 18:38:23.422795] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.886 [2024-07-12 18:38:23.422844] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.886 [2024-07-12 18:38:23.422886] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.886 [2024-07-12 18:38:23.423162] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.886 [2024-07-12 18:38:23.423180] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.886 [2024-07-12 18:38:23.423195] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.886 [2024-07-12 18:38:23.423209] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.886 [2024-07-12 18:38:23.424860] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.886 [2024-07-12 18:38:23.424905] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.886 [2024-07-12 18:38:23.424956] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.886 [2024-07-12 18:38:23.424997] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.886 [2024-07-12 18:38:23.425420] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.886 [2024-07-12 18:38:23.425473] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.886 [2024-07-12 18:38:23.425517] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.886 [2024-07-12 18:38:23.425559] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.886 [2024-07-12 18:38:23.425600] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.886 [2024-07-12 18:38:23.425887] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.886 [2024-07-12 18:38:23.425903] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.886 [2024-07-12 18:38:23.425917] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.886 [2024-07-12 18:38:23.425940] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.886 [2024-07-12 18:38:23.429666] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.886 [2024-07-12 18:38:23.429717] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.886 [2024-07-12 18:38:23.429758] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.886 [2024-07-12 18:38:23.429798] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.886 [2024-07-12 18:38:23.430219] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.886 [2024-07-12 18:38:23.430282] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.886 [2024-07-12 18:38:23.430323] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.886 [2024-07-12 18:38:23.430363] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.886 [2024-07-12 18:38:23.430403] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.886 [2024-07-12 18:38:23.430730] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.886 [2024-07-12 18:38:23.430746] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.886 [2024-07-12 18:38:23.430761] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.886 [2024-07-12 18:38:23.430775] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.886 [2024-07-12 18:38:23.432390] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.886 [2024-07-12 18:38:23.432436] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.886 [2024-07-12 18:38:23.432476] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.886 [2024-07-12 18:38:23.432517] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.886 [2024-07-12 18:38:23.432985] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.886 [2024-07-12 18:38:23.433044] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.886 [2024-07-12 18:38:23.433086] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.886 [2024-07-12 18:38:23.433127] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.886 [2024-07-12 18:38:23.433168] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.886 [2024-07-12 18:38:23.433586] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.886 [2024-07-12 18:38:23.433603] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.886 [2024-07-12 18:38:23.433617] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.886 [2024-07-12 18:38:23.433632] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.886 [2024-07-12 18:38:23.438680] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.886 [2024-07-12 18:38:23.438744] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.886 [2024-07-12 18:38:23.438784] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.886 [2024-07-12 18:38:23.438824] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.886 [2024-07-12 18:38:23.439096] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.886 [2024-07-12 18:38:23.439157] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.886 [2024-07-12 18:38:23.439199] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.886 [2024-07-12 18:38:23.439241] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.886 [2024-07-12 18:38:23.439281] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.886 [2024-07-12 18:38:23.439549] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.886 [2024-07-12 18:38:23.439566] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.886 [2024-07-12 18:38:23.439580] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.886 [2024-07-12 18:38:23.439594] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.886 [2024-07-12 18:38:23.441235] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.886 [2024-07-12 18:38:23.441285] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.886 [2024-07-12 18:38:23.441329] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.886 [2024-07-12 18:38:23.441369] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.887 [2024-07-12 18:38:23.441638] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.887 [2024-07-12 18:38:23.441693] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.887 [2024-07-12 18:38:23.441735] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.887 [2024-07-12 18:38:23.441775] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.887 [2024-07-12 18:38:23.441815] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.887 [2024-07-12 18:38:23.442247] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.887 [2024-07-12 18:38:23.442265] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.887 [2024-07-12 18:38:23.442280] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.887 [2024-07-12 18:38:23.442296] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.887 [2024-07-12 18:38:23.446712] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.887 [2024-07-12 18:38:23.446763] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.887 [2024-07-12 18:38:23.446803] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.887 [2024-07-12 18:38:23.446842] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.887 [2024-07-12 18:38:23.447112] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.887 [2024-07-12 18:38:23.447174] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.887 [2024-07-12 18:38:23.447219] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.887 [2024-07-12 18:38:23.447259] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.887 [2024-07-12 18:38:23.447304] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.887 [2024-07-12 18:38:23.447665] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.887 [2024-07-12 18:38:23.447681] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.887 [2024-07-12 18:38:23.447695] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.887 [2024-07-12 18:38:23.447710] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.887 [2024-07-12 18:38:23.449233] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.887 [2024-07-12 18:38:23.449279] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.887 [2024-07-12 18:38:23.449320] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.887 [2024-07-12 18:38:23.449361] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.887 [2024-07-12 18:38:23.449633] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.887 [2024-07-12 18:38:23.449686] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.887 [2024-07-12 18:38:23.449727] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.887 [2024-07-12 18:38:23.449767] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.887 [2024-07-12 18:38:23.449806] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.887 [2024-07-12 18:38:23.450279] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.887 [2024-07-12 18:38:23.450297] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.887 [2024-07-12 18:38:23.450312] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.887 [2024-07-12 18:38:23.450329] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.887 [2024-07-12 18:38:23.452824] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.887 [2024-07-12 18:38:23.452875] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.887 [2024-07-12 18:38:23.452936] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.887 [2024-07-12 18:38:23.452977] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.887 [2024-07-12 18:38:23.453243] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.887 [2024-07-12 18:38:23.453314] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.887 [2024-07-12 18:38:23.453358] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.887 [2024-07-12 18:38:23.453399] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.887 [2024-07-12 18:38:23.453440] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.887 [2024-07-12 18:38:23.453713] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.887 [2024-07-12 18:38:23.453734] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.887 [2024-07-12 18:38:23.453748] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.887 [2024-07-12 18:38:23.453763] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.887 [2024-07-12 18:38:23.455299] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.887 [2024-07-12 18:38:23.455344] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.887 [2024-07-12 18:38:23.455387] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.887 [2024-07-12 18:38:23.455427] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.887 [2024-07-12 18:38:23.455842] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.887 [2024-07-12 18:38:23.455903] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.887 [2024-07-12 18:38:23.455953] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.887 [2024-07-12 18:38:23.455994] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.887 [2024-07-12 18:38:23.456034] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.887 [2024-07-12 18:38:23.456304] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.887 [2024-07-12 18:38:23.456320] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.887 [2024-07-12 18:38:23.456335] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.887 [2024-07-12 18:38:23.456351] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.887 [2024-07-12 18:38:23.458888] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.887 [2024-07-12 18:38:23.458946] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.887 [2024-07-12 18:38:23.458987] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.887 [2024-07-12 18:38:23.459027] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.887 [2024-07-12 18:38:23.459293] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.887 [2024-07-12 18:38:23.459354] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.887 [2024-07-12 18:38:23.459404] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.887 [2024-07-12 18:38:23.459443] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.887 [2024-07-12 18:38:23.459484] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.887 [2024-07-12 18:38:23.459748] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.887 [2024-07-12 18:38:23.459765] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.887 [2024-07-12 18:38:23.459779] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.887 [2024-07-12 18:38:23.459793] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.887 [2024-07-12 18:38:23.461486] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.887 [2024-07-12 18:38:23.461541] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.887 [2024-07-12 18:38:23.461585] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.887 [2024-07-12 18:38:23.461624] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.887 [2024-07-12 18:38:23.461889] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.887 [2024-07-12 18:38:23.461954] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.887 [2024-07-12 18:38:23.461996] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.887 [2024-07-12 18:38:23.462037] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.887 [2024-07-12 18:38:23.462078] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.887 [2024-07-12 18:38:23.462348] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.887 [2024-07-12 18:38:23.462365] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.887 [2024-07-12 18:38:23.462379] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.887 [2024-07-12 18:38:23.462393] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.887 [2024-07-12 18:38:23.465238] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.887 [2024-07-12 18:38:23.465294] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.887 [2024-07-12 18:38:23.465335] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.887 [2024-07-12 18:38:23.465380] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.887 [2024-07-12 18:38:23.465646] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.887 [2024-07-12 18:38:23.465703] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.887 [2024-07-12 18:38:23.465758] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.887 [2024-07-12 18:38:23.465799] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.887 [2024-07-12 18:38:23.465840] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.887 [2024-07-12 18:38:23.466116] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.887 [2024-07-12 18:38:23.466134] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.887 [2024-07-12 18:38:23.466148] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.888 [2024-07-12 18:38:23.466162] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.888 [2024-07-12 18:38:23.467820] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.888 [2024-07-12 18:38:23.467864] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.888 [2024-07-12 18:38:23.467908] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.888 [2024-07-12 18:38:23.467957] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.888 [2024-07-12 18:38:23.468222] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.888 [2024-07-12 18:38:23.468279] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.888 [2024-07-12 18:38:23.468323] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.888 [2024-07-12 18:38:23.468364] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.888 [2024-07-12 18:38:23.468404] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.888 [2024-07-12 18:38:23.468875] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.888 [2024-07-12 18:38:23.468892] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.888 [2024-07-12 18:38:23.468906] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.888 [2024-07-12 18:38:23.468920] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.888 [2024-07-12 18:38:23.473346] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.888 [2024-07-12 18:38:23.473408] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.888 [2024-07-12 18:38:23.473450] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.888 [2024-07-12 18:38:23.473506] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.888 [2024-07-12 18:38:23.473771] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.888 [2024-07-12 18:38:23.473825] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.888 [2024-07-12 18:38:23.473873] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.888 [2024-07-12 18:38:23.473916] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.888 [2024-07-12 18:38:23.473966] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.888 [2024-07-12 18:38:23.474235] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.888 [2024-07-12 18:38:23.474251] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.888 [2024-07-12 18:38:23.474266] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.888 [2024-07-12 18:38:23.474280] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.888 [2024-07-12 18:38:23.475958] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.888 [2024-07-12 18:38:23.476003] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.888 [2024-07-12 18:38:23.476042] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.888 [2024-07-12 18:38:23.476082] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.888 [2024-07-12 18:38:23.476345] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.888 [2024-07-12 18:38:23.476400] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.888 [2024-07-12 18:38:23.476441] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.888 [2024-07-12 18:38:23.476480] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.888 [2024-07-12 18:38:23.476520] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.888 [2024-07-12 18:38:23.476891] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.888 [2024-07-12 18:38:23.476907] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.888 [2024-07-12 18:38:23.476937] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.888 [2024-07-12 18:38:23.476952] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.888 [2024-07-12 18:38:23.481948] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.888 [2024-07-12 18:38:23.482000] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.888 [2024-07-12 18:38:23.482041] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.888 [2024-07-12 18:38:23.482081] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.888 [2024-07-12 18:38:23.482387] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.888 [2024-07-12 18:38:23.482444] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.888 [2024-07-12 18:38:23.482485] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.888 [2024-07-12 18:38:23.482525] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.888 [2024-07-12 18:38:23.482565] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.888 [2024-07-12 18:38:23.482828] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.888 [2024-07-12 18:38:23.482846] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.888 [2024-07-12 18:38:23.482862] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.888 [2024-07-12 18:38:23.482878] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.888 [2024-07-12 18:38:23.484522] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.888 [2024-07-12 18:38:23.484572] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.888 [2024-07-12 18:38:23.484626] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.888 [2024-07-12 18:38:23.484667] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.888 [2024-07-12 18:38:23.484942] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.888 [2024-07-12 18:38:23.485001] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.888 [2024-07-12 18:38:23.485042] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.888 [2024-07-12 18:38:23.485083] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.888 [2024-07-12 18:38:23.485122] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.888 [2024-07-12 18:38:23.485410] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.888 [2024-07-12 18:38:23.485426] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.888 [2024-07-12 18:38:23.485442] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.888 [2024-07-12 18:38:23.485457] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.888 [2024-07-12 18:38:23.490871] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.888 [2024-07-12 18:38:23.490937] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.888 [2024-07-12 18:38:23.490986] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.888 [2024-07-12 18:38:23.491027] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.888 [2024-07-12 18:38:23.491296] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.888 [2024-07-12 18:38:23.491359] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.888 [2024-07-12 18:38:23.491405] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.888 [2024-07-12 18:38:23.491445] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.888 [2024-07-12 18:38:23.491485] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.888 [2024-07-12 18:38:23.491751] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.888 [2024-07-12 18:38:23.491768] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.888 [2024-07-12 18:38:23.491782] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.888 [2024-07-12 18:38:23.491797] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.888 [2024-07-12 18:38:23.493440] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.888 [2024-07-12 18:38:23.493485] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.888 [2024-07-12 18:38:23.493525] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.888 [2024-07-12 18:38:23.493566] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.888 [2024-07-12 18:38:23.493830] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.888 [2024-07-12 18:38:23.493888] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.888 [2024-07-12 18:38:23.493938] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.888 [2024-07-12 18:38:23.493979] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.888 [2024-07-12 18:38:23.494019] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.888 [2024-07-12 18:38:23.494293] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.888 [2024-07-12 18:38:23.494310] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.888 [2024-07-12 18:38:23.494324] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.888 [2024-07-12 18:38:23.494338] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.888 [2024-07-12 18:38:23.497915] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.888 [2024-07-12 18:38:23.497977] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.888 [2024-07-12 18:38:23.498019] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.888 [2024-07-12 18:38:23.498063] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.888 [2024-07-12 18:38:23.498355] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.888 [2024-07-12 18:38:23.498409] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.888 [2024-07-12 18:38:23.498451] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.888 [2024-07-12 18:38:23.498496] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.889 [2024-07-12 18:38:23.498538] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.889 [2024-07-12 18:38:23.498838] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.889 [2024-07-12 18:38:23.498854] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.889 [2024-07-12 18:38:23.498868] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.889 [2024-07-12 18:38:23.498883] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.889 [2024-07-12 18:38:23.500517] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.889 [2024-07-12 18:38:23.502366] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.889 [2024-07-12 18:38:23.502414] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.889 [2024-07-12 18:38:23.502454] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.889 [2024-07-12 18:38:23.502721] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.889 [2024-07-12 18:38:23.502780] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.889 [2024-07-12 18:38:23.502821] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.889 [2024-07-12 18:38:23.502861] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.889 [2024-07-12 18:38:23.502901] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.889 [2024-07-12 18:38:23.503239] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.889 [2024-07-12 18:38:23.503256] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.889 [2024-07-12 18:38:23.503271] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.889 [2024-07-12 18:38:23.503286] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.889 [2024-07-12 18:38:23.508920] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.889 [2024-07-12 18:38:23.508982] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.889 [2024-07-12 18:38:23.510531] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.889 [2024-07-12 18:38:23.510578] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.889 [2024-07-12 18:38:23.510847] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.889 [2024-07-12 18:38:23.510911] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.889 [2024-07-12 18:38:23.510964] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.889 [2024-07-12 18:38:23.512624] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.889 [2024-07-12 18:38:23.512671] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.889 [2024-07-12 18:38:23.512951] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.889 [2024-07-12 18:38:23.512969] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.889 [2024-07-12 18:38:23.512983] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.889 [2024-07-12 18:38:23.513002] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.889 [2024-07-12 18:38:23.514529] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.889 [2024-07-12 18:38:23.515512] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.889 [2024-07-12 18:38:23.515571] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.889 [2024-07-12 18:38:23.515615] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.889 [2024-07-12 18:38:23.515885] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.889 [2024-07-12 18:38:23.515948] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.889 [2024-07-12 18:38:23.516343] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.889 [2024-07-12 18:38:23.516386] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.889 [2024-07-12 18:38:23.516431] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.889 [2024-07-12 18:38:23.516830] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.889 [2024-07-12 18:38:23.516846] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.889 [2024-07-12 18:38:23.516860] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.889 [2024-07-12 18:38:23.516875] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.889 [2024-07-12 18:38:23.522984] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.889 [2024-07-12 18:38:23.523040] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.889 [2024-07-12 18:38:23.523083] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.889 [2024-07-12 18:38:23.524742] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.889 [2024-07-12 18:38:23.525077] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.889 [2024-07-12 18:38:23.526757] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.889 [2024-07-12 18:38:23.526805] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.889 [2024-07-12 18:38:23.526845] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.889 [2024-07-12 18:38:23.528503] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.889 [2024-07-12 18:38:23.528880] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.889 [2024-07-12 18:38:23.528899] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.889 [2024-07-12 18:38:23.528913] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.889 [2024-07-12 18:38:23.528935] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.889 [2024-07-12 18:38:23.534197] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.889 [2024-07-12 18:38:23.534247] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.889 [2024-07-12 18:38:23.535629] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.889 [2024-07-12 18:38:23.535680] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.889 [2024-07-12 18:38:23.535958] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.889 [2024-07-12 18:38:23.536017] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.889 [2024-07-12 18:38:23.537656] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.889 [2024-07-12 18:38:23.537702] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.889 [2024-07-12 18:38:23.538596] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.889 [2024-07-12 18:38:23.538872] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.889 [2024-07-12 18:38:23.538888] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.889 [2024-07-12 18:38:23.538902] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.889 [2024-07-12 18:38:23.538917] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.889 [2024-07-12 18:38:23.543335] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.889 [2024-07-12 18:38:23.543390] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.889 [2024-07-12 18:38:23.543431] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.889 [2024-07-12 18:38:23.544568] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.889 [2024-07-12 18:38:23.545000] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.889 [2024-07-12 18:38:23.545058] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.889 [2024-07-12 18:38:23.546118] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.889 [2024-07-12 18:38:23.546163] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.889 [2024-07-12 18:38:23.546203] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.889 [2024-07-12 18:38:23.546541] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.889 [2024-07-12 18:38:23.546557] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.890 [2024-07-12 18:38:23.546572] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.890 [2024-07-12 18:38:23.546586] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.890 [2024-07-12 18:38:23.551603] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.890 [2024-07-12 18:38:23.553297] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.890 [2024-07-12 18:38:23.553351] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.890 [2024-07-12 18:38:23.554292] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.890 [2024-07-12 18:38:23.554573] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.890 [2024-07-12 18:38:23.554984] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.890 [2024-07-12 18:38:23.555809] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.890 [2024-07-12 18:38:23.555855] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.890 [2024-07-12 18:38:23.556793] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.890 [2024-07-12 18:38:23.557235] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.890 [2024-07-12 18:38:23.557253] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.890 [2024-07-12 18:38:23.557270] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.890 [2024-07-12 18:38:23.557285] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.890 [2024-07-12 18:38:23.563389] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.890 [2024-07-12 18:38:23.565065] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.890 [2024-07-12 18:38:23.566730] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.890 [2024-07-12 18:38:23.567441] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.890 [2024-07-12 18:38:23.567714] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.890 [2024-07-12 18:38:23.567775] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.890 [2024-07-12 18:38:23.568171] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.890 [2024-07-12 18:38:23.568214] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.890 [2024-07-12 18:38:23.569155] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.890 [2024-07-12 18:38:23.569434] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.890 [2024-07-12 18:38:23.569452] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.890 [2024-07-12 18:38:23.569467] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.890 [2024-07-12 18:38:23.569482] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.890 [2024-07-12 18:38:23.574431] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.890 [2024-07-12 18:38:23.575828] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.890 [2024-07-12 18:38:23.577497] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.890 [2024-07-12 18:38:23.579155] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.890 [2024-07-12 18:38:23.579516] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.890 [2024-07-12 18:38:23.581228] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.890 [2024-07-12 18:38:23.581627] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.890 [2024-07-12 18:38:23.582032] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.890 [2024-07-12 18:38:23.583596] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.890 [2024-07-12 18:38:23.584062] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.890 [2024-07-12 18:38:23.584080] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.890 [2024-07-12 18:38:23.584095] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.890 [2024-07-12 18:38:23.584117] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.890 [2024-07-12 18:38:23.589587] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.890 [2024-07-12 18:38:23.591240] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.890 [2024-07-12 18:38:23.592890] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.890 [2024-07-12 18:38:23.593867] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.890 [2024-07-12 18:38:23.594150] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.890 [2024-07-12 18:38:23.594867] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.890 [2024-07-12 18:38:23.595265] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.890 [2024-07-12 18:38:23.596879] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.890 [2024-07-12 18:38:23.597282] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.890 [2024-07-12 18:38:23.597726] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.890 [2024-07-12 18:38:23.597744] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.890 [2024-07-12 18:38:23.597759] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.890 [2024-07-12 18:38:23.597774] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.890 [2024-07-12 18:38:23.604426] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.890 [2024-07-12 18:38:23.606119] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:39.890 [2024-07-12 18:38:23.607833] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.150 [2024-07-12 18:38:23.608648] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.150 [2024-07-12 18:38:23.608924] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.150 [2024-07-12 18:38:23.609337] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.150 [2024-07-12 18:38:23.609861] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.150 [2024-07-12 18:38:23.611301] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.150 [2024-07-12 18:38:23.611688] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.150 [2024-07-12 18:38:23.612084] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.150 [2024-07-12 18:38:23.612102] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.150 [2024-07-12 18:38:23.612116] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.150 [2024-07-12 18:38:23.612131] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.150 [2024-07-12 18:38:23.618090] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.151 [2024-07-12 18:38:23.619622] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.151 [2024-07-12 18:38:23.620852] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.151 [2024-07-12 18:38:23.622327] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.151 [2024-07-12 18:38:23.622718] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.151 [2024-07-12 18:38:23.623131] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.151 [2024-07-12 18:38:23.624509] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.151 [2024-07-12 18:38:23.625099] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.151 [2024-07-12 18:38:23.625488] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.151 [2024-07-12 18:38:23.625759] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.151 [2024-07-12 18:38:23.625776] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.151 [2024-07-12 18:38:23.625791] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.151 [2024-07-12 18:38:23.625805] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.151 [2024-07-12 18:38:23.631424] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.151 [2024-07-12 18:38:23.633101] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.151 [2024-07-12 18:38:23.634202] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.151 [2024-07-12 18:38:23.635074] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.151 [2024-07-12 18:38:23.635506] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.151 [2024-07-12 18:38:23.636533] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.151 [2024-07-12 18:38:23.637467] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.151 [2024-07-12 18:38:23.637855] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.151 [2024-07-12 18:38:23.639449] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.151 [2024-07-12 18:38:23.639771] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.151 [2024-07-12 18:38:23.639787] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.151 [2024-07-12 18:38:23.639801] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.151 [2024-07-12 18:38:23.639816] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.151 [2024-07-12 18:38:23.646323] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.151 [2024-07-12 18:38:23.646932] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.151 [2024-07-12 18:38:23.648283] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.151 [2024-07-12 18:38:23.648672] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.151 [2024-07-12 18:38:23.649039] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.151 [2024-07-12 18:38:23.650519] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.151 [2024-07-12 18:38:23.650909] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.151 [2024-07-12 18:38:23.651518] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.151 [2024-07-12 18:38:23.653164] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.151 [2024-07-12 18:38:23.653439] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.151 [2024-07-12 18:38:23.653455] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.151 [2024-07-12 18:38:23.653470] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.151 [2024-07-12 18:38:23.653484] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.151 [2024-07-12 18:38:23.658810] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.151 [2024-07-12 18:38:23.659215] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.151 [2024-07-12 18:38:23.660905] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.151 [2024-07-12 18:38:23.661312] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.151 [2024-07-12 18:38:23.661748] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.151 [2024-07-12 18:38:23.663465] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.151 [2024-07-12 18:38:23.665035] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.151 [2024-07-12 18:38:23.666748] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.151 [2024-07-12 18:38:23.668565] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.151 [2024-07-12 18:38:23.668940] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.151 [2024-07-12 18:38:23.668957] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.151 [2024-07-12 18:38:23.668971] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.151 [2024-07-12 18:38:23.668986] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.151 [2024-07-12 18:38:23.673737] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.151 [2024-07-12 18:38:23.674910] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.151 [2024-07-12 18:38:23.675708] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.151 [2024-07-12 18:38:23.676102] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.151 [2024-07-12 18:38:23.676384] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.151 [2024-07-12 18:38:23.677773] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.151 [2024-07-12 18:38:23.679433] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.151 [2024-07-12 18:38:23.681093] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.151 [2024-07-12 18:38:23.681841] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.151 [2024-07-12 18:38:23.682120] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.151 [2024-07-12 18:38:23.682136] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.151 [2024-07-12 18:38:23.682151] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.151 [2024-07-12 18:38:23.682165] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.151 [2024-07-12 18:38:23.686950] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.151 [2024-07-12 18:38:23.688273] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.151 [2024-07-12 18:38:23.688662] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.151 [2024-07-12 18:38:23.689821] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.151 [2024-07-12 18:38:23.690177] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.151 [2024-07-12 18:38:23.691888] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.151 [2024-07-12 18:38:23.693553] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.151 [2024-07-12 18:38:23.694715] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.151 [2024-07-12 18:38:23.696041] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.151 [2024-07-12 18:38:23.696342] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.151 [2024-07-12 18:38:23.696358] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.151 [2024-07-12 18:38:23.696373] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.151 [2024-07-12 18:38:23.696387] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.151 [2024-07-12 18:38:23.702400] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.151 [2024-07-12 18:38:23.703141] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.151 [2024-07-12 18:38:23.703530] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.151 [2024-07-12 18:38:23.703920] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.151 [2024-07-12 18:38:23.704367] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.151 [2024-07-12 18:38:23.706144] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.151 [2024-07-12 18:38:23.706543] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.151 [2024-07-12 18:38:23.706966] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.151 [2024-07-12 18:38:23.708500] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.151 [2024-07-12 18:38:23.708958] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.151 [2024-07-12 18:38:23.708975] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.151 [2024-07-12 18:38:23.708990] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.151 [2024-07-12 18:38:23.709006] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.151 [2024-07-12 18:38:23.714606] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.151 [2024-07-12 18:38:23.715009] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.151 [2024-07-12 18:38:23.715403] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.151 [2024-07-12 18:38:23.715794] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.151 [2024-07-12 18:38:23.716231] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.151 [2024-07-12 18:38:23.717584] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.151 [2024-07-12 18:38:23.717977] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.151 [2024-07-12 18:38:23.718957] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.151 [2024-07-12 18:38:23.719944] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.151 [2024-07-12 18:38:23.720377] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.152 [2024-07-12 18:38:23.720398] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.152 [2024-07-12 18:38:23.720413] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.152 [2024-07-12 18:38:23.720428] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.152 [2024-07-12 18:38:23.724823] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.152 [2024-07-12 18:38:23.725225] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.152 [2024-07-12 18:38:23.725619] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.152 [2024-07-12 18:38:23.726016] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.152 [2024-07-12 18:38:23.726293] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.152 [2024-07-12 18:38:23.727233] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.152 [2024-07-12 18:38:23.727622] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.152 [2024-07-12 18:38:23.728988] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.152 [2024-07-12 18:38:23.729577] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.152 [2024-07-12 18:38:23.730010] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.152 [2024-07-12 18:38:23.730028] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.152 [2024-07-12 18:38:23.730044] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.152 [2024-07-12 18:38:23.730059] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.152 [2024-07-12 18:38:23.733753] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.152 [2024-07-12 18:38:23.734158] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.152 [2024-07-12 18:38:23.734551] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.152 [2024-07-12 18:38:23.734954] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.152 [2024-07-12 18:38:23.735225] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.152 [2024-07-12 18:38:23.735918] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.152 [2024-07-12 18:38:23.736311] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.152 [2024-07-12 18:38:23.737919] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.152 [2024-07-12 18:38:23.738321] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.152 [2024-07-12 18:38:23.738754] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.152 [2024-07-12 18:38:23.738773] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.152 [2024-07-12 18:38:23.738788] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.152 [2024-07-12 18:38:23.738804] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.152 [2024-07-12 18:38:23.742086] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.152 [2024-07-12 18:38:23.742484] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.152 [2024-07-12 18:38:23.742880] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.152 [2024-07-12 18:38:23.743280] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.152 [2024-07-12 18:38:23.743553] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.152 [2024-07-12 18:38:23.744053] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.152 [2024-07-12 18:38:23.744442] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.152 [2024-07-12 18:38:23.746254] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.152 [2024-07-12 18:38:23.746644] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.152 [2024-07-12 18:38:23.747081] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.152 [2024-07-12 18:38:23.747099] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.152 [2024-07-12 18:38:23.747114] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.152 [2024-07-12 18:38:23.747129] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.152 [2024-07-12 18:38:23.750048] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.152 [2024-07-12 18:38:23.750448] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.152 [2024-07-12 18:38:23.750842] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.152 [2024-07-12 18:38:23.751327] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.152 [2024-07-12 18:38:23.751602] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.152 [2024-07-12 18:38:23.752013] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.152 [2024-07-12 18:38:23.752400] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.152 [2024-07-12 18:38:23.754009] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.152 [2024-07-12 18:38:23.754397] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.152 [2024-07-12 18:38:23.754803] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.152 [2024-07-12 18:38:23.754819] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.152 [2024-07-12 18:38:23.754834] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.152 [2024-07-12 18:38:23.754849] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.152 [2024-07-12 18:38:23.757800] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.152 [2024-07-12 18:38:23.758212] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.152 [2024-07-12 18:38:23.758618] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.152 [2024-07-12 18:38:23.759340] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.152 [2024-07-12 18:38:23.759611] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.152 [2024-07-12 18:38:23.760023] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.152 [2024-07-12 18:38:23.760618] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.152 [2024-07-12 18:38:23.761997] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.152 [2024-07-12 18:38:23.762384] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.152 [2024-07-12 18:38:23.762785] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.152 [2024-07-12 18:38:23.762801] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.152 [2024-07-12 18:38:23.762816] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.152 [2024-07-12 18:38:23.762831] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.152 [2024-07-12 18:38:23.765727] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.152 [2024-07-12 18:38:23.766138] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.152 [2024-07-12 18:38:23.766535] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.152 [2024-07-12 18:38:23.767430] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.152 [2024-07-12 18:38:23.767709] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.152 [2024-07-12 18:38:23.768120] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.152 [2024-07-12 18:38:23.768853] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.152 [2024-07-12 18:38:23.770099] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.152 [2024-07-12 18:38:23.770490] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.152 [2024-07-12 18:38:23.770880] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.152 [2024-07-12 18:38:23.770897] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.152 [2024-07-12 18:38:23.770911] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.152 [2024-07-12 18:38:23.770931] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.152 [2024-07-12 18:38:23.773809] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.152 [2024-07-12 18:38:23.774215] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.152 [2024-07-12 18:38:23.774620] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.152 [2024-07-12 18:38:23.775373] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.152 [2024-07-12 18:38:23.775651] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.152 [2024-07-12 18:38:23.776066] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.152 [2024-07-12 18:38:23.776667] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.152 [2024-07-12 18:38:23.778034] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.152 [2024-07-12 18:38:23.778419] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.152 [2024-07-12 18:38:23.778818] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.152 [2024-07-12 18:38:23.778835] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.152 [2024-07-12 18:38:23.778851] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.152 [2024-07-12 18:38:23.778865] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.152 [2024-07-12 18:38:23.781753] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.152 [2024-07-12 18:38:23.782159] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.152 [2024-07-12 18:38:23.782559] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.152 [2024-07-12 18:38:23.783271] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.152 [2024-07-12 18:38:23.783547] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.152 [2024-07-12 18:38:23.783957] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.153 [2024-07-12 18:38:23.784463] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.153 [2024-07-12 18:38:23.785919] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.153 [2024-07-12 18:38:23.786312] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.153 [2024-07-12 18:38:23.786708] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.153 [2024-07-12 18:38:23.786725] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.153 [2024-07-12 18:38:23.786739] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.153 [2024-07-12 18:38:23.786754] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.153 [2024-07-12 18:38:23.789675] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.153 [2024-07-12 18:38:23.790081] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.153 [2024-07-12 18:38:23.790490] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.153 [2024-07-12 18:38:23.791154] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.153 [2024-07-12 18:38:23.791429] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.153 [2024-07-12 18:38:23.791832] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.153 [2024-07-12 18:38:23.792282] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.153 [2024-07-12 18:38:23.793790] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.153 [2024-07-12 18:38:23.794183] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.153 [2024-07-12 18:38:23.794579] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.153 [2024-07-12 18:38:23.794601] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.153 [2024-07-12 18:38:23.794615] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.153 [2024-07-12 18:38:23.794630] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.153 [2024-07-12 18:38:23.797547] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.153 [2024-07-12 18:38:23.797961] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.153 [2024-07-12 18:38:23.798357] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.153 [2024-07-12 18:38:23.799127] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.153 [2024-07-12 18:38:23.799403] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.153 [2024-07-12 18:38:23.799810] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.153 [2024-07-12 18:38:23.800397] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.153 [2024-07-12 18:38:23.801768] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.153 [2024-07-12 18:38:23.802166] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.153 [2024-07-12 18:38:23.802635] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.153 [2024-07-12 18:38:23.802652] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.153 [2024-07-12 18:38:23.802666] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.153 [2024-07-12 18:38:23.802680] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.153 [2024-07-12 18:38:23.805578] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.153 [2024-07-12 18:38:23.805986] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.153 [2024-07-12 18:38:23.806387] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.153 [2024-07-12 18:38:23.807175] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.153 [2024-07-12 18:38:23.807453] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.153 [2024-07-12 18:38:23.807858] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.153 [2024-07-12 18:38:23.808468] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.153 [2024-07-12 18:38:23.809814] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.153 [2024-07-12 18:38:23.810210] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.153 [2024-07-12 18:38:23.810686] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.153 [2024-07-12 18:38:23.810702] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.153 [2024-07-12 18:38:23.810716] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.153 [2024-07-12 18:38:23.810731] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.153 [2024-07-12 18:38:23.813607] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.153 [2024-07-12 18:38:23.814026] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.153 [2024-07-12 18:38:23.814423] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.153 [2024-07-12 18:38:23.815382] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.153 [2024-07-12 18:38:23.815658] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.153 [2024-07-12 18:38:23.816070] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.153 [2024-07-12 18:38:23.816787] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.153 [2024-07-12 18:38:23.818033] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.153 [2024-07-12 18:38:23.818421] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.153 [2024-07-12 18:38:23.818818] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.153 [2024-07-12 18:38:23.818835] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.153 [2024-07-12 18:38:23.818852] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.153 [2024-07-12 18:38:23.818866] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.153 [2024-07-12 18:38:23.824318] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.153 [2024-07-12 18:38:23.825328] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.153 [2024-07-12 18:38:23.825718] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.153 [2024-07-12 18:38:23.826113] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.153 [2024-07-12 18:38:23.826459] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.153 [2024-07-12 18:38:23.827451] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.153 [2024-07-12 18:38:23.828420] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.153 [2024-07-12 18:38:23.828808] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.153 [2024-07-12 18:38:23.830075] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.153 [2024-07-12 18:38:23.830432] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.153 [2024-07-12 18:38:23.830449] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.153 [2024-07-12 18:38:23.830463] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.153 [2024-07-12 18:38:23.830478] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.153 [2024-07-12 18:38:23.835600] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.153 [2024-07-12 18:38:23.835655] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.153 [2024-07-12 18:38:23.836154] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.153 [2024-07-12 18:38:23.836547] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.153 [2024-07-12 18:38:23.836978] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.153 [2024-07-12 18:38:23.837382] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.153 [2024-07-12 18:38:23.839054] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.153 [2024-07-12 18:38:23.839450] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.153 [2024-07-12 18:38:23.839840] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.153 [2024-07-12 18:38:23.840119] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.153 [2024-07-12 18:38:23.840136] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.153 [2024-07-12 18:38:23.840150] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.153 [2024-07-12 18:38:23.840165] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.153 [2024-07-12 18:38:23.843611] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.153 [2024-07-12 18:38:23.845118] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.153 [2024-07-12 18:38:23.845164] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.153 [2024-07-12 18:38:23.845551] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.153 [2024-07-12 18:38:23.845907] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.153 [2024-07-12 18:38:23.847298] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.153 [2024-07-12 18:38:23.848970] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.153 [2024-07-12 18:38:23.849018] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.153 [2024-07-12 18:38:23.850666] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.153 [2024-07-12 18:38:23.850970] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.153 [2024-07-12 18:38:23.850987] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.153 [2024-07-12 18:38:23.851002] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.153 [2024-07-12 18:38:23.851017] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.153 [2024-07-12 18:38:23.856382] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.154 [2024-07-12 18:38:23.856440] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.154 [2024-07-12 18:38:23.856862] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.154 [2024-07-12 18:38:23.858406] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.154 [2024-07-12 18:38:23.858884] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.154 [2024-07-12 18:38:23.859329] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.154 [2024-07-12 18:38:23.859376] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.154 [2024-07-12 18:38:23.860749] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.154 [2024-07-12 18:38:23.862408] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.154 [2024-07-12 18:38:23.862682] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.154 [2024-07-12 18:38:23.862701] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.154 [2024-07-12 18:38:23.862716] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.154 [2024-07-12 18:38:23.862731] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.154 [2024-07-12 18:38:23.867502] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.154 [2024-07-12 18:38:23.869141] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.154 [2024-07-12 18:38:23.869542] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.154 [2024-07-12 18:38:23.869586] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.154 [2024-07-12 18:38:23.870030] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.154 [2024-07-12 18:38:23.870092] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.154 [2024-07-12 18:38:23.871758] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.154 [2024-07-12 18:38:23.872159] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.154 [2024-07-12 18:38:23.872207] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.154 [2024-07-12 18:38:23.872649] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.154 [2024-07-12 18:38:23.872666] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.154 [2024-07-12 18:38:23.872682] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.154 [2024-07-12 18:38:23.872697] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.415 [2024-07-12 18:38:23.878948] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.415 [2024-07-12 18:38:23.880643] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.415 [2024-07-12 18:38:23.880697] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.415 [2024-07-12 18:38:23.882378] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.415 [2024-07-12 18:38:23.882761] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.415 [2024-07-12 18:38:23.884344] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.415 [2024-07-12 18:38:23.884389] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.415 [2024-07-12 18:38:23.884775] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.415 [2024-07-12 18:38:23.884818] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.415 [2024-07-12 18:38:23.885186] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.415 [2024-07-12 18:38:23.885203] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.415 [2024-07-12 18:38:23.885218] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.415 [2024-07-12 18:38:23.885232] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.415 [2024-07-12 18:38:23.889484] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.415 [2024-07-12 18:38:23.890640] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.415 [2024-07-12 18:38:23.892404] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.415 [2024-07-12 18:38:23.892458] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.415 [2024-07-12 18:38:23.892729] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.415 [2024-07-12 18:38:23.894399] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.415 [2024-07-12 18:38:23.894444] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.415 [2024-07-12 18:38:23.896136] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.415 [2024-07-12 18:38:23.896905] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.415 [2024-07-12 18:38:23.897181] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.415 [2024-07-12 18:38:23.897199] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.415 [2024-07-12 18:38:23.897213] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.415 [2024-07-12 18:38:23.897228] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.415 [2024-07-12 18:38:23.901420] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.415 [2024-07-12 18:38:23.901475] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.415 [2024-07-12 18:38:23.903128] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.415 [2024-07-12 18:38:23.903172] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.415 [2024-07-12 18:38:23.903497] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.415 [2024-07-12 18:38:23.903555] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.415 [2024-07-12 18:38:23.903598] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.415 [2024-07-12 18:38:23.905407] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.415 [2024-07-12 18:38:23.905458] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.415 [2024-07-12 18:38:23.905728] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.415 [2024-07-12 18:38:23.905745] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.415 [2024-07-12 18:38:23.905759] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.415 [2024-07-12 18:38:23.905773] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.415 [2024-07-12 18:38:23.910826] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.415 [2024-07-12 18:38:23.910874] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.415 [2024-07-12 18:38:23.910915] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.415 [2024-07-12 18:38:23.910960] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.415 [2024-07-12 18:38:23.911330] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.415 [2024-07-12 18:38:23.911731] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.415 [2024-07-12 18:38:23.911776] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.415 [2024-07-12 18:38:23.913296] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.415 [2024-07-12 18:38:23.913341] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.415 [2024-07-12 18:38:23.913684] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.416 [2024-07-12 18:38:23.913700] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.416 [2024-07-12 18:38:23.913714] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.416 [2024-07-12 18:38:23.913729] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.416 [2024-07-12 18:38:23.918499] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.416 [2024-07-12 18:38:23.918558] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.416 [2024-07-12 18:38:23.918598] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.416 [2024-07-12 18:38:23.918638] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.416 [2024-07-12 18:38:23.918909] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.416 [2024-07-12 18:38:23.918971] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.416 [2024-07-12 18:38:23.919013] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.416 [2024-07-12 18:38:23.919055] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.416 [2024-07-12 18:38:23.919096] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.416 [2024-07-12 18:38:23.919369] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.416 [2024-07-12 18:38:23.919385] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.416 [2024-07-12 18:38:23.919400] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.416 [2024-07-12 18:38:23.919414] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.416 [2024-07-12 18:38:23.922150] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.416 [2024-07-12 18:38:23.922205] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.416 [2024-07-12 18:38:23.922248] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.416 [2024-07-12 18:38:23.922292] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.416 [2024-07-12 18:38:23.922559] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.416 [2024-07-12 18:38:23.922616] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.416 [2024-07-12 18:38:23.922665] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.416 [2024-07-12 18:38:23.922706] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.416 [2024-07-12 18:38:23.922745] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.416 [2024-07-12 18:38:23.923018] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.416 [2024-07-12 18:38:23.923034] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.416 [2024-07-12 18:38:23.923053] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.416 [2024-07-12 18:38:23.923067] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.416 [2024-07-12 18:38:23.927646] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.416 [2024-07-12 18:38:23.927702] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.416 [2024-07-12 18:38:23.927742] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.416 [2024-07-12 18:38:23.927783] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.416 [2024-07-12 18:38:23.928060] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.416 [2024-07-12 18:38:23.928118] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.416 [2024-07-12 18:38:23.928159] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.416 [2024-07-12 18:38:23.928200] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.416 [2024-07-12 18:38:23.928241] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.416 [2024-07-12 18:38:23.928665] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.416 [2024-07-12 18:38:23.928683] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.416 [2024-07-12 18:38:23.928698] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.416 [2024-07-12 18:38:23.928713] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.416 [2024-07-12 18:38:23.933132] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.416 [2024-07-12 18:38:23.933182] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.416 [2024-07-12 18:38:23.933222] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.416 [2024-07-12 18:38:23.933263] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.416 [2024-07-12 18:38:23.933533] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.416 [2024-07-12 18:38:23.933590] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.416 [2024-07-12 18:38:23.933631] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.416 [2024-07-12 18:38:23.933672] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.416 [2024-07-12 18:38:23.933717] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.416 [2024-07-12 18:38:23.934076] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.416 [2024-07-12 18:38:23.934093] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.416 [2024-07-12 18:38:23.934107] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.416 [2024-07-12 18:38:23.934122] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.416 [2024-07-12 18:38:23.938512] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.416 [2024-07-12 18:38:23.938562] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.416 [2024-07-12 18:38:23.938608] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.416 [2024-07-12 18:38:23.938654] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.416 [2024-07-12 18:38:23.939086] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.416 [2024-07-12 18:38:23.939137] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.416 [2024-07-12 18:38:23.939185] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.416 [2024-07-12 18:38:23.939230] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.416 [2024-07-12 18:38:23.939272] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.416 [2024-07-12 18:38:23.939544] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.416 [2024-07-12 18:38:23.939561] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.416 [2024-07-12 18:38:23.939575] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.416 [2024-07-12 18:38:23.939590] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.416 [2024-07-12 18:38:23.943343] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.416 [2024-07-12 18:38:23.943394] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.416 [2024-07-12 18:38:23.943436] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.416 [2024-07-12 18:38:23.943478] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.416 [2024-07-12 18:38:23.943745] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.416 [2024-07-12 18:38:23.943804] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.416 [2024-07-12 18:38:23.943844] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.416 [2024-07-12 18:38:23.943885] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.416 [2024-07-12 18:38:23.943938] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.416 [2024-07-12 18:38:23.944209] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.416 [2024-07-12 18:38:23.944225] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.416 [2024-07-12 18:38:23.944239] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.416 [2024-07-12 18:38:23.944254] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.416 [2024-07-12 18:38:23.949404] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.416 [2024-07-12 18:38:23.949452] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.416 [2024-07-12 18:38:23.949493] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.416 [2024-07-12 18:38:23.949533] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.416 [2024-07-12 18:38:23.949884] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.416 [2024-07-12 18:38:23.949947] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.416 [2024-07-12 18:38:23.949988] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.416 [2024-07-12 18:38:23.950035] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.416 [2024-07-12 18:38:23.950080] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.416 [2024-07-12 18:38:23.950516] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.416 [2024-07-12 18:38:23.950533] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.416 [2024-07-12 18:38:23.950551] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.416 [2024-07-12 18:38:23.950567] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.416 [2024-07-12 18:38:23.955627] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.416 [2024-07-12 18:38:23.955681] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.416 [2024-07-12 18:38:23.955722] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.416 [2024-07-12 18:38:23.955762] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.416 [2024-07-12 18:38:23.956037] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.417 [2024-07-12 18:38:23.956094] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.417 [2024-07-12 18:38:23.956135] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.417 [2024-07-12 18:38:23.956175] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.417 [2024-07-12 18:38:23.956215] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.417 [2024-07-12 18:38:23.956480] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.417 [2024-07-12 18:38:23.956496] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.417 [2024-07-12 18:38:23.956510] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.417 [2024-07-12 18:38:23.956525] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.417 [2024-07-12 18:38:23.959762] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.417 [2024-07-12 18:38:23.959813] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.417 [2024-07-12 18:38:23.959858] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.417 [2024-07-12 18:38:23.959899] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.417 [2024-07-12 18:38:23.960323] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.417 [2024-07-12 18:38:23.960379] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.417 [2024-07-12 18:38:23.960420] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.417 [2024-07-12 18:38:23.960460] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.417 [2024-07-12 18:38:23.960499] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.417 [2024-07-12 18:38:23.960795] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.417 [2024-07-12 18:38:23.960811] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.417 [2024-07-12 18:38:23.960826] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.417 [2024-07-12 18:38:23.960844] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.417 [2024-07-12 18:38:23.965570] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.417 [2024-07-12 18:38:23.965620] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.417 [2024-07-12 18:38:23.965661] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.417 [2024-07-12 18:38:23.965700] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.417 [2024-07-12 18:38:23.965973] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.417 [2024-07-12 18:38:23.966034] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.417 [2024-07-12 18:38:23.966075] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.417 [2024-07-12 18:38:23.966123] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.417 [2024-07-12 18:38:23.966164] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.417 [2024-07-12 18:38:23.966521] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.417 [2024-07-12 18:38:23.966537] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.417 [2024-07-12 18:38:23.966552] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.417 [2024-07-12 18:38:23.966566] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.417 [2024-07-12 18:38:23.970757] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.417 [2024-07-12 18:38:23.970806] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.417 [2024-07-12 18:38:23.970846] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.417 [2024-07-12 18:38:23.970888] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.417 [2024-07-12 18:38:23.971181] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.417 [2024-07-12 18:38:23.971238] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.417 [2024-07-12 18:38:23.971279] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.417 [2024-07-12 18:38:23.971319] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.417 [2024-07-12 18:38:23.971370] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.417 [2024-07-12 18:38:23.971637] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.417 [2024-07-12 18:38:23.971654] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.417 [2024-07-12 18:38:23.971668] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.417 [2024-07-12 18:38:23.971682] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.417 [2024-07-12 18:38:23.976794] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.417 [2024-07-12 18:38:23.976854] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.417 [2024-07-12 18:38:23.976897] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.417 [2024-07-12 18:38:23.976950] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.417 [2024-07-12 18:38:23.977223] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.417 [2024-07-12 18:38:23.977275] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.417 [2024-07-12 18:38:23.977315] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.417 [2024-07-12 18:38:23.977363] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.417 [2024-07-12 18:38:23.977406] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.417 [2024-07-12 18:38:23.977857] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.417 [2024-07-12 18:38:23.977874] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.417 [2024-07-12 18:38:23.977890] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.417 [2024-07-12 18:38:23.977906] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.417 [2024-07-12 18:38:23.981012] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.417 [2024-07-12 18:38:23.981063] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.417 [2024-07-12 18:38:23.981107] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.417 [2024-07-12 18:38:23.981149] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.417 [2024-07-12 18:38:23.981418] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.417 [2024-07-12 18:38:23.981474] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.417 [2024-07-12 18:38:23.981515] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.417 [2024-07-12 18:38:23.981555] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.417 [2024-07-12 18:38:23.981596] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.417 [2024-07-12 18:38:23.982009] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.417 [2024-07-12 18:38:23.982026] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.417 [2024-07-12 18:38:23.982041] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.417 [2024-07-12 18:38:23.982055] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.417 [2024-07-12 18:38:23.987180] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.417 [2024-07-12 18:38:23.987230] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.417 [2024-07-12 18:38:23.987270] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.417 [2024-07-12 18:38:23.987311] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.417 [2024-07-12 18:38:23.987739] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.417 [2024-07-12 18:38:23.987793] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.417 [2024-07-12 18:38:23.987838] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.417 [2024-07-12 18:38:23.987878] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.417 [2024-07-12 18:38:23.987924] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.417 [2024-07-12 18:38:23.988205] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.417 [2024-07-12 18:38:23.988222] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.417 [2024-07-12 18:38:23.988236] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.417 [2024-07-12 18:38:23.988250] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.417 [2024-07-12 18:38:23.991969] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.417 [2024-07-12 18:38:23.992017] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.417 [2024-07-12 18:38:23.992065] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.417 [2024-07-12 18:38:23.992107] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.417 [2024-07-12 18:38:23.992467] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.417 [2024-07-12 18:38:23.992521] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.417 [2024-07-12 18:38:23.992561] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.417 [2024-07-12 18:38:23.992602] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.417 [2024-07-12 18:38:23.992643] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.417 [2024-07-12 18:38:23.992959] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.417 [2024-07-12 18:38:23.992976] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.417 [2024-07-12 18:38:23.992991] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.417 [2024-07-12 18:38:23.993005] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.418 [2024-07-12 18:38:23.997143] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.418 [2024-07-12 18:38:23.997193] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.418 [2024-07-12 18:38:23.997235] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.418 [2024-07-12 18:38:23.997276] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.418 [2024-07-12 18:38:23.997632] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.418 [2024-07-12 18:38:23.997685] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.418 [2024-07-12 18:38:23.997726] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.418 [2024-07-12 18:38:23.997766] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.418 [2024-07-12 18:38:23.997806] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.418 [2024-07-12 18:38:23.998148] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.418 [2024-07-12 18:38:23.998165] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.418 [2024-07-12 18:38:23.998179] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.418 [2024-07-12 18:38:23.998193] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.418 [2024-07-12 18:38:24.003089] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.418 [2024-07-12 18:38:24.003146] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.418 [2024-07-12 18:38:24.003193] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.418 [2024-07-12 18:38:24.003233] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.418 [2024-07-12 18:38:24.003500] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.418 [2024-07-12 18:38:24.003556] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.418 [2024-07-12 18:38:24.003599] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.418 [2024-07-12 18:38:24.003639] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.418 [2024-07-12 18:38:24.003680] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.418 [2024-07-12 18:38:24.004105] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.418 [2024-07-12 18:38:24.004122] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.418 [2024-07-12 18:38:24.004137] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.418 [2024-07-12 18:38:24.004153] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.418 [2024-07-12 18:38:24.008338] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.418 [2024-07-12 18:38:24.008388] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.418 [2024-07-12 18:38:24.008440] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.418 [2024-07-12 18:38:24.008481] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.418 [2024-07-12 18:38:24.008749] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.418 [2024-07-12 18:38:24.008818] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.418 [2024-07-12 18:38:24.008861] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.418 [2024-07-12 18:38:24.008902] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.418 [2024-07-12 18:38:24.008950] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.418 [2024-07-12 18:38:24.009219] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.418 [2024-07-12 18:38:24.009235] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.418 [2024-07-12 18:38:24.009250] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.418 [2024-07-12 18:38:24.009264] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.418 [2024-07-12 18:38:24.013338] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.418 [2024-07-12 18:38:24.013387] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.418 [2024-07-12 18:38:24.013427] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.418 [2024-07-12 18:38:24.013468] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.418 [2024-07-12 18:38:24.013897] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.418 [2024-07-12 18:38:24.013954] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.418 [2024-07-12 18:38:24.013997] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.418 [2024-07-12 18:38:24.014038] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.418 [2024-07-12 18:38:24.014080] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.418 [2024-07-12 18:38:24.014371] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.418 [2024-07-12 18:38:24.014387] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.418 [2024-07-12 18:38:24.014401] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.418 [2024-07-12 18:38:24.014415] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.418 [2024-07-12 18:38:24.018154] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.418 [2024-07-12 18:38:24.018216] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.418 [2024-07-12 18:38:24.018257] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.418 [2024-07-12 18:38:24.018298] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.418 [2024-07-12 18:38:24.018565] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.418 [2024-07-12 18:38:24.018622] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.418 [2024-07-12 18:38:24.018663] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.418 [2024-07-12 18:38:24.018703] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.418 [2024-07-12 18:38:24.018743] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.418 [2024-07-12 18:38:24.019065] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.418 [2024-07-12 18:38:24.019082] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.418 [2024-07-12 18:38:24.019096] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.418 [2024-07-12 18:38:24.019111] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.418 [2024-07-12 18:38:24.022015] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.418 [2024-07-12 18:38:24.022086] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.418 [2024-07-12 18:38:24.022133] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.418 [2024-07-12 18:38:24.022173] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.418 [2024-07-12 18:38:24.022439] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.418 [2024-07-12 18:38:24.022497] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.418 [2024-07-12 18:38:24.022537] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.418 [2024-07-12 18:38:24.022577] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.418 [2024-07-12 18:38:24.022618] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.418 [2024-07-12 18:38:24.022949] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.418 [2024-07-12 18:38:24.022966] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.418 [2024-07-12 18:38:24.022981] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.418 [2024-07-12 18:38:24.022998] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.418 [2024-07-12 18:38:24.028446] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.418 [2024-07-12 18:38:24.028498] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.418 [2024-07-12 18:38:24.028540] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.418 [2024-07-12 18:38:24.028584] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.418 [2024-07-12 18:38:24.028974] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.418 [2024-07-12 18:38:24.029028] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.418 [2024-07-12 18:38:24.029069] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.418 [2024-07-12 18:38:24.029112] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.418 [2024-07-12 18:38:24.029152] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.418 [2024-07-12 18:38:24.029586] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.418 [2024-07-12 18:38:24.029604] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.418 [2024-07-12 18:38:24.029619] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.418 [2024-07-12 18:38:24.029635] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.418 [2024-07-12 18:38:24.034771] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.418 [2024-07-12 18:38:24.034820] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.418 [2024-07-12 18:38:24.034861] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.418 [2024-07-12 18:38:24.034901] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.418 [2024-07-12 18:38:24.035176] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.418 [2024-07-12 18:38:24.035234] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.418 [2024-07-12 18:38:24.035275] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.418 [2024-07-12 18:38:24.035315] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.418 [2024-07-12 18:38:24.035354] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.419 [2024-07-12 18:38:24.035619] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.419 [2024-07-12 18:38:24.035636] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.419 [2024-07-12 18:38:24.035650] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.419 [2024-07-12 18:38:24.035664] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.419 [2024-07-12 18:38:24.038806] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.419 [2024-07-12 18:38:24.038855] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.419 [2024-07-12 18:38:24.038894] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.419 [2024-07-12 18:38:24.038940] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.419 [2024-07-12 18:38:24.039211] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.419 [2024-07-12 18:38:24.039268] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.419 [2024-07-12 18:38:24.039309] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.419 [2024-07-12 18:38:24.039349] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.419 [2024-07-12 18:38:24.039389] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.419 [2024-07-12 18:38:24.039654] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.419 [2024-07-12 18:38:24.039671] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.419 [2024-07-12 18:38:24.039685] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.419 [2024-07-12 18:38:24.039699] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.419 [2024-07-12 18:38:24.044057] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.419 [2024-07-12 18:38:24.044456] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.419 [2024-07-12 18:38:24.044501] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.419 [2024-07-12 18:38:24.044547] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.419 [2024-07-12 18:38:24.044970] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.419 [2024-07-12 18:38:24.045020] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.419 [2024-07-12 18:38:24.045065] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.419 [2024-07-12 18:38:24.045109] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.419 [2024-07-12 18:38:24.045150] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.419 [2024-07-12 18:38:24.045584] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.419 [2024-07-12 18:38:24.045601] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.419 [2024-07-12 18:38:24.045617] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.419 [2024-07-12 18:38:24.045632] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.419 [2024-07-12 18:38:24.050748] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.419 [2024-07-12 18:38:24.050798] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.419 [2024-07-12 18:38:24.052459] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.419 [2024-07-12 18:38:24.052505] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.419 [2024-07-12 18:38:24.052775] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.419 [2024-07-12 18:38:24.052847] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.419 [2024-07-12 18:38:24.052888] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.419 [2024-07-12 18:38:24.053782] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.419 [2024-07-12 18:38:24.053829] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.419 [2024-07-12 18:38:24.054344] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.419 [2024-07-12 18:38:24.054362] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.419 [2024-07-12 18:38:24.054381] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.419 [2024-07-12 18:38:24.054397] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.419 [2024-07-12 18:38:24.058122] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.419 [2024-07-12 18:38:24.059666] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.419 [2024-07-12 18:38:24.059712] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.419 [2024-07-12 18:38:24.059753] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.419 [2024-07-12 18:38:24.060030] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.419 [2024-07-12 18:38:24.060084] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.419 [2024-07-12 18:38:24.061451] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.419 [2024-07-12 18:38:24.061496] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.419 [2024-07-12 18:38:24.061536] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.419 [2024-07-12 18:38:24.061805] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.419 [2024-07-12 18:38:24.061821] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.419 [2024-07-12 18:38:24.061836] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.419 [2024-07-12 18:38:24.061850] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.419 [2024-07-12 18:38:24.067532] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.419 [2024-07-12 18:38:24.067586] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.419 [2024-07-12 18:38:24.067627] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.419 [2024-07-12 18:38:24.069283] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.419 [2024-07-12 18:38:24.069558] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.419 [2024-07-12 18:38:24.070775] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.419 [2024-07-12 18:38:24.070820] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.419 [2024-07-12 18:38:24.070864] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.419 [2024-07-12 18:38:24.072488] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.419 [2024-07-12 18:38:24.072765] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.419 [2024-07-12 18:38:24.072782] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.419 [2024-07-12 18:38:24.072796] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.419 [2024-07-12 18:38:24.072810] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.419 [2024-07-12 18:38:24.077079] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.419 [2024-07-12 18:38:24.077128] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.419 [2024-07-12 18:38:24.078408] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.419 [2024-07-12 18:38:24.078453] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.419 [2024-07-12 18:38:24.078765] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.419 [2024-07-12 18:38:24.078823] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.419 [2024-07-12 18:38:24.080467] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.419 [2024-07-12 18:38:24.080521] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.419 [2024-07-12 18:38:24.082172] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.419 [2024-07-12 18:38:24.082537] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.419 [2024-07-12 18:38:24.082554] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.419 [2024-07-12 18:38:24.082568] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.419 [2024-07-12 18:38:24.082582] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.420 [2024-07-12 18:38:24.088191] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.420 [2024-07-12 18:38:24.088244] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.420 [2024-07-12 18:38:24.088284] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.420 [2024-07-12 18:38:24.088672] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.420 [2024-07-12 18:38:24.088974] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.420 [2024-07-12 18:38:24.089029] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.420 [2024-07-12 18:38:24.090405] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.420 [2024-07-12 18:38:24.090451] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.420 [2024-07-12 18:38:24.090491] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.420 [2024-07-12 18:38:24.090758] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.420 [2024-07-12 18:38:24.090775] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.420 [2024-07-12 18:38:24.090789] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.420 [2024-07-12 18:38:24.090803] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.420 [2024-07-12 18:38:24.095846] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.420 [2024-07-12 18:38:24.096259] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.420 [2024-07-12 18:38:24.096304] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.420 [2024-07-12 18:38:24.096687] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.420 [2024-07-12 18:38:24.097112] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.420 [2024-07-12 18:38:24.097509] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.420 [2024-07-12 18:38:24.098807] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.420 [2024-07-12 18:38:24.098853] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.420 [2024-07-12 18:38:24.100242] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.420 [2024-07-12 18:38:24.100512] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.420 [2024-07-12 18:38:24.100528] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.420 [2024-07-12 18:38:24.100543] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.420 [2024-07-12 18:38:24.100557] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.420 [2024-07-12 18:38:24.106078] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.420 [2024-07-12 18:38:24.106474] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.420 [2024-07-12 18:38:24.106862] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.420 [2024-07-12 18:38:24.107253] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.420 [2024-07-12 18:38:24.107605] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.420 [2024-07-12 18:38:24.107662] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.420 [2024-07-12 18:38:24.109040] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.420 [2024-07-12 18:38:24.109086] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.420 [2024-07-12 18:38:24.110737] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.420 [2024-07-12 18:38:24.111014] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.420 [2024-07-12 18:38:24.111030] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.420 [2024-07-12 18:38:24.111045] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.420 [2024-07-12 18:38:24.111059] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.420 [2024-07-12 18:38:24.115696] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.420 [2024-07-12 18:38:24.116100] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.420 [2024-07-12 18:38:24.116488] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.420 [2024-07-12 18:38:24.117439] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.420 [2024-07-12 18:38:24.117764] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.420 [2024-07-12 18:38:24.119442] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.420 [2024-07-12 18:38:24.121096] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.420 [2024-07-12 18:38:24.122284] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.420 [2024-07-12 18:38:24.123667] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.420 [2024-07-12 18:38:24.123941] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.420 [2024-07-12 18:38:24.123957] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.420 [2024-07-12 18:38:24.123972] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.420 [2024-07-12 18:38:24.123986] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.420 [2024-07-12 18:38:24.128987] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.420 [2024-07-12 18:38:24.130651] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.420 [2024-07-12 18:38:24.132301] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.420 [2024-07-12 18:38:24.133033] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.420 [2024-07-12 18:38:24.133307] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.420 [2024-07-12 18:38:24.134924] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.420 [2024-07-12 18:38:24.136501] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.420 [2024-07-12 18:38:24.138027] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.681 [2024-07-12 18:38:24.138621] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.681 [2024-07-12 18:38:24.138890] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.681 [2024-07-12 18:38:24.138906] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.681 [2024-07-12 18:38:24.138921] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.681 [2024-07-12 18:38:24.138941] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.681 [2024-07-12 18:38:24.143286] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.681 [2024-07-12 18:38:24.145065] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.681 [2024-07-12 18:38:24.146722] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.681 [2024-07-12 18:38:24.147799] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.681 [2024-07-12 18:38:24.148110] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.681 [2024-07-12 18:38:24.149709] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.681 [2024-07-12 18:38:24.151347] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.681 [2024-07-12 18:38:24.152825] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.681 [2024-07-12 18:38:24.153223] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.681 [2024-07-12 18:38:24.153661] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.681 [2024-07-12 18:38:24.153683] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.681 [2024-07-12 18:38:24.153698] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.681 [2024-07-12 18:38:24.153713] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.681 [2024-07-12 18:38:24.158457] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.681 [2024-07-12 18:38:24.160215] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.681 [2024-07-12 18:38:24.161816] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.681 [2024-07-12 18:38:24.163517] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.681 [2024-07-12 18:38:24.163787] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.681 [2024-07-12 18:38:24.164454] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.681 [2024-07-12 18:38:24.164848] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.681 [2024-07-12 18:38:24.165240] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.681 [2024-07-12 18:38:24.165629] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.681 [2024-07-12 18:38:24.166013] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.681 [2024-07-12 18:38:24.166030] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.681 [2024-07-12 18:38:24.166045] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.681 [2024-07-12 18:38:24.166059] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.681 [2024-07-12 18:38:24.172150] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.681 [2024-07-12 18:38:24.173778] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.681 [2024-07-12 18:38:24.175006] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.681 [2024-07-12 18:38:24.175396] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.681 [2024-07-12 18:38:24.175830] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.681 [2024-07-12 18:38:24.176234] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.681 [2024-07-12 18:38:24.176622] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.681 [2024-07-12 18:38:24.177015] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.681 [2024-07-12 18:38:24.177412] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.681 [2024-07-12 18:38:24.177868] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.681 [2024-07-12 18:38:24.177885] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.681 [2024-07-12 18:38:24.177904] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.681 [2024-07-12 18:38:24.177919] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.681 [2024-07-12 18:38:24.181555] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.681 [2024-07-12 18:38:24.181956] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.681 [2024-07-12 18:38:24.182354] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.681 [2024-07-12 18:38:24.182748] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.681 [2024-07-12 18:38:24.183201] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.681 [2024-07-12 18:38:24.183600] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.681 [2024-07-12 18:38:24.183992] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.681 [2024-07-12 18:38:24.184381] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.681 [2024-07-12 18:38:24.184772] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.681 [2024-07-12 18:38:24.185152] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.681 [2024-07-12 18:38:24.185170] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.681 [2024-07-12 18:38:24.185185] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.681 [2024-07-12 18:38:24.185199] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.681 [2024-07-12 18:38:24.188619] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.681 [2024-07-12 18:38:24.189021] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.681 [2024-07-12 18:38:24.189411] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.681 [2024-07-12 18:38:24.189800] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.681 [2024-07-12 18:38:24.190286] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.681 [2024-07-12 18:38:24.190689] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.681 [2024-07-12 18:38:24.191085] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.681 [2024-07-12 18:38:24.191471] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.681 [2024-07-12 18:38:24.191855] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.681 [2024-07-12 18:38:24.192256] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.681 [2024-07-12 18:38:24.192273] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.681 [2024-07-12 18:38:24.192288] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.681 [2024-07-12 18:38:24.192303] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.681 [2024-07-12 18:38:24.195820] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.681 [2024-07-12 18:38:24.196224] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.681 [2024-07-12 18:38:24.196615] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.681 [2024-07-12 18:38:24.197006] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.681 [2024-07-12 18:38:24.197434] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.681 [2024-07-12 18:38:24.197833] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.681 [2024-07-12 18:38:24.198233] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.681 [2024-07-12 18:38:24.198620] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.681 [2024-07-12 18:38:24.199008] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.681 [2024-07-12 18:38:24.199410] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.681 [2024-07-12 18:38:24.199426] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.681 [2024-07-12 18:38:24.199441] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.681 [2024-07-12 18:38:24.199456] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.681 [2024-07-12 18:38:24.202882] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.681 [2024-07-12 18:38:24.203294] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.681 [2024-07-12 18:38:24.203706] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.681 [2024-07-12 18:38:24.204103] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.681 [2024-07-12 18:38:24.204507] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.681 [2024-07-12 18:38:24.204906] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.681 [2024-07-12 18:38:24.205306] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.681 [2024-07-12 18:38:24.205703] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.681 [2024-07-12 18:38:24.206108] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.681 [2024-07-12 18:38:24.206555] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.681 [2024-07-12 18:38:24.206573] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.681 [2024-07-12 18:38:24.206591] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.681 [2024-07-12 18:38:24.206606] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.682 [2024-07-12 18:38:24.210124] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.682 [2024-07-12 18:38:24.210519] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.682 [2024-07-12 18:38:24.210912] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.682 [2024-07-12 18:38:24.211323] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.682 [2024-07-12 18:38:24.211767] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.682 [2024-07-12 18:38:24.212172] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.682 [2024-07-12 18:38:24.212561] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.682 [2024-07-12 18:38:24.212957] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.682 [2024-07-12 18:38:24.213351] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.682 [2024-07-12 18:38:24.213721] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.682 [2024-07-12 18:38:24.213738] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.682 [2024-07-12 18:38:24.213757] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.682 [2024-07-12 18:38:24.213771] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.682 [2024-07-12 18:38:24.217222] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.682 [2024-07-12 18:38:24.217617] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.682 [2024-07-12 18:38:24.218011] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.682 [2024-07-12 18:38:24.218403] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.682 [2024-07-12 18:38:24.218778] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.682 [2024-07-12 18:38:24.219187] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.682 [2024-07-12 18:38:24.219574] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.682 [2024-07-12 18:38:24.219966] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.682 [2024-07-12 18:38:24.220354] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.682 [2024-07-12 18:38:24.220721] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.682 [2024-07-12 18:38:24.220738] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.682 [2024-07-12 18:38:24.220753] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.682 [2024-07-12 18:38:24.220768] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.682 [2024-07-12 18:38:24.224335] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.682 [2024-07-12 18:38:24.224748] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.682 [2024-07-12 18:38:24.225141] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.682 [2024-07-12 18:38:24.225531] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.682 [2024-07-12 18:38:24.225959] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.682 [2024-07-12 18:38:24.226365] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.682 [2024-07-12 18:38:24.226759] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.682 [2024-07-12 18:38:24.227150] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.682 [2024-07-12 18:38:24.227534] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.682 [2024-07-12 18:38:24.228004] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.682 [2024-07-12 18:38:24.228021] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.682 [2024-07-12 18:38:24.228038] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.682 [2024-07-12 18:38:24.228053] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.682 [2024-07-12 18:38:24.230795] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.682 [2024-07-12 18:38:24.231198] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.682 [2024-07-12 18:38:24.231592] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.682 [2024-07-12 18:38:24.231990] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.682 [2024-07-12 18:38:24.232380] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.682 [2024-07-12 18:38:24.232778] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.682 [2024-07-12 18:38:24.233171] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.682 [2024-07-12 18:38:24.233560] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.682 [2024-07-12 18:38:24.233960] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.682 [2024-07-12 18:38:24.234309] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.682 [2024-07-12 18:38:24.234326] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.682 [2024-07-12 18:38:24.234341] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.682 [2024-07-12 18:38:24.234356] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.682 [2024-07-12 18:38:24.237166] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.682 [2024-07-12 18:38:24.237564] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.682 [2024-07-12 18:38:24.237974] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.682 [2024-07-12 18:38:24.238363] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.682 [2024-07-12 18:38:24.238846] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.682 [2024-07-12 18:38:24.239249] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.682 [2024-07-12 18:38:24.239643] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.682 [2024-07-12 18:38:24.240070] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.682 [2024-07-12 18:38:24.240462] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.682 [2024-07-12 18:38:24.240849] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.682 [2024-07-12 18:38:24.240866] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.682 [2024-07-12 18:38:24.240881] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.682 [2024-07-12 18:38:24.240896] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.682 [2024-07-12 18:38:24.243595] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.682 [2024-07-12 18:38:24.243994] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.682 [2024-07-12 18:38:24.244385] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.682 [2024-07-12 18:38:24.244775] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.682 [2024-07-12 18:38:24.245156] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.682 [2024-07-12 18:38:24.245558] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.682 [2024-07-12 18:38:24.245951] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.682 [2024-07-12 18:38:24.246340] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.682 [2024-07-12 18:38:24.246728] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.682 [2024-07-12 18:38:24.247056] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.682 [2024-07-12 18:38:24.247073] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.682 [2024-07-12 18:38:24.247088] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.682 [2024-07-12 18:38:24.247102] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.682 [2024-07-12 18:38:24.249439] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.682 [2024-07-12 18:38:24.249845] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.682 [2024-07-12 18:38:24.250242] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.682 [2024-07-12 18:38:24.250637] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.682 [2024-07-12 18:38:24.251128] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.682 [2024-07-12 18:38:24.251540] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.682 [2024-07-12 18:38:24.251922] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.682 [2024-07-12 18:38:24.252316] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.682 [2024-07-12 18:38:24.252709] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.682 [2024-07-12 18:38:24.253131] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.682 [2024-07-12 18:38:24.253149] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.682 [2024-07-12 18:38:24.253163] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.682 [2024-07-12 18:38:24.253178] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.682 [2024-07-12 18:38:24.256004] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.682 [2024-07-12 18:38:24.256405] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.682 [2024-07-12 18:38:24.256797] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.682 [2024-07-12 18:38:24.257187] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.682 [2024-07-12 18:38:24.257617] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.682 [2024-07-12 18:38:24.258019] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.682 [2024-07-12 18:38:24.258409] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.682 [2024-07-12 18:38:24.258801] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.682 [2024-07-12 18:38:24.259200] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.683 [2024-07-12 18:38:24.259639] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.683 [2024-07-12 18:38:24.259657] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.683 [2024-07-12 18:38:24.259672] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.683 [2024-07-12 18:38:24.259692] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.683 [2024-07-12 18:38:24.263584] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.683 [2024-07-12 18:38:24.265246] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.683 [2024-07-12 18:38:24.266088] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.683 [2024-07-12 18:38:24.267470] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.683 [2024-07-12 18:38:24.267740] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.683 [2024-07-12 18:38:24.269437] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.683 [2024-07-12 18:38:24.271018] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.683 [2024-07-12 18:38:24.271405] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.683 [2024-07-12 18:38:24.271793] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.683 [2024-07-12 18:38:24.272240] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.683 [2024-07-12 18:38:24.272257] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.683 [2024-07-12 18:38:24.272271] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.683 [2024-07-12 18:38:24.272286] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.683 [2024-07-12 18:38:24.275741] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.683 [2024-07-12 18:38:24.276643] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.683 [2024-07-12 18:38:24.278333] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.683 [2024-07-12 18:38:24.280098] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.683 [2024-07-12 18:38:24.280370] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.683 [2024-07-12 18:38:24.282037] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.683 [2024-07-12 18:38:24.282432] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.683 [2024-07-12 18:38:24.282822] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.683 [2024-07-12 18:38:24.283211] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.683 [2024-07-12 18:38:24.283646] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.683 [2024-07-12 18:38:24.283663] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.683 [2024-07-12 18:38:24.283682] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.683 [2024-07-12 18:38:24.283697] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.683 [2024-07-12 18:38:24.286683] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.683 [2024-07-12 18:38:24.288196] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.683 [2024-07-12 18:38:24.289606] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.683 [2024-07-12 18:38:24.291262] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.683 [2024-07-12 18:38:24.291535] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.683 [2024-07-12 18:38:24.292374] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.683 [2024-07-12 18:38:24.292764] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.683 [2024-07-12 18:38:24.293155] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.683 [2024-07-12 18:38:24.293542] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.683 [2024-07-12 18:38:24.293943] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.683 [2024-07-12 18:38:24.293959] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.683 [2024-07-12 18:38:24.293974] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.683 [2024-07-12 18:38:24.293988] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.683 [2024-07-12 18:38:24.296444] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.683 [2024-07-12 18:38:24.297844] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.683 [2024-07-12 18:38:24.299513] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.683 [2024-07-12 18:38:24.301165] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.683 [2024-07-12 18:38:24.301486] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.683 [2024-07-12 18:38:24.301895] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.683 [2024-07-12 18:38:24.302289] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.683 [2024-07-12 18:38:24.302685] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.683 [2024-07-12 18:38:24.303100] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.683 [2024-07-12 18:38:24.303372] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.683 [2024-07-12 18:38:24.303388] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.683 [2024-07-12 18:38:24.303403] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.683 [2024-07-12 18:38:24.303417] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.683 [2024-07-12 18:38:24.306411] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.683 [2024-07-12 18:38:24.308073] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.683 [2024-07-12 18:38:24.309730] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.683 [2024-07-12 18:38:24.311012] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.683 [2024-07-12 18:38:24.311430] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.683 [2024-07-12 18:38:24.311828] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.683 [2024-07-12 18:38:24.312221] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.683 [2024-07-12 18:38:24.312607] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.683 [2024-07-12 18:38:24.314326] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.683 [2024-07-12 18:38:24.314636] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.683 [2024-07-12 18:38:24.314653] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.683 [2024-07-12 18:38:24.314667] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.683 [2024-07-12 18:38:24.314681] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.683 [2024-07-12 18:38:24.317993] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.683 [2024-07-12 18:38:24.319815] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.683 [2024-07-12 18:38:24.321546] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.683 [2024-07-12 18:38:24.321940] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.683 [2024-07-12 18:38:24.322372] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.683 [2024-07-12 18:38:24.322769] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.683 [2024-07-12 18:38:24.323162] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.683 [2024-07-12 18:38:24.324389] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.683 [2024-07-12 18:38:24.325781] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.683 [2024-07-12 18:38:24.326062] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.683 [2024-07-12 18:38:24.326079] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.683 [2024-07-12 18:38:24.326093] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.683 [2024-07-12 18:38:24.326108] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.683 [2024-07-12 18:38:24.329356] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.683 [2024-07-12 18:38:24.331013] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.683 [2024-07-12 18:38:24.331508] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.683 [2024-07-12 18:38:24.331896] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.683 [2024-07-12 18:38:24.332318] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.683 [2024-07-12 18:38:24.332719] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.683 [2024-07-12 18:38:24.333490] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.683 [2024-07-12 18:38:24.334866] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.683 [2024-07-12 18:38:24.336509] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.683 [2024-07-12 18:38:24.336782] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.683 [2024-07-12 18:38:24.336799] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.683 [2024-07-12 18:38:24.336813] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.683 [2024-07-12 18:38:24.336832] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.683 [2024-07-12 18:38:24.340106] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.683 [2024-07-12 18:38:24.340157] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.683 [2024-07-12 18:38:24.341002] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.683 [2024-07-12 18:38:24.341394] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.683 [2024-07-12 18:38:24.341840] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.684 [2024-07-12 18:38:24.342248] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.684 [2024-07-12 18:38:24.342660] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.684 [2024-07-12 18:38:24.344200] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.684 [2024-07-12 18:38:24.345893] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.684 [2024-07-12 18:38:24.346170] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.684 [2024-07-12 18:38:24.346186] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.684 [2024-07-12 18:38:24.346201] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.684 [2024-07-12 18:38:24.346215] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.684 [2024-07-12 18:38:24.349526] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.684 [2024-07-12 18:38:24.350895] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.684 [2024-07-12 18:38:24.350947] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.684 [2024-07-12 18:38:24.351334] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.684 [2024-07-12 18:38:24.351772] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.684 [2024-07-12 18:38:24.352176] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.684 [2024-07-12 18:38:24.352565] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.684 [2024-07-12 18:38:24.352612] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.684 [2024-07-12 18:38:24.354275] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.684 [2024-07-12 18:38:24.354548] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.684 [2024-07-12 18:38:24.354564] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.684 [2024-07-12 18:38:24.354578] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.684 [2024-07-12 18:38:24.354593] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.684 [2024-07-12 18:38:24.357968] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.684 [2024-07-12 18:38:24.358018] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.684 [2024-07-12 18:38:24.359669] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.684 [2024-07-12 18:38:24.360978] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.684 [2024-07-12 18:38:24.361390] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.684 [2024-07-12 18:38:24.361789] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.684 [2024-07-12 18:38:24.361833] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.684 [2024-07-12 18:38:24.362224] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.684 [2024-07-12 18:38:24.362610] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.684 [2024-07-12 18:38:24.362877] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.684 [2024-07-12 18:38:24.362894] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.684 [2024-07-12 18:38:24.362908] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.684 [2024-07-12 18:38:24.362923] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.684 [2024-07-12 18:38:24.364625] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.684 [2024-07-12 18:38:24.366021] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.684 [2024-07-12 18:38:24.367673] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.684 [2024-07-12 18:38:24.367719] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.684 [2024-07-12 18:38:24.367995] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.684 [2024-07-12 18:38:24.368056] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.684 [2024-07-12 18:38:24.369442] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.684 [2024-07-12 18:38:24.369831] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.684 [2024-07-12 18:38:24.369875] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.684 [2024-07-12 18:38:24.370317] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.684 [2024-07-12 18:38:24.370339] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.684 [2024-07-12 18:38:24.370354] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.684 [2024-07-12 18:38:24.370369] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.684 [2024-07-12 18:38:24.374006] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.684 [2024-07-12 18:38:24.375643] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.684 [2024-07-12 18:38:24.375689] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.684 [2024-07-12 18:38:24.376519] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.684 [2024-07-12 18:38:24.376830] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.684 [2024-07-12 18:38:24.378622] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.684 [2024-07-12 18:38:24.378668] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.684 [2024-07-12 18:38:24.380311] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.684 [2024-07-12 18:38:24.380363] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.684 [2024-07-12 18:38:24.380633] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.684 [2024-07-12 18:38:24.380650] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.684 [2024-07-12 18:38:24.380664] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.684 [2024-07-12 18:38:24.380679] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.684 [2024-07-12 18:38:24.383128] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.684 [2024-07-12 18:38:24.384939] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.684 [2024-07-12 18:38:24.386589] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.684 [2024-07-12 18:38:24.386642] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.684 [2024-07-12 18:38:24.386910] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.684 [2024-07-12 18:38:24.388578] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.684 [2024-07-12 18:38:24.388623] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.684 [2024-07-12 18:38:24.389565] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.684 [2024-07-12 18:38:24.390953] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.684 [2024-07-12 18:38:24.391223] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.684 [2024-07-12 18:38:24.391240] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.684 [2024-07-12 18:38:24.391254] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.684 [2024-07-12 18:38:24.391268] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.684 [2024-07-12 18:38:24.393583] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.684 [2024-07-12 18:38:24.393639] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.684 [2024-07-12 18:38:24.394030] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.684 [2024-07-12 18:38:24.394076] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.684 [2024-07-12 18:38:24.394345] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.684 [2024-07-12 18:38:24.394398] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.684 [2024-07-12 18:38:24.394446] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.684 [2024-07-12 18:38:24.396274] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.684 [2024-07-12 18:38:24.396321] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.684 [2024-07-12 18:38:24.396589] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.684 [2024-07-12 18:38:24.396605] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.684 [2024-07-12 18:38:24.396619] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.684 [2024-07-12 18:38:24.396634] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.684 [2024-07-12 18:38:24.398319] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.684 [2024-07-12 18:38:24.398363] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.684 [2024-07-12 18:38:24.398404] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.684 [2024-07-12 18:38:24.398444] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.684 [2024-07-12 18:38:24.398706] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.684 [2024-07-12 18:38:24.399808] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.684 [2024-07-12 18:38:24.399857] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.684 [2024-07-12 18:38:24.400260] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.684 [2024-07-12 18:38:24.400303] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.684 [2024-07-12 18:38:24.400739] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.684 [2024-07-12 18:38:24.400756] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.684 [2024-07-12 18:38:24.400771] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.684 [2024-07-12 18:38:24.400785] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.684 [2024-07-12 18:38:24.402825] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.685 [2024-07-12 18:38:24.402877] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.685 [2024-07-12 18:38:24.402922] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.685 [2024-07-12 18:38:24.402969] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.685 [2024-07-12 18:38:24.403237] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.685 [2024-07-12 18:38:24.403295] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.685 [2024-07-12 18:38:24.403336] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.685 [2024-07-12 18:38:24.403376] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.685 [2024-07-12 18:38:24.403416] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.685 [2024-07-12 18:38:24.403707] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.685 [2024-07-12 18:38:24.403724] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.685 [2024-07-12 18:38:24.403741] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.685 [2024-07-12 18:38:24.403756] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.947 [2024-07-12 18:38:24.405304] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.947 [2024-07-12 18:38:24.405350] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.947 [2024-07-12 18:38:24.405390] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.947 [2024-07-12 18:38:24.405430] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.947 [2024-07-12 18:38:24.405801] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.947 [2024-07-12 18:38:24.405879] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.947 [2024-07-12 18:38:24.405922] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.947 [2024-07-12 18:38:24.405971] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.947 [2024-07-12 18:38:24.406011] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.947 [2024-07-12 18:38:24.406443] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.947 [2024-07-12 18:38:24.406460] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.947 [2024-07-12 18:38:24.406476] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.947 [2024-07-12 18:38:24.406493] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.947 [2024-07-12 18:38:24.408594] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.947 [2024-07-12 18:38:24.408640] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.947 [2024-07-12 18:38:24.408679] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.947 [2024-07-12 18:38:24.408720] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.947 [2024-07-12 18:38:24.408990] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.947 [2024-07-12 18:38:24.409047] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.947 [2024-07-12 18:38:24.409087] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.947 [2024-07-12 18:38:24.409127] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.947 [2024-07-12 18:38:24.409167] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.947 [2024-07-12 18:38:24.409429] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.947 [2024-07-12 18:38:24.409446] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.947 [2024-07-12 18:38:24.409460] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.947 [2024-07-12 18:38:24.409474] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.947 [2024-07-12 18:38:24.411119] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.947 [2024-07-12 18:38:24.411170] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.947 [2024-07-12 18:38:24.411213] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.947 [2024-07-12 18:38:24.411255] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.947 [2024-07-12 18:38:24.411522] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.947 [2024-07-12 18:38:24.411580] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.947 [2024-07-12 18:38:24.411622] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.947 [2024-07-12 18:38:24.411663] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.947 [2024-07-12 18:38:24.411705] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.947 [2024-07-12 18:38:24.412123] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.947 [2024-07-12 18:38:24.412140] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.947 [2024-07-12 18:38:24.412155] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.947 [2024-07-12 18:38:24.412170] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.947 [2024-07-12 18:38:24.414403] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.947 [2024-07-12 18:38:24.414447] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.947 [2024-07-12 18:38:24.414498] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.947 [2024-07-12 18:38:24.414540] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.947 [2024-07-12 18:38:24.414812] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.947 [2024-07-12 18:38:24.414865] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.947 [2024-07-12 18:38:24.414908] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.947 [2024-07-12 18:38:24.414965] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.947 [2024-07-12 18:38:24.415005] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.947 [2024-07-12 18:38:24.415268] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.947 [2024-07-12 18:38:24.415285] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.947 [2024-07-12 18:38:24.415300] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.947 [2024-07-12 18:38:24.415314] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.947 [2024-07-12 18:38:24.417010] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.947 [2024-07-12 18:38:24.417054] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.947 [2024-07-12 18:38:24.417094] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.947 [2024-07-12 18:38:24.417134] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.947 [2024-07-12 18:38:24.417400] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.947 [2024-07-12 18:38:24.417460] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.947 [2024-07-12 18:38:24.417501] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.947 [2024-07-12 18:38:24.417541] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.947 [2024-07-12 18:38:24.417581] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.947 [2024-07-12 18:38:24.417994] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.947 [2024-07-12 18:38:24.418012] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.947 [2024-07-12 18:38:24.418027] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.947 [2024-07-12 18:38:24.418042] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.947 [2024-07-12 18:38:24.420470] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.947 [2024-07-12 18:38:24.420518] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.947 [2024-07-12 18:38:24.420559] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.947 [2024-07-12 18:38:24.420599] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.947 [2024-07-12 18:38:24.420940] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.947 [2024-07-12 18:38:24.420998] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.947 [2024-07-12 18:38:24.421039] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.948 [2024-07-12 18:38:24.421079] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.948 [2024-07-12 18:38:24.421119] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.948 [2024-07-12 18:38:24.421381] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.948 [2024-07-12 18:38:24.421397] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.948 [2024-07-12 18:38:24.421412] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.948 [2024-07-12 18:38:24.421426] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.948 [2024-07-12 18:38:24.423089] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.948 [2024-07-12 18:38:24.423149] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.948 [2024-07-12 18:38:24.423191] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.948 [2024-07-12 18:38:24.423231] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.948 [2024-07-12 18:38:24.423498] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.948 [2024-07-12 18:38:24.423563] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.948 [2024-07-12 18:38:24.423604] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.948 [2024-07-12 18:38:24.423645] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.948 [2024-07-12 18:38:24.423686] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.948 [2024-07-12 18:38:24.424015] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.948 [2024-07-12 18:38:24.424033] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.948 [2024-07-12 18:38:24.424048] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.948 [2024-07-12 18:38:24.424063] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.948 [2024-07-12 18:38:24.426626] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.948 [2024-07-12 18:38:24.426673] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.948 [2024-07-12 18:38:24.426717] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.948 [2024-07-12 18:38:24.426758] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.948 [2024-07-12 18:38:24.427108] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.948 [2024-07-12 18:38:24.427171] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.948 [2024-07-12 18:38:24.427212] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.948 [2024-07-12 18:38:24.427252] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.948 [2024-07-12 18:38:24.427292] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.948 [2024-07-12 18:38:24.427555] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.948 [2024-07-12 18:38:24.427571] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.948 [2024-07-12 18:38:24.427586] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.948 [2024-07-12 18:38:24.427600] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.948 [2024-07-12 18:38:24.429210] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.948 [2024-07-12 18:38:24.429255] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.948 [2024-07-12 18:38:24.429296] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.948 [2024-07-12 18:38:24.429336] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.948 [2024-07-12 18:38:24.429605] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.948 [2024-07-12 18:38:24.429666] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.948 [2024-07-12 18:38:24.429707] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.948 [2024-07-12 18:38:24.429757] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.948 [2024-07-12 18:38:24.429800] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.948 [2024-07-12 18:38:24.430072] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.948 [2024-07-12 18:38:24.430088] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.948 [2024-07-12 18:38:24.430104] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.948 [2024-07-12 18:38:24.430120] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.948 [2024-07-12 18:38:24.432476] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.948 [2024-07-12 18:38:24.432522] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.948 [2024-07-12 18:38:24.432563] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.948 [2024-07-12 18:38:24.432605] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.948 [2024-07-12 18:38:24.432872] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.948 [2024-07-12 18:38:24.432931] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.948 [2024-07-12 18:38:24.432972] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.948 [2024-07-12 18:38:24.433020] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.948 [2024-07-12 18:38:24.433061] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.948 [2024-07-12 18:38:24.433332] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.948 [2024-07-12 18:38:24.433352] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.948 [2024-07-12 18:38:24.433367] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.948 [2024-07-12 18:38:24.433381] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.948 [2024-07-12 18:38:24.435051] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.948 [2024-07-12 18:38:24.435098] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.948 [2024-07-12 18:38:24.435138] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.948 [2024-07-12 18:38:24.435179] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.948 [2024-07-12 18:38:24.435445] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.948 [2024-07-12 18:38:24.435502] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.948 [2024-07-12 18:38:24.435543] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.948 [2024-07-12 18:38:24.435583] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.948 [2024-07-12 18:38:24.435623] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.948 [2024-07-12 18:38:24.435885] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.948 [2024-07-12 18:38:24.435902] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.948 [2024-07-12 18:38:24.435917] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.948 [2024-07-12 18:38:24.435938] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.948 [2024-07-12 18:38:24.438212] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.948 [2024-07-12 18:38:24.438258] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.948 [2024-07-12 18:38:24.438304] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.948 [2024-07-12 18:38:24.438346] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.948 [2024-07-12 18:38:24.438751] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.948 [2024-07-12 18:38:24.438804] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.948 [2024-07-12 18:38:24.438844] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.948 [2024-07-12 18:38:24.438885] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.948 [2024-07-12 18:38:24.438932] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.948 [2024-07-12 18:38:24.439248] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.948 [2024-07-12 18:38:24.439265] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.948 [2024-07-12 18:38:24.439279] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.948 [2024-07-12 18:38:24.439294] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.948 [2024-07-12 18:38:24.440891] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.948 [2024-07-12 18:38:24.440945] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.948 [2024-07-12 18:38:24.440986] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.948 [2024-07-12 18:38:24.441034] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.948 [2024-07-12 18:38:24.441302] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.948 [2024-07-12 18:38:24.441360] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.948 [2024-07-12 18:38:24.441402] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.948 [2024-07-12 18:38:24.441451] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.948 [2024-07-12 18:38:24.441495] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.948 [2024-07-12 18:38:24.441761] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.948 [2024-07-12 18:38:24.441779] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.948 [2024-07-12 18:38:24.441793] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.948 [2024-07-12 18:38:24.441808] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.948 [2024-07-12 18:38:24.443897] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.948 [2024-07-12 18:38:24.443948] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.948 [2024-07-12 18:38:24.443993] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.949 [2024-07-12 18:38:24.444035] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.949 [2024-07-12 18:38:24.444465] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.949 [2024-07-12 18:38:24.444518] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.949 [2024-07-12 18:38:24.444561] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.949 [2024-07-12 18:38:24.444603] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.949 [2024-07-12 18:38:24.444648] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.949 [2024-07-12 18:38:24.444916] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.949 [2024-07-12 18:38:24.444937] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.949 [2024-07-12 18:38:24.444952] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.949 [2024-07-12 18:38:24.444967] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.949 [2024-07-12 18:38:24.446586] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.949 [2024-07-12 18:38:24.446630] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.949 [2024-07-12 18:38:24.446671] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.949 [2024-07-12 18:38:24.446711] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.949 [2024-07-12 18:38:24.447023] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.949 [2024-07-12 18:38:24.447081] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.949 [2024-07-12 18:38:24.447126] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.949 [2024-07-12 18:38:24.447166] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.949 [2024-07-12 18:38:24.447207] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.949 [2024-07-12 18:38:24.447470] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.949 [2024-07-12 18:38:24.447487] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.949 [2024-07-12 18:38:24.447501] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.949 [2024-07-12 18:38:24.447516] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.949 [2024-07-12 18:38:24.449812] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.949 [2024-07-12 18:38:24.449856] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.949 [2024-07-12 18:38:24.449900] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.949 [2024-07-12 18:38:24.449946] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.949 [2024-07-12 18:38:24.450401] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.949 [2024-07-12 18:38:24.450452] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.949 [2024-07-12 18:38:24.450493] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.949 [2024-07-12 18:38:24.450534] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.949 [2024-07-12 18:38:24.450574] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.949 [2024-07-12 18:38:24.450890] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.949 [2024-07-12 18:38:24.450906] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.949 [2024-07-12 18:38:24.450921] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.949 [2024-07-12 18:38:24.450942] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.949 [2024-07-12 18:38:24.452570] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.949 [2024-07-12 18:38:24.452616] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.949 [2024-07-12 18:38:24.452661] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.949 [2024-07-12 18:38:24.452701] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.949 [2024-07-12 18:38:24.452976] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.949 [2024-07-12 18:38:24.453043] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.949 [2024-07-12 18:38:24.453096] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.949 [2024-07-12 18:38:24.453140] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.949 [2024-07-12 18:38:24.453180] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.949 [2024-07-12 18:38:24.453449] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.949 [2024-07-12 18:38:24.453469] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.949 [2024-07-12 18:38:24.453484] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.949 [2024-07-12 18:38:24.453498] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.949 [2024-07-12 18:38:24.455390] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.949 [2024-07-12 18:38:24.455436] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.949 [2024-07-12 18:38:24.455477] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.949 [2024-07-12 18:38:24.455521] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.949 [2024-07-12 18:38:24.455969] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.949 [2024-07-12 18:38:24.456023] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.949 [2024-07-12 18:38:24.456065] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.949 [2024-07-12 18:38:24.456106] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.949 [2024-07-12 18:38:24.456148] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.949 [2024-07-12 18:38:24.456578] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.949 [2024-07-12 18:38:24.456594] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.949 [2024-07-12 18:38:24.456608] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.949 [2024-07-12 18:38:24.456623] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.949 [2024-07-12 18:38:24.458164] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.949 [2024-07-12 18:38:24.458217] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.949 [2024-07-12 18:38:24.458264] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.949 [2024-07-12 18:38:24.458305] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.949 [2024-07-12 18:38:24.458620] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.949 [2024-07-12 18:38:24.458673] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.949 [2024-07-12 18:38:24.458714] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.949 [2024-07-12 18:38:24.458754] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.949 [2024-07-12 18:38:24.458794] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.949 [2024-07-12 18:38:24.459130] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.949 [2024-07-12 18:38:24.459146] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.949 [2024-07-12 18:38:24.459161] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.949 [2024-07-12 18:38:24.459175] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.949 [2024-07-12 18:38:24.461039] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.949 [2024-07-12 18:38:24.461085] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.949 [2024-07-12 18:38:24.461135] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.949 [2024-07-12 18:38:24.461177] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.949 [2024-07-12 18:38:24.461558] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.949 [2024-07-12 18:38:24.461611] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.949 [2024-07-12 18:38:24.461652] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.949 [2024-07-12 18:38:24.461694] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.949 [2024-07-12 18:38:24.461734] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.949 [2024-07-12 18:38:24.462194] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.949 [2024-07-12 18:38:24.462212] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.949 [2024-07-12 18:38:24.462227] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.949 [2024-07-12 18:38:24.462243] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.949 [2024-07-12 18:38:24.463804] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.949 [2024-07-12 18:38:24.463849] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.949 [2024-07-12 18:38:24.463889] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.949 [2024-07-12 18:38:24.463935] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.949 [2024-07-12 18:38:24.464369] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.949 [2024-07-12 18:38:24.464432] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.949 [2024-07-12 18:38:24.464472] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.949 [2024-07-12 18:38:24.464513] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.949 [2024-07-12 18:38:24.464553] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.949 [2024-07-12 18:38:24.464862] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.949 [2024-07-12 18:38:24.464879] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.949 [2024-07-12 18:38:24.464893] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.950 [2024-07-12 18:38:24.464907] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.950 [2024-07-12 18:38:24.466641] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.950 [2024-07-12 18:38:24.466689] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.950 [2024-07-12 18:38:24.466731] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.950 [2024-07-12 18:38:24.466773] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.950 [2024-07-12 18:38:24.467213] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.950 [2024-07-12 18:38:24.467264] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.950 [2024-07-12 18:38:24.467318] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.950 [2024-07-12 18:38:24.467364] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.950 [2024-07-12 18:38:24.467405] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.950 [2024-07-12 18:38:24.467865] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.950 [2024-07-12 18:38:24.467882] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.950 [2024-07-12 18:38:24.467897] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.950 [2024-07-12 18:38:24.467912] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.950 [2024-07-12 18:38:24.469604] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.950 [2024-07-12 18:38:24.469653] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.950 [2024-07-12 18:38:24.469693] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.950 [2024-07-12 18:38:24.469733] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.950 [2024-07-12 18:38:24.470008] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.950 [2024-07-12 18:38:24.470065] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.950 [2024-07-12 18:38:24.470106] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.950 [2024-07-12 18:38:24.470148] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.950 [2024-07-12 18:38:24.470188] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.950 [2024-07-12 18:38:24.470456] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.950 [2024-07-12 18:38:24.470472] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.950 [2024-07-12 18:38:24.470486] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.950 [2024-07-12 18:38:24.470501] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.950 [2024-07-12 18:38:24.472138] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.950 [2024-07-12 18:38:24.472537] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.950 [2024-07-12 18:38:24.472581] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.950 [2024-07-12 18:38:24.472623] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.950 [2024-07-12 18:38:24.473003] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.950 [2024-07-12 18:38:24.473055] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.950 [2024-07-12 18:38:24.473095] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.950 [2024-07-12 18:38:24.473136] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.950 [2024-07-12 18:38:24.473176] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.950 [2024-07-12 18:38:24.473601] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.950 [2024-07-12 18:38:24.473619] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.950 [2024-07-12 18:38:24.473639] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.950 [2024-07-12 18:38:24.473654] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.950 [2024-07-12 18:38:24.475226] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.950 [2024-07-12 18:38:24.475272] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.950 [2024-07-12 18:38:24.476131] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.950 [2024-07-12 18:38:24.476178] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.950 [2024-07-12 18:38:24.476445] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.950 [2024-07-12 18:38:24.476507] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.950 [2024-07-12 18:38:24.476549] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.950 [2024-07-12 18:38:24.478280] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.950 [2024-07-12 18:38:24.478326] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.950 [2024-07-12 18:38:24.478593] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.950 [2024-07-12 18:38:24.478609] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.950 [2024-07-12 18:38:24.478624] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.950 [2024-07-12 18:38:24.478638] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.950 [2024-07-12 18:38:24.480826] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.950 [2024-07-12 18:38:24.481228] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.950 [2024-07-12 18:38:24.481272] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.950 [2024-07-12 18:38:24.481314] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.950 [2024-07-12 18:38:24.481639] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.950 [2024-07-12 18:38:24.481699] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.950 [2024-07-12 18:38:24.483352] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.950 [2024-07-12 18:38:24.483398] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.950 [2024-07-12 18:38:24.483438] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.950 [2024-07-12 18:38:24.483704] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.950 [2024-07-12 18:38:24.483721] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.950 [2024-07-12 18:38:24.483735] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.950 [2024-07-12 18:38:24.483749] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.950 [2024-07-12 18:38:24.485801] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.950 [2024-07-12 18:38:24.485852] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.950 [2024-07-12 18:38:24.485893] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.950 [2024-07-12 18:38:24.486291] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.950 [2024-07-12 18:38:24.486700] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.950 [2024-07-12 18:38:24.487101] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.950 [2024-07-12 18:38:24.487145] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.950 [2024-07-12 18:38:24.487189] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.950 [2024-07-12 18:38:24.488864] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.950 [2024-07-12 18:38:24.489205] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.950 [2024-07-12 18:38:24.489222] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.950 [2024-07-12 18:38:24.489236] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.950 [2024-07-12 18:38:24.489250] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.950 [2024-07-12 18:38:24.490868] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.950 [2024-07-12 18:38:24.490913] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.950 [2024-07-12 18:38:24.492689] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.950 [2024-07-12 18:38:24.492733] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.950 [2024-07-12 18:38:24.493007] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.950 [2024-07-12 18:38:24.493064] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.950 [2024-07-12 18:38:24.494766] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.950 [2024-07-12 18:38:24.494810] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.950 [2024-07-12 18:38:24.495200] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.950 [2024-07-12 18:38:24.495637] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.950 [2024-07-12 18:38:24.495655] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.950 [2024-07-12 18:38:24.495671] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.950 [2024-07-12 18:38:24.495686] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.950 [2024-07-12 18:38:24.499665] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.950 [2024-07-12 18:38:24.499719] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.950 [2024-07-12 18:38:24.499760] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.950 [2024-07-12 18:38:24.501036] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.950 [2024-07-12 18:38:24.501324] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.950 [2024-07-12 18:38:24.501378] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.950 [2024-07-12 18:38:24.502750] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.950 [2024-07-12 18:38:24.502799] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.951 [2024-07-12 18:38:24.502840] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.951 [2024-07-12 18:38:24.503111] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.951 [2024-07-12 18:38:24.503128] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.951 [2024-07-12 18:38:24.503142] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.951 [2024-07-12 18:38:24.503157] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.951 [2024-07-12 18:38:24.505295] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.951 [2024-07-12 18:38:24.505686] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.951 [2024-07-12 18:38:24.505731] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.951 [2024-07-12 18:38:24.506681] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.951 [2024-07-12 18:38:24.507000] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.951 [2024-07-12 18:38:24.508678] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.951 [2024-07-12 18:38:24.510341] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.951 [2024-07-12 18:38:24.510386] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.951 [2024-07-12 18:38:24.511641] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.951 [2024-07-12 18:38:24.511922] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.951 [2024-07-12 18:38:24.511943] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.951 [2024-07-12 18:38:24.511957] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.951 [2024-07-12 18:38:24.511971] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.951 [2024-07-12 18:38:24.513974] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.951 [2024-07-12 18:38:24.514366] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.951 [2024-07-12 18:38:24.514758] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.951 [2024-07-12 18:38:24.515303] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.951 [2024-07-12 18:38:24.515572] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.951 [2024-07-12 18:38:24.515632] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.951 [2024-07-12 18:38:24.517410] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.951 [2024-07-12 18:38:24.517463] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.951 [2024-07-12 18:38:24.519123] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.951 [2024-07-12 18:38:24.519392] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.951 [2024-07-12 18:38:24.519408] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.951 [2024-07-12 18:38:24.519422] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.951 [2024-07-12 18:38:24.519445] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.951 [2024-07-12 18:38:24.522875] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.951 [2024-07-12 18:38:24.523276] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.951 [2024-07-12 18:38:24.523664] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.951 [2024-07-12 18:38:24.524054] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.951 [2024-07-12 18:38:24.524501] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.951 [2024-07-12 18:38:24.524902] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.951 [2024-07-12 18:38:24.525298] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.951 [2024-07-12 18:38:24.525689] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.951 [2024-07-12 18:38:24.526082] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.951 [2024-07-12 18:38:24.526560] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.951 [2024-07-12 18:38:24.526580] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.951 [2024-07-12 18:38:24.526595] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.951 [2024-07-12 18:38:24.526610] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.951 [2024-07-12 18:38:24.529211] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.951 [2024-07-12 18:38:24.529606] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.951 [2024-07-12 18:38:24.530002] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.951 [2024-07-12 18:38:24.530405] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.951 [2024-07-12 18:38:24.530808] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.951 [2024-07-12 18:38:24.531220] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.951 [2024-07-12 18:38:24.531609] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.951 [2024-07-12 18:38:24.532001] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.951 [2024-07-12 18:38:24.532394] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.951 [2024-07-12 18:38:24.532835] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.951 [2024-07-12 18:38:24.532851] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.951 [2024-07-12 18:38:24.532866] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.951 [2024-07-12 18:38:24.532880] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.951 [2024-07-12 18:38:24.535613] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.951 [2024-07-12 18:38:24.536014] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.951 [2024-07-12 18:38:24.536404] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.951 [2024-07-12 18:38:24.536795] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.951 [2024-07-12 18:38:24.537215] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.951 [2024-07-12 18:38:24.537612] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.951 [2024-07-12 18:38:24.538008] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.951 [2024-07-12 18:38:24.538400] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.951 [2024-07-12 18:38:24.538789] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.951 [2024-07-12 18:38:24.539233] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.951 [2024-07-12 18:38:24.539252] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.951 [2024-07-12 18:38:24.539267] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.951 [2024-07-12 18:38:24.539282] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.951 [2024-07-12 18:38:24.542003] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.951 [2024-07-12 18:38:24.542395] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.951 [2024-07-12 18:38:24.542784] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.951 [2024-07-12 18:38:24.543189] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.951 [2024-07-12 18:38:24.543567] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.951 [2024-07-12 18:38:24.543979] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.952 [2024-07-12 18:38:24.544369] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.952 [2024-07-12 18:38:24.544756] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.952 [2024-07-12 18:38:24.545145] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.952 [2024-07-12 18:38:24.545561] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.952 [2024-07-12 18:38:24.545577] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.952 [2024-07-12 18:38:24.545592] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.952 [2024-07-12 18:38:24.545606] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.952 [2024-07-12 18:38:24.548272] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.952 [2024-07-12 18:38:24.548670] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.952 [2024-07-12 18:38:24.549069] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.952 [2024-07-12 18:38:24.549459] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.952 [2024-07-12 18:38:24.549903] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.952 [2024-07-12 18:38:24.550308] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.952 [2024-07-12 18:38:24.550697] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.952 [2024-07-12 18:38:24.551087] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.952 [2024-07-12 18:38:24.551489] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.952 [2024-07-12 18:38:24.551938] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.952 [2024-07-12 18:38:24.551955] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.952 [2024-07-12 18:38:24.551969] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.952 [2024-07-12 18:38:24.551985] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.952 [2024-07-12 18:38:24.554711] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.952 [2024-07-12 18:38:24.555111] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.952 [2024-07-12 18:38:24.555499] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.952 [2024-07-12 18:38:24.555884] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.952 [2024-07-12 18:38:24.556344] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.952 [2024-07-12 18:38:24.556758] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.952 [2024-07-12 18:38:24.557156] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.952 [2024-07-12 18:38:24.557547] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.952 [2024-07-12 18:38:24.557937] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.952 [2024-07-12 18:38:24.558397] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.952 [2024-07-12 18:38:24.558415] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.952 [2024-07-12 18:38:24.558430] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.952 [2024-07-12 18:38:24.558444] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.952 [2024-07-12 18:38:24.561099] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.952 [2024-07-12 18:38:24.561492] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.952 [2024-07-12 18:38:24.561884] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.952 [2024-07-12 18:38:24.562281] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.952 [2024-07-12 18:38:24.562730] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.952 [2024-07-12 18:38:24.563136] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.952 [2024-07-12 18:38:24.563523] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.952 [2024-07-12 18:38:24.563909] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.952 [2024-07-12 18:38:24.564301] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.952 [2024-07-12 18:38:24.564658] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.952 [2024-07-12 18:38:24.564676] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.952 [2024-07-12 18:38:24.564690] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.952 [2024-07-12 18:38:24.564706] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.952 [2024-07-12 18:38:24.567619] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.952 [2024-07-12 18:38:24.568020] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.952 [2024-07-12 18:38:24.568425] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.952 [2024-07-12 18:38:24.568812] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.952 [2024-07-12 18:38:24.569291] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.952 [2024-07-12 18:38:24.569691] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.952 [2024-07-12 18:38:24.570087] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.952 [2024-07-12 18:38:24.570478] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.952 [2024-07-12 18:38:24.570868] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.952 [2024-07-12 18:38:24.571296] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.952 [2024-07-12 18:38:24.571313] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.952 [2024-07-12 18:38:24.571328] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.952 [2024-07-12 18:38:24.571343] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.952 [2024-07-12 18:38:24.574106] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.952 [2024-07-12 18:38:24.574502] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.952 [2024-07-12 18:38:24.574890] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.952 [2024-07-12 18:38:24.575285] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.952 [2024-07-12 18:38:24.575672] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.952 [2024-07-12 18:38:24.576083] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.952 [2024-07-12 18:38:24.576471] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.952 [2024-07-12 18:38:24.576858] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.952 [2024-07-12 18:38:24.577256] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.952 [2024-07-12 18:38:24.577608] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.952 [2024-07-12 18:38:24.577625] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.952 [2024-07-12 18:38:24.577640] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.952 [2024-07-12 18:38:24.577655] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.952 [2024-07-12 18:38:24.580286] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.952 [2024-07-12 18:38:24.580678] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.952 [2024-07-12 18:38:24.581088] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.952 [2024-07-12 18:38:24.581478] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.952 [2024-07-12 18:38:24.581918] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.952 [2024-07-12 18:38:24.582326] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.952 [2024-07-12 18:38:24.582717] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.952 [2024-07-12 18:38:24.583116] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.952 [2024-07-12 18:38:24.583510] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.952 [2024-07-12 18:38:24.583998] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.952 [2024-07-12 18:38:24.584017] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.952 [2024-07-12 18:38:24.584032] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.952 [2024-07-12 18:38:24.584047] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.952 [2024-07-12 18:38:24.586743] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.952 [2024-07-12 18:38:24.587142] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.952 [2024-07-12 18:38:24.587527] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.952 [2024-07-12 18:38:24.587912] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.952 [2024-07-12 18:38:24.588307] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.952 [2024-07-12 18:38:24.588718] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.952 [2024-07-12 18:38:24.589115] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.952 [2024-07-12 18:38:24.589501] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.952 [2024-07-12 18:38:24.589886] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.952 [2024-07-12 18:38:24.590319] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.952 [2024-07-12 18:38:24.590337] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.952 [2024-07-12 18:38:24.590354] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.952 [2024-07-12 18:38:24.590369] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.952 [2024-07-12 18:38:24.592945] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.952 [2024-07-12 18:38:24.593337] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.952 [2024-07-12 18:38:24.594358] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.953 [2024-07-12 18:38:24.595271] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.953 [2024-07-12 18:38:24.595584] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.953 [2024-07-12 18:38:24.596698] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.953 [2024-07-12 18:38:24.597089] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.953 [2024-07-12 18:38:24.597473] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.953 [2024-07-12 18:38:24.597861] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.953 [2024-07-12 18:38:24.598314] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.953 [2024-07-12 18:38:24.598332] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.953 [2024-07-12 18:38:24.598346] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.953 [2024-07-12 18:38:24.598361] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.953 [2024-07-12 18:38:24.600963] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.953 [2024-07-12 18:38:24.601372] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.953 [2024-07-12 18:38:24.601777] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.953 [2024-07-12 18:38:24.602175] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.953 [2024-07-12 18:38:24.602628] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.953 [2024-07-12 18:38:24.603033] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.953 [2024-07-12 18:38:24.603423] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.953 [2024-07-12 18:38:24.603812] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.953 [2024-07-12 18:38:24.604216] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.953 [2024-07-12 18:38:24.604610] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.953 [2024-07-12 18:38:24.604627] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.953 [2024-07-12 18:38:24.604641] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.953 [2024-07-12 18:38:24.604656] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.953 [2024-07-12 18:38:24.607405] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.953 [2024-07-12 18:38:24.607797] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.953 [2024-07-12 18:38:24.608190] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.953 [2024-07-12 18:38:24.608574] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.953 [2024-07-12 18:38:24.609028] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.953 [2024-07-12 18:38:24.609432] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.953 [2024-07-12 18:38:24.609824] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.953 [2024-07-12 18:38:24.610227] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.953 [2024-07-12 18:38:24.610617] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.953 [2024-07-12 18:38:24.611100] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.953 [2024-07-12 18:38:24.611119] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.953 [2024-07-12 18:38:24.611135] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.953 [2024-07-12 18:38:24.611150] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.953 [2024-07-12 18:38:24.614482] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.953 [2024-07-12 18:38:24.615237] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.953 [2024-07-12 18:38:24.616686] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.953 [2024-07-12 18:38:24.618341] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.953 [2024-07-12 18:38:24.618611] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.953 [2024-07-12 18:38:24.620339] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.953 [2024-07-12 18:38:24.620736] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.953 [2024-07-12 18:38:24.621127] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.953 [2024-07-12 18:38:24.621514] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.953 [2024-07-12 18:38:24.621946] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.953 [2024-07-12 18:38:24.621965] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.953 [2024-07-12 18:38:24.621980] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.953 [2024-07-12 18:38:24.621995] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.953 [2024-07-12 18:38:24.624996] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.953 [2024-07-12 18:38:24.626431] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.953 [2024-07-12 18:38:24.627824] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.953 [2024-07-12 18:38:24.629483] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.953 [2024-07-12 18:38:24.629755] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.953 [2024-07-12 18:38:24.630750] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.953 [2024-07-12 18:38:24.631156] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.953 [2024-07-12 18:38:24.631548] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.953 [2024-07-12 18:38:24.631949] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.953 [2024-07-12 18:38:24.632379] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.953 [2024-07-12 18:38:24.632396] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.953 [2024-07-12 18:38:24.632412] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.953 [2024-07-12 18:38:24.632427] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.953 [2024-07-12 18:38:24.634658] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.953 [2024-07-12 18:38:24.636094] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.953 [2024-07-12 18:38:24.637745] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.953 [2024-07-12 18:38:24.639399] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.953 [2024-07-12 18:38:24.639669] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.953 [2024-07-12 18:38:24.640084] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.953 [2024-07-12 18:38:24.640473] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.953 [2024-07-12 18:38:24.640860] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.953 [2024-07-12 18:38:24.641255] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.953 [2024-07-12 18:38:24.641552] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.953 [2024-07-12 18:38:24.641568] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.953 [2024-07-12 18:38:24.641583] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.953 [2024-07-12 18:38:24.641597] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.953 [2024-07-12 18:38:24.644748] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.953 [2024-07-12 18:38:24.646175] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.953 [2024-07-12 18:38:24.647827] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.953 [2024-07-12 18:38:24.649486] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.953 [2024-07-12 18:38:24.649855] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.953 [2024-07-12 18:38:24.650265] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.953 [2024-07-12 18:38:24.650654] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.953 [2024-07-12 18:38:24.651048] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.953 [2024-07-12 18:38:24.651747] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.953 [2024-07-12 18:38:24.652025] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.953 [2024-07-12 18:38:24.652043] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.953 [2024-07-12 18:38:24.652057] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.953 [2024-07-12 18:38:24.652072] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.953 [2024-07-12 18:38:24.655009] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.953 [2024-07-12 18:38:24.656664] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.953 [2024-07-12 18:38:24.658326] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.953 [2024-07-12 18:38:24.659557] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.953 [2024-07-12 18:38:24.659980] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.953 [2024-07-12 18:38:24.660382] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.953 [2024-07-12 18:38:24.660770] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.953 [2024-07-12 18:38:24.661163] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.953 [2024-07-12 18:38:24.662903] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.953 [2024-07-12 18:38:24.663219] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.953 [2024-07-12 18:38:24.663235] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.953 [2024-07-12 18:38:24.663250] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.954 [2024-07-12 18:38:24.663264] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.954 [2024-07-12 18:38:24.666254] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:40.954 [2024-07-12 18:38:24.667787] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.216 [2024-07-12 18:38:24.669301] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.216 [2024-07-12 18:38:24.669888] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.216 [2024-07-12 18:38:24.670357] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.216 [2024-07-12 18:38:24.670760] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.216 [2024-07-12 18:38:24.671155] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.216 [2024-07-12 18:38:24.671546] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.216 [2024-07-12 18:38:24.673259] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.216 [2024-07-12 18:38:24.673534] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.216 [2024-07-12 18:38:24.673551] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.216 [2024-07-12 18:38:24.673565] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.216 [2024-07-12 18:38:24.673580] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.216 [2024-07-12 18:38:24.676922] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.216 [2024-07-12 18:38:24.678718] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.216 [2024-07-12 18:38:24.680367] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.216 [2024-07-12 18:38:24.680753] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.216 [2024-07-12 18:38:24.681169] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.216 [2024-07-12 18:38:24.681567] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.216 [2024-07-12 18:38:24.681964] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.216 [2024-07-12 18:38:24.682992] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.216 [2024-07-12 18:38:24.684372] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.216 [2024-07-12 18:38:24.684645] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.216 [2024-07-12 18:38:24.684662] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.216 [2024-07-12 18:38:24.684676] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.216 [2024-07-12 18:38:24.684691] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.216 [2024-07-12 18:38:24.687879] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.216 [2024-07-12 18:38:24.689528] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.216 [2024-07-12 18:38:24.690343] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.216 [2024-07-12 18:38:24.690738] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.216 [2024-07-12 18:38:24.691188] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.216 [2024-07-12 18:38:24.691592] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.216 [2024-07-12 18:38:24.691990] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.216 [2024-07-12 18:38:24.693734] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.216 [2024-07-12 18:38:24.695497] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.216 [2024-07-12 18:38:24.695772] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.216 [2024-07-12 18:38:24.695789] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.216 [2024-07-12 18:38:24.695803] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.216 [2024-07-12 18:38:24.695818] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.216 [2024-07-12 18:38:24.699214] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.216 [2024-07-12 18:38:24.700839] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.216 [2024-07-12 18:38:24.701237] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.216 [2024-07-12 18:38:24.701626] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.216 [2024-07-12 18:38:24.702062] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.216 [2024-07-12 18:38:24.702462] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.216 [2024-07-12 18:38:24.703807] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.216 [2024-07-12 18:38:24.705193] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.216 [2024-07-12 18:38:24.706842] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.216 [2024-07-12 18:38:24.707122] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.216 [2024-07-12 18:38:24.707140] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.216 [2024-07-12 18:38:24.707156] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.216 [2024-07-12 18:38:24.707171] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.216 [2024-07-12 18:38:24.710440] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.216 [2024-07-12 18:38:24.710986] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.216 [2024-07-12 18:38:24.711378] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.216 [2024-07-12 18:38:24.711771] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.216 [2024-07-12 18:38:24.712261] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.216 [2024-07-12 18:38:24.712855] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.216 [2024-07-12 18:38:24.714244] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.216 [2024-07-12 18:38:24.715886] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.216 [2024-07-12 18:38:24.717534] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.216 [2024-07-12 18:38:24.717806] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.216 [2024-07-12 18:38:24.717823] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.216 [2024-07-12 18:38:24.717838] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.216 [2024-07-12 18:38:24.717853] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.216 [2024-07-12 18:38:24.720654] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.216 [2024-07-12 18:38:24.720707] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.216 [2024-07-12 18:38:24.721115] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.216 [2024-07-12 18:38:24.721507] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.216 [2024-07-12 18:38:24.722002] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.216 [2024-07-12 18:38:24.722401] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.216 [2024-07-12 18:38:24.724105] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.216 [2024-07-12 18:38:24.725873] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.216 [2024-07-12 18:38:24.727682] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.216 [2024-07-12 18:38:24.727960] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.216 [2024-07-12 18:38:24.727985] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.216 [2024-07-12 18:38:24.728000] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.216 [2024-07-12 18:38:24.728015] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.216 [2024-07-12 18:38:24.731237] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.216 [2024-07-12 18:38:24.731633] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.216 [2024-07-12 18:38:24.731680] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.216 [2024-07-12 18:38:24.732075] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.216 [2024-07-12 18:38:24.732466] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.216 [2024-07-12 18:38:24.732868] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.217 [2024-07-12 18:38:24.734216] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.217 [2024-07-12 18:38:24.734265] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.217 [2024-07-12 18:38:24.735650] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.217 [2024-07-12 18:38:24.735923] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.217 [2024-07-12 18:38:24.735949] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.217 [2024-07-12 18:38:24.735964] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.217 [2024-07-12 18:38:24.735978] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.217 [2024-07-12 18:38:24.739263] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.217 [2024-07-12 18:38:24.739315] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.217 [2024-07-12 18:38:24.741104] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.217 [2024-07-12 18:38:24.741495] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.217 [2024-07-12 18:38:24.741922] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.217 [2024-07-12 18:38:24.742326] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.217 [2024-07-12 18:38:24.742372] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.217 [2024-07-12 18:38:24.742757] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.217 [2024-07-12 18:38:24.744163] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.217 [2024-07-12 18:38:24.744497] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.217 [2024-07-12 18:38:24.744513] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.217 [2024-07-12 18:38:24.744527] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.217 [2024-07-12 18:38:24.744542] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.217 [2024-07-12 18:38:24.746158] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.217 [2024-07-12 18:38:24.747613] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.217 [2024-07-12 18:38:24.749282] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.217 [2024-07-12 18:38:24.749328] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.217 [2024-07-12 18:38:24.749601] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.217 [2024-07-12 18:38:24.749661] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.217 [2024-07-12 18:38:24.750065] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.217 [2024-07-12 18:38:24.750454] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.217 [2024-07-12 18:38:24.750501] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.217 [2024-07-12 18:38:24.750870] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.217 [2024-07-12 18:38:24.750887] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.217 [2024-07-12 18:38:24.750902] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.217 [2024-07-12 18:38:24.750917] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.217 [2024-07-12 18:38:24.754406] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.217 [2024-07-12 18:38:24.755874] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.217 [2024-07-12 18:38:24.755933] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.217 [2024-07-12 18:38:24.757550] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.217 [2024-07-12 18:38:24.757865] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.217 [2024-07-12 18:38:24.759546] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.217 [2024-07-12 18:38:24.759594] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.217 [2024-07-12 18:38:24.761256] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.217 [2024-07-12 18:38:24.761302] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.217 [2024-07-12 18:38:24.761700] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.217 [2024-07-12 18:38:24.761717] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.217 [2024-07-12 18:38:24.761732] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.217 [2024-07-12 18:38:24.761748] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.217 [2024-07-12 18:38:24.764116] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.217 [2024-07-12 18:38:24.765503] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.217 [2024-07-12 18:38:24.767150] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.217 [2024-07-12 18:38:24.767196] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.217 [2024-07-12 18:38:24.767464] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.217 [2024-07-12 18:38:24.769038] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.217 [2024-07-12 18:38:24.769084] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.217 [2024-07-12 18:38:24.770560] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.217 [2024-07-12 18:38:24.771954] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.217 [2024-07-12 18:38:24.772225] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.217 [2024-07-12 18:38:24.772242] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.217 [2024-07-12 18:38:24.772256] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.217 [2024-07-12 18:38:24.772270] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.217 [2024-07-12 18:38:24.774782] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.217 [2024-07-12 18:38:24.774835] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.217 [2024-07-12 18:38:24.775233] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.217 [2024-07-12 18:38:24.775279] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.217 [2024-07-12 18:38:24.775549] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.217 [2024-07-12 18:38:24.775608] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.217 [2024-07-12 18:38:24.775668] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.217 [2024-07-12 18:38:24.777335] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.217 [2024-07-12 18:38:24.777382] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.217 [2024-07-12 18:38:24.777648] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.217 [2024-07-12 18:38:24.777665] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.217 [2024-07-12 18:38:24.777680] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.217 [2024-07-12 18:38:24.777694] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.217 [2024-07-12 18:38:24.779354] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.217 [2024-07-12 18:38:24.779399] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.217 [2024-07-12 18:38:24.779439] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.217 [2024-07-12 18:38:24.779479] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.217 [2024-07-12 18:38:24.779745] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.217 [2024-07-12 18:38:24.780791] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.217 [2024-07-12 18:38:24.780850] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.217 [2024-07-12 18:38:24.781243] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.217 [2024-07-12 18:38:24.781288] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.217 [2024-07-12 18:38:24.781708] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.217 [2024-07-12 18:38:24.781726] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.217 [2024-07-12 18:38:24.781741] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.217 [2024-07-12 18:38:24.781756] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.217 [2024-07-12 18:38:24.783762] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.217 [2024-07-12 18:38:24.783812] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.217 [2024-07-12 18:38:24.783852] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.217 [2024-07-12 18:38:24.783892] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.217 [2024-07-12 18:38:24.784166] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.217 [2024-07-12 18:38:24.784224] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.217 [2024-07-12 18:38:24.784265] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.217 [2024-07-12 18:38:24.784305] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.217 [2024-07-12 18:38:24.784345] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.217 [2024-07-12 18:38:24.784662] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.217 [2024-07-12 18:38:24.784679] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.217 [2024-07-12 18:38:24.784697] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.217 [2024-07-12 18:38:24.784713] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.217 [2024-07-12 18:38:24.786228] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.218 [2024-07-12 18:38:24.786272] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.218 [2024-07-12 18:38:24.786312] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.218 [2024-07-12 18:38:24.786352] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.218 [2024-07-12 18:38:24.786811] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.218 [2024-07-12 18:38:24.786873] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.218 [2024-07-12 18:38:24.786915] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.218 [2024-07-12 18:38:24.786964] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.218 [2024-07-12 18:38:24.787004] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.218 [2024-07-12 18:38:24.787432] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.218 [2024-07-12 18:38:24.787449] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.218 [2024-07-12 18:38:24.787464] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.218 [2024-07-12 18:38:24.787479] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.218 [2024-07-12 18:38:24.789533] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.218 [2024-07-12 18:38:24.789578] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.218 [2024-07-12 18:38:24.789618] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.218 [2024-07-12 18:38:24.789659] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.218 [2024-07-12 18:38:24.789937] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.218 [2024-07-12 18:38:24.790003] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.218 [2024-07-12 18:38:24.790043] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.218 [2024-07-12 18:38:24.790083] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.218 [2024-07-12 18:38:24.790123] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.218 [2024-07-12 18:38:24.790387] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.218 [2024-07-12 18:38:24.790404] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.218 [2024-07-12 18:38:24.790418] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.218 [2024-07-12 18:38:24.790432] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.218 [2024-07-12 18:38:24.792158] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.218 [2024-07-12 18:38:24.792212] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.218 [2024-07-12 18:38:24.792260] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.218 [2024-07-12 18:38:24.792300] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.218 [2024-07-12 18:38:24.792569] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.218 [2024-07-12 18:38:24.792630] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.218 [2024-07-12 18:38:24.792671] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.218 [2024-07-12 18:38:24.792712] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.218 [2024-07-12 18:38:24.792753] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.218 [2024-07-12 18:38:24.793231] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.218 [2024-07-12 18:38:24.793249] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.218 [2024-07-12 18:38:24.793264] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.218 [2024-07-12 18:38:24.793280] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.218 [2024-07-12 18:38:24.795444] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.218 [2024-07-12 18:38:24.795495] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.218 [2024-07-12 18:38:24.795540] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.218 [2024-07-12 18:38:24.795585] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.218 [2024-07-12 18:38:24.795853] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.218 [2024-07-12 18:38:24.795904] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.218 [2024-07-12 18:38:24.795963] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.218 [2024-07-12 18:38:24.796004] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.218 [2024-07-12 18:38:24.796043] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.218 [2024-07-12 18:38:24.796306] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.218 [2024-07-12 18:38:24.796323] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.218 [2024-07-12 18:38:24.796337] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.218 [2024-07-12 18:38:24.796351] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.218 [2024-07-12 18:38:24.798020] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.218 [2024-07-12 18:38:24.798064] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.218 [2024-07-12 18:38:24.798104] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.218 [2024-07-12 18:38:24.798144] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.218 [2024-07-12 18:38:24.798410] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.218 [2024-07-12 18:38:24.798472] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.218 [2024-07-12 18:38:24.798519] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.218 [2024-07-12 18:38:24.798618] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.218 [2024-07-12 18:38:24.798666] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.218 [2024-07-12 18:38:24.799123] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.218 [2024-07-12 18:38:24.799141] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.218 [2024-07-12 18:38:24.799157] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.218 [2024-07-12 18:38:24.799172] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.218 [2024-07-12 18:38:24.801419] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.218 [2024-07-12 18:38:24.801464] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.218 [2024-07-12 18:38:24.801504] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.218 [2024-07-12 18:38:24.801545] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.218 [2024-07-12 18:38:24.801878] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.218 [2024-07-12 18:38:24.801944] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.218 [2024-07-12 18:38:24.801985] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.218 [2024-07-12 18:38:24.802025] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.218 [2024-07-12 18:38:24.802064] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.218 [2024-07-12 18:38:24.802327] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.218 [2024-07-12 18:38:24.802343] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.218 [2024-07-12 18:38:24.802358] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.218 [2024-07-12 18:38:24.802372] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.218 [2024-07-12 18:38:24.804032] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.218 [2024-07-12 18:38:24.804080] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.218 [2024-07-12 18:38:24.804130] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.218 [2024-07-12 18:38:24.804170] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.218 [2024-07-12 18:38:24.804434] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.218 [2024-07-12 18:38:24.804504] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.218 [2024-07-12 18:38:24.804547] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.218 [2024-07-12 18:38:24.804588] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.218 [2024-07-12 18:38:24.804628] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.218 [2024-07-12 18:38:24.804907] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.218 [2024-07-12 18:38:24.804923] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.218 [2024-07-12 18:38:24.804947] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.218 [2024-07-12 18:38:24.804966] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.218 [2024-07-12 18:38:24.807439] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.218 [2024-07-12 18:38:24.807485] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.218 [2024-07-12 18:38:24.807534] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.218 [2024-07-12 18:38:24.807574] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.218 [2024-07-12 18:38:24.807843] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.218 [2024-07-12 18:38:24.807901] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.218 [2024-07-12 18:38:24.807965] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.218 [2024-07-12 18:38:24.808010] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.218 [2024-07-12 18:38:24.808051] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.218 [2024-07-12 18:38:24.808330] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.218 [2024-07-12 18:38:24.808347] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.219 [2024-07-12 18:38:24.808361] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.219 [2024-07-12 18:38:24.808376] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.219 [2024-07-12 18:38:24.810003] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.219 [2024-07-12 18:38:24.810047] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.219 [2024-07-12 18:38:24.810087] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.219 [2024-07-12 18:38:24.810127] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.219 [2024-07-12 18:38:24.810390] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.219 [2024-07-12 18:38:24.810451] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.219 [2024-07-12 18:38:24.810492] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.219 [2024-07-12 18:38:24.810532] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.219 [2024-07-12 18:38:24.810572] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.219 [2024-07-12 18:38:24.810842] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.219 [2024-07-12 18:38:24.810859] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.219 [2024-07-12 18:38:24.810873] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.219 [2024-07-12 18:38:24.810887] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.219 [2024-07-12 18:38:24.813303] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.219 [2024-07-12 18:38:24.813350] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.219 [2024-07-12 18:38:24.813392] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.219 [2024-07-12 18:38:24.813439] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.219 [2024-07-12 18:38:24.813773] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.219 [2024-07-12 18:38:24.813828] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.219 [2024-07-12 18:38:24.813871] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.219 [2024-07-12 18:38:24.813912] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.219 [2024-07-12 18:38:24.813961] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.219 [2024-07-12 18:38:24.814266] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.219 [2024-07-12 18:38:24.814285] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.219 [2024-07-12 18:38:24.814302] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.219 [2024-07-12 18:38:24.814317] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.219 [2024-07-12 18:38:24.815979] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.219 [2024-07-12 18:38:24.816034] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.219 [2024-07-12 18:38:24.816078] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.219 [2024-07-12 18:38:24.816120] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.219 [2024-07-12 18:38:24.816391] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.219 [2024-07-12 18:38:24.816454] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.219 [2024-07-12 18:38:24.816495] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.219 [2024-07-12 18:38:24.816538] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.219 [2024-07-12 18:38:24.816578] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.219 [2024-07-12 18:38:24.816848] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.219 [2024-07-12 18:38:24.816868] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.219 [2024-07-12 18:38:24.816882] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.219 [2024-07-12 18:38:24.816896] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.219 [2024-07-12 18:38:24.819076] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.219 [2024-07-12 18:38:24.819124] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.219 [2024-07-12 18:38:24.819170] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.219 [2024-07-12 18:38:24.819212] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.219 [2024-07-12 18:38:24.819629] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.219 [2024-07-12 18:38:24.819685] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.219 [2024-07-12 18:38:24.819726] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.219 [2024-07-12 18:38:24.819766] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.219 [2024-07-12 18:38:24.819810] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.219 [2024-07-12 18:38:24.820121] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.219 [2024-07-12 18:38:24.820138] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.219 [2024-07-12 18:38:24.820152] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.219 [2024-07-12 18:38:24.820166] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.219 [2024-07-12 18:38:24.821761] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.219 [2024-07-12 18:38:24.821807] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.219 [2024-07-12 18:38:24.821847] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.219 [2024-07-12 18:38:24.821888] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.219 [2024-07-12 18:38:24.822209] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.219 [2024-07-12 18:38:24.822274] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.219 [2024-07-12 18:38:24.822317] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.219 [2024-07-12 18:38:24.822360] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.219 [2024-07-12 18:38:24.822409] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.219 [2024-07-12 18:38:24.822677] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.219 [2024-07-12 18:38:24.822697] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.219 [2024-07-12 18:38:24.822712] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.219 [2024-07-12 18:38:24.822726] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.219 [2024-07-12 18:38:24.824895] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.219 [2024-07-12 18:38:24.824952] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.219 [2024-07-12 18:38:24.824994] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.219 [2024-07-12 18:38:24.825034] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.219 [2024-07-12 18:38:24.825478] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.219 [2024-07-12 18:38:24.825532] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.219 [2024-07-12 18:38:24.825576] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.219 [2024-07-12 18:38:24.825622] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.219 [2024-07-12 18:38:24.825663] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.219 [2024-07-12 18:38:24.825939] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.219 [2024-07-12 18:38:24.825956] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.219 [2024-07-12 18:38:24.825971] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.219 [2024-07-12 18:38:24.825990] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.219 [2024-07-12 18:38:24.827638] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.219 [2024-07-12 18:38:24.827686] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.219 [2024-07-12 18:38:24.827726] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.219 [2024-07-12 18:38:24.827767] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.219 [2024-07-12 18:38:24.828053] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.219 [2024-07-12 18:38:24.828113] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.219 [2024-07-12 18:38:24.828158] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.219 [2024-07-12 18:38:24.828198] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.219 [2024-07-12 18:38:24.828239] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.219 [2024-07-12 18:38:24.828503] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.219 [2024-07-12 18:38:24.828520] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.219 [2024-07-12 18:38:24.828534] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.219 [2024-07-12 18:38:24.828548] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.219 [2024-07-12 18:38:24.830548] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.219 [2024-07-12 18:38:24.830595] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.219 [2024-07-12 18:38:24.830640] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.219 [2024-07-12 18:38:24.830698] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.219 [2024-07-12 18:38:24.831157] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.219 [2024-07-12 18:38:24.831212] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.220 [2024-07-12 18:38:24.831255] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.220 [2024-07-12 18:38:24.831297] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.220 [2024-07-12 18:38:24.831339] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.220 [2024-07-12 18:38:24.831733] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.220 [2024-07-12 18:38:24.831751] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.220 [2024-07-12 18:38:24.831766] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.220 [2024-07-12 18:38:24.831781] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.220 [2024-07-12 18:38:24.833308] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.220 [2024-07-12 18:38:24.833361] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.220 [2024-07-12 18:38:24.833403] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.220 [2024-07-12 18:38:24.833443] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.220 [2024-07-12 18:38:24.833742] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.220 [2024-07-12 18:38:24.833803] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.220 [2024-07-12 18:38:24.833845] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.220 [2024-07-12 18:38:24.833884] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.220 [2024-07-12 18:38:24.833939] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.220 [2024-07-12 18:38:24.834214] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.220 [2024-07-12 18:38:24.834231] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.220 [2024-07-12 18:38:24.834246] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.220 [2024-07-12 18:38:24.834260] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.220 [2024-07-12 18:38:24.836105] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.220 [2024-07-12 18:38:24.836151] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.220 [2024-07-12 18:38:24.836193] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.220 [2024-07-12 18:38:24.836234] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.220 [2024-07-12 18:38:24.836599] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.220 [2024-07-12 18:38:24.836660] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.220 [2024-07-12 18:38:24.836701] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.220 [2024-07-12 18:38:24.836743] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.220 [2024-07-12 18:38:24.836783] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.220 [2024-07-12 18:38:24.837203] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.220 [2024-07-12 18:38:24.837221] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.220 [2024-07-12 18:38:24.837237] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.220 [2024-07-12 18:38:24.837253] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.220 [2024-07-12 18:38:24.838859] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.220 [2024-07-12 18:38:24.838906] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.220 [2024-07-12 18:38:24.838956] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.220 [2024-07-12 18:38:24.839006] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.220 [2024-07-12 18:38:24.839278] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.220 [2024-07-12 18:38:24.839330] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.220 [2024-07-12 18:38:24.839379] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.220 [2024-07-12 18:38:24.839419] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.220 [2024-07-12 18:38:24.839470] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.220 [2024-07-12 18:38:24.839741] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.220 [2024-07-12 18:38:24.839758] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.220 [2024-07-12 18:38:24.839772] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.220 [2024-07-12 18:38:24.839786] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.220 [2024-07-12 18:38:24.841855] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.220 [2024-07-12 18:38:24.841900] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.220 [2024-07-12 18:38:24.841953] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.220 [2024-07-12 18:38:24.841997] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.220 [2024-07-12 18:38:24.842439] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.220 [2024-07-12 18:38:24.842492] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.220 [2024-07-12 18:38:24.842534] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.220 [2024-07-12 18:38:24.842576] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.220 [2024-07-12 18:38:24.842618] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.220 [2024-07-12 18:38:24.842889] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.220 [2024-07-12 18:38:24.842906] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.220 [2024-07-12 18:38:24.842920] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.220 [2024-07-12 18:38:24.842943] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.220 [2024-07-12 18:38:24.844558] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.220 [2024-07-12 18:38:24.844602] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.220 [2024-07-12 18:38:24.844642] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.220 [2024-07-12 18:38:24.844682] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.220 [2024-07-12 18:38:24.845007] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.220 [2024-07-12 18:38:24.845070] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.220 [2024-07-12 18:38:24.845112] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.220 [2024-07-12 18:38:24.845152] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.220 [2024-07-12 18:38:24.845193] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.221 [2024-07-12 18:38:24.845458] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.221 [2024-07-12 18:38:24.845474] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.221 [2024-07-12 18:38:24.845489] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.221 [2024-07-12 18:38:24.845503] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.221 [2024-07-12 18:38:24.847461] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.221 [2024-07-12 18:38:24.847507] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.221 [2024-07-12 18:38:24.847564] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.221 [2024-07-12 18:38:24.847614] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.221 [2024-07-12 18:38:24.848079] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.221 [2024-07-12 18:38:24.848133] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.221 [2024-07-12 18:38:24.848178] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.221 [2024-07-12 18:38:24.848219] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.221 [2024-07-12 18:38:24.848260] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.221 [2024-07-12 18:38:24.848650] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.221 [2024-07-12 18:38:24.848667] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.221 [2024-07-12 18:38:24.848681] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.221 [2024-07-12 18:38:24.848695] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.221 [2024-07-12 18:38:24.850211] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.221 [2024-07-12 18:38:24.850263] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.221 [2024-07-12 18:38:24.850305] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.221 [2024-07-12 18:38:24.850347] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.221 [2024-07-12 18:38:24.850646] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.221 [2024-07-12 18:38:24.850700] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.221 [2024-07-12 18:38:24.850741] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.221 [2024-07-12 18:38:24.850781] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.221 [2024-07-12 18:38:24.850821] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.221 [2024-07-12 18:38:24.851125] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.221 [2024-07-12 18:38:24.851141] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.221 [2024-07-12 18:38:24.851156] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.221 [2024-07-12 18:38:24.851170] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.221 [2024-07-12 18:38:24.853010] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.221 [2024-07-12 18:38:24.853418] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.221 [2024-07-12 18:38:24.853461] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.221 [2024-07-12 18:38:24.853503] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.221 [2024-07-12 18:38:24.853938] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.221 [2024-07-12 18:38:24.853993] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.221 [2024-07-12 18:38:24.854035] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.221 [2024-07-12 18:38:24.854080] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.221 [2024-07-12 18:38:24.854122] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.221 [2024-07-12 18:38:24.854444] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.221 [2024-07-12 18:38:24.854460] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.221 [2024-07-12 18:38:24.854475] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.221 [2024-07-12 18:38:24.854489] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.221 [2024-07-12 18:38:24.856054] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.221 [2024-07-12 18:38:24.856100] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.221 [2024-07-12 18:38:24.857782] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.221 [2024-07-12 18:38:24.857835] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.221 [2024-07-12 18:38:24.858111] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.221 [2024-07-12 18:38:24.858171] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.221 [2024-07-12 18:38:24.858212] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.221 [2024-07-12 18:38:24.859923] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.221 [2024-07-12 18:38:24.859993] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.221 [2024-07-12 18:38:24.860261] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.221 [2024-07-12 18:38:24.860277] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.221 [2024-07-12 18:38:24.860291] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.221 [2024-07-12 18:38:24.860306] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.221 [2024-07-12 18:38:24.862711] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.221 [2024-07-12 18:38:24.863731] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.221 [2024-07-12 18:38:24.863778] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.221 [2024-07-12 18:38:24.863819] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.221 [2024-07-12 18:38:24.864174] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.221 [2024-07-12 18:38:24.864234] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.221 [2024-07-12 18:38:24.865887] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.221 [2024-07-12 18:38:24.865941] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.221 [2024-07-12 18:38:24.865982] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.221 [2024-07-12 18:38:24.866257] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.221 [2024-07-12 18:38:24.866273] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.221 [2024-07-12 18:38:24.866288] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.221 [2024-07-12 18:38:24.866302] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.221 [2024-07-12 18:38:24.869553] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.221 [2024-07-12 18:38:24.869605] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.221 [2024-07-12 18:38:24.869653] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.221 [2024-07-12 18:38:24.870046] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.221 [2024-07-12 18:38:24.870475] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.221 [2024-07-12 18:38:24.870873] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.221 [2024-07-12 18:38:24.870916] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.221 [2024-07-12 18:38:24.870966] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.221 [2024-07-12 18:38:24.871351] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.221 [2024-07-12 18:38:24.871652] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.221 [2024-07-12 18:38:24.871668] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.221 [2024-07-12 18:38:24.871683] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.221 [2024-07-12 18:38:24.871697] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.221 [2024-07-12 18:38:24.873300] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.221 [2024-07-12 18:38:24.873346] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.221 [2024-07-12 18:38:24.874933] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.221 [2024-07-12 18:38:24.874981] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.221 [2024-07-12 18:38:24.875275] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.221 [2024-07-12 18:38:24.875334] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.221 [2024-07-12 18:38:24.877027] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.221 [2024-07-12 18:38:24.877073] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.221 [2024-07-12 18:38:24.878803] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.221 [2024-07-12 18:38:24.879198] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.221 [2024-07-12 18:38:24.879217] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.221 [2024-07-12 18:38:24.879231] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.221 [2024-07-12 18:38:24.879246] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.221 [2024-07-12 18:38:24.882320] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.221 [2024-07-12 18:38:24.882385] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.221 [2024-07-12 18:38:24.882427] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.221 [2024-07-12 18:38:24.882813] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.222 [2024-07-12 18:38:24.883263] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.222 [2024-07-12 18:38:24.883318] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.222 [2024-07-12 18:38:24.883706] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.222 [2024-07-12 18:38:24.883751] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.222 [2024-07-12 18:38:24.883793] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.222 [2024-07-12 18:38:24.884229] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.222 [2024-07-12 18:38:24.884247] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.222 [2024-07-12 18:38:24.884262] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.222 [2024-07-12 18:38:24.884280] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.222 [2024-07-12 18:38:24.886590] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.222 [2024-07-12 18:38:24.887010] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.222 [2024-07-12 18:38:24.887060] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.222 [2024-07-12 18:38:24.887448] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.222 [2024-07-12 18:38:24.887945] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.222 [2024-07-12 18:38:24.888350] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.222 [2024-07-12 18:38:24.888739] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.222 [2024-07-12 18:38:24.888785] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.222 [2024-07-12 18:38:24.889178] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.222 [2024-07-12 18:38:24.889620] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.222 [2024-07-12 18:38:24.889640] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.222 [2024-07-12 18:38:24.889655] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.222 [2024-07-12 18:38:24.889671] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.222 [2024-07-12 18:38:24.892342] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.222 [2024-07-12 18:38:24.892738] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.222 [2024-07-12 18:38:24.893137] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.222 [2024-07-12 18:38:24.893530] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.222 [2024-07-12 18:38:24.893968] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.222 [2024-07-12 18:38:24.894030] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.222 [2024-07-12 18:38:24.894420] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.222 [2024-07-12 18:38:24.894464] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.222 [2024-07-12 18:38:24.894848] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.222 [2024-07-12 18:38:24.895298] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.222 [2024-07-12 18:38:24.895317] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.222 [2024-07-12 18:38:24.895333] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.222 [2024-07-12 18:38:24.895348] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.222 [2024-07-12 18:38:24.897986] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.222 [2024-07-12 18:38:24.898382] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.222 [2024-07-12 18:38:24.898773] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.222 [2024-07-12 18:38:24.899176] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.222 [2024-07-12 18:38:24.899630] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.222 [2024-07-12 18:38:24.900040] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.222 [2024-07-12 18:38:24.900443] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.222 [2024-07-12 18:38:24.900827] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.222 [2024-07-12 18:38:24.901229] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.222 [2024-07-12 18:38:24.901619] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.222 [2024-07-12 18:38:24.901636] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.222 [2024-07-12 18:38:24.901650] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.222 [2024-07-12 18:38:24.901665] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.222 [2024-07-12 18:38:24.904666] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.222 [2024-07-12 18:38:24.905072] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.222 [2024-07-12 18:38:24.905467] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.222 [2024-07-12 18:38:24.905854] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.222 [2024-07-12 18:38:24.906318] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.222 [2024-07-12 18:38:24.906720] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.222 [2024-07-12 18:38:24.907127] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.222 [2024-07-12 18:38:24.907524] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.222 [2024-07-12 18:38:24.907914] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.222 [2024-07-12 18:38:24.908298] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.222 [2024-07-12 18:38:24.908321] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.222 [2024-07-12 18:38:24.908335] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.222 [2024-07-12 18:38:24.908350] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.222 [2024-07-12 18:38:24.910945] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.222 [2024-07-12 18:38:24.911338] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.222 [2024-07-12 18:38:24.911726] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.222 [2024-07-12 18:38:24.912122] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.222 [2024-07-12 18:38:24.912494] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.222 [2024-07-12 18:38:24.912899] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.222 [2024-07-12 18:38:24.913297] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.222 [2024-07-12 18:38:24.913687] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.222 [2024-07-12 18:38:24.914086] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.222 [2024-07-12 18:38:24.914434] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.222 [2024-07-12 18:38:24.914451] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.222 [2024-07-12 18:38:24.914467] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.222 [2024-07-12 18:38:24.914481] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.222 [2024-07-12 18:38:24.917134] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.222 [2024-07-12 18:38:24.917534] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.222 [2024-07-12 18:38:24.917953] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.222 [2024-07-12 18:38:24.918348] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.222 [2024-07-12 18:38:24.918776] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.222 [2024-07-12 18:38:24.919185] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.222 [2024-07-12 18:38:24.919577] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.222 [2024-07-12 18:38:24.919981] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.222 [2024-07-12 18:38:24.920374] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.222 [2024-07-12 18:38:24.920851] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.222 [2024-07-12 18:38:24.920868] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.222 [2024-07-12 18:38:24.920884] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.222 [2024-07-12 18:38:24.920900] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.222 [2024-07-12 18:38:24.923557] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.222 [2024-07-12 18:38:24.923973] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.222 [2024-07-12 18:38:24.924367] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.222 [2024-07-12 18:38:24.924758] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.222 [2024-07-12 18:38:24.925178] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.222 [2024-07-12 18:38:24.925582] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.222 [2024-07-12 18:38:24.926001] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.222 [2024-07-12 18:38:24.926392] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.222 [2024-07-12 18:38:24.926783] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.222 [2024-07-12 18:38:24.927257] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.222 [2024-07-12 18:38:24.927275] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.222 [2024-07-12 18:38:24.927290] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.222 [2024-07-12 18:38:24.927306] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.223 [2024-07-12 18:38:24.929995] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.223 [2024-07-12 18:38:24.930397] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.223 [2024-07-12 18:38:24.930784] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.223 [2024-07-12 18:38:24.931187] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.223 [2024-07-12 18:38:24.931628] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.223 [2024-07-12 18:38:24.932037] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.223 [2024-07-12 18:38:24.932424] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.223 [2024-07-12 18:38:24.932813] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.223 [2024-07-12 18:38:24.933207] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.223 [2024-07-12 18:38:24.933534] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.223 [2024-07-12 18:38:24.933552] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.223 [2024-07-12 18:38:24.933569] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.223 [2024-07-12 18:38:24.933585] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.223 [2024-07-12 18:38:24.936184] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.223 [2024-07-12 18:38:24.936584] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.223 [2024-07-12 18:38:24.936981] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.223 [2024-07-12 18:38:24.937372] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.481 [2024-07-12 18:38:24.937807] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.481 [2024-07-12 18:38:24.938213] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.481 [2024-07-12 18:38:24.938615] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.481 [2024-07-12 18:38:24.939016] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.481 [2024-07-12 18:38:24.939407] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.481 [2024-07-12 18:38:24.939840] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.481 [2024-07-12 18:38:24.939857] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.481 [2024-07-12 18:38:24.939875] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.481 [2024-07-12 18:38:24.939890] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.481 [2024-07-12 18:38:24.942508] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.481 [2024-07-12 18:38:24.942906] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.481 [2024-07-12 18:38:24.943302] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.481 [2024-07-12 18:38:24.943692] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.481 [2024-07-12 18:38:24.944087] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.481 [2024-07-12 18:38:24.944489] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.481 [2024-07-12 18:38:24.944884] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.481 [2024-07-12 18:38:24.945281] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.481 [2024-07-12 18:38:24.945685] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.481 [2024-07-12 18:38:24.946129] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.481 [2024-07-12 18:38:24.946148] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.481 [2024-07-12 18:38:24.946164] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.481 [2024-07-12 18:38:24.946179] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.481 [2024-07-12 18:38:24.948893] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.481 [2024-07-12 18:38:24.949303] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.481 [2024-07-12 18:38:24.949696] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.481 [2024-07-12 18:38:24.950098] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.481 [2024-07-12 18:38:24.950608] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.482 [2024-07-12 18:38:24.951019] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.482 [2024-07-12 18:38:24.951426] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.482 [2024-07-12 18:38:24.951817] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.482 [2024-07-12 18:38:24.952226] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.482 [2024-07-12 18:38:24.952499] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.482 [2024-07-12 18:38:24.952519] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.482 [2024-07-12 18:38:24.952533] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.482 [2024-07-12 18:38:24.952547] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.482 [2024-07-12 18:38:24.955429] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.482 [2024-07-12 18:38:24.955844] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.482 [2024-07-12 18:38:24.956246] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.482 [2024-07-12 18:38:24.956636] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.482 [2024-07-12 18:38:24.957013] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.482 [2024-07-12 18:38:24.957419] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.482 [2024-07-12 18:38:24.957810] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.482 [2024-07-12 18:38:24.958219] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.482 [2024-07-12 18:38:24.958612] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.482 [2024-07-12 18:38:24.959060] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.482 [2024-07-12 18:38:24.959078] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.482 [2024-07-12 18:38:24.959094] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.482 [2024-07-12 18:38:24.959110] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.482 [2024-07-12 18:38:24.963340] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:41.482 [2024-07-12 18:38:24.963734] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:34:42.047 00:34:42.047 Latency(us) 00:34:42.047 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:34:42.047 Job: crypto_ram (Core Mask 0x1, workload: verify, depth: 128, IO size: 65536) 00:34:42.047 Verification LBA range: start 0x0 length 0x100 00:34:42.047 crypto_ram : 6.14 41.72 2.61 0.00 0.00 2975815.23 284483.23 2698943.44 00:34:42.047 Job: crypto_ram (Core Mask 0x2, workload: verify, depth: 128, IO size: 65536) 00:34:42.047 Verification LBA range: start 0x100 length 0x100 00:34:42.047 crypto_ram : 6.08 42.08 2.63 0.00 0.00 2944578.78 286306.84 2538465.73 00:34:42.047 Job: crypto_ram1 (Core Mask 0x1, workload: verify, depth: 128, IO size: 65536) 00:34:42.047 Verification LBA range: start 0x0 length 0x100 00:34:42.047 crypto_ram1 : 6.14 41.71 2.61 0.00 0.00 2864970.35 282659.62 2480110.19 00:34:42.047 Job: crypto_ram1 (Core Mask 0x2, workload: verify, depth: 128, IO size: 65536) 00:34:42.047 Verification LBA range: start 0x100 length 0x100 00:34:42.047 crypto_ram1 : 6.08 42.07 2.63 0.00 0.00 2835133.66 286306.84 2319632.47 00:34:42.047 Job: crypto_ram2 (Core Mask 0x1, workload: verify, depth: 128, IO size: 65536) 00:34:42.047 Verification LBA range: start 0x0 length 0x100 00:34:42.047 crypto_ram2 : 5.66 258.35 16.15 0.00 0.00 438184.35 33508.84 682030.30 00:34:42.047 Job: crypto_ram2 (Core Mask 0x2, workload: verify, depth: 128, IO size: 65536) 00:34:42.047 Verification LBA range: start 0x100 length 0x100 00:34:42.047 crypto_ram2 : 5.64 272.00 17.00 0.00 0.00 417052.58 83886.08 663794.20 00:34:42.047 Job: crypto_ram3 (Core Mask 0x1, workload: verify, depth: 128, IO size: 65536) 00:34:42.047 Verification LBA range: start 0x0 length 0x100 00:34:42.047 crypto_ram3 : 5.75 267.15 16.70 0.00 0.00 412075.41 41031.23 351956.81 00:34:42.047 Job: crypto_ram3 (Core Mask 0x2, workload: verify, depth: 128, IO size: 65536) 00:34:42.047 Verification LBA range: start 0x100 length 0x100 00:34:42.047 crypto_ram3 : 5.80 286.65 17.92 0.00 0.00 385314.74 65194.07 474138.71 00:34:42.047 =================================================================================================================== 00:34:42.047 Total : 1251.73 78.23 0.00 0.00 765886.80 33508.84 2698943.44 00:34:42.613 00:34:42.613 real 0m9.320s 00:34:42.613 user 0m17.669s 00:34:42.613 sys 0m0.486s 00:34:42.613 18:38:26 blockdev_crypto_qat.bdev_verify_big_io -- common/autotest_common.sh@1124 -- # xtrace_disable 00:34:42.613 18:38:26 blockdev_crypto_qat.bdev_verify_big_io -- common/autotest_common.sh@10 -- # set +x 00:34:42.613 ************************************ 00:34:42.613 END TEST bdev_verify_big_io 00:34:42.613 ************************************ 00:34:42.613 18:38:26 blockdev_crypto_qat -- common/autotest_common.sh@1142 -- # return 0 00:34:42.613 18:38:26 blockdev_crypto_qat -- bdev/blockdev.sh@779 -- # run_test bdev_write_zeroes /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:34:42.613 18:38:26 blockdev_crypto_qat -- common/autotest_common.sh@1099 -- # '[' 13 -le 1 ']' 00:34:42.613 18:38:26 blockdev_crypto_qat -- common/autotest_common.sh@1105 -- # xtrace_disable 00:34:42.613 18:38:26 blockdev_crypto_qat -- common/autotest_common.sh@10 -- # set +x 00:34:42.613 ************************************ 00:34:42.613 START TEST bdev_write_zeroes 00:34:42.613 ************************************ 00:34:42.613 18:38:26 blockdev_crypto_qat.bdev_write_zeroes -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:34:42.613 [2024-07-12 18:38:26.162596] Starting SPDK v24.09-pre git sha1 182dd7de4 / DPDK 24.03.0 initialization... 00:34:42.613 [2024-07-12 18:38:26.162657] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2667133 ] 00:34:42.613 [2024-07-12 18:38:26.290735] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:34:42.872 [2024-07-12 18:38:26.387825] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:34:42.872 [2024-07-12 18:38:26.409111] accel_dpdk_cryptodev.c: 223:accel_dpdk_cryptodev_set_driver: *NOTICE*: Using driver crypto_qat 00:34:42.872 [2024-07-12 18:38:26.417139] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation encrypt will be assigned to module dpdk_cryptodev 00:34:42.872 [2024-07-12 18:38:26.425157] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation decrypt will be assigned to module dpdk_cryptodev 00:34:42.872 [2024-07-12 18:38:26.530668] accel_dpdk_cryptodev.c:1178:accel_dpdk_cryptodev_init: *NOTICE*: Found crypto devices: 96 00:34:45.438 [2024-07-12 18:38:28.762557] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_cbc" 00:34:45.439 [2024-07-12 18:38:28.762620] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:34:45.439 [2024-07-12 18:38:28.762634] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:34:45.439 [2024-07-12 18:38:28.770576] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_xts" 00:34:45.439 [2024-07-12 18:38:28.770596] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:34:45.439 [2024-07-12 18:38:28.770608] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:34:45.439 [2024-07-12 18:38:28.778596] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_cbc2" 00:34:45.439 [2024-07-12 18:38:28.778615] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:34:45.439 [2024-07-12 18:38:28.778627] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:34:45.439 [2024-07-12 18:38:28.786617] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_xts2" 00:34:45.439 [2024-07-12 18:38:28.786635] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc3 00:34:45.439 [2024-07-12 18:38:28.786647] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:34:45.439 Running I/O for 1 seconds... 00:34:46.375 00:34:46.375 Latency(us) 00:34:46.375 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:34:46.375 Job: crypto_ram (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:34:46.375 crypto_ram : 1.02 2029.76 7.93 0.00 0.00 62537.78 5641.79 75679.83 00:34:46.375 Job: crypto_ram1 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:34:46.375 crypto_ram1 : 1.03 2042.86 7.98 0.00 0.00 61851.49 5613.30 70209.00 00:34:46.375 Job: crypto_ram2 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:34:46.375 crypto_ram2 : 1.02 15686.64 61.28 0.00 0.00 8039.91 2436.23 10599.74 00:34:46.375 Job: crypto_ram3 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:34:46.375 crypto_ram3 : 1.02 15665.30 61.19 0.00 0.00 8015.12 2436.23 8377.21 00:34:46.375 =================================================================================================================== 00:34:46.375 Total : 35424.57 138.38 0.00 0.00 14280.18 2436.23 75679.83 00:34:46.633 00:34:46.633 real 0m4.227s 00:34:46.633 user 0m3.783s 00:34:46.633 sys 0m0.398s 00:34:46.633 18:38:30 blockdev_crypto_qat.bdev_write_zeroes -- common/autotest_common.sh@1124 -- # xtrace_disable 00:34:46.633 18:38:30 blockdev_crypto_qat.bdev_write_zeroes -- common/autotest_common.sh@10 -- # set +x 00:34:46.633 ************************************ 00:34:46.633 END TEST bdev_write_zeroes 00:34:46.633 ************************************ 00:34:46.891 18:38:30 blockdev_crypto_qat -- common/autotest_common.sh@1142 -- # return 0 00:34:46.891 18:38:30 blockdev_crypto_qat -- bdev/blockdev.sh@782 -- # run_test bdev_json_nonenclosed /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonenclosed.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:34:46.891 18:38:30 blockdev_crypto_qat -- common/autotest_common.sh@1099 -- # '[' 13 -le 1 ']' 00:34:46.891 18:38:30 blockdev_crypto_qat -- common/autotest_common.sh@1105 -- # xtrace_disable 00:34:46.891 18:38:30 blockdev_crypto_qat -- common/autotest_common.sh@10 -- # set +x 00:34:46.891 ************************************ 00:34:46.891 START TEST bdev_json_nonenclosed 00:34:46.891 ************************************ 00:34:46.891 18:38:30 blockdev_crypto_qat.bdev_json_nonenclosed -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonenclosed.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:34:46.891 [2024-07-12 18:38:30.473053] Starting SPDK v24.09-pre git sha1 182dd7de4 / DPDK 24.03.0 initialization... 00:34:46.891 [2024-07-12 18:38:30.473113] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2667721 ] 00:34:46.891 [2024-07-12 18:38:30.601327] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:34:47.150 [2024-07-12 18:38:30.698890] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:34:47.150 [2024-07-12 18:38:30.698962] json_config.c: 608:json_config_prepare_ctx: *ERROR*: Invalid JSON configuration: not enclosed in {}. 00:34:47.150 [2024-07-12 18:38:30.698983] rpc.c: 190:spdk_rpc_server_finish: *ERROR*: No server listening on provided address: 00:34:47.150 [2024-07-12 18:38:30.698995] app.c:1052:spdk_app_stop: *WARNING*: spdk_app_stop'd on non-zero 00:34:47.150 00:34:47.150 real 0m0.385s 00:34:47.150 user 0m0.239s 00:34:47.150 sys 0m0.144s 00:34:47.150 18:38:30 blockdev_crypto_qat.bdev_json_nonenclosed -- common/autotest_common.sh@1123 -- # es=234 00:34:47.150 18:38:30 blockdev_crypto_qat.bdev_json_nonenclosed -- common/autotest_common.sh@1124 -- # xtrace_disable 00:34:47.150 18:38:30 blockdev_crypto_qat.bdev_json_nonenclosed -- common/autotest_common.sh@10 -- # set +x 00:34:47.150 ************************************ 00:34:47.150 END TEST bdev_json_nonenclosed 00:34:47.150 ************************************ 00:34:47.150 18:38:30 blockdev_crypto_qat -- common/autotest_common.sh@1142 -- # return 234 00:34:47.150 18:38:30 blockdev_crypto_qat -- bdev/blockdev.sh@782 -- # true 00:34:47.150 18:38:30 blockdev_crypto_qat -- bdev/blockdev.sh@785 -- # run_test bdev_json_nonarray /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonarray.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:34:47.150 18:38:30 blockdev_crypto_qat -- common/autotest_common.sh@1099 -- # '[' 13 -le 1 ']' 00:34:47.150 18:38:30 blockdev_crypto_qat -- common/autotest_common.sh@1105 -- # xtrace_disable 00:34:47.150 18:38:30 blockdev_crypto_qat -- common/autotest_common.sh@10 -- # set +x 00:34:47.150 ************************************ 00:34:47.150 START TEST bdev_json_nonarray 00:34:47.150 ************************************ 00:34:47.150 18:38:30 blockdev_crypto_qat.bdev_json_nonarray -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonarray.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:34:47.409 [2024-07-12 18:38:30.922792] Starting SPDK v24.09-pre git sha1 182dd7de4 / DPDK 24.03.0 initialization... 00:34:47.409 [2024-07-12 18:38:30.922851] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2667848 ] 00:34:47.409 [2024-07-12 18:38:31.039667] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:34:47.665 [2024-07-12 18:38:31.136572] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:34:47.665 [2024-07-12 18:38:31.136639] json_config.c: 614:json_config_prepare_ctx: *ERROR*: Invalid JSON configuration: 'subsystems' should be an array. 00:34:47.665 [2024-07-12 18:38:31.136661] rpc.c: 190:spdk_rpc_server_finish: *ERROR*: No server listening on provided address: 00:34:47.665 [2024-07-12 18:38:31.136673] app.c:1052:spdk_app_stop: *WARNING*: spdk_app_stop'd on non-zero 00:34:47.665 00:34:47.665 real 0m0.372s 00:34:47.665 user 0m0.234s 00:34:47.665 sys 0m0.136s 00:34:47.665 18:38:31 blockdev_crypto_qat.bdev_json_nonarray -- common/autotest_common.sh@1123 -- # es=234 00:34:47.665 18:38:31 blockdev_crypto_qat.bdev_json_nonarray -- common/autotest_common.sh@1124 -- # xtrace_disable 00:34:47.665 18:38:31 blockdev_crypto_qat.bdev_json_nonarray -- common/autotest_common.sh@10 -- # set +x 00:34:47.665 ************************************ 00:34:47.665 END TEST bdev_json_nonarray 00:34:47.665 ************************************ 00:34:47.665 18:38:31 blockdev_crypto_qat -- common/autotest_common.sh@1142 -- # return 234 00:34:47.666 18:38:31 blockdev_crypto_qat -- bdev/blockdev.sh@785 -- # true 00:34:47.666 18:38:31 blockdev_crypto_qat -- bdev/blockdev.sh@787 -- # [[ crypto_qat == bdev ]] 00:34:47.666 18:38:31 blockdev_crypto_qat -- bdev/blockdev.sh@794 -- # [[ crypto_qat == gpt ]] 00:34:47.666 18:38:31 blockdev_crypto_qat -- bdev/blockdev.sh@798 -- # [[ crypto_qat == crypto_sw ]] 00:34:47.666 18:38:31 blockdev_crypto_qat -- bdev/blockdev.sh@810 -- # trap - SIGINT SIGTERM EXIT 00:34:47.666 18:38:31 blockdev_crypto_qat -- bdev/blockdev.sh@811 -- # cleanup 00:34:47.666 18:38:31 blockdev_crypto_qat -- bdev/blockdev.sh@23 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/aiofile 00:34:47.666 18:38:31 blockdev_crypto_qat -- bdev/blockdev.sh@24 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json 00:34:47.666 18:38:31 blockdev_crypto_qat -- bdev/blockdev.sh@26 -- # [[ crypto_qat == rbd ]] 00:34:47.666 18:38:31 blockdev_crypto_qat -- bdev/blockdev.sh@30 -- # [[ crypto_qat == daos ]] 00:34:47.666 18:38:31 blockdev_crypto_qat -- bdev/blockdev.sh@34 -- # [[ crypto_qat = \g\p\t ]] 00:34:47.666 18:38:31 blockdev_crypto_qat -- bdev/blockdev.sh@40 -- # [[ crypto_qat == xnvme ]] 00:34:47.666 00:34:47.666 real 1m12.437s 00:34:47.666 user 2m40.692s 00:34:47.666 sys 0m9.018s 00:34:47.666 18:38:31 blockdev_crypto_qat -- common/autotest_common.sh@1124 -- # xtrace_disable 00:34:47.666 18:38:31 blockdev_crypto_qat -- common/autotest_common.sh@10 -- # set +x 00:34:47.666 ************************************ 00:34:47.666 END TEST blockdev_crypto_qat 00:34:47.666 ************************************ 00:34:47.666 18:38:31 -- common/autotest_common.sh@1142 -- # return 0 00:34:47.666 18:38:31 -- spdk/autotest.sh@360 -- # run_test chaining /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/chaining.sh 00:34:47.666 18:38:31 -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:34:47.666 18:38:31 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:34:47.666 18:38:31 -- common/autotest_common.sh@10 -- # set +x 00:34:47.666 ************************************ 00:34:47.666 START TEST chaining 00:34:47.666 ************************************ 00:34:47.666 18:38:31 chaining -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/chaining.sh 00:34:47.922 * Looking for test storage... 00:34:47.922 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev 00:34:47.922 18:38:31 chaining -- bdev/chaining.sh@14 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/nvmf/common.sh 00:34:47.922 18:38:31 chaining -- nvmf/common.sh@7 -- # uname -s 00:34:47.922 18:38:31 chaining -- nvmf/common.sh@7 -- # [[ Linux == FreeBSD ]] 00:34:47.922 18:38:31 chaining -- nvmf/common.sh@9 -- # NVMF_PORT=4420 00:34:47.922 18:38:31 chaining -- nvmf/common.sh@10 -- # NVMF_SECOND_PORT=4421 00:34:47.922 18:38:31 chaining -- nvmf/common.sh@11 -- # NVMF_THIRD_PORT=4422 00:34:47.922 18:38:31 chaining -- nvmf/common.sh@12 -- # NVMF_IP_PREFIX=192.168.100 00:34:47.922 18:38:31 chaining -- nvmf/common.sh@13 -- # NVMF_IP_LEAST_ADDR=8 00:34:47.922 18:38:31 chaining -- nvmf/common.sh@14 -- # NVMF_TCP_IP_ADDRESS=127.0.0.1 00:34:47.922 18:38:31 chaining -- nvmf/common.sh@15 -- # NVMF_TRANSPORT_OPTS= 00:34:47.922 18:38:31 chaining -- nvmf/common.sh@16 -- # NVMF_SERIAL=SPDKISFASTANDAWESOME 00:34:47.922 18:38:31 chaining -- nvmf/common.sh@17 -- # nvme gen-hostnqn 00:34:47.922 18:38:31 chaining -- nvmf/common.sh@17 -- # NVME_HOSTNQN=nqn.2014-08.org.nvmexpress:uuid:809e2efd-7f71-e711-906e-0017a4403562 00:34:47.922 18:38:31 chaining -- nvmf/common.sh@18 -- # NVME_HOSTID=809e2efd-7f71-e711-906e-0017a4403562 00:34:47.922 18:38:31 chaining -- nvmf/common.sh@19 -- # NVME_HOST=("--hostnqn=$NVME_HOSTNQN" "--hostid=$NVME_HOSTID") 00:34:47.922 18:38:31 chaining -- nvmf/common.sh@20 -- # NVME_CONNECT='nvme connect' 00:34:47.922 18:38:31 chaining -- nvmf/common.sh@21 -- # NET_TYPE=phy-fallback 00:34:47.922 18:38:31 chaining -- nvmf/common.sh@22 -- # NVME_SUBNQN=nqn.2016-06.io.spdk:testnqn 00:34:47.922 18:38:31 chaining -- nvmf/common.sh@45 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/common.sh 00:34:47.922 18:38:31 chaining -- scripts/common.sh@508 -- # [[ -e /bin/wpdk_common.sh ]] 00:34:47.922 18:38:31 chaining -- scripts/common.sh@516 -- # [[ -e /etc/opt/spdk-pkgdep/paths/export.sh ]] 00:34:47.922 18:38:31 chaining -- scripts/common.sh@517 -- # source /etc/opt/spdk-pkgdep/paths/export.sh 00:34:47.922 18:38:31 chaining -- paths/export.sh@2 -- # PATH=/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:34:47.922 18:38:31 chaining -- paths/export.sh@3 -- # PATH=/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:34:47.922 18:38:31 chaining -- paths/export.sh@4 -- # PATH=/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:34:47.922 18:38:31 chaining -- paths/export.sh@5 -- # export PATH 00:34:47.922 18:38:31 chaining -- paths/export.sh@6 -- # echo /opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:34:47.922 18:38:31 chaining -- nvmf/common.sh@47 -- # : 0 00:34:47.922 18:38:31 chaining -- nvmf/common.sh@48 -- # export NVMF_APP_SHM_ID 00:34:47.922 18:38:31 chaining -- nvmf/common.sh@49 -- # build_nvmf_app_args 00:34:47.922 18:38:31 chaining -- nvmf/common.sh@25 -- # '[' 0 -eq 1 ']' 00:34:47.922 18:38:31 chaining -- nvmf/common.sh@29 -- # NVMF_APP+=(-i "$NVMF_APP_SHM_ID" -e 0xFFFF) 00:34:47.922 18:38:31 chaining -- nvmf/common.sh@31 -- # NVMF_APP+=("${NO_HUGE[@]}") 00:34:47.922 18:38:31 chaining -- nvmf/common.sh@33 -- # '[' -n '' ']' 00:34:47.922 18:38:31 chaining -- nvmf/common.sh@35 -- # '[' 0 -eq 1 ']' 00:34:47.922 18:38:31 chaining -- nvmf/common.sh@51 -- # have_pci_nics=0 00:34:47.922 18:38:31 chaining -- bdev/chaining.sh@16 -- # nqn=nqn.2016-06.io.spdk:cnode0 00:34:47.922 18:38:31 chaining -- bdev/chaining.sh@17 -- # key0=(00112233445566778899001122334455 11223344556677889900112233445500) 00:34:47.922 18:38:31 chaining -- bdev/chaining.sh@18 -- # key1=(22334455667788990011223344550011 33445566778899001122334455001122) 00:34:47.922 18:38:31 chaining -- bdev/chaining.sh@19 -- # bperfsock=/var/tmp/bperf.sock 00:34:47.922 18:38:31 chaining -- bdev/chaining.sh@20 -- # declare -A stats 00:34:47.922 18:38:31 chaining -- bdev/chaining.sh@66 -- # nvmftestinit 00:34:47.922 18:38:31 chaining -- nvmf/common.sh@441 -- # '[' -z tcp ']' 00:34:47.923 18:38:31 chaining -- nvmf/common.sh@446 -- # trap nvmftestfini SIGINT SIGTERM EXIT 00:34:47.923 18:38:31 chaining -- nvmf/common.sh@448 -- # prepare_net_devs 00:34:47.923 18:38:31 chaining -- nvmf/common.sh@410 -- # local -g is_hw=no 00:34:47.923 18:38:31 chaining -- nvmf/common.sh@412 -- # remove_spdk_ns 00:34:47.923 18:38:31 chaining -- nvmf/common.sh@628 -- # xtrace_disable_per_cmd _remove_spdk_ns 00:34:47.923 18:38:31 chaining -- common/autotest_common.sh@22 -- # eval '_remove_spdk_ns 13> /dev/null' 00:34:47.923 18:38:31 chaining -- common/autotest_common.sh@22 -- # _remove_spdk_ns 00:34:47.923 18:38:31 chaining -- nvmf/common.sh@414 -- # [[ phy-fallback != virt ]] 00:34:47.923 18:38:31 chaining -- nvmf/common.sh@414 -- # gather_supported_nvmf_pci_devs 00:34:47.923 18:38:31 chaining -- nvmf/common.sh@285 -- # xtrace_disable 00:34:47.923 18:38:31 chaining -- common/autotest_common.sh@10 -- # set +x 00:34:54.488 18:38:38 chaining -- nvmf/common.sh@289 -- # local intel=0x8086 mellanox=0x15b3 pci net_dev 00:34:54.488 18:38:38 chaining -- nvmf/common.sh@291 -- # pci_devs=() 00:34:54.488 18:38:38 chaining -- nvmf/common.sh@291 -- # local -a pci_devs 00:34:54.488 18:38:38 chaining -- nvmf/common.sh@292 -- # pci_net_devs=() 00:34:54.488 18:38:38 chaining -- nvmf/common.sh@292 -- # local -a pci_net_devs 00:34:54.488 18:38:38 chaining -- nvmf/common.sh@293 -- # pci_drivers=() 00:34:54.488 18:38:38 chaining -- nvmf/common.sh@293 -- # local -A pci_drivers 00:34:54.488 18:38:38 chaining -- nvmf/common.sh@295 -- # net_devs=() 00:34:54.488 18:38:38 chaining -- nvmf/common.sh@295 -- # local -ga net_devs 00:34:54.488 18:38:38 chaining -- nvmf/common.sh@296 -- # e810=() 00:34:54.488 18:38:38 chaining -- nvmf/common.sh@296 -- # local -ga e810 00:34:54.488 18:38:38 chaining -- nvmf/common.sh@297 -- # x722=() 00:34:54.488 18:38:38 chaining -- nvmf/common.sh@297 -- # local -ga x722 00:34:54.488 18:38:38 chaining -- nvmf/common.sh@298 -- # mlx=() 00:34:54.488 18:38:38 chaining -- nvmf/common.sh@298 -- # local -ga mlx 00:34:54.488 18:38:38 chaining -- nvmf/common.sh@301 -- # e810+=(${pci_bus_cache["$intel:0x1592"]}) 00:34:54.488 18:38:38 chaining -- nvmf/common.sh@302 -- # e810+=(${pci_bus_cache["$intel:0x159b"]}) 00:34:54.488 18:38:38 chaining -- nvmf/common.sh@304 -- # x722+=(${pci_bus_cache["$intel:0x37d2"]}) 00:34:54.488 18:38:38 chaining -- nvmf/common.sh@306 -- # mlx+=(${pci_bus_cache["$mellanox:0xa2dc"]}) 00:34:54.488 18:38:38 chaining -- nvmf/common.sh@308 -- # mlx+=(${pci_bus_cache["$mellanox:0x1021"]}) 00:34:54.488 18:38:38 chaining -- nvmf/common.sh@310 -- # mlx+=(${pci_bus_cache["$mellanox:0xa2d6"]}) 00:34:54.488 18:38:38 chaining -- nvmf/common.sh@312 -- # mlx+=(${pci_bus_cache["$mellanox:0x101d"]}) 00:34:54.488 18:38:38 chaining -- nvmf/common.sh@314 -- # mlx+=(${pci_bus_cache["$mellanox:0x1017"]}) 00:34:54.488 18:38:38 chaining -- nvmf/common.sh@315 -- # mlx+=(${pci_bus_cache["$mellanox:0x1019"]}) 00:34:54.488 18:38:38 chaining -- nvmf/common.sh@317 -- # mlx+=(${pci_bus_cache["$mellanox:0x1015"]}) 00:34:54.488 18:38:38 chaining -- nvmf/common.sh@318 -- # mlx+=(${pci_bus_cache["$mellanox:0x1013"]}) 00:34:54.488 18:38:38 chaining -- nvmf/common.sh@320 -- # pci_devs+=("${e810[@]}") 00:34:54.488 18:38:38 chaining -- nvmf/common.sh@321 -- # [[ tcp == rdma ]] 00:34:54.488 18:38:38 chaining -- nvmf/common.sh@327 -- # [[ '' == mlx5 ]] 00:34:54.488 18:38:38 chaining -- nvmf/common.sh@329 -- # [[ '' == e810 ]] 00:34:54.488 18:38:38 chaining -- nvmf/common.sh@331 -- # [[ '' == x722 ]] 00:34:54.488 18:38:38 chaining -- nvmf/common.sh@335 -- # (( 0 == 0 )) 00:34:54.488 18:38:38 chaining -- nvmf/common.sh@336 -- # return 1 00:34:54.488 18:38:38 chaining -- nvmf/common.sh@416 -- # [[ no == yes ]] 00:34:54.488 18:38:38 chaining -- nvmf/common.sh@423 -- # [[ phy-fallback == phy ]] 00:34:54.488 18:38:38 chaining -- nvmf/common.sh@426 -- # [[ phy-fallback == phy-fallback ]] 00:34:54.488 18:38:38 chaining -- nvmf/common.sh@427 -- # echo 'WARNING: No supported devices were found, fallback requested for tcp test' 00:34:54.488 WARNING: No supported devices were found, fallback requested for tcp test 00:34:54.488 18:38:38 chaining -- nvmf/common.sh@431 -- # [[ tcp == tcp ]] 00:34:54.488 18:38:38 chaining -- nvmf/common.sh@432 -- # nvmf_veth_init 00:34:54.488 18:38:38 chaining -- nvmf/common.sh@141 -- # NVMF_INITIATOR_IP=10.0.0.1 00:34:54.488 18:38:38 chaining -- nvmf/common.sh@142 -- # NVMF_FIRST_TARGET_IP=10.0.0.2 00:34:54.488 18:38:38 chaining -- nvmf/common.sh@143 -- # NVMF_SECOND_TARGET_IP=10.0.0.3 00:34:54.488 18:38:38 chaining -- nvmf/common.sh@144 -- # NVMF_BRIDGE=nvmf_br 00:34:54.488 18:38:38 chaining -- nvmf/common.sh@145 -- # NVMF_INITIATOR_INTERFACE=nvmf_init_if 00:34:54.488 18:38:38 chaining -- nvmf/common.sh@146 -- # NVMF_INITIATOR_BRIDGE=nvmf_init_br 00:34:54.488 18:38:38 chaining -- nvmf/common.sh@147 -- # NVMF_TARGET_NAMESPACE=nvmf_tgt_ns_spdk 00:34:54.488 18:38:38 chaining -- nvmf/common.sh@148 -- # NVMF_TARGET_NS_CMD=(ip netns exec "$NVMF_TARGET_NAMESPACE") 00:34:54.488 18:38:38 chaining -- nvmf/common.sh@149 -- # NVMF_TARGET_INTERFACE=nvmf_tgt_if 00:34:54.488 18:38:38 chaining -- nvmf/common.sh@150 -- # NVMF_TARGET_INTERFACE2=nvmf_tgt_if2 00:34:54.488 18:38:38 chaining -- nvmf/common.sh@151 -- # NVMF_TARGET_BRIDGE=nvmf_tgt_br 00:34:54.488 18:38:38 chaining -- nvmf/common.sh@152 -- # NVMF_TARGET_BRIDGE2=nvmf_tgt_br2 00:34:54.488 18:38:38 chaining -- nvmf/common.sh@154 -- # ip link set nvmf_init_br nomaster 00:34:54.488 18:38:38 chaining -- nvmf/common.sh@155 -- # ip link set nvmf_tgt_br nomaster 00:34:54.488 Cannot find device "nvmf_tgt_br" 00:34:54.488 18:38:38 chaining -- nvmf/common.sh@155 -- # true 00:34:54.488 18:38:38 chaining -- nvmf/common.sh@156 -- # ip link set nvmf_tgt_br2 nomaster 00:34:54.749 Cannot find device "nvmf_tgt_br2" 00:34:54.749 18:38:38 chaining -- nvmf/common.sh@156 -- # true 00:34:54.749 18:38:38 chaining -- nvmf/common.sh@157 -- # ip link set nvmf_init_br down 00:34:54.749 18:38:38 chaining -- nvmf/common.sh@158 -- # ip link set nvmf_tgt_br down 00:34:54.749 Cannot find device "nvmf_tgt_br" 00:34:54.749 18:38:38 chaining -- nvmf/common.sh@158 -- # true 00:34:54.749 18:38:38 chaining -- nvmf/common.sh@159 -- # ip link set nvmf_tgt_br2 down 00:34:54.749 Cannot find device "nvmf_tgt_br2" 00:34:54.749 18:38:38 chaining -- nvmf/common.sh@159 -- # true 00:34:54.749 18:38:38 chaining -- nvmf/common.sh@160 -- # ip link delete nvmf_br type bridge 00:34:54.749 18:38:38 chaining -- nvmf/common.sh@161 -- # ip link delete nvmf_init_if 00:34:54.749 18:38:38 chaining -- nvmf/common.sh@162 -- # ip netns exec nvmf_tgt_ns_spdk ip link delete nvmf_tgt_if 00:34:54.749 Cannot open network namespace "nvmf_tgt_ns_spdk": No such file or directory 00:34:54.749 18:38:38 chaining -- nvmf/common.sh@162 -- # true 00:34:54.749 18:38:38 chaining -- nvmf/common.sh@163 -- # ip netns exec nvmf_tgt_ns_spdk ip link delete nvmf_tgt_if2 00:34:54.749 Cannot open network namespace "nvmf_tgt_ns_spdk": No such file or directory 00:34:54.749 18:38:38 chaining -- nvmf/common.sh@163 -- # true 00:34:54.749 18:38:38 chaining -- nvmf/common.sh@166 -- # ip netns add nvmf_tgt_ns_spdk 00:34:54.749 18:38:38 chaining -- nvmf/common.sh@169 -- # ip link add nvmf_init_if type veth peer name nvmf_init_br 00:34:54.749 18:38:38 chaining -- nvmf/common.sh@170 -- # ip link add nvmf_tgt_if type veth peer name nvmf_tgt_br 00:34:54.749 18:38:38 chaining -- nvmf/common.sh@171 -- # ip link add nvmf_tgt_if2 type veth peer name nvmf_tgt_br2 00:34:54.749 18:38:38 chaining -- nvmf/common.sh@174 -- # ip link set nvmf_tgt_if netns nvmf_tgt_ns_spdk 00:34:54.749 18:38:38 chaining -- nvmf/common.sh@175 -- # ip link set nvmf_tgt_if2 netns nvmf_tgt_ns_spdk 00:34:54.749 18:38:38 chaining -- nvmf/common.sh@178 -- # ip addr add 10.0.0.1/24 dev nvmf_init_if 00:34:54.749 18:38:38 chaining -- nvmf/common.sh@179 -- # ip netns exec nvmf_tgt_ns_spdk ip addr add 10.0.0.2/24 dev nvmf_tgt_if 00:34:54.749 18:38:38 chaining -- nvmf/common.sh@180 -- # ip netns exec nvmf_tgt_ns_spdk ip addr add 10.0.0.3/24 dev nvmf_tgt_if2 00:34:54.749 18:38:38 chaining -- nvmf/common.sh@183 -- # ip link set nvmf_init_if up 00:34:54.749 18:38:38 chaining -- nvmf/common.sh@184 -- # ip link set nvmf_init_br up 00:34:54.749 18:38:38 chaining -- nvmf/common.sh@185 -- # ip link set nvmf_tgt_br up 00:34:54.749 18:38:38 chaining -- nvmf/common.sh@186 -- # ip link set nvmf_tgt_br2 up 00:34:54.749 18:38:38 chaining -- nvmf/common.sh@187 -- # ip netns exec nvmf_tgt_ns_spdk ip link set nvmf_tgt_if up 00:34:54.749 18:38:38 chaining -- nvmf/common.sh@188 -- # ip netns exec nvmf_tgt_ns_spdk ip link set nvmf_tgt_if2 up 00:34:54.749 18:38:38 chaining -- nvmf/common.sh@189 -- # ip netns exec nvmf_tgt_ns_spdk ip link set lo up 00:34:54.749 18:38:38 chaining -- nvmf/common.sh@192 -- # ip link add nvmf_br type bridge 00:34:55.008 18:38:38 chaining -- nvmf/common.sh@193 -- # ip link set nvmf_br up 00:34:55.008 18:38:38 chaining -- nvmf/common.sh@196 -- # ip link set nvmf_init_br master nvmf_br 00:34:55.008 18:38:38 chaining -- nvmf/common.sh@197 -- # ip link set nvmf_tgt_br master nvmf_br 00:34:55.008 18:38:38 chaining -- nvmf/common.sh@198 -- # ip link set nvmf_tgt_br2 master nvmf_br 00:34:55.267 18:38:38 chaining -- nvmf/common.sh@201 -- # iptables -I INPUT 1 -i nvmf_init_if -p tcp --dport 4420 -j ACCEPT 00:34:55.267 18:38:38 chaining -- nvmf/common.sh@202 -- # iptables -A FORWARD -i nvmf_br -o nvmf_br -j ACCEPT 00:34:55.267 18:38:38 chaining -- nvmf/common.sh@205 -- # ping -c 1 10.0.0.2 00:34:55.267 PING 10.0.0.2 (10.0.0.2) 56(84) bytes of data. 00:34:55.267 64 bytes from 10.0.0.2: icmp_seq=1 ttl=64 time=0.107 ms 00:34:55.267 00:34:55.267 --- 10.0.0.2 ping statistics --- 00:34:55.267 1 packets transmitted, 1 received, 0% packet loss, time 0ms 00:34:55.267 rtt min/avg/max/mdev = 0.107/0.107/0.107/0.000 ms 00:34:55.267 18:38:38 chaining -- nvmf/common.sh@206 -- # ping -c 1 10.0.0.3 00:34:55.267 PING 10.0.0.3 (10.0.0.3) 56(84) bytes of data. 00:34:55.267 64 bytes from 10.0.0.3: icmp_seq=1 ttl=64 time=0.074 ms 00:34:55.267 00:34:55.267 --- 10.0.0.3 ping statistics --- 00:34:55.267 1 packets transmitted, 1 received, 0% packet loss, time 0ms 00:34:55.267 rtt min/avg/max/mdev = 0.074/0.074/0.074/0.000 ms 00:34:55.267 18:38:38 chaining -- nvmf/common.sh@207 -- # ip netns exec nvmf_tgt_ns_spdk ping -c 1 10.0.0.1 00:34:55.267 PING 10.0.0.1 (10.0.0.1) 56(84) bytes of data. 00:34:55.267 64 bytes from 10.0.0.1: icmp_seq=1 ttl=64 time=0.038 ms 00:34:55.267 00:34:55.267 --- 10.0.0.1 ping statistics --- 00:34:55.267 1 packets transmitted, 1 received, 0% packet loss, time 0ms 00:34:55.267 rtt min/avg/max/mdev = 0.038/0.038/0.038/0.000 ms 00:34:55.267 18:38:38 chaining -- nvmf/common.sh@209 -- # NVMF_APP=("${NVMF_TARGET_NS_CMD[@]}" "${NVMF_APP[@]}") 00:34:55.267 18:38:38 chaining -- nvmf/common.sh@433 -- # return 0 00:34:55.267 18:38:38 chaining -- nvmf/common.sh@450 -- # '[' '' == iso ']' 00:34:55.267 18:38:38 chaining -- nvmf/common.sh@454 -- # NVMF_TRANSPORT_OPTS='-t tcp' 00:34:55.267 18:38:38 chaining -- nvmf/common.sh@455 -- # [[ tcp == \r\d\m\a ]] 00:34:55.267 18:38:38 chaining -- nvmf/common.sh@464 -- # [[ tcp == \t\c\p ]] 00:34:55.267 18:38:38 chaining -- nvmf/common.sh@465 -- # NVMF_TRANSPORT_OPTS='-t tcp -o' 00:34:55.267 18:38:38 chaining -- nvmf/common.sh@468 -- # '[' tcp == tcp ']' 00:34:55.267 18:38:38 chaining -- nvmf/common.sh@474 -- # modprobe nvme-tcp 00:34:55.267 18:38:38 chaining -- bdev/chaining.sh@67 -- # nvmfappstart -m 0x2 00:34:55.267 18:38:38 chaining -- nvmf/common.sh@479 -- # timing_enter start_nvmf_tgt 00:34:55.267 18:38:38 chaining -- common/autotest_common.sh@722 -- # xtrace_disable 00:34:55.267 18:38:38 chaining -- common/autotest_common.sh@10 -- # set +x 00:34:55.267 18:38:38 chaining -- nvmf/common.sh@481 -- # nvmfpid=2671443 00:34:55.267 18:38:38 chaining -- nvmf/common.sh@480 -- # ip netns exec nvmf_tgt_ns_spdk /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/nvmf_tgt -i 0 -e 0xFFFF -m 0x2 00:34:55.267 18:38:38 chaining -- nvmf/common.sh@482 -- # waitforlisten 2671443 00:34:55.267 18:38:38 chaining -- common/autotest_common.sh@829 -- # '[' -z 2671443 ']' 00:34:55.267 18:38:38 chaining -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:34:55.267 18:38:38 chaining -- common/autotest_common.sh@834 -- # local max_retries=100 00:34:55.267 18:38:38 chaining -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:34:55.267 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:34:55.267 18:38:38 chaining -- common/autotest_common.sh@838 -- # xtrace_disable 00:34:55.267 18:38:38 chaining -- common/autotest_common.sh@10 -- # set +x 00:34:55.267 [2024-07-12 18:38:38.940982] Starting SPDK v24.09-pre git sha1 182dd7de4 / DPDK 24.03.0 initialization... 00:34:55.267 [2024-07-12 18:38:38.941120] [ DPDK EAL parameters: nvmf -c 0x2 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:34:55.526 [2024-07-12 18:38:39.141346] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:34:55.526 [2024-07-12 18:38:39.242432] app.c: 603:app_setup_trace: *NOTICE*: Tracepoint Group Mask 0xFFFF specified. 00:34:55.526 [2024-07-12 18:38:39.242478] app.c: 604:app_setup_trace: *NOTICE*: Use 'spdk_trace -s nvmf -i 0' to capture a snapshot of events at runtime. 00:34:55.526 [2024-07-12 18:38:39.242493] app.c: 609:app_setup_trace: *NOTICE*: 'spdk_trace' without parameters will also work if this is the only 00:34:55.526 [2024-07-12 18:38:39.242506] app.c: 610:app_setup_trace: *NOTICE*: SPDK application currently running. 00:34:55.526 [2024-07-12 18:38:39.242517] app.c: 611:app_setup_trace: *NOTICE*: Or copy /dev/shm/nvmf_trace.0 for offline analysis/debug. 00:34:55.526 [2024-07-12 18:38:39.242545] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:34:56.460 18:38:40 chaining -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:34:56.460 18:38:40 chaining -- common/autotest_common.sh@862 -- # return 0 00:34:56.460 18:38:40 chaining -- nvmf/common.sh@483 -- # timing_exit start_nvmf_tgt 00:34:56.460 18:38:40 chaining -- common/autotest_common.sh@728 -- # xtrace_disable 00:34:56.460 18:38:40 chaining -- common/autotest_common.sh@10 -- # set +x 00:34:56.460 18:38:40 chaining -- nvmf/common.sh@484 -- # trap 'process_shm --id $NVMF_APP_SHM_ID || :; nvmftestfini' SIGINT SIGTERM EXIT 00:34:56.460 18:38:40 chaining -- bdev/chaining.sh@69 -- # mktemp 00:34:56.460 18:38:40 chaining -- bdev/chaining.sh@69 -- # input=/tmp/tmp.CGh8VKPD3T 00:34:56.460 18:38:40 chaining -- bdev/chaining.sh@69 -- # mktemp 00:34:56.460 18:38:40 chaining -- bdev/chaining.sh@69 -- # output=/tmp/tmp.MlB10xJUFT 00:34:56.460 18:38:40 chaining -- bdev/chaining.sh@70 -- # trap 'tgtcleanup; exit 1' SIGINT SIGTERM EXIT 00:34:56.460 18:38:40 chaining -- bdev/chaining.sh@72 -- # rpc_cmd 00:34:56.460 18:38:40 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:34:56.460 18:38:40 chaining -- common/autotest_common.sh@10 -- # set +x 00:34:56.460 malloc0 00:34:56.460 true 00:34:56.460 true 00:34:56.460 [2024-07-12 18:38:40.176877] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "key0" 00:34:56.460 crypto0 00:34:56.460 [2024-07-12 18:38:40.184907] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "key1" 00:34:56.719 crypto1 00:34:56.719 [2024-07-12 18:38:40.193036] tcp.c: 672:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:34:56.719 [2024-07-12 18:38:40.209248] tcp.c: 967:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 10.0.0.2 port 4420 *** 00:34:56.719 18:38:40 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:34:56.719 18:38:40 chaining -- bdev/chaining.sh@85 -- # update_stats 00:34:56.719 18:38:40 chaining -- bdev/chaining.sh@51 -- # get_stat sequence_executed 00:34:56.719 18:38:40 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:34:56.719 18:38:40 chaining -- bdev/chaining.sh@39 -- # event=sequence_executed 00:34:56.719 18:38:40 chaining -- bdev/chaining.sh@39 -- # opcode= 00:34:56.719 18:38:40 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:34:56.719 18:38:40 chaining -- bdev/chaining.sh@40 -- # [[ -z '' ]] 00:34:56.719 18:38:40 chaining -- bdev/chaining.sh@41 -- # rpc_cmd accel_get_stats 00:34:56.719 18:38:40 chaining -- bdev/chaining.sh@41 -- # jq -r .sequence_executed 00:34:56.719 18:38:40 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:34:56.719 18:38:40 chaining -- common/autotest_common.sh@10 -- # set +x 00:34:56.719 18:38:40 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:34:56.719 18:38:40 chaining -- bdev/chaining.sh@51 -- # stats["sequence_executed"]=12 00:34:56.719 18:38:40 chaining -- bdev/chaining.sh@52 -- # get_stat executed encrypt 00:34:56.719 18:38:40 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:34:56.719 18:38:40 chaining -- bdev/chaining.sh@39 -- # event=executed 00:34:56.719 18:38:40 chaining -- bdev/chaining.sh@39 -- # opcode=encrypt 00:34:56.719 18:38:40 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:34:56.719 18:38:40 chaining -- bdev/chaining.sh@40 -- # [[ -z encrypt ]] 00:34:56.719 18:38:40 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:34:56.719 18:38:40 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:34:56.719 18:38:40 chaining -- common/autotest_common.sh@10 -- # set +x 00:34:56.719 18:38:40 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "encrypt").executed' 00:34:56.719 18:38:40 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:34:56.719 18:38:40 chaining -- bdev/chaining.sh@52 -- # stats["encrypt_executed"]= 00:34:56.719 18:38:40 chaining -- bdev/chaining.sh@53 -- # get_stat executed decrypt 00:34:56.719 18:38:40 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:34:56.719 18:38:40 chaining -- bdev/chaining.sh@39 -- # event=executed 00:34:56.719 18:38:40 chaining -- bdev/chaining.sh@39 -- # opcode=decrypt 00:34:56.719 18:38:40 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:34:56.719 18:38:40 chaining -- bdev/chaining.sh@40 -- # [[ -z decrypt ]] 00:34:56.719 18:38:40 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:34:56.719 18:38:40 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "decrypt").executed' 00:34:56.719 18:38:40 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:34:56.719 18:38:40 chaining -- common/autotest_common.sh@10 -- # set +x 00:34:56.719 18:38:40 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:34:56.719 18:38:40 chaining -- bdev/chaining.sh@53 -- # stats["decrypt_executed"]=12 00:34:56.719 18:38:40 chaining -- bdev/chaining.sh@54 -- # get_stat executed copy 00:34:56.719 18:38:40 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:34:56.719 18:38:40 chaining -- bdev/chaining.sh@39 -- # event=executed 00:34:56.719 18:38:40 chaining -- bdev/chaining.sh@39 -- # opcode=copy 00:34:56.719 18:38:40 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:34:56.719 18:38:40 chaining -- bdev/chaining.sh@40 -- # [[ -z copy ]] 00:34:56.719 18:38:40 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:34:56.719 18:38:40 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "copy").executed' 00:34:56.719 18:38:40 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:34:56.719 18:38:40 chaining -- common/autotest_common.sh@10 -- # set +x 00:34:56.719 18:38:40 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:34:56.719 18:38:40 chaining -- bdev/chaining.sh@54 -- # stats["copy_executed"]=4 00:34:56.719 18:38:40 chaining -- bdev/chaining.sh@88 -- # dd if=/dev/urandom of=/tmp/tmp.CGh8VKPD3T bs=1K count=64 00:34:56.719 64+0 records in 00:34:56.719 64+0 records out 00:34:56.719 65536 bytes (66 kB, 64 KiB) copied, 0.00106979 s, 61.3 MB/s 00:34:56.719 18:38:40 chaining -- bdev/chaining.sh@89 -- # spdk_dd --if /tmp/tmp.CGh8VKPD3T --ob Nvme0n1 --bs 65536 --count 1 00:34:56.719 18:38:40 chaining -- bdev/chaining.sh@25 -- # local config 00:34:56.719 18:38:40 chaining -- bdev/chaining.sh@31 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/gen_nvme.sh --mode=remote --json-with-subsystems --trid=tcp:10.0.0.2:4420:nqn.2016-06.io.spdk:cnode0 00:34:56.719 18:38:40 chaining -- bdev/chaining.sh@32 -- # jq '.subsystems[0].config[.subsystems[0].config | length] |= 00:34:56.719 {"method": "bdev_set_options", "params": {"bdev_auto_examine": false}}' 00:34:56.978 18:38:40 chaining -- bdev/chaining.sh@31 -- # config='{ 00:34:56.978 "subsystems": [ 00:34:56.978 { 00:34:56.978 "subsystem": "bdev", 00:34:56.978 "config": [ 00:34:56.978 { 00:34:56.978 "method": "bdev_nvme_attach_controller", 00:34:56.978 "params": { 00:34:56.978 "trtype": "tcp", 00:34:56.978 "adrfam": "IPv4", 00:34:56.978 "name": "Nvme0", 00:34:56.978 "subnqn": "nqn.2016-06.io.spdk:cnode0", 00:34:56.978 "traddr": "10.0.0.2", 00:34:56.978 "trsvcid": "4420" 00:34:56.978 } 00:34:56.978 }, 00:34:56.978 { 00:34:56.978 "method": "bdev_set_options", 00:34:56.978 "params": { 00:34:56.978 "bdev_auto_examine": false 00:34:56.978 } 00:34:56.978 } 00:34:56.978 ] 00:34:56.978 } 00:34:56.978 ] 00:34:56.978 }' 00:34:56.978 18:38:40 chaining -- bdev/chaining.sh@33 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_dd -c /dev/fd/62 --if /tmp/tmp.CGh8VKPD3T --ob Nvme0n1 --bs 65536 --count 1 00:34:56.978 18:38:40 chaining -- bdev/chaining.sh@33 -- # echo '{ 00:34:56.978 "subsystems": [ 00:34:56.978 { 00:34:56.978 "subsystem": "bdev", 00:34:56.978 "config": [ 00:34:56.978 { 00:34:56.978 "method": "bdev_nvme_attach_controller", 00:34:56.978 "params": { 00:34:56.978 "trtype": "tcp", 00:34:56.978 "adrfam": "IPv4", 00:34:56.978 "name": "Nvme0", 00:34:56.978 "subnqn": "nqn.2016-06.io.spdk:cnode0", 00:34:56.978 "traddr": "10.0.0.2", 00:34:56.978 "trsvcid": "4420" 00:34:56.978 } 00:34:56.978 }, 00:34:56.978 { 00:34:56.978 "method": "bdev_set_options", 00:34:56.978 "params": { 00:34:56.978 "bdev_auto_examine": false 00:34:56.978 } 00:34:56.978 } 00:34:56.978 ] 00:34:56.978 } 00:34:56.978 ] 00:34:56.978 }' 00:34:56.978 [2024-07-12 18:38:40.515645] Starting SPDK v24.09-pre git sha1 182dd7de4 / DPDK 24.03.0 initialization... 00:34:56.978 [2024-07-12 18:38:40.515706] [ DPDK EAL parameters: spdk_dd --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2671753 ] 00:34:56.978 [2024-07-12 18:38:40.645333] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:34:57.237 [2024-07-12 18:38:40.743158] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:34:57.495  Copying: 64/64 [kB] (average 20 MBps) 00:34:57.495 00:34:57.495 18:38:41 chaining -- bdev/chaining.sh@90 -- # get_stat sequence_executed 00:34:57.495 18:38:41 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:34:57.495 18:38:41 chaining -- bdev/chaining.sh@39 -- # event=sequence_executed 00:34:57.495 18:38:41 chaining -- bdev/chaining.sh@39 -- # opcode= 00:34:57.495 18:38:41 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:34:57.495 18:38:41 chaining -- bdev/chaining.sh@40 -- # [[ -z '' ]] 00:34:57.495 18:38:41 chaining -- bdev/chaining.sh@41 -- # rpc_cmd accel_get_stats 00:34:57.495 18:38:41 chaining -- bdev/chaining.sh@41 -- # jq -r .sequence_executed 00:34:57.495 18:38:41 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:34:57.495 18:38:41 chaining -- common/autotest_common.sh@10 -- # set +x 00:34:57.495 18:38:41 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:34:57.495 18:38:41 chaining -- bdev/chaining.sh@90 -- # (( 13 == stats[sequence_executed] + 1 )) 00:34:57.495 18:38:41 chaining -- bdev/chaining.sh@91 -- # get_stat executed encrypt 00:34:57.495 18:38:41 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:34:57.495 18:38:41 chaining -- bdev/chaining.sh@39 -- # event=executed 00:34:57.495 18:38:41 chaining -- bdev/chaining.sh@39 -- # opcode=encrypt 00:34:57.495 18:38:41 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:34:57.495 18:38:41 chaining -- bdev/chaining.sh@40 -- # [[ -z encrypt ]] 00:34:57.495 18:38:41 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:34:57.495 18:38:41 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "encrypt").executed' 00:34:57.495 18:38:41 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:34:57.495 18:38:41 chaining -- common/autotest_common.sh@10 -- # set +x 00:34:57.495 18:38:41 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:34:57.753 18:38:41 chaining -- bdev/chaining.sh@91 -- # (( 2 == stats[encrypt_executed] + 2 )) 00:34:57.753 18:38:41 chaining -- bdev/chaining.sh@92 -- # get_stat executed decrypt 00:34:57.753 18:38:41 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:34:57.753 18:38:41 chaining -- bdev/chaining.sh@39 -- # event=executed 00:34:57.753 18:38:41 chaining -- bdev/chaining.sh@39 -- # opcode=decrypt 00:34:57.753 18:38:41 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:34:57.753 18:38:41 chaining -- bdev/chaining.sh@40 -- # [[ -z decrypt ]] 00:34:57.753 18:38:41 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:34:57.753 18:38:41 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "decrypt").executed' 00:34:57.753 18:38:41 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:34:57.753 18:38:41 chaining -- common/autotest_common.sh@10 -- # set +x 00:34:57.753 18:38:41 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:34:57.753 18:38:41 chaining -- bdev/chaining.sh@92 -- # (( 12 == stats[decrypt_executed] )) 00:34:57.753 18:38:41 chaining -- bdev/chaining.sh@95 -- # get_stat executed copy 00:34:57.753 18:38:41 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:34:57.753 18:38:41 chaining -- bdev/chaining.sh@39 -- # event=executed 00:34:57.753 18:38:41 chaining -- bdev/chaining.sh@39 -- # opcode=copy 00:34:57.753 18:38:41 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:34:57.753 18:38:41 chaining -- bdev/chaining.sh@40 -- # [[ -z copy ]] 00:34:57.753 18:38:41 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:34:57.753 18:38:41 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "copy").executed' 00:34:57.753 18:38:41 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:34:57.753 18:38:41 chaining -- common/autotest_common.sh@10 -- # set +x 00:34:57.753 18:38:41 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:34:57.753 18:38:41 chaining -- bdev/chaining.sh@95 -- # (( 4 == stats[copy_executed] )) 00:34:57.753 18:38:41 chaining -- bdev/chaining.sh@96 -- # update_stats 00:34:57.753 18:38:41 chaining -- bdev/chaining.sh@51 -- # get_stat sequence_executed 00:34:57.753 18:38:41 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:34:57.753 18:38:41 chaining -- bdev/chaining.sh@39 -- # event=sequence_executed 00:34:57.753 18:38:41 chaining -- bdev/chaining.sh@39 -- # opcode= 00:34:57.753 18:38:41 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:34:57.753 18:38:41 chaining -- bdev/chaining.sh@40 -- # [[ -z '' ]] 00:34:57.753 18:38:41 chaining -- bdev/chaining.sh@41 -- # jq -r .sequence_executed 00:34:57.754 18:38:41 chaining -- bdev/chaining.sh@41 -- # rpc_cmd accel_get_stats 00:34:57.754 18:38:41 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:34:57.754 18:38:41 chaining -- common/autotest_common.sh@10 -- # set +x 00:34:57.754 18:38:41 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:34:57.754 18:38:41 chaining -- bdev/chaining.sh@51 -- # stats["sequence_executed"]=13 00:34:57.754 18:38:41 chaining -- bdev/chaining.sh@52 -- # get_stat executed encrypt 00:34:57.754 18:38:41 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:34:57.754 18:38:41 chaining -- bdev/chaining.sh@39 -- # event=executed 00:34:57.754 18:38:41 chaining -- bdev/chaining.sh@39 -- # opcode=encrypt 00:34:57.754 18:38:41 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:34:57.754 18:38:41 chaining -- bdev/chaining.sh@40 -- # [[ -z encrypt ]] 00:34:57.754 18:38:41 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:34:57.754 18:38:41 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "encrypt").executed' 00:34:57.754 18:38:41 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:34:57.754 18:38:41 chaining -- common/autotest_common.sh@10 -- # set +x 00:34:57.754 18:38:41 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:34:57.754 18:38:41 chaining -- bdev/chaining.sh@52 -- # stats["encrypt_executed"]=2 00:34:57.754 18:38:41 chaining -- bdev/chaining.sh@53 -- # get_stat executed decrypt 00:34:57.754 18:38:41 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:34:57.754 18:38:41 chaining -- bdev/chaining.sh@39 -- # event=executed 00:34:57.754 18:38:41 chaining -- bdev/chaining.sh@39 -- # opcode=decrypt 00:34:57.754 18:38:41 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:34:57.754 18:38:41 chaining -- bdev/chaining.sh@40 -- # [[ -z decrypt ]] 00:34:57.754 18:38:41 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:34:57.754 18:38:41 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "decrypt").executed' 00:34:57.754 18:38:41 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:34:57.754 18:38:41 chaining -- common/autotest_common.sh@10 -- # set +x 00:34:57.754 18:38:41 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:34:57.754 18:38:41 chaining -- bdev/chaining.sh@53 -- # stats["decrypt_executed"]=12 00:34:57.754 18:38:41 chaining -- bdev/chaining.sh@54 -- # get_stat executed copy 00:34:57.754 18:38:41 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:34:57.754 18:38:41 chaining -- bdev/chaining.sh@39 -- # event=executed 00:34:57.754 18:38:41 chaining -- bdev/chaining.sh@39 -- # opcode=copy 00:34:58.012 18:38:41 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:34:58.012 18:38:41 chaining -- bdev/chaining.sh@40 -- # [[ -z copy ]] 00:34:58.012 18:38:41 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:34:58.012 18:38:41 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "copy").executed' 00:34:58.012 18:38:41 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:34:58.012 18:38:41 chaining -- common/autotest_common.sh@10 -- # set +x 00:34:58.012 18:38:41 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:34:58.012 18:38:41 chaining -- bdev/chaining.sh@54 -- # stats["copy_executed"]=4 00:34:58.012 18:38:41 chaining -- bdev/chaining.sh@99 -- # spdk_dd --of /tmp/tmp.MlB10xJUFT --ib Nvme0n1 --bs 65536 --count 1 00:34:58.012 18:38:41 chaining -- bdev/chaining.sh@25 -- # local config 00:34:58.012 18:38:41 chaining -- bdev/chaining.sh@31 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/gen_nvme.sh --mode=remote --json-with-subsystems --trid=tcp:10.0.0.2:4420:nqn.2016-06.io.spdk:cnode0 00:34:58.012 18:38:41 chaining -- bdev/chaining.sh@32 -- # jq '.subsystems[0].config[.subsystems[0].config | length] |= 00:34:58.012 {"method": "bdev_set_options", "params": {"bdev_auto_examine": false}}' 00:34:58.012 18:38:41 chaining -- bdev/chaining.sh@31 -- # config='{ 00:34:58.012 "subsystems": [ 00:34:58.012 { 00:34:58.012 "subsystem": "bdev", 00:34:58.012 "config": [ 00:34:58.012 { 00:34:58.012 "method": "bdev_nvme_attach_controller", 00:34:58.012 "params": { 00:34:58.012 "trtype": "tcp", 00:34:58.012 "adrfam": "IPv4", 00:34:58.012 "name": "Nvme0", 00:34:58.012 "subnqn": "nqn.2016-06.io.spdk:cnode0", 00:34:58.012 "traddr": "10.0.0.2", 00:34:58.012 "trsvcid": "4420" 00:34:58.012 } 00:34:58.012 }, 00:34:58.012 { 00:34:58.012 "method": "bdev_set_options", 00:34:58.012 "params": { 00:34:58.012 "bdev_auto_examine": false 00:34:58.012 } 00:34:58.012 } 00:34:58.012 ] 00:34:58.012 } 00:34:58.012 ] 00:34:58.012 }' 00:34:58.012 18:38:41 chaining -- bdev/chaining.sh@33 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_dd -c /dev/fd/62 --of /tmp/tmp.MlB10xJUFT --ib Nvme0n1 --bs 65536 --count 1 00:34:58.012 18:38:41 chaining -- bdev/chaining.sh@33 -- # echo '{ 00:34:58.012 "subsystems": [ 00:34:58.012 { 00:34:58.012 "subsystem": "bdev", 00:34:58.012 "config": [ 00:34:58.012 { 00:34:58.012 "method": "bdev_nvme_attach_controller", 00:34:58.012 "params": { 00:34:58.012 "trtype": "tcp", 00:34:58.012 "adrfam": "IPv4", 00:34:58.012 "name": "Nvme0", 00:34:58.012 "subnqn": "nqn.2016-06.io.spdk:cnode0", 00:34:58.012 "traddr": "10.0.0.2", 00:34:58.012 "trsvcid": "4420" 00:34:58.012 } 00:34:58.012 }, 00:34:58.012 { 00:34:58.012 "method": "bdev_set_options", 00:34:58.012 "params": { 00:34:58.012 "bdev_auto_examine": false 00:34:58.012 } 00:34:58.012 } 00:34:58.012 ] 00:34:58.012 } 00:34:58.012 ] 00:34:58.012 }' 00:34:58.012 [2024-07-12 18:38:41.602732] Starting SPDK v24.09-pre git sha1 182dd7de4 / DPDK 24.03.0 initialization... 00:34:58.012 [2024-07-12 18:38:41.602783] [ DPDK EAL parameters: spdk_dd --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2671868 ] 00:34:58.012 [2024-07-12 18:38:41.715974] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:34:58.269 [2024-07-12 18:38:41.816605] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:34:58.544  Copying: 64/64 [kB] (average 31 MBps) 00:34:58.544 00:34:58.544 18:38:42 chaining -- bdev/chaining.sh@100 -- # get_stat sequence_executed 00:34:58.544 18:38:42 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:34:58.544 18:38:42 chaining -- bdev/chaining.sh@39 -- # event=sequence_executed 00:34:58.544 18:38:42 chaining -- bdev/chaining.sh@39 -- # opcode= 00:34:58.544 18:38:42 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:34:58.544 18:38:42 chaining -- bdev/chaining.sh@40 -- # [[ -z '' ]] 00:34:58.544 18:38:42 chaining -- bdev/chaining.sh@41 -- # rpc_cmd accel_get_stats 00:34:58.544 18:38:42 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:34:58.544 18:38:42 chaining -- common/autotest_common.sh@10 -- # set +x 00:34:58.544 18:38:42 chaining -- bdev/chaining.sh@41 -- # jq -r .sequence_executed 00:34:58.544 18:38:42 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:34:58.809 18:38:42 chaining -- bdev/chaining.sh@100 -- # (( 14 == stats[sequence_executed] + 1 )) 00:34:58.809 18:38:42 chaining -- bdev/chaining.sh@101 -- # get_stat executed encrypt 00:34:58.810 18:38:42 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:34:58.810 18:38:42 chaining -- bdev/chaining.sh@39 -- # event=executed 00:34:58.810 18:38:42 chaining -- bdev/chaining.sh@39 -- # opcode=encrypt 00:34:58.810 18:38:42 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:34:58.810 18:38:42 chaining -- bdev/chaining.sh@40 -- # [[ -z encrypt ]] 00:34:58.810 18:38:42 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:34:58.810 18:38:42 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "encrypt").executed' 00:34:58.810 18:38:42 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:34:58.810 18:38:42 chaining -- common/autotest_common.sh@10 -- # set +x 00:34:58.810 18:38:42 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:34:58.810 18:38:42 chaining -- bdev/chaining.sh@101 -- # (( 2 == stats[encrypt_executed] )) 00:34:58.810 18:38:42 chaining -- bdev/chaining.sh@102 -- # get_stat executed decrypt 00:34:58.810 18:38:42 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:34:58.810 18:38:42 chaining -- bdev/chaining.sh@39 -- # event=executed 00:34:58.810 18:38:42 chaining -- bdev/chaining.sh@39 -- # opcode=decrypt 00:34:58.810 18:38:42 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:34:58.810 18:38:42 chaining -- bdev/chaining.sh@40 -- # [[ -z decrypt ]] 00:34:58.810 18:38:42 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:34:58.810 18:38:42 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "decrypt").executed' 00:34:58.810 18:38:42 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:34:58.810 18:38:42 chaining -- common/autotest_common.sh@10 -- # set +x 00:34:58.810 18:38:42 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:34:58.810 18:38:42 chaining -- bdev/chaining.sh@102 -- # (( 14 == stats[decrypt_executed] + 2 )) 00:34:58.810 18:38:42 chaining -- bdev/chaining.sh@103 -- # get_stat executed copy 00:34:58.810 18:38:42 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:34:58.810 18:38:42 chaining -- bdev/chaining.sh@39 -- # event=executed 00:34:58.810 18:38:42 chaining -- bdev/chaining.sh@39 -- # opcode=copy 00:34:58.810 18:38:42 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:34:58.810 18:38:42 chaining -- bdev/chaining.sh@40 -- # [[ -z copy ]] 00:34:58.810 18:38:42 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:34:58.810 18:38:42 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "copy").executed' 00:34:58.810 18:38:42 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:34:58.810 18:38:42 chaining -- common/autotest_common.sh@10 -- # set +x 00:34:58.810 18:38:42 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:34:58.810 18:38:42 chaining -- bdev/chaining.sh@103 -- # (( 4 == stats[copy_executed] )) 00:34:58.810 18:38:42 chaining -- bdev/chaining.sh@104 -- # cmp /tmp/tmp.CGh8VKPD3T /tmp/tmp.MlB10xJUFT 00:34:58.810 18:38:42 chaining -- bdev/chaining.sh@105 -- # spdk_dd --if /dev/zero --ob Nvme0n1 --bs 65536 --count 1 00:34:58.810 18:38:42 chaining -- bdev/chaining.sh@25 -- # local config 00:34:58.810 18:38:42 chaining -- bdev/chaining.sh@31 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/gen_nvme.sh --mode=remote --json-with-subsystems --trid=tcp:10.0.0.2:4420:nqn.2016-06.io.spdk:cnode0 00:34:58.810 18:38:42 chaining -- bdev/chaining.sh@32 -- # jq '.subsystems[0].config[.subsystems[0].config | length] |= 00:34:58.810 {"method": "bdev_set_options", "params": {"bdev_auto_examine": false}}' 00:34:58.810 18:38:42 chaining -- bdev/chaining.sh@31 -- # config='{ 00:34:58.810 "subsystems": [ 00:34:58.810 { 00:34:58.810 "subsystem": "bdev", 00:34:58.810 "config": [ 00:34:58.810 { 00:34:58.810 "method": "bdev_nvme_attach_controller", 00:34:58.810 "params": { 00:34:58.810 "trtype": "tcp", 00:34:58.810 "adrfam": "IPv4", 00:34:58.810 "name": "Nvme0", 00:34:58.810 "subnqn": "nqn.2016-06.io.spdk:cnode0", 00:34:58.810 "traddr": "10.0.0.2", 00:34:58.810 "trsvcid": "4420" 00:34:58.810 } 00:34:58.810 }, 00:34:58.810 { 00:34:58.810 "method": "bdev_set_options", 00:34:58.810 "params": { 00:34:58.810 "bdev_auto_examine": false 00:34:58.810 } 00:34:58.810 } 00:34:58.810 ] 00:34:58.810 } 00:34:58.810 ] 00:34:58.810 }' 00:34:58.810 18:38:42 chaining -- bdev/chaining.sh@33 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_dd -c /dev/fd/62 --if /dev/zero --ob Nvme0n1 --bs 65536 --count 1 00:34:58.810 18:38:42 chaining -- bdev/chaining.sh@33 -- # echo '{ 00:34:58.810 "subsystems": [ 00:34:58.810 { 00:34:58.810 "subsystem": "bdev", 00:34:58.810 "config": [ 00:34:58.810 { 00:34:58.810 "method": "bdev_nvme_attach_controller", 00:34:58.810 "params": { 00:34:58.810 "trtype": "tcp", 00:34:58.810 "adrfam": "IPv4", 00:34:58.810 "name": "Nvme0", 00:34:58.810 "subnqn": "nqn.2016-06.io.spdk:cnode0", 00:34:58.810 "traddr": "10.0.0.2", 00:34:58.810 "trsvcid": "4420" 00:34:58.810 } 00:34:58.810 }, 00:34:58.810 { 00:34:58.810 "method": "bdev_set_options", 00:34:58.810 "params": { 00:34:58.810 "bdev_auto_examine": false 00:34:58.810 } 00:34:58.810 } 00:34:58.810 ] 00:34:58.810 } 00:34:58.810 ] 00:34:58.810 }' 00:34:59.069 [2024-07-12 18:38:42.574796] Starting SPDK v24.09-pre git sha1 182dd7de4 / DPDK 24.03.0 initialization... 00:34:59.069 [2024-07-12 18:38:42.574865] [ DPDK EAL parameters: spdk_dd --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2672000 ] 00:34:59.069 [2024-07-12 18:38:42.704250] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:34:59.341 [2024-07-12 18:38:42.801322] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:34:59.600  Copying: 64/64 [kB] (average 10 MBps) 00:34:59.600 00:34:59.600 18:38:43 chaining -- bdev/chaining.sh@106 -- # update_stats 00:34:59.600 18:38:43 chaining -- bdev/chaining.sh@51 -- # get_stat sequence_executed 00:34:59.600 18:38:43 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:34:59.600 18:38:43 chaining -- bdev/chaining.sh@39 -- # event=sequence_executed 00:34:59.600 18:38:43 chaining -- bdev/chaining.sh@39 -- # opcode= 00:34:59.600 18:38:43 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:34:59.600 18:38:43 chaining -- bdev/chaining.sh@40 -- # [[ -z '' ]] 00:34:59.600 18:38:43 chaining -- bdev/chaining.sh@41 -- # jq -r .sequence_executed 00:34:59.600 18:38:43 chaining -- bdev/chaining.sh@41 -- # rpc_cmd accel_get_stats 00:34:59.600 18:38:43 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:34:59.600 18:38:43 chaining -- common/autotest_common.sh@10 -- # set +x 00:34:59.600 18:38:43 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:34:59.600 18:38:43 chaining -- bdev/chaining.sh@51 -- # stats["sequence_executed"]=15 00:34:59.600 18:38:43 chaining -- bdev/chaining.sh@52 -- # get_stat executed encrypt 00:34:59.600 18:38:43 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:34:59.600 18:38:43 chaining -- bdev/chaining.sh@39 -- # event=executed 00:34:59.600 18:38:43 chaining -- bdev/chaining.sh@39 -- # opcode=encrypt 00:34:59.600 18:38:43 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:34:59.600 18:38:43 chaining -- bdev/chaining.sh@40 -- # [[ -z encrypt ]] 00:34:59.600 18:38:43 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:34:59.600 18:38:43 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "encrypt").executed' 00:34:59.600 18:38:43 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:34:59.600 18:38:43 chaining -- common/autotest_common.sh@10 -- # set +x 00:34:59.600 18:38:43 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:34:59.600 18:38:43 chaining -- bdev/chaining.sh@52 -- # stats["encrypt_executed"]=4 00:34:59.600 18:38:43 chaining -- bdev/chaining.sh@53 -- # get_stat executed decrypt 00:34:59.600 18:38:43 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:34:59.600 18:38:43 chaining -- bdev/chaining.sh@39 -- # event=executed 00:34:59.600 18:38:43 chaining -- bdev/chaining.sh@39 -- # opcode=decrypt 00:34:59.600 18:38:43 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:34:59.600 18:38:43 chaining -- bdev/chaining.sh@40 -- # [[ -z decrypt ]] 00:34:59.600 18:38:43 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:34:59.600 18:38:43 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:34:59.600 18:38:43 chaining -- common/autotest_common.sh@10 -- # set +x 00:34:59.600 18:38:43 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "decrypt").executed' 00:34:59.600 18:38:43 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:34:59.858 18:38:43 chaining -- bdev/chaining.sh@53 -- # stats["decrypt_executed"]=14 00:34:59.858 18:38:43 chaining -- bdev/chaining.sh@54 -- # get_stat executed copy 00:34:59.858 18:38:43 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:34:59.858 18:38:43 chaining -- bdev/chaining.sh@39 -- # event=executed 00:34:59.858 18:38:43 chaining -- bdev/chaining.sh@39 -- # opcode=copy 00:34:59.858 18:38:43 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:34:59.858 18:38:43 chaining -- bdev/chaining.sh@40 -- # [[ -z copy ]] 00:34:59.858 18:38:43 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:34:59.858 18:38:43 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:34:59.858 18:38:43 chaining -- common/autotest_common.sh@10 -- # set +x 00:34:59.858 18:38:43 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "copy").executed' 00:34:59.858 18:38:43 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:34:59.858 18:38:43 chaining -- bdev/chaining.sh@54 -- # stats["copy_executed"]=4 00:34:59.858 18:38:43 chaining -- bdev/chaining.sh@109 -- # spdk_dd --if /tmp/tmp.CGh8VKPD3T --ob Nvme0n1 --bs 4096 --count 16 00:34:59.858 18:38:43 chaining -- bdev/chaining.sh@25 -- # local config 00:34:59.859 18:38:43 chaining -- bdev/chaining.sh@31 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/gen_nvme.sh --mode=remote --json-with-subsystems --trid=tcp:10.0.0.2:4420:nqn.2016-06.io.spdk:cnode0 00:34:59.859 18:38:43 chaining -- bdev/chaining.sh@32 -- # jq '.subsystems[0].config[.subsystems[0].config | length] |= 00:34:59.859 {"method": "bdev_set_options", "params": {"bdev_auto_examine": false}}' 00:34:59.859 18:38:43 chaining -- bdev/chaining.sh@31 -- # config='{ 00:34:59.859 "subsystems": [ 00:34:59.859 { 00:34:59.859 "subsystem": "bdev", 00:34:59.859 "config": [ 00:34:59.859 { 00:34:59.859 "method": "bdev_nvme_attach_controller", 00:34:59.859 "params": { 00:34:59.859 "trtype": "tcp", 00:34:59.859 "adrfam": "IPv4", 00:34:59.859 "name": "Nvme0", 00:34:59.859 "subnqn": "nqn.2016-06.io.spdk:cnode0", 00:34:59.859 "traddr": "10.0.0.2", 00:34:59.859 "trsvcid": "4420" 00:34:59.859 } 00:34:59.859 }, 00:34:59.859 { 00:34:59.859 "method": "bdev_set_options", 00:34:59.859 "params": { 00:34:59.859 "bdev_auto_examine": false 00:34:59.859 } 00:34:59.859 } 00:34:59.859 ] 00:34:59.859 } 00:34:59.859 ] 00:34:59.859 }' 00:34:59.859 18:38:43 chaining -- bdev/chaining.sh@33 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_dd -c /dev/fd/62 --if /tmp/tmp.CGh8VKPD3T --ob Nvme0n1 --bs 4096 --count 16 00:34:59.859 18:38:43 chaining -- bdev/chaining.sh@33 -- # echo '{ 00:34:59.859 "subsystems": [ 00:34:59.859 { 00:34:59.859 "subsystem": "bdev", 00:34:59.859 "config": [ 00:34:59.859 { 00:34:59.859 "method": "bdev_nvme_attach_controller", 00:34:59.859 "params": { 00:34:59.859 "trtype": "tcp", 00:34:59.859 "adrfam": "IPv4", 00:34:59.859 "name": "Nvme0", 00:34:59.859 "subnqn": "nqn.2016-06.io.spdk:cnode0", 00:34:59.859 "traddr": "10.0.0.2", 00:34:59.859 "trsvcid": "4420" 00:34:59.859 } 00:34:59.859 }, 00:34:59.859 { 00:34:59.859 "method": "bdev_set_options", 00:34:59.859 "params": { 00:34:59.859 "bdev_auto_examine": false 00:34:59.859 } 00:34:59.859 } 00:34:59.859 ] 00:34:59.859 } 00:34:59.859 ] 00:34:59.859 }' 00:34:59.859 [2024-07-12 18:38:43.506661] Starting SPDK v24.09-pre git sha1 182dd7de4 / DPDK 24.03.0 initialization... 00:34:59.859 [2024-07-12 18:38:43.506725] [ DPDK EAL parameters: spdk_dd --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2672192 ] 00:35:00.117 [2024-07-12 18:38:43.633435] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:35:00.117 [2024-07-12 18:38:43.731306] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:35:00.635  Copying: 64/64 [kB] (average 9142 kBps) 00:35:00.635 00:35:00.635 18:38:44 chaining -- bdev/chaining.sh@110 -- # get_stat sequence_executed 00:35:00.635 18:38:44 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:35:00.635 18:38:44 chaining -- bdev/chaining.sh@39 -- # event=sequence_executed 00:35:00.635 18:38:44 chaining -- bdev/chaining.sh@39 -- # opcode= 00:35:00.635 18:38:44 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:35:00.635 18:38:44 chaining -- bdev/chaining.sh@40 -- # [[ -z '' ]] 00:35:00.635 18:38:44 chaining -- bdev/chaining.sh@41 -- # rpc_cmd accel_get_stats 00:35:00.635 18:38:44 chaining -- bdev/chaining.sh@41 -- # jq -r .sequence_executed 00:35:00.635 18:38:44 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:35:00.635 18:38:44 chaining -- common/autotest_common.sh@10 -- # set +x 00:35:00.635 18:38:44 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:35:00.635 18:38:44 chaining -- bdev/chaining.sh@110 -- # (( 31 == stats[sequence_executed] + 16 )) 00:35:00.635 18:38:44 chaining -- bdev/chaining.sh@111 -- # get_stat executed encrypt 00:35:00.635 18:38:44 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:35:00.635 18:38:44 chaining -- bdev/chaining.sh@39 -- # event=executed 00:35:00.635 18:38:44 chaining -- bdev/chaining.sh@39 -- # opcode=encrypt 00:35:00.635 18:38:44 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:35:00.635 18:38:44 chaining -- bdev/chaining.sh@40 -- # [[ -z encrypt ]] 00:35:00.635 18:38:44 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:35:00.635 18:38:44 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "encrypt").executed' 00:35:00.635 18:38:44 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:35:00.635 18:38:44 chaining -- common/autotest_common.sh@10 -- # set +x 00:35:00.635 18:38:44 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:35:00.635 18:38:44 chaining -- bdev/chaining.sh@111 -- # (( 36 == stats[encrypt_executed] + 32 )) 00:35:00.635 18:38:44 chaining -- bdev/chaining.sh@112 -- # get_stat executed decrypt 00:35:00.635 18:38:44 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:35:00.635 18:38:44 chaining -- bdev/chaining.sh@39 -- # event=executed 00:35:00.635 18:38:44 chaining -- bdev/chaining.sh@39 -- # opcode=decrypt 00:35:00.635 18:38:44 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:35:00.635 18:38:44 chaining -- bdev/chaining.sh@40 -- # [[ -z decrypt ]] 00:35:00.635 18:38:44 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:35:00.635 18:38:44 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:35:00.635 18:38:44 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "decrypt").executed' 00:35:00.635 18:38:44 chaining -- common/autotest_common.sh@10 -- # set +x 00:35:00.635 18:38:44 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:35:00.635 18:38:44 chaining -- bdev/chaining.sh@112 -- # (( 14 == stats[decrypt_executed] )) 00:35:00.635 18:38:44 chaining -- bdev/chaining.sh@113 -- # get_stat executed copy 00:35:00.635 18:38:44 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:35:00.635 18:38:44 chaining -- bdev/chaining.sh@39 -- # event=executed 00:35:00.635 18:38:44 chaining -- bdev/chaining.sh@39 -- # opcode=copy 00:35:00.635 18:38:44 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:35:00.635 18:38:44 chaining -- bdev/chaining.sh@40 -- # [[ -z copy ]] 00:35:00.635 18:38:44 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "copy").executed' 00:35:00.635 18:38:44 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:35:00.635 18:38:44 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:35:00.635 18:38:44 chaining -- common/autotest_common.sh@10 -- # set +x 00:35:00.635 18:38:44 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:35:00.635 18:38:44 chaining -- bdev/chaining.sh@113 -- # (( 4 == stats[copy_executed] )) 00:35:00.635 18:38:44 chaining -- bdev/chaining.sh@114 -- # update_stats 00:35:00.635 18:38:44 chaining -- bdev/chaining.sh@51 -- # get_stat sequence_executed 00:35:00.635 18:38:44 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:35:00.635 18:38:44 chaining -- bdev/chaining.sh@39 -- # event=sequence_executed 00:35:00.635 18:38:44 chaining -- bdev/chaining.sh@39 -- # opcode= 00:35:00.635 18:38:44 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:35:00.635 18:38:44 chaining -- bdev/chaining.sh@40 -- # [[ -z '' ]] 00:35:00.894 18:38:44 chaining -- bdev/chaining.sh@41 -- # jq -r .sequence_executed 00:35:00.894 18:38:44 chaining -- bdev/chaining.sh@41 -- # rpc_cmd accel_get_stats 00:35:00.894 18:38:44 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:35:00.894 18:38:44 chaining -- common/autotest_common.sh@10 -- # set +x 00:35:00.894 18:38:44 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:35:00.894 18:38:44 chaining -- bdev/chaining.sh@51 -- # stats["sequence_executed"]=31 00:35:00.894 18:38:44 chaining -- bdev/chaining.sh@52 -- # get_stat executed encrypt 00:35:00.894 18:38:44 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:35:00.894 18:38:44 chaining -- bdev/chaining.sh@39 -- # event=executed 00:35:00.894 18:38:44 chaining -- bdev/chaining.sh@39 -- # opcode=encrypt 00:35:00.894 18:38:44 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:35:00.894 18:38:44 chaining -- bdev/chaining.sh@40 -- # [[ -z encrypt ]] 00:35:00.894 18:38:44 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:35:00.894 18:38:44 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "encrypt").executed' 00:35:00.894 18:38:44 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:35:00.894 18:38:44 chaining -- common/autotest_common.sh@10 -- # set +x 00:35:00.894 18:38:44 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:35:00.894 18:38:44 chaining -- bdev/chaining.sh@52 -- # stats["encrypt_executed"]=36 00:35:00.894 18:38:44 chaining -- bdev/chaining.sh@53 -- # get_stat executed decrypt 00:35:00.894 18:38:44 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:35:00.894 18:38:44 chaining -- bdev/chaining.sh@39 -- # event=executed 00:35:00.894 18:38:44 chaining -- bdev/chaining.sh@39 -- # opcode=decrypt 00:35:00.894 18:38:44 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:35:00.894 18:38:44 chaining -- bdev/chaining.sh@40 -- # [[ -z decrypt ]] 00:35:00.894 18:38:44 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:35:00.894 18:38:44 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "decrypt").executed' 00:35:00.894 18:38:44 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:35:00.894 18:38:44 chaining -- common/autotest_common.sh@10 -- # set +x 00:35:00.894 18:38:44 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:35:00.894 18:38:44 chaining -- bdev/chaining.sh@53 -- # stats["decrypt_executed"]=14 00:35:00.894 18:38:44 chaining -- bdev/chaining.sh@54 -- # get_stat executed copy 00:35:00.894 18:38:44 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:35:00.894 18:38:44 chaining -- bdev/chaining.sh@39 -- # event=executed 00:35:00.894 18:38:44 chaining -- bdev/chaining.sh@39 -- # opcode=copy 00:35:00.894 18:38:44 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:35:00.894 18:38:44 chaining -- bdev/chaining.sh@40 -- # [[ -z copy ]] 00:35:00.894 18:38:44 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:35:00.894 18:38:44 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "copy").executed' 00:35:00.894 18:38:44 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:35:00.894 18:38:44 chaining -- common/autotest_common.sh@10 -- # set +x 00:35:00.894 18:38:44 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:35:00.894 18:38:44 chaining -- bdev/chaining.sh@54 -- # stats["copy_executed"]=4 00:35:00.894 18:38:44 chaining -- bdev/chaining.sh@117 -- # : 00:35:00.894 18:38:44 chaining -- bdev/chaining.sh@118 -- # spdk_dd --of /tmp/tmp.MlB10xJUFT --ib Nvme0n1 --bs 4096 --count 16 00:35:00.894 18:38:44 chaining -- bdev/chaining.sh@25 -- # local config 00:35:00.894 18:38:44 chaining -- bdev/chaining.sh@31 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/gen_nvme.sh --mode=remote --json-with-subsystems --trid=tcp:10.0.0.2:4420:nqn.2016-06.io.spdk:cnode0 00:35:00.894 18:38:44 chaining -- bdev/chaining.sh@32 -- # jq '.subsystems[0].config[.subsystems[0].config | length] |= 00:35:00.894 {"method": "bdev_set_options", "params": {"bdev_auto_examine": false}}' 00:35:00.894 18:38:44 chaining -- bdev/chaining.sh@31 -- # config='{ 00:35:00.894 "subsystems": [ 00:35:00.894 { 00:35:00.894 "subsystem": "bdev", 00:35:00.894 "config": [ 00:35:00.894 { 00:35:00.894 "method": "bdev_nvme_attach_controller", 00:35:00.894 "params": { 00:35:00.894 "trtype": "tcp", 00:35:00.894 "adrfam": "IPv4", 00:35:00.894 "name": "Nvme0", 00:35:00.894 "subnqn": "nqn.2016-06.io.spdk:cnode0", 00:35:00.894 "traddr": "10.0.0.2", 00:35:00.894 "trsvcid": "4420" 00:35:00.894 } 00:35:00.894 }, 00:35:00.894 { 00:35:00.894 "method": "bdev_set_options", 00:35:00.894 "params": { 00:35:00.894 "bdev_auto_examine": false 00:35:00.894 } 00:35:00.894 } 00:35:00.894 ] 00:35:00.894 } 00:35:00.894 ] 00:35:00.894 }' 00:35:00.894 18:38:44 chaining -- bdev/chaining.sh@33 -- # echo '{ 00:35:00.894 "subsystems": [ 00:35:00.894 { 00:35:00.894 "subsystem": "bdev", 00:35:00.894 "config": [ 00:35:00.894 { 00:35:00.894 "method": "bdev_nvme_attach_controller", 00:35:00.894 "params": { 00:35:00.894 "trtype": "tcp", 00:35:00.894 "adrfam": "IPv4", 00:35:00.894 "name": "Nvme0", 00:35:00.894 "subnqn": "nqn.2016-06.io.spdk:cnode0", 00:35:00.894 "traddr": "10.0.0.2", 00:35:00.894 "trsvcid": "4420" 00:35:00.894 } 00:35:00.894 }, 00:35:00.894 { 00:35:00.894 "method": "bdev_set_options", 00:35:00.894 "params": { 00:35:00.894 "bdev_auto_examine": false 00:35:00.894 } 00:35:00.894 } 00:35:00.894 ] 00:35:00.894 } 00:35:00.894 ] 00:35:00.894 }' 00:35:00.894 18:38:44 chaining -- bdev/chaining.sh@33 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_dd -c /dev/fd/62 --of /tmp/tmp.MlB10xJUFT --ib Nvme0n1 --bs 4096 --count 16 00:35:01.153 [2024-07-12 18:38:44.661295] Starting SPDK v24.09-pre git sha1 182dd7de4 / DPDK 24.03.0 initialization... 00:35:01.153 [2024-07-12 18:38:44.661362] [ DPDK EAL parameters: spdk_dd --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2672399 ] 00:35:01.153 [2024-07-12 18:38:44.792330] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:35:01.411 [2024-07-12 18:38:44.899894] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:35:01.670  Copying: 64/64 [kB] (average 1454 kBps) 00:35:01.670 00:35:01.670 18:38:45 chaining -- bdev/chaining.sh@119 -- # get_stat sequence_executed 00:35:01.670 18:38:45 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:35:01.670 18:38:45 chaining -- bdev/chaining.sh@39 -- # event=sequence_executed 00:35:01.670 18:38:45 chaining -- bdev/chaining.sh@39 -- # opcode= 00:35:01.670 18:38:45 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:35:01.670 18:38:45 chaining -- bdev/chaining.sh@40 -- # [[ -z '' ]] 00:35:01.670 18:38:45 chaining -- bdev/chaining.sh@41 -- # rpc_cmd accel_get_stats 00:35:01.670 18:38:45 chaining -- bdev/chaining.sh@41 -- # jq -r .sequence_executed 00:35:01.670 18:38:45 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:35:01.670 18:38:45 chaining -- common/autotest_common.sh@10 -- # set +x 00:35:01.929 18:38:45 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:35:01.929 18:38:45 chaining -- bdev/chaining.sh@119 -- # (( 47 == stats[sequence_executed] + 16 )) 00:35:01.929 18:38:45 chaining -- bdev/chaining.sh@120 -- # get_stat executed encrypt 00:35:01.929 18:38:45 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:35:01.929 18:38:45 chaining -- bdev/chaining.sh@39 -- # event=executed 00:35:01.929 18:38:45 chaining -- bdev/chaining.sh@39 -- # opcode=encrypt 00:35:01.929 18:38:45 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:35:01.929 18:38:45 chaining -- bdev/chaining.sh@40 -- # [[ -z encrypt ]] 00:35:01.929 18:38:45 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "encrypt").executed' 00:35:01.929 18:38:45 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:35:01.929 18:38:45 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:35:01.929 18:38:45 chaining -- common/autotest_common.sh@10 -- # set +x 00:35:01.929 18:38:45 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:35:01.929 18:38:45 chaining -- bdev/chaining.sh@120 -- # (( 36 == stats[encrypt_executed] )) 00:35:01.929 18:38:45 chaining -- bdev/chaining.sh@121 -- # get_stat executed decrypt 00:35:01.929 18:38:45 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:35:01.929 18:38:45 chaining -- bdev/chaining.sh@39 -- # event=executed 00:35:01.929 18:38:45 chaining -- bdev/chaining.sh@39 -- # opcode=decrypt 00:35:01.929 18:38:45 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:35:01.929 18:38:45 chaining -- bdev/chaining.sh@40 -- # [[ -z decrypt ]] 00:35:01.929 18:38:45 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "decrypt").executed' 00:35:01.929 18:38:45 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:35:01.929 18:38:45 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:35:01.929 18:38:45 chaining -- common/autotest_common.sh@10 -- # set +x 00:35:01.929 18:38:45 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:35:01.929 18:38:45 chaining -- bdev/chaining.sh@121 -- # (( 46 == stats[decrypt_executed] + 32 )) 00:35:01.929 18:38:45 chaining -- bdev/chaining.sh@122 -- # get_stat executed copy 00:35:01.929 18:38:45 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:35:01.929 18:38:45 chaining -- bdev/chaining.sh@39 -- # event=executed 00:35:01.929 18:38:45 chaining -- bdev/chaining.sh@39 -- # opcode=copy 00:35:01.929 18:38:45 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:35:01.929 18:38:45 chaining -- bdev/chaining.sh@40 -- # [[ -z copy ]] 00:35:01.929 18:38:45 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "copy").executed' 00:35:01.929 18:38:45 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:35:01.929 18:38:45 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:35:01.929 18:38:45 chaining -- common/autotest_common.sh@10 -- # set +x 00:35:01.929 18:38:45 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:35:01.929 18:38:45 chaining -- bdev/chaining.sh@122 -- # (( 4 == stats[copy_executed] )) 00:35:01.929 18:38:45 chaining -- bdev/chaining.sh@123 -- # cmp /tmp/tmp.CGh8VKPD3T /tmp/tmp.MlB10xJUFT 00:35:01.929 18:38:45 chaining -- bdev/chaining.sh@125 -- # trap - SIGINT SIGTERM EXIT 00:35:01.929 18:38:45 chaining -- bdev/chaining.sh@126 -- # tgtcleanup 00:35:01.929 18:38:45 chaining -- bdev/chaining.sh@58 -- # rm -f /tmp/tmp.CGh8VKPD3T /tmp/tmp.MlB10xJUFT 00:35:01.929 18:38:45 chaining -- bdev/chaining.sh@59 -- # nvmftestfini 00:35:01.929 18:38:45 chaining -- nvmf/common.sh@488 -- # nvmfcleanup 00:35:01.929 18:38:45 chaining -- nvmf/common.sh@117 -- # sync 00:35:01.929 18:38:45 chaining -- nvmf/common.sh@119 -- # '[' tcp == tcp ']' 00:35:01.929 18:38:45 chaining -- nvmf/common.sh@120 -- # set +e 00:35:01.929 18:38:45 chaining -- nvmf/common.sh@121 -- # for i in {1..20} 00:35:01.929 18:38:45 chaining -- nvmf/common.sh@122 -- # modprobe -v -r nvme-tcp 00:35:01.929 rmmod nvme_tcp 00:35:01.929 rmmod nvme_fabrics 00:35:02.187 rmmod nvme_keyring 00:35:02.187 18:38:45 chaining -- nvmf/common.sh@123 -- # modprobe -v -r nvme-fabrics 00:35:02.187 18:38:45 chaining -- nvmf/common.sh@124 -- # set -e 00:35:02.187 18:38:45 chaining -- nvmf/common.sh@125 -- # return 0 00:35:02.187 18:38:45 chaining -- nvmf/common.sh@489 -- # '[' -n 2671443 ']' 00:35:02.187 18:38:45 chaining -- nvmf/common.sh@490 -- # killprocess 2671443 00:35:02.187 18:38:45 chaining -- common/autotest_common.sh@948 -- # '[' -z 2671443 ']' 00:35:02.187 18:38:45 chaining -- common/autotest_common.sh@952 -- # kill -0 2671443 00:35:02.187 18:38:45 chaining -- common/autotest_common.sh@953 -- # uname 00:35:02.187 18:38:45 chaining -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:35:02.187 18:38:45 chaining -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2671443 00:35:02.187 18:38:45 chaining -- common/autotest_common.sh@954 -- # process_name=reactor_1 00:35:02.187 18:38:45 chaining -- common/autotest_common.sh@958 -- # '[' reactor_1 = sudo ']' 00:35:02.187 18:38:45 chaining -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2671443' 00:35:02.187 killing process with pid 2671443 00:35:02.187 18:38:45 chaining -- common/autotest_common.sh@967 -- # kill 2671443 00:35:02.187 18:38:45 chaining -- common/autotest_common.sh@972 -- # wait 2671443 00:35:02.444 18:38:46 chaining -- nvmf/common.sh@492 -- # '[' '' == iso ']' 00:35:02.444 18:38:46 chaining -- nvmf/common.sh@495 -- # [[ tcp == \t\c\p ]] 00:35:02.444 18:38:46 chaining -- nvmf/common.sh@496 -- # nvmf_tcp_fini 00:35:02.444 18:38:46 chaining -- nvmf/common.sh@274 -- # [[ nvmf_tgt_ns_spdk == \n\v\m\f\_\t\g\t\_\n\s ]] 00:35:02.444 18:38:46 chaining -- nvmf/common.sh@278 -- # remove_spdk_ns 00:35:02.444 18:38:46 chaining -- nvmf/common.sh@628 -- # xtrace_disable_per_cmd _remove_spdk_ns 00:35:02.444 18:38:46 chaining -- common/autotest_common.sh@22 -- # eval '_remove_spdk_ns 13> /dev/null' 00:35:02.444 18:38:46 chaining -- common/autotest_common.sh@22 -- # _remove_spdk_ns 00:35:02.444 18:38:46 chaining -- nvmf/common.sh@279 -- # ip -4 addr flush nvmf_init_if 00:35:02.444 18:38:46 chaining -- bdev/chaining.sh@129 -- # trap 'bperfcleanup; exit 1' SIGINT SIGTERM EXIT 00:35:02.444 18:38:46 chaining -- bdev/chaining.sh@132 -- # bperfpid=2672612 00:35:02.444 18:38:46 chaining -- bdev/chaining.sh@134 -- # waitforlisten 2672612 00:35:02.444 18:38:46 chaining -- common/autotest_common.sh@829 -- # '[' -z 2672612 ']' 00:35:02.444 18:38:46 chaining -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:35:02.444 18:38:46 chaining -- common/autotest_common.sh@834 -- # local max_retries=100 00:35:02.444 18:38:46 chaining -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:35:02.444 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:35:02.444 18:38:46 chaining -- bdev/chaining.sh@131 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -t 5 -w verify -o 4096 -q 256 --wait-for-rpc -z 00:35:02.444 18:38:46 chaining -- common/autotest_common.sh@838 -- # xtrace_disable 00:35:02.444 18:38:46 chaining -- common/autotest_common.sh@10 -- # set +x 00:35:02.701 [2024-07-12 18:38:46.198822] Starting SPDK v24.09-pre git sha1 182dd7de4 / DPDK 24.03.0 initialization... 00:35:02.702 [2024-07-12 18:38:46.198883] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2672612 ] 00:35:02.702 [2024-07-12 18:38:46.328983] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:35:02.958 [2024-07-12 18:38:46.442473] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:35:03.889 18:38:47 chaining -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:35:03.889 18:38:47 chaining -- common/autotest_common.sh@862 -- # return 0 00:35:03.889 18:38:47 chaining -- bdev/chaining.sh@135 -- # rpc_cmd 00:35:03.889 18:38:47 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:35:03.889 18:38:47 chaining -- common/autotest_common.sh@10 -- # set +x 00:35:03.889 malloc0 00:35:03.889 true 00:35:03.889 true 00:35:03.889 [2024-07-12 18:38:47.586514] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "key0" 00:35:03.889 crypto0 00:35:03.889 [2024-07-12 18:38:47.594526] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "key1" 00:35:03.889 crypto1 00:35:03.889 18:38:47 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:35:03.889 18:38:47 chaining -- bdev/chaining.sh@145 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py perform_tests 00:35:04.146 Running I/O for 5 seconds... 00:35:09.407 00:35:09.407 Latency(us) 00:35:09.407 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:35:09.407 Job: crypto1 (Core Mask 0x1, workload: verify, depth: 256, IO size: 4096) 00:35:09.407 Verification LBA range: start 0x0 length 0x2000 00:35:09.407 crypto1 : 5.01 11496.07 44.91 0.00 0.00 22207.60 6439.62 14246.96 00:35:09.407 =================================================================================================================== 00:35:09.407 Total : 11496.07 44.91 0.00 0.00 22207.60 6439.62 14246.96 00:35:09.407 0 00:35:09.407 18:38:52 chaining -- bdev/chaining.sh@146 -- # killprocess 2672612 00:35:09.407 18:38:52 chaining -- common/autotest_common.sh@948 -- # '[' -z 2672612 ']' 00:35:09.407 18:38:52 chaining -- common/autotest_common.sh@952 -- # kill -0 2672612 00:35:09.407 18:38:52 chaining -- common/autotest_common.sh@953 -- # uname 00:35:09.407 18:38:52 chaining -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:35:09.407 18:38:52 chaining -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2672612 00:35:09.407 18:38:52 chaining -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:35:09.407 18:38:52 chaining -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:35:09.407 18:38:52 chaining -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2672612' 00:35:09.407 killing process with pid 2672612 00:35:09.407 18:38:52 chaining -- common/autotest_common.sh@967 -- # kill 2672612 00:35:09.407 Received shutdown signal, test time was about 5.000000 seconds 00:35:09.407 00:35:09.407 Latency(us) 00:35:09.407 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:35:09.407 =================================================================================================================== 00:35:09.407 Total : 0.00 0.00 0.00 0.00 0.00 0.00 0.00 00:35:09.407 18:38:52 chaining -- common/autotest_common.sh@972 -- # wait 2672612 00:35:09.407 18:38:53 chaining -- bdev/chaining.sh@152 -- # bperfpid=2673490 00:35:09.407 18:38:53 chaining -- bdev/chaining.sh@154 -- # waitforlisten 2673490 00:35:09.407 18:38:53 chaining -- common/autotest_common.sh@829 -- # '[' -z 2673490 ']' 00:35:09.407 18:38:53 chaining -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:35:09.407 18:38:53 chaining -- common/autotest_common.sh@834 -- # local max_retries=100 00:35:09.407 18:38:53 chaining -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:35:09.407 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:35:09.407 18:38:53 chaining -- common/autotest_common.sh@838 -- # xtrace_disable 00:35:09.407 18:38:53 chaining -- bdev/chaining.sh@151 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -t 5 -w verify -o 4096 -q 256 --wait-for-rpc -z 00:35:09.407 18:38:53 chaining -- common/autotest_common.sh@10 -- # set +x 00:35:09.664 [2024-07-12 18:38:53.229369] Starting SPDK v24.09-pre git sha1 182dd7de4 / DPDK 24.03.0 initialization... 00:35:09.664 [2024-07-12 18:38:53.229504] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2673490 ] 00:35:09.922 [2024-07-12 18:38:53.424456] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:35:09.922 [2024-07-12 18:38:53.526574] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:35:10.901 18:38:54 chaining -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:35:10.901 18:38:54 chaining -- common/autotest_common.sh@862 -- # return 0 00:35:10.901 18:38:54 chaining -- bdev/chaining.sh@155 -- # rpc_cmd 00:35:10.901 18:38:54 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:35:10.901 18:38:54 chaining -- common/autotest_common.sh@10 -- # set +x 00:35:10.901 malloc0 00:35:10.901 true 00:35:10.901 true 00:35:10.901 [2024-07-12 18:38:54.446249] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc0 00:35:10.901 [2024-07-12 18:38:54.446295] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:35:10.901 [2024-07-12 18:38:54.446316] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x2270730 00:35:10.901 [2024-07-12 18:38:54.446329] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:35:10.901 [2024-07-12 18:38:54.447403] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:35:10.901 [2024-07-12 18:38:54.447443] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt0 00:35:10.901 pt0 00:35:10.901 [2024-07-12 18:38:54.454281] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "key0" 00:35:10.901 crypto0 00:35:10.901 [2024-07-12 18:38:54.462302] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "key1" 00:35:10.901 crypto1 00:35:10.901 18:38:54 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:35:10.901 18:38:54 chaining -- bdev/chaining.sh@166 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py perform_tests 00:35:11.159 Running I/O for 5 seconds... 00:35:16.425 00:35:16.425 Latency(us) 00:35:16.425 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:35:16.425 Job: crypto1 (Core Mask 0x1, workload: verify, depth: 256, IO size: 4096) 00:35:16.425 Verification LBA range: start 0x0 length 0x2000 00:35:16.425 crypto1 : 5.02 9117.87 35.62 0.00 0.00 27989.24 6525.11 16868.40 00:35:16.425 =================================================================================================================== 00:35:16.425 Total : 9117.87 35.62 0.00 0.00 27989.24 6525.11 16868.40 00:35:16.425 0 00:35:16.425 18:38:59 chaining -- bdev/chaining.sh@167 -- # killprocess 2673490 00:35:16.425 18:38:59 chaining -- common/autotest_common.sh@948 -- # '[' -z 2673490 ']' 00:35:16.425 18:38:59 chaining -- common/autotest_common.sh@952 -- # kill -0 2673490 00:35:16.425 18:38:59 chaining -- common/autotest_common.sh@953 -- # uname 00:35:16.425 18:38:59 chaining -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:35:16.425 18:38:59 chaining -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2673490 00:35:16.425 18:38:59 chaining -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:35:16.425 18:38:59 chaining -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:35:16.426 18:38:59 chaining -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2673490' 00:35:16.426 killing process with pid 2673490 00:35:16.426 18:38:59 chaining -- common/autotest_common.sh@967 -- # kill 2673490 00:35:16.426 Received shutdown signal, test time was about 5.000000 seconds 00:35:16.426 00:35:16.426 Latency(us) 00:35:16.426 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:35:16.426 =================================================================================================================== 00:35:16.426 Total : 0.00 0.00 0.00 0.00 0.00 0.00 0.00 00:35:16.426 18:38:59 chaining -- common/autotest_common.sh@972 -- # wait 2673490 00:35:16.426 18:39:00 chaining -- bdev/chaining.sh@169 -- # trap - SIGINT SIGTERM EXIT 00:35:16.426 18:39:00 chaining -- bdev/chaining.sh@170 -- # killprocess 2673490 00:35:16.426 18:39:00 chaining -- common/autotest_common.sh@948 -- # '[' -z 2673490 ']' 00:35:16.426 18:39:00 chaining -- common/autotest_common.sh@952 -- # kill -0 2673490 00:35:16.426 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/common/autotest_common.sh: line 952: kill: (2673490) - No such process 00:35:16.426 18:39:00 chaining -- common/autotest_common.sh@975 -- # echo 'Process with pid 2673490 is not found' 00:35:16.426 Process with pid 2673490 is not found 00:35:16.426 18:39:00 chaining -- bdev/chaining.sh@171 -- # wait 2673490 00:35:16.426 18:39:00 chaining -- bdev/chaining.sh@175 -- # nvmftestinit 00:35:16.426 18:39:00 chaining -- nvmf/common.sh@441 -- # '[' -z tcp ']' 00:35:16.426 18:39:00 chaining -- nvmf/common.sh@446 -- # trap nvmftestfini SIGINT SIGTERM EXIT 00:35:16.426 18:39:00 chaining -- nvmf/common.sh@448 -- # prepare_net_devs 00:35:16.426 18:39:00 chaining -- nvmf/common.sh@410 -- # local -g is_hw=no 00:35:16.426 18:39:00 chaining -- nvmf/common.sh@412 -- # remove_spdk_ns 00:35:16.426 18:39:00 chaining -- nvmf/common.sh@628 -- # xtrace_disable_per_cmd _remove_spdk_ns 00:35:16.426 18:39:00 chaining -- common/autotest_common.sh@22 -- # eval '_remove_spdk_ns 13> /dev/null' 00:35:16.426 18:39:00 chaining -- common/autotest_common.sh@22 -- # _remove_spdk_ns 00:35:16.426 18:39:00 chaining -- nvmf/common.sh@414 -- # [[ phy-fallback != virt ]] 00:35:16.426 18:39:00 chaining -- nvmf/common.sh@414 -- # gather_supported_nvmf_pci_devs 00:35:16.426 18:39:00 chaining -- nvmf/common.sh@285 -- # xtrace_disable 00:35:16.426 18:39:00 chaining -- common/autotest_common.sh@10 -- # set +x 00:35:16.426 18:39:00 chaining -- nvmf/common.sh@289 -- # local intel=0x8086 mellanox=0x15b3 pci net_dev 00:35:16.426 18:39:00 chaining -- nvmf/common.sh@291 -- # pci_devs=() 00:35:16.426 18:39:00 chaining -- nvmf/common.sh@291 -- # local -a pci_devs 00:35:16.426 18:39:00 chaining -- nvmf/common.sh@292 -- # pci_net_devs=() 00:35:16.426 18:39:00 chaining -- nvmf/common.sh@292 -- # local -a pci_net_devs 00:35:16.426 18:39:00 chaining -- nvmf/common.sh@293 -- # pci_drivers=() 00:35:16.426 18:39:00 chaining -- nvmf/common.sh@293 -- # local -A pci_drivers 00:35:16.426 18:39:00 chaining -- nvmf/common.sh@295 -- # net_devs=() 00:35:16.426 18:39:00 chaining -- nvmf/common.sh@295 -- # local -ga net_devs 00:35:16.426 18:39:00 chaining -- nvmf/common.sh@296 -- # e810=() 00:35:16.426 18:39:00 chaining -- nvmf/common.sh@296 -- # local -ga e810 00:35:16.426 18:39:00 chaining -- nvmf/common.sh@297 -- # x722=() 00:35:16.426 18:39:00 chaining -- nvmf/common.sh@297 -- # local -ga x722 00:35:16.426 18:39:00 chaining -- nvmf/common.sh@298 -- # mlx=() 00:35:16.426 18:39:00 chaining -- nvmf/common.sh@298 -- # local -ga mlx 00:35:16.426 18:39:00 chaining -- nvmf/common.sh@301 -- # e810+=(${pci_bus_cache["$intel:0x1592"]}) 00:35:16.426 18:39:00 chaining -- nvmf/common.sh@302 -- # e810+=(${pci_bus_cache["$intel:0x159b"]}) 00:35:16.426 18:39:00 chaining -- nvmf/common.sh@304 -- # x722+=(${pci_bus_cache["$intel:0x37d2"]}) 00:35:16.426 18:39:00 chaining -- nvmf/common.sh@306 -- # mlx+=(${pci_bus_cache["$mellanox:0xa2dc"]}) 00:35:16.426 18:39:00 chaining -- nvmf/common.sh@308 -- # mlx+=(${pci_bus_cache["$mellanox:0x1021"]}) 00:35:16.426 18:39:00 chaining -- nvmf/common.sh@310 -- # mlx+=(${pci_bus_cache["$mellanox:0xa2d6"]}) 00:35:16.426 18:39:00 chaining -- nvmf/common.sh@312 -- # mlx+=(${pci_bus_cache["$mellanox:0x101d"]}) 00:35:16.426 18:39:00 chaining -- nvmf/common.sh@314 -- # mlx+=(${pci_bus_cache["$mellanox:0x1017"]}) 00:35:16.426 18:39:00 chaining -- nvmf/common.sh@315 -- # mlx+=(${pci_bus_cache["$mellanox:0x1019"]}) 00:35:16.426 18:39:00 chaining -- nvmf/common.sh@317 -- # mlx+=(${pci_bus_cache["$mellanox:0x1015"]}) 00:35:16.426 18:39:00 chaining -- nvmf/common.sh@318 -- # mlx+=(${pci_bus_cache["$mellanox:0x1013"]}) 00:35:16.426 18:39:00 chaining -- nvmf/common.sh@320 -- # pci_devs+=("${e810[@]}") 00:35:16.426 18:39:00 chaining -- nvmf/common.sh@321 -- # [[ tcp == rdma ]] 00:35:16.426 18:39:00 chaining -- nvmf/common.sh@327 -- # [[ '' == mlx5 ]] 00:35:16.426 18:39:00 chaining -- nvmf/common.sh@329 -- # [[ '' == e810 ]] 00:35:16.426 18:39:00 chaining -- nvmf/common.sh@331 -- # [[ '' == x722 ]] 00:35:16.426 18:39:00 chaining -- nvmf/common.sh@335 -- # (( 0 == 0 )) 00:35:16.426 18:39:00 chaining -- nvmf/common.sh@336 -- # return 1 00:35:16.426 18:39:00 chaining -- nvmf/common.sh@416 -- # [[ no == yes ]] 00:35:16.426 18:39:00 chaining -- nvmf/common.sh@423 -- # [[ phy-fallback == phy ]] 00:35:16.426 18:39:00 chaining -- nvmf/common.sh@426 -- # [[ phy-fallback == phy-fallback ]] 00:35:16.426 18:39:00 chaining -- nvmf/common.sh@427 -- # echo 'WARNING: No supported devices were found, fallback requested for tcp test' 00:35:16.426 WARNING: No supported devices were found, fallback requested for tcp test 00:35:16.426 18:39:00 chaining -- nvmf/common.sh@431 -- # [[ tcp == tcp ]] 00:35:16.426 18:39:00 chaining -- nvmf/common.sh@432 -- # nvmf_veth_init 00:35:16.426 18:39:00 chaining -- nvmf/common.sh@141 -- # NVMF_INITIATOR_IP=10.0.0.1 00:35:16.426 18:39:00 chaining -- nvmf/common.sh@142 -- # NVMF_FIRST_TARGET_IP=10.0.0.2 00:35:16.426 18:39:00 chaining -- nvmf/common.sh@143 -- # NVMF_SECOND_TARGET_IP=10.0.0.3 00:35:16.426 18:39:00 chaining -- nvmf/common.sh@144 -- # NVMF_BRIDGE=nvmf_br 00:35:16.426 18:39:00 chaining -- nvmf/common.sh@145 -- # NVMF_INITIATOR_INTERFACE=nvmf_init_if 00:35:16.426 18:39:00 chaining -- nvmf/common.sh@146 -- # NVMF_INITIATOR_BRIDGE=nvmf_init_br 00:35:16.426 18:39:00 chaining -- nvmf/common.sh@147 -- # NVMF_TARGET_NAMESPACE=nvmf_tgt_ns_spdk 00:35:16.426 18:39:00 chaining -- nvmf/common.sh@148 -- # NVMF_TARGET_NS_CMD=(ip netns exec "$NVMF_TARGET_NAMESPACE") 00:35:16.426 18:39:00 chaining -- nvmf/common.sh@149 -- # NVMF_TARGET_INTERFACE=nvmf_tgt_if 00:35:16.426 18:39:00 chaining -- nvmf/common.sh@150 -- # NVMF_TARGET_INTERFACE2=nvmf_tgt_if2 00:35:16.426 18:39:00 chaining -- nvmf/common.sh@151 -- # NVMF_TARGET_BRIDGE=nvmf_tgt_br 00:35:16.426 18:39:00 chaining -- nvmf/common.sh@152 -- # NVMF_TARGET_BRIDGE2=nvmf_tgt_br2 00:35:16.426 18:39:00 chaining -- nvmf/common.sh@154 -- # ip link set nvmf_init_br nomaster 00:35:16.684 18:39:00 chaining -- nvmf/common.sh@155 -- # ip link set nvmf_tgt_br nomaster 00:35:16.684 Cannot find device "nvmf_tgt_br" 00:35:16.684 18:39:00 chaining -- nvmf/common.sh@155 -- # true 00:35:16.684 18:39:00 chaining -- nvmf/common.sh@156 -- # ip link set nvmf_tgt_br2 nomaster 00:35:16.684 Cannot find device "nvmf_tgt_br2" 00:35:16.684 18:39:00 chaining -- nvmf/common.sh@156 -- # true 00:35:16.684 18:39:00 chaining -- nvmf/common.sh@157 -- # ip link set nvmf_init_br down 00:35:16.684 18:39:00 chaining -- nvmf/common.sh@158 -- # ip link set nvmf_tgt_br down 00:35:16.684 Cannot find device "nvmf_tgt_br" 00:35:16.684 18:39:00 chaining -- nvmf/common.sh@158 -- # true 00:35:16.684 18:39:00 chaining -- nvmf/common.sh@159 -- # ip link set nvmf_tgt_br2 down 00:35:16.684 Cannot find device "nvmf_tgt_br2" 00:35:16.684 18:39:00 chaining -- nvmf/common.sh@159 -- # true 00:35:16.684 18:39:00 chaining -- nvmf/common.sh@160 -- # ip link delete nvmf_br type bridge 00:35:16.684 18:39:00 chaining -- nvmf/common.sh@161 -- # ip link delete nvmf_init_if 00:35:16.684 18:39:00 chaining -- nvmf/common.sh@162 -- # ip netns exec nvmf_tgt_ns_spdk ip link delete nvmf_tgt_if 00:35:16.684 Cannot open network namespace "nvmf_tgt_ns_spdk": No such file or directory 00:35:16.684 18:39:00 chaining -- nvmf/common.sh@162 -- # true 00:35:16.684 18:39:00 chaining -- nvmf/common.sh@163 -- # ip netns exec nvmf_tgt_ns_spdk ip link delete nvmf_tgt_if2 00:35:16.684 Cannot open network namespace "nvmf_tgt_ns_spdk": No such file or directory 00:35:16.684 18:39:00 chaining -- nvmf/common.sh@163 -- # true 00:35:16.684 18:39:00 chaining -- nvmf/common.sh@166 -- # ip netns add nvmf_tgt_ns_spdk 00:35:16.684 18:39:00 chaining -- nvmf/common.sh@169 -- # ip link add nvmf_init_if type veth peer name nvmf_init_br 00:35:16.942 18:39:00 chaining -- nvmf/common.sh@170 -- # ip link add nvmf_tgt_if type veth peer name nvmf_tgt_br 00:35:16.942 18:39:00 chaining -- nvmf/common.sh@171 -- # ip link add nvmf_tgt_if2 type veth peer name nvmf_tgt_br2 00:35:16.942 18:39:00 chaining -- nvmf/common.sh@174 -- # ip link set nvmf_tgt_if netns nvmf_tgt_ns_spdk 00:35:16.942 18:39:00 chaining -- nvmf/common.sh@175 -- # ip link set nvmf_tgt_if2 netns nvmf_tgt_ns_spdk 00:35:16.942 18:39:00 chaining -- nvmf/common.sh@178 -- # ip addr add 10.0.0.1/24 dev nvmf_init_if 00:35:16.942 18:39:00 chaining -- nvmf/common.sh@179 -- # ip netns exec nvmf_tgt_ns_spdk ip addr add 10.0.0.2/24 dev nvmf_tgt_if 00:35:16.942 18:39:00 chaining -- nvmf/common.sh@180 -- # ip netns exec nvmf_tgt_ns_spdk ip addr add 10.0.0.3/24 dev nvmf_tgt_if2 00:35:17.199 18:39:00 chaining -- nvmf/common.sh@183 -- # ip link set nvmf_init_if up 00:35:17.199 18:39:00 chaining -- nvmf/common.sh@184 -- # ip link set nvmf_init_br up 00:35:17.199 18:39:00 chaining -- nvmf/common.sh@185 -- # ip link set nvmf_tgt_br up 00:35:17.199 18:39:00 chaining -- nvmf/common.sh@186 -- # ip link set nvmf_tgt_br2 up 00:35:17.199 18:39:00 chaining -- nvmf/common.sh@187 -- # ip netns exec nvmf_tgt_ns_spdk ip link set nvmf_tgt_if up 00:35:17.199 18:39:00 chaining -- nvmf/common.sh@188 -- # ip netns exec nvmf_tgt_ns_spdk ip link set nvmf_tgt_if2 up 00:35:17.199 18:39:00 chaining -- nvmf/common.sh@189 -- # ip netns exec nvmf_tgt_ns_spdk ip link set lo up 00:35:17.199 18:39:00 chaining -- nvmf/common.sh@192 -- # ip link add nvmf_br type bridge 00:35:17.199 18:39:00 chaining -- nvmf/common.sh@193 -- # ip link set nvmf_br up 00:35:17.199 18:39:00 chaining -- nvmf/common.sh@196 -- # ip link set nvmf_init_br master nvmf_br 00:35:17.457 18:39:00 chaining -- nvmf/common.sh@197 -- # ip link set nvmf_tgt_br master nvmf_br 00:35:17.457 18:39:01 chaining -- nvmf/common.sh@198 -- # ip link set nvmf_tgt_br2 master nvmf_br 00:35:17.713 18:39:01 chaining -- nvmf/common.sh@201 -- # iptables -I INPUT 1 -i nvmf_init_if -p tcp --dport 4420 -j ACCEPT 00:35:17.713 18:39:01 chaining -- nvmf/common.sh@202 -- # iptables -A FORWARD -i nvmf_br -o nvmf_br -j ACCEPT 00:35:17.713 18:39:01 chaining -- nvmf/common.sh@205 -- # ping -c 1 10.0.0.2 00:35:17.713 PING 10.0.0.2 (10.0.0.2) 56(84) bytes of data. 00:35:17.713 64 bytes from 10.0.0.2: icmp_seq=1 ttl=64 time=0.130 ms 00:35:17.713 00:35:17.713 --- 10.0.0.2 ping statistics --- 00:35:17.713 1 packets transmitted, 1 received, 0% packet loss, time 0ms 00:35:17.713 rtt min/avg/max/mdev = 0.130/0.130/0.130/0.000 ms 00:35:17.713 18:39:01 chaining -- nvmf/common.sh@206 -- # ping -c 1 10.0.0.3 00:35:17.713 PING 10.0.0.3 (10.0.0.3) 56(84) bytes of data. 00:35:17.713 64 bytes from 10.0.0.3: icmp_seq=1 ttl=64 time=0.094 ms 00:35:17.713 00:35:17.713 --- 10.0.0.3 ping statistics --- 00:35:17.713 1 packets transmitted, 1 received, 0% packet loss, time 0ms 00:35:17.713 rtt min/avg/max/mdev = 0.094/0.094/0.094/0.000 ms 00:35:17.713 18:39:01 chaining -- nvmf/common.sh@207 -- # ip netns exec nvmf_tgt_ns_spdk ping -c 1 10.0.0.1 00:35:17.971 PING 10.0.0.1 (10.0.0.1) 56(84) bytes of data. 00:35:17.971 64 bytes from 10.0.0.1: icmp_seq=1 ttl=64 time=0.058 ms 00:35:17.971 00:35:17.971 --- 10.0.0.1 ping statistics --- 00:35:17.971 1 packets transmitted, 1 received, 0% packet loss, time 0ms 00:35:17.971 rtt min/avg/max/mdev = 0.058/0.058/0.058/0.000 ms 00:35:17.971 18:39:01 chaining -- nvmf/common.sh@209 -- # NVMF_APP=("${NVMF_TARGET_NS_CMD[@]}" "${NVMF_APP[@]}") 00:35:17.971 18:39:01 chaining -- nvmf/common.sh@433 -- # return 0 00:35:17.971 18:39:01 chaining -- nvmf/common.sh@450 -- # '[' '' == iso ']' 00:35:17.971 18:39:01 chaining -- nvmf/common.sh@454 -- # NVMF_TRANSPORT_OPTS='-t tcp' 00:35:17.971 18:39:01 chaining -- nvmf/common.sh@455 -- # [[ tcp == \r\d\m\a ]] 00:35:17.971 18:39:01 chaining -- nvmf/common.sh@464 -- # [[ tcp == \t\c\p ]] 00:35:17.971 18:39:01 chaining -- nvmf/common.sh@465 -- # NVMF_TRANSPORT_OPTS='-t tcp -o' 00:35:17.971 18:39:01 chaining -- nvmf/common.sh@468 -- # '[' tcp == tcp ']' 00:35:17.971 18:39:01 chaining -- nvmf/common.sh@474 -- # modprobe nvme-tcp 00:35:17.971 18:39:01 chaining -- bdev/chaining.sh@176 -- # nvmfappstart -m 0x2 00:35:17.971 18:39:01 chaining -- nvmf/common.sh@479 -- # timing_enter start_nvmf_tgt 00:35:17.971 18:39:01 chaining -- common/autotest_common.sh@722 -- # xtrace_disable 00:35:17.971 18:39:01 chaining -- common/autotest_common.sh@10 -- # set +x 00:35:17.971 18:39:01 chaining -- nvmf/common.sh@481 -- # nvmfpid=2674877 00:35:17.971 18:39:01 chaining -- nvmf/common.sh@480 -- # ip netns exec nvmf_tgt_ns_spdk ip netns exec nvmf_tgt_ns_spdk /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/nvmf_tgt -i 0 -e 0xFFFF -m 0x2 00:35:17.971 18:39:01 chaining -- nvmf/common.sh@482 -- # waitforlisten 2674877 00:35:17.971 18:39:01 chaining -- common/autotest_common.sh@829 -- # '[' -z 2674877 ']' 00:35:17.971 18:39:01 chaining -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:35:17.971 18:39:01 chaining -- common/autotest_common.sh@834 -- # local max_retries=100 00:35:17.971 18:39:01 chaining -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:35:17.971 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:35:17.971 18:39:01 chaining -- common/autotest_common.sh@838 -- # xtrace_disable 00:35:17.971 18:39:01 chaining -- common/autotest_common.sh@10 -- # set +x 00:35:17.971 [2024-07-12 18:39:01.621795] Starting SPDK v24.09-pre git sha1 182dd7de4 / DPDK 24.03.0 initialization... 00:35:17.971 [2024-07-12 18:39:01.621866] [ DPDK EAL parameters: nvmf -c 0x2 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:35:18.230 [2024-07-12 18:39:01.749591] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:35:18.230 [2024-07-12 18:39:01.845079] app.c: 603:app_setup_trace: *NOTICE*: Tracepoint Group Mask 0xFFFF specified. 00:35:18.230 [2024-07-12 18:39:01.845129] app.c: 604:app_setup_trace: *NOTICE*: Use 'spdk_trace -s nvmf -i 0' to capture a snapshot of events at runtime. 00:35:18.230 [2024-07-12 18:39:01.845143] app.c: 609:app_setup_trace: *NOTICE*: 'spdk_trace' without parameters will also work if this is the only 00:35:18.230 [2024-07-12 18:39:01.845156] app.c: 610:app_setup_trace: *NOTICE*: SPDK application currently running. 00:35:18.230 [2024-07-12 18:39:01.845166] app.c: 611:app_setup_trace: *NOTICE*: Or copy /dev/shm/nvmf_trace.0 for offline analysis/debug. 00:35:18.230 [2024-07-12 18:39:01.845195] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:35:19.162 18:39:02 chaining -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:35:19.162 18:39:02 chaining -- common/autotest_common.sh@862 -- # return 0 00:35:19.162 18:39:02 chaining -- nvmf/common.sh@483 -- # timing_exit start_nvmf_tgt 00:35:19.162 18:39:02 chaining -- common/autotest_common.sh@728 -- # xtrace_disable 00:35:19.162 18:39:02 chaining -- common/autotest_common.sh@10 -- # set +x 00:35:19.162 18:39:02 chaining -- nvmf/common.sh@484 -- # trap 'process_shm --id $NVMF_APP_SHM_ID || :; nvmftestfini' SIGINT SIGTERM EXIT 00:35:19.162 18:39:02 chaining -- bdev/chaining.sh@178 -- # rpc_cmd 00:35:19.162 18:39:02 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:35:19.162 18:39:02 chaining -- common/autotest_common.sh@10 -- # set +x 00:35:19.162 malloc0 00:35:19.162 [2024-07-12 18:39:02.615377] tcp.c: 672:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:35:19.162 [2024-07-12 18:39:02.631578] tcp.c: 967:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 10.0.0.2 port 4420 *** 00:35:19.162 18:39:02 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:35:19.162 18:39:02 chaining -- bdev/chaining.sh@186 -- # trap 'bperfcleanup || :; nvmftestfini || :; exit 1' SIGINT SIGTERM EXIT 00:35:19.162 18:39:02 chaining -- bdev/chaining.sh@189 -- # bperfpid=2675101 00:35:19.162 18:39:02 chaining -- bdev/chaining.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/bperf.sock -t 5 -w verify -o 4096 -q 256 --wait-for-rpc -z 00:35:19.162 18:39:02 chaining -- bdev/chaining.sh@191 -- # waitforlisten 2675101 /var/tmp/bperf.sock 00:35:19.162 18:39:02 chaining -- common/autotest_common.sh@829 -- # '[' -z 2675101 ']' 00:35:19.162 18:39:02 chaining -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/bperf.sock 00:35:19.162 18:39:02 chaining -- common/autotest_common.sh@834 -- # local max_retries=100 00:35:19.162 18:39:02 chaining -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/bperf.sock...' 00:35:19.163 Waiting for process to start up and listen on UNIX domain socket /var/tmp/bperf.sock... 00:35:19.163 18:39:02 chaining -- common/autotest_common.sh@838 -- # xtrace_disable 00:35:19.163 18:39:02 chaining -- common/autotest_common.sh@10 -- # set +x 00:35:19.163 [2024-07-12 18:39:02.705149] Starting SPDK v24.09-pre git sha1 182dd7de4 / DPDK 24.03.0 initialization... 00:35:19.163 [2024-07-12 18:39:02.705217] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2675101 ] 00:35:19.163 [2024-07-12 18:39:02.834646] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:35:19.419 [2024-07-12 18:39:02.939736] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:35:19.984 18:39:03 chaining -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:35:19.984 18:39:03 chaining -- common/autotest_common.sh@862 -- # return 0 00:35:19.984 18:39:03 chaining -- bdev/chaining.sh@192 -- # rpc_bperf 00:35:19.984 18:39:03 chaining -- bdev/chaining.sh@22 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/bperf.sock 00:35:20.549 [2024-07-12 18:39:03.983854] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "key0" 00:35:20.549 nvme0n1 00:35:20.549 true 00:35:20.549 crypto0 00:35:20.549 18:39:04 chaining -- bdev/chaining.sh@201 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/bperf.sock perform_tests 00:35:20.549 Running I/O for 5 seconds... 00:35:25.814 00:35:25.814 Latency(us) 00:35:25.814 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:35:25.814 Job: crypto0 (Core Mask 0x1, workload: verify, depth: 256, IO size: 4096) 00:35:25.814 Verification LBA range: start 0x0 length 0x2000 00:35:25.814 crypto0 : 5.02 8312.41 32.47 0.00 0.00 30696.42 4986.43 27696.08 00:35:25.814 =================================================================================================================== 00:35:25.814 Total : 8312.41 32.47 0.00 0.00 30696.42 4986.43 27696.08 00:35:25.814 0 00:35:25.814 18:39:09 chaining -- bdev/chaining.sh@205 -- # get_stat_bperf sequence_executed 00:35:25.814 18:39:09 chaining -- bdev/chaining.sh@48 -- # get_stat sequence_executed '' rpc_bperf 00:35:25.814 18:39:09 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:35:25.814 18:39:09 chaining -- bdev/chaining.sh@39 -- # event=sequence_executed 00:35:25.814 18:39:09 chaining -- bdev/chaining.sh@39 -- # opcode= 00:35:25.814 18:39:09 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_bperf 00:35:25.814 18:39:09 chaining -- bdev/chaining.sh@40 -- # [[ -z '' ]] 00:35:25.814 18:39:09 chaining -- bdev/chaining.sh@41 -- # rpc_bperf accel_get_stats 00:35:25.814 18:39:09 chaining -- bdev/chaining.sh@41 -- # jq -r .sequence_executed 00:35:25.814 18:39:09 chaining -- bdev/chaining.sh@22 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/bperf.sock accel_get_stats 00:35:25.814 18:39:09 chaining -- bdev/chaining.sh@205 -- # sequence=83464 00:35:25.814 18:39:09 chaining -- bdev/chaining.sh@206 -- # get_stat_bperf executed encrypt 00:35:25.814 18:39:09 chaining -- bdev/chaining.sh@48 -- # get_stat executed encrypt rpc_bperf 00:35:25.814 18:39:09 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:35:25.814 18:39:09 chaining -- bdev/chaining.sh@39 -- # event=executed 00:35:25.814 18:39:09 chaining -- bdev/chaining.sh@39 -- # opcode=encrypt 00:35:25.814 18:39:09 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_bperf 00:35:25.814 18:39:09 chaining -- bdev/chaining.sh@40 -- # [[ -z encrypt ]] 00:35:25.814 18:39:09 chaining -- bdev/chaining.sh@43 -- # rpc_bperf accel_get_stats 00:35:25.814 18:39:09 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "encrypt").executed' 00:35:25.814 18:39:09 chaining -- bdev/chaining.sh@22 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/bperf.sock accel_get_stats 00:35:26.071 18:39:09 chaining -- bdev/chaining.sh@206 -- # encrypt=41732 00:35:26.071 18:39:09 chaining -- bdev/chaining.sh@207 -- # get_stat_bperf executed decrypt 00:35:26.071 18:39:09 chaining -- bdev/chaining.sh@48 -- # get_stat executed decrypt rpc_bperf 00:35:26.071 18:39:09 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:35:26.071 18:39:09 chaining -- bdev/chaining.sh@39 -- # event=executed 00:35:26.071 18:39:09 chaining -- bdev/chaining.sh@39 -- # opcode=decrypt 00:35:26.071 18:39:09 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_bperf 00:35:26.071 18:39:09 chaining -- bdev/chaining.sh@40 -- # [[ -z decrypt ]] 00:35:26.071 18:39:09 chaining -- bdev/chaining.sh@43 -- # rpc_bperf accel_get_stats 00:35:26.071 18:39:09 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "decrypt").executed' 00:35:26.071 18:39:09 chaining -- bdev/chaining.sh@22 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/bperf.sock accel_get_stats 00:35:26.328 18:39:09 chaining -- bdev/chaining.sh@207 -- # decrypt=41732 00:35:26.328 18:39:09 chaining -- bdev/chaining.sh@208 -- # get_stat_bperf executed crc32c 00:35:26.328 18:39:09 chaining -- bdev/chaining.sh@48 -- # get_stat executed crc32c rpc_bperf 00:35:26.328 18:39:09 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:35:26.328 18:39:09 chaining -- bdev/chaining.sh@39 -- # event=executed 00:35:26.328 18:39:09 chaining -- bdev/chaining.sh@39 -- # opcode=crc32c 00:35:26.328 18:39:09 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_bperf 00:35:26.328 18:39:09 chaining -- bdev/chaining.sh@40 -- # [[ -z crc32c ]] 00:35:26.328 18:39:09 chaining -- bdev/chaining.sh@43 -- # rpc_bperf accel_get_stats 00:35:26.328 18:39:09 chaining -- bdev/chaining.sh@22 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/bperf.sock accel_get_stats 00:35:26.328 18:39:09 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "crc32c").executed' 00:35:26.586 18:39:10 chaining -- bdev/chaining.sh@208 -- # crc32c=83464 00:35:26.586 18:39:10 chaining -- bdev/chaining.sh@210 -- # (( sequence > 0 )) 00:35:26.586 18:39:10 chaining -- bdev/chaining.sh@211 -- # (( encrypt + decrypt == sequence )) 00:35:26.586 18:39:10 chaining -- bdev/chaining.sh@212 -- # (( encrypt + decrypt == crc32c )) 00:35:26.586 18:39:10 chaining -- bdev/chaining.sh@214 -- # killprocess 2675101 00:35:26.586 18:39:10 chaining -- common/autotest_common.sh@948 -- # '[' -z 2675101 ']' 00:35:26.586 18:39:10 chaining -- common/autotest_common.sh@952 -- # kill -0 2675101 00:35:26.586 18:39:10 chaining -- common/autotest_common.sh@953 -- # uname 00:35:26.586 18:39:10 chaining -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:35:26.586 18:39:10 chaining -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2675101 00:35:26.586 18:39:10 chaining -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:35:26.586 18:39:10 chaining -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:35:26.586 18:39:10 chaining -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2675101' 00:35:26.586 killing process with pid 2675101 00:35:26.586 18:39:10 chaining -- common/autotest_common.sh@967 -- # kill 2675101 00:35:26.586 Received shutdown signal, test time was about 5.000000 seconds 00:35:26.586 00:35:26.586 Latency(us) 00:35:26.586 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:35:26.586 =================================================================================================================== 00:35:26.586 Total : 0.00 0.00 0.00 0.00 0.00 0.00 0.00 00:35:26.586 18:39:10 chaining -- common/autotest_common.sh@972 -- # wait 2675101 00:35:26.843 18:39:10 chaining -- bdev/chaining.sh@219 -- # bperfpid=2676314 00:35:26.843 18:39:10 chaining -- bdev/chaining.sh@217 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/bperf.sock -t 5 -w verify -o 65536 -q 32 --wait-for-rpc -z 00:35:26.843 18:39:10 chaining -- bdev/chaining.sh@221 -- # waitforlisten 2676314 /var/tmp/bperf.sock 00:35:26.843 18:39:10 chaining -- common/autotest_common.sh@829 -- # '[' -z 2676314 ']' 00:35:26.843 18:39:10 chaining -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/bperf.sock 00:35:26.843 18:39:10 chaining -- common/autotest_common.sh@834 -- # local max_retries=100 00:35:26.843 18:39:10 chaining -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/bperf.sock...' 00:35:26.843 Waiting for process to start up and listen on UNIX domain socket /var/tmp/bperf.sock... 00:35:26.843 18:39:10 chaining -- common/autotest_common.sh@838 -- # xtrace_disable 00:35:26.843 18:39:10 chaining -- common/autotest_common.sh@10 -- # set +x 00:35:27.099 [2024-07-12 18:39:10.586870] Starting SPDK v24.09-pre git sha1 182dd7de4 / DPDK 24.03.0 initialization... 00:35:27.100 [2024-07-12 18:39:10.586954] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2676314 ] 00:35:27.100 [2024-07-12 18:39:10.718412] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:35:27.100 [2024-07-12 18:39:10.814807] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:35:28.031 18:39:11 chaining -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:35:28.031 18:39:11 chaining -- common/autotest_common.sh@862 -- # return 0 00:35:28.031 18:39:11 chaining -- bdev/chaining.sh@222 -- # rpc_bperf 00:35:28.031 18:39:11 chaining -- bdev/chaining.sh@22 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/bperf.sock 00:35:28.289 [2024-07-12 18:39:11.927157] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "key0" 00:35:28.289 nvme0n1 00:35:28.289 true 00:35:28.289 crypto0 00:35:28.289 18:39:11 chaining -- bdev/chaining.sh@231 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/bperf.sock perform_tests 00:35:28.547 Running I/O for 5 seconds... 00:35:33.948 00:35:33.948 Latency(us) 00:35:33.948 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:35:33.948 Job: crypto0 (Core Mask 0x1, workload: verify, depth: 32, IO size: 65536) 00:35:33.948 Verification LBA range: start 0x0 length 0x200 00:35:33.948 crypto0 : 5.01 1697.16 106.07 0.00 0.00 18476.91 1332.09 18919.96 00:35:33.948 =================================================================================================================== 00:35:33.948 Total : 1697.16 106.07 0.00 0.00 18476.91 1332.09 18919.96 00:35:33.948 0 00:35:33.948 18:39:17 chaining -- bdev/chaining.sh@233 -- # get_stat_bperf sequence_executed 00:35:33.948 18:39:17 chaining -- bdev/chaining.sh@48 -- # get_stat sequence_executed '' rpc_bperf 00:35:33.948 18:39:17 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:35:33.948 18:39:17 chaining -- bdev/chaining.sh@39 -- # event=sequence_executed 00:35:33.948 18:39:17 chaining -- bdev/chaining.sh@39 -- # opcode= 00:35:33.948 18:39:17 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_bperf 00:35:33.948 18:39:17 chaining -- bdev/chaining.sh@40 -- # [[ -z '' ]] 00:35:33.949 18:39:17 chaining -- bdev/chaining.sh@41 -- # rpc_bperf accel_get_stats 00:35:33.949 18:39:17 chaining -- bdev/chaining.sh@41 -- # jq -r .sequence_executed 00:35:33.949 18:39:17 chaining -- bdev/chaining.sh@22 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/bperf.sock accel_get_stats 00:35:33.949 18:39:17 chaining -- bdev/chaining.sh@233 -- # sequence=16994 00:35:33.949 18:39:17 chaining -- bdev/chaining.sh@234 -- # get_stat_bperf executed encrypt 00:35:33.949 18:39:17 chaining -- bdev/chaining.sh@48 -- # get_stat executed encrypt rpc_bperf 00:35:33.949 18:39:17 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:35:33.949 18:39:17 chaining -- bdev/chaining.sh@39 -- # event=executed 00:35:33.949 18:39:17 chaining -- bdev/chaining.sh@39 -- # opcode=encrypt 00:35:33.949 18:39:17 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_bperf 00:35:33.949 18:39:17 chaining -- bdev/chaining.sh@40 -- # [[ -z encrypt ]] 00:35:33.949 18:39:17 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "encrypt").executed' 00:35:33.949 18:39:17 chaining -- bdev/chaining.sh@43 -- # rpc_bperf accel_get_stats 00:35:33.949 18:39:17 chaining -- bdev/chaining.sh@22 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/bperf.sock accel_get_stats 00:35:33.949 18:39:17 chaining -- bdev/chaining.sh@234 -- # encrypt=8497 00:35:33.949 18:39:17 chaining -- bdev/chaining.sh@235 -- # get_stat_bperf executed decrypt 00:35:33.949 18:39:17 chaining -- bdev/chaining.sh@48 -- # get_stat executed decrypt rpc_bperf 00:35:33.949 18:39:17 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:35:33.949 18:39:17 chaining -- bdev/chaining.sh@39 -- # event=executed 00:35:33.949 18:39:17 chaining -- bdev/chaining.sh@39 -- # opcode=decrypt 00:35:33.949 18:39:17 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_bperf 00:35:33.949 18:39:17 chaining -- bdev/chaining.sh@40 -- # [[ -z decrypt ]] 00:35:33.949 18:39:17 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "decrypt").executed' 00:35:33.949 18:39:17 chaining -- bdev/chaining.sh@43 -- # rpc_bperf accel_get_stats 00:35:33.949 18:39:17 chaining -- bdev/chaining.sh@22 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/bperf.sock accel_get_stats 00:35:34.206 18:39:17 chaining -- bdev/chaining.sh@235 -- # decrypt=8497 00:35:34.206 18:39:17 chaining -- bdev/chaining.sh@236 -- # get_stat_bperf executed crc32c 00:35:34.206 18:39:17 chaining -- bdev/chaining.sh@48 -- # get_stat executed crc32c rpc_bperf 00:35:34.206 18:39:17 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:35:34.206 18:39:17 chaining -- bdev/chaining.sh@39 -- # event=executed 00:35:34.206 18:39:17 chaining -- bdev/chaining.sh@39 -- # opcode=crc32c 00:35:34.206 18:39:17 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_bperf 00:35:34.206 18:39:17 chaining -- bdev/chaining.sh@40 -- # [[ -z crc32c ]] 00:35:34.206 18:39:17 chaining -- bdev/chaining.sh@43 -- # rpc_bperf accel_get_stats 00:35:34.206 18:39:17 chaining -- bdev/chaining.sh@22 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/bperf.sock accel_get_stats 00:35:34.206 18:39:17 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "crc32c").executed' 00:35:34.463 18:39:17 chaining -- bdev/chaining.sh@236 -- # crc32c=16994 00:35:34.463 18:39:17 chaining -- bdev/chaining.sh@238 -- # (( sequence > 0 )) 00:35:34.463 18:39:17 chaining -- bdev/chaining.sh@239 -- # (( encrypt + decrypt == sequence )) 00:35:34.463 18:39:17 chaining -- bdev/chaining.sh@240 -- # (( encrypt + decrypt == crc32c )) 00:35:34.463 18:39:17 chaining -- bdev/chaining.sh@242 -- # killprocess 2676314 00:35:34.463 18:39:17 chaining -- common/autotest_common.sh@948 -- # '[' -z 2676314 ']' 00:35:34.463 18:39:17 chaining -- common/autotest_common.sh@952 -- # kill -0 2676314 00:35:34.463 18:39:17 chaining -- common/autotest_common.sh@953 -- # uname 00:35:34.463 18:39:17 chaining -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:35:34.463 18:39:17 chaining -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2676314 00:35:34.463 18:39:18 chaining -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:35:34.463 18:39:18 chaining -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:35:34.463 18:39:18 chaining -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2676314' 00:35:34.463 killing process with pid 2676314 00:35:34.463 18:39:18 chaining -- common/autotest_common.sh@967 -- # kill 2676314 00:35:34.463 Received shutdown signal, test time was about 5.000000 seconds 00:35:34.463 00:35:34.463 Latency(us) 00:35:34.463 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:35:34.463 =================================================================================================================== 00:35:34.463 Total : 0.00 0.00 0.00 0.00 0.00 0.00 0.00 00:35:34.463 18:39:18 chaining -- common/autotest_common.sh@972 -- # wait 2676314 00:35:34.720 18:39:18 chaining -- bdev/chaining.sh@243 -- # nvmftestfini 00:35:34.720 18:39:18 chaining -- nvmf/common.sh@488 -- # nvmfcleanup 00:35:34.720 18:39:18 chaining -- nvmf/common.sh@117 -- # sync 00:35:34.720 18:39:18 chaining -- nvmf/common.sh@119 -- # '[' tcp == tcp ']' 00:35:34.720 18:39:18 chaining -- nvmf/common.sh@120 -- # set +e 00:35:34.720 18:39:18 chaining -- nvmf/common.sh@121 -- # for i in {1..20} 00:35:34.720 18:39:18 chaining -- nvmf/common.sh@122 -- # modprobe -v -r nvme-tcp 00:35:34.720 rmmod nvme_tcp 00:35:34.720 rmmod nvme_fabrics 00:35:34.720 rmmod nvme_keyring 00:35:37.253 18:39:20 chaining -- nvmf/common.sh@123 -- # modprobe -v -r nvme-fabrics 00:35:37.253 18:39:20 chaining -- nvmf/common.sh@124 -- # set -e 00:35:37.253 18:39:20 chaining -- nvmf/common.sh@125 -- # return 0 00:35:37.253 18:39:20 chaining -- nvmf/common.sh@489 -- # '[' -n 2674877 ']' 00:35:37.253 18:39:20 chaining -- nvmf/common.sh@490 -- # killprocess 2674877 00:35:37.253 18:39:20 chaining -- common/autotest_common.sh@948 -- # '[' -z 2674877 ']' 00:35:37.253 18:39:20 chaining -- common/autotest_common.sh@952 -- # kill -0 2674877 00:35:37.253 18:39:20 chaining -- common/autotest_common.sh@953 -- # uname 00:35:37.253 18:39:20 chaining -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:35:37.253 18:39:20 chaining -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2674877 00:35:37.253 18:39:20 chaining -- common/autotest_common.sh@954 -- # process_name=reactor_1 00:35:37.253 18:39:20 chaining -- common/autotest_common.sh@958 -- # '[' reactor_1 = sudo ']' 00:35:37.253 18:39:20 chaining -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2674877' 00:35:37.253 killing process with pid 2674877 00:35:37.253 18:39:20 chaining -- common/autotest_common.sh@967 -- # kill 2674877 00:35:37.253 18:39:20 chaining -- common/autotest_common.sh@972 -- # wait 2674877 00:35:37.253 18:39:20 chaining -- nvmf/common.sh@492 -- # '[' '' == iso ']' 00:35:37.253 18:39:20 chaining -- nvmf/common.sh@495 -- # [[ tcp == \t\c\p ]] 00:35:37.253 18:39:20 chaining -- nvmf/common.sh@496 -- # nvmf_tcp_fini 00:35:37.253 18:39:20 chaining -- nvmf/common.sh@274 -- # [[ nvmf_tgt_ns_spdk == \n\v\m\f\_\t\g\t\_\n\s ]] 00:35:37.253 18:39:20 chaining -- nvmf/common.sh@278 -- # remove_spdk_ns 00:35:37.253 18:39:20 chaining -- nvmf/common.sh@628 -- # xtrace_disable_per_cmd _remove_spdk_ns 00:35:37.253 18:39:20 chaining -- common/autotest_common.sh@22 -- # eval '_remove_spdk_ns 13> /dev/null' 00:35:37.253 18:39:20 chaining -- common/autotest_common.sh@22 -- # _remove_spdk_ns 00:35:37.253 18:39:20 chaining -- nvmf/common.sh@279 -- # ip -4 addr flush nvmf_init_if 00:35:37.253 18:39:20 chaining -- bdev/chaining.sh@245 -- # trap - SIGINT SIGTERM EXIT 00:35:37.253 00:35:37.253 real 0m49.449s 00:35:37.253 user 1m3.116s 00:35:37.253 sys 0m13.360s 00:35:37.253 18:39:20 chaining -- common/autotest_common.sh@1124 -- # xtrace_disable 00:35:37.253 18:39:20 chaining -- common/autotest_common.sh@10 -- # set +x 00:35:37.253 ************************************ 00:35:37.253 END TEST chaining 00:35:37.253 ************************************ 00:35:37.253 18:39:20 -- common/autotest_common.sh@1142 -- # return 0 00:35:37.253 18:39:20 -- spdk/autotest.sh@363 -- # [[ 0 -eq 1 ]] 00:35:37.253 18:39:20 -- spdk/autotest.sh@367 -- # [[ 0 -eq 1 ]] 00:35:37.253 18:39:20 -- spdk/autotest.sh@371 -- # [[ 0 -eq 1 ]] 00:35:37.253 18:39:20 -- spdk/autotest.sh@375 -- # [[ 0 -eq 1 ]] 00:35:37.253 18:39:20 -- spdk/autotest.sh@380 -- # trap - SIGINT SIGTERM EXIT 00:35:37.253 18:39:20 -- spdk/autotest.sh@382 -- # timing_enter post_cleanup 00:35:37.253 18:39:20 -- common/autotest_common.sh@722 -- # xtrace_disable 00:35:37.253 18:39:20 -- common/autotest_common.sh@10 -- # set +x 00:35:37.253 18:39:20 -- spdk/autotest.sh@383 -- # autotest_cleanup 00:35:37.253 18:39:20 -- common/autotest_common.sh@1392 -- # local autotest_es=0 00:35:37.253 18:39:20 -- common/autotest_common.sh@1393 -- # xtrace_disable 00:35:37.253 18:39:20 -- common/autotest_common.sh@10 -- # set +x 00:35:41.439 INFO: APP EXITING 00:35:41.439 INFO: killing all VMs 00:35:41.439 INFO: killing vhost app 00:35:41.439 INFO: EXIT DONE 00:35:45.618 0000:d7:05.5 (8086 201d): Skipping not allowed VMD controller at 0000:d7:05.5 00:35:45.618 0000:85:05.5 (8086 201d): Skipping not allowed VMD controller at 0000:85:05.5 00:35:45.618 Waiting for block devices as requested 00:35:45.618 0000:5e:00.0 (8086 0b60): vfio-pci -> nvme 00:35:45.876 0000:00:04.7 (8086 2021): vfio-pci -> ioatdma 00:35:45.876 0000:00:04.6 (8086 2021): vfio-pci -> ioatdma 00:35:46.134 0000:00:04.5 (8086 2021): vfio-pci -> ioatdma 00:35:46.134 0000:00:04.4 (8086 2021): vfio-pci -> ioatdma 00:35:46.134 0000:00:04.3 (8086 2021): vfio-pci -> ioatdma 00:35:46.392 0000:00:04.2 (8086 2021): vfio-pci -> ioatdma 00:35:46.392 0000:00:04.1 (8086 2021): vfio-pci -> ioatdma 00:35:46.392 0000:00:04.0 (8086 2021): vfio-pci -> ioatdma 00:35:46.650 0000:80:04.7 (8086 2021): vfio-pci -> ioatdma 00:35:46.650 0000:80:04.6 (8086 2021): vfio-pci -> ioatdma 00:35:46.650 0000:80:04.5 (8086 2021): vfio-pci -> ioatdma 00:35:46.907 0000:80:04.4 (8086 2021): vfio-pci -> ioatdma 00:35:46.907 0000:80:04.3 (8086 2021): vfio-pci -> ioatdma 00:35:46.907 0000:80:04.2 (8086 2021): vfio-pci -> ioatdma 00:35:47.165 0000:80:04.1 (8086 2021): vfio-pci -> ioatdma 00:35:47.165 0000:80:04.0 (8086 2021): vfio-pci -> ioatdma 00:35:51.345 0000:d7:05.5 (8086 201d): Skipping not allowed VMD controller at 0000:d7:05.5 00:35:51.345 0000:85:05.5 (8086 201d): Skipping not allowed VMD controller at 0000:85:05.5 00:35:51.345 Cleaning 00:35:51.345 Removing: /var/run/dpdk/spdk0/config 00:35:51.345 Removing: /var/run/dpdk/spdk0/fbarray_memseg-2048k-0-0 00:35:51.345 Removing: /var/run/dpdk/spdk0/fbarray_memseg-2048k-0-1 00:35:51.345 Removing: /var/run/dpdk/spdk0/fbarray_memseg-2048k-0-2 00:35:51.345 Removing: /var/run/dpdk/spdk0/fbarray_memseg-2048k-0-3 00:35:51.345 Removing: /var/run/dpdk/spdk0/fbarray_memseg-2048k-1-0 00:35:51.345 Removing: /var/run/dpdk/spdk0/fbarray_memseg-2048k-1-1 00:35:51.345 Removing: /var/run/dpdk/spdk0/fbarray_memseg-2048k-1-2 00:35:51.345 Removing: /var/run/dpdk/spdk0/fbarray_memseg-2048k-1-3 00:35:51.345 Removing: /var/run/dpdk/spdk0/fbarray_memzone 00:35:51.345 Removing: /var/run/dpdk/spdk0/hugepage_info 00:35:51.345 Removing: /dev/shm/nvmf_trace.0 00:35:51.345 Removing: /dev/shm/spdk_tgt_trace.pid2416605 00:35:51.345 Removing: /var/run/dpdk/spdk0 00:35:51.345 Removing: /var/run/dpdk/spdk_pid2415750 00:35:51.345 Removing: /var/run/dpdk/spdk_pid2416605 00:35:51.345 Removing: /var/run/dpdk/spdk_pid2417131 00:35:51.345 Removing: /var/run/dpdk/spdk_pid2417863 00:35:51.345 Removing: /var/run/dpdk/spdk_pid2418051 00:35:51.345 Removing: /var/run/dpdk/spdk_pid2418866 00:35:51.345 Removing: /var/run/dpdk/spdk_pid2418992 00:35:51.345 Removing: /var/run/dpdk/spdk_pid2419274 00:35:51.345 Removing: /var/run/dpdk/spdk_pid2421888 00:35:51.345 Removing: /var/run/dpdk/spdk_pid2423234 00:35:51.345 Removing: /var/run/dpdk/spdk_pid2423465 00:35:51.345 Removing: /var/run/dpdk/spdk_pid2423848 00:35:51.345 Removing: /var/run/dpdk/spdk_pid2424119 00:35:51.345 Removing: /var/run/dpdk/spdk_pid2424401 00:35:51.602 Removing: /var/run/dpdk/spdk_pid2424650 00:35:51.602 Removing: /var/run/dpdk/spdk_pid2424885 00:35:51.602 Removing: /var/run/dpdk/spdk_pid2425140 00:35:51.602 Removing: /var/run/dpdk/spdk_pid2425723 00:35:51.602 Removing: /var/run/dpdk/spdk_pid2428412 00:35:51.602 Removing: /var/run/dpdk/spdk_pid2428617 00:35:51.602 Removing: /var/run/dpdk/spdk_pid2428851 00:35:51.602 Removing: /var/run/dpdk/spdk_pid2429070 00:35:51.602 Removing: /var/run/dpdk/spdk_pid2429220 00:35:51.602 Removing: /var/run/dpdk/spdk_pid2429321 00:35:51.602 Removing: /var/run/dpdk/spdk_pid2429529 00:35:51.602 Removing: /var/run/dpdk/spdk_pid2429819 00:35:51.602 Removing: /var/run/dpdk/spdk_pid2430078 00:35:51.602 Removing: /var/run/dpdk/spdk_pid2430279 00:35:51.602 Removing: /var/run/dpdk/spdk_pid2430490 00:35:51.602 Removing: /var/run/dpdk/spdk_pid2430687 00:35:51.602 Removing: /var/run/dpdk/spdk_pid2430894 00:35:51.602 Removing: /var/run/dpdk/spdk_pid2431243 00:35:51.602 Removing: /var/run/dpdk/spdk_pid2431439 00:35:51.602 Removing: /var/run/dpdk/spdk_pid2431640 00:35:51.602 Removing: /var/run/dpdk/spdk_pid2431835 00:35:51.602 Removing: /var/run/dpdk/spdk_pid2432038 00:35:51.602 Removing: /var/run/dpdk/spdk_pid2432307 00:35:51.602 Removing: /var/run/dpdk/spdk_pid2432588 00:35:51.602 Removing: /var/run/dpdk/spdk_pid2432790 00:35:51.602 Removing: /var/run/dpdk/spdk_pid2432986 00:35:51.602 Removing: /var/run/dpdk/spdk_pid2433186 00:35:51.602 Removing: /var/run/dpdk/spdk_pid2433413 00:35:51.602 Removing: /var/run/dpdk/spdk_pid2433729 00:35:51.602 Removing: /var/run/dpdk/spdk_pid2433939 00:35:51.602 Removing: /var/run/dpdk/spdk_pid2434144 00:35:51.602 Removing: /var/run/dpdk/spdk_pid2434583 00:35:51.602 Removing: /var/run/dpdk/spdk_pid2435002 00:35:51.602 Removing: /var/run/dpdk/spdk_pid2435529 00:35:51.602 Removing: /var/run/dpdk/spdk_pid2435957 00:35:51.602 Removing: /var/run/dpdk/spdk_pid2436328 00:35:51.602 Removing: /var/run/dpdk/spdk_pid2436660 00:35:51.602 Removing: /var/run/dpdk/spdk_pid2436902 00:35:51.602 Removing: /var/run/dpdk/spdk_pid2437131 00:35:51.602 Removing: /var/run/dpdk/spdk_pid2437407 00:35:51.602 Removing: /var/run/dpdk/spdk_pid2437860 00:35:51.602 Removing: /var/run/dpdk/spdk_pid2438235 00:35:51.602 Removing: /var/run/dpdk/spdk_pid2438427 00:35:51.602 Removing: /var/run/dpdk/spdk_pid2442585 00:35:51.602 Removing: /var/run/dpdk/spdk_pid2444288 00:35:51.602 Removing: /var/run/dpdk/spdk_pid2445978 00:35:51.602 Removing: /var/run/dpdk/spdk_pid2446875 00:35:51.602 Removing: /var/run/dpdk/spdk_pid2447948 00:35:51.602 Removing: /var/run/dpdk/spdk_pid2448252 00:35:51.602 Removing: /var/run/dpdk/spdk_pid2448335 00:35:51.602 Removing: /var/run/dpdk/spdk_pid2448365 00:35:51.602 Removing: /var/run/dpdk/spdk_pid2452268 00:35:51.602 Removing: /var/run/dpdk/spdk_pid2452703 00:35:51.602 Removing: /var/run/dpdk/spdk_pid2453751 00:35:51.602 Removing: /var/run/dpdk/spdk_pid2453956 00:35:51.602 Removing: /var/run/dpdk/spdk_pid2459278 00:35:51.602 Removing: /var/run/dpdk/spdk_pid2460911 00:35:51.602 Removing: /var/run/dpdk/spdk_pid2462557 00:35:51.859 Removing: /var/run/dpdk/spdk_pid2466644 00:35:51.859 Removing: /var/run/dpdk/spdk_pid2468269 00:35:51.859 Removing: /var/run/dpdk/spdk_pid2469078 00:35:51.859 Removing: /var/run/dpdk/spdk_pid2473157 00:35:51.859 Removing: /var/run/dpdk/spdk_pid2475619 00:35:51.859 Removing: /var/run/dpdk/spdk_pid2476430 00:35:51.859 Removing: /var/run/dpdk/spdk_pid2486168 00:35:51.859 Removing: /var/run/dpdk/spdk_pid2488399 00:35:51.859 Removing: /var/run/dpdk/spdk_pid2490002 00:35:51.859 Removing: /var/run/dpdk/spdk_pid2499945 00:35:51.859 Removing: /var/run/dpdk/spdk_pid2502183 00:35:51.859 Removing: /var/run/dpdk/spdk_pid2503158 00:35:51.859 Removing: /var/run/dpdk/spdk_pid2513038 00:35:51.859 Removing: /var/run/dpdk/spdk_pid2516388 00:35:51.859 Removing: /var/run/dpdk/spdk_pid2517793 00:35:51.859 Removing: /var/run/dpdk/spdk_pid2529047 00:35:51.859 Removing: /var/run/dpdk/spdk_pid2531482 00:35:51.859 Removing: /var/run/dpdk/spdk_pid2532633 00:35:51.859 Removing: /var/run/dpdk/spdk_pid2543641 00:35:51.859 Removing: /var/run/dpdk/spdk_pid2546489 00:35:51.859 Removing: /var/run/dpdk/spdk_pid2547638 00:35:51.859 Removing: /var/run/dpdk/spdk_pid2558879 00:35:51.859 Removing: /var/run/dpdk/spdk_pid2562761 00:35:51.859 Removing: /var/run/dpdk/spdk_pid2563890 00:35:51.859 Removing: /var/run/dpdk/spdk_pid2565026 00:35:51.859 Removing: /var/run/dpdk/spdk_pid2568263 00:35:51.859 Removing: /var/run/dpdk/spdk_pid2573969 00:35:51.859 Removing: /var/run/dpdk/spdk_pid2576499 00:35:51.859 Removing: /var/run/dpdk/spdk_pid2581072 00:35:51.859 Removing: /var/run/dpdk/spdk_pid2584387 00:35:51.859 Removing: /var/run/dpdk/spdk_pid2589745 00:35:51.859 Removing: /var/run/dpdk/spdk_pid2592510 00:35:51.859 Removing: /var/run/dpdk/spdk_pid2599321 00:35:51.859 Removing: /var/run/dpdk/spdk_pid2601681 00:35:51.859 Removing: /var/run/dpdk/spdk_pid2607806 00:35:51.859 Removing: /var/run/dpdk/spdk_pid2610209 00:35:51.859 Removing: /var/run/dpdk/spdk_pid2616250 00:35:51.859 Removing: /var/run/dpdk/spdk_pid2618623 00:35:51.859 Removing: /var/run/dpdk/spdk_pid2623069 00:35:51.859 Removing: /var/run/dpdk/spdk_pid2623422 00:35:51.859 Removing: /var/run/dpdk/spdk_pid2624151 00:35:51.859 Removing: /var/run/dpdk/spdk_pid2624511 00:35:51.859 Removing: /var/run/dpdk/spdk_pid2624946 00:35:51.859 Removing: /var/run/dpdk/spdk_pid2625715 00:35:51.859 Removing: /var/run/dpdk/spdk_pid2626519 00:35:51.859 Removing: /var/run/dpdk/spdk_pid2626854 00:35:51.859 Removing: /var/run/dpdk/spdk_pid2628601 00:35:51.859 Removing: /var/run/dpdk/spdk_pid2630207 00:35:51.859 Removing: /var/run/dpdk/spdk_pid2631811 00:35:51.859 Removing: /var/run/dpdk/spdk_pid2633190 00:35:51.859 Removing: /var/run/dpdk/spdk_pid2634722 00:35:51.859 Removing: /var/run/dpdk/spdk_pid2636365 00:35:51.859 Removing: /var/run/dpdk/spdk_pid2638068 00:35:51.859 Removing: /var/run/dpdk/spdk_pid2639372 00:35:51.859 Removing: /var/run/dpdk/spdk_pid2639916 00:35:51.859 Removing: /var/run/dpdk/spdk_pid2640443 00:35:51.859 Removing: /var/run/dpdk/spdk_pid2642459 00:35:51.859 Removing: /var/run/dpdk/spdk_pid2644315 00:35:52.117 Removing: /var/run/dpdk/spdk_pid2646161 00:35:52.117 Removing: /var/run/dpdk/spdk_pid2647287 00:35:52.117 Removing: /var/run/dpdk/spdk_pid2648953 00:35:52.117 Removing: /var/run/dpdk/spdk_pid2650001 00:35:52.117 Removing: /var/run/dpdk/spdk_pid2650035 00:35:52.117 Removing: /var/run/dpdk/spdk_pid2650170 00:35:52.117 Removing: /var/run/dpdk/spdk_pid2650468 00:35:52.117 Removing: /var/run/dpdk/spdk_pid2650659 00:35:52.117 Removing: /var/run/dpdk/spdk_pid2652098 00:35:52.117 Removing: /var/run/dpdk/spdk_pid2653766 00:35:52.117 Removing: /var/run/dpdk/spdk_pid2655267 00:35:52.117 Removing: /var/run/dpdk/spdk_pid2656009 00:35:52.117 Removing: /var/run/dpdk/spdk_pid2656863 00:35:52.117 Removing: /var/run/dpdk/spdk_pid2657065 00:35:52.117 Removing: /var/run/dpdk/spdk_pid2657169 00:35:52.117 Removing: /var/run/dpdk/spdk_pid2657273 00:35:52.117 Removing: /var/run/dpdk/spdk_pid2658207 00:35:52.117 Removing: /var/run/dpdk/spdk_pid2658752 00:35:52.117 Removing: /var/run/dpdk/spdk_pid2659132 00:35:52.117 Removing: /var/run/dpdk/spdk_pid2661289 00:35:52.117 Removing: /var/run/dpdk/spdk_pid2663138 00:35:52.117 Removing: /var/run/dpdk/spdk_pid2664984 00:35:52.117 Removing: /var/run/dpdk/spdk_pid2666050 00:35:52.117 Removing: /var/run/dpdk/spdk_pid2667133 00:35:52.117 Removing: /var/run/dpdk/spdk_pid2667721 00:35:52.117 Removing: /var/run/dpdk/spdk_pid2667848 00:35:52.117 Removing: /var/run/dpdk/spdk_pid2671753 00:35:52.117 Removing: /var/run/dpdk/spdk_pid2671868 00:35:52.117 Removing: /var/run/dpdk/spdk_pid2672000 00:35:52.117 Removing: /var/run/dpdk/spdk_pid2672192 00:35:52.117 Removing: /var/run/dpdk/spdk_pid2672399 00:35:52.117 Removing: /var/run/dpdk/spdk_pid2672612 00:35:52.117 Removing: /var/run/dpdk/spdk_pid2673490 00:35:52.117 Removing: /var/run/dpdk/spdk_pid2675101 00:35:52.117 Removing: /var/run/dpdk/spdk_pid2676314 00:35:52.117 Clean 00:35:53.489 18:39:36 -- common/autotest_common.sh@1451 -- # return 0 00:35:53.489 18:39:36 -- spdk/autotest.sh@384 -- # timing_exit post_cleanup 00:35:53.489 18:39:36 -- common/autotest_common.sh@728 -- # xtrace_disable 00:35:53.489 18:39:36 -- common/autotest_common.sh@10 -- # set +x 00:35:53.489 18:39:36 -- spdk/autotest.sh@386 -- # timing_exit autotest 00:35:53.489 18:39:36 -- common/autotest_common.sh@728 -- # xtrace_disable 00:35:53.489 18:39:36 -- common/autotest_common.sh@10 -- # set +x 00:35:53.489 18:39:36 -- spdk/autotest.sh@387 -- # chmod a+r /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/timing.txt 00:35:53.489 18:39:36 -- spdk/autotest.sh@389 -- # [[ -f /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/udev.log ]] 00:35:53.489 18:39:36 -- spdk/autotest.sh@389 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/udev.log 00:35:53.489 18:39:36 -- spdk/autotest.sh@391 -- # hash lcov 00:35:53.489 18:39:36 -- spdk/autotest.sh@391 -- # [[ CC_TYPE=gcc == *\c\l\a\n\g* ]] 00:35:53.489 18:39:36 -- spdk/autotest.sh@393 -- # hostname 00:35:53.489 18:39:36 -- spdk/autotest.sh@393 -- # lcov --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 --rc genhtml_branch_coverage=1 --rc genhtml_function_coverage=1 --rc genhtml_legend=1 --rc geninfo_all_blocks=1 --no-external -q -c -d /var/jenkins/workspace/crypto-phy-autotest/spdk -t spdk-wfp-50 -o /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/cov_test.info 00:35:53.489 geninfo: WARNING: invalid characters removed from testname! 00:36:25.572 18:40:04 -- spdk/autotest.sh@394 -- # lcov --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 --rc genhtml_branch_coverage=1 --rc genhtml_function_coverage=1 --rc genhtml_legend=1 --rc geninfo_all_blocks=1 --no-external -q -a /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/cov_base.info -a /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/cov_test.info -o /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/cov_total.info 00:36:25.572 18:40:07 -- spdk/autotest.sh@395 -- # lcov --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 --rc genhtml_branch_coverage=1 --rc genhtml_function_coverage=1 --rc genhtml_legend=1 --rc geninfo_all_blocks=1 --no-external -q -r /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/cov_total.info '*/dpdk/*' -o /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/cov_total.info 00:36:26.950 18:40:10 -- spdk/autotest.sh@396 -- # lcov --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 --rc genhtml_branch_coverage=1 --rc genhtml_function_coverage=1 --rc genhtml_legend=1 --rc geninfo_all_blocks=1 --no-external -q -r /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/cov_total.info '/usr/*' -o /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/cov_total.info 00:36:29.484 18:40:13 -- spdk/autotest.sh@397 -- # lcov --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 --rc genhtml_branch_coverage=1 --rc genhtml_function_coverage=1 --rc genhtml_legend=1 --rc geninfo_all_blocks=1 --no-external -q -r /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/cov_total.info '*/examples/vmd/*' -o /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/cov_total.info 00:36:32.015 18:40:15 -- spdk/autotest.sh@398 -- # lcov --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 --rc genhtml_branch_coverage=1 --rc genhtml_function_coverage=1 --rc genhtml_legend=1 --rc geninfo_all_blocks=1 --no-external -q -r /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/cov_total.info '*/app/spdk_lspci/*' -o /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/cov_total.info 00:36:35.322 18:40:18 -- spdk/autotest.sh@399 -- # lcov --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 --rc genhtml_branch_coverage=1 --rc genhtml_function_coverage=1 --rc genhtml_legend=1 --rc geninfo_all_blocks=1 --no-external -q -r /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/cov_total.info '*/app/spdk_top/*' -o /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/cov_total.info 00:36:37.327 18:40:20 -- spdk/autotest.sh@400 -- # rm -f cov_base.info cov_test.info OLD_STDOUT OLD_STDERR 00:36:37.327 18:40:21 -- common/autobuild_common.sh@15 -- $ source /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/common.sh 00:36:37.327 18:40:21 -- scripts/common.sh@508 -- $ [[ -e /bin/wpdk_common.sh ]] 00:36:37.327 18:40:21 -- scripts/common.sh@516 -- $ [[ -e /etc/opt/spdk-pkgdep/paths/export.sh ]] 00:36:37.327 18:40:21 -- scripts/common.sh@517 -- $ source /etc/opt/spdk-pkgdep/paths/export.sh 00:36:37.327 18:40:21 -- paths/export.sh@2 -- $ PATH=/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/home/sys_sgci/.local/bin:/home/sys_sgci/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:36:37.327 18:40:21 -- paths/export.sh@3 -- $ PATH=/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/home/sys_sgci/.local/bin:/home/sys_sgci/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:36:37.327 18:40:21 -- paths/export.sh@4 -- $ PATH=/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/home/sys_sgci/.local/bin:/home/sys_sgci/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:36:37.327 18:40:21 -- paths/export.sh@5 -- $ export PATH 00:36:37.327 18:40:21 -- paths/export.sh@6 -- $ echo /opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/home/sys_sgci/.local/bin:/home/sys_sgci/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:36:37.327 18:40:21 -- common/autobuild_common.sh@443 -- $ out=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:36:37.327 18:40:21 -- common/autobuild_common.sh@444 -- $ date +%s 00:36:37.327 18:40:21 -- common/autobuild_common.sh@444 -- $ mktemp -dt spdk_1720802421.XXXXXX 00:36:37.327 18:40:21 -- common/autobuild_common.sh@444 -- $ SPDK_WORKSPACE=/tmp/spdk_1720802421.C7djVl 00:36:37.327 18:40:21 -- common/autobuild_common.sh@446 -- $ [[ -n '' ]] 00:36:37.327 18:40:21 -- common/autobuild_common.sh@450 -- $ '[' -n '' ']' 00:36:37.327 18:40:21 -- common/autobuild_common.sh@453 -- $ scanbuild_exclude='--exclude /var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/' 00:36:37.327 18:40:21 -- common/autobuild_common.sh@457 -- $ scanbuild_exclude+=' --exclude /var/jenkins/workspace/crypto-phy-autotest/spdk/xnvme --exclude /tmp' 00:36:37.327 18:40:21 -- common/autobuild_common.sh@459 -- $ scanbuild='scan-build -o /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/scan-build-tmp --exclude /var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/ --exclude /var/jenkins/workspace/crypto-phy-autotest/spdk/xnvme --exclude /tmp --status-bugs' 00:36:37.327 18:40:21 -- common/autobuild_common.sh@460 -- $ get_config_params 00:36:37.327 18:40:21 -- common/autotest_common.sh@396 -- $ xtrace_disable 00:36:37.327 18:40:21 -- common/autotest_common.sh@10 -- $ set +x 00:36:37.585 18:40:21 -- common/autobuild_common.sh@460 -- $ config_params='--enable-debug --enable-werror --with-rdma --with-idxd --with-fio=/usr/src/fio --with-iscsi-initiator --disable-unit-tests --with-vbdev-compress --with-dpdk-compressdev --with-crypto --enable-ubsan --enable-coverage --with-ublk' 00:36:37.585 18:40:21 -- common/autobuild_common.sh@462 -- $ start_monitor_resources 00:36:37.585 18:40:21 -- pm/common@17 -- $ local monitor 00:36:37.585 18:40:21 -- pm/common@19 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:36:37.585 18:40:21 -- pm/common@19 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:36:37.585 18:40:21 -- pm/common@19 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:36:37.585 18:40:21 -- pm/common@21 -- $ date +%s 00:36:37.585 18:40:21 -- pm/common@19 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:36:37.585 18:40:21 -- pm/common@21 -- $ date +%s 00:36:37.585 18:40:21 -- pm/common@25 -- $ sleep 1 00:36:37.585 18:40:21 -- pm/common@21 -- $ date +%s 00:36:37.585 18:40:21 -- pm/common@21 -- $ date +%s 00:36:37.585 18:40:21 -- pm/common@21 -- $ /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm/collect-cpu-load -d /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power -l -p monitor.autopackage.sh.1720802421 00:36:37.585 18:40:21 -- pm/common@21 -- $ /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm/collect-vmstat -d /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power -l -p monitor.autopackage.sh.1720802421 00:36:37.585 18:40:21 -- pm/common@21 -- $ /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm/collect-cpu-temp -d /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power -l -p monitor.autopackage.sh.1720802421 00:36:37.585 18:40:21 -- pm/common@21 -- $ sudo -E /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm/collect-bmc-pm -d /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power -l -p monitor.autopackage.sh.1720802421 00:36:37.585 Redirecting to /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power/monitor.autopackage.sh.1720802421_collect-vmstat.pm.log 00:36:37.585 Redirecting to /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power/monitor.autopackage.sh.1720802421_collect-cpu-load.pm.log 00:36:37.585 Redirecting to /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power/monitor.autopackage.sh.1720802421_collect-cpu-temp.pm.log 00:36:37.585 Redirecting to /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power/monitor.autopackage.sh.1720802421_collect-bmc-pm.bmc.pm.log 00:36:38.521 18:40:22 -- common/autobuild_common.sh@463 -- $ trap stop_monitor_resources EXIT 00:36:38.521 18:40:22 -- spdk/autopackage.sh@10 -- $ MAKEFLAGS=-j72 00:36:38.521 18:40:22 -- spdk/autopackage.sh@11 -- $ cd /var/jenkins/workspace/crypto-phy-autotest/spdk 00:36:38.521 18:40:22 -- spdk/autopackage.sh@13 -- $ [[ 0 -eq 1 ]] 00:36:38.521 18:40:22 -- spdk/autopackage.sh@18 -- $ [[ 0 -eq 0 ]] 00:36:38.521 18:40:22 -- spdk/autopackage.sh@19 -- $ timing_finish 00:36:38.521 18:40:22 -- common/autotest_common.sh@734 -- $ flamegraph=/usr/local/FlameGraph/flamegraph.pl 00:36:38.522 18:40:22 -- common/autotest_common.sh@735 -- $ '[' -x /usr/local/FlameGraph/flamegraph.pl ']' 00:36:38.522 18:40:22 -- common/autotest_common.sh@737 -- $ /usr/local/FlameGraph/flamegraph.pl --title 'Build Timing' --nametype Step: --countname seconds /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/timing.txt 00:36:38.522 18:40:22 -- spdk/autopackage.sh@20 -- $ exit 0 00:36:38.522 18:40:22 -- spdk/autopackage.sh@1 -- $ stop_monitor_resources 00:36:38.522 18:40:22 -- pm/common@29 -- $ signal_monitor_resources TERM 00:36:38.522 18:40:22 -- pm/common@40 -- $ local monitor pid pids signal=TERM 00:36:38.522 18:40:22 -- pm/common@42 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:36:38.522 18:40:22 -- pm/common@43 -- $ [[ -e /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power/collect-cpu-load.pid ]] 00:36:38.522 18:40:22 -- pm/common@44 -- $ pid=2687567 00:36:38.522 18:40:22 -- pm/common@50 -- $ kill -TERM 2687567 00:36:38.522 18:40:22 -- pm/common@42 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:36:38.522 18:40:22 -- pm/common@43 -- $ [[ -e /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power/collect-vmstat.pid ]] 00:36:38.522 18:40:22 -- pm/common@44 -- $ pid=2687569 00:36:38.522 18:40:22 -- pm/common@50 -- $ kill -TERM 2687569 00:36:38.522 18:40:22 -- pm/common@42 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:36:38.522 18:40:22 -- pm/common@43 -- $ [[ -e /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power/collect-cpu-temp.pid ]] 00:36:38.522 18:40:22 -- pm/common@44 -- $ pid=2687570 00:36:38.522 18:40:22 -- pm/common@50 -- $ kill -TERM 2687570 00:36:38.522 18:40:22 -- pm/common@42 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:36:38.522 18:40:22 -- pm/common@43 -- $ [[ -e /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power/collect-bmc-pm.pid ]] 00:36:38.522 18:40:22 -- pm/common@44 -- $ pid=2687594 00:36:38.522 18:40:22 -- pm/common@50 -- $ sudo -E kill -TERM 2687594 00:36:38.522 + [[ -n 2300741 ]] 00:36:38.522 + sudo kill 2300741 00:36:38.532 [Pipeline] } 00:36:38.553 [Pipeline] // stage 00:36:38.560 [Pipeline] } 00:36:38.580 [Pipeline] // timeout 00:36:38.586 [Pipeline] } 00:36:38.604 [Pipeline] // catchError 00:36:38.609 [Pipeline] } 00:36:38.630 [Pipeline] // wrap 00:36:38.638 [Pipeline] } 00:36:38.656 [Pipeline] // catchError 00:36:38.667 [Pipeline] stage 00:36:38.670 [Pipeline] { (Epilogue) 00:36:38.686 [Pipeline] catchError 00:36:38.688 [Pipeline] { 00:36:38.705 [Pipeline] echo 00:36:38.706 Cleanup processes 00:36:38.713 [Pipeline] sh 00:36:38.996 + sudo pgrep -af /var/jenkins/workspace/crypto-phy-autotest/spdk 00:36:38.996 2687668 /usr/bin/ipmitool sdr dump /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power/sdr.cache 00:36:38.996 2687887 sudo pgrep -af /var/jenkins/workspace/crypto-phy-autotest/spdk 00:36:39.011 [Pipeline] sh 00:36:39.293 ++ sudo pgrep -af /var/jenkins/workspace/crypto-phy-autotest/spdk 00:36:39.293 ++ grep -v 'sudo pgrep' 00:36:39.293 ++ awk '{print $1}' 00:36:39.293 + sudo kill -9 2687668 00:36:39.319 [Pipeline] sh 00:36:39.609 + jbp/jenkins/jjb-config/jobs/scripts/compress_artifacts.sh 00:36:51.818 [Pipeline] sh 00:36:52.101 + jbp/jenkins/jjb-config/jobs/scripts/check_artifacts_size.sh 00:36:52.101 Artifacts sizes are good 00:36:52.122 [Pipeline] archiveArtifacts 00:36:52.132 Archiving artifacts 00:36:52.257 [Pipeline] sh 00:36:52.539 + sudo chown -R sys_sgci /var/jenkins/workspace/crypto-phy-autotest 00:36:52.560 [Pipeline] cleanWs 00:36:52.572 [WS-CLEANUP] Deleting project workspace... 00:36:52.572 [WS-CLEANUP] Deferred wipeout is used... 00:36:52.578 [WS-CLEANUP] done 00:36:52.584 [Pipeline] } 00:36:52.606 [Pipeline] // catchError 00:36:52.620 [Pipeline] sh 00:36:52.903 + logger -p user.info -t JENKINS-CI 00:36:52.913 [Pipeline] } 00:36:52.933 [Pipeline] // stage 00:36:52.940 [Pipeline] } 00:36:52.958 [Pipeline] // node 00:36:52.991 [Pipeline] End of Pipeline 00:36:53.027 Finished: SUCCESS